Data orchestration with deep learning process optimization in fabs presents a largely untapped opportunity for next-level production. In the semiconductor industry, technological complexity and ballooning expenditures—on capital equipment, operations, and materials—are driving the adoption of AI and machine learning to unlock efficiencies across the entire value chain.

A 2021 report by McKinsey investigated the value-creation potential of this increasingly prevalent technology for chipmakers. It predicted that AI could contribute up to $40 billion annually over the next two years. Manufacturing is the most significant cost driver for the semiconductor industry. Unsurprisingly, McKinsey’s research revealed manufacturing as the link in the semiconductor value chain that stood to realize the highest gains from advanced machine learning solutions. Recent AI endeavors for semiconductor fabrication are beginning to bear these findings out.

THE DEEP LEARNING IMPERATIVE

When classifying defects with legacy computer vision, engineers now contend with the unprecedented transistor density and functional complexity of next-generation semiconductor devices. Such complexity renders this conventional brand of machine learning inadequate. State-of-the-art AI is fast becoming essential for newer fabs to limit yield loss effectively. For example, a recent article by industry veteran and engineer Anne Meixner in Semiconductor Engineering looks at the current uptake and successes of advanced deep learning techniques. She points to the adoption of AI in fabs for defect classification and wafer disposition—as a response to the limitations of Automated Optical Inspection (AOI). 

And yet, much of the potential of AI to improve yield, reduce the Cost of Goods Sold (COGS), and shorten the time to market for semiconductor foundries remains unexplored. A hesitancy to investigate new technological territory in this space might seem surprising. Especially if one considers that industrial data is the fuel for AI-driven manufacturing process optimization. After all, semiconductor foundries are more data-centric than all other production environments.

Consider also that the convergence of Information Technology (IT) and Operational Technology (OT) presents novel opportunities to utilize this data. Used correctly, it unlocks previously hidden production value. Nevertheless, leveraging vast quantities of data with deep learning still comes with significant challenges to any precision manufacturing operation. Proper retrieval, storage, traceability, selection, and contextualization are critical for AI deployments to drive production efficiencies.

DATA IS KEY

In particular, silicon industry experts readily acknowledge that the terabytes generated daily can be a source of torment for tool operators as often they are a boon. Many thousands of production process signals typically emanate from hundreds of fabs tools. In another of Meixner’s recent investigations, ‘Too Much Fab And Test Data, Low Utilization,’ she summarizes the big data paradox which the semiconductor industry faces:

“Semiconductor manufacturing IT professionals are witnessing a sudden explosion of data across all product sectors. The industry is grappling with the management of that data, and at the same time pushing to collect more data because it could be useful.” 

DataProphet conducted some of its own research into the potential benefits of AI deployments for the chip-making process. Susan Wilkerson and Markus Keil—C-level semiconductor industry veterans from North America and Europe, respectively—echo this notion that much scope remains for deeper integration of data-driven solutions. For example, Susan Wilkerson points to the potential of mature data mining to bring deposition and etching process improvements in semiconductor foundries for better uniformity and more in-spec die towards the edge of the wafer:

“The industry seems like it’s just at the beginning of properly utilizing data to improve overall yield. From my experience, yield is often only tracked with Statistical Process Control Charts (SPCs). A lot of fabs could really benefit from seeing patterns in the data that would enable adjustments before chips go off-spec, thus preventing scrapped wafers.”

AI READINESS

Process and quality teams need role-based access to standardized and centralized plant data. This data needs traceability and human-interpretable context—based on targeted KPIs at the tool, segment, or plant level. All of this is readily achievable with the data and current IIoT technologies available in the market today. 

AI readiness is predicated on robust, adaptable data infrastructure. The vendor should build the IIoT platform from the ground up to be scalable, secure, and fault-tolerant. The platform should also ingest, encrypt, and centralize industrial data from thousands of devices across multiple plants. Ideally, live dashboards will display and configure all production information relevant to the user. Finally, the web interface supports intuitive collaboration and coordination among teams.

However, before extracting maximum value from AI, manufacturers face a further hurdle. Clearing it necessitates a transition from data-derived compliance and control. Manufacturers need to harness data to enable continuous, AI-driven optimization. Achieving this optimization in fabs means including context in the data. This contextualization typically manifests in the attachment of quality, metrology, and process data to the material flow. This can happen at the die, wafer, or cassette level.

Data orchestration with traceability and context positions manufacturers to utilize deep learning holistically. And it is the portal to AI-driven solutions that draw on data to unlock consistent production value. 

THE DIFFICULTY OF LEVERAGING DATA FOR BETTER YIELD

Yield is crucial in fabs. Any gains in learning cycles or throughput both pre and post-ramp-up are financially impactful. This bottom-line significance of yield optimization will be especially obvious to fabs professionals. Integration Engineers, Engineering Managers, and executives are acutely aware of the potential value of each in-spec die produced per wafer. A critical question for semiconductor foundries then is: How to better utilize data to improve KPIs? 

In any complex manufacturing process, a hyper-preponderance of data has become standard. Yet, monitoring it all to glean patterns and trends on multiple charts is time-consuming and unwieldy. An Engineering Manager we spoke with confirmed this. He acknowledged that sifting through the mass of data that fabs currently produce to isolate yield impacts strains human expertise. A piece of equipment can have upwards of 10,000 sensors generating data at varying frequencies. In addition, numerous wafers move through hundreds of process steps before chip packaging. 

We also learned that a yield management team often picks up a wafer defect months after the event in semiconductor foundries. Next, other teams of people with divergent expertise need to align around the origin of the problem. Models are then built based on historical events and the physics behind the devices. After this, analysis is conducted to determine whether the data supports or refutes the model. In the absence of data, tests interrogate model validity.

This time lag means that despite sophisticated and abundant data, diagnosis of root causes in fabs often happens as a reaction to a non-quality event. Of course, an event like this has already incurred a cost. What is more, the severity effect of an undetected issue occurring early in a process often increases as it travels downstream.

This overdependence on a wide set of experts amounts to a struggle when determining which actions are necessary to meaningfully and timeously impact chip production. The implication here is that there is now simply too much data in fabs. Traditional statistical process control techniques cannot drive manufacturing efficiency to the next level.

Complex manufacturing processes are inherently multivariate. For their optimization, the setpoint adjustments must positively reinforce one another when applied. This kind of reinforcement simultaneously establishes self-sustaining plant states. Legacy analytics techniques have reached an upper limit when it comes to setting new production benchmarks. Classically predictive AI tends to improve manufacturing performance over time. But it is not designed to maximize throughput holistically and ahead of production anomalies.

PRESCRIPTIVE AI WITH HUMANS-IN-THE-LOOP

The optimization of a complex process is guided not merely via data about current or imminent failures. Instead, it is achieved in manufacturing by systemic apprehension of all interdependent variables. This includes those variables that have historically produced the best results. 

AI-as-a-Service is particularly well suited to addressing the data management and yield-limiting challenges discussed above. A modular approach with humans-in-the-loop can seamlessly combine an adaptive IIoT platform with a proactive AI-driven solution. Assuming ongoing and complete post-installation support, additional data sources and full model maintenance—AI-as-a-Service is proven to eliminate the risks of adopting AI. In doing so, it significantly reduces digital transformation costs. 

Unsupervised deep learning serves prescriptive analytics best. It continually and pre-emptively determines the adjustable variables most likely to achieve an optimal production run. Semiconductor device makers can model such data-driven solutions to target historical and live data in their fabs. In this way, they will discover the complex relationships between process variables and quality metrics. For example, a deep learning algorithm can investigate how the parameters in the lithography and etch & strip processes work together—in tandem with the quality variables of yield, throughput, and metrology data outputs. 

DEEP LEARNING PROCESS OPTIMIZATION

How does unsupervised deep learning for process optimization in fabs work? These state-of-the-art AI models are holistic solutions and work systemically to automate root-cause analysis. The algorithm discovers “Best of Best” (BoB) batches from a contiguous period. This process of discovery ensures reproducibility in practice by separating the conditions that lead to poor or high-quality wafers.

Using its learned relationships concerning the historical BoB region, the model can reliably analyze the current process data and quality. Finally, the model automatically delivers prescriptions to operators as a prioritized set of high-impact parametric adjustments. The adjustments move towards the BoB state as quickly as possible—without destabilizing or forestalling current production. 

These setpoint prescriptions, which the advanced deep learning model generates, are proven to optimize production to meet and exceed the targeted KPIs.

In sum, unsupervised deep learning means data-driven discovery that continuously evaluates actual production processes then delivers adaptive guidance. This prescriptive course correction takes the plant from the current mode of operation to the mode which is most optimal.

It should be emphasized that effective advanced deep learning not only anticipates yield-limiting events but prevents production loss from occurring in the first place.

DEPLOYING AI FOR ROI

Ideally, an AI-for-Manufacturing deployment can realize an ROI of six months or less. It begins with a deep dive into current process and quality data systems to produce a clear report on maturity for a prescriptive analytics solution.

Proper data management is synonymous with AI readiness. Once they establish good data orchestration, fabs should insist on AI-as-a-Service with the following provisions: training, fixed benchmarking of performance, maintenance, remodeling post-optimization, continual support, weekly updates upon installation, and a successful commissioning test. 

Industry feedback strongly suggests that semiconductor device makers can benefit from leveraging their abundant data. AI readiness is especially pertinent considering the data-related challenges fabs must now meet to transform digitally. As with other key verticals, if semiconductor foundries target data intelligently, marrying it with AI can pre-emptively maximize yield (i.e., functional dies per wafer). For fabs, displacing disruptive, costly, and lengthy troubleshooting has become a matter of urgency. With advanced deep learning, systemic yield-limiting events can be relegated to potentialities.