With three out of four manufacturing companies employing process optimization techniques, one can safely conclude that process optimization is a standard practice in the industry. The differentiating factor is how well the optimization is implemented and if one can optimize across multiple processes and large sections of a production plant.

Reducing waste, minimizing rework, increasing output and quality are highly dependent on the manufacturing engineers’ skills who, in turn, are enabled by the available toolset. Companies that have better tools and a more experienced workforce will get optimization right and have an upper hand in lowering production costs and improving output quality. However, human-led efforts in process optimization reach a plateau due to the limitations of classical techniques and the dependence on deep end-to-end system experience. Trying to surpass this plateau with the same approach will only result in diminishing returns.

If, during uneventful times, manufacturers have a window of tolerance for process inefficiencies in periods of crisis when revenue dries up and international supply chains collapse, process optimization becomes more than a competitive advantage – it becomes mission-critical. A 2020 McKinsey report states that even after recuperating from the initial COVID-19 fallout, 45% of manufacturers in Asia struggle with sudden material shortages.

With such a disruption, those who cannot improve overall equipment efficiency will not weather the storm. When working with fewer materials and less cash, every dollar saved will make a difference.

For both the short and long term, exceptional process optimization will help manufacturing companies stay afloat.


Up to the turn of the millennium, only expert engineers were able to identify process optimization opportunities – confined to a single process. Based on their experience and knowledge, they would consistently conduct inspections to identify any patterns or trends that could lead to poor yields. The engineers’ ability to carry out inspections were limited to narrow time frames and individual pieces of machinery or unit-process steps.

The deployment of sensors has enabled data gathering and helped refine the optimization process. Rather than relying on inspections, the engineers could look at historical data to analyze trends and spot inconsistencies. Equipped with data and some basic processing tools, optimizers could extract more insights than physical inspections. However, this process could still be subject to a high error rate. This eventually gave rise to analytics platforms, which collate information and process data more efficiently to produce easy-to-read charts and graphs. There is still the question of how useful these charts are in actually devising actions that improve production processes.

Data is the foundation for informed decision-making, but, it can only be put to use when insights from that data are actionable. Spreadsheets or graphs can’t help us understand data when dealing with hundreds of thousands of records, which is why 72% of the data in the manufacturing industry is unused.

For a more systematic approach to process optimization, we can look at industry-specific methodologies. Plan, Do, Check, Act – or PDCA – is a quality management methodology that is used by manufacturers to continuously roll out process improvements. It describes – at a high level – a set of steps required to identify, implement and verify optimization opportunities. 

Although the PDCA methodology is more efficient when having data, charts, and graphs available for engineers to leverage, their expertise is still critical throughout all the PDCA steps. If we can manipulate the available data in such a way to produce actionable insights rather than just visual representations, we can offload some of the PDCA tasks from engineers to an application capable of handling the remaining 72% of unused data. This is where AI in manufacturing can bring considerable benefits. By making use of all the available data, AI solutions can prescribe actionable insights to engineers, supervisors, and operators.


Most references to AI in mainstream publications refer to Artificial General Intelligence. This type of AI would be able to replicate a human mind and all of its complexities. Researchers are continuously working toward this significant goal, and in that pursuit, we have developed technologies that can be applied today. The most notable one is Machine Learning (ML), a subset of AI, which analyses labeled data with or without human supervision to produce an intelligent output.

When working within a specific set of rules and relevant data, machines can now outperform humans on tasks such as image and object recognition, cybersecurity threat detection, and stock market predictions by a considerable margin. These types of data-heavy computational tasks are where AI-driven technologies have an upper hand on human capabilities.

By trial, error, and correction, ML applications can practice on thousands or millions of data points to fine-tune their ability to produce an output with the highest success rate. The manufacturing sector is an ideal space for implementing machine learning solutions to uplift optimization where human potential maxes out. Whereas a well-oiled optimization process that uses non-AI tools achieves a 5-6% internal scrap and 10-15% rework rate, a leading AI solution can achieve up to 0% internal scrap and an 8% rework rate. Three months after deployment, our AI-led optimization system, DataProphet PRESCRIBE,  achieved a rate of 0% scrap.


As we saw with the example above of a machine learning solution achieving 0% scrap, we can infer that, with the right solution, near-perfect optimization potential is there. After all, we cannot expect humans to analyze 173,000 records with laser-like precision.

Even if AI can process—with high precision—thousands or millions of data points, the application must be able to deliver industry-specific benefits. As such, a Machine Learning solution can only deliver tangible results in the manufacturing industry if it can do the following:

  1. Learn complex relationships between process parameters and plant metrics: A suitable AI solution must be able to identify the dependencies between multiple process parameters directly from the data and have the ability to drive better production KPIs by understanding these relationships.  
  2. Discover the optimum operating paradigm across manufacturing steps: Analyzing the end-to-end manufacturing process to identify improvements all the way from raw materials to finished product.
  3. Provide operators with specific sequences of changes: Upon discovering the optimum operating paradigm, an excellent AI solution must do the heavy lifting and prescribe the most efficient (smallest) collection of step-by-step changes required to achieve operational efficiency.
  4. Prescribe changes to plant control configurations without additional human analysis: The AI solution must be able to suggest improvements solely based on data, without additional human efforts.

Simply gathering and analyzing data is not sufficient to help manufacturers navigate these turbulent times. While Artificial Intelligence or Machine Learning can be the answer to making sense of some data, a truly useful AI application must be designed in a way that provides tangible benefits to manufacturers.