AI prescriptions that optimize production are fueled by data factories already own. They are also based on processes with which manufacturers are intimate. For high-impact digital transformation, industrial AI specialists initiate and sustain collaborative relationships with subject matter experts (SMEs). These relationships foster mutual learning. Committed collaboration and knowledge sharing empower manufacturers to integrate data-driven decision-making on the shop floor. In the process, they also develop the digital maturity mindset so critical to flourishing during manufacturing’s great technological reset. 

COMPUTATIONAL BUSINESS THINKING—THE NEW MUST-HAVE SKILL FOR INDUSTRIALS

In AI-as-a-Service: Change Management Wisdom For Manufacturing 4.0, we concluded that cultivating manufacturer belief in factory data began with drawing on existing plant wisdom and instilling user agency. We suggested that in successful smart factory use cases, technology vendors collaborate with functional manufacturing experts — to enact AI-led prescriptions that yield measurable results.

However, deeply embedding data-driven decision-making over the long term requires second-order digital understanding.

At this next level of digital maturity, the underlying metrics a data scientist uses must ultimately capture the imagination of factory personnel. These metrics generate insights into the manufacturing process’s key optimizable parameters. They also validate the machine learning models and measure their production impact. Taken together, the metrics are also emblematic of an era-defining, commercially applicable paradigm—computational thinking. Applied to business problems, this mode of thinking draws on pillars of computer science (i.e. algorithmic decomposition, abstraction, and pattern recognition) to create new value. Getting on board with an algorithmic mindset is a wise move for anyone whose business is, increasingly, bound to data-generating physical processes. This is certainly the case for today’s manufacturers.

There are two other good reasons computational thinking, or, more specifically, algorithmic business thinking should be a modality worth deep engagement for manufacturers. Firstly, establishing an enduring mindset shift toward AI-enablement is not a fad. Rather, it is a long-term project that will ultimately define all 21st-century industries. Secondly, when it comes to enterprise-wide digital transformation, the clock is ticking.

The readiness is all.

But how can manufacturers get ahead of the data-driven enterprise curve?

CROSS-FERTILISATION BETWEEN TRADITIONAL INDUSTRIAL ENGINEERING AND DATA SCIENCE

Within a broader universal digital ecosystem, plant engineers, manufacturing executives, managers, and operators must forge their own path. At the level of production, this path harnesses computational thinking to optimize processes continually with state-of-the-art technology. The best manufacturing leaders do this in a spirit of collaboration and continuous learning with AI specialists.

During AI use cases, the data scientist and data engineer, along with the manufacturing SMEs, form two parts of the digital transformation equation. For this reason, they are indispensable to a project’s success. Ideally, a cross-fertilization of relative expertise takes place:

  1. The manufacturing expert benefits from apprehension of data science sufficient to be convinced that the data-derived prescriptions are not only safe but can realistically offer a significant KPI benefit—for example, improved cycle time and reduced costs.
  2. On the other hand, it is the responsibility of the data scientists and data engineers:
    • to know enough about the underlying thermal/chemical/metallurgical production process.
    • to convey the relationship between the data and its utilization — in a way that is accurate and relatable to the manufacturer.

Let’s look at some scenarios of engineering coming to terms with data science inside the plant.

CONTEXTUALIZING AI PRESCRIPTIONS FOR MANUFACTURERS

Frederick Theron, a chartered engineer, and highly experienced AI-for-manufacturing project management office (PMO) lead, emphasizes the importance of contextualization when bringing AI into factories:

“DataProphet’s software generates instructions which, from the operator’s or the engineer’s point of view, may appear to come out of the blue. If they’ve been doing their job for 20 years, they have a very good idea of how to do it. Yet all of a sudden, a set of instructions comes that, from their perspective, has very little context. For this reason, an inherent, and completely justifiable, risk management mindset kicks in until we can comprehensively convey how the AI prescriptions were derived — what they mean, and how they should be implemented.”

Someone whose job it is to address these questions is Nelius Coomans, a data scientist at DataProphet and a chemical engineer. His insights on change management dialogue for digital adoption in manufacturing are based on thousands of hours of experience. These hours have been put in with customers, both remotely and on-site. Coomans contextualizes a typical learning trajectory during which plant teams gain an AI mindset shift over time:

“We spent a lot of time with management in person, sometimes once a day or three times a week—sometimes on-site for eight to ten hours at a time. This was necessary in the early days because what we noticed was this pattern. Management would identify a problem, and we would create a solution or a tool for that problem, but then the people that work with that problem on a day-to-day basis wouldn’t adopt the technology. So, we would have to look at the PRESCRIBE front end with the teams, explain the Ranker, and then study the Ranker’s parameters with them and work with plant engineers and operators to strategize how they were going to apply the changes to the control plan—from day to day and week to week.”

ADDRESSING KNOWLEDGE DOMAIN SKEPTICISM TOWARDS DATA SCIENCE AT THE EDGE OF PRODUCTION

Drilling down on the criticality of plant personnel buying into the underlying validity of data-driven insights into the material flow is Jan Combrink. He is another highly experienced DataProphet data scientist (as well as a qualified mechatronic and electronic engineer). Combrink observes that deep learning prescriptions can seem counterintuitive to plant operators:

“An operator’s sense of the process they’re overseeing is technical in the immediate sense; it’s an empirical understanding. And there is a little bit of a gap there between this paradigm and a computational paradigm, which is abstracted out from the process. The plant operator is like—‘Why would I make these changes?’

Echoing this operator skepticism, Renita Raidoo, another of our data scientists and a trained mechanical engineer, makes the following observation:

“In more traditional manufacturing plants with very traditional engineers and technicians, there is sometimes greater apprehension when implementing AI based prescriptions. There is a tendency to rely on years of experience and process knowledge without fully considering the benefits that new technology can bring to their operations.”

What is the best way to overcome this resistance?

According to Combrink, plant engineers can be a good intermediary because they find it easier to make the conceptual link between the data-driven prescriptions and the manufacturing process:

“A plant engineer has a theoretical sense—far removed from the actual process. Just a very high-level overview of everything modeled in a fine-grained manner, but abstracted. And I think that when a person has an extensive background in engineering, the mathematics of the optimization problem, if explained, makes sense to them.”

ABSTRACTION CLARIFIES A COMPLEX PHYSICAL PROCESS

On the question of abstraction in computer science, Paul McDonagh-Smith, MIT Sloan senior lecturer in IT and Executive Education, makes this critical distinction:

“We apply ‘Abstraction’ to separate signal from noise so we can focus on what’s important by removing the data and information that distract our focus.”

This principle has practical shop floor relevance to cultivating a data-driven mindset shift around a production process. In manufacturing environments, data scientists rigorously abstract the essence of a production process out of a mass of data. Because this mass of data is beyond the scope of human comprehension, it needs to be put into human-interpretable form. In turn, operators can focus on data-driven optimization. As Raidoo points out:

“As a human being, no matter how smart you are, you can’t visualize 300 parameters at once and glean where a process is going.”

During smart factory use cases, plant engineers and process engineers can act as a bridge between industrial AI technology vendors and operators, in terms of:

  1. Vetting the viability of the data modeling as it relates to the physical process.
  2. Making the required changes to plant control plans needed to apply the prescriptions.
  3. Encouraging operators to integrate AI prescriptions into their standard operating procedures.

On this second point, it’s important personnel develop some appreciation of the way AI modeling maps possibilities for continual improvement. AI modeling does this by accounting for the myriad relationships between process parameters. A two-dimensional map of outcomes plants have previously achieved (on a scale from suboptimal to optimal) gives manufacturers vision. This map allows them to perceive at a glance where they are, and where they need to be.

GETTING TECHNICAL WITH DATA-DRIVEN MANUFACTURING OPTIMIZATION 

However, as Combrink explains, changing everything all at once is not feasible when treating a plant as an optimization problem that AI can solve. Moving the current plant state towards the best-of-best (BOB) state must be done incrementally.

There is a good reason for this. Complexity. Firstly, the production impact (e.g. in terms of yield or quality) of modifying an input variable (e.g. a parameter for volume, temperate, pressure, etc.) is not directly proportional. In other words, the effects are non-linear. Secondly, production outcomes are the result of not one but many parameters interacting with each other during a manufacturing process.

Non-linear data models such as fully-connected neural networks can capture the dynamic correlations between hundreds of input variables. From here, data scientists can construct a high-dimensional representation of a manufacturing process. Machine learning can then be utilized:

  1. to discover hidden patterns in the process’s real-world behavior, historically and from production run to production run. 
  2. to generate insights from these patterns.
  3. to suggest incremental modifications for continually improving the plant state.

And yet, it’s important to remember that the resulting insights of non-linear modeling and deep learning discovery may not adhere to the rules manufacturing professionals may intuitively expect.

Theron confirms this, and also points out that the technical ‘how to get there part’ is ultimately the responsibility of the plant’s operational team:

“You can’t roll up and say, ‘Change all of these things on a Monday morning.’ Some of the changes are subtle, and some of the changes are quite profound—sometimes orders of magnitude. Either way, there needs to be a systematic approach to how they are briefed and how they are communicated so the changes can be implemented incrementally. If you want to change a dial from 3 to 6 you don’t change the dial straight from 3 to 6, you change it from 3 to 3.5, from 4 to 4.25. You take these baby steps, or you’re supposed to take these baby steps, to get to where it is you want to go [i.e., the BOB region] and then slowly nudge the process onto the rails, onto a different track.”

It is logistically important that operational teams see the prescriptions as steps to a final destination. They must also realize that the AI Ranker gives the most important changes for that shift. Prior to this, the prescriptions need to have gone to the process engineer, or the plant manager. It is their responsibility to come up with a feasible operational plan that charts a path towards the optimal targets informed by the data modeling.

FINDING BOB—PART OF THE CONTINGENCY PLAN FOR OPERATIONAL EXCELLENCE 4.0

Part of the contingency plan for an AI-enabled plant includes achieving the BOB targets. Ideally, as Coomans advises, the data-driven prescriptions form part of the day’s strategy meeting. In this context, the entire operational team for the shift can plan how to integrate AI prescriptions over the ensuing production period.

As we established in Conceptual Rebooting For The Data-Driven Continuous Improvement Journey , AI-driven optimization integrates with existing operational best work practices. In the same way, algorithmic business insights for manufacturing process optimization enhance current continuous improvement pillars. They do not displace them.

As with other industries, now is the time for leveraged industrial data to permeate professional communication and decision-making. This doubly applies to stakeholders in the manufacturing value chain, especially at the level of production. As we are reminded, OEMs, machine builders, and Tier I suppliers are square in the eye of a storm. It is a storm fueled by supply chain upheaval, acute environmental challenges, and a rapidly shifting technological paradigm. 

Collaborating around production data is a pathway to much-needed commercial agility and resilience for Manufacturing Excellence 4.0. Begin the journey now.