The previous article in our change management series made a case for embedding algorithmic business thinking in a plant’s standard operating procedures. With examples from AI use cases, we illustrated how collaborating around data-driven decision making solved next-level optimization problems. However, we also concluded that successful AI enablement depended on a knowledge domain synthesis — between traditional industrial engineering and data science. 

In this discussion, we take a systems level view, looking at how the AI-as-a-Service (AIaaS) model in itself can innately drive smart factory change.


Undoubtedly, an organizational cultural shift (from the C-suite to the shop floor) needs to take place for digital transformation of manufacturing. Thomas Meakin, Jeremy Palmer, Valentina Sartori, and Jamie Vickers summarize it well in “Winning with AI is a state of mind”:

“It means collaboration and continuous learning over individual knowledge and experience, with employees seeking out new data, skills, workflows, and technologies for driving ongoing performance improvements … organizations that embrace an AI-enabled mindset are better at meeting the most formidable technical and cultural challenges and making organizational changes so they can reap the full value that the data and technology offer.”

And yet, we should remember that the AI-oriented cultural change exemplified here is a by-product. It likely reflects the integrated learnings of collaborating stakeholders who have witnessed the results of their efforts during AI use cases.


DataProphet’s AIaaS model is guided by a commercial value proposition. The process of achieving this value inherently requires stakeholders (AI specialists, managers, IT systems owners, plant engineers, and operators) to collaborate around an agreed value uplift.

The gateway for this value uplift is factory data. To define a viable use case, DataProphet works with plant personnel to establish data integrity via a data readiness assessment. This typically necessitates a collaborative data discovery and improvement journey ensuring a plant’s data integrity. By extension, it also ensures  a plant teams’ awareness of their data as a value enabler.

And rather than overlaying a pre-ordained technology onto a plant environment (a solution without a problem), the AIaaS approach begins with assessing the data architecture of a factory to determine:

  1. which gaps need to be filled to ensure a plant is ready to drive value with their data.
  2. a sufficient process and quality data pipeline for solving an optimization problem.


Data scientist Jan Combrink emphasizes the importance to a factory of beginning with a comprehensive data picture of their operation. He points out that to model a factory, he needs to be able to see how the different sub-processes in production relate to one another:

“Essentially, the view that you want is a complete mapping from process A all the way to process Y that then results in quality Z. The last variable in the chain is the quality outcome; quality is traced back to the last sub-process. And from there on that last sub-process has to be traced back to the previous sub process — all the way back to the input of the system. And if you don’t have sight of all of that for the same time period, then you don’t really have a view of how the entire process works; you can’t really model it. You can model some of the relationships, but that’s not going to allow you to optimize the entire plant.”

Depending on the factory context, data is collected in various ways: via Programmable Logic Controllers (PLCs), Supervisory Control and Data Acquisition (SCADA), and Manufacturing Execution Systems /Quality Management Systems (MES/QMS).

As Jan explains, at the start of every project, there is a handover of a plant’s data. However, in reality, it usually manifests as a disjointed set of spreadsheets and databases. For example, quality measurements of production phenomena arrive that were measured in a lab and written down and have been entered in. However, these data sets tend to be collated and handed over to DataProphet’s install teams without context, which is to say without the customer having a full picture of the data’s relevance. Indeed, some data may have been collected for purposes that are not even known to plant personnel—regulatory requirements, for example, or sensor measurements logged for the purpose of non-defined future use, thus lacking a plan:

“What often happens is the client knows about the different data sources but has never really used the database themselves. Therefore, before we can deploy our solutions, we address the current problems with the way a factory has been gathering its data and address these with an agreed optimization goal in mind. The first things we address are whether the amount of time that they’ve recorded data for is too short and whether the process by which they have recorded the data is flawed.”

Manufacturers can also expedite this collaborative data management process by involving in-house data scientists. In our experience, their input can be significant. For example, they can create automated reports for their management teams on statistical correlations outside of what is captured by our AI models.

With a clear picture of their data’s relevance and, potentially, its shortcomings, manufacturers gain clarity around evaluating an IIoT stack — according to their unique data requirements and KPI objectives. 


To establish data integrity prior to a commissioning test, DataProphet works with plant personnel to lay down five data connectivity building blocks:

  1. Perception
  2. Networking
  3. Storage
  4. Application
  5. Intelligence

At the layer of data perception, we collaborate to ensure flexible integration of sensors, machines, edge devices (with real-time AI and compute capabilities), and databases. The perception layer establishes a foundation for our adaptive AI-ready IIoT platform. Edge-to-cloud (or edge-to-prem) data storage integration provides safety, security, and remote access for an evolving production environment. At the networking layer, seamless communication is ensured via numerous protocols. Full connectivity happens at the upper layer — the application. Application is where an IIoT platform executes its intelligence remotely.

It does so at the human-machine interface (HMI). At the HMI, our connectivity platform enables centralized, real-time reporting and intuitively hierarchical data visualization. It also provides operators with analytics capabilities, alarms system integration, and KPI monitoring. Because our connectivity solution is built with AI in mind, its intelligence layer harnesses the optimization and prescriptive maintenance capabilities of DataProphet PRESCRIBE.

For one manufacturing customer, DataProphet PRESCRIBE was deployed to reduce scrap produced during a light-alloy wheel manufacturing process. For this automotive OEM, the data discovery journey itself added significant value to the plant’s standard operating procedure by revealing critical areas for improvement in data logging across the production line. This was achieved by:

  • refining the data
  • improving logging resolution
  • logging of additional parameters
  • implementing a more traceable and reliable method of recording visual scrap.

As indicated above, by the time a successful commissioning test has been completed in a factory, plant teams have, by necessity, also engaged in a KPI-contextualized data orchestration process. Through this process, they have adopted a digital maturity mindset. DataProphet’s AIaaS model reinforces the traceable value line from the digital transformation of their data as attached to an underlying physical process — to a targeted enhancement of their production.

The proof of this is measurable and sustainable improvements in throughput and quality that operational teams have actioned from data-derived prescriptions.


With a clear optimization objective in mind, manufacturing professionals and AI specialists are critical change agents in the 4IR partnership ecosystem. With buy-in from the C-suite, technology vendors and plant personnel accelerate the path along the digital adoption curve. AIaaS specialists act as industrial technology supersetters. Functional manufacturing experts as validators and pathfinders. Together, they collaborate and consult to forge a data pipeline from the material flow of production via advanced analytics. The result is digitally-derived rolling commercial value.

As we have observed, AIaaS for manufacturing works with manufacturers to centralize their system so that IT and OT teams can align around smart factory initiatives. In the process, factory personnel become familiar with acting on AI-driven data insights. This empowers them to move from a system where reactive troubleshooting of production anomalies is the norm. Instead they become part of a system where the insights of deep learning discovery drive next-level continual improvement.


To turn a factory’s data into value, an AI-ready IIoT platform needs to be integrated with a plant’s existing data architecture to extract, transform, store, and leverage — from the industrial edge to the HMI. However, the importance of a digital maturity mindset cannot be ignored. During the journey, an AIaaS vendor must collaborate in the training and ongoing support necessary for plant teams at every step.

Digital transformation of manufacturing starts with defining a business value. Following this, it involves honing the tool of data-driven technology to go after that value and measure the results achieved. Once initial momentum has been established for one use case, the proven solution can be rolled out to other products, sites, and factory fleets. 


However we name it (e.g., 4IR, autonomous manufacturing, or smart manufacturing), the realities of our age — a data-centric industrial inflection point — cannot be ignored. The manufacturers best-poised at the nexus between the old way of doing things and next-era production will play a starring role.