Blog

Driving Operational Efficiency with Help from Industrial Generative AI Models

LLMs such as ChatGPT provide smart solutions for data-savvy engineers

Generative AI for industry is arguably the most exciting technology ever developed. By late winter of 2023, OpenAI’s ChatGPT already had broken a record for the fastest-growing user base of any computer application in history. More than 100 million people were using the app weekly when OpenAI announced in early November that it was giving its ChatGPT Plus subscribers the ability to create their own GPTs, and the use cases—including those for industrial applications—are growing.

This chart, courtesy of Gartner, shows the evolution of generative AI since the first advancements in natural language translation were made in 2010.

What Are GPTs and LLMs?

The rapid advancements in generative AI are the result of research breakthroughs since 2010 in transformers, which is the “T” in generative pretrained transformer (GPT). A transformer is a type of neural network regression model that takes an input sequence and turns it into an output sequence. These transformers allow for parallel processing of unlabeled datasets, which means they can be trained quickly with relatively few resources. Transformers are designed to process sequential material. This could be a sentence used in natural language or a collection of observations, such as time-series data.

The most basic type of neural network is a Multilayer Perceptron (MLP). Here, it shows multiple input layers, hidden layers, an output layer, and a target variable.

The transformers used in generative AI associate patterns and relationships between the sequential components. This means they learn these patterns and can repeat them, in context, at query. To get the desired result, they are trained on massive amounts of data. This gives them the name large language models (LLMs). Essentially, LLMs are trained on all available written data in the world at the time of their creation, or roughly all the information available on the Internet.

Practical Applications for Generative AI for Industry

The use of natural language means that coding knowledge is not required to use a GPT. It also means that natural languages are processed well. This makes a GPT ideal for changing the tone of text to make it softer or sound more professional. It also creates tangible drafts of text tailored to the requested style and length.

But the use cases for generative AI are more than just text generation. They include:

  • Discovery: Asking questions in a GPT search field, which leads to the discovery of answers that might not otherwise have been found.
  • Summarization/simplification: Quickly generating abbreviated versions of lengthy articles and web pages, as well as creating outlines and extracting key points from content.
  • Classification: Sorting topics for specific use cases.
  • Code generation: Creating portions of code or generating entire software programs.
  • Product design: Meeting the demands of creating new product ideas.

Generative AI for Industrial Operations

This new technology is also useful for helping operational experts make improvements throughout the plant.

Because LLMs are great at generating natural language text, they are also good at structuring natural and programming languages. They also can be used to query a database, or to help reason a problem. In fact, they make a good copilot for engineers and data scientists.

By combining TrendMiner with generative AI, engineers and data scientists can produce snippets of Python code or even complete machine learning models.

At TrendMiner, AI research teams combined advanced industrial analytics software with an LLM to develop TrendMinerGPT. Data scientists, who are highly research-focused when it comes to designing models, can use it to generate all the Python code needed for a machine learning exercise. This rapidly speeds up a time-consuming task and creates models with the right technical functions in place. It also splits up datasets and generates the code for model deployment.

Engineers also can use TrendMinerGPT as coding copilot. They expect an interface where they can build visualizations, but sometimes engineers want something specific that is tailored to their needs. Instead of learning the code to build a visualization from scratch, engineers simply could ask TrendMinerGPT to generate the code, including the tag name, and display the desired visualization. That visualization now can be embedded in the engineer’s dashboard and shared with other operational engineers.

Here, an operational expert has asked TrendMinerGPT to create a dashboard that shows the Mind Blower production line along with a list of all the batches, quality, and maintenance data. The engineer also has asked it to highlight the mashing ton’s process parameters and its active monitors.

Building a dashboard for thousands might be a tedious task, but engineers can ask an LLM to do it instead. For example, an operational expert asks TrendMinerGPT to build a dashboard showing a specific production line that lists batch quality and maintenance data. They then ask it to highlight the mashing ton’s process parameters and its active monitors. If the LLM has access to the correct data, it will generate a dashboard using those specifications.

Hands-free operation for inspection and maintenance is another use for TrendMinerGPT. Here, a field worker has logged an event using voice prompts while continuing an inspection.

Still another use for TrendMinerGPT is in inspection and maintenance. Workers in the field often need their hands for another task and need to interact with software in a different way. Through voice commands, the inspector could ask the LLM to log an inspection. For example, a worker notices heavy vibrations and loud noises during his rounds. He creates an inspection log note for the point where the anomalies are noticed, and TrendMinerGPT then fills in the asset hierarchy details and its location. TrendMiner then saves this information as contextual data. But AI takes this log note a step further and compares the new contextual data to maintenance situations from the past. It then labels the priority based on this knowledge, which can be confirmed by a user later. It also gathers prescriptive information based on successful use cases from the past.

Problem solving and general solutions are yet another use case for TrendMinerGPT. In this case, LLM has generated a list of ways a user could improve performance. More specific queries lead to more specific results.

TrendMinerGPT could even help its users understand more about TrendMiner. Using the chat feature of the LLM, a chatbot offers deeper and more tailored support solutions, which leads to more creative use cases. These cases then can be shared by uploading the knowledge to the LLM. When prompted, TrendMinerGPT then can provide the steps of the use case for engineers who encounter a similar situation.

In the future, the trust of autonomous systems may lead to the development of automations in maintenance and other areas. For now, the best uses of AI for industry are anomaly detection, forecasting, classification of production quality data, and natural language for building reports and summaries.

Research is ongoing for ways AI can be chained together to work on a bigger singular outcome, where one LLM can ask questions of other LLMs that are trained on something different. That second LLM then could request a specific AI function to classify or to forecast a another piece of information and feed that back to a whole chain of events that eventually lead to an outcome. In theory, these outcomes could mimic human intelligence or even surpass the job a human could do.

Is Your Organization Ready for AI for Industry?

While the world is not yet ready for autonomous AI for industry, there are many things industrial companies can do today to ensure they are ready when AI evolves. Embracing change, democratizing data to everyone who needs access, and overcoming organizational and technical challenges are part of preparing for industrial AI. In the next blog, we will explore how data-driven manufacturers have successfully completed their digitalization journeys and transformed into augmented factories.

From our Blog

Achieve Better Operational Storytelling with a Connected Factory

A Connected Factory unlocks data silos and provides a more holistic view of operations to help engineers make better decisions.
The smart factory with the help of IT OT Convergence

Navigating the IT/OT Convergence in Pursuit of a Smart Factory

The IT/OT convergence facilitates a more innovative, efficient, and adaptable manufacturing environment in a digital transformation.

Reap the Benefits of Operational Insights in a Data-Driven Factory

,
Companies have many decisions to make about how they will explore and use operational data.