7

Dassault Systèmes creates Big Context with generative AI

 1 year ago
source link: https://diginomica.com/dassault-creates-big-context-generative-ai
Go to the source link to view the article. You can view the picture content, updated content and better typesetting reading experience. If the link is broken, please click the button below to view the snapshot at that time.
neoserver,ios ssh client

Dassault Systèmes creates Big Context with generative AI

This website uses cookies to ensure you get the best experience on our website.

Skip to main content

AI

The rise of ChatGPT and Dall-E has drawn attention to generative AI’s ability to create content such as text and imagery. Dassault Systèmes is betting that recent innovations in big data and big AI models can automate and enrich context at scale. It is developing AI-generated ontologies to project data from many sources onto virtual twins to enhance the context for front-line workers and executives.

Big Context helps experts in cost analysis, procurement, or engineering to pull data from across the enterprise to make sense of the problem in front of them. The AI-generated ontologies can help cost analysis teams identify how price spikes affect parts, procurement teams identify suitable replacement parts, and engineering teams set up simulations to ensure performance.

Renault is using this approach to prioritize design changes in response to supply chain shocks. A large cosmetics company is using this approach to link customer sentiment expressed across the web to variations in product formulations.

Morgan Zimmerman,  CEO of Dassault Systèmes’ NETVIBES Brand, says the core idea is to use semantic technology to overlay data spread across ERP, PLM, CRM, and third-party sources onto a standard projection. This is similar to how Google Maps overlays a wide variety of data onto a map to improve understanding and decision-making. He explains: 

This is the equivalent of using Google Maps to bring data together. Finding a restaurant near a flower shop is a painful question if you get one Excel spreadsheet with the locations of restaurants and another with the locations of flower shops. But this becomes obvious as soon as they get projected on the map.

Now if you have IoT data, quality data, and cost data coming from different systems, they are hard to leverage and correlate with one another. But as soon as I have projected the data onto a virtual twin of the same aircraft, car, or factory, then they are correlated by nature. Because the virtual twin is model-based, you can play with it and do what-if scenarios. We are capturing enormous data sets coming from many systems. We are projecting it on the actual twin of the product, the factory, or the company, and then we are connecting people and process.

Putting PLM front and center

Along with Siemens and PTC, Dassault is one of the leading PLM vendors and has been busy enhancing the ancillary services for improving product design and manufacturing workflows. Over the years, it has built up various tools for contextualizing data related to concrete physical things. Zimmerman argues this provides a better starting point for contextualizing data for various kinds of experts within the company than systems like ERP systems focused on more abstract concepts like transactions:

In the PLM, we are managing the complexity of the product structure. And we are capable of working with the infinite configuration of a vehicle that does not exist in an ERP. So, the PLM is shadowing a level of model on which we can map the data, which is far more defined than whatever you would find in an ERP.

Dassault Systèmes acquired Netvibes in 2012, which helps pull data from across the web and internal systems into curated dashboards. It also acquired Proxem in 2020, which was developing AI-powered semantic processing tools to automatically map data from various sources into a common ontology.

The combination of automated data intelligence and AI-driven semantic data interpretation and automation led to the development of what Dassault calls a semantic data lake. Essentially AI models are taught how to extract data from across the enterprise and third-party sources and then normalize them into a consistent data schema.

Responding to price shocks

Supply chains are facing increasing waves of shocks in response to inflation, energy spikes, and the shift to net zero. These spikes can disproportionally affect the cost of manufacturing. Big context can help bring together data from many different systems to help executives discover pending issues and collaborate with teams from within a single system.

For example, Dassault Systèmes worked with Renault to track how increases in the price of raw materials like aluminium, cobalt, and lithium affected the cost of different cars. Renault wanted to determine how to prioritize new designs and configurations in response to price spikes. If aluminium goes up, building or selecting parts made from other materials may make sense. Similarly, price spikes in lithium may prioritize the need for new battery designs.

This information is buried across the enterprise or needs to be pulled from supplier websites. An internal recipe for molding a part might characterize aluminum content in one way. At the same time, a third-party component might only list it as a footnote buried in a product data sheet. Generative AI helps train a custom model to translate these different ways of describing the percentage of aluminium or other materials into a common ontology. Zimmerman notes that a large company commonly needs to translate tens of thousands of ways of describing raw materials into a few dozen core ingredients.

Another issue is that price spikes affect the manufacturing cost differently. If Renault forges a part like a frame, then price spikes might directly affect the bottom line. In other cases, the material may be part of an assembly purchased from a supplier that has some leeway in deciding how much and when to pass on cost increases. In these instances, it is essential to bring in data about the supplier’s past behavior regarding raw materials increases.

Process is key

Cost monitoring executives can use the tools to keep tabs on changing commodity prices. When they identify a price spike, they can overlay it onto a virtual twin to help coordinate with procurement and simulation teams.

Once cost teams have identified raw materials and parts most susceptible to price spikes, they can overlay these onto a virtual twin and raise an issue ticket. The procurement team can then use information about the part and a database of comparable parts to see if any alternatives offer a significant cost advantage. If so, they flag the option directly onto the virtual twin and then hand it off to the design simulation team to ensure it meets performance quality objectives.

The big deal in all this is that the cost analysis, procurement, or design simulation teams do not care about where the data came from since they do not have to go back to the source.

My take

The popular press is smitten with how generative AI can make more data. A better problem for most enterprises is figuring out how to make less data.

ChatGPT can quickly summarize a long document or email to help us understand the essential point. Generative AI ontologies can also extend these kinds of summaries to other types of data as well.

One current of thought is that enterprises can dramatically improve their use of data through well-designed, consistent schema across the enterprise. This approach may have some value in well-defined domains around business data.

But things are a bit fuzzier when it comes to data from physical things, supply chain partners, and third-party data providers. In these cases, AI-powered data translators could go a long way toward contextualizing information appropriately for the problem at hand.

Dassault Systèmes may have a leg up when it comes to combining AI ontologies with virtual and digital twins. But others are exploring the field as well. PLM vendor Siemens is also exploring how ChatGPT integration could enhance collaboration from within its extensive set of tools. Companies such as AtScale, Datameer, and Dremio are building out tools to help companies craft a semantic layer for translating across systems at scale.


About Joyk


Aggregate valuable and interesting links.
Joyk means Joy of geeK