Why Your Digital Twin Needs an Intelligent Data Pipeline
What Is an Intelligent Data Pipeline?
An intelligent data pipeline is a system which allows for the transfer, processing and buffering of data, and for actions to be taken based on the data values. Usually a microservice based architecture, they rely on total connectivity, lightweight edge deployments, and scalable templated solutions. An excellent modern alternative to traditional connectors, their many advantages have been detailed in several different documents on this site.
Why Digital Twins Demand Smarter Data Infrastructure
In the context of a live digital twin, there are several key reasons why an intelligent data pipeline is a better option than traditional connectors, or protocols like MQTT or OPC-UA:
Intelligent Data Pipeline vs. Legacy Data Transfer Methods
Intelligent data pipeline | Legacy method (connectors, MQTT, OPC-UA etc) | |
Point to point data transfer | ✅ | ❌ |
Simultaneous multi-source data gathering | ✅ | ❌ |
Simultaneous multi-destination data sending | ✅ | ❌ |
Sensor fusion | ✅ | ❌ |
Conditional data forwarding | ✅ | ❌ |
Universal namespace enforcement | ✅ | ⚠️ |
Templated configurations for easy scaling | ✅ | ❌ |
EdgeML ready | ✅ | ❌ |
Compression/feature extraction | ✅ | ❌ |
How Intelligent Pipelines Power Live Digital Twins
Such a flexible data management infrastructure is essential for a complex and developing environment. As the benefits of a live digital twin are derived from its ability to deal with lots of data very quickly, the edge ML, sensor fusion and multi-source data ingress properties of data pipelines are essential in delivering near real time insights. The multi-destination data forwarding and ability to use conditional forwarding to build a rich data set, and send it directly to your modelling tool of choice, as shown below:

See Intelligent Data Pipelines in Action
Want to know how else an intelligent data pipeline can help when building a live digital twin? Check out the full whitepaper here! And for other uses of intelligent data pipelines such as semantic modelling, data management, and compatibility, check out our other white papers in the resources tab here!
Key Definitions:
Digital twin: A virtual representation of what a complex real world system was doing
Live digital twin: A virtual representation of what a complex real world system is doing
Intelligent data pipeline: A microservice based system allowing for the transfer, processing and buffering of data from many different sources to many possible destinations.