Our client had a complex industrial process they wanted to reproduce and analyse in a “digital twin”. The data collection was locked behind a bandwidth bottleneck caused by hardware limitations, and the process was generating more data than required to summarise the state of operation. The client needed a real-time and intelligent compression technique to ensure sufficient data was logged for analysis.
Mind Foundry developed a bespoke intelligent sampling routine that was able to reduce streams of continuous data, sampled at arbitrary frequencies, by an order of magnitude appropriate to the complexity of the underlying signal in the data.
The technology was to be deployed on an “edge” device: the algorithm was required to process a data stream generating several GBs per hour using a single core of a single processor running at less than 200MHz. Nevertheless the algorithm was able to run in real time under these extreme constraints on computational resources.
The technology enabled the client to develop advanced health monitoring checks for their complex machinery using the newly available data.
was available to run algorithms
Factor of 10
Decrease in the size of the data being recorded
made possible by intelligent compression