Anomaly Detection in Manufacturing | SQream

Anomaly Detection in Manufacturing

A South Korean manufacturing conglomerate selected the SQream platform to analyze a peta-scale database composed of manufacturing machine sensor events, ingested to 13K tables, on their production floor. The main purpose was to leverage SQream’s fast ingestion and processing capabilities in order to learn normal patterns and alert to anomalies before malfunctions occurred. SQream leverages the power of the data center at the edge. The legacy data center infrastructure is not needed as terabytes of data can be ingested at the edge, in scale and with high performance.

13K tables
continually analyzed
17% improvement in
Overall Equipment
Efficiency
2 petabytes
of data are analyzed;
2.5 trillion rows
9X faster than
the previous platform
Industry Vertical: Manufacturing

 

Why Do Anything?

The previous platform could not process the massive number of events and consolidate the insights for the decision makers.

Why Now?

The South Korean manufacturer was looking for a cost-effective solution that could be integrated easily in their existing ecosystem. They were looking for an optimal solution that could grow up to 3 PB, and which would be able to work with their existing AI/ML tools. SQream supported the monitoring process of quality (PMQ) using acsensorize (collecting sensors); discovery (creating baselines) and alert on time (using ML models). Adopting these methodologies using SQream’s best practices allowed their manufacturing teams to shorten manufacturing time, reduce costs, and reduce malfunction time.

Why SQream?

  • SQream has a proprietary algorithm to chunk data and thus, cost is significantly reduced.
  • SQream can load dozens of terabytes per hour, while automatically optimizing and compressing the data.
  • Ingestion volume easily scales up and out.
  • Ability to process petabytes of data, thousands of columns.
  • Rapid ingestion to thousands of columns.
  • Ability to rapidly join many tables with large datasets.

Fastest time to insight on any size data

Business Challenge

A South Korean manufacturing conglomerate selected the SQream platform to analyze a petascale database composed of manufacturing machine sensor events which were ingested to 13K tables, on their production floor. The main purpose was to leverage SQream’s fast ingestion and processing capabilities in order to learn normal patterns, identify malfunction as early as possible, and detect any non-standard deviations from the normal pattern, to trigger additional scrutiny and reduce potential issues in supply chain operations.

Business Impact

The onset of smart sensor networks has opened unprecedented possibilities in the industrial environment, leading to a surge of smart manufacturing and Industry 4.0.

The ever-growing amount of data enabled by such technologies paves the way for using Machine Learning methods to extract information from the data that would be impossible to obtain, even by human experts. The manufacturing team was asked to detect anomalies (as early as possible) on machine-based sensor accumulation and analysis. Since the manufacturing process was extremely complicated and contained thousands of sensors, the first step was to accumulate the sensor data (acsensorize) from the different stages and create a baseline database. After having 1000 TB of data, SQream platform continually processed and searched for deviations from baseline.

SQream Benefits

The South Korean manufacturer was looking for a cost-effective solution that was easy to deploy and could be integrated into their existing ecosystem. They were looking for an optimal solution that could scale up to 3 PB and which would be able to work with their existing AI tools. They wanted to be competitive, shorten the manufacturing time and reduce the time of malfunction.

SQream Solution Components
  • SQream at the Edge
  • SQream connectivity supports dozens ofinternal and external sources of data
  • SQream mega loader
  • SQream BI/ML direct queries on data
  • Three nines uptime availability
Trusted by:

Architecture Considerations

1.

Sensor data is collected in the staging area to 13K 1 tables.

2.

Every 15-minutes the SQream platform gathers the data using copy command and reads it.

3.

SQream compute will then run 200 different queries in order to aggregate and enrich the data.

4.

Data ingestion is done 9X faster than the previous platform.