Intelligence to Manufacturing (I2M)

The I2M Platform is a corporate cloud SaaS for digitization, control, automation and optimization of manufacturing operations.

Integrates IT/OT datasources across the enterprise and applies Artificial Intelligence to emulate the decision-making process of operations, production and process experts.

Based on almost 30 years of expertise, currently manages about USD $2 million of daily production in industries such as automotive, pharma, plastics and steel.


Strategic Operations Research (SOR)

Given a demand/capacity scenario (current or forecast), predicts the most profitable allocation of Production Orders (POs) across interdependent Work Centers.

Potential bottlenecks are detected and precified, along with a selection of quantified options to resolve each one of them. [...]

Work Centers can be factories within a supply chain, workcells within a factory or workshops within a workcell.
Time-horizon depends on the input and is typically based on weekly, monthly, quarterly or yearly forecast.

The System maintains a Master Production Schedule (MPS) that encompasses several alternative forecast scenarios, providing the most profitable solution for each of them:

  • Detects and predicts bottlenecks, describing their operational/financial impacts and quantifying possible solutions (e.g. improve KPIs, increase shifts, additional hours).

  • Optimizes production mix when demand exceeds capacity, determining the most profitable portfolio by comparing each SKU's margin versus opportunity costs of the required production slots.

  • Production Planning and Control (PPC), a daily iteration of the MPS based on current demand and inventory levels.

Simulations can also be based on alternate capacity scenarios to predict operational and financial impacts of investments and budget cuts (e.g. improving KPIs, acquiring assets, decreasing work shifts).

KPIs for each Work Center are tracked, analyzed and benchmarked to profile Corporate Capacity, improving the MPS's precision and alerting stakeholders if performance deteriorates or is insufficient to meet current production goals.

Lastly, the module also tracks and profiles inventory levels per SKU to optimize Warehouse Leveling, triggering additional WOs for PPC based on demand (actual vs. forecast) and each Work Center's capacity/idle slots.

Production Activity Control (PAC)

Controls Work in Progress (WIP) to optimize sequence/flow, detecting and mitigating disruptions in real time via rescheduling/ rerouting.

Also provides data management for traceability, KPIs, standardized work instructions and maintenance interaction. [...]

Given short-term (e.g. daily) demand scheduled by PPC and mid/long-term demand forecast planned by MPS, tracks WIP to continuously predict the most profitable sequencing/flow of materials and resources within the shopfloor.

Disruptions and deviations (e.g. bottlenecks, downtime) are detected in real time, and the System's AI autonomously reschedules and reroutes WIP to minimize financial and operational impacts.

Centralizes Data Management for each P.O. (e.g. recipe, BoM, standardized work instructions, setpoint adjustments) and provides detailed Traceability, physically tracking it in the shopfloor and timestamping operations for analysis:

  • Compare this P.O. to others within the same SKU to uncover potential inefficiencies and noncompliances;

  • Constitute the overall profile of its SKU, which is used by SOR to prioritize production mix according to profitability;

  • Benchmark it against the same operations performed by different assets to identify degradation patterns;

Net Capacity Management tracking operational (e.g. OEE, TEEP, MTBF, MTTR, Net Run Rate) and financial (e.g. materials/energy cost, cost-per-unit) Key Performance Indicators (KPIs).

These KPIs are continuosly analyzed to profile net capacity per asset and per SKU, updating the manufacturing model used by SOR to improve prediction accuracy.

Based on this information, Forecast Alarms inform stakeholders if a Work Center's performance deteriorates or if the factory's overall capacity is insufficient to meet its production goals. SOR is capable of predicting and precifying these bottlenecks.

Lastly, users can also issue, log and track operator activity and maintenance interaction.

Process Digitization & Analysis (PDA)

Digitizes manufacturing into a strucured dataset that represents the physical process over time - as well as its corporate context.

This Digital Twin can be used to classify specific products (instances) or understand overall/ optimal process behavior (aggregate). [...]

Digital Twin Instance is a structured dataset that describes the physical processes which compose a given P.O., providing detailed traceability (e.g. timestamps, historian, continuous process features, discrete product features, KPIs).

Digital Twin Aggregate is the composition of several DTIs to characterize a given process, asset or SKU. The Inference Engine then clusterizes these DTIs, dividing them into Classes with similar productive behavior.

Once training is complete, the System is able to classify processes in real-time: each DTI is indicated as a composition of classes found in the DTA. This greatly improves the statistical relevance of Quality Inspections, allowoing teams to select parts which represent distinct production behavior patterns.

If available process control data is sufficient to establish direct correlations between specific classes and undesirable productive behavior (e.g. noncompliances, inefficiency, asset degradation), the System's Predictor can be detect them in real time, alerting stakeholders.

Assets also have DTAs composed by the DTIs of all P.O.s they produced over time. This can be benchmarked and tracked over time to identify deterioration patterns, informing stakeholders and providing data-driven analytics to inform Predictive Maintenance procedures.

The DTA for each SKU includes KPIs for the overall process, which can be benchmarked to establish best practices (e.g. setpoints, procedures, recipes, BoMs, standardized work instructions).

The Industrial Digital Thread addresses three significant challenges faced by manufacturers:

(1) Faulty production techniques lead to costly unplanned downtime of industrial assets with root causes that are difficult to determine. The causes of such failures are hidden in manufacturing processes. Careful analysis is required to return assets back to production but field engineers and service teams often lack data and insights needed to troubleshoot the underlying issues. The Digital Thread provides the data and analytics needed to understand root causes.

(2) Hundreds of thousands of parts go into large complex product such as aircraft structures. These parts are provided by multiple supplier facilities spread across the globe. Issues regarding the mechanical fit and quality issues often result in expensive returns and reworking. The Digital Thread maintains visibility across the supply chain.

(3) Product defects could be the result of a fault in design, material, or supply chain. Expensive recalls are regular occurrences across industries. The Digital Thread helps to reduce variability and faults, and to trace the cause when they occur."
Management User


"The foundation [of Industry 4.0] is the availability of all relevant information in real time, by connecting all elements participating in the value chain, combined with the capability to deduce, from the data at any time, the optimal flow in the value chain.

By connecting humans, objects and systems, dynamic real-time optimized and self-organized intercompany value networks are created, which can be optimized according to different criteria - costs, reliability and resource consumption”.

Management User

Como a Indústria 4.0 resolve problemas de produção

Industry 4.0 is a convergence of Big Data, Artificial Intelligence (AI) and Manufacturing technologies that enables Systems to automate and optimize critical decision-making at previously unpractical scale. [...]

Given the last decades of industrial automation, the concept is hardly new. Scale, however, is drastically different: Machine Learning (ML) - AI optimized for Big Data - raises the bar for the inferences and predictions a software appliance can reliably make.

Information and knowledge lead you to profitable decisions, but only specialized Machine Learning can accurately and cost-effectively extract these insights from raw datasets.

Once modeled, trained and validated to be performed by scalable resources (e.g. cloud computation), these closed-loop ML algorithms perform several consecutive cycles per second, 24/7. After each cycle:

  • the System applies analytical outcomes and collects feedback data for the next cycle; and/or

  • someone, somewhere, clicks something.

Whether the AI relies on traditional automation or Industry 4.0, at some point stakeholders will apply their combined (yet unscalable) expertise to provide feedback on outcomes or use them for decision support.

If the first analysis these professionals would now make can also be modeled, trained and validated, another layer of knowledge work can be automated at scale.

Rather than KPIs and data visualization, Users are presented with the compound insights a process Expert would have drawn from the dataset (given unlimited time and computational resources).

These modular toolboxes of scalable knowledge automation are the devices and systems that compose Industry 4.0. What separates them is where analysis are made (e.g. cloud, sensors, devices, datacenter, smartphones) and their technological composition:

  • Manufacturing Model that translates technical expertise and data into AI-friendly frameworks. Defines the scope of the application.

  • Industrial Big Data for training/validating the model and as a continuous data source for analysis.

  • Machine Learning engines that enable the complex optimizations defined by the Manufacturing Model.

What isn't Industry 4.0

Companies and products situated at binary intersections of these areas of expertise often position themselves as Industry 4.0, however fall short from delivering reliable, actionable analysis at scale. [...]

  • Know-How + Big Data: integrates IT and OT data sources, enabling parts of the the Industrial Internet (IIoT). However, analysis is left to the user, supported only by KPIs and data visualization. Data sources for your analysis scale; the analysis itself does not.

  • Know-How + Artificial Intelligence: automation projects that lack Big Data (and thus Machine Learning) capability to train and validate models. Results do not scale reliably and iterations/ improvements are cost intensive (as is the Total Cost of Ownership).

  • Big Data + Artificial Intelligence: agnostic ML engines that require extensive customization and specialized labor to translate abstract results into practical knowledge. They also frequently lack Service Level Agreement (SLA) compatible with manufacturing environments (e.g. real-time, cybersecurity, availability).

Advanced Analytics

Unlike previous industrial revolutions, Industry 4.0 does not require intensive hardware costs: if data is available, analysis can be outsourced to third-party SaaS platforms that continuously apply their algorithm toolboxes of AI and ML. [...]

  • Inference Engine: clusterization of historic Big Data to identify and describe patterns, composing a manufacturing model that characterizes and classifies production behavior.

  • Predictive Engine: analyze live datastreams, continuously classifying operations to detect abnormal/ undesirable process behavior in real time.

  • Prescriptive Engine: determine the most profitable way to resolve or manage operational restrictions and constraints (e.g. KPIs, sequencing, shifts, reserved slots).
“Industrial analytics can be applied to machine-streaming data received from disparate sources to detect, abstract, filter and aggregate event-patterns, and then to correlate and model them to detect event relationships, such as causality, membership, and timing characteristics.

Identifying meaningful events and inferring patterns can suggest large and more complex correlations so that proper responses can be made to these events.
Industrial analytics can also be used to discover and communicate meaningful patterns in data and to predict outcomes.”
Management User

Industrial SLA

Software as a Service (SaaS)

The Platform is a SaaS, with a monthly subscription rather than onerous upfront capital investments.

There are also several technical benefits, as we maintain and update your cyber-infrastructure with State-of-the-Art technology and algorithms. [...]

Financially, this allows clients to achieve faster breakeven and improved Net Present Value, avoiding the pitfalls of Total Cost of Ownership associated with traditional models (e.g. software depreciation, hardware, network, database, human capital etc).

Additionally, this business model provides a series of technical advantages for clients:

  • Platform upgrades: new AI and Machine Learning algorithms, new features and OS compatibility;

  • Data backup and automatic servers monitoring with mobile push-notifications and e-mail alerts;

  • High degree of cyber security, with edge computing (hybrid cloud architecture) and enterprise-grade security frameworks;

  • Technical support at business hours (standard) or full 24×7 (contractual addendum).

Machine-to-Machine API

The Platform is on the Enterprise Layer (ISA-95 LVL-4) and establishes autonomous communications with a variety of third-party IT/OT applications (systems and assets) on the corporate layer as well as on the factory level. [...]

Third-Party Application System Interface
  • LVL4: ERP, CRM, BI, S&OP

  • I2M API (https)

  • Files: .xml, .json, .xlsx, .csv, .txt
  • LVL2: SCADA, Historian

  • S7 Ethernet (ISO over TCP)
  • Ethernet/IP
  • Modbus
  • OPC UA*, OPC DA**

* Industry 4.0 standard
** for legacy systems

Cyber Security

Algoritmos criptográficos para encriptação, key exchange, assinatura digital e hashing.
Aderente à CNSA Suite, aprovada pelo governo americano para proteger Sistemas de Segurança Nacional (NSS). [...]

Quantum-resistant cryptographic algorithms for encryption, key exchange, digital signature and hashing.
Compliant with the CNSA Suite, used by NSA to protect National Security Systems (NSS) [...]

Advanced Encryption Standard (AES) Symmetric block cypher used for information protection Use 256-bit Keys to protect up to TOP SECRET FIPS Pub 197
Elliptic Curve Diffie-Hellman (ECDH) key Exchange Asymetric algorithm used for key establishment Use curve P-384 to protect up to TOP SECRET NIST SP 800-56A
Secure Hash Algorithm (SHA) Algorithm used for computing a condensed representation of information Use SHA-384 to protect up to TOP SECRET FIPS Pub 180-4
Rivest–Shamir–Adleman (RSA) Asymmetric algorithm used for key establishment Minimum 3072-bit modulus to protect up to TOP SECRET NIST SP 800-56B rev1
Asymmetric algorithm used for digital signatures FIPS Pub 186-4

Edge Server

Distributed architecture allows the Platform to run efficiently at scale with minimal requirements for hardware and bandwidth, creating a sub-second layer of AI and ensuring availability even if offline from the cloud. [...]

Usage of the Edge Server depends on which modules are installed:

Depending on the factory Ethernet infrastructure, the architecture can be:

Edge Server Minimal Requirements
IIoT Edge
  • Windows 10 (64 bits)
  • 4 vCPU (2.0 GHz and AVX2 support)
  • 8 Gb RAM and ~200 Gb free HD
Enterprise Edge
  • Windows server (2016 to 2019) (64 bits)
  • 8 vCPU (2.0 GHz and AVX2 support)
  • 16 Gb RAM and ~200 Gb free HD


The System is process-independent (i.e. discret manufacturing, continuous process and/or batch) and a fast-track project of one to three months, depending on the complexity of your operations’ structure. [...]

During activation, I2M Team models the client’s production and process structure, as well as activates all data interface that the system needs to run.

SaaS activation
1 week
I2M Team: license setup SOR and PAC online
Operations modeling
2-4 weeks
Agreement over configuration data such as: BU, Plants, Lines, Operations, Routes, Machines, Process Type, recipes/BoM, manufacturing Data, process features, warehouse policy, SKU list (production costs, profit margin)
Operations data mask interface
2-4 weeks
Agreement over setup and inpupt data:
IP, Protocol and Tags
OT data mask interface
2-4 weeks
Agreement over OT data:
IP, Protocol and Tags.
Edge and PDA online
Edge server activation
1-2 weeks
Customer: host, industrial ethernet infra-structure and IT configurations ready
I2M Team and customer automation team: OT interface commissioning.
IT middeware integration*
2-4 weeks
IT interface commissioning.
Data Science*
2-4 weeks
Modules integration, analytic view adjustments, MCOTS dashboards. Enterprise integration online

* only for Enterprise licenses

About Us

Application Highlights

Based on almost 30 years of expertise and globally launched at Hannover Messe 2018, the Platform has since been used in a wide spectrum of applications, currently managing about USD $ 2 million in production per day. [...]

  • Plastics: industrial IoT for critical shopfloor assets and AI management of packaging lines.

  • Steel: Specialized AI algorithms for continuous optimization of steel rolling mill.

  • Automotive: traceability, process and production management from auto-parts factories in Brazil to OEMs in Europe and Asia.

  • Pharmaceutical: enterprise-wide planning and scheduling.

ISA Conference and Paper

DataBot's CTO Alexi Condor is scheduled to present his paper based on the I2M Platform's architecture at the upcoming IIoT & Smart Manufacturing Conference (19-21 October, Galveston - TX), organized by the International Sociecty of Automation (ISA). [...]
(Product Director)
(Commercial Team)
(Technical Team)

+55 (12) 99721-0027 (Commercial)
+55 (12) 99261-2108 (Operations)
+55 (12) 3945-1385
+55 (12) 3945-1391

Sao Jose dos Campos Technological Park
500, Avenida Doutor Altino Bondensan, 12247-016,
São José dos Campos (SP) – Brazil