Vision, Data and AI: Connecting the Factory of the Future

Artificial intelligence and the Internet of Things are helping companies make the factory of the future a reality by harnessing the power of machine vision, connected devices, and real-time insights.

Designing and implementing smart factories requires the ability for manufacturers to optimize production while, at the same time, understanding how and where human workers are needed in order to maximize automation processes.

Machine vision and robotics are already part of the fabric of most advanced manufacturing operations. Automated systems do everything from perform assembly tasks to inspect finished products as part of quality assurance measures. But now, AI-powered machine vision systems can operate autonomously. Cameras and local analysis are integrated throughout the production environment and connected to each other, and to an on-premise industrial grade server.

AI, specifically machine learning, enables all of the information flowing in from connected devices to be processed and analyzed automatically. That means data from hundreds, even thousands, of mechanical eyes that “see” operations throughout the manufacturing process. Today’s smart systems can process massive amounts of information and make instant adjustments– without human intervention.

Machine Vision Redefines What Is Possible

Machine vision has a multitude of practical applications on the factory floor. One example is its use in anomaly detection. Human inspection is subject to a certain amount of error, and limited by an inspector’s capacity to scrutinize only one thing at a time. With a machine vision algorithm in place, inspections can be completed as fast as the production line can move – with close to 100 percent accuracy. That means, for instance, before any component moves from point A to point B, it can be inspected for problems. Machine learning is built-in, so the flow of production data continuously updates the overall system.

The reality is that machine vision systems provide much higher levels of quality control than traditional manual inspections. In addition, quality control can now be implemented more often, and much earlier in the production process. Moving inspections further upstream means, for example, that a faulty part can be spotted before it is integrated with other components. Problem parts that used to be passed on to the next stage of the manufacturing process can be removed – saving time, optimizing productivity, and improving the overall quality of the finished product.

While the use of AI-driven systems is still very much in its early days, new developments are on the horizon. Intel, for example, has developed an open architecture for IOT applications that integrates hardware, software toolkits, and AI technologies on a single platform. Intel provides processors, accelerators and software optimization to deliver the highest computing capacity required at the edge. What’s more, the Intel® OpenVINO toolkit has optimization tools focused on edge-side deep learning that helps developers convert vision data into business insights.

Moving Toward the Autonomous Factory

As device-to-cloud IoT technology evolves, the productivity implications are astonishing. For example, the future state of autonomous operations will integrate feedback loops from multiple data sources, including vision.

Machine learning capabilities will both manage and analyze all of the inputs from multiple sensors in real-time. Data also will be processed at the edge, as needed. For instance, a specific piece of equipment on a manufacturing line might use edge compute capabilities that enable it to leverage specific algorithms for self-adjustment based on its own analysis of operating factors, such as speed of the line or environmental conditions. Learning at the edge extends the ability for manufacturers to get actionable insights exactly where they are needed in the production process.

Predictive decision-making is also evolving as machines monitor themselves and each other for potential problems. For example, in an IoT-connected factory environment, a specific cutting machine can self-diagnose when a blade’s sharpness falls below a given specification, or when a component that runs the machine is overheating. In an autonomous environment, the machine can fix itself by, for example, going off-line long enough to replace the dull blade or the failing component. Because the machine is connected to other smart devices, it can automatically route workflow to another machine while it is offline.

Assessing the Implications of Industry 4.0

These days, the IoT-connected business ecosystem is often called Industry 4.0. The term refers to the fourth industrial revolution, which is being fueled by an AI-driven digital transformation.

The industrial IOT that is emerging as part of Industry 4.0 is enabled by connectivity between everything from robotic production and inventory management machines to computing in the cloud and on the edge. This connectivity promises to improve manufacturing efficiency and productivity.

But, as a recent study by Intel’s Internet of Things Group notes, Industry 4.0 is about more than just integrating smart machines into production environments. It’s about completely transforming the workplace as part of a “co-evolution” of workers and manufacturing operations.

The “eye of manufacturing” in the future is going to reflect a composite of human and machine vision. While technology is driving this evolution, it is ultimately people who will be at the heart of both the transition process and the factory of the future.

To stay informed about Intel IoT developments, subscribe to our RSS feed for email notifications of blog updates, or visit, LinkedIn, Facebook and Twitter.

Published on Categories Artificial intelligence, Industrial, Internet of ThingsTags , ,
Christine Boles

About Christine Boles

Christine Boles is a vice president in the Internet of Things Group (IOTG) and general manager of Intel’s Industrial Solutions Division. Her organization is responsible for Intel’s Industrial IOT business within the manufacturing, energy, logistics and commercial building segments, including the product and ecosystem strategies for this rapidly evolving space. Boles joined Intel in 1992 as an application engineer for 16-bit microcontrollers. For over 25 years, she has led development, delivery and the enabling of customers and ecosystems for Intel based solutions in multiple leadership roles. These solutions span a broad range of embedded and internet of things applications across many industries, including communications, storage, retail, imaging, and commercial buildings. Boles holds a Bachelor of Science in Electrical Engineering from University of Cincinnati and an MBA from Arizona State University

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.