Big data, analytics, and machine learning are starting to feel like anonymous business words, but they're not just overused abstract concepts—those buzzwords represent huge changes in much of the technology we deal with in our daily lives. Some of those changes have been for the better, making our interaction with machines and information more natural and more powerful. Others have helped companies tap into consumers' relationships, behaviors, locations and innermost thoughts in powerful and often disturbing ways. And the technologies have left a mark on everything from our highways to our homes.
It's no surprise that the concept of "information about everything" is being aggressively applied to manufacturing contexts. Just as they transformed consumer goods, smart, cheap, sensor-laden devices paired with powerful analytics and algorithms have been changing the industrial world as well over the past decade. The "Internet of Things" has arrived on the factory floor with all the force of a giant electronic Kool-Aid Man exploding through a cinderblock wall.
Tagged as "Industry 4.0," (hey, at least it's better than "Internet of Things"), this fourth industrial revolution has been unfolding over the past decade with fits and starts—largely because of the massive cultural and structural differences between the information technology that fuels the change and the "operational technology" that has been at the heart of industrial automation for decades.
As with other marriages of technology and artificial intelligence (or at least the limited learning algorithms we're all currently calling "artificial intelligence"), the potential payoffs of Industry 4.0 are enormous. Companies are seeing more precise, higher quality manufacturing with lowered operational costs; less downtime because of predictive maintenance and intelligence in the supply chain; and fewer injuries on factory floors because of more adaptable equipment. And outside of the factory, other industries could benefit from having a nervous system of sensors, analytics to process "lakes" of data, and just-in-time responses to emergent issues—aviation, energy, logistics, and many other businesses that rely on reliable, predictable things could also get a boost.
But the new way comes with significant challenges, not the least of which are the security and resilience of the networked nervous systems stitching all this new magic together. When human safety is on the line—both the safety of workers and people who live in proximity to industrial sites—those concerns can't be as easily set aside as mobile application updates or operating system patches.
Sensors and sensibility
The term "Industry 4.0" was coined by Acatech (the German government's academy of engineering sciences) in a 2011 national roadmap for use of embedded systems technology. Intended as a way to describe industrial "digitization," the term was applied to mark the shift away from simple automation with largely stand-alone industrial robots toward networked "cyber-physical systems"—information-based orchestration between systems and the humans working with them, based on a variety of sensor and human inputs.
It's a robot! It's stealing my job! (Actually, it's doing carbon fiber layup, which is exactly the kind of time consuming task that we want robots to be doing.)
As a promotional document for the roadmap from the German Federal Ministry of Education and Research stated, "Machines that communicate with each other, inform each other about defects in the production process, identify and re-order scarce material inventories... this is the vision behind Industry 4.0."
In the Industry 4.0 future, smart factories using additive manufacturing—such as 3D printing through selective laser sintering—and other computer-driven manufacturing systems are able to adaptively manufacture parts on demand, direct from digital designs. Sensors keep track of needed components and order them based on patterns of demand and other algorithmic decision trees, taking "just-in-time" manufacturing to a new level of optimization. Optical sensors and machine-learning-driven systems monitor the quality of components with more consistency and accuracy than potentially tired and bored humans on the product line. Industrial robots work in synchronization with the humans handling more delicate tasks—or replace them entirely. Entire supply chains can pivot with the introduction of new products, changes in consumption, and economic fluctuation. And the machines can tell humans when the machines need to be fixed before they even break or tell people better ways to organize the line—all because of artificial intelligence processing the massive amounts of data generated by the manufacturing process.
That vision has driven a 1.15 billion Euro (approximately $1.3 billion) European Union effort called the European Factories of the Future Research Association. Similar "factory of the future" efforts have been funded by the US government—in particular, by the Department of Defense, which sees the technology as key to the defense industrial base.
The Defense Advanced Research Projects Agency (DARPA) has used research programs such as the Adaptive Vehicle Make project to seed development of advanced, information-integrated manufacturing projects and continues to look at Industry 4.0-enabling technologies such as effective human-machine teaming (the ability of machines to adapt to and work side by side with humans as partners rather than as tools) and smart supply chain systems based on artificial intelligence technology—an effort called LogX. Researchers at MITRE Corporation's Human-Machine Social Systems (HMSS) Lab have also been working on ways to improve how robotic systems interact with humans.
The brains of a wind turbine, pictured here, contain more industrial sensors than you can shake a stick at
As part of that work, MITRE has partnered with several robotics startups—including American Robotics, which has developed a fully automated drone system for precision agriculture. Called Scout, the system is an autonomous, weather-proofed unit that sits adjacent to fields. All a farmer has to do is program in drone flight times, and the AI handles drone flight planning and managing the flight itself, as well as the collection and processing of imagery and data, uploading everything to the cloud as it goes.
That level of autonomy allows farmers to simply look at data about crop health and other metrics on their personal devices, and then act upon that data—selectively applying pesticides, herbicides, or additional fertilizers if necessary. With some more machine learning juice, those are tasks that could eventually be handed off to other drones or robotic farming equipment once patterns and rules of their use are established.
Scout mirrors how human-machine teaming could work in the factory—with autonomous machines passing data to humans via augmented vision or other displays, letting humans make decisions based on their skills and knowledge of the domain, and then having humans and machines act upon the required tasks together. But that level of integration is still in its infancy.
Every sensor tells a story
One place where an embryonic form of human-machine teaming already takes place is in the world of retail: Walmart uses robots to scan store shelves for stock levels and has automated truck unloading (via a system called the "Fast Unloader") at many stores—using sensors and conveyor belts to sort shipments onto stocking carts. And robotic systems have already taken over the role of warehouse "picking" at Amazon, working with humans to retrieve and ship purchases.
Conversely, an element of Industry 4.0 that has evolved past the embryonic stage is the use of sensor data to drive plant operations—especially for the task of predictive maintenance. Unexpected equipment downtime is the bane of all industries, especially when the failure of a relatively minor part leads to the total failure of an expensive asset.
Ars' Lee Hutchinson stands in front of the creel cabinet that feeds carbon fiber to the robot that took all of our carbon fiber layup jobs.
By some estimates, about 80 percent of the time currently spent on industrial maintenance is purely reactive—time spent fixing things that broke. And nearly half of unscheduled downtime in industrial systems is the result of equipment failures, often with equipment late in its life cycle. Being able to predict failures and plan maintenance or replacement of hardware when it will have less impact on operations is the Holy Grail of plant operators.
It's also a goal that industry has been chasing for a very long time. The concept of computerized maintenance management systems (CMMS) has been around in some form since the 1960s, when early implementations were built around mainframes. But CMMS has almost always been a heavily manual process, relying on maintenance reports and data collected and fed into computers by humans—not capturing the full breadth and depth of sensor data being generated by increasingly instrumented (and expensive) industrial systems.
Doing something with that data to predict and prevent system failures has gotten increasingly important. As explained by MathWorks' Industry Manager Philipp Wallner, the mounting urgency is due to "[T]he growing complexity that we're seeing with electronic components in assets and devices, and the growing amount of software in them." And as industrial systems provide more data about their operations on the plant floor or in the field, that data needs to be processed to be useful to the operator—not just for predicting when maintenance needs to occur, but to optimize the way equipment is operated.
An airplane being assembled at an Airbus facility. The company is developing "smart tools" that use local and network intelligence as part of its own Industry 4.0 "factory of the future" initiative.
Predictive maintenance systems—such as IBM's Maximo, General Electric's Predix and MATLAB Predictive Maintenance Toolbox—are an attempt to harness machine learning and simulation models to make that level of smartness possible. "Predictive maintenance is the leading application in making use of that data in the field," Wallner said, "especially in areas where components are really costly, such as wind energy. For equipment operators it's a no brainer."
It's a harder sell to equipment manufacturers, in some cases—especially because implementing the concept often involves providing detailed (and therefore proprietary and deeply guarded) modeling data for their products. And some equipment manufacturers might see predictive maintenance as a threat to their high-margin sales and maintenance business. However, some companies have already begun building their own lines of businesses based on predictive maintenance—such as General Electric.
GE first used Predix for internal purposes, such as planning maintenance of its fleet of jet engines—using "data lakes" of engine telemetry readings to help determine when to schedule aircraft for maintenance to minimize its impact on GE's customers. Using a library of data for each piece of supported equipment and a stream of sensor data, GE Software's data scientists built models—"digital twins" of the systems themselves—that can be used to detect early signs of part wear before things progress to part failure.
But GE has also applied the same technique to other, less mechanical inputs—including using models for weather and tree growth data to predict when trees might become a threat to Quebec Hydro's power lines. And GE has expanded the role of Predix into the energy market, modeling power plant output and other factors to give energy traders a tool to help them make financial decisions. Predictive systems are also already having an impact on logistics—for example, at Amazon, which uses predictive models to power Amazon Prime's pre-staging of products closer to potential purchasers.
There are other approaches to prognostication, some of which bleed into managing the overall operation of the plant itself. IBM's Maximo APM, for example—based on IBM's Watson IoT platform—builds its baseline from sensors and other data from equipment on the factory floor to continuously refine its algorithms for maintenance. Another Maximo package focuses on overall plant operations, identifying process bottlenecks and other issues that could drive up operation costs. (L'Oreal has had success implementing Maximo and the Watson IoT platform as part of its own Industry 4.0 effort.)
Bridging the gap between data and knowledge
But there are several challenges that companies face in making predictive systems effective—the old computing proverb of "garbage in, garbage out" definitely still applies. MathWorks' Wallner noted that the main challenge is bridging the gap between the two knowledge domains needed to make predictive maintenance work. "How do you really enable the domain experts to work closely with the data scientists, or have one person do both? That's quite often the tension," Wallner explained. "You have two silos of knowledge, with one group having the pure data scientists and the other having domain experts with knowledge of the equipment they build, not talking to each other." The tools to create the models needed for operation must facilitate collaboration between those two camps, he said.
Even when there's good collaboration, there's another problem for many predictive models: while there's plenty of data available, most of it is about normal operations rather than failures (which is how it should be—a smoothly running plant shouldn't be suffering a lot of failures). "Often there's not enough failure data to train algorithms," Wallner said. "How do you train algorithms that need lots of data with a lack of failure data?"
A time-sensitive networking switch used in an industrial control traffic network.
In some cases, manufacturers perform "run to fail" tests to collect data about how their equipment acts as components start to push outside of their normal operating parameters. But "run to fail" tests involve creating failures, and purposefully breaking costly and complicated manufacturing hardware is uncommon. "You don't want to run a scenario where you break your wind turbine," Wallner explained. "It's too expensive and dangerous." In these cases, the manufacturers' domain experts may have already built simulation models to test such conditions computationally—and those models can be incorporated into predictive maintenance systems with a bit of adaptation.
The last gap to be bridged is how and where to process device data. In some cases, for safety or speed of response, the data from equipment needs to be analyzed very close to the industrial equipment itself—even having algorithms run on the embedded processor or procedural logic controller (PLC) that drives the machine. Other parts of analysis that are real-time but not directly safety-oriented might run on hardware nearby. But more long-term predictive analysis usually requires a lot of computing power and access to lots of other supporting data, and this usually means complex applications running in a company's datacenter or an industrial cloud computing system. Both GE's and IBM's predictive systems run in the cloud, while MathWorks' algorithms can be run locally or in other clouds (including GE's Predix cloud).
In some cases, companies may run combinations of all of the above methods or start off with "edge" systems handling predictions until they're more comfortable with using cloud solutions. "It makes sense to have some of the algorithm as close as possible to the equipment, to do things like data filtering," explained Wallner, "but have the predictive algorithm in the cloud." This gets you the best of all worlds.
The dangers of digitizing
While there is vast potential in the combination of information technology and operational technology that makes Industry 4.0 concepts like predictive maintenance possible, realizing that potential doesn't come without risks—especially if proper security measures aren't taken. While there have been few credible cyber-threats to industrial systems, new threats are emerging—including the "Triton" malware attacks that aimed to disable safety systems at multiple industrial sites and the "Black Energy" cyber-attacks in Ukraine that briefly took portions of the power grid down.
This is Baltimore, gentlemen. The gods will not save you...from ransomware. (And they won't save your factory from it, either, if you're not careful.)
Predictive modeling systems pose a lesser risk than those having direct control over equipment, but there's still reason for concern about potential access to raw analytics data from the factory floor. Such data won't immediately yield the blueprints for proprietary manufacturing parts, but if it's subject to "big data" analytics techniques it might give an adversary (or a competitor) a wealth of information about the patterns of manufacturing operations, plant efficiency, and manufacturing process details that could be used for other purposes—including outright industrial espionage. Officials from German Ministry of Education and Research noted in the ministry's industry 4.0 report that "The most prevalent concern, especially among [subject matter experts], is that Industry 4.0's data is not secure, business secrets are lost, and carefully guarded companies' knowledge is revealed to the competition."
There are much greater threats, however, that could come from mixing operational technology with traditional IT, especially as autonomous systems are connected to existing industrial networks. Ransomware and other destructive malware could bring down control networks, as it did in Baltimore when a ransomware attack destroyed data from autonomous red light and speed camera sensors and shut down the CityWatch camera network. And there's the threat that controls themselves could eventually be targeted and manipulated, subverted, or sabotaged.
Much of what has protected operational technology from attacks thus far has been "security through obscurity." Industrial control protocols vary widely across equipment manufacturers. But blending the Internet of Things and other information technology with operational tech will require a great deal more attention to security—especially in applications where there's a threat to human lives. A malicious attack on safety systems could have "cyberphysical" ramifications beyond lost productivity or broken equipment in chemical, energy, and other industries where a failure could put the public at risk.
GE and others have tried to protect networks by isolating control systems from sensor data networks and by placing firewalls in front of older systems to block unwanted network traffic. Industrial cloud computing is generally partitioned from the Internet by virtual private networks and other measures.
But before industries hand over more jobs to autonomous software and hardware robots, a full assessment of the security for data and commands flowing to and from them is probably a good idea.