AI IN CONTROL

“Real-time rule engines” and “adaptive control” are two of today’s monikers for artificial intelligence (AI), fuzzy logic, and similar information technologies that were so widely touted in the 1980s. They still exist, but as Glenn Anderson, project sales manager for Omron Electronics LLC (Schaumburg, IL; www.omron.com) explains, “Customers are looking for off-the-shelf solutions to control problems. [They are saying] ‘Don’t give us a tool that we have to figure out how to use to solve our own problems. Give us a solution.’ People don’t really care how the function is implemented.” So just what are these rules-based solutions?

 

There Is A Difference—With Interface

In the 1980s, AI ran on proprietary hardware, databases, and operating systems. No longer. G2, the flagship inference engine from Gensym (Burlington, MA; www.gensym.com), exemplifies how several technological changes have made rules-based deployment easier and “open.” For starters, all of G2’s rules, procedures, and object models work in real-time. This ensures that operational decisions and actions are executed rapidly. G2 message management, reasoning tools, and distributed publish-and-subscribe facility make the inference engine well suited to scalable and distributed operator advisory and alarm management applications. G2 runs on Microsoft Windows, various UNIX variants, and Linux. Its reasoning-engine logic, objects, data structures, and user interface integrate with other systems through Microsoft’s ActiveX (think: Internet Explorer), COM, and .NET, as well as an alphabet soup of standards including HTTP, Java ODBC, OPC, RMI, SQL, and XML. The user interface is a thin client based on Microsoft Windows; it can be embedded in Windows applications such as Microsoft Office or those created using Microsoft Visual Basic, C++, or .NET. Last, programming G2 involves graphical object-oriented modeling, natural language rules, generic rule and procedural logic, a built-in understanding of time, and the ability to incorporate programming changes without stopping to compile or relink code.

Inference engines work. For example, Toyota Motor Corp. uses Gensym G2 to plan its final assembly line. Assembly employs approximately 200 operators executing 3,000 basic operations per vehicle that involve about 2,000 parts/vehicle across 100 to 200 vehicle forms. Production takt time: 1 to 3 minutes. Using incoming demand, the G2-based scheduler works out how a given automobile model is to be assembled, including which operators, which assembly stations, and the location of both potential bottlenecks and operator conflicts at various stages during assembly.

G2 is also the basis behind Gensym’s latest version of NeurOn-Line software. This neural-network application is for operations involving real-time predictions of product quality and process conditions. Engineers use it to create models of complex processes—”model-based reasoning” is an analytic technique becoming quite popular in factory control—that typically cannot be modeled using conventional analytical techniques. According to David Siegel, Gensym’s director of Marketing, “The models are built through a training procedure that uses historical process data. Through these models, operators and control systems continuously receive real-time predictions of product quality and process variables that would be impractical to measure directly. Such predictions support more effective control of the variability in processes and enable improved product quality, higher production yields, and increased throughput.” The latest version of NeurOn-Line has a new recursive structure for predictive neural-network models. This lets models “extend the time horizon and accuracy of their predictions by using previous output values for new predictions,” explains Siegel.

Volkswagen (VW) Group (Madrid, Spain) uses the inference engine from ILOG Inc. (www.ilog.com; Mountain View, CA) for new-car sequencing and production planning at the group’s SEAT Martorell and the VW Navarra plants. These two locations produce 2,000 and 1,200 cars per day, respectively. Using ILOG’s optimization software, the two plants match assembly line resources to customer and dealer specifications, thereby stocking only the parts needed for current orders. In the past, planning was mostly done manually—in 90 minutes. Now, building a schedule takes 15 minutes. Planners used to take a full day to generate a production plan for the next day; with the ILOG system, planners can generate the plan in half a day. The VW Navarra facility also uses ILOG Solver for car sequence scheduling in minutes, a job that manually used to take six hours.

 

Call Your Agent

In reality, rules-based technology “gets embedded in solutions so that the end user doesn’t even know there’s AI inside,” says Siegel. “I don’t know of many total standalone AI/expert system-type applications. They’re almost always a part of the larger picture.” Anderson agrees. He points out that control engineers showed long ago that fuzzy logic couldn’t do anything that mathematical modeling couldn’t do. More to the point, continues Anderson, “If people want to implement fuzzy logic control, they don’t have to buy Omron hardware or software to do it. They can write fuzzy logic algorithms in a programmable controller’s native programming language.” (That said, a systems developer writing a control application requiring fuzzy logic can get that capability by buying a multi-loop, general-purpose process control module for Omron’s CS1 PLC.)

Nevertheless, AI is on the factory floor. For instance, take a look at Omron optical discrete sensors for detecting and staging the sequence of objects in manufacturing. These sensors work with infrared or visible light pulses of 15,000 and 25,000 KHz. Two or more of these sensors close together can interfere with the operation of the others because the spurious light pulse from one sensor can trigger a nearby sensor. With embedded fuzzy logic, each sensor now has the intelligence to prevent mutual interference.

These days, the focus is on agent-based operations on the factory floor. One term for this is prognostics: “the ability to predict and prevent possible fault or system degradation before failures occur,” rather than the current method of scheduled maintenance and reactive maintenance based on the “fail and fix approach,” explains Jay Lee, director of the National Science Foundation Industry/University Cooperative Research Center on Intelligent Maintenance Systems (IMS; www.imscenter.net) and professor at University of Cincinnati.

Some background: IMS is a multi-campus research center including University of Wisconsin (Milwaukee), the University of Michigan (Ann Arbor), and now the University of Cincinnati. The center has over 45 company members and sponsors that provide expertise and real-world testbeds, including General Motors (automotive assembly), Harley Davidson (predictive maintenance system for machine tools), Hitachi (intelligent condition-based monitoring and maintenance of gas turbine), Intel (semiconductor fab equipment and processes), and Rockwell Automation (energy and power systems, asset optimization and management, wireless machine sensors). With these companies, IMS is working to advance “infotronics”—the mix of industrial automation, integrated systems, and information technology—for prognostics, near-zero downtime, and ultimately “smart” factory control.

The IMS Center has developed a toolbox of algorithms. Of particular interest is the Watchdog Agent. This agent, explains Lee, “can assess and predict the process or equipment performance based on the inputs from the sensors mounted on it. Performance-related information is extracted from multiple sensor inputs through signal processing, feature extraction, and sensor fusion techniques. A performance valuation module determines the current level of degradation of a system based on the overlap of recently observed signatures with normal operation signatures.” These same process signatures are used to forecast process and machine performance.

Watchdog agents can reside in an existing microprocessor in a factory floor device or controller; no special chip required. You can have different Watchdog agents focus on different things: hydraulics, vibrations, tool wear, machine use, and inventory. The agents can be networked together, say in a CNC machine, “to monitor themselves by reasoning the current performance compared with previous performance,” explains Lee. That is, the agents sense process or equipment degradation well before performance becomes unacceptable. An onboard decision support tool then determines the most critical object or process in the system needing repair. It assesses the risks of making or not making that repair in a given time. Based on this self-assessment, the Watchdog Agent triggers service to be performed by itself or informs the user that service is required.

Here’s a simple example of agent assessment. Two sensors measured the spindle load in orthogonal directions for each boring cycle at a General Motors manufacturing plant. Each cycle lasted about 35 seconds; 1,000 signals were collected during three 8-hour shifts. IMS found that as tool wear increases, the load during the drilling increased for one of the sensors; that is, “other faults may be detected that are not related to tool wear,” says Lee. “Expert knowledge can be used to decide which features best represent normal processes and which best represent faulty processes.”

A second IMS project is the Device-to-Business (D2B) platform, basically an autonomous intelligent agent that links factory floor devices directly to a business system, such as enterprise resource planning (ERP), thereby circumventing traditional factory supervisory control systems, such as programmable controllers. The agent compares performance data from these devices to historical trends and, when necessary, alerts the relevant parties. For example, an IMS project has D2B linked to an ERP system at Ford Motor. The ERP system automatically cancels replenishment for metal in its Dearborn-based Rouge Truck Plant if the stamping machine is about to fail.

In the past, a good operator could adjust machinery based on intangibles—visual, sound, vibrations, smell, and touch—attributes that have not quite been fully captured through sensor, analytic, and inference-based response technologies. Yet. As was the expectation in the 1980s, AI can still help fill the gap in the skills and knowledge that are increasingly lacking on the factory floor as the number of equipment operators drops and more manufacturing engineers leave the field through attrition. 

CHANNEL PARTNERS