Its two vision inspection offerings work with and learn from human operators, and are easy to implement, the company said.
By Bruce Geiselman
Manufacturers are adopting artificial-intelligence (AI) technology during quality-control inspections to help improve efficiency, minimize errors, and reduce costs and scrap, according to advocates of the emerging technology.
Pleora Technologies, which offers AI inspection hardware and software to manufacturers in a variety of fields, including plastics processors, has recently seen increased interest in its products from manufacturers concerned about maintaining production quality while dealing with a lack of workers and rising costs.
“Expanding the off-the-shelf and customizable capabilities of our AI solutions, we’re helping manufacturers address immediate concerns related to quality, labor shortages and increasing costs while they accelerate their focus on collecting and analyzing inspection data to drive process improvements,” John Butler, VP of sales and marketing with Pleora, said in a press release.
Pleora offers two types of AI inspection systems, marketing manager Ed Goffin said.
Pleora’s eBUS AI Studio development software platform and AI Gateway edge processor hardwareasily integrate into existing inspection systems for manufacturers that already have an automated vision inspection system. For manufacturers that don’t, the company offers a turnkey system, its Visual Inspection System, which is deployed to learn from and eventually support human inspectors.
The Visual Inspection System is a stand-alone system that includes a camera, display monitor, keyboard, mouse and cables, as well as a processing unit that is basically the same as Pleora’s AI Gateway edge processor running a light version of its eBus AI Studio software. It works with human inspectors to eliminate subjective, inconsistent and stressful decisions, according to the company. It also reduces errors that could result in poor products being shipped to customers, and it can increase production and lower costs by reducing downtime, waste and scrap.
A human inspector places a manufactured item, such as an injection molded part or assembled plastic part, into an inspection station and manually looks for defects. At the same time, a camera that is part of the Visual Inspection System captures images of the part, and the system’s software compares them to images of a known good product.
The system then highlights on a computer monitor differences it notices between the part being inspected and the stored image or images of a known good part. The system then asks the inspector if a difference constitutes a part defect or if it is an acceptable variation. Each time the inspector responds, the system learns to differentiate between defects and acceptable variations. As the AI software learns, it begins offering suggestions on whether to accept or reject a part, but the human operator makes the final decision. Examples of variations might include the placement of a label on an assembled part or the shade of a colored plastic part.
“They [Pleora’s clients] aren’t trying to replace the human, they’re just trying to give them a tool to make sure that their decisions are consistent and objective,” Goffin said. “There’s also tracking and reporting that can happen so that they can have some traceability on operator decisions within the system.”
For example, the Visual Inspection System can store photos of inspected parts along with bar codes that allow tracking of the parts. If, in the future, questions arise about a part’s quality, the company can examine the stored photos to determine if the parts were faulty when they left the plant or if problems might have occurred after they were shipped. It also allows the company to trace a product back to a particular inspector and possibly to the specific production equipment that manufactured the part.
The manual Visual Inspection System is targeted toward smaller manufacturers that cannot justify the cost of an automated visual inspection system that employs AI, or for manufacturers who do frequent short production runs.
Some of Pleora Technologies’ customers use the manual Visual Inspection System to help train new employees who might not have the skills of veteran quality-control inspectors.
“We have a few examples where manufacturers have an inspector who has worked that job for decades, and they’re really good,” Goffin said. “They have all the knowledge of what a defective part is, whether a part is within tolerance, and what a good part is. They’ve been doing it for years. As these manufacturers bring on new inspectors, they don’t have that level of knowledge, but they can use the system trained on that really good inspector’s knowledge to then help the new inspectors with their decision-making processes.”
Pleora Technologies’ manual Visual Inspection System helps break down some of the barriers that might have kept some manufacturers and plastics processors from adopting AI technologies, Goffin said.
“I think people are sort of scared away by the cost and complexity of AI, or the perceived notion that it’s expensive and complex,” he said.
In some automated AI systems, there is a greater degree of complexity in designing the system, training employees and deploying algorithms, he said, and manufacturers often believe they need someone with a doctoral level of expertise in AI to adopt the technologies.
“I think a lot of manufacturers look at AI and think it looks like an interesting technology, in terms of it can help me make some of these subjective decisions where my current machine vision system is maybe struggling, but it’s too complex, too costly,” he said. “What we’re trying to do, which is a little bit of a game changer, is the idea that it doesn’t have to be complex, especially if you start with a Visual Inspection System. You can get up and running, needing only one good image and then train based on operator decisions. Then, you can have the algorithm not take over but supplement some of those decision-making processes for your operator.”
For larger operations or companies that already have automated inspection systems in place, Pleora Technologies offers AI products that will incorporate into existing operations.
“If you’re already running automated inspection, such as on a conveyor belt with products going under a camera for inspection, [we offer] an edge-processing solution to add AI into those applications,” Goffin said.
The company’s AI solution for existing automated inspection systems consists of Pleora’s eBUS AI Studio “no-code” development software that allows non-experts and designers to develop, customize and train AI and computer vision add-ons without needing to know or use coding. The software uses a “drag-and-drop” programming interface.
The add-ins — known as plug-ins — are then transferred to an AI Gateway edge-processor provided to the manufacturer by Pleora Technologies. The AI Gateway is connected to and communicates with the customer’s cameras or vision systems, and the software on the AI Gateway edge-processor processes the video from the cameras and decides whether the items being inspected meet programmed specifications. Goffin said that using an edge-processing approach lets customers scale AI into their application without replacing existing infrastructure and end-user processes.
The AI Gateway edge processor also communicates with the manufacturer’s existing computer system and can share data with enterprise resource planning and manufacturing analytics platforms to help monitor end-to-end quality and processes, and can be used for continuous offline plug-in training, according to the company.
“In other words, after the end-user develops the plug-ins using Pleora’s eBUS AI Studio development software and uploads the plug-ins to the AI Gateway edge processor, the end-user tests the system offline so that it is not actually taking any actions but is identifying suspicious parts and asking an operator for input on whether the part is defective or not,” Goffin said.
“Based on the operator’s responses, the software learns to identify truly defective parts that need to be removed from the conveyor belt. Once the end user is confident in the accuracy of the AI software, then the system can be put online and AI will make those decisions without operator input.”
“It lets you scale the system with new capabilities without replacing the end-to-end system,” Goffin said. “You can keep your machine vision cameras that you have installed. Usually, these systems are pretty complex. … You don’t want to replace that. So, edge processing lets you add in AI into an automated inspection process, while keeping existing processes and infrastructure.”