The assembly line of BMW has made history by being the first auto manufacturer in the world to adopt an automated, end-to-end digital method for painting painted vehicle surfaces in regular production.
Artificial intelligence-controlled robots individually inspect each car to ensure it meets predetermined criteria of quality. As a result, operations are more reliable, lead times are shortened, and vehicle surface quality is continuously good. The best investigation of the root causes of surface finish faults is also made possible by data saved in the cloud.
Four robots on the paint line are positioned around a freshly painted body in the processing booth. The robots start working on the body’s surface as though on command. They sand it, polish it, use a different attachment and sandpaper, and adjust the polishing compound. Cameras document the activity.
According to Stefan Auflitsch, manager of production paint application and finish at the facility, “What is special about this is that the robots work on each body exactly where needed, because the tiny specks and bumps that can appear after the topcoat is applied and that we want to remove are in different spots on each vehicle.” “Until they are reprogrammed, robots are often programmed to follow the same pattern. They can work in a more specialized way thanks to artificial intelligence. Every working day, up to 1,000 automobiles may go through the finishing process, which results in 1,000 different processes.
Since March 2022, BMW has used automated surface processing in its Regensburg factory’s serial manufacturing. The factory now uses AI-based processing at scale for the first time in the history of the planet. This stage is preceded by an automated process known as automated surface inspection, which has long been thought of as state-of-the-art in the automobile industry. The features that need to be processed after the topcoat has been applied are found and noted throughout this phase.
The technique first employs deflectometry during the automated surface examination to find outlier traits. Cameras scan the surface of the vehicle as giant monitors show black-and-white striped patterns on it. As the striped pattern changes, they are able to detect even the smallest variations in the shiny paintwork. The camera recognizes areas that depart from the ideal, much like a properly trained eye, and sends this information immediately to a computer system. The computer records the precise location, shape, and size of the deviations, generates a digital 3D image from the data, and categorizes it using measurable standards. All vehicle surfaces are thus examined for the purpose of ensuring customer satisfaction and are then treated as necessary.
“Today, the system is already more knowledgeable than our top personnel put together. We finalized the system using the collective knowledge of our team; the equipment’s efficiency depends on the special understanding of our colleagues. We incorporated their wisdom into the programming. Project manager Daniel Poggensee explains that the algorithm now recognizes and decides objectively which features require postprocessing on the basis of this information.
The technology builds a unique profile for each body from the gathered data, which is subsequently used as the foundation for customized surface processing. This implies that no bump, regardless of size, can go undetected.
More benefits beyond merely accurate characteristic recognition and a quicker process lead time are provided by the new approach: All deviations detected are processed by automated surface processing in the best possible sequence, promptly, reliably, and with excellent quality.
Robotics usage has inherent restrictions, though. For instance, they are unable to process the body’s margins or the last few millimeters adjacent to doors and other parts. Additionally, the fuel-filling flap is excessively delicate. For this reason, the finishing touches and thorough examination of the body are ultimately performed by qualified staff. The AI data, however, is just as beneficial for human employees as it is for the robots. To make sure nothing is missed, a laser projector digitally identifies the pertinent portions of the body surface.
Poggensee has further ambitions for the technology based on the project’s success.
“We anticipate soon being able to intervene in the process even earlier if there are any inconsistencies, which will enable us to prevent faults from occurring in the first place,” he says. “Thanks to the data in the cloud, we expect to be able to do that in the near future.”
In order to avoid having to constantly switch between the body and the computer to capture actions, the technology might potentially be utilized to automatically record human actions.
Credits: Assembly Mag
Click on the following link Metrologically Speaking to read more such blogs about the Metrology Industry.