STIHL worker cutting down a tree
Success Story

Machine Learning Technology Helps STIHL Cut Cost, Time and Human Error

STIHL is a global name in the development, manufacturing and distribution of power tools for forestry and agriculture. As the world's top-selling chainsaw brand, quality assurance is an integral part of the company's production process.

Zebra Success Story: STIHL

Overview: Manufacturing Challenge

Implement a fully automated inspection process that uses machine learning technology to accurately verify components are properly manufactured before they are used.

Benefits / Outcomes

  • 99.5% hit rate accuracy
  • Significant cost savings
  • Time saved due to fully automated inspection process 

Customer

STIHL
Waiblingen, Germany

Partner

Rauscher GmbH

Industry

Manufacturing

Solutions

About STIHL

The STIHL Group has been a global name in the development, manufacturing and distribution of power tools for forestry and agriculture since 1926. Servicing professional forestry and agricultural sectors as well as construction and consumer markets, STIHL has been the world’s top-selling chainsaw brand since 1971.

With a worldwide sales volume of nearly 4 billion euros1 and a workforce of 16,722, STIHL oversees its own manufacturing plants in seven countries—Germany, USA, Brazil, Switzerland, Austria, China and the Philippines.

Maintaining a high degree of manufacturing verticality ensures key knowledge is developed and upheld in-house.

Quality assurance is integral to STIHL’s production process; as part of an upgrade, STIHL sought a fully automated solution for visual quality assessment. “Previously, the objective visual quality test was carried out by people,” says Alexander Fromm, Engineer for Automation Systems at STIHL Group. “The test—and success—of a vision system was that the same assessment can be made using neural network technology at the very least.” 

The Challenge

The inspection that STIHL sought to enhance focuses on the production of gasoline suction heads as a component of a chainsaw. Filtering dirt, wood shavings and other invasive elements, these suction heads are integral to ensure that no dirt particles enter the combustion chamber, which could cause damage to the power tool.

The gasoline suction heads consist of a plastic body and a piece of fabric which is applied later in the assembly. These suction heads are a semi-finished product at this point in their manufacturing, and the inspection occurs midway through the production process. Each suction head features four footbridges that must be assessed independently after the injection molding, and before the part proceeds to the next step in the manufacturing process.

It is crucial to assess and classify the seams on these components to ensure that the footbridges are adequately positioned and sealed before they are used. The footbridge gives the component its stability, stretching the fabric of the filter and enclosing the fabric seam so it does not tear open.

Prior to implementation of STIHL’s new system, human operators would perform an objective visual quality test to determine whether the components were adequate. Though the production machine was always automated, operator intervention was necessary when the output started to deviate from the high-quality standards at STIHL. In those cases, the operator would need to visually inspect the batch of parts to determine if the production machine had become problematic.

STIHL sought a new solution, one that would replace the human element with machine vision based on deep learning. The quality-assurance assessment would thus be automated to cut costs and save time. “When we began the process of considering a machine vision solution, every gasoline suction head was being checked by a human,” says Fromm. “However, the parts are very small and the error features are quite hard to detect, so we determined a need for deploying machine vision into the inspection process.” Slip rates are defined as instances wherein a bad part is erroneously classified as a good part; hit rates are accuracy results. 

STIHL forged a relationship with Rauscher GmbH, a Zebra Registered Reseller, after meeting at a trade fair. We appreciate being able to work with a single solutions provider, as that has been instrumental in getting our systems up and running quickly. It’s because of our positive history with Rauscher GmbH that STIHL sought to work with them on development of this new system.

The Solution

“The inspection process of each part involves looking at four distinct footbridges, and the machine processes 60 parts per minute,” Fromm outlines. “The inspection therefore occurs at a rate of 240 images per minute.” Conventional image processing tools had been used to evaluate the parts; deep learning functionality extends the field of image processing capabilities in instances where conventional image processing produces inconclusive results, due to high natural variability. “STIHL determined that rule-based image processing is not appropriate, because the component images vary too much and the error rate is too high, even at hit rates ranging from 80% to 95%,” Fromm concludes. “The new system would thus be required to yield fewer slips and result in a higher hit rate. Using the newly implemented classification steps yielded hit rates of 99.5% accuracy, a tremendous improvement.”

STIHL’s new vision system comprises Aurora Design Assistant vision software running on a Zebra 4Sight GPm vision controller, selected because of the I/O capabilities, PROFINET connections, and Power-over-Ethernet (PoE) support. The system also includes a PoE line-scan camera, a rotary table, an encoder, and ultra-high intensity line lights (LL230 Series) from Advanced illumination. 

The Zebra Difference: Outcome and Benefits

Effective training of a neural network is not a trivial task; images must be adequate in number, appropriately labeled and represent the expected application variations on a setup that yields repeatable imaging conditions. With this in mind, the team at STIHL engaged Zebra’s vision experts to undertake the training of the convolutional neural network (CNN) on their behalf.

Fromm describes the collection of images as representing “a plastic part with fabric seam, photographed from the inside. In the images, only the footbridge itself contains important information, everything else is irrelevant. To prepare the dataset, therefore, each footbridge is extricated from the general image and sorted into folders classed as ‘good (IO)’ and ‘not good (NIO)’. The team at STIHL undertook the process of manually labeling 2,000 representative parts, each with four images, for a total dataset of 8,000 images. Without guidance, this level of intricacy would have been exceptionally challenging.”

The collection of 8,000 images was provided to the vision experts, who employed Aurora Imaging CoPilot’s interactive environment to train the CNN and produce a classification context file, which was subsequently returned to STIHL for importing into Aurora Design Assistant software environment and used to automatically classify new images into these pre-determined classes. Aurora Imaging CoPilot gives access to pre-defined CNN architectures and provides a userfriendly experience for building the image dataset necessary for training.

Fromm affirms that “our main point of contact was through Rauscher GmbH; we received quick answers and very good support. As the system was being brought online, the STIHL team undertook online training using the Vision Academy portal to help strengthen our knowledge of how best to use the Aurora Design Assistant machine vision software.”

Bringing it Online
With support, STIHL successfully navigated the challenge of establishing a correct and repeatable presentation of the footbridge to facilitate taking images for training the CNN. Another challenge was simply collecting the sheer volume of images necessary, as well as the careful cutting, sorting, and labelling of the images.

“It was a challenge,” Fromm notes. “But the more time you invest in procuring good images, the better results you get!”

Conclusion
With the deployment of their new vision system, STIHL is overwhelmingly pleased with the enhancements that Aurora Design Assistant’s deep-learning-based classification tools have made to their quality-assurance measures. Plans are already underway to develop a second, similar system; image collection and CNN training has already begun.

“Deep learning technology extends the field of image processing, in which conventional image processing yields inadequate results,” Fromm concludes. “Implementation of this new system—one that effectively deploys deep learning—has replaced objective visual processes that STIHL had in place. As a result, we anticipate great improvements in our efficiency, with the ability to perform new tasks, ensuring an overall improvement in the quality of our products.”