Quality and Performance Optimization through Generative Adversarial Networks based Anomaly Detection

Résultats de recherche: Contribution à un événement scientifique (non publié)PosterRevue par des pairs

2 Téléchargements (Pure)

Résumé

Over the past few decades, the industrial sector evolved with technologies, culminating in the current era of cyber-physical systems known as Industry 4.0. This new paradigm incorporates a range of technologies, including machine learning, and decision systems, with a primary objective of optimizing the factory productivity. Anomaly detection is a crucial quality control step that represents an important part of these opportunities. Our research focuses on improving anomaly detection, specifically the Printed Circuit Board Assembly (PCBA) manufacturing for the automotive industry, as well as all kind of industrial or medical images.
Automatic visual inspection is the most common approach for anomaly detection, where a product image is judged by an algorithm that asks an operator to confirm any detected anomaly. Traditional algorithms compare images to a golden sample, but they suffer from practical drawbacks, such as a high false positive rate that leads to unnecessary manual checks of PCBAs or human misjudgment. Deep representation learning techniques are suggested, where a model is trained on the normal data distribution. This model reconstructs a new input image, and an anomaly score is calculated based on the differences between the input and reconstructed images. The difficulty of reconstructing complex images and the small difference between normal and abnormal variability remain major challenges. The literature on anomaly detection using deep learning techniques can be employed to address this issue but adapting those to real-world industrial settings may require additional tuning.
The idea of Generative Adversarial Networks (GANs) is to adversarially train a generator and a discriminator, so that the generator captures, as faithfully as possible, the distribution of the normal data. For a challenging dataset as the PCBA one, our approach consists of representing the images as a composition of coherent and rich details, through a Vector Quantized GAN (VQGAN). Then, the residual image and the networks losses are the metrics that quantify how the inferred image is different from the normality. These metrics are finally classified through a binary extra tree classifier to discriminate normal and abnormal products. Another approach developed is to take the few abnormal data available into consideration, in order to train a cycle-GAN. Its objective is to generate a normal version of the inferred image, thanks to a model previously trained to reconstruct images from normal and abnormal images. Based on a threshold set to reach zero false negative (ZFN) on an FID or SSE anomaly score, the image is flagged as normal or not.
The VQGAN-based approach achieves a regular accuracy of 95.69% and 87.93% under the ZFN constraint, on the PCBA dataset. The CycleGAN-based one reaches an average accuracy of 97.2% (85.4% under the ZFN) for texture-shaped images on industrial and medical images.
As a future work, the quality constraint will be applied directly at the training step. The Augmented Lagrangian Method is under study to improve the current performance. The method integration into the real-world production line will also be explored, considering the business philosophy, constraints, and the workers habits.
langue originaleAnglais
Etat de la publicationNon publié - 14 avr. 2023
Evénement21st Symposium on Intelligent Data Analysis (IDA 2023) - UCL, Louvain-la-Neuve, Belgique
Durée: 12 avr. 202314 avr. 2023
https://ida2023.org/

Colloque

Colloque21st Symposium on Intelligent Data Analysis (IDA 2023)
Titre abrégéIDA 2023
Pays/TerritoireBelgique
La villeLouvain-la-Neuve
période12/04/2314/04/23
Adresse Internet

Empreinte digitale

Examiner les sujets de recherche de « Quality and Performance Optimization through Generative Adversarial Networks based Anomaly Detection ». Ensemble, ils forment une empreinte digitale unique.

Contient cette citation