Menu
baader logo blue
BACK

The Future of Quality Detection and Vision Technology - An Interview with Dr Fabian Isernhagen

BAADER has a long history of providing vision systems for quality grading in distribution applications. Our globally recognized ClassifEYE® system has been in operation for many years. The latest innovation, AI-based ClassifEYE® 2.0, represents a substantial step forward, taking vision technology to the next level and enabling real-time processing optimization throughout the entire processing phase. Within these pages, you have the opportunity to get insights into the BAADER development team by reading an interview with Dr Fabian Isernhagen, Director of Vision and Machine Learning.

BAADER has a long history of providing vision systems for quality grading in distribution applications. Our globally recognized ClassifEYE® system has been in operation for many years. The latest innovation, AI-based ClassifEYE® 2.0, represents a substantial step forward, taking vision technology to the next level and enabling real-time processing optimization throughout the entire processing phase. Within these pages, you have the opportunity to get insights into the BAADER development team by reading an interview with Dr Fabian Isernhagen, Director of Vision and Machine Learning.

Q1: What is your background and what motivated our team to explore AI-based vision technology?

My background is in mathematics and computer science, with a Ph.D. in machine learning. When I started working in autonomous driving research, I supervised a bachelor’s student. We both worked on similar projects, focusing on camera and laser-based environmental perception for autonomous vehicles. However, there was a key difference: I had to use traditional algorithms because of regulations, while my student could use artificial intelligence. I reached a juncture where classical algorithms unequivocally fall short of matching the capabilities of AI, especially in tasks like identifying street signs, detecting lanes, and recognizing objects.

When I joined BAADER in early 2018 I encountered a familiar choice: stick with traditional algorithms and image processing or, as Monty Python would say, “do something completely different.” I opted for the latter and initiated the training of neural networks to detect chickens, virtually decompose them, and individually assess the quality of their parts. This allows our new ClassifEYE® 2.0 to determine if a bird has imperfections, such as breast skin damage, unremoved feathers, bruises, etc. thereby requiring a product downgrade.

My team and I aim to transform the world of poultry processing. Currently, our customers produce based on input parameters in a strict and mostly rigid workflow, chicken in – product out. We solve problems where we must, but not where we can. When systems do not communicate with each other throughout the plant we end up having multiple ‘data islands’, making it hard to create a flow of value on the chicken’s way through the factory. ClassifEYE® 2.0 introduces a paradigm shift by providing real-time insights and value throughout the entire processing chain.

BAADER’s vision and AI-based portfolio will expand with ClassifEYE® 2.0 products in the coming years, offering advanced technology at critical points in the production chain. This enables customers to detect and address issues in real-time rather than relying on end-of-day reports, shifting from post-production responses to immediate alert systems for deviations from target values.

Q2: Could you explain the key components and technologies involved in ClassifEYE 2.0?

Key components of ClassifEYE® 2.0 include a 1.4-megapixel colour camera, optical components like a lens and an LED flash, and state-of-the-art AI technology. The system mimics human perception, incorporating a camera, data transfer, and computer processing, adapting to environmental conditions. It leverages various imaging technologies, including grayscale and colour cameras, and has the potential to integrate ultrasonic, x-ray, infrared, and ultraviolet imaging. Using diverse data sources allows us to adapt to specific challenges, and we are actively exploring 3D and 4D data sources.

Q3: How does the AI-based vision system handle real-time monitoring, and what mechanisms are in place for generating reports?

During interactions with customers, it became evident that their utmost requirement is real-time insights - not after the shift has ended but immediately, enabling fast responses to critical and value-reducing processing. When we introduce the system to customers, explaining the advanced technology can be challenging. However, as soon as we showcase the dashboard and allow the customer to experience it firsthand, its value becomes immediately clear.

Q4:  Can you share specific scenarios where this technology has demonstrated significant improvements within poultry processing?

Recently, we conducted an installation for a customer and after the initial setup and fine-tuning of the system’s settings, we handed over control to the customer and conducted training. The customer’s QA team was present and particularly inquired about the functionality they valued most: reporting and real-time performance. The QA team expressed great satisfaction with the system’s ability to generate reports with minute-level temporal resolution and to customize the arrangement of the most relevant information on the ClassifEYE® 2.0 Dashboard. The head of QA was enthusiastic about showcasing their personalized widget design on the dashboard. This level of acceptance and enthusiasm for the system affirmed that we were headed in the right direction with our approach.

Q5: What safety and ethical considerations have you considered when implementing AI and are there any compliance challenges?

Ethical concerns have been raised regarding AI, particularly generative AI, which often faces ethical conflicts. However, playing a vital role in processing, our AI approach also demands proper safeguards to fulfil its intended function. To address this, we place a strong emphasis on extensive testing, simulating potential scenarios that could occur during production with our system.

I am convinced that ClassifEYE® 2.0 will contribute to more ethical and morally acceptable processing. In terms of animal welfare, we see a promising future as our system is highly adaptable over time. Additionally, as we retain individual data for each bird analysed in our system, we will have the capability to trace these birds back to their respective farmers. Our plan includes installing ClassifEYE® 2.0 systems for quality control immediately after defeathering. This step is crucial in identifying deficiencies that are likely related to chicken breeding and live bird handling. Ultimately, this option will enable poultry processors to pay based on quality rather than just quantity, contributing to higher bird welfare and improved overall bird quality.

Currently, we do not face or expect regulatory challenges. As a matter of fact, I envision that with ClassifEYE® 2.0, we can set the stage for future regulatory and compliance requirements in poultry processing, specifically related to maintaining quality in food production. It would be remarkable to witness ClassifEYE® 2.0 become a mandatory hallmark of quality, aligning with our customers’ objectives. Not only is this a sound economic strategy, but it also underscores our commitment to delivering excellence.

Dr. Fabian Isernhagen

© Picture Credits:

The Future of Quality Detection and Vision Technology - An Interview with Dr Fabian Isernhagen