Zebra Technologies, a leading digital solutions provider specializing in the intelligent connection of data, assets, and people, has unveiled a series of advanced AI features for its Aurora machine vision software. These enhancements introduce deep learning capabilities aimed at tackling complex visual inspection challenges. According to Zebra’s 2024 Manufacturing Vision Study, 61% of manufacturing leaders globally anticipate AI will drive growth by 2029. Additionally, Zebra’s report on AI in the automotive industry reveals that while AI, including deep learning, is already employed across the automotive supply chain, there is a growing demand for more advanced capabilities—needs that these new features aim to address.
Zebra’s upgraded Aurora software suite now offers robust visual inspection solutions tailored for machine and line builders, engineers, programmers, and data scientists in industries such as automotive, electronics, semiconductor, food and beverage, and packaging. Key features include no-code deep learning optical character recognition (OCR), drag-and-drop environments, and extensive libraries that enable users to address complex use cases beyond the reach of traditional rules-based systems.
“Manufacturers are encountering persistent quality issues and new challenges as materials and sectors evolve, particularly in automotive and electronics,” said Donato Montanari, Vice President and General Manager of Machine Vision at Zebra Technologies. “They are seeking innovative solutions that enhance their existing toolsets with AI capabilities for more effective visual inspection, especially in complex scenarios.”
Aurora Design Assistant™
Zebra’s Aurora Design Assistant provides an integrated development environment where users can build applications by designing flowcharts rather than writing traditional code. It also allows for the creation of web-based human-machine interfaces (HMIs) for these applications. New features include deep learning object detection and an updated Aurora Imaging Copilot companion application with a dedicated workspace for training object detection models. Additional add-ons support deep learning model training with NVIDIA GPU cards and inference or prediction with NVIDIA GPUs and Intel integrated GPUs.
Aurora Vision Studio™
Aurora Vision Studio enables machine and computer vision engineers to swiftly develop, integrate, and monitor advanced vision applications. This hardware-agnostic software offers an intuitive graphical environment for creating sophisticated vision applications without coding. With over 3,000 proven filters, users can design tailored solutions through a streamlined three-step workflow: algorithm design, custom HMI creation, and deployment on a PC-based industrial computer. The deep learning toolchain has been upgraded with a new training engine that improves data balancing, resulting in faster and more reliable training, and is now compatible with Linux systems for inference only.
Aurora Imaging Library™
The Aurora Imaging Library software development kit is designed for experienced programmers working with C++, C#, and Python. It provides an extensive array of tools for processing and analyzing 2D and 3D images using both traditional and deep learning methods. Recent additions include anomaly detection tools for defect detection and assembly verification, utilizing unsupervised training with normal references. The deep learning-based OCR tool features a pre-trained neural network model capable of recognizing a wide range of characters and symbols without the need for specific font training, incorporating string models and constraints for more accurate readings.
Key Takeaways:
- Zebra’s Aurora machine vision software now includes advanced deep learning features to support manufacturers with visual inspection and quality challenges.
- The suite is designed to facilitate complex use cases with user-friendly tools for engineers, programmers, and data scientists.
- The 2024 Manufacturing Vision Study highlights a significant expectation for AI to drive industry growth by 2029.