Machine vision has been around for a long time. In fact, one may actually argue that the basic theoretical, optical, and mathematical aspects of machine vision have been understood since the ancient Greeks discovered how the camera obscura works. But it took mankind hundreds and hundreds of years more, before it was able to deploy that technology on a massive scale.
The year was 1981, when Richard Lyon, an American scientist, invented the optical mouse while working at Xerox Palo Alto Research Center. Thirty five years later, and several hundred millions of units sold, this small companion of our digital life is by far the most successful — certainly in terms of units sold and customer penetration — application of machine vision technology today. In geek speak, an optical mouse is actually nothing other than an image-acquisition device with an embedded vision processor on which specialized image processing software is running.
Another example of machine vision application around since the 1970’s and ubiquitous and pervasive in our daily life is the barcode reader. The familiar “beep” that we hear when the cashier of our favorite grocery store scans the many goods we are about to buy, signals that the machine has “seen” and correctly decoded a bunch of odd-looking bars into numbers. A few billion barcodes are read every day around the world.
Using technical jargon, the optical mouse performs a function called “guidance” (e.g. through websites) whereas the barcode reader performs a function called “identification” (of products for instance).
From vision to classification
One other critical and widely-spread function performed by machine vision is called “classification”. Humans are excellent at classification. One quick look at a basket full of apples for instance, and we are able to immediately distinguish good from bad or damaged apples, or red from green apples, etc. The most basic form of a classification is binary: yes or no. Binary classification answers questions such as “is this apple good?”
Using a real-life example such as the the manufacturing industry for instance, the application of binary classification becomes critical because it answers the integral question of whether a product is manufactured according to the specifications.
A problem to solve: flexibility in classification
For the last forty years, traditional machine vision has been struggling with classifying goods and products, and for a good reason: classification is a very complex function to program. It requires an experienced programmer and a product domain expert to sit together and to clearly write out all the rules or features required to distinguish a “good” object from a “not-good” object. More importantly, the program so carefully crafted further needs to be tweaked and adjusted if the objects to be classified change even slightly.
In modern manufacturing where personalization and customization are key, inflexible systems are not acceptable. For this reason, in the vast majority of plants around the world today, many binary classification tasks are performed manually, by human inspectors who pick up each manufactured product to check it carefully for potential defects. Regardless of the intrinsic economic value of the object, from inexpensive foods to top of the line iPhones, visual inspection carried out by human operators is often a critical part of the quality control strategy of the plant. In fact, more than 50% of all quality checks in the world are still performed by humans.
The major problem with human inspections is its huge variability: not only across different production lines, shifts, and operators, but also within the same shift as eyes get tired and the operator needs to take frequent and long breaks to rest their eyes.
Spotting defects is not an easy task and is affected by many parameters, such as the training and turn over of the operators, their expertise, the health of their eyes, and (the inconsistency of) their skills. And this is exactly the problem that deevio is solving: our unique deep learning technology assists the hundreds of thousands of people around the world whose job is to stare eight hours a day at all sort of different products checking for manufacturing defects.
deevio: improving quality control with deep learning
Our idea is simple. We assist quality control operators by placing a camera and a monitor next to their work bench. The operators then place the object to be inspected under the camera. After a short analysis, operators can look up to the monitor where a magnified picture of the object is displayed with an overlap of critical areas with all defects detected by our deep learning model. It is then up to the operators to check the area of the found defect and confirm or invalidate its presence and make the ultimate pass-or-fail decision.
The camera and the monitor are connected to deevio’s proprietary hardware: the AI-Box that is fueled by a customer specific deep learning model, especially trained to inspect that particular product.
Inside our AI-Box
The terms “deep learning”, “machine learning”, and “artificial intelligence” have been so hyped recently that there is no point in providing definitions, but I will outline one fact that is perhaps not very well known, namely the progress that deep learning has done in solving image classification tasks.
Nowadays, it is safe to say that deep-learning enabled machines can detect, locate, and classify objects more accurately than humans, as shown in the figure below.
And this is exactly the foundation of our idea: by applying deep learning, we can quickly, efficiently, and inexpensively train a machine to visually inspect an object more accurately, consistently, and reliably than a human can.
The implications are staggering: manufacturing is a 12 trillion industry that employs 340 million workers in manufacturing plants. Of these workers, roughly one third is in quality assurance. In other words, we have the opportunity to change the working life of over 100 million people. Wish us luck!
For more information regarding our solution, check deevio.ai or shoot us an email at email@example.com
Thursday, September 20, 2018