A University of Windsor PhD student has won a $10,000 scholarship in an AUTO21-sponsored competition.
AUTO21 is Canada's national automotive research program, which provides funding to more than 50 applied research and development projects at 45 universities across the country.
Siddhant Ahuja, a student in electrical and computer engineering, took the top prize in a recent technology competition for developing a visual sensor network that can track products and people through an automated production line.
His system begins with the tiny cameras found in cellphones. When linked up with high-power microprocessors, the cameras keep the robots working smoothly in three dimensions.
Vaughan: Sid, what's the invention?
Ahuja: The invention is essentially a miniaturization of a television system of two cameras looking at a subject in front of it and perceiving the distance between itself and the subject, kind of like the way your eyes work.
The cameras are off the shelf and they are actually the ones used in Blackberry products. They are very tiny and are connected into an Analog Devices 1.2-gigahertz dual core processor.
We have replaced big cameras, big computers, big networking equipment with a tiny small package in order to integrate it with robots.
We built the software, integrated it with the hardware and everything.
It was a project done under the supervision of Dr. Jonathan Wu, who is my PhD Supervisor and also the Canada Research Chair in automotive sensors and systems.
What will it do?
It can help in automation, quality inspection and tracking objects in a factory setting.
It has the algorithms built in which we have developed in this lab. Some detect the distances, some track people, some are for quality inspection.
Well, I've been in lots of automated factories with robots and laser measurement and inspection everywhere.
There are many robots and many lasers in a factory setting. The difference is that each robot and every motion is pre-programmed. A human operator has pre-programmed them to operate in certain ways, to go one millimetre in this axis or that axis.
The problem happens when the part that it is trying to pick moves a little bit.
If it has been pre-programmed to go to a certain point and pick it up, if the part isn't there, the whole line shuts down.
Lasers provide you with a one-dimensional scan. Our cameras provide a 3D scan. So whatever it's looking at it tells you the distances of all the points in its sphere.
Can't the big automation companies like Siemens do that?
They have products for stereo vision but our objective was to miniaturize it.
Their products that do stereo vision need to be hooked up to a computer. And those computers need to be hooked up to a central computer through networking equipment to a central node. You are talking about a big infrastructure and if the central computer shuts down the whole production line shuts down.
Our module does not send raw data. Our module actually processes the data onboard and then sends the final analyzed result to the central computer.
Wouldn't this have an application in high-tech surveillance?
Yes.
Take for example London, England where they have thousands and thousands of cameras on the streets connected to a big huge data banks and big huge server banks.
Ultimately what they are doing is sending raw data to a central computer. When they're trying to fix the central computer, update the software, whatever, the whole thing shuts down. So there's a huge amount of down time.
Our product analyzes the images right then and there and you don't need to send the final image to the central computer. You send the analysis to the central computer, not the raw images.
So if you want to track somebody with seamless tracking from camera to camera throughout the city, it's no problem.
Will you ever get the chance to see if this works in real life as opposed to the laboratory?
The first thing we had to do is build the modules themselves. We have achieved that stage.
The second step is to actually connect them in a network and that's what we're working on now.
So hopefully in a year or two, we would have a network of these cameras and we can then test it within the lab environment.
And then the next step would be to put it in a factory and test it out.
We have filed a provision patent. It has been reviewed and published.
So why aren't the automation and surveillance companies beating down your door?
We are waiting for the call.
Michael Vaughan is co-host with Jeremy Cato of Car/Business, which appears Fridays at 8 p.m. on Business News Network and Saturdays at 2 p.m. on CTV.