Friday, July 23, 2010

Robots, with vision, at your service

A recent article and video in the New York Times describes Bandit, a robot built by researchers at the University of Southern California, which interacts with autistic children. Three-foot-tall Bandit can maintain “eye” contact with an autistic child and, sometimes, use playful or sympathetic actions to overcome withdrawn behavior.

Another robot, named RUBI—Robot Using Bayesian Inference—at the University of California, San Diego, images children’s faces, recognizes basic emotions from facial muscle movement, and responds with verbal and physical gestures of encouragement.

These service robots are part of a rapidly growing wave of robotic human helpers. In the classroom they may supplement the work of human teachers, during surgery they may perform delicate procedures, and on the battlefield they may help disarm a roadside bomb, as described in our June 2010 cover story.

The technological differences between these service robots--with their vision and image processing functions--and robots used in industrial applications can be small. For example, a recent article on our website describes the work of researchers at the Technical University of Munich who are imaging non-verbal communications such as gestures and facial expressions as a method of interacting with robots. To date, they have demonstrated that their work can help those that require assisted living and workers in automated production plants, where background noise may make speech recognition difficult.

Recently, European researchers have built a robot for 'on-demand' rubbish collection – just make a call and it will soon arrive at your door. It's ideal for collecting waste in the narrow streets of many historical towns.

About the size of a person, it can navigate the narrowest of alleys, stop outside your door and take your rubbish away. And the best bit is this: You don't have to remember when to put your bin out, but simply make a telephone call. Soon the robot is waiting outside your door, ready to receive your rubbish.

Monday, July 19, 2010

July VSD online--Hyperspectral imaging, minature autofocus lenses, 3-D vision

To capture continuous spectral bands from the UV to the far IR, hyperspectral imaging has become a powerful imaging tool. In our July issue, Rand Swanson at Resonon describes a compact hyperspectral imaging system that has been flown in a Cessna aircraft to monitor the spread of leafy spurge, an invasive weed that reduces grazing forage for livestock.

In our Product Focus article, editor Andy Wilson describes recent developments in miniaturized autofocus lenses. Whether based on electro-optical, electromechanical, thermo-optical, or acousto-mechanical techniques, these tunable optics will find cutting-edge applications in smart machine-vision systems, endoscopy systems, and mobile phones.

Our cover story shows how an optical tester based on an off-the-shelf camera system can be used to calibrate centering errors of lenses to ensure the imaging quality of an optical assembly or subassembly.

3-D vision remains one of the most alluring areas for innovation in machine vision development. While dual-camera and time-of-flight sensors are becoming increasingly important, other options such as the one described in an article about ISee3D now allow stereo images to be captured from a single camera/lens combination.

We also have articles on: inspection of wood surfaces for defects by researchers at AIDO in Spain; an algorithm that uses partial deriatives to improve edge detection; and an FFT processor that performs phase correlation.

Wednesday, July 7, 2010

Blogging about machine vision – interested?

Some cynics I know mock the idea of blogging, but I think it’s a good way to explore a subject such as machine vision. And a blogger might even be paid the highest compliment--having your blog blogged about.

A case in point: editor Andy Wilson’s My View video blog on the Vision Systems Design website was recently blogged about by Laura Hoffman, who runs the Microscan blog SolutionConnection, along with colleagues such as John Agapakis.

Numerous other companies in the machine vision industry have blogs or are trying to figure out what they could write that wouldn’t rattle internal corporate feathers but would still be interesting. Like Microscan, Thor Vollset at ScorpionVision’s blog is aiming to keep readers up to date on the company and how customers can use its products.

Then there are system integrators who have occasional blogs on their own sites or on a magazine site. These include David Dechow at Aptura Machine Vision Solutions with his Regarding Machine Vision blog, and Ned Lecky at Lecky Integration and John Nagle at Nagle Research. Also, there are journalists who blog about related topics, such as Frank Tobe at Everything Robotic and Gabriele Jansen at Inspect-online blog.

And of course there is the popular and anonymous B Grey at machinevision4users blog, who presumably hides his or her identity out of concern about industry (or employer?) reaction. The postings vary from technical observations and comparisons to witty digs. But anonymity is both a shield and a crutch. Speaking as a journalist who must live with the consequences of what I write, I think B Grey should stand forth and be counted.

Whether anonymous or very public, all bloggers can attest to the fact that it’s not easy to post frequently and have something interesting or new--or at least amusing--to say. Yet it can be quite rewarding, personally and professionally.

Blogs can become good networking and marketing tools that engage people. And you can re-post blogs to other social media sites such as Linked In, where you will find many relevant groups such as the Vision Systems Design Group, the Machine Vision Group, the Image Processing Group, and the 3D Machine Vision Group. Most of these groups have hundreds or even thousands of members.

If you have a comment on what I’ve written, please post it on this blog.

If you’re reading this and interested in contributing a regular or at least somewhat regular blog to Vision Systems Design, please let me know: