Tuesday, June 22, 2010

June issue: vision-guided robots, software, and Three Mile Island



Our June issue is now available on our website. The articles in it point to some of the many ways in which machine vision is evolving.

I glimpsed this potential in 1982, when I watched the feed from the first remote video camera lowered into a reactor vessel at Three Mile Island, after the nuclear accident had destroyed the reactor core in 1979. It took several years to develop the imaging equipment for this first foray and, in the years that followed, many cameras and robots would gather information about damage and perform cleanup operations in highly radioactive areas of the plant. Here's a picture of Rover, developed with Carnegie Mellon University.


Although robots were not then sophisticated enough to perform major operations—and stereo vision was practically a dream—the future of vision-guided robots was obvious. Some colleagues and I wrote a history of the cleanup, including the robotic and imaging technologies that were used. You can download a PDF of the history published by the Electric Power Research Institute by clicking HERE.


Remotely operated vehicles are now playing an increasing role in other crises. Our cover story in the June issue, for example, shows how 3-D displays can help remote operators in the military safely handle and dispose of explosive devices using robots.

Another article explains how single-sensor image fusion technology could enable simpler and more effective imaging of potential threats in security and defense operations.

Machine vision is not always on the front line of environmental and political challenges, however. Researchers from the University of Ilmenau in Germany are using image processing techniques to evaluate the quality of wheat after it is harvested.

And, as contributing editor Winn Hardin explains, manufacturers are using other machine vision techniques ensure that the steel tubes produced for oil and gas production are of the highest quality.

This broadening range of biomedical, robotics, military, and aerospace applications is leading software vendors to expand the functionality of their products beyond simple measurement functions, as editor Andy Wilson writes in his Product Focus article on machine vision software.

Indeed, new opportunities for machine vision and image processing systems are occurring every year. To take advantage of these developments, however, suppliers of machine vision systems will have to look outside the box of conventional industrial manufacturing and into niche applications that span the gamut from agriculture to space exploration.

Friday, June 11, 2010

Art, vision, and ALS - forget eye-tracking for shoppers

We've seen eye-tracking systems that help determine the preferences of shoppers or website browsers. Here's one that could really benefit people who suffer from physical limitations: The EyeWriter project.

It's an ongoing collaborative research effort to empower people who are suffering from amyotrophic lateral sclerosis (ALS; aka Lou Gehrig's disease) with creative technologies. It allows graffiti writers and artists with paralysis resulting from ALS to draw using only their eyes.

The collaborative consists of members of Free Art and Technology (FAT), OpenFrameworks, the Graffiti Research Lab, and The Ebeling Group communities. They have teamed-up with LA graffiti writer, publisher, and activist, Tony Quan, aka TEMPTONE. He was diagnosed with ALS in 2003, a disease which has left him almost completely physically paralyzed… except for his eyes.

The long-term goal is to create a professional/social network of software developers, hardware hackers, urban projection artists and ALS patients from around the world who are using local materials and open source research to creatively connect and make eye art.

The Eyewriter from Evan Roth on Vimeo.


Click HERE to see the Specification Sheet for the Eyewriter, including the low-cost vision components that are needed.

Tuesday, June 8, 2010

The Vision Show revealed--on video

The most interesting thing for me about The Vision Show in Boston (May 25-27) was the simple fact that about 80 exhibitors put on a very upbeat and comprehensive showing of machine vision components available to integrators and end-users.

In one place you could see and touch the cameras, lighting, boards, cabling, etc that you might want to design into your next system. In the technical sessions and tutorials, you could also be instructed in many of the fundamentals of the technology and understand how products perform.

Here’s a video with Jeff Burnstein from the AIA talking about the show and what’s coming next.



Of course the fact that most vendors were reporting good to great sales numbers really helped. The overall mood of the roughly 1900 attendees and exhibitors was so strikingly different than during the depths of the recession that it was impossible not to get caught up in the good feelings.

Follow this link to our Video Showcase to see some of the videos that were made during the show.

Monday, June 7, 2010

British military envisions how imaging catches insurgents

Using very high-resolution digital cameras, multispectral imaging, and laser ranging, the UK’s Defence Science and Technology Laboratory (DSTL) says that new imaging technology will be used within 5 years to recognize insurgents or terrorists.

DSTL, which develops and tests the latest technologies for the Ministry of Defence, had members of its staff act out insurgent-like behavior, while developers and engineers took on the role of "good guys", pursuing and monitoring them.



The military twist was that these high-tech surveillance techniques are being combined with software that can pick out unusual patterns in behavior--such as two vehicles meeting in a concealed area. The surveillance, DSTL says, will eventually help to "win the battle" against insurgency. For more information, read the excellent BBC News article.