Tuesday, June 22, 2010

June issue: vision-guided robots, software, and Three Mile Island



Our June issue is now available on our website. The articles in it point to some of the many ways in which machine vision is evolving.

I glimpsed this potential in 1982, when I watched the feed from the first remote video camera lowered into a reactor vessel at Three Mile Island, after the nuclear accident had destroyed the reactor core in 1979. It took several years to develop the imaging equipment for this first foray and, in the years that followed, many cameras and robots would gather information about damage and perform cleanup operations in highly radioactive areas of the plant. Here's a picture of Rover, developed with Carnegie Mellon University.


Although robots were not then sophisticated enough to perform major operations—and stereo vision was practically a dream—the future of vision-guided robots was obvious. Some colleagues and I wrote a history of the cleanup, including the robotic and imaging technologies that were used. You can download a PDF of the history published by the Electric Power Research Institute by clicking HERE.


Remotely operated vehicles are now playing an increasing role in other crises. Our cover story in the June issue, for example, shows how 3-D displays can help remote operators in the military safely handle and dispose of explosive devices using robots.

Another article explains how single-sensor image fusion technology could enable simpler and more effective imaging of potential threats in security and defense operations.

Machine vision is not always on the front line of environmental and political challenges, however. Researchers from the University of Ilmenau in Germany are using image processing techniques to evaluate the quality of wheat after it is harvested.

And, as contributing editor Winn Hardin explains, manufacturers are using other machine vision techniques ensure that the steel tubes produced for oil and gas production are of the highest quality.

This broadening range of biomedical, robotics, military, and aerospace applications is leading software vendors to expand the functionality of their products beyond simple measurement functions, as editor Andy Wilson writes in his Product Focus article on machine vision software.

Indeed, new opportunities for machine vision and image processing systems are occurring every year. To take advantage of these developments, however, suppliers of machine vision systems will have to look outside the box of conventional industrial manufacturing and into niche applications that span the gamut from agriculture to space exploration.

Friday, June 11, 2010

Art, vision, and ALS - forget eye-tracking for shoppers

We've seen eye-tracking systems that help determine the preferences of shoppers or website browsers. Here's one that could really benefit people who suffer from physical limitations: The EyeWriter project.

It's an ongoing collaborative research effort to empower people who are suffering from amyotrophic lateral sclerosis (ALS; aka Lou Gehrig's disease) with creative technologies. It allows graffiti writers and artists with paralysis resulting from ALS to draw using only their eyes.

The collaborative consists of members of Free Art and Technology (FAT), OpenFrameworks, the Graffiti Research Lab, and The Ebeling Group communities. They have teamed-up with LA graffiti writer, publisher, and activist, Tony Quan, aka TEMPTONE. He was diagnosed with ALS in 2003, a disease which has left him almost completely physically paralyzed… except for his eyes.

The long-term goal is to create a professional/social network of software developers, hardware hackers, urban projection artists and ALS patients from around the world who are using local materials and open source research to creatively connect and make eye art.

The Eyewriter from Evan Roth on Vimeo.


Click HERE to see the Specification Sheet for the Eyewriter, including the low-cost vision components that are needed.

Tuesday, June 8, 2010

The Vision Show revealed--on video

The most interesting thing for me about The Vision Show in Boston (May 25-27) was the simple fact that about 80 exhibitors put on a very upbeat and comprehensive showing of machine vision components available to integrators and end-users.

In one place you could see and touch the cameras, lighting, boards, cabling, etc that you might want to design into your next system. In the technical sessions and tutorials, you could also be instructed in many of the fundamentals of the technology and understand how products perform.

Here’s a video with Jeff Burnstein from the AIA talking about the show and what’s coming next.



Of course the fact that most vendors were reporting good to great sales numbers really helped. The overall mood of the roughly 1900 attendees and exhibitors was so strikingly different than during the depths of the recession that it was impossible not to get caught up in the good feelings.

Follow this link to our Video Showcase to see some of the videos that were made during the show.

Monday, June 7, 2010

British military envisions how imaging catches insurgents

Using very high-resolution digital cameras, multispectral imaging, and laser ranging, the UK’s Defence Science and Technology Laboratory (DSTL) says that new imaging technology will be used within 5 years to recognize insurgents or terrorists.

DSTL, which develops and tests the latest technologies for the Ministry of Defence, had members of its staff act out insurgent-like behavior, while developers and engineers took on the role of "good guys", pursuing and monitoring them.



The military twist was that these high-tech surveillance techniques are being combined with software that can pick out unusual patterns in behavior--such as two vehicles meeting in a concealed area. The surveillance, DSTL says, will eventually help to "win the battle" against insurgency. For more information, read the excellent BBC News article.

Monday, May 24, 2010

Imaging battles Gulf oil disaster

Satellite imaging and paricle image velocimetry are two of the imaging techniques being deployed against the oil spill in the Gulf of Mexico. A May 22 article in the New York Times describes several of the techniques that researchers are using to try a get an accurate measurement of the oil spill.

One approach is described in more detail by one of the Times authors, Steve Wereley at Purdue University, in a PowerPoint presentation entitled “Oil Flow Rate Analysis – Deepwater Horizons Accident”. He predicts that the Deepwater Horizon Gulf of Mexico oil spill is more than 50 times worse than initial BP predictions.

Using an imaging technique called particle image velocimetry (PIV), Wereley analyzed video obtained from BP to compute the magnitude of oil flowing from the site. According to his presentation, Wereley estimates that between 56,000 and 84,000 barrels a day are currently pouring into the Gulf of Mexico. Doug Suttle, chief operating officer for BP, initially said he thinks the estimate of 1,000 barrels a day is accurate, although BP is now admitting they have underestimated the amount of oil leaking.

To obtain his figures, Wereley computed the average plume velocity of the oil using PIV techniques, multiplied this figure by the cross-sectional area to find the volume flow rate, and then converted this figure to barrels per day.

PIV is an optical method of fluid visualization. It is used to obtain instantaneous velocity measurements and related properties in fluids. By measuring features in the fluid, motion of these features is used to calculate velocity information of the flow being studied.

A live video of the oil leak, provided by BP over Ustream is available on www.ustream.com - search: live oil spill cam.



All this imaging doesn't even take into account the dozen or so remote underwater vehicles that are now in operation near the sea bed, around the leak, streaming video back to a control center in Houston.



The oil spill is a disaster that maybe imaging and machine vision can help understand and moderate.

Thursday, May 13, 2010

Machine vision lives in Iran

While cruising the Bosphorus I met a vision system integrator from Tehran. Kasra Ravanbakhsh is the co-founder and managing director of Kasra Hooshmand Engineering (KDI). I was of course taken with him since he attributed his attendance at the EMVA Business Conference in Istanbul with seeing an advertisement for it in Vision Systems Design.

It turns out that along with my own blog about the conference and travel home under The Volcanic Cloud, Kasra has made a blog with many pictures about the EMVA conference.

KDI was formed in 2003 as private joint stock company in Tehran. The company's previous name was Kasra Digital Instruments and it still uses that abbreviation and logo.


Kasra says his company excels in developing machine vision systems, PC-based automation and monitoring, industrial automation, data acquisition, LabVIEW programming, microcontroller-based systems, and instrumentation. It is also very involved in cleanroom design and installation.
 
He also claims that KDI is the only professional developer of machine vision and real-time image processing-based inspection and control systems in Iran. KDI operates in industries such as pharmaceutical, glassware, packaging, military, aerospace, paper, food and beverage, and steel and aluminum production.

Kasra made many good contacts during the conference and perhaps opened the eyes of his new friends to some of the technical and intellectual life that stands just beyond their usual reach—not to mention some potential sales opportunities.

Europeans often note that their North American colleagues come from a “young” culture on the far side of the Atlantic. A bit of Persian history as described on the KDI website helps to put real antiquity into perspective!

Conard Holton
cholton@pennwell.com

Wednesday, May 12, 2010

An interactive atlas about global automation

If you manufacture automation equipment, including machine vision systems and robots, and you’re wondering where in the world to look for commercial growth opportunities, then you should review the Automation Atlas.

The Atlas shows the relative degree of automation in a country by showing the estimated number of robots per employees in processing industries. For more information and to use the Atlas, click here.



The Atlas was commissioned by the AUTOMATICA trade fair (held at Messe Munich, 7-11 June) and created by the statistical department of IFR - International Federation of Robotics, which is sponsoring the co-located ROBOTIK conference. The very interesting conference program is now available on the IFR website.

The IFR says only one-third of companies use automation technologies such as industrial robots or process-integrated quality control. For example, according to the Automation Atlas, countries in Eastern Europe employ relatively little automation technology--fewer than 50 industrial robots per 10,000 employees in the processing industry. The robot figure is only between 100 and 200 in Slovenia.

And globally there are clearly opportunities for growth in the pharmaceutical, cosmetics, and medical equipment industries, where the number of industrial robots in use is estimated to be fewer than 50 per 10,000 employees. In contrast, there are an estimated 400 to 700 robots for the same number of employees in the automobile industry.

Conard Holton, cholton@pennwell.com
Vision Systems Design