Tuesday, September 25, 2012

Weisswurst, weissbier, and vision systems

Last week, I dispatched our industrious European Editor Dave Wilson off to the rather lovely Bavarian city of Munich to gain some insight into the work that is being undertaken by companies in the region.

During his brief sojourn in Germany, Dave met up with a number of outfits involved in the business of developing vision systems. One of these was Opto -- a small to medium-sized private enterprise with around 35 employees based in the town of Grafelfing on the outskirts of Munich.

Now at the outset, it might seem that a company of such a size might not have a whole lot to discuss. But first appearances can be deceptive, as Dave discovered when Markus Riedi, the President of Opto, gave him a brief presentation on what the company had been up to over the years.

During that presentation, Dave realized that, while the company might best be known for the optical components that it markets, in fact, around 55 percent of its business comes from developing rather complex custom-built products, where it combines its expertise in optics, mechanics, software and electronics to deliver complete modules that its customers can integrate into their own machines.

Herr Riedi showed Dave several examples of the sorts of engineering projects that the company had undertaken. One was an integrated imaging module developed for the inspection of semiconductor dies. Another was an optical subsystem used to inspect pixels on an LCD screen. Then, there was an opto-mechanical module for integration into a laser eye surgery system. And, last but not least, was an imaging system the company had developed to image cells in an embryo incubation machine.



After the presentation, Herr Riedi told Dave that his company was very selective about the companies that it works with to develop products, and only targets areas where the company can provide a lot of value added expertise.

And the strategy appears to be paying off. From a 0.5m Euro business in 2006, Herr Riedi has grown the company to the 7m Euro business that it is today. By 2020, he told Dave that he hopes to push that figure up to the 20m Euro mark.

One way he plans to do that is to actively promote the products that his customers are manufacturing. The idea is a simple one -- the more products they sell, the more subsystems that Opto sells. To do so, Reidi has already started to populate his company's web site with examples of the end-user products that his complex optical subsystems have been designed into.

Impressed with the caliber of companies like Opto, Dave is now looking forward to the day when he might take another trip to Bavaria to meet up with yet more folks involved in the imaging business. But although he tells me that his motives are purely altruistic, I have a suspicion that the quality of the local Bavarian weisswurst and weissbier might also have something to do with it.

Friday, September 21, 2012

Vision Systems in Action

As regular readers of this blog might recall, a few weeks ago I decided to hold a competition in which I challenged systems integrators to email me images of their very own vision systems in action.

To encourage readers to enter the aptly named "Vision Systems in Action 2012" competition, I promised that the winning images that we received would be published in an upcoming blog, providing the winners with lots of publicity and, potentially, a few sales leads as well.

Because the competition didn't come with any prizes, however, the response was less than spectacular. Nevertheless, the Vision Systems Design judging panel were impressed by the high standard and diversity of the photographs we did receive. And now, after several hours deliberating over the entries, I'm pleased to say that our judges have chosen a winner as well as a runner up.

The winner of the "Vision Systems in Action 2012" competition is none other than Earl Yardley, the Director of Industrial Vision Systems (Kingston Bagpuize, UK) who submitted a rather stunning image of a vision system his company has developed to inspect a medical device.



The judges unanimously decided that Yardley's photograph should take first prize, not only for its quality, but the fact that it followed the overall brief set by the judging panel. They were particularly impressed by the photographer's use of lighting as well as the effective use of the color blue which dominated the image.

The runner-up in the "Vision Systems in Action 2012" competition was Vincent Marcoux, the sales and marketing co-ordinator of Telops (Quebec, Canada). He submitted a rather stunning picture of the Chateau Frontenac which was designated a National Historic Site of Canada in 1980. Marcoux captured the image of the chateau using the company's very own HD-IR 1280 x 1024 infrared camera.


The judges were extremely impressed by the exquisiteness of the image, as well as the sense of foreboding that it conveyed. Our panel was particularly taken by the effectiveness of the infrared imaging technique as well as the striking use of the color orange which dominated the image.

As the Editor-in-Chief of Vision Systems Design, I would like to thank everyone for their interest in the "Vision Systems in Action" competition and for taking the time and effort to participate. Perhaps next year, we shall do it again.

Thursday, September 20, 2012

Build your own supercomputer

Many image processing tasks are computationally intensive. As such, system integrators are always on the lookout for any means that will help them to accelerate their application software.

One way to do this is to determine whether an application could be optimized -- either by hand or by using optimization tools such as Vector Fabrics' (Eindhoven, The Netherlands) Pareon -- to enable it to take advantage of the many processing cores that are in the latest microprocessors from AMD and Intel.

If an application can be considered to be easily separated into a number of parallel tasks -- such as those known in the industry as "embarrassingly parallel problems" -- then the only limitation the systems integrator has is how to source enough inexpensive processors to perform the task.

Fortunately, since the advent of the GPU, cores are plentiful. As such, many engineers are harnessing the power of games engines such as GE Force’s GTX 470 -- which sports no less than 448 CUDA cores and 1GByte of memory -- to vastly accelerate their image processing applications.

Now in a few cases where engineers really need to harness even more hardware power, they have only one alternative -- build it themselves. That, indeed, is exactly what engineers at the Air Force Research Laboratory (Rome, NY, USA) have done.

Their massive supercomputer -- which was developed for the Air Force for image processing tasks -- is ranked as one of the fortieth fastest computers in the world. Yet, believe it or not, it has been constructed by wiring together no less than 1,700 off-the-shelf PlayStation 3 gaming consoles!


Now if you are anything like me, you are probably wondering how you might be able to design such a beast yourself, while doing so without shelling out an inordinate sum of money to buy so many Sony games consoles.

If you do, you might like to check out the web page of Professor Simon Cox from the University of Southampton (Southampton, UK), who, together with a team of computer scientists at the university (and his six year old son James) has built a supercomputer out of Raspberry Pi's, a rats nest of cables and an awful lot of Lego.

"As soon as we were able to source sufficient Raspberry Pi computers we wanted to see if it was possible to link them together into a supercomputer. We installed and built all of the necessary software on the Pi starting from a standard Debian Wheezy system image," says Professor Cox.

The machine, named "Iridis-Pi" after the university's Iridis supercomputer, runs off a single 13A mains socket and uses a Message Passing Interface to enable the processing nodes to communicate over Ethernet. The system has a total of 64 processors and 1Tb of memory (16GByte SD cards for each Raspberry Pi).

Now I'm not about to claim that this supercomputer is going to rank up there with the PlayStation-based system built for the US Air Force, but it certainly would be a fun project to build and experiment on. And at a price of under $4000, who wouldn't want to give it a go?

Fortunately, for those interested in doing so, the learned Professor has published a step-by-step guide so you can build your own supercomputer Raspberry Pi supercomputer without too much effort.

The Southampton team wants to see the low-cost supercomputer used to enable students to tackle complex engineering and scientific challenges. Maybe the system isn't really the most cost effective way to do that, but it certainly is inspirational.

Editor's note:  PA Consulting Group and the Raspberry Pi Foundation have teamed up to challenge schoolchildren, students and computer programmers to develop a useful application using a Raspberry Pi that will make the world a better place. I'm sure they would welcome ideas from the imaging community! Details on the competition can be found here.


Friday, September 7, 2012

Turn your iPhone into an IR camera

If you live an old drafty house like I do, you're probably not looking forward to another long cold winter -- not in the least because you will inevitably find yourself shelling out exorbitant sums of money just to keep the place nice and toasty.

Fortunately, since the advent of thermal imaging cameras, it's now pretty easy to identify patterns of heat loss from your property and to then take some remedial action to fix any problems.

Due to the cost of the cameras, however, it's unlikely that you will want to go out and buy one yourself. It's more likely that you will call on the services of a professional home inspector or energy auditor who will bring their own thermal imaging kit around to your properties to perform the task.

Even a professional survey, however, isn't likely to come cheap, although probably a darned sight less expensive than buying your own camera.

Faced with these two alternatives, engineer Andy Rawson decided to turn his iPhone into a thermal camera by developing custom-built hardware and software solution that would interface to it.

More specifically, Rawson designed a PCB board that sports a Melexis (Ieper, Belgium) MLX90620 FIRray device which can measure thermal radiation between -20°C to 300°C thanks to its 16 x 4 element far infrared (FIR) thermopile sensor array. The software then transmits the thermal images collected by the infrared sensor on Rawson’s board to the iPhone through its dock connector after which they are overlaid onto the phone's display together with numerical temperature values.



Having developed the hardware and the software, Rawson says that he would now like to make and sell the systems so others can save money and energy. He figures he should be able to manufacture and sell them for around $150.

Nevertheless, this is also going to be an open source hardware project, so if you want to make your own systems, that's fine by him too. A man of his words, Rawson posted the iPhone code and the board layout on the internet this week. Interested readers can find it here.

While he might be a talented engineer, Rawson admits that he is terrible at dreaming up names for his projects! So he's encouraging people to submit names for the new design to his web site. The winner will receive one of the thermal imaging systems for free.

A video of the thermal imaging system in action can be seen on YouTube here.

Tuesday, September 4, 2012

Kinect comes home

Three Swedish researchers from the Centre for Autonomous Systems (CAS) at the Kungliga Tekniska Hogskolan (KTH) in Stockholm, Sweden are asking people to get involved in a crowd sourcing project to build a library of 3-D models of objects captured using their Microsoft Kinect cameras.

The idea behind the so-called Kinect@Home project -- which was started by Alper Aydemir, Rasmus Goransson and Professor Patric Jensfeltaims -- is to attempt to acquire a vast number of such models from the general public that robotics and computer vision researchers can then use to improve their algorithms.

The researchers chose the Microsoft Kinect camera for some pretty obvious reasons. Not only can it be used to capture both RGB images and depth values of objects, since its launch it has entered the homes of some 20 million people, making it a perfect piece of hardware for a crowd sourcing task.

Before any captured image frames of an object from the Kinect can be uploaded to the Kinect@Home server, users first need to connect their Kinect camera to their PC and install a plug-in. Once they have done so, the website starts showing the live Kinect images on a browser to confirm that the software is working correctly.

Next, the plug-in can be used to start uploading captured frames of an object to the researchers Kinect@Home server. After uploading is complete, optional metadata can be associated with the model of the object. As well as uploading their own models to the site, users can also download models created by others and import them into their own 3-D modeling software packages.

To display the models over the web, the resolution of the models has been lowered at present, but the researchers say that as they acquire faster servers and more bandwidth, this will change dramatically.

At present, the Kinect@Home browser software plug-in only runs on a PC running Microsoft Windows Vista, Windows 7 and 8, but the Swedish software engineers would be pleased to talk to any other software developers that might be interested in porting the browser plug-in to the Linux or Mac operating systems, as well as providing support for Microsoft’s Software Developer Kit.

If you do give the software a try and your models look a bit messed up when they appear in the browser, it's probably your fault. To get the best results from the system, the software developers advise users to move their Kinect cameras slowly and not to point them towards blank walls, featureless or empty spaces.

Personally, I'm tempted to go out and buy a Kinect just to see what Kinect@Home is like. But if you already have one, you can try out the software here.