Thursday, December 29, 2011

Image processing is all in the mind?

If you are anything like me, you probably gave out one or two video games as presents to some of your younger relatives over the holiday season. If you did, however, you ought to be aware of the danger involved, and the potential repercussions of your actions.

Apparently, according to research carried out by academics in the UK and Sweden, some video game players are becoming so immersed in their virtual gaming environments that -- when they stop playing -- they transfer some of their virtual experiences to the real world.

That's right. Researchers led by Angelica Ortiz de Gortari and Professor Mark Griffiths from Nottingham Trent University's International Gaming Research Unit, and Professor Karin Aronsson from Stockholm University, have revealed that some gamers experience what they call "Game Transfer Phenomena" (GTP), which results in them doing things in the real world as if they were still in the game!

Extreme examples of GTP have included gamers thinking in the same way as when they were gaming, such as reaching for a search button when looking for someone in a crowd and seeing energy boxes appear above people's heads.

Aside from the game players, though, I wonder if this research might also have some implications for software developers working in the vision systems business, many of whom also work long hours staring at computer screens, often taking their work home with them.

How many of these individuals, I wonder, also imagine that they are performing image-processing tasks when going about their daily routine? Have you, for example, ever believed that you were performing a hyperspectral analysis when considering whether or not to purchase apples in the supermarket, optical character recognition to check the sell-by date on the fruit, or even a histogram equalization on the face of the attractive young lady at the checkout line?

While Professor Mark Griffiths, director of the International Gaming Research Unit at Nottingham Trent University, said that he found that intensive gaming may lead to negative psychological, emotional, or behavioral consequences, the same might hold true for those of us who spend too much time at work developing image-processing software.

Thank goodness, then, that we will soon be able to look forward to a few more days respite from our toils to celebrate the New Year.

Happy New Year.

Thursday, December 22, 2011

Christmas time is here again

It's that time of year again. That time when many of us will be erecting a fir in the corner, decking the halls with boughs of holly, and sitting back to enjoy a glass of mulled wine as we roast chestnuts over an open fire.

That's right. It's Christmas, the festive season in which we put our work to one side for a while to enjoy a few well-deserved days off to spend with our friends and family.



But before the festivities can begin, there are numerous chores that must be performed. And one of these, of course, is to send Christmas greetings to all our friends and colleagues.

Traditionally, such messages of comfort and joy have been sent via the postal service. After purchasing a box of Christmas cards, many of us spend hours writing individual messages inside them, after which the cards are duly inserted into envelopes, addressed, and taken down to the Post Office where they are mailed.

In these days of automation, however, some of us no longer leave the comfort of our armchairs to perform the task, preferring to use e-mail greeting cards instead. While such e-mail messages may never have the quite the same personal appeal as a real piece of card with a Christmas scene printed upon it, they certainly are a cost-effective alternative to sending out the real thing.

With such electronic wizardry automating our traditional time-intensive Christmas labors, it's interesting to consider by what means we might be delivering our Christmas greetings to our friends and colleagues in the future.

Well, I think the folks at Edmund Optics might have found the answer. To distribute a Christmas message to their audience in the vision systems design industry, the innovative Edmund Optics team has produced a rather amusing video that they have uploaded onto YouTube where it can be viewed by all and sundry.

But this isn't just any video greeting. Oh no. The entertaining video features a number of Edmund Optics' employees playing a familiar Christmas tune on the company's own range of telecentric lenses. That's right. Watch carefully and you will see the so-called "Telecentric Bell Choir" [click for YouTube video] ringing the lenses to play that Christmas favorite "Carol of the Bells."

From my perspective, this form of sending holiday greetings to friends and family is clearly the wave of the future. What's more -- for Edmund Optics at least -- it might be a way to generate a whole new market for its acoustically-enabled telecentric product line.

Happy holidays!

Wednesday, December 21, 2011

Recycling light bulbs with vision

The United Nations is urging countries across the globe to phase-out old style incandescent light bulbs and switch to low-energy compact fluorescent light (CFL) bulbs to save billions of dollars in energy costs as well as help combat climate change.

One issue with such bulbs is that they contain minute traces of mercury, however, and hence should be recycled to prevent the release of mercury into the environment rather than just tossed into a dumpster.

This, of course, has created an enormous opportunity to automate the collection of old CFL bulbs -- an opportunity that one machine maker in the UK has clearly identified.

That's right. Partly thanks to the stealthy deployment of a machine vision system, London, UK-based Revend Recycling has now developed a machine to collect light bulbs in exchange for discount vouchers or other consumer rewards.



When using the so-called "Light Bulb Recycling Reverse Vending" machine, an individual is guided through the recycling process by a touch-screen menu. After the unwanted bulbs are placed into the machine, they are then identified by the vision system, after which the machine softly drops the bulbs into a storage container.

The machine then automatically dispenses a reward incentive voucher, which can be chosen from a large selection of different rewards on the touch-screen.

To enable recovery and recycling statistics to be collated, the recycling data captured from every light bulb received are transmitted to a secure central database. An embedded computer system in the machine also determines when the light bulb storage container in the machine is full, and when it is, the machine automatically sends a text or email when it nears full capacity, so that it can be emptied.

So far, the vision based recycling machine has proved to be a bit of a hit. The Scandinavian modern-style furniture and accessories store Ikea, for example, recently inked an agreement with Revend Recycling, and soon the store's customers in the UK, Germany, and Denmark will have the option to recycle used light bulbs with the machines. As an added feature, the recycling machines can also be purchased with an add-on system for collecting domestic batteries.

Friday, December 16, 2011

New standard aims to accelerate image processing

It goes without saying that computer vision has become an essential ingredient of many modern systems, where it has been used for numerous purposes including gesture tracking, smart video surveillance, automatic driver assistance, visual inspection, and robotics.

Many modern consumer computer-based devices -- from smart phones to desktop computers -- can be capable of running vision applications, but to do so, they often require hardware-accelerated vision algorithms to enable them to work in real time.

Consequently, many hardware vendors have developed accelerated computer vision libraries for their products: CoreImage by Apple, IPP by Intel, NPP by Nvidia, IMGLIB and VLIB by TI, the recently announced FastCV by Qualcomm.

As each of these companies develops its own API, however, the market fragments, creating a need for an open standard that will simplify the development of efficient cross-platform computer vision applications.

Now, Khronos' (Beaverton, OR, USA) vision working group aims to do just that, by developing an open, royalty-free cross-platform API standard that will be able to accelerate high-level libraries, such as the popular OpenCV open-source vision library, or be used by applications directly.

The folks at the Khronos Group say that any interested company is welcome to join the group to make contributions, influence the direction of the specification, and gain early access to draft specifications before a first public release within 12 months.

The vision working group will commence work during January 2012. More details on joining Khronos can be found at http://www.khronos.org/members/ or e-mailing info@khronos.org.

Wednesday, December 14, 2011

Embedded vision is just a game

In the late 1970s, game maker Atari launched what was to become one of the most popular video games of the era -- Asteroids.

Those of our readers old enough to remember might recall how thrilling it was back then to navigate a spaceship through an asteroid field which was periodically traversed by flying saucers, shooting and destroying both while being careful not to collide with either.

Today, of course, with the advent of new home consoles such as the Sony Playstation, Microsoft Xbox, and Nintendo Wii -- and the availability of a plethora of more graphically pleasing and complex games -- one might be forgiven for thinking that games like Asteroids are ancient history.

Well, apparently not. Because thanks in part to some rather innovative Swedish image-processing technology, it looks as if old games might be about to make a comeback.

That’s right. This month, eye tracking and control systems developer Tobii Technology (Danderyd, Sweden) took the wraps off “EyeAsteroids,” a game it claimed was the world’s first arcade game totally run by eye control.




In the company’s EyeAsteroid game, players have the chance to save the world (yet again) from an impending asteroid collision. As a slew of asteroids move closer to Earth, the gamer looks at them in order to fire a laser that destroys the rocks and saves the world from destruction.

Henrik Eskilsson, the chief executive officer of Tobii Technology, believes the addition of eye control to computer games is the most significant development in the gaming industry since the introduction of motion control systems such as the Nintendo Wii. And if he’s right, that’s a big market opportunity for all sorts of folks that are associated with the vision systems business.

But perhaps more importantly, the game might make interested parties take a look at another of the company’s product offerings: an image recognition system that can find and track the eyes of drivers in order to inform an automotive safety system of the driver’s state, regardless of changes in environmental conditions.

Because while saving Earth from asteroids using the eyes might be good fun and games, saving lives on the highway through tracking the eyes of motorists is a much more distinguished achievement, and one that -- in the long run -- might also prove to be a more lucrative business opportunity.

Nevertheless, those folks more captivated by the former application of the technology will be only too pleased to know that the EyeAsteroids game is available for purchase by companies and individuals. Tobii Technology plans a limited production run of 50 units that will be available for $15,000 each.

Friday, December 9, 2011

It's a bug's life (revisited)

Remotely operated unmanned aerial vehicles (UAVs) equipped with wireless video and still cameras can be used in a variety of applications, from assisting the military and law enforcement agencies in surveillance duties, to inspecting large remote structures such as pipelines and electric transmission lines.

Typically, however, such vehicles are quite sizable, bulky beasts due to the fact that they must carry a power source as well as large cameras. And that can limit the sorts of applications that they can effectively handle.

Now however, it would appear that researchers at the University of Michigan College of Engineering have come up with an interesting idea that might one day see vehicles as small as insects carrying out such duties in confined spaces.

And that’s because the vehicles that they are proposing to "build" are, in fact, real insects that could be fitted out with the necessary technology to turn them into mobile reconnaissance devices.




The work is at an early stage of development at the moment of course. To date, professor Khalil Najafi, the chair of electrical and computer engineering, and doctoral student Erkan Aktakka are figuring out ways to "harvest" energy from either the body heat or movement of the insects as a means to power the cameras, microphones, and other sensors and communications equipment that they might carry.

As interesting as it all sounds, there are obviously bigger engineering challenges ahead than just conquering the energy harvesting issue. One obvious problem is how the researchers will eventually control the insects once they have been fitted out with their energy harvesting devices and appropriate vision systems.

Then again, they may not need to. If a plague of such insects were dropped in Biblical proportions upon a rogue state for clandestine monitoring purposes by our armed forces, the chances that one of them would reveal some useful information would be pretty high.

The religious and political consequences of letting loose high-tech pestilent biotechnology on such countries, however, might be so profound that the little fliers never get off the ground.

Editor's note: The research work at the university was funded by the Hybrid Insect Micro Electromechanical Systems program of the Defense Advanced Research Projects Agency under grant No. N66001-07-1-2006.

Wednesday, December 7, 2011

I can see clearly now

While I've always been short-sighted, until not long ago it was always pretty easy for me to read books or magazines while wearing the same set of glasses that helped me see at great distances.

But over the past couple of years, it became apparent that I not only needed glasses to correct for myopia but also to assist with looking at things closer to hand.

To solve my dual myopic-hyperopic headache, I turned to my local optician who suggested that a pair of bifocals or varifocal lenses might do the trick. And it did. Thanks to her recommendation, I now sport a rather expensive pair of glasses with varifocals that enable me to focus my eyes onto both objects far and near.

As great as these varifocals are, however, I accept that they aren't for everyone. In fact, some people dislike them as they find it difficult to get used to which areas of the lens they have to look through!

One optics company -- Roanoke, Virginia-based PixelOptics -- has come up with a unique solution to the problem: an electronic set of glasses called emPower that has a layer of liquid crystals in each lens that can instantly create a near-focus zone, either when the user either touches a small panel on the side of the frames or in response to up and down movements of the head.

Under development for 12 years, the new system, which is protected by nearly 300 patents and patent applications pending around the world, looks to be yet another interesting option for those folks with optical issues like mine.

I'd like to think that there might be some use for this technology in the industrial marketplace, too, but I haven't quite figured out where that might be yet.

I can't, for example, envisage any system integrator actually manually swiping cameras fitted with such lenses to change their focal length while they might be inspecting parts at high speed in an industrial setting. Nor could I imagine that many engineers would build a system to move such a camera up and down to do the same -- an autofocus system would surely be a lot more effective!

Nevertheless, I'm keeping an open mind about the whole affair, because the imaging business is replete with individuals that can take ideas from one marketplace and put them to use in others.

Monday, December 5, 2011

In your eye

While industrial vision systems might seem pretty sophisticated beasts, none has really come close to matching the astonishing characteristics of the human eye.

Despite that fact, even the human eye is often less than perfect, as those suffering from short or long-sightedness will testify. Those folks inevitably end up seeking to correct such problems either by wearing spectacles or contact lenses.

Not content with developing a contact lens to correct vision anomalies, however, a team of engineers at the University of Washington and Aalto University, Finland, have now developed a prototype contact lens that takes the concept of streaming real-time information into the eye a step closer to reality.

The contact lens itself has a built-in antenna to harvest power sent out by an external source, as well as an IC to store the energy and transfer it to a single blue LED that shines light into the eye.

One major problem the researchers had to overcome was the fact that the human eye -- with its minimum focal distance of several centimeters -- cannot resolve objects on a contact lens. Any information projected on to the lens would probably appear blurry. To resolve the issue, they incorporated a set of Fresnel lenses into the device to focus light from the LED onto the retina.

After demonstrating the operation and safety of the contact on a rabbit, they found that significant improvements are now necessary before fully functional, remotely powered displays actually become a reality. While the device, for example, could be wirelessly powered in free space from approximately 1 m away, this was reduced to about 2 cm when placed on the rabbit's eye.

Another issue facing the researchers is to create a future version of the contact lens with a display that can produce more than one color at a higher resolution. While the existing prototype lens only has one single controllable pixel, they believe that in the future such devices might incorporate hundreds of pixels that would allow users to read e-mails or text messages.

I don’t know about you, but I can’t wait for the time in which I might be able to have such e-mails, text messages, or, heaven forbid, Twitter feeds, projected directly into my eye, especially when I am on vacation. No, I wouldn’t even put my pet rabbit through such an ordeal.



Shown: Conceptual model of a multipixel contact lens display. 1: LED. 2: Power harvesting circuitry. 3: Antenna. 4. Interconnects. 5. Transparent polymer. 6. Virtual image.