Today’s smartphones and tablets have made us all pretty familiar with touch screen technology, but its origin actually goes back to the mid-1960s. The first commercially available touch screens appeared some 20 years later in initially in Buick automobiles, though reliability was questionable and they were expensive to replace.
As the technology evolved, touch screens were increasingly used in specialist applications such as industrial control, the restaurant trade, and later gaming devices, but it wasn’t until the introduction of Apple’s iPhone that touch screens took off as a mainstream input mechanism in residential and commercial applications.
Every smartphone and tablet has one now, and so do most of today’s laptops and notebooks. It’s no exaggeration to say they have become so second nature, it’s hard to imagine how we ever ran our lives without them.
The technologies used to create touch screens have evolved too, including some funky manifestations involving ultrasonic waves and acoustic pulses, but only three of them remain in common use. Resistive overlays survive because they are moisture resistant (try using your smartphone with wet fingers), infra-red frames because they can be retro-fitted, including over stacked display panels and the projected capacitive technology used today in every smartphone, tablet, notepad and touch laptop.
Touch Screens: Getting Smarter
All of this touch screen technology ultimately produce the same simple output: notification of a touch, or released touch, and the x-y coordinate at which it occurs. This output is very similar to that of a one-button mouse, except that coordinates are only given when actually touching the display – as if the mouse button were being pressed. An important development has been support for a small number of simultaneous touches, each of which is reported independently. Interpreting these touches with software has allowed gestures, such as pinch and spread, to be supported, which significantly add to the user experience.
Using these technologies, we have simple two-dimensional touch, albeit with some limitations, particularly the size of display, and a number of simultaneous users that are supported.
In the last two years, touch walls have become an emerging market, particularly in large-scale data visualisation and collaboration applications. Typically deployed in the prestige meeting rooms, executive and customer briefing centers of large corporates, but also in museums and higher educational establishments, they require both massive scale and unlimited simultaneous users.
Which is where optical recognition touch screen technology comes in. Originally developed at the University of Helsinki in 2007, this technology uses cameras to literally ‘see’ what is touching the display surface. As with traditional LCD displays, the screen is illuminated with a white backlight, however with optical recognition, infra-red light, invisible to the human eye, is added.
When an object touches the glass surface of the display, some of the infra-red light is reflected back and picked up by an array of infra-red cameras located on the backlight board. This is illustrated in Figure 1 below. Figure 2 is an example of an actual image seen by one of the infra-red cameras. By using infra-red light, none of the ambient light in the room is reflected and a clear picture of the object touching the display is visible. Sophisticated software algorithms then interpret the image, in this case identifying not only the five fingers, but also the palm of the hand as shown in Figure 3.
One early concern was whether the interpretation of images in this way could be time consuming and a potential cause of latency, leading to seemingly unresponsive performance. In fact, it turns out to be the fastest of all touch screen technologies, with no perceptible lag, even when drawing at speed.
A further benefit is that, unlike all other touch technologies, optical recognition has virtually unlimited scalability. Rather than the typical ‘10 touch points’, any number of users can work simultaneously, the last receiving exactly the same experience as the first, with no danger of touches being confused.
Because optical recognition operates entirely behind the display, it has no impact on the display edges and thus can take advantage of the latest narrow bezel LCD technology. This makes optical recognition displays truly stackable, not only as flat displays, but also in curved wall configurations, an important consideration for the user’s field of view with a large wall.
The ability to identify objects touching the display, or even approaching it, opens a world of new opportunities. Firstly, drawing tools such as pens and erasers can be immediately identified, no need for mode switching controls. Second, the ability to see objects close to the display allows the user’s position to be identified. Consider a horizontal table-top touch display in a retail environment.
When a customer touches the display, perhaps causing a pop-up message, his or her orientation can be determined by the position of their wrist, which is visible, even though it is not touching the display surface. The pop-up message can then be oriented so that it is directly facing the user, wherever they are standing around the table. This makes for a more personal user experience.
Perhaps the most exciting aspect of this touch screen technology is the ability to read printed QR-like codes and their orientation. The codes, which cost virtually nothing to reproduce, can be easily applied to merchandise and other objects. When placed on the display, the code is instantly recognised allowing relevant content to be immediately displayed. A great example application is kitchen design, where customers build their dream kitchen, placing miniature replicas of cabinets and appliances on the display surface, which then shows them dimensions, 3D renderings and so on.
Increasingly used for product launches, and other high-profile applications, optical recognition literally takes touch screen technology to a new dimension. Perhaps that moniker has become too narrow and touch displays have evolved to intelligent surfaces?