Capacitive Touch Screens

Graham Roth
May 22, 2013

Submitted as coursework for PH250, Stanford University, Spring 2012

Introduction

Phones with touchscreens are practically ubiquitous nowadays. Since well before the Apple iPhone debuted in 2007, mobile phones with capacitive touch-sensing screens have been available from multiple manufacturers. These screens allow us to take full advantage of our most responsive output devices, our fingers, to create input for our mobile phones. Advancements in capacitive screen technology have allowed for multi-touch devices, some as early as 1984, culminating in the iPhone, Android and other smartphone series today.

Motivation

Fitts's Law is a mathematical model describing how well humans can use their body to generate information signals. In his paper on the information capacity of the human motor system in controlling the amplitude of movement, Paul Fitts demonstrated the mathematical correlation between the time it takes to select an object on a screen and the distance to and size of the object. [1] In a later paper, Stuart Card et al. explored the design space of possible input devices, performing experiments to determine which body parts can generate information the fastest. [2] While the most common input device of the day, the mouse, uses the wrist, Card et al. showed that the wrist is only the second most efficient input device. The body part with the shortest targeting and selection times is the fingers. [3]

History

Long before the era of the personal computer, and much, much before those computers were small enough to fit inside of a cellular telephone, capacitive touch-pads were being used in synthesizers and other devices for the creation of electronic music. Light pens existed by 1952 that could select a point on a CRT screen by conveying input back when the electron beam refreshes the spot on the monitor the pen is being pointed at. In 1965, E.A. Johnson described in Electronics Letters a method for using capacitive touch-sensing to make a touch-sensitive screen, a system that would be used for air-traffic control in the UK until the late 1990's. [4]

From the late 60's onwards, touchscreens became more and more refined as research continued. Multi-touch-capable screens were demonstrated as early as 1984, though at that point most multi-touch devices were simply tablets and not screens. [2] In 1994, IBM and Bell South announced the Simon, often called "the first smartphone". The Simon had few physical buttons and used a touchscreen as its primary source of input, combining the functions of a phone with those of a Personal Digital Assistant (PDA). [5]

Since then, touchscreens have become higher-resolution, cheaper to manufacture, and more sophisticated, eventually leading to the release of the most famous smartphone, the Apple iPhone, in 2007. Clearly inspired by the Simon, the iPhone has only one button on the main interface, and relies on the touchscreen for most of its input. It was also the first phone to include some multi-touch capability, implementing a "pinching" gesture for zooming in and out. Since 2007, the number of touchscreen-based "smartphones," on the market has exploded, and today they are one of the most common types of cell phone in use.

How Touchscreens Work

Capacitive touch-sensing comes in two varieties, but both take advantage of the rather large capacitance of the human body to alter that of the screen. Both are formed by making either a grid of transparent electrodes or two perpendicular layers containing transparent, parallel conductive channels, forming a coordinate system.

The first method that is sometimes found in touchscreens is a system that uses self-capacitance. When a human finger (or anything with a dielectric constant different from that of air) comes close to the screen, it senses the capacitance between the electrode or wire and the human body. This change in capacitance can be measured, and the location of the touching finger can be pinpointed. The screen can be calibrated so that it only responds to a change in capacitance approximately equal to that induced by a human finger touching the screen, thereby eliminating accidental touches to the screen from other environmental factors. One of the problems with using a grid of wires in a self-capacitance system is that the two layers operate independent of one another, and so it is impossible to consistently and accurately pinpoint a second touch on the screen - there are "ghosting" artifacts that indicate a finger is touching the screen in a place it is not.

The other method that is often used is a mutual-capacitance system. In this system, if you have a grid of wires, the x-axis and y-axis of the grid work together, with one direction of wires carrying current, and the other sensing the capacitance between them. You can also just have a capacitor at each location instead of an electrode. In either construction, when a finger gets close to the screen, because it has a dielectric constant different from air, it changes the local electric field, and therefore the mutual capacitance of the wires or the capacitor array. The location data is then sent to the processor. Just like in a self-capacitance system, the sensitivity can be calibrated so as to only respond to a human finger.

Once the processor receives data from the screen, it simply removes noise, calculates the center of the intended touch region, groups multiple touches, if applicable, determines what gesture is being performed, if any, and passes that information on to whatever is running on the OS, allowing you to use gestures to control your phone. [6]

Conclusions

Touchscreens are becoming more an more prevalent every day. With increasingly tiny capacitor grids in screens and increasingly accurate tracking algorithms, it won't be long until touchscreens are in almost every part of our lives. Though multi-touch systems have been slower in development, they, too are growing in prevalence, speed, and affordability. Where for years we were stuck using the mouse to navigate the digital realm, this developing technology will allow us to truly harness the most efficient and useful output devices on our bodies - our fingers.

© Graham Roth. The author grants permission to copy, distribute and display this work in unaltered form, with attribution to the author, for noncommercial purposes only. All other rights, including commercial rights, are reserved to the author.

References

[1] "Touching the Future," The Economist: Technology Quarterly, 6 Sep 08

[2] S. Card, J. D. Mackinlay, and G. A Robertson, "Morphological Analysis of the Design Space of Input Devices," ACM Trans. Information Sys. 9, 99 (1991).

[3] P. M. Fitts, "The Information Capacity of the Human Motor System in Controlling the Amplitude of Movement," J. Exp. Psych. 47, 381 (1954).

[4] E. A. Johnson, "Touch Display: A Novel Input/Output Device for Computers," Electronics Lett. 1, 219 (1965).

[5] D. J. Allard et al. "Apparatus and Method for Marking Text on a Display Screen in a Personal Communications Device," U.S. Patent 5815142, 29 Sep 98.

[6] L. K. Baxter, Capacitive Sensors: Design and Applications (Wiley-IEEE, 1996).