I stumbled onto an article about touch screen technology through Twitter via Atmel. They gave a tiny little piece of history on touch screens and have a great infographic on it. I took one of the names and started searching and found cool little nuggets of useless but fun information on the subject and wanted to compile it here. Most of it is just regurgitating Wikipedia, but it's still nice to have it all written up concisely and not so encyclopedically-sounding. If you'd rather read all this unfiltered, it's at Wikipedia here (about touch screens in general) and here (about multi-touch). I've just reorganized and distilled it all. Accuracy is not guaranteed and was not at all verified. If I were to write a book about it, I'd go double-check all this stuff. This is a blog. It's not worth the pixels it's printed on. As stated in the Atmel article, touch screens are EVERYWHERE now. So much so that children think screens that do not respond to touch are simply broken. A monitor without touch is, well, quaint. Remember that scene from the movie "Star Trek 4: The Voyage" where Scotty talks into the MacIntosh mouse? "The keyboard... How quaint."
Touch Screen History and Timeline
This isn't an exhaustive list of key events. I whipped this together with quick searches and some pretty terse reading. (Read: I have a short attention span.)
2,000 B.C.: First touch interface experiments on man-eating tigers considered failure do to animals eating scientists.
Prior to 1965: Touch interfaces only in science fiction.
1965-1967: The capacitive touchscreen was first described by the UK's E. A. Johnson in articles he published, including diagrams and photos of his work.
1968: The application of this technology to air traffic control systems was described.
Late 1960s: IBM began building the first touch screens.
1972: PLATO IV by Control Data used single-touch in a 16x16 array. Danish EE Bent Stumpe develops capacitive touch screens (based on his work in the 1960s when he worked at a television factory).
1973: CERN produces a transparent touch screen developed by their engineers Frank Beck and Bent Stumpe, based on work by Stumpe.
October 7, 1975: US patent #3,911,215 is awarded to American inventor G. Samuel Hurst for a resistive touch screen.
1982: First resistive touch screen is produced. Multi-touch is built using a camera and frosted glass by the University of Toronto's Input Research Group.
1983: Bell Labs publishes comprehensive discussion on touch screen UIs. Pinch-to-zoom and other gestures of today are influenced by Myron Krueger's video system "Video Place/Video Desk" (Myron talks about it on YouTube, at 4:34 you see pinch-to-scale).
1984: Bells Labs makes touch screen that allows user to change size of image with two hands (um, that would be multi-touch, Bob).
1985: Capacitive multi-touch tablet built by University of Toronto's Input Research Group.
1985-1989: GM uses touch screens in their Buick brand's Riviera and Reatta models of cars.
1991: Pierre Wellner publishes paper on multi-touch "Digital Desk" (working prototype is an actual desk with a camera above) which supports pinching gestures.
Circa 1991-1992: Sun Star7 prototype PDA uses touch screen
1993: IBM Simon is the first touchscreen phone released and Apple Newton is released.
1998: Palm releases the first edition of its Palm Pilot which has slick handwriting recognition. I loved mine. I have a color one in one of my junk bins. Sadness.
1999: Fingerworks begins work on multi-gesture input. (You'll see this name again down below.)
2001: Mitsubishi Electric Research Laboratories begins development on DiamondTouch, a multi-touch AND multi-user system that used capacitance (via floor mat and chairs). Microsoft starts development on the PixelSense tabletop touch system. Alias/Wavefront's PortfolioWall is made for collaborative teams.
2004: Nintendo DS handheld game system is released with a touch screen (and stylus). Tetris rocks.
2005: Apple acquires Fingerworks (and its patents), which developed multi-touch and multi-gesture technology between 1999 and 2005, filing patents for that work from 2001 to 2005. Apple patented refinements to the Fingerworks technologies.
2006: 200,000 touch-enabled mobile devices shipped.
2007: iPhone brings multi-touch into mainstream consumer consciousness (Apple incorrectly claims they invented multi-touch).
2008: Microsoft introduces the Surface (later renamed to PixelSense, I guess).
2012: Atmel releases its XSense flexible touch screen line of products based on copper mesh electrodes, which came out of the work done at Quantum Research Group Ltd. (acquired by Atmel in 2008 to become Atmel Technologies Ireland Limited).
2013: Estimated mobile touch devices shipped to hit 1.28 billion (with a B!).
July 17, 2019: Andy Frey predicts touch screen technology will be a big deal in the future.
Technologies Used in Touch Interfaces
These are all SUPER-duper-simplified explanations of the various technologies, so you EEs out there: Don't be whiny, bitchy commenters about the lack of depth or even slight imperfections in the descriptions. My audience will dig deeper on their own to find the real scoop. Consider this sloppy Cliff's Notes, but, like, sloppier and "good enough." Thanks. Love, Andy.
The predominant technology in mobile and tablet touch interfaces today (2013, in case you're reading this in 2064) is capacitive sensing. Capacitive sensing works by reading the change in capacitance on the screen and noting changes in that value. The human body affects the capacitance when it nears or touches the sensor surface. Capacitance is the amount of charge something holds. The time it takes to discharge is known without interference from a human touch. If that time to discharge changes, the device knows someone is messin' with it.
Resistive sensing touch panels measure resistance at various points on the surface. Then pressure, required for this to work, is applied to the surface, the resistance at that point changes from the norm and the devices knows you pressed there. The device runs through all the intersections of resistances on the X and Y axis to find the points whose resistance values aren't normal.
Optical (Like IR and Light and Stuff>
Usually involves either a frame around the outer edge of the panel or a camera above or behind the panel. With the frame style, emitters of IR shoot IR to receivers of IR. If the beams of IR are interrupted, BOOM! Touch. With the camera types, there is usually a panel that makes it easier for a camera to see fingertips squished onto the surface. The computer then recognizes the points where the fingertips are as touches.
Wave or Acoustic
Ultrasonic waves are barfed out over the surface of the display. If something screws with the wave as it travels across the display, a touch is registered. The waves are absorbed or deflected by a finger or stylus.
More Details on Touch Tech
If you're interested in geeking out more than this little article offers, here are some of the links I scanned through to gather this info. Please don't use these links as accuracy weapons to correct my mistakes, mistypes, misconceptions, or any other misses. We nerds can be pretty annoyingly skeptical of the accuracy of claims made by people other than ourselves. If you're skeptical, go read this stuff on your own and write your own synopses.