How does the touch screen work?

The touch screen is the interface of the present. A few decades ago it seemed somewhat futuristic, but with mobile devices it has become quite a conventional user interface. What’s more, it has also made the leap to other areas, such as convertible or 2-in-1 laptops, etc. But… how does this interesting technology work?

What is a touch screen?

A touch screen, is an input device that enables interaction with a computer, mobile phone or other electronic device through direct contact with the surface of the screen. Instead of using a keyboard or mouse, the user can manipulate, swipe, pinch or perform gestures on the screen to carry out different actions and commands.

Touch screen types

There is not just one touch screen technology, but several. And to understand how it works, which is the objective of this article, you must first know the types and how each of them works:


The most basic and frequent touch screens, such as those of many vending machines, including ATMs, as well as other similar ones, are of the resistive type. These screens work through the interaction of two electrically conductive layers that flex when touched. One of the layers is resistive and the other is conductive, separated by small separators that keep them apart until contact. An electrical flow constantly passes through these layers, but when you touch the screen, the layers are pressed and the electrical current at the point of contact is changed. The software detects this disturbance in the current at specific coordinates and executes the corresponding function.


Unlike resistive touch screens, capacitive type screens do not need to be pressed with a finger to produce a change in electrical flow. Instead, they are capable of detecting any electrically charged object, including human skin, since it is made up of positively and negatively charged atoms. Capacitive screens are made using materials such as copper or indium tin oxide, which store electrical charges in an electrostatic grid made up of tiny wires, each one thinner than a human hair.

Surface acoustic wave

Surface Acoustic Wave (SAW) is a cost-effective option compared to Projected Capacitive Touch (PCAP) technology. It offers greater optical clarity with significantly smaller edges and is suitable for applications with bevels.

Perceived Pixel

Research and development is ongoing into new touch screen technologies, although capacitive touch is still the predominant industry standard today. One of the main challenges in the development of touch screens is adapting them for larger surfaces, since the electric fields in larger screens often interfere with their sensing ability.

At the software level: how the touch screen works

At the software level, the operation of a touch screen involves several steps:

  1. Touch detection: Touch screen software continuously monitors input data from the screen hardware to detect any touch or touch interaction. This involves tracking the touch points and movements made by fingers or a stylus on the touch surface.
  2. Data interpretation: Once a touch is detected, the software interprets the captured data to determine the precise location of the touch on the screen. This involves calculating the X and Y coordinates of the contact point and the intensity of the touch if it is supported by pressure sensing technology.
  3. Data conversion: After interpreting the data, the software converts the obtained information into digital signals understandable by the operating system and running applications. This converted data is used to generate commands and touch events to be sent to applications based on user interactions.
  4. Execution of actions: Once the commands and touch events are generated, the software sends them to the relevant applications to perform the corresponding actions. This may include scrolling, selection of elements in the interface, zooming, dragging and dropping, among other specific touch interactions.
  5. User interface: The software also takes care of updating and reflecting changes to the user interface in response to touch interactions. This may include highlighting or giving focus to selected items, adjusting the size and position of items on the screen, or displaying context menus based on actions performed by the user.