Electronics

Interactive electronic skin lights up when touched

Interactive electronic skin lights up when touched
The interactive electronic skin developed at UC Berkeley (Photo: Ali Javey/Chuan Wang, Berkeley)
The interactive electronic skin developed at UC Berkeley (Photo: Ali Javey/Chuan Wang, Berkeley)
View 5 Images
Light is locally emitted only where is the surface is touched (Photo: Ali Javey/Chuan Wang, Berkeley)
1/5
Light is locally emitted only where is the surface is touched (Photo: Ali Javey/Chuan Wang, Berkeley)
A fully fabricated interactive e-skin containing 16 ×16 pixels (Photo: Ali Javey/Chuan Wang, Berkeley)
2/5
A fully fabricated interactive e-skin containing 16 ×16 pixels (Photo: Ali Javey/Chuan Wang, Berkeley)
Schematic illustration of the interactive e-skin device under operation (Image: Ali Javey/Chuan Wang, Berkeley)
3/5
Schematic illustration of the interactive e-skin device under operation (Image: Ali Javey/Chuan Wang, Berkeley)
Schematic layout of a single pixel, consisting of a nanotube thin-film-transistor, organic LED, and a pressure sensor integrated vertically on a polyimide substrate (Image: Ali Javey/Chuan Wang, Berkeley)
4/5
Schematic layout of a single pixel, consisting of a nanotube thin-film-transistor, organic LED, and a pressure sensor integrated vertically on a polyimide substrate (Image: Ali Javey/Chuan Wang, Berkeley)
The interactive electronic skin developed at UC Berkeley (Photo: Ali Javey/Chuan Wang, Berkeley)
5/5
The interactive electronic skin developed at UC Berkeley (Photo: Ali Javey/Chuan Wang, Berkeley)
View gallery - 5 images

The stereotype of the clumsy robot may soon become a thing of the past thanks to ongoing research at the University of California, Berkeley, where a team of engineers has created a thin and interactive sensor network that can be layered onto the surfaces of virtually any shape. The device gives out immediate feedback via an LED light when touched, and could be used to create smart bandages that monitor vitals in a patient in real time, wallpapers that act as touchscreens, or even to give humanoid robots that elusive "human touch."

Full control of the human hand takes up a very large portion of our motor cortex, and the delicate balance of forces, pressures and coordinated movements required to write on a piece of paper, cross our fingers or hit a tennis ball is extremely tough to replicate in a lab.

Focusing on factors which used to be overlooked in the field of robotics (such as palm flexibility), researchers such as former tennis pro Yoky Matsuoka have gone a long way toward improving flexibility and range of motion in robotic hands. Making a robot more sensitive to the touch, however, is proving a harder challenge.

This is where Berkeley's research comes in. A team of engineers led by Prof. Ali Javey has built on a previous design of theirs to create a matrix of interactive pressure sensors that, when touched, immediately respond by lighting up with an intensity proportionate to the pressure being applied. Unlike your stiff iPhone touchscreen, this system is highly flexible and can be easily laminated onto any surface, no matter how geometrically complex (read: a robotic hand).

Schematic layout of a single pixel, consisting of a nanotube thin-film-transistor, organic LED, and a pressure sensor integrated vertically on a polyimide substrate (Image: Ali Javey/Chuan Wang, Berkeley)
Schematic layout of a single pixel, consisting of a nanotube thin-film-transistor, organic LED, and a pressure sensor integrated vertically on a polyimide substrate (Image: Ali Javey/Chuan Wang, Berkeley)

Manufacturing the device was relatively straightforward. The researchers deposited a thin layer of polymer on top of a silicon wafer and then used standard semiconductor manufacturing techniques to layer in a transistor, an organic LED and a pressure sensor on top of each other. Finally, they simply peeled off the plastic from the silicon base, leaving a freestanding film with a sensor network embedded in it.

The result was a matrix of 16 by 16 sensors that is highly interactive – with a response time of just one millisecond, which is a hundred times faster than previous attempts.

"The electronic components are all vertically integrated, which is a fairly sophisticated system to put onto a relatively cheap piece of plastic," says Javey. "What makes this technology potentially easy to commercialize is that the process meshes well with existing semiconductor machinery."

Currently, the engineers are working on manufacturing a more advanced version of the sensors that can respond to temperature and light as well as touch. A paper on the research appeared recently published in the journal Nature Materials.

The video below shows the e-skin in action.

Source: University of California, Berkeley

Interactive E-Skin Developed at UC Berkeley

View gallery - 5 images
No comments
0 comments
There are no comments. Be the first!