Robots with Feelings

July 21, 2013 — A new milestone by engineers at UC Berkeley can help robots become more touchy-feely, literally.

A research team led by Ali Javey, UC Berkeley associate professor of electrical engineering and computer sciences, has created the first user-interactive sensor network on flexible plastic. The new electronic skin, or e-skin, responds to touch by instantly lighting up. The more intense the pressure, the brighter the light it emits.

"We are not just making devices; we are building systems," said Javey, who also has an appointment as a faculty scientist at the Lawrence Berkeley National Laboratory. "With the interactive e-skin, we have demonstrated an elegant system on plastic that can be wrapped around different objects to enable a new form of human-machine interfacing."


Well, Science has certainly done it again. Now that robots can potentially feel objects and actions in much the same way humans do, we're engineering a way to hurt them. That's right, science is engineering ways for us to defend ourselves against robots by making them feel.

Ok, that's a little ridiculous but even still the concept is fantastic. There are a lot of applications for this that go beyond just robots. Developing a micro layer that allows us to see feedback from touching an object can also lead us to developing more integrated touch pads, and even interfacing with buildings.

I can't imagine all of the applications of this technology yet, but I imagine it to have greater repercussions than just it's intended use. Keep your eyes out for Robots with these, and feel free to poke them like you would a friend with a sunburn. They can't really feel it... Can they?