From Corduroy to Clicks: Bringing the Real Sense of Touch to the Touchscreen

Touchscreens today offer beautiful, high resolution graphics that respond instantly to the touch of a finger.  But they offer little to feel beyond a cold, hard, flat piece of glass.  Now imagine a touchscreen that goes one step further—one that gives tactile feedback, the sense of actually feeling what you are touching.  New research and an even newer private sector initiative is making it happen and we believe the results will be transformative.

But first, it is worth considering why you should even care about feeling the images on a touchscreen.

Most of us have had the mildly unsettling experience of being blindfolded or of wearing noise-cancelling headphones with no music, but few of us have ever been deprived, even for a moment, of the sense of touch.  Imagine the dentist’s Novocain applied to your fingertips.  What a strange lack-of-sensation that would be!

As it turns out, this has been done in the lab, and the results are much more than mildly unsettling:  they are totally debilitating.  We live in a physical world and we need touch to interact with the world:  to type, hold a spoon, write with a pen, button a button, handle coins, play tiddlywinks!  The list is endless.

Yet, we also live in the virtual world, and there we seem to do just fine without touch.  To the best of our knowledge, no one has ever run the Novocain experiment, but in all likelihood you could use a touch screen just fine with a numbed-up finger.  Of course, you would need to keep your eyes on that screen at all times … but, let’s face it:  most of us do exactly that.

It doesn’t need to be that way.  Imagine feeling the keys on a keyboard, the clicks of a knob or a toggle, the fabric of a dress.  The touch screen is crazy powerful because it presents us a software-defined interface, and therefore it is infinitely flexible.  If only feel could also be software-defined.

Software-defined feel – also known as “haptics” – turns out to be a hard problem.  It is even harder if the thing you are touching should also be a high resolution display with no moving parts.  How do you turn a lifeless piece of glass into an infinitely programmable haptic display?

There is a way, and it is pretty fascinating.  When a finger slides along a surface, much of what it feels is due to friction, the force that resists the finger’s motion.  Variations in friction are experienced as texture, shape, and events such as the click of a switch.  Moreover, friction can be modulated by generating an electric field between the skin and surface.  It is basically the same physical effect as static cling, but the trick is to be able to control the strength of the friction very accurately and to be able to change it very, very fast.

Variable friction technology was developed over the past decade in our lab at the Neuroscience and Robotics Laboratory at Northwestern University’s School of Engineering, where one research thrust is active sensing via touch.  The work began when a graduate student noticed that the tip of an ultrasonic spot remover (a now-defunct product, supplanted by “Tide to go”) felt oddly slippery.  It turns out that a rapidly vibrating surface is in fact more slippery than the same surface when it is not vibrating.

Creating the vibrations costs a lot of energy, however, which motivated us to find another approach.  It turned out that an approach had already been discovered … in 1876.  That was the year that the Chicago inventor Elisha Gray (who famously submitted his patent for the telephone about an hour after Alexander Graham Bell) first noticed vibrations produced by his finger sliding across a ceramic-coated bathtub with an AC voltage applied.  We took this kernel of an idea and converted it into something that would actually work on a touchscreen.

The technology was eventually licensed to the startup company Tanvas, Inc., which was created specifically to commercialize haptic touch screens and trackpads.

At the 2017 Consumer Electronics Show, the company unveiled a tablet featuring TanvasTouch technology.  Visitors were able to zip a virtual zipper, feel corduroy and moleskin swatches, control smart phone and automobile functions with haptic feedback, pluck the strings and feel the frets of a virtual guitar, and feel the ripples in a Zen-like pond.

Today, Tanvas is focused on bringing this technology to market via partnerships in the automotive, consumer electronics, and gaming markets.  The earliest that TanvasTouch will be in an auto dealer or a Best Buy near you is probably 2019 but all indications are, it will be worth the wait!


Add Comment

How Rihanna Achieved a Net Worth of $245 Million
The 10 Richest People in the UK
What No Business Owner Will Ever Tell You (Part 1)
Northwest Business Leaders Feel Good Today, But Are Worried About the Future
10 Benefits of Having a Dillards Credit Card
Saving a Little Extra Money in the Age of the Internet
Why Should You Consider a 529 Plan?
Five Fintechs that Will Change Your Life
GDPR Part IV: Maintenance
What Exactly is Engineering Technology?
New MIT Project Could Bring Driverless Cars to Rural Roads
How Google Street View Images Could Help Us Address Public Health
Five Airlines with the Best Status Match Programs
The Five Best Hotels in Key West, Florida
The 10 Most Beautiful Subway Systems in the World
The Five Best Places To Live in Maine
The Top Five Honda Prelude Models of All-Time
How To Stop Your Car From Depreciating Quickly
The Top Five Lexus RX 350 Models of All-Time
The Top Five Porsche Boxster Models of All-Time
The Top Five Vincero Watches Money Can Buy
A Closer Look at The Jacob & Co. Twin Turbo Furious
A Closer Look at the Tutima Saxon One M
The Top Five Stuhrling Watches Money Can Buy