From Corduroy to Clicks: Bringing the Real Sense of Touch to the Touchscreen

Touchscreens today offer beautiful, high resolution graphics that respond instantly to the touch of a finger.  But they offer little to feel beyond a cold, hard, flat piece of glass.  Now imagine a touchscreen that goes one step further—one that gives tactile feedback, the sense of actually feeling what you are touching.  New research and an even newer private sector initiative is making it happen and we believe the results will be transformative.

But first, it is worth considering why you should even care about feeling the images on a touchscreen.

Most of us have had the mildly unsettling experience of being blindfolded or of wearing noise-cancelling headphones with no music, but few of us have ever been deprived, even for a moment, of the sense of touch.  Imagine the dentist’s Novocain applied to your fingertips.  What a strange lack-of-sensation that would be!

As it turns out, this has been done in the lab, and the results are much more than mildly unsettling:  they are totally debilitating.  We live in a physical world and we need touch to interact with the world:  to type, hold a spoon, write with a pen, button a button, handle coins, play tiddlywinks!  The list is endless.

Yet, we also live in the virtual world, and there we seem to do just fine without touch.  To the best of our knowledge, no one has ever run the Novocain experiment, but in all likelihood you could use a touch screen just fine with a numbed-up finger.  Of course, you would need to keep your eyes on that screen at all times … but, let’s face it:  most of us do exactly that.

It doesn’t need to be that way.  Imagine feeling the keys on a keyboard, the clicks of a knob or a toggle, the fabric of a dress.  The touch screen is crazy powerful because it presents us a software-defined interface, and therefore it is infinitely flexible.  If only feel could also be software-defined.

Software-defined feel – also known as “haptics” – turns out to be a hard problem.  It is even harder if the thing you are touching should also be a high resolution display with no moving parts.  How do you turn a lifeless piece of glass into an infinitely programmable haptic display?

There is a way, and it is pretty fascinating.  When a finger slides along a surface, much of what it feels is due to friction, the force that resists the finger’s motion.  Variations in friction are experienced as texture, shape, and events such as the click of a switch.  Moreover, friction can be modulated by generating an electric field between the skin and surface.  It is basically the same physical effect as static cling, but the trick is to be able to control the strength of the friction very accurately and to be able to change it very, very fast.

Variable friction technology was developed over the past decade in our lab at the Neuroscience and Robotics Laboratory at Northwestern University’s School of Engineering, where one research thrust is active sensing via touch.  The work began when a graduate student noticed that the tip of an ultrasonic spot remover (a now-defunct product, supplanted by “Tide to go”) felt oddly slippery.  It turns out that a rapidly vibrating surface is in fact more slippery than the same surface when it is not vibrating.

Creating the vibrations costs a lot of energy, however, which motivated us to find another approach.  It turned out that an approach had already been discovered … in 1876.  That was the year that the Chicago inventor Elisha Gray (who famously submitted his patent for the telephone about an hour after Alexander Graham Bell) first noticed vibrations produced by his finger sliding across a ceramic-coated bathtub with an AC voltage applied.  We took this kernel of an idea and converted it into something that would actually work on a touchscreen.

The technology was eventually licensed to the startup company Tanvas, Inc., which was created specifically to commercialize haptic touch screens and trackpads.

At the 2017 Consumer Electronics Show, the company unveiled a tablet featuring TanvasTouch technology.  Visitors were able to zip a virtual zipper, feel corduroy and moleskin swatches, control smart phone and automobile functions with haptic feedback, pluck the strings and feel the frets of a virtual guitar, and feel the ripples in a Zen-like pond.

Today, Tanvas is focused on bringing this technology to market via partnerships in the automotive, consumer electronics, and gaming markets.  The earliest that TanvasTouch will be in an auto dealer or a Best Buy near you is probably 2019 but all indications are, it will be worth the wait!

Add Comment

10 Things You Didn’t Know about RevLocal’s Marc Hawk
Spring Forward to Success: A Quarterly Review to Keep Your Business Plan on Track
Six Hallmarks of Entrepreneurial Infrastructure Leadership
One Definition Does Not Fit All: The Four Quadrants of Entrepreneurship
Here’s Why No One Can Escape Back Taxes
10 Confidence-Inspiring Facts About the Stock Market
The Top Ten Must Read Personal Finance Books of 2018
Four Helpful “Tricks” For Reaching Your Financial Goals
The Top Five Advances in High Speed Flight Over the Last Decade
What Are Pop-Up Cameras and Are They The Future?
The GDPR and Your Organization: What You Need to Know
“Nanodrops” to Repair Corneas Could Ultimately Replace Glasses
The Top Five Cities to Visit in Poland
Adventures in Reno: Reasons Beyond Gambling to Visit This Nevada Mountain Town
5 Mistakes to Avoid Navigating Through Airport Security
Five Luxurious Yoga Retreats You Need to Try This Year
A Closer Look at the 2019 Jaguar I-Pace
20 Things You Didn’t Know About Bugatti
A Closer Look at the 2019 Audi A6
A Closer Look at the Porsche Panamera Sport Turismo
A Closer Look at The Hamilton Khaki Pilot Auto Day Date
A Closer Look at the Frederique Constant Hybrid, Automatic Smartwatch
10 Things You Didn’t Know about Bremont Watches
A Closer Look at The Christophe Claret Maestro Mamba