From Corduroy to Clicks: Bringing the Real Sense of Touch to the Touchscreen

Touchscreens today offer beautiful, high resolution graphics that respond instantly to the touch of a finger.  But they offer little to feel beyond a cold, hard, flat piece of glass.  Now imagine a touchscreen that goes one step further—one that gives tactile feedback, the sense of actually feeling what you are touching.  New research and an even newer private sector initiative is making it happen and we believe the results will be transformative.

But first, it is worth considering why you should even care about feeling the images on a touchscreen.

Most of us have had the mildly unsettling experience of being blindfolded or of wearing noise-cancelling headphones with no music, but few of us have ever been deprived, even for a moment, of the sense of touch.  Imagine the dentist’s Novocain applied to your fingertips.  What a strange lack-of-sensation that would be!

As it turns out, this has been done in the lab, and the results are much more than mildly unsettling:  they are totally debilitating.  We live in a physical world and we need touch to interact with the world:  to type, hold a spoon, write with a pen, button a button, handle coins, play tiddlywinks!  The list is endless.

Yet, we also live in the virtual world, and there we seem to do just fine without touch.  To the best of our knowledge, no one has ever run the Novocain experiment, but in all likelihood you could use a touch screen just fine with a numbed-up finger.  Of course, you would need to keep your eyes on that screen at all times … but, let’s face it:  most of us do exactly that.

It doesn’t need to be that way.  Imagine feeling the keys on a keyboard, the clicks of a knob or a toggle, the fabric of a dress.  The touch screen is crazy powerful because it presents us a software-defined interface, and therefore it is infinitely flexible.  If only feel could also be software-defined.

Software-defined feel – also known as “haptics” – turns out to be a hard problem.  It is even harder if the thing you are touching should also be a high resolution display with no moving parts.  How do you turn a lifeless piece of glass into an infinitely programmable haptic display?

There is a way, and it is pretty fascinating.  When a finger slides along a surface, much of what it feels is due to friction, the force that resists the finger’s motion.  Variations in friction are experienced as texture, shape, and events such as the click of a switch.  Moreover, friction can be modulated by generating an electric field between the skin and surface.  It is basically the same physical effect as static cling, but the trick is to be able to control the strength of the friction very accurately and to be able to change it very, very fast.

Variable friction technology was developed over the past decade in our lab at the Neuroscience and Robotics Laboratory at Northwestern University’s School of Engineering, where one research thrust is active sensing via touch.  The work began when a graduate student noticed that the tip of an ultrasonic spot remover (a now-defunct product, supplanted by “Tide to go”) felt oddly slippery.  It turns out that a rapidly vibrating surface is in fact more slippery than the same surface when it is not vibrating.

Creating the vibrations costs a lot of energy, however, which motivated us to find another approach.  It turned out that an approach had already been discovered … in 1876.  That was the year that the Chicago inventor Elisha Gray (who famously submitted his patent for the telephone about an hour after Alexander Graham Bell) first noticed vibrations produced by his finger sliding across a ceramic-coated bathtub with an AC voltage applied.  We took this kernel of an idea and converted it into something that would actually work on a touchscreen.

The technology was eventually licensed to the startup company Tanvas, Inc., which was created specifically to commercialize haptic touch screens and trackpads.

At the 2017 Consumer Electronics Show, the company unveiled a tablet featuring TanvasTouch technology.  Visitors were able to zip a virtual zipper, feel corduroy and moleskin swatches, control smart phone and automobile functions with haptic feedback, pluck the strings and feel the frets of a virtual guitar, and feel the ripples in a Zen-like pond.

Today, Tanvas is focused on bringing this technology to market via partnerships in the automotive, consumer electronics, and gaming markets.  The earliest that TanvasTouch will be in an auto dealer or a Best Buy near you is probably 2019 but all indications are, it will be worth the wait!




Add Comment

A Boomer’s Guide for Millennials: The ABC’s of Leadership: N is For Nurture
The 20 Richest Drug Dealers in History
How Robert Downey Jr. Amassed his Huge Net Worth
How to Hire a Personal Assistant and Focus on What Matters
Making a Habit Out of Money Management
10 Questions to Ask When Buying a Used Car
Here’s What Happens to Your Store Credit Card When the Store Closes
Trying to Answer the Question “Can I Afford to Retire?”
10 Ways to Protect Yourself from Cyber Security Threats
T-Mobile Commits to 100 percent Renewable Electricity by the Year 2021
Startup Company Hackrod Wants to 3D Print Your Car Design
How to Change an Extension for Media Files and Documents
The Dreamiest Desert Oasis Is In Old Town Scottsdale
The Top Five Hotels in Hanoi, Vietnam
Discover Oslo, the Nordic Gem
The Top Five Restaurants in Tuscany, Italy
What We Know about the 2019 Porsche 718 Boxster Spyder
A Closer Look at the 2019 Chevy Corvette ZR1
A Closer Look at the Rezvani Beast Alpha X Blackbird
A Closer Look at the 2018 Subaru BRZ tS
A Closer Look at the Bamford x Fragment Zenith El Primero
A Closer Look at The Richard Mille RM 53-01 Tourbillon Pablo Mac Donough
A Closer Look At The Konstantin Chaykin Joker
A Closer Look at the De Bethune DB28 Steel Wheels Skeleton