Science AMA Series: We are SynTouch, engineers making tactile sensors that rival the human fingertip. We capture the sense of Touch, and use it to quantify product feel, build adaptive robotic and prosthetics hands, and drive VR haptic displays. AMA!

Abstract

We are SynTouch: the world leader in the technology of human touch. We invented the only sensor in the world that endows machines with the ability to replicate the human sense of touch. We call this emerging field Machine Touch. Like machine vision, it requires a combination of sensors and algorithms to take a human sense, capture it and allow us to do useful things with tactile information.

One core application of our technology is quantifying dimensions of touch - we’ve created a taxonomy called the SynTouch Standard® that consists of fifteen dimensions humans feel. The information is captured by our BioTac Toccare® which Automakers, Apparel and Consumer Electronics companies use to define and improve the haptics of their products. Analogous to the use of digital color meters to capture RGB values and drive product manufacturing decisions to ensure they ‘look right’, our technology provides information to ensure products ‘feel right’.

Our technology also functions as the input for haptic displays for VR and telerobotics. This allows us to drive haptic displays with real-world data for anything from a surgical robot to a gaming device – and we’ve worked with both!

We’re also pursuing long-term projects to command robotic hands with tactile sense and reflexes. Our sensors allow robot hands to handle fragile objects better than currently available systems – one prime use case that we’re pursing now deploying this technology in prosthetics to allow amputees to handle fragile objects without dropping or crushing them.

SynTouch was founded in 2008 by Professor Gerald Loeb, and Ph.D. students Matthew Borzage, Jeremy Fishel, and Nicholas Wettels who were at the University of Southern California. We’ve been recognized by Popular Mechanics, The Robot Report, and the World Economic Forum…

Happy to answer more questions, but we're getting busier with foot traffic right now.

Thank you for your interest!

In your opinion(s), how far are we away from having affordable, high tech prothetic arms? (Like, Deus Ex, for example)

Purpatraitor

That's a tough question to give a good response on. I hesitate to speculate incorrectly, but I'll give you my perspective as best I can.

There are a few excellent questions embeded, one is cost, one is 'high tech', the third is 'when'?

Costs are currently high because advanced prosthetics are made in relatively small volumes. We're in the fortunate situation of not having as many upper-limb amputees as would be required for this to be considered a large market, That traditional manufacturing will be accompanied by high costs. If new manufacturing practices allow smaller volume production to be done at low cost, then that's the most likely way this will lower cost. This is ongoing and has great synergy with the changes to the manufacturing that we see happening right now.

'High tech' is a bit harder to see a direct path on. There are a number of arms out there that go above and beyond the traditional 1 dof powered hands. These are quite expensive, and the user interface can quickly become a barrier to giving an amputee the full advantage of their complexity. I truly hope that these can be overcome, and we're supplying sensors to a number of teams that are working on this.

SynTouch has taken a different approach and has integrated sensors onto the basic 1 dof hands that are the most commonly used powered prosthetics, and incorporating reflexes into them to make them perform more like the way you expect your hand to behave. We were awarded a breakthrough technology award for this technology from Popular Mechanics.

We also received funding from the Congressional Directed Medical Research Program to create a batch of these modified hands and give them to amputee veterans to take home. We're going to find out if the technology helps them be more confident in their hands, or find them less mentally taxing to use. If this is successful we should be able to make the case that more veterans should have these hands.

There's a ton of good progress, but 'When?' is still a hard question.


Could you please elaborate on the 15 dimensions of touch?

Razasaza

The fifteen dimensions of touch are the RGB values for colors. The SynTouch Standard are the fifteen dimensions that quantify how any material feels to a human.

There are a lot of measurement tools that analyze material properties but these aren't well correlated with human perception - and there are some interesting reasons why that is.

In order to measure them you need a sensor with all the same sensory capabilities and mechanical features of the human finger (i.e the BioTac), and a robot capable of making the same exploratory movements as a human and processing software that emulates the signal processing of the brain. We built instruments - BioTac Toccare which include all these key elements.

The resulting instrument provides fifteen standard dimensions that comprise five subgroups - microtexture and macrotexture, friction, compliance, thermal and adhesion. You can read more about them here at the bottom of the page.

Thanks for the good question!


This AMA is being permanently archived by The Winnower, a publishing platform that offers traditional scholarly publishing tools to traditional and non-traditional scholarly outputs—because scholarly communication doesn’t just happen in journals.

To cite this AMA please use: https://doi.org/10.15200/winn.150357.79071

You can learn more and start contributing at authorea.com

redditWinnower


Thanks for being here today! What considerations go into designing technology that "feels right" and which are the greatest challenges?

p1percub

Designing technology that "feels right". Do you mean tactile/haptic feedback, tactors, etc - technology that is intended to literally recreate the sensations of touch?

The biggest consideration is how exactly we need to reproduce sensations to make them plausible for a given application. Low fidelity black and white video was sufficient to sell millions of televisions. AM and FM radio music became ubiquitous far before we had perfect digital recordings and high fidelity speakers. We still don't do a perfect job of reproducing video or audio but we've learned enough about making the displays and enough about the human nervous system to create lossy audio and video compression that is indiscriminate to humans and 3D displays and other plausible sensory experiences.

We're just scratching the surface of doing the same to the world of touch. Do we need to play audio of a button click when you tap on your phone screen? Early smartphones did this to make people feel like there were 'buttons' on it. Can we use a limited number of vibrations to imitate a rich palette of textures? etc. There is excellent work happening in this field right now to address these questions, and many teams (us included) thinking about what applications can be addressed with the current technology, and where we need to make something that feels closer to the real thing.


How small have you miniaturized your product?
Like will we see robots with synTouch pads on thier manipulators anytime soon?

piedpipernyc

Regarding size:

We've designed miniaturized versions of the BioTac which would work for surgical robots, but we haven't built these - the effort to redesign them is substantial. BioTac sensors are designed to mimic the skin of your fingertip (glabrous skin), we know that the size and shape and mechanical features of the fingertip play a critical role in how the skin senses interaction with materials.

You may indeed start to see robots with our sensors on them - we've integrated with over a dozen of the best research platforms and have made our sensors available to countless academic and industrial labs.


Hi All, thanks for doing this AMA! Two Questions if that is ok!

  1. Does the current product allow for the sensation of temperature with the touch? (is this one of the 15 dimensions)

  2. As VR gaming improves, have you looked into the possibility of using this technology to create gloves , sleeves, other things for VR games that would use your touch algorithms to create the in-game sensation on the gamer's actual body?

Thisisntmyaccount24

Regarding temperature: Humans don't actually feel temperature, we feel the exchange of heat with objects.

This allows a fun parlor trick (fun for scientists?): if you ask people to touch a metal table leg, and the wood top of the table, they almost always describe the metal as 'cold' and the wood as 'warmer'... a thermometer would tell you that you're wrong - they're both at room temp.

So why the difference? The metal is able to conduct the heat away from your body better, and has a larger thermal mass, you feel the heat of your hand decreasing and call that 'cold'.

So we actually have two dimensions of temp: a transient and persistent effect. The BioTac Toccare head heats up to body temp due to resistive elements in the head, and thermistors in it allow us to measure the rate of change of heat. It allows us to measure something - say a leather jacket - and show that while it will feel cool when you first put it on, after a few minutes it will feel warm.

VR and telepresence are huge potential fields for us. We're in the situation where we've created the machine equivalent of the finger - analogous to inventing the camera, or the microphone for the eye or ear. Now we need excellent ways to playback that sensation to humans, to make a monitor or speaker. In the world of touch those have several names: tactors, haptic displays, haptic feedback devices, etc. However, our technology to acquire tactile information is more refined than the technology to display it.

We think these teams are going to be successful though, and we're working with a number of them. When they're ready we have an entire library of literal textures for the VR gaming world to use.


Thank you for doing this AMA!

I can imagine there are a lot of different potential applications for this technology. Which one(s) is/are the most exciting or personally motivating for you? What inspired you to explore this area of research?

neurobeegirl

Thanks for your interest!

The technology was originally conceived at the DARPA revolutionizing prosthetics kickoff meeting - almost a decade ago. The task at hand (sorry) was to perfectly recreate the human arm for veteran amputees. One of the components of the challenge was to make sensors capable of measuring everything humans could feel.

There's a ton of application areas - it's a bit like we invented the camera or the microphone and now have so many application areas that we can go into - prosthetics, autonomous robotic manipulation, social touch & telepresence, product design, etc.

One of the things that I think is personally fascinating is the fact that we can now measure the entire world of materials. I'd like to complete a library of materials where you could lookup anything and find materials that feel similar on the fifteen dimension. Unlike color (which is essentially a continuum) I think we'll find areas in the dimensions that are empty (dragons be here) meaning there isn't anything that has a unique combination of traits. If we do, then maybe some talented material scientist can invent a material that feels unlike anything we've felt before.

Thanks for the good questions!


Thanks for coming to talk with us!

How big a challenge is it to make sure that prosthetics don't apply too much force in ways that might break things or hurt someone? Is this easy, or is it a hard problem?

asbruckman

Very insightful - to expand on your question, we found that what amputees wanted the most out of their hands was to be able to handle objects without dropping or breaking them. Imagine watching your hand every time you grasped or held something in it, and still dropping objects despite the attention. It's mentally taxing and makes reluctant to use your hand.

... and the sense of touch is critical if that's what you want your hands to do.

Essentially we recreated the same tactile reflexes you expect to be present in your spinal cord - reflexes that modulate your intent to close and grasp your hand. We've shown that this makes you quite capable of grabbing objects without dropping or breaking them.

We'll be giving this capability to a limited number of veterans in a study sponsored by the CDMRP, and finding out if this works in their homes as well as it does in our lab.


What is currently your biggest hurdle for interfacing with severed motor and sensory nerves in amputees?

Skrute

We actually don't have any hurdles with this: we aren't actively working with motor/sensory nerve interfaces. However, our clients do.

This is fantastically difficult research - there are issues with biocompatibility, damage to the nerves etc required just to start the probably harder task of understanding what nerves you're interfaced with. Not my area of research though, so perhaps someone else can give the latest?

Two of the interesting uses for this class of interface are cochlear implants and retinal implants. In those applications, the interfaces are a bit simpler (okay still complex though) because there's such a strong relationship between the location and the frequency of the sound evoked, or the stimulated image.


Additional Assets

Reviews

License

This article and its reviews are distributed under the terms of the Creative Commons Attribution 4.0 International License, which permits unrestricted use, distribution, and redistribution in any medium, provided that the original author and source are credited.