Interactive touchless technologies are about to have their renaissance moment. We’re keen to understand how this technology is transforming the way people interact at conferences, sales, and training during the time of social distancing.

By now we all know the importance of washing our hands, wearing a mask, and maintaining physical distance in well-ventilated areas. Once conferences and in-person meetings begin to come back,a surge of uses in touchless technologies will be inevitable. While touching a screen will be unpalatable, people will still want to interact.

We spoke with SVP of Creative & Medical Science Kevin Millar and VP of Creative Innovation James Hackett about what the near-future holds across the touchless tech landscape.

Let’s go!


Touchless technologies that we’re all comfortable with already exist in many places, such as the automatic doors at the supermarket, or automatic faucets and hand dryers in public restrooms. From a User Experience standpoint, we aim to create interactions that are as simple and intuitive as these that are already out there in use.

Simplifying new interactions sometimes means leveraging controls that are already well known. BYOD, or Bring Your Own Device, works by turning your phone into a controller or input device, similar to placing a Starbucks order on your mobile device instead of interacting with the store’s cashier and debit machine. In real-life spaces with physical installations, INVIVO is exploring how to transfer as many interactions as possible to the user’s own mobile device – leveraging familiar interactions while keeping people at a safe distance.

an example of BYOD, using a phone to manipulate a screen.

In the pandemic and—hopefully soon—post-pandemic world, touchless tech will find its way into all industries, not just healthcare. Brick and mortar stores are not going away, and in-person physician appointments cannot be totally replaced by virtual consultations. But we may find that some touch-based interfaces, like the menu at McDonald’s, won’t be nearly as popular anymore. Touchless kiosks, where hand gestures replace tapping a screen, may quickly begin to overtake the use of large touch screens.

In some cases, touchless simply means using our voices, like when we place an order at a drive-through. Automation is replacing many instances of customer service, and the pandemic may help to accelerate that. However, voice and lip synch technologies are likely still further away due to challenges of local accents, combined with the diversity of languages spoken in large cities. Interfaces often activate using only a limited vocabulary to prevent frustrating errors. However, as devices like the Amazon Alexa, Google Home, Apple’s Siri, and Microsoft’s Cortana become increasingly more popular, voice recognition will continue to advance.


If physically touching surfaces at a conference will be undesirable for quite some time while we all continue to mask up, how can we continue to have valuable interactions with our customers and colleagues?

One option that we’ve found successful is interactive wall displays with proximity-based activation. “If a person stands in certain areas of a conference booth, we have a pretty good idea of what content they’re looking at,” Kevin Millar shares. “By placing floor or overhead sensors throughout the booth, we can customize the experience so that the panel activates and shows the attendee the information that they were interested in.”

a person stands in front of an interactive wall, with a sensor on the floor

In a crowded exhibit hall, having a booth that triggers and interacts with attendees as they walk by could make the difference in capturing and holding someone’s attention.

The Microsoft Azure Kinect is another sensor capable of detecting movement. Originally designed for the Xbox game system, Kinect utilizes extremely accurate full-body tracking that goes well beyond following hand gestures. The Kinect, or similar devices like the Intel Realsense Depth Camera, can be put to work allowing people to interact with screens using larger gestures, giving them a feeling of relationship to the material by engaging more muscle groups.

For an experience closer to what people are used to with touchscreens, using infrared (IR) bezels may be another option. They act as an outer frame around the screen that senses when someone points to an object on screen, and activates it. Imagine hovering your hand above your phone to tap instead of leaving fingerprints (and germs!) on the glass.

That may be harder to do than it sounds—we’re all used to touching the glass. That’s when BYOD solutions will have a chance to shine. With a simple QR code, users can access controls to a booth installation. It’s possible that conference settings struggling to provide decent wi-fi could prove to be a barrier, but ultimately people are already very comfortable using their own phones as a tool. Transforming their phone into an extension of a booth kiosk could also have the advantage of connecting them with further information to peruse long after the conference is over.

A conference attendee stands inside a booth, and gestures to manipulate what he sees.

“Using gesture-controlled input devices lacks the tactile sensation that we’ve come to expect, but physically touching objects in a shared space will likely remain unpalatable for a while. It’s time to try new ways of operating.”

-Kevin Millar, SVP of Creative & Medical Science


While floor sensors and interactive walls may delight and command attention in a conference space, Medical Science Liaisons (MSLs) or Sales Reps often need to interact with clients in a way that’s more personal.

Hand gesture tracking devices, such as the family of devices by Ultraleap may fit the bill. The Leap Motion sensor made a splash several years ago. It’s a little sensor bar that can be set down on a desk or anywhere else, and by making hand or finger movements above it, like a swiping to advance a slide presentation, the Leap Motion will respond without the need for directly handling the screen. Ultraleap also makes the larger Stratos Inspire, which includes touchless haptics—a way to give sensory feedback without the need for touching the screen.

“These devices offer gesture-based control similar to what you’d find in a virtual reality experience, without actually touching anything – and without the need for a shared headset,” Kevin Millar offers. “You can pair them up with anything, like a touch screen, or even a holographic display, allowing you to control the experience.”

Gesture tracking isn’t destined to ignore haptic feedback, either. has created ultrasound device called the M1 so you can feel feedback from your gestures, without touching anything more than air. Tapping into our spatial sense and sense of touch will allow for more intuitive and specific gestures.

Holographic demos are an immersive way to share interactive content during a masks-on, in-person visit with clients without asking them to put on an AR headset. The Looking Glass Display has a 36” screen for larger demonstrations, but the 12” version might be portable enough for an in-person engagement. Exploring a new drug’s MOA in 3D, while being able to manipulate the model without ever touching the device, could be a strong contender for a new standard industry demonstration.

There are other technologies we have been testing and keeping our eyes on, of course. Such as the tiny Glamos sensor, which creates an RF field for gesture tracking, or the customizable commands of the speech-to-meaning Houndify system which uses AI to enhance its effectiveness. Different types of conversations will call for specific combinations of tech to create a seamless and intuitive flow. Our team has the experience to navigate and determine the best option with you, as our long-time client-partners have discovered.

Illustration of a man gesturing at screen, manipulating a molecule toward a cell with visible DNA.


During the pandemic, many pharmaceutical companies have been investing in training for their representatives. And the current moment is not only a chance to train your team, it’s a chance to learn more about what training is the most effective.

Gaze-based, eye-tracking software by innovative companies like Tobii are able to provide data on where people are placing their focus, giving you a more granular set of analytics than whether or not a web page was loaded. It’s not simply theoretical, either. Tobii has launched the Pro 3 glasses with a large field of view, 2 cameras, and 8 infrared illuminators per eye, along with an API so external developers can explore the technology’s possibilities for user testing.

Perhaps the biggest innovation coming soon will belong to BYOD scenarios, at least if the rumors about Apple developing augmented reality glasses are true.

“Apple’s release of an AR headset will likely usher in a new era of virtual content,” suggests James Hackett. “The implications are wide-reaching. We’ve all gotten used to our phones as interfaces, and they can act as a processor-in-our-pocket, leaving a set of AR glasses comfortable and not too heavy, since they wouldn’t need as much hardware in their frames. Imagine a $500 device plus your $1,500 phone you already know how to use. Adoption could happen quickly.”

James continues, “If an AR cloud develops alongside 5G speeds, training could be done with contextually relevant 3D assets, really heavy, high-fidelity content projected seamlessly to the user in real-time.” The content would be downloaded or connected to the cloud through the phone, which then sends it to the glasses. AR training in industry-specific scenarios will take off as fast as developers and animators can create them. Imagine being able to talk with a remote technician who can see what you’re seeing, and they highlight what to look for in real time in your field of vision – we have that capability right now. There’s even potential technology to use the glasses to track the position of your hand, allowing true mixed reality manipulation of the digital assets to occur.

“Fifty years ago the idea of having an all-purpose computer in our pockets seemed like a pipe dream; now it’s hard for many of us to imagine daily routines without them. The potential versatility of augmented reality glasses are poised to enhance our work lives and alter our behaviour in a similar way.”

-James Hackett, VP of Creative Innovation

There are important considerations with any of these technologies—standards will need to be developed to ensure meta-level information is included for people with visual impairments, for example. In contextual training scenarios, imagine a user could ask, “Siri, describe the room for me,” and then a LiDAR system like the one on iPad Pro could map the room and describe the positions of objects and machinery .

If something like AR glasses paired to phones takes off in the next couple of years, touchless technology will suddenly be personal for many professionals in numerous settings.


Effective touchless technology is not just in the realm of conceptual possibility anymore. The INVIVO team has extensive experience with touchless interactivity, and we’re always trying out the latest tech and exploring best-case scenarios so we can keep you ahead of the competition.

Contact us to learn more, today. Okay, you’re going to have to touch your screen this time, but just tap or click right here. 😉

🔵🟢🟡 Ready to learn more about INVIVO? Watch our latest medical animation reel!