A Human-Machine Interface (HMI) is a user interface or dashboard that connects a person to a machine, system, or device.
How often do you interact with technology in your life? If you answered ‘every day’ then you should take a step back and think about how you’d like to interact with and how you’d like to see technology evolve in the next year. How about the next five years? Ten?
For me, technology isn’t evolving quickly enough. Sounds crazy, I know, but I envision a world where technology like touchless tech and kitchens with built-in holographic interfaces — to name a few — are the norm. I’m not sure how to get from point A to point B, but I definitely would like to try to pave the way… or at least start asking the right questions.
Touchless User Interface describes an electronic device which can be controlled by gesture or sound and does not require physical contact in order to be activated or performed.
Back in 2016, Google I/O released information on Soli. (Click here to watch a 4-minute video on how Soli works.) Taken directly from Soli’s project website, “The Soli project uses radar to enable new types of intuitive interactions.” To put it in layman’s terms, it’s a small chip that allows a user to interact with a product without touching it. Now this might sound like something straight out of a sci-fi movie, but I’m extremely excited about the potential possibilities moving forward with touchless interfaces.
Google is expected to release the Note 4 with a Soli radar chip built into the top bar of the phone. It will enable face unlock, similar to iPhone’s Face ID, but will take it a step further by allowing the user to skip songs, snooze alarms and silence phone calls with only a gesture. Google applied for a waiver to allow the Soli sensors to operate at higher power levels than currently allowed. The Federal Communications Commission (FCC) granted this waiver in January of 2019.
But it goes beyond cell phones and skipping songs. Man-Machine Interface (MMI) is becoming increasingly popular across every technical genre and truly has a place in everything we interact with as humans. In fact, you might even have it in your kitchen or your car and even Alexa falls into the touchless device category (but voice-controlled devices are, for today, another topic altogether.) Without going too in-depth, here’s a few more fields where touchless technology has been improving lives in modern-day.
With the passing of the Hands-Free Laws all over the United States, we’re most likely going to see an increase of Human-Machine Interfaces in vehicles. Volkswagen first debuted a gesture-controlled concept in 2015 and BMW released a touchless dashboard concept in 2016 before implementing it into their more recent models.
In these two examples, this technology makes it possible to control displays and functions using hand movements without having to touch anything. For example, a swipe gesture toward the windshield would cause the sunroof to close, while the same movement in the opposite direction opens it.
Microsoft released a white paper on Touchless Multifactor Authentication for the medical field stating that “…touchless MFA solutions can save time and increase convenience…while thwarting common types of breaches, including cybercrime hacking…” It even showed a drastic improvement in hygiene since clinicians didn’t have to break the sterile field during surgery and were able to access previously inaccessible IT resources while using touchless technology such as scans.
Touchless technology is a great way to interact with video games. Things like the Xbox Kinect may not even seem like anything special, but implementing ultrasound technology to control video games is a huge step forward in Human Machine Interface systems. The release of the movie “Ready Player One” in 2018 displayed what our future could easily look like in video game development not too far down the road. Immersive interactive video games have also shown major improvement physically, socially, and cognitively for wheelchair users and those with lacking motor skills.
While I can’t predict the future, I would like to see more interactive and immersible things popping up soon. Whether it’s holographic watches, desks with built-in sensors for gesture controlling web-pages and navigating computers, or something truly out of Minority Report, I think that setting distinct goals for shaping how we view and interact with technology is something that companies really need to consider and revolutionize.
And who knows, maybe you’re the next revolutionary genius.