Back to Blog

Touchless UI in Insurance

Touchless UI in Insurance

This is a guest post written by KV Dipu, President & Head of Operations, Communities & Customer Service at Bajaj Allianz General Insurance Company.


COVID-19 has forced organizations to embark on new journeys in various customer facing domains and the fintech/financial services sector is no exception. Since it has become imperative for organizations to consider the health and safety of customers and employees, touchless technologies are gaining traction across the spectrum to avoid physical contact. While touchless technologies are not new – smart homes, smart health, augmented and virtual reality games, smart cars, etc. are all part of our lives – they have gained a new lease of life and are now being adopted at scale.

There are numerous ways in which human communication occurs; we connect with gestures, speech, and touch along with the choice to switch amongst them seamlessly. However, the human-computer interaction is different and complicated. Touchless UI is a classic approach to control and speak with your gadget without typing or touching the screen to carry out a task. In the “Home insurance” segment, where IoT devices are used for safety along with insurance to cover during theft or burglary, the gestures control becomes an essential tool to protect from intruders. The gesture recognition system can differentiate between people and can avoid unauthorized access. However, it relies on various lighting conditions as the cameras require a desirable ambience.

Gesture vs. voice commands

Gesture recognition uses computer vision; voice recognition uses natural language processing and does not depend on lights or cameras, which add to the cost. On the contrary, gesture recognition beats voice recognition because of its natural character. This is a key factor for the implementation of voice recognition technologies into several fintech products, making the process more efficient and inexpensive. Since users do not need to go through a steep learning curve, the need of change management is eliminated to a great extent.

Typically, the development of such innovations is not without challenges. Incorporation of such gestures requires deep study of human gestures and feature engineering them while developing deep learning algorithms. Error rates, false positives, and false negatives must be eliminated to attain six sigma status. Advanced Driver Assistance Systems (ADAS) in insurance plays a major role in the third-party claims of commercial vehicles. The gestures are not restricted to hand or finger movements; eye gestures help in understanding driver behavior while driving. It helps in deriving a driver score based on the eye movements that helps data scientists analyze whether the driver is drowsy, drunk, or distracted. Consequently, the driver score is used by underwriters to arrive at a premium based on the driver score.

UI design principles

The standard design principles for touchless UI are classified into three categories: action, navigation, and transform gestures.

  • Action gestures: Action gestures involve those related to tapping, long press and swiping that help interact with elements of GUI and access additional functionality.
  • Navigation gestures: Navigation gestures involve actions related to scrolling, dragging, swiping, and pinching that enable users to flip through a product with ease.
  • Transform gestures: Transform gestures involve actions related to compound gestures, pick up and move, and double tap that enable users to zoom into and out of content, reorder content, rotate graphics et al.

Interaction principles are changing by the day as human gestures keep evolving to keep pace with the new behavior brought in by millennials. Data scientists need to consider these changes while doing feature engineering. They also need to curate data sets accordingly as these data sets will eventually lead to the training for ML/DL algorithms.

Whilst the technology evolves, two aspects of customer experience need to be taken into consideration:

  • Ease of understanding: Gestures must be in line with the ones widely known and address gaps in existing ones. Cultural nuances (for instance, a gesture in country A can mean something radically different from the same gesture in country B) need to be taken into account.
  • Realistic responses: Latency is annoying and people switch products/services if the responses are not real time. Traditional authentication with user IDs and passwords/OTPs is not something users expect in the new era of banking and insurance. Gestures and voice commands enhance the customer experience by quick logins and payments. However, for certain aspects of banking, an extra layer of security is applied and the value of transactions is restricted to avoid risk.

    By 2025, 72% of the people across the globe are expected to have smartphones. As touchless UI/UX gains traction, we are in for the new normal – M2M (man to machine) interactions, human communication morphing from voice into gestures, design at the core of customer experience, amongst others, and, most importantly, healthy and happy human beings!

KV Dipu is President & Head of Operations, Communities & Customer Service, spearheading digital transformation, leveraging start-ups from fintech/insuretech globally, at Bajaj Allianz General Insurance Company (a joint venture of Allianz, the world’s leading insurer, and Bajaj Finserv & ranked #8 amongst the global top 100 digital insurers).


Photo by Laura Barbato on Unsplash