Can you (or should you) trust new tech?
It’s 9.55 on a Thursday morning. Tens of thousands of people wait on the sidewalks outside Las Vegas Convention Centre, lines snaking around the block. Behind the locked doors of convention halls, thousands of exhibitors prepare to show off new products for visitors.
Over four days, 175 000 industry professionals are exposed to more than 20 000 high-tech products, each hoping to spot the next big thing and fixated on not missing out on tech of the future.
This year’s Consumer Electronics Show (CES), hosted in Las Vegas early this year, boasted the largest show floor in the event’s 50-year history. The delegates were so mesmerised by the ‘future’ on display.
But one thing was missing from the vast majority of new products? The element of trust. As new categories of technology are emerging, and new products and services are launched into the market daily, it results in a nervous and distrustful public.
The trust dilemma
The consequence of such distrust is while the CES show floors heaved with walking, talking and dancing robots and human beings lost in virtual reality headsets, such products will die when they reach the real world of consumer purchase decisions.
During one of the main CES conference sessions, this issue came to the fore as speakers and panelists debated how to build trust into emerging technologies. The session, hosted by the Mobile Ecosystem Forum (MEF), highlighted the role of trust across the boundaries of safety, compliance, law, security, privacy and ethics.
Rather than attacking futuristic technologies, it focused on the need for consumer trust as a cornerstone for industry collaboration.
“Building privacy and security is really core to creating the future of the industry,” said Rimma Perelmuter, CEO of the MEF. “We need to focus on raising awareness of how important trust is for consumers, not only in terms of best practise but also showcasing innovation.”
The artificial intelligence take over
Artificial intelligence (AI) is about to emerge into the consumer mainstream. New phones from Huawei and Samsung use machine learning to optimise the phone’s performance continually, based on user behaviour. The latest devices from Sony and LG include Google Assistant, an AI-based voice activated service. Before long, AI will be a standard phone feature.
The trust challenge in such technology lies in the fact that AI will work best when data on user needs, behaviour and activity are processed in the cloud and tested against other aggregated data – which all requires consumer trust.
Commitment to safety and privacy
The challenge in the mobile ecosystem is to lock down every imaginable privacy and security breach. This requires more than software engineering or hardware design optimised for consumer protection. It needs education on protection and demands clear messaging.
The consumer protection messaging needs to be conceived at the design stage of the new product, technology or service. It must be part of the DNA built into the design, rather than a marketing message bolted on the product sent to the market.
Advances in science and tech
Consider new medical technology. A chip designed by the Swiss Federal Institute of Technology can detect a molecule called troponin, which is released by the heart muscle when it begins malfunctioning – and gives three to four hours warning before a heart attack. The chip is implanted just under the skin, and transmits a warning signal to a receiver on the skin, which in turn sends a signal to the user’s smartphone. Your phone warns you that you’re about to have a heart attack.
This may sound like expensive science fiction, but it’s hoped that the cost of the chips will become cheaper in the near future. This will result in doctors routinely implanting them in at-risk patients.
But consider the trust factor. If nervous patients, or their distrustful families, are sold only on the technology without the trust element being part of the very fabric of the product, they may well reject it for fear of hackers, radiation, interference and even privacy violations.
If trust is built into a product from the start, there’s greater opportunity to drive competitive advantage.