Since its inception, the wearables market has made notable progress in laying a solid foundation of awareness as developments in mobile connectivity and artificial intelligence have advanced. The success of devices such as Apple’s iWatch, Fitbit and GoPro cameras is a clear indication that health and fitness-focused devices have paved the way for a second phase of new growth in the wearables market.
It’s predicted that the number of connected wearable devices worldwide will reach 929 million by 2021 – that’s an increase of over 185% from 2016 figures. Largely driven by smartwatches and wristbands, devices including smart footwear, jewellery and attachable cameras are becoming increasingly popular with consumers.
But there are some intriguing developments in the wearables market at the moment, particularly in the application of machine translation and language interpretation. In fact, startups have made some considerable headway in the last 18 months developing some complex wearable technology – some of which designed with translation at its core.
Translation on the go
Hearables, a commonly interchangeable term used for in-ear wearable devices, is a market that’s still in its infancy, however similar technology such as hearing aids have been around since the late 1890’s. But it’s speech translation technology that’s slowly becoming a commercial feature in hearables today.
Comprised of three essential parts, speech translation requires automatic speech recognition, machine translation and speech synthesis. Automatic speech recognition takes sound using the microphone on a device and transcribes it into words. These words are then translated into another language using machine translation and the translated words are then converted back into sound using a speech synthesiser on the device.
Waverly Labs, a US-based technology company, has developed a new wearable called Pilot, an earpiece designed to instantly translate speech – allowing the user to have a conversation with someone speaking another language.
Both users will need to be wearing a Pilot earpiece and translations are processed via an app on a Bluetooth-connected smartphone. Although users will need a few seconds for the earpiece to translate another person’s speech, the result is an almost fluid conversation between speakers of two different languages.
The $4.5 million crowdsourced platform currently supports Portuguese, Spanish, French, Italian, along with English and is currently due to ship over 25,000 pre-orders in early 2018.
Similarly, Lingmo International unveiled it’s hearable, Translate One2One, at a United Nations event in Switzerland earlier this year. Powered by IBM’s artificial intelligence engine, Watson, the device impressed delegates by showcasing its ability to translate up to 8 languages without the need for a smartphone or Wi-Fi connection. The Australian startup pushed back its commercial launch to late 2017 in order to work on additional features including integrating Google Maps and adding Arabic as a 9th language.
The list of tech companies developing on-the-go in-ear translation devices doesn’t stop there. Both German-based Bragi and Japan-based Logbar have pioneered the same type of wearable translation technology, but focus on two different types of consumers to differentiate themselves in the market.
As an established wearables brand, Bragi has added The Dash Pro to its collection of in-ear devices. Aimed at health and fitness users, Bragi has designed their devices for customers who value comfort, style, durability and biometric tracking as core features – with translation as an added bonus.
Logbar’s handheld translation device, ili – which requires no Wi-Fi, 3G, or any other internet connection – is positioned as the ultimate accessory for travellers. Currently, the device only supports English to Spanish, Mandarin and Japanese, however, its proprietary dictionary includes an extensive library of travel lexicon while boasting claims of outperforming all existing translation engines for the purpose of travel alone.
But the most commercial ‘babel fish-type solution’ to hit the market, comes from internet behemoth, Google. The Bluetooth-enabled Pixel Buds – set to rival Apple’s AirPods – is compatible with most Android smartphones and owners of its Pixel flagship handset are able to use the onboard Google Assistant in order to take advantage of real-time translation in over 40 languages using the Google Translate app.
We’ve previously mentioned the role of wearable technology in the workplace, including investment made by companies to improve their employees’ health using devices such as smartwatches. But until machine translation has been perfected to interpret the complexities and cultural nuances of human language, scenarios such as wearables translating speech in international business meetings, for instance, is blue sky thinking at best.
Unsurprisingly, it’s in the travel sector where we could see some real impact in the future. According to a 2017 travel report by APADMI, 54% of UK travellers want better mobile technology on holiday and 38% would like better tools to help them with language translations – signalling an appetite to enrich their experiences when travelling abroad.
Devices that function without the need of an internet connection like the IBM Watson-powered Translate One2One, stand to be favourable contenders for users who want to enhance their travelling experiences without relying on memorising common phrases before their holiday – especially with complex East Asian languages.
But this could be a challenge in itself with these first-generation hearables. Approaching a stranger in a foreign country and speaking into a wearable device to communicate could be alienating in some cultures, or even offensive.
However, Logbar’s handheld ili device – which provides one-way only translations – proved to be quite a hit with reviewers who found the locals of Tokyo to be very responsive when using the device in tourist environments such as restaurants, hotels and local bars.
Wearables present a unique opportunity for developers to help achieve universal accessibility for people with disabilities, whether it’s tools for the hard of hearing or legally blind.
Developed in South Korea, the Dot smartwatch allows users who are visually impaired to tell the time using four raised tactile cells to mimic braille. Like most smartwatches on the market, Dot is connected to a smartphone allowing additional functionality including sending and receiving SMS messages and receiving push notifications from popular apps.
As the world’s first commercial micro-braille actuator, the company behind the smartwatch has pioneered a vision to encourage braille education in both Western and emerging markets – especially in rural areas where internet connections are scarce or unreliable. In fact, development is already underway on Dot Mini, a project focusing on affordable braille displays for developing countries to encourage braille learning.
In the US, undergraduate students, Thomas Pryor and Navis Azodi collaborated to develop a pair of gloves that can translate American Sign Language (ASL) into text and speech. The device went on to win the Lemelson-MIT Student Prize in 2016.
The aptly named SignAloud gloves work by using sensors positioned on the hand to analyse hand position and movement. When in motion, the gloves send data to a computer via Bluetooth which is then processed, and the corresponding text and speech is then sent back to the user.
The Microsoft Kinect, Nintendo Wii and the new iPhone X are just a few examples of gesture monitoring technology already in commercial use today and sign language translation devices – most of which cover the forearm – aren’t necessarily new to the academic world.
But the SignAloud gloves are compact and ergonomic enough to be used as an everyday accessory, allowing speakers of ASL in the future to communicate to the wider world without affecting the way they naturally speak to each other.
It’s an exciting time in the wearables market at the moment. With the right development, we could see a future where doctors are able to treat their patients in their native language and improve medication adherence. Teams working in public services such as local government could also reduce interpreting costs when dealing with linguistically and culturally diverse communities.
Wearables tackling the complex challenge of language translation and communication are still in their infancy and many of the devices currently on the market have yet to see real commercial success. But it will be interesting to see how 2nd and 3rd generation versions of these devices will fair as the appetite for smart wearable technology continues to grow.