Growth Agency in the UAE, UK, USA & AUS

AI in the Metaverse: How it Could Break the Language Barrier

Written by Caryn Oram | February 06, 2023

The nature of the metaverse means that anyone can access it from anywhere, all of us entering the same environment regardless of where we are geographically. Up until now, one problem has remained present despite this ability to all meet in the same digital environment. Language. It's like meeting with people of multiple different nationalities in one town square, all trying to communicate without having one common language that everyone shares. This is exactly what people have experienced in the metaverse.

Fortunately, various solutions have been proposed or implemented to try and remedy the problem, including:

Avatar language

Avatar language is the communication used between the avatars - digital representations - of individuals in a metaverse environment. When designing your avatar, you are able to set it to communicate in a specific language. This allows for others to easily identify what language is spoken by a particular avatar. From this, people can choose whom to communicate with and how, in terms of using other translation services, etc. The problem with this is that communication between people who do not share a common language still needs translation, and this can end up being time-consuming and inefficient. Another issue with avatar language on its own is that it lacks the ability to display non-verbal cues or emotions. An interesting solution to this problem is that some metaverses have actually created animations and emotes; this is a short action that requires the avatar to act out or move in a certain way to express an emotion. While this is clever, it is not as helpful for cross-language communication as a translation tool would be.

Learning other languages

Another solution that some metaverses have introduced is the option for individuals to actually learn other languages. By interacting with different language tools and games, people can communicate with speakers of different languages while learning at the same time. This is also a useful option, but, again, it does not offer a quick translation tool that would allow people to converse smoothly in a discursive, back-and-forth manner.

In-world translation

This is a solution that offers the real-time translation setup that is likely to be most attractive to people. Various metaverse platforms have implemented such real-time features already, and this is where Verbum by OneMeta comes in.

 

OneMeta has recently launched its new product, Verbum. Debuted at the CES (Consumer Electronics Show) 2023 with success, Verbum is an AI platform that allows you to hear one person speak in their language, but hear it in your own, in almost real-time speed:

"Our AI-powered web platform automatically translates, transcribes, and delivers closed captioning for 82 languages during calls, meetings, events, and chats in near-real time."

This tool is robust and shows extreme potential. It can provide its real-time translation function for up to 50 participants in one setting, all in their own language, in over 82 languages and 40 different dialects. The sheer scope of this makes the new tech a huge breakthrough in digital communication in general and, while at this stage, Verbum is directed towards more Zoom call-type settings, the scope for its implementation in the metaverse is certainly there.

Given that in-world translation tools are already being utilised in the metaverse, a highly capable technology like Verbum would be extremely well-received. And with more and more business activities taking place within the metaverse as it is, the platform is becoming another business environment where communication between global organisations will become commonplace. The more efficient the translation tools that allow those enterprises to communicate, the better. Verbum is one to watch. We are excited to see what unfolds with this new tool!