This year, we returned to 4YFN as exhibitors alongside IQS, where we had the opportunity to showcase MouthX in its latest version right before the official market launch. Last year's edition served to announce the start of our pre-orders, and this year we will be shipping the first units. But what are the questions people ask most when they see a MouthX for the first time?
On social media we shared the reactions of visitors who saw it for the first time and tried to guess what it was. However, the people who came to our stand had many more questions that we've put together here.
The questions curious tech visitors had about MouthX
“How does it work?” is usually just the starting point for people who are drawn to our technology, like anyone is when visiting an event like MWC or 4YFN.

Many visitors were able to see MouthX in action, and they were also curious about comfort, how easy it is to get used to the device, and additional technical details. We are used to receiving these kinds of questions, just like when we appeared on El Hormiguero: at the end of the day, MouthX introduces a type of technology that people have rarely seen before, one that can open many possibilities for human-device interaction in a natural and ergonomic way.
1. Is additional software required to integrate MouthX with other devices?
When people see MouthX in action, many ask whether they need to install extra software on their devices.
MouthX has its own dedicated app, available for device users. The app allows users to personalise preferred commands, manage connected devices, store configuration data such as sensitivity and response speed, and much more.

2. Are there plans to integrate MouthX with AR/VR or other alternative control systems?
One of the most frequent questions, and one that helps guide our development roadmap, is whether augmented reality (AR) or virtual reality (VR) environments will work with MouthX in the future.
This is something some of our testers are already experimenting with. While AR or VR integration is not a direct built-in function of the device itself, it is clearly a very relevant area for future integrations.
3. What type of sensors does MouthX use to detect mouth movements?
Talking about the basics isn't so basic when a device is as complex as MouthX, but the way it works is easy to understand. The device detects intentional movements of the tongue, jaw and head, translating them into multidimensional digital commands. Specifically, the device distinguishes these movements through its lingual joystick (which we call J1), molar joystick (J2), and head-movement tracking system, allowing users to control digital interfaces with precision.

4. What communication protocols does MouthX support (Bluetooth, USB, etc.), and what latency does it have?
MouthX communicates with external devices through Bluetooth Low Energy (BLE) operating in the 2.4 GHz ISM band (2400–2483.5 MHz). This allows it to connect with smartphones, tablets and computers; and users can switch between devices either with a light suction gesture or directly through the app.
All in all, our participation at 4YFN showed us one thing: accessible technology is interesting and relevant for everyone. What began as a conversation about assistive technology quickly turned into broader discussions about the future of human-device interaction.
From curious visitors and startup founders to developers and tech enthusiasts, many people discovered for the first time that controlling technology with your mouth is not science fiction, it’s already possible with MouthX. Because technology should not only be innovative, it should also be accessible, intuitive and capable of expanding how people interact with the digital world.
