NextMind Dev Kit Impressions

GBAtemp_NextMind.png

In one of the most sci-fi-esque videos shared in 2021 thus far, we see a macaque playing Pong with its mind; not in a movie, nor in a CGI demo, but in real life. I’m, of course, talking about the one shared by Elon Musk’s Neuralink. The animal manages to do this by means of a brain-computer interface (BCI) implanted in its skull, enabling its neural activity to interact with the game.

Neuralink’s near-term goal is to use the technology for therapeutic purposes in order to help those who can’t use their limbs to control certain software and hardware. But eventually, they will want people to have what Musk describes as “a Fitbit in your skull” to control the technology they use with their thoughts. And the entertainment industry, including gaming, is also closely eyeing BCIs.

Valve’s Gabe Newell has overtly expressed his interest in BCIs and in one of his most recent interviews shared part of his vision for integrating these devices to improve immersion in gaming far beyond what our “meat peripherals” currently can. Valve has also partnered with OpenBCI to jointly develop a hardware; which is likely to be a BCI-enabled VR headset. However, while Neuralink and Valve/OpenBCI have publicly shared their interests and intentions in this space, all of their progress is done behind closed doors; with their prototypes and dev kits inaccessible to most. Gabe Newell himself said that “if you're a software developer in 2022 who doesn't have one of these in your test lab, you're making a silly mistake”. So how can you dabble with BCI if you can’t have access to one?

BCI startup NextMind might have a solution for you. Based in France, they have recently released their $400 BCI dev kit and I’ve had access to one myself along with some of its demos. Since it is a dev kit and not the final consumer version, I’ve opted for an impressions piece rather than a full review. Even the NextMind representative who walked me through the dev kit demos told me that the final product might not look like the dev kit at all and could even be integrated in a device. He compared it to the Oculus which started out as a series of dev kits before launching a consumer-ready product. This does for a pretty accurate comparison as BCI development for gaming and entertainment purposes is still in its infancy.

end.jpg

contents.jpg electrodes.jpg

However, NextMind’s dev kit might as well have been the final product as its form factor is a surprisingly compact one and the device itself is very light (135x66x55mm and 60 grams). Unlike Neuralink’s BCI, the one from NextMind is non-invasive and is essentially a wearable as it comes with a headband to which the device clips to. I was quite impressed to see how small, practical and versatile NextMind designed their dev kit as it can even be inconspicuously clipped to a cap or a VR headset (more on that later). If this represents the dev kit, there’s a big chance that the final device will be even slimmer (at worst, it will stay the same which isn’t bad at all).

Once the headband is adjusted to a comfortable fit, the BCI stays firmly at the back of your head with its dry electrodes (the 9 protrusions) making contact with your scalp. These electrodes can then read your brain activity, more specifically, activity from your visual cortex, which the device translates to an action.



Wearing the headband with the electrodes poking at the back of your head isn't the most comfortable thing ever but it definitely isn’t unbearable. The discomfort is more apparent over time (after wearing it for 10-15 minutes) and feels a bit like applying too much pressure directly on your scalp with a comb. That said, I do hope the consumer version is made more comfortable. The headband on the other hand is comfortable as there is a thick layer of padding where it gets in contact with your forehead.

Regarding batteries, the device packs a 240 mAh one that can be charged in 2 hours via the USB-C port on the underside of the device. Worry not as the USB-C port is only used for charging and the device works completely wirelessly as it pairs to a computer via Bluetooth (BLE). Additionally, there’s only one physical button on its right side to power it on/off and to enter pairing mode.

electrodes 2.jpg

button.jpg usb port.jpg

On the software side of things, I had access to NextMind’s in-house demos which showcase the possibilities of the device. These are accessible via the NextMind Manager app and each launch requires the user to calibrate the device. This might sound inconvenient but it’s to ensure optimal performance of the BCI. Think of it the same way as you adjust the play area every time you don a VR headset.

Calibration with NextMind’s BCI involves focusing at the center of the screen where you can see three small rotating bars. If you are focusing right (no talking or moving), the bars join to form a triangle (they can split again if you lose focus). It might sound easy when reading about it but it’s much tougher in practice. The NextMind representative who helped me during the demo walkthrough said it’s like controlling a new sense which isn’t a far-fetched claim once I’ve been through it myself. Thankfully, the process does get easier with practice.

After I completed my first calibration process, I could have access to the demos that NextMind prepared. There’s a “TV remote” app that let me play/pause TV channels, mute/unmute them and select the one I wanted to watch; only by looking and focusing at the respective selection area on the screen. All the other demos were controlled in a similar fashion: a security lock whose pin I could type in with my eyes, directing the paddle of a Brick Breaker clone with my gaze, selecting the beats in a music composer. You can find more of the demos in the video below:



NextMind was also very kind to provide access to a WIP VR game demo that uses their BCI and I could play it with my Oculus Quest tethered to my PC via the Oculus Link. That game involved exploding aliens by focusing on their brain and moving through portals by focusing on those.



From those demos’ control descriptions, you might have realized something: the NextMind BCI functions similar to an eye tracker. That’s because it’s essentially what it does as it reads signals from the vision-processing area of the brain and translates the signal to information that is fed to a computer. When it works, it delivers a more wow-factor than an eye tracker but it’s tougher to get a hold of.

Once I could control my focus my experience with the demos provided was a fun one. The response is accurate, works as intended and seeing things on-screen respond to my thoughts felt surreal and cool at the same time. This also indicated that I might as well get used to it as we will eventually have access to more and more such experiences as they seamlessly integrate with our existing or future devices.

The technology behind it is quite impressive and goes beyond eye tracking. The true potential rests in what developers can create with it and those creations should indeed prove interesting. NextMind told me that they plan to have a section on their website to showcase their community projects and it might be worth keeping an eye on if you are interested in hte technology. When I asked the company whether they will limit themselves to control only software with their BCI, they said that it’s not necessarily the case as it’s possible to pair it with some hardware like smart switches or even coffee makers to initiate an action. This is somewhat akin to Macrotellect’s BrainLink Pro which I tested before but that one’s use is limited.

From what I experienced, I can also see NextMind’s BCI work serve as a great accessory to other promising technologies. I particularly see it as a fit to phone-tethered Mixed Reality headsets like the Nreal Light or MAD Gaze GLOW Plus. These are more affordable, consumer-focused MR headsets, but they rely on the user controlling/navigating through MR apps with their phone. Hand tracking could work but is mostly a WIP on those devices and still requires physical input. NextMind’s BCI on the other hand is an accurate, ready-made option that could offer a totally hands-free way to perform simple navigation in MR. So in the future, we might increasingly control our devices with the power of our mind similar to how we control them with our physical force today.

in band.jpg

That future is at least a couple of years away at best as practical applications for BCIs is rather limited, as exemplified with my experience with the limited NextMind-compatible software. However, this is to be expected as the device is but a dev kit and will probably go through several iterations before a consumer unit hits the shelves; and by then, a thriving ecosystem could have developed around it.

If you are a developer and want to heed to Gabe Newell’s advice, then NextMind’s dev kit presents itself as an adequate place to start developing BCI experiences. It packs promising tech in a compact form factor and the company behind it even released its SDK for developers to make use of.

While we don’t know what the future holds for NextMind itself (they might be acquired or develop their device alongside other hardware manufacturers), BCI seems like a promising way forward as some big players are already investing heavily in the technology. So as a developer, you might not be amiss to start creating your projects with the technology and NextMind offers such an opportunity right now with its dev kit. And if you do end up developing for the device, do share your creations as I look forward to the types of experiences that can be crafted with NextMind's BCI!



:arrow: NextMind Official Website
 

Site & Scene News

Popular threads in this forum

General chit-chat
Help Users
    HiradeGirl @ HiradeGirl: Wlak past the light and kill that giant mosquito.