DS emulation gets an augmented reality proof of concept

Screenshot from 2023-06-23 14-42-31.png

Homebrew and romhacking developers are always working on great things to bring forth new ways to experience great classics for older consoles. One such example is in DS emulation, which has seen it's fair share of attention these past years, like the introduction of a new emulator in the form of melonDS, which seems to be in a fast track to becoming the defacto DS emulator, and today it seems like a big breakthrough in DS emulation has appeared based on the work of melonDS.

Software developer Zhuowei Zhang has developed a proof of concept of an augmented reality feature for DS emulation, in which a holographic 3D image of the current game floats above the controller the player uses for the emulator.



As detailed in the tweet, the proof of concept works with melonDS and the melonDS core, utilizing a live 3D model extraction called MelonRipper, and the augmented reality development is possible using iOS' RealityKit. The prototype currently has some downsides, which Zhang goes into detail in the GitHub repository for the project:
  • the MelonRipper->RealityKit converter doesn't work very well (e.g. doesn't handle transparency)
  • there's a terrible memory leak (might be this?) that crashes the app after a few minutes
  • only tested with the camera position used by Mario Kart DS.
    • (for example, Pokemon Black and HeartGold use a different camera angle, and I had to remove the shader that crops the model for anything to show up)
  • no way to select rom/touchscreen input/etc. The ROM name is hardcoded to "rom.nds" in the app's folder in Files/iTunes.
Despite these downsides, the project as it stands is already outstanding, giving players some glimpses of the augmented reality with some DS titles like Mario Kart DS, Pokemon Heart Gold and Pokemon Black, giving a floating image of the games with great detail (aside from the transparency issues mentioned in the repository).

The project is currently open source and available in Zhang's GitHub repository linked below.

:arrow: DSReality GitHub Repository
 

Jonna

Some sort of musician.
Member
Joined
May 15, 2015
Messages
1,234
Trophies
1
Age
35
Location
Canada
Website
twitter.com
XP
3,145
Country
Canada
So is no one going to actually answer the questions or explain what's actually going on to those of us confused? At least 4 posts stating this, and you can throw mine in there as well. This just looks like, right now, like some one went into Blender, combined real life footage with a model animation they made, and stuck it together, and are just stating "this is possible right now." What's possible? A hologram? Or exactly what I'm seeing, in which case I need to tape a phone to my Switch and try to play the Switch while holding a phone to look through my phone's screen at the augmented reality part? Is this actually possible in real time, or is this just saying it's only theoretically possible? The 3d model isn't fully matching up to the events happening in the game, and is losing a lot of detail. Which, I know is probably a limitation of what you can extract from the emulation, but then what purpose does this serve? That's assuming this isn't just a mock-up animation.

I'm just confused in general about this.
 
  • Like
Reactions: Lostbhoy

eyeliner

Has an itch needing to be scratched.
Member
Joined
Feb 17, 2006
Messages
2,891
Trophies
2
Age
44
XP
5,538
Country
Portugal
Should be more clear if it used a game like Mario.
As it is it looks like a more mobile camera for the games.
 

sarkwalvein

There's hope for a Xenosaga port.
Member
Joined
Jun 29, 2007
Messages
8,508
Trophies
2
Age
41
Location
Niedersachsen
XP
11,234
Country
Germany
So is no one going to actually answer the questions or explain what's actually going on to those of us confused? At least 4 posts stating this, and you can throw mine in there as well. This just looks like, right now, like some one went into Blender, combined real life footage with a model animation they made, and stuck it together, and are just stating "this is possible right now." What's possible? A hologram? Or exactly what I'm seeing, in which case I need to tape a phone to my Switch and try to play the Switch while holding a phone to look through my phone's screen at the augmented reality part? Is this actually possible in real time, or is this just saying it's only theoretically possible? The 3d model isn't fully matching up to the events happening in the game, and is losing a lot of detail. Which, I know is probably a limitation of what you can extract from the emulation, but then what purpose does this serve? That's assuming this isn't just a mock-up animation.

I'm just confused in general about this.
I haven't read details, but from the video I get this:
- The Switch is just used as a gamepad, don't think about it.
- The smartphone attached to the Switch is only used to display a QR code, don't think about it.
It may have other uses like sending gyro data, but not relevant for basic understanding, so don't think about it.
- A third device, the real device being used to play this, the device where the emulator runs, is also the device the video is recorded in. You are supposed to look at the screen of this device, the video is a screen recording. This device is an AR device, either glasses or just iOS phone/tablet that is recording video, locating the QR code, and augmenting the scene (i.e. putting the emulated scene above the "gamepad"). The gamepad (joycons) are connected to this device and the person is playing the game.

The whole interesting point of this is the automatic decomposition and recreation of the emulated 3D scene, so that it can be fed into the RealityKit to obtain an AR-like image with appropriate headtracking. This is the point. Everything else is irrelevant. This runs automatically from the DS Emulation.

So, just to summarize (more or less I repeat the last paragraph). What is possible right now in real time is the following:
The automatic decomposition and recreation of the emulated 3D scene from the 3D game done in real time inside the emulator, so that it can be fed into the RealityKit to obtain an AR-like image with appropriate headtracking and displayed in a device that supports RealKit for AR like an iPhone, iPad, or who knows what AR glasses (Apple Vision Pro??).
 
Last edited by sarkwalvein,

M7L7NK7

Well-Known Member
Member
Joined
Oct 16, 2017
Messages
3,904
Trophies
1
Website
youtube.com
XP
5,976
Country
Australia
I haven't read details, but from the video I get this:
- The Switch is just used as a gamepad, don't think about it.
- The smartphone attached to the Switch is only used to display a QR code, don't think about it.
It may have other uses like sending gyro data, but not relevant for basic understanding, so don't think about it.
- A third device, the real device being used to play this, the device where the emulator runs, is also the device the video is recorded in. You are supposed to look at the screen of this device, the video is a screen recording. This device is an AR device, either glasses or just iOS phone/tablet that is recording video, locating the QR code, and augmenting the scene (i.e. putting the emulated scene above the "gamepad"). The gamepad (joycons) are connected to this device and the person is playing the game.

The whole interesting point of this is the automatic decomposition and recreation of the emulated 3D scene, so that it can be fed into the RealityKit to obtain an AR-like image with appropriate headtracking. This is the point. Everything else is irrelevant. This runs automatically from the DS Emulation.

So, just to summarize (more or less I repeat the last paragraph). What is possible right now in real time is the following:
The automatic decomposition and recreation of the emulated 3D scene from the 3D game done in real time inside the emulator, so that it can be fed into the RealityKit to obtain an AR-like image with appropriate headtracking and displayed in a device that supports RealKit for AR like an iPhone, iPad, or who knows what AR glasses (Apple Vision Pro??).
Why was it so hard for this to be explained earlier? Nowhere was another device even mentioned, literally looks like it projects it into the air without a headset
 

orangy57

bruh
Member
Joined
Aug 17, 2015
Messages
916
Trophies
1
Age
21
Location
New Jersey
XP
2,949
Country
United States
it's really impressive work for something so useless, I'd love to see it projected onto a table

this just seems like people desperately trying to find any use for the apple AR headset tho
 

Jonna

Some sort of musician.
Member
Joined
May 15, 2015
Messages
1,234
Trophies
1
Age
35
Location
Canada
Website
twitter.com
XP
3,145
Country
Canada
I haven't read details, but from the video I get this:
- The Switch is just used as a gamepad, don't think about it.
- The smartphone attached to the Switch is only used to display a QR code, don't think about it.
It may have other uses like sending gyro data, but not relevant for basic understanding, so don't think about it.
- A third device, the real device being used to play this, the device where the emulator runs, is also the device the video is recorded in. You are supposed to look at the screen of this device, the video is a screen recording. This device is an AR device, either glasses or just iOS phone/tablet that is recording video, locating the QR code, and augmenting the scene (i.e. putting the emulated scene above the "gamepad"). The gamepad (joycons) are connected to this device and the person is playing the game.

The whole interesting point of this is the automatic decomposition and recreation of the emulated 3D scene, so that it can be fed into the RealityKit to obtain an AR-like image with appropriate headtracking. This is the point. Everything else is irrelevant. This runs automatically from the DS Emulation.

So, just to summarize (more or less I repeat the last paragraph). What is possible right now in real time is the following:
The automatic decomposition and recreation of the emulated 3D scene from the 3D game done in real time inside the emulator, so that it can be fed into the RealityKit to obtain an AR-like image with appropriate headtracking and displayed in a device that supports RealKit for AR like an iPhone, iPad, or who knows what AR glasses (Apple Vision Pro??).
Thank you so, so much! That clears it right up.
 

Paralel

Well-Known Member
Member
Joined
Sep 19, 2006
Messages
100
Trophies
1
XP
1,172
Country
United States
I haven't read details, but from the video I get this:
- The Switch is just used as a gamepad, don't think about it.
- The smartphone attached to the Switch is only used to display a QR code, don't think about it.
It may have other uses like sending gyro data, but not relevant for basic understanding, so don't think about it.
- A third device, the real device being used to play this, the device where the emulator runs, is also the device the video is recorded in. You are supposed to look at the screen of this device, the video is a screen recording. This device is an AR device, either glasses or just iOS phone/tablet that is recording video, locating the QR code, and augmenting the scene (i.e. putting the emulated scene above the "gamepad"). The gamepad (joycons) are connected to this device and the person is playing the game.

The whole interesting point of this is the automatic decomposition and recreation of the emulated 3D scene, so that it can be fed into the RealityKit to obtain an AR-like image with appropriate headtracking. This is the point. Everything else is irrelevant. This runs automatically from the DS Emulation.

So, just to summarize (more or less I repeat the last paragraph). What is possible right now in real time is the following:
The automatic decomposition and recreation of the emulated 3D scene from the 3D game done in real time inside the emulator, so that it can be fed into the RealityKit to obtain an AR-like image with appropriate headtracking and displayed in a device that supports RealKit for AR like an iPhone, iPad, or who knows what AR glasses (Apple Vision Pro??).

So one wouldn't need to show the controller if they didn't want to, and instead of using a phone they could just use a piece of paper with with a QR code printed on it. The augmented reality projection would appear through the phone/iPad/whatever is running the DS emulation and providing the realtime display of the environment, just above where the paper QR code is located since it is used strictly as a positioning mechanism in the displayed environment that is on the screen.

Not showing the controller and just using a paper QR code would have made this much, much easier to understand than people thinking that somehow the two devices seen on the screen recording are somehow involved with what one is seeing in the recording.

As for what is actually seen in the video demonstration, it's rather impressive that it is able to make a rudimentary volumetric reconstruction of a DS game (in somewhat realtime) and have it appear as a virtual element in an AR environment.
 
Last edited by Paralel,

Jayro

MediCat USB Dev
Developer
Joined
Jul 23, 2012
Messages
12,983
Trophies
4
Location
WA State
Website
ko-fi.com
XP
17,020
Country
United States
Sorry, but that's ugly and glitchy as hell. I don't like it one bit. DS emulation is already fantastic using DraStic. Whatever emulator they're using for this sucks.
 

Site & Scene News

Popular threads in this forum

General chit-chat
Help Users
  • No one is chatting at the moment.
    K3Nv2 @ K3Nv2: Lol rappers still promoting crypto