Hey, guys. Moron here, equipped with an ill-conceived concept of a theoretical 3DS application, and absolutely no practical or technical knowledge whatsoever.
So. Here I am, playing the Japan-only Legend of Xanadu via TemperPCE on my favorite electronic device in the entire world -- my trusty, reliable, fun-dispensing NN3DS -- thanks in no small part to all the actual intelligent, knowledgeable, and talented developers among this community that continually innovate and trick it out to its full potential.
Just trying to reach the first of the beautiful-looking side-scrolling sections I've seen screenshots of to experience it for myself. No fan translation for this one, though, and it's a bit of a text-heavy JRPG. So, tad bit of a language barrier going on. Thanks to the wonders of modern technology, however, I have a free phone application that uses the camera to translate Japanese text into English in real-time. Obviously, the results aren't necessarily the quality and thoughtful kind of translations that actual, dedicated fan localizations with a human touch tend to provide, but even when the English is occasionally utterly broken, it provides enough of a basis to decipher and at least piece together the general idea. Constantly hovering my phone screen over my 3DS screen can get a little tedious, but it's still amazing in and of itself that I can even do this at all. I'm thinking of maybe trying to do a run of SegaGaga with this method now. I don't see an actual translation for that one ever seeing the light of day with all the apparent technical hurdles associated with its English text-size that its various developers have faced during their numerous failed passes at it, and at this point, it's better than nothing.
Aaaand, that's when I got to wondering to myself, in my moron-mush brain, if it would be technically possible at all to create a similar sort of application natively for the 3DS. I have no idea how these phone apps actually work, of course. Just seems that it essentially interprets the text on-screen, matches it with its native-language dictionary, correlates it with the appropriate equivalent in the alternate language dictionary that's to be output, and then simply outputs the swapped text to the user. These dictionaries would probably have to be built-in to this 3DS app somehow, and I would surmise it would have to work as some sort of Luma/Rosalina application acting as something of an overlay that you can use independently and in conjunction with any other app of your choosing... that is, if Luma can actually read the on-screen text in real time, if at all. No actual clue there (or anywhere). I'm just suspecting that maybe that's technically possible somehow, simply due to the fact that you can take screen shots through Luma? I know that's pretty broad and general, though. Just because it can screen cap a native image doesn't mean it is somehow capable of actually reading or interpreting any visual information on the screen.
Reading specific visual information on-screen, natively, in real-time, is likely a ridiculously fantastical pipe-dream of a concept on my part, now that I take into account all the potential variables in text size, pixel size, resolution, fonts, etc., etc., across wildly varying types of software. This is all just occurring to me now, and yeah... this theoretical native-screen-reading app I've dreamt up in my head is probably just not how this type of shit works or reads information, and any specific software would probably need to be individually programmed for compatibility to be effectively read via the game's code itself, rather than its on-screen visual output in real-time... which then pretty much renders my entire concept moot, I suppose. These cell phone translator apps use a camera and visual information from an actual lens, adjusting the text size accordingly with draw distance and whatnot. So, you probably need an actual camera lens to pull-off reading real-time on-screen text. The 3DS has a camera of its own as well, of course. So maybe this theoretical application would actually be technically achievable through its camera in the same way the actual smart phone applications work. But who really needs that when we all already have our phones for that?
My pipe-dream is a Luma application powered by voodoo magic that can somehow detect, read, and translate native on-screen text information, and overlay the translated text box over the original foreign text in real-time as you're playing, just as the camera-operated smart phone apps do, and that you can utilize universally across pretty much anything. I'll go look for a genie lamp and get back to you guys.
Well, anyway. If there's anyone here who has any idea what they're actually talking about regarding the technical logistics of any of my nonsense here, and would care to indulge me in any genuine development insight or technical break downs to properly put my toddler's-understanding and cave-drawn development roadmap for this idea in clearer context, I would be very interested and appreciative of that. Or any other input, really.
Thanks, and enjoy your day.
So. Here I am, playing the Japan-only Legend of Xanadu via TemperPCE on my favorite electronic device in the entire world -- my trusty, reliable, fun-dispensing NN3DS -- thanks in no small part to all the actual intelligent, knowledgeable, and talented developers among this community that continually innovate and trick it out to its full potential.
Just trying to reach the first of the beautiful-looking side-scrolling sections I've seen screenshots of to experience it for myself. No fan translation for this one, though, and it's a bit of a text-heavy JRPG. So, tad bit of a language barrier going on. Thanks to the wonders of modern technology, however, I have a free phone application that uses the camera to translate Japanese text into English in real-time. Obviously, the results aren't necessarily the quality and thoughtful kind of translations that actual, dedicated fan localizations with a human touch tend to provide, but even when the English is occasionally utterly broken, it provides enough of a basis to decipher and at least piece together the general idea. Constantly hovering my phone screen over my 3DS screen can get a little tedious, but it's still amazing in and of itself that I can even do this at all. I'm thinking of maybe trying to do a run of SegaGaga with this method now. I don't see an actual translation for that one ever seeing the light of day with all the apparent technical hurdles associated with its English text-size that its various developers have faced during their numerous failed passes at it, and at this point, it's better than nothing.
Aaaand, that's when I got to wondering to myself, in my moron-mush brain, if it would be technically possible at all to create a similar sort of application natively for the 3DS. I have no idea how these phone apps actually work, of course. Just seems that it essentially interprets the text on-screen, matches it with its native-language dictionary, correlates it with the appropriate equivalent in the alternate language dictionary that's to be output, and then simply outputs the swapped text to the user. These dictionaries would probably have to be built-in to this 3DS app somehow, and I would surmise it would have to work as some sort of Luma/Rosalina application acting as something of an overlay that you can use independently and in conjunction with any other app of your choosing... that is, if Luma can actually read the on-screen text in real time, if at all. No actual clue there (or anywhere). I'm just suspecting that maybe that's technically possible somehow, simply due to the fact that you can take screen shots through Luma? I know that's pretty broad and general, though. Just because it can screen cap a native image doesn't mean it is somehow capable of actually reading or interpreting any visual information on the screen.
Reading specific visual information on-screen, natively, in real-time, is likely a ridiculously fantastical pipe-dream of a concept on my part, now that I take into account all the potential variables in text size, pixel size, resolution, fonts, etc., etc., across wildly varying types of software. This is all just occurring to me now, and yeah... this theoretical native-screen-reading app I've dreamt up in my head is probably just not how this type of shit works or reads information, and any specific software would probably need to be individually programmed for compatibility to be effectively read via the game's code itself, rather than its on-screen visual output in real-time... which then pretty much renders my entire concept moot, I suppose. These cell phone translator apps use a camera and visual information from an actual lens, adjusting the text size accordingly with draw distance and whatnot. So, you probably need an actual camera lens to pull-off reading real-time on-screen text. The 3DS has a camera of its own as well, of course. So maybe this theoretical application would actually be technically achievable through its camera in the same way the actual smart phone applications work. But who really needs that when we all already have our phones for that?
My pipe-dream is a Luma application powered by voodoo magic that can somehow detect, read, and translate native on-screen text information, and overlay the translated text box over the original foreign text in real-time as you're playing, just as the camera-operated smart phone apps do, and that you can utilize universally across pretty much anything. I'll go look for a genie lamp and get back to you guys.
Well, anyway. If there's anyone here who has any idea what they're actually talking about regarding the technical logistics of any of my nonsense here, and would care to indulge me in any genuine development insight or technical break downs to properly put my toddler's-understanding and cave-drawn development roadmap for this idea in clearer context, I would be very interested and appreciative of that. Or any other input, really.
Thanks, and enjoy your day.
Attachments
Last edited by Thee_Stranger,