At first glance, these codes don't seem very helpful. That'd still require a lot of work to rewrite mostly everything. I won't go into details but there's specific windows code, GL and at best sdl2 (Wii=sdl 1.2).
And again, it must be integrated in a music player. The Wii isn't a multi-tasking system. Unlike a PC, you can't read the audio from any running programs.
Well, nothing's impossible, but maybe a much simpler visualizer could be written instead and implemented in wiimc etc.
Thanks for having a look. I meant to respond to your media player point earlier, having trouble editing from android.
My thinking was that the media players are quite extensive bits of software
- this would only need to respond to audio: a mic, network stream (bluetooth?), or yes, an internal ogg/flac/mp3 player; without taking room of a full-featured media player, leaving space for visualistion code.
It's a major bummer this code is wayyyy outside what the Wii is capable of - before it was open sourced, it ran on much simpler hardware.
Would the "something" the wii could do - could that something be capable of parsing the user scripts, if not to full effect?
Maturing on the idea, the killer app, the reason to leave a Wii plugged in, autoboot to this, would be streaming audio from any where to an amp. Your laptop, your PC across the room, maybe your phone?
The visualisation would be more than a cherry on top, it's, like the ART of the 00's, right?
Interestingly, the earliest versions of this sort of visualisation thing, up to the Atari ST, only responded to joysticks and user inputs - rotating wiimotes might be fun - there was an analog precedent, using Pro TV and Video kit, still going (now digital!). Units costing tens of thousands.