Are you doing your own custom memory management? Libogc has this stupid quirk that means once you've dipped into MEM2, any unused memory in MEM1 remains unusable until all of MEM2 is freed again (which isn't likely to happen)... So for example if MEM1 had 4MB free and you malloc'd 4.5MB, a 4.5MB chunk of MEM2 would be allocated and the 4MB in MEM1 effectively get thrown away. It's really stupid and could have been avoided if the devkitpro devs had implemented mmap() instead of sbrk().
Loaded the textures that come with the initialization; 20.7MB DRAM left. It is getting challenging now Hopefully Doom adds these to the image cache so that they can bedestroyed and reloaded as needed... Its not needed to keep stuff such as splashscreen and menu windows in memory all the time. Otherwise creating that will be the first thing to do after I get the client server thing done. But I'm going to try displayin the textures loaded first; I think it is the splashscreen and/or logo... Perhaps the console too. That stuff is still mocked so I don't really know what I get to see
I limit texture formats to I4 (4bits pp intensity maps), IA4 (2x4bits pp 2D normal maps), RGB565 (16bits color maps) and RGB4443 (16 bit color+alpha). I'm also going to use the supplied DDS files (DXT compression) as WII supports DXT1 but these aren't being loaded yet.
What is the point? No one has any clue of how long, if ever, until the wii u gets hacked.see i would wait until the wii u comes out and maybe port it to that wii since its a little bit more powerful than the first wii but like i said good luck to ya (:
this will never run on the wii
this will never run on the wii
Well, the Wii hardware is souped up Gamecube hardware, which was very close in terms of power compared to the Xbox. Its GPU and CPU are more capable than most developers give it credit. Would it have to be toned down, most likely, but at the same time, it would probably still look pretty good. The Wii/Gamecube hardware is more powerful than most think.
BTW, do you have a donation link? I'm sure many people (including myself) would love to donate to you just out of respect for your hard work.
People really have no sense of how powerful he Wii really is and the GameCube was.
I understand. I have faith that you can make this happen. Even if your efforts fail, I would at least like to donate a little bit of something towards your effort, even if it is only beer money.No, but in real world my rate is between 60 and 70 EUR an hour so you can start saving Haha, no really, I can't take any donation's while I'm not certain whether this will succeed and I wouldn't donate myself until timedemo is running at least.
Comparing clock frequencies of different CPU/GPU architectures makes no sense at all, not even for a rough comparison. You'd need actual benchmarks to compare them all since different CPUs perform very differing workloads per clock cycle.this will never run on the wii
Well, the Wii hardware is souped up Gamecube hardware, which was very close in terms of power compared to the Xbox. Its GPU and CPU are more capable than most developers give it credit. Would it have to be toned down, most likely, but at the same time, it would probably still look pretty good. The Wii/Gamecube hardware is more powerful than most think.
THANK YOU! People really have no sense of how powerful he Wii really is and the GameCube was. Both systems when pushed hard enough can really push out some beautiful sights. With that said, here are rough specs. Take it all with a grain of salt. We all know that raw numbers don't reflect real-time action.
GPU
PlayStation 2 "Graphics Synthesizer" clocked at 147.456 MHz
GameCube 162 MHz "Flipper" LSI (co-developed by Nintendo and ArtX, acquired by ATI)
XBox 233 MHz "NV2A" ASIC. Co-developed by Microsoft and Nvidia.
Wii ATI"Hollywood" GPU clocked at 243 MHz
CPU
PlayStation 2 64-bit[3][4] "Emotion Engine" clocked at 294.912 MHz (299 MHz on newer versions)
GameCube 486 MHzIBM "Gekko" PowerPC
XBox 32-bit 733 MHz, custom Intel Pentium III
Wii PowerPC-based "Broadway" processor, clocked at 729 MHz
Comparing clock frequencies of different CPU/GPU architectures makes no sense at all, not even for a rough comparison. You'd need actual benchmarks to compare them all since different CPUs perform very differing workloads per clock cycle.this will never run on the wii
Well, the Wii hardware is souped up Gamecube hardware, which was very close in terms of power compared to the Xbox. Its GPU and CPU are more capable than most developers give it credit. Would it have to be toned down, most likely, but at the same time, it would probably still look pretty good. The Wii/Gamecube hardware is more powerful than most think.
THANK YOU! People really have no sense of how powerful he Wii really is and the GameCube was. Both systems when pushed hard enough can really push out some beautiful sights. With that said, here are rough specs. Take it all with a grain of salt. We all know that raw numbers don't reflect real-time action.
GPU
PlayStation 2 "Graphics Synthesizer" clocked at 147.456 MHz
GameCube 162 MHz "Flipper" LSI (co-developed by Nintendo and ArtX, acquired by ATI)
XBox 233 MHz "NV2A" ASIC. Co-developed by Microsoft and Nvidia.
Wii ATI"Hollywood" GPU clocked at 243 MHz
CPU
PlayStation 2 64-bit[3][4] "Emotion Engine" clocked at 294.912 MHz (299 MHz on newer versions)
GameCube 486 MHzIBM "Gekko" PowerPC
XBox 32-bit 733 MHz, custom Intel Pentium III
Wii PowerPC-based "Broadway" processor, clocked at 729 MHz
Probably, it has less lines of code to begin with.dont try to be rude but what about Q3 it would be easier to port!?
Could be an idea yesTake Steam gifts, then at least someone can send you the full game.