Hacking Doom 3 Source released, chances of a Wii port?

DRS

Member
Newcomer
Joined
Mar 31, 2012
Messages
24
Trophies
0
XP
23
Country
Netherlands
Are you doing your own custom memory management? Libogc has this stupid quirk that means once you've dipped into MEM2, any unused memory in MEM1 remains unusable until all of MEM2 is freed again (which isn't likely to happen)... So for example if MEM1 had 4MB free and you malloc'd 4.5MB, a 4.5MB chunk of MEM2 would be allocated and the 4MB in MEM1 effectively get thrown away. It's really stupid and could have been avoided if the devkitpro devs had implemented mmap() instead of sbrk().

Yes I use lwp heap to support Doom's Mem manager. I also use an old libogc I guess so I just get a bad alloc or a freeze at startup(before main()) when allocating too much. Did you suffer from it with the Descent port?

Small update:
I dropped about 10-12 rendering classes (code) in the soup now. I must say that gl drawing is deferred quite well. I still didn't run into gl drawing yet, only some gl vertex buffer allocations. I again saved a few Megs by ditching the file that contains SuperOptimizeOccluders. It is a file that allocates some large global arrays, but is used only for offline rendering; hopefully only when using the editor and not while loading a map. But even without that, we only have 2MB SRAM left now when initialized. Unbelievable, it eats and eats and eats! Need to move more data to DRAM but it takes much time to trace the exact allocations that matter. I also was thinking perhaps the decls don't need to be read during initialization. If I'm right it uses a 'Find or load otherwise' code design so I might add a LFU pattern or something. But these already go into DRAM so it won't save much for allowing more code.

BTW I estimate this on 1200 hours of work or so, and perhaps some timedemo playing w/o any shaders in 600-800 hours. Unless it runs out of memory before that is:)
 

snakepliskin2334

Well-Known Member
Member
Joined
Mar 25, 2012
Messages
226
Trophies
0
Age
36
XP
234
Country
United States
Loaded the textures that come with the initialization; 20.7MB DRAM left. It is getting challenging now:) Hopefully Doom adds these to the image cache so that they can bedestroyed and reloaded as needed... Its not needed to keep stuff such as splashscreen and menu windows in memory all the time. Otherwise creating that will be the first thing to do after I get the client server thing done. But I'm going to try displayin the textures loaded first; I think it is the splashscreen and/or logo... Perhaps the console too. That stuff is still mocked so I don't really know what I get to see:)

I limit texture formats to I4 (4bits pp intensity maps), IA4 (2x4bits pp 2D normal maps), RGB565 (16bits color maps) and RGB4443 (16 bit color+alpha). I'm also going to use the supplied DDS files (DXT compression) as WII supports DXT1 but these aren't being loaded yet.

see i would wait until the wii u comes out and maybe port it to that wii since its a little bit more powerful than the first wii but like i said good luck to ya (:
 

DRS

Member
Newcomer
Joined
Mar 31, 2012
Messages
24
Trophies
0
XP
23
Country
Netherlands


Turns out that Doom uses a unified renderer to render both 2D and 3D stuff. I expected somewhat simpler code for 2D. Anyways, I ended up creating some lightweight GL methods so GX is setup in the same order as GL based cards are. Didn't have to change much code by doing that. Setting the vertex arrays, TEV shader etc is done in GX directly. I only support single textured polygons now since it's okay for the current goals.

It doesn't work 100% correctly yet though; the triangular stuff should be lines of text I guess. I never played Doom3 and the demo doesn't work on my vista machine, so I really don't know what it should say!
 

the_randomizer

The Temp's official fox whisperer
Member
Joined
Apr 29, 2011
Messages
31,284
Trophies
2
Age
38
Location
Dr. Wahwee's castle
XP
18,969
Country
United States
this will never run on the wii :lol:


Well, the Wii hardware is souped up Gamecube hardware, which was very close in terms of power compared to the Xbox. Its GPU and CPU are more capable than most developers give it credit. Would it have to be toned down, most likely, but at the same time, it would probably still look pretty good. The Wii/Gamecube hardware is more powerful than most think.
 
  • Like
Reactions: 1 person

DeadlyFoez

XFlak Fanboy
Banned
Joined
Apr 12, 2009
Messages
5,920
Trophies
0
Website
DeadlyFoez.zzl.org
XP
2,875
Country
United States
Great job DRS. I can't wait to see you pull this off. I was never much of a Doom fan, but when you release this, I will certainly play it. Keep up the good work.

BTW, do you have a donation link? I'm sure many people (including myself) would love to donate to you just out of respect for your hard work.
 
  • Like
Reactions: 1 person

LightyKD

Future CEO of OUYA Inc.
Member
Joined
Jun 25, 2008
Messages
5,545
Trophies
2
Age
38
Location
Angel Grove, CA
XP
5,355
Country
United States
this will never run on the wii :lol:


Well, the Wii hardware is souped up Gamecube hardware, which was very close in terms of power compared to the Xbox. Its GPU and CPU are more capable than most developers give it credit. Would it have to be toned down, most likely, but at the same time, it would probably still look pretty good. The Wii/Gamecube hardware is more powerful than most think.


THANK YOU! People really have no sense of how powerful he Wii really is and the GameCube was. Both systems when pushed hard enough can really push out some beautiful sights. With that said, here are rough specs. Take it all with a grain of salt. We all know that raw numbers don't reflect real-time action.

GPU

PlayStation 2 "Graphics Synthesizer" clocked at 147.456 MHz
GameCube 162 MHz "Flipper" LSI (co-developed by Nintendo and ArtX, acquired by ATI)
XBox 233 MHz "NV2A" ASIC. Co-developed by Microsoft and Nvidia.
Wii ATI"Hollywood" GPU clocked at 243 MHz

CPU
PlayStation 2 64-bit[3][4] "Emotion Engine" clocked at 294.912 MHz (299 MHz on newer versions)
GameCube 486 MHzIBM "Gekko" PowerPC
XBox 32-bit 733 MHz, custom Intel Pentium III
Wii PowerPC-based "Broadway" processor, clocked at 729 MHz
 
  • Like
Reactions: 1 person

DRS

Member
Newcomer
Joined
Mar 31, 2012
Messages
24
Trophies
0
XP
23
Country
Netherlands
A bit better already: it says initializing and then loading.



But then after a while it shows a new message that has corrupted characters, but no weird triangles anymore at least. Well that issue was because I didn't flush the CPU's data cache to actually move the vertices to memory where GPU can fetch them. Perhaps it still isn't completely right yet.

BTW, do you have a donation link? I'm sure many people (including myself) would love to donate to you just out of respect for your hard work.

No, but in real world my rate is between 60 and 70 EUR an hour so you can start saving:) Haha, no really, I can't take any donation's while I'm not certain whether this will succeed and I wouldn't donate myself until timedemo is running at least.

People really have no sense of how powerful he Wii really is and the GameCube was.

Well it lacks enough capabilities to easily get convinced of such a thing:)
 
  • Like
Reactions: 2 people

DeadlyFoez

XFlak Fanboy
Banned
Joined
Apr 12, 2009
Messages
5,920
Trophies
0
Website
DeadlyFoez.zzl.org
XP
2,875
Country
United States
No, but in real world my rate is between 60 and 70 EUR an hour so you can start saving:) Haha, no really, I can't take any donation's while I'm not certain whether this will succeed and I wouldn't donate myself until timedemo is running at least.
I understand. I have faith that you can make this happen. Even if your efforts fail, I would at least like to donate a little bit of something towards your effort, even if it is only beer money.

Thank you for what you are doing.
 

Minox

Thanks for the fish
Former Staff
Joined
Aug 27, 2007
Messages
6,995
Trophies
2
XP
6,155
Country
Japan
this will never run on the wii :lol:


Well, the Wii hardware is souped up Gamecube hardware, which was very close in terms of power compared to the Xbox. Its GPU and CPU are more capable than most developers give it credit. Would it have to be toned down, most likely, but at the same time, it would probably still look pretty good. The Wii/Gamecube hardware is more powerful than most think.


THANK YOU! People really have no sense of how powerful he Wii really is and the GameCube was. Both systems when pushed hard enough can really push out some beautiful sights. With that said, here are rough specs. Take it all with a grain of salt. We all know that raw numbers don't reflect real-time action.

GPU

PlayStation 2 "Graphics Synthesizer" clocked at 147.456 MHz
GameCube 162 MHz "Flipper" LSI (co-developed by Nintendo and ArtX, acquired by ATI)
XBox 233 MHz "NV2A" ASIC. Co-developed by Microsoft and Nvidia.
Wii ATI"Hollywood" GPU clocked at 243 MHz

CPU
PlayStation 2 64-bit[3][4] "Emotion Engine" clocked at 294.912 MHz (299 MHz on newer versions)
GameCube 486 MHzIBM "Gekko" PowerPC
XBox 32-bit 733 MHz, custom Intel Pentium III
Wii PowerPC-based "Broadway" processor, clocked at 729 MHz
Comparing clock frequencies of different CPU/GPU architectures makes no sense at all, not even for a rough comparison. You'd need actual benchmarks to compare them all since different CPUs perform very differing workloads per clock cycle.
 

LightyKD

Future CEO of OUYA Inc.
Member
Joined
Jun 25, 2008
Messages
5,545
Trophies
2
Age
38
Location
Angel Grove, CA
XP
5,355
Country
United States
this will never run on the wii :lol:


Well, the Wii hardware is souped up Gamecube hardware, which was very close in terms of power compared to the Xbox. Its GPU and CPU are more capable than most developers give it credit. Would it have to be toned down, most likely, but at the same time, it would probably still look pretty good. The Wii/Gamecube hardware is more powerful than most think.


THANK YOU! People really have no sense of how powerful he Wii really is and the GameCube was. Both systems when pushed hard enough can really push out some beautiful sights. With that said, here are rough specs. Take it all with a grain of salt. We all know that raw numbers don't reflect real-time action.

GPU

PlayStation 2 "Graphics Synthesizer" clocked at 147.456 MHz
GameCube 162 MHz "Flipper" LSI (co-developed by Nintendo and ArtX, acquired by ATI)
XBox 233 MHz "NV2A" ASIC. Co-developed by Microsoft and Nvidia.
Wii ATI"Hollywood" GPU clocked at 243 MHz

CPU
PlayStation 2 64-bit[3][4] "Emotion Engine" clocked at 294.912 MHz (299 MHz on newer versions)
GameCube 486 MHzIBM "Gekko" PowerPC
XBox 32-bit 733 MHz, custom Intel Pentium III
Wii PowerPC-based "Broadway" processor, clocked at 729 MHz
Comparing clock frequencies of different CPU/GPU architectures makes no sense at all, not even for a rough comparison. You'd need actual benchmarks to compare them all since different CPUs perform very differing workloads per clock cycle.

I agree, which is why I wanted people to take that last post with a grain of salt. We all know that numbers don't begin to show real world ability.
 

gbatenp

Member
Newcomer
Joined
Nov 8, 2010
Messages
14
Trophies
0
XP
61
Country
United States
dont try to be rude but what about Q3 it would be easier to port!?

Do it or not, we always have this: :P

http://www.youtube.com/watch?v=sfmmzobUVFA
 
  • Like
Reactions: 1 person

DRS

Member
Newcomer
Joined
Mar 31, 2012
Messages
24
Trophies
0
XP
23
Country
Netherlands
dont try to be rude but what about Q3 it would be easier to port!?
Probably, it has less lines of code to begin with.

And another step. This pic is actually served by the main loop. So that's good news; init is done and main loop running. When writing this, it is still running, sometimes switches to a corrupted screen again but then back to this image again. Have to look into that. I think that the menu should pop up at some point, but most probably it isn't not doing anything now (i.e. mocked code only logs, not executes, and therefore doesn't proceed the game). I currently log all the vertex calls which makes things slow because it is creating tons of text.



I need to switch to faster Media as well, the SD card I'm using is way too slow and it isn't really feasible to work on it if game startup takes more than 10 minutes already:) I started this execute at 00:30 and it's 01:00 now haha:)
 

DRS

Member
Newcomer
Joined
Mar 31, 2012
Messages
24
Trophies
0
XP
23
Country
Netherlands
Sooo, fixed a few things, went on, and found that the 2D rendering is kind of a struggle. Doom basically has a software(!) shader for 2D that allows materials to blend or mask textures and framebuffer using either texture or framebuffer supplied blendfactors (alpha). All of this is configured and influenced by running scripts. The DOOM logo for example blends 4 rects on top of each other:
- an alpha mask to exclude everything outside the logo
- followed by the DOOM 3 color texture. The Letters are drawn with low alpha value.
- then blended with two animated effect textures using the previous stage's written alpha in the framebuffer.

So the alphamask was drawn just to make this effect texture display within the logo's bounds, and the low alpha within the logo was drawn just to make the effect not overwrite the colors. The Wii (and any other GPU that is able to blend 4 or more textures per pass) is able to do such a thing in a single polygon pass so I'll optimize it in future.



Also, the screens nicely fades in and out now:)

But, I still have a few bugs, it's not completely right yet, missing button text, overdrawing some button corners with black (alpha issue!) and it only shows mars' highlight texture, not it's 3D body. The trace says that the latter it IS drawn though, so either again an alpha issue or a 3D drawing issue. With a bit of luck a bump mapped rotating mars in a few weeks:)
 
  • Like
Reactions: 3 people

Site & Scene News

Popular threads in this forum

General chit-chat
Help Users
  • No one is chatting at the moment.
    DinohScene @ DinohScene: ahh nothing beats a coffee disaronno at work