Hacking Wii Emulation resolution

MultipedBeatle

Active Member
OP
Newcomer
Joined
Oct 25, 2016
Messages
37
Trophies
0
XP
534
Country
United States
This question has probably been asked before, but is there anything you can do on the Wii U about the vWii emulation resolution? Obviously, there are hardware limitations but if GameCube games ran full resolution on the original Wii I don't see why the Wii U couldn't run Wii games without that black border around the whole screen and maybe a resolution bump. I apologize if it's been asked before, but I just started a Galaxy 2 playthrough and it's been on my mind for a bit!
 

driverdis

I am Justice
Member
Joined
Sep 21, 2011
Messages
2,867
Trophies
2
Age
31
Location
1.048596β
XP
2,838
Country
United States
This question has probably been asked before, but is there anything you can do on the Wii U about the vWii emulation resolution? Obviously, there are hardware limitations but if GameCube games ran full resolution on the original Wii I don't see why the Wii U couldn't run Wii games without that black border around the whole screen and maybe a resolution bump. I apologize if it's been asked before, but I just started a Galaxy 2 playthrough and it's been on my mind for a bit!

The reason for is that vWii is not emulated. vWii runs on the Wii U hardware in a special mode where the clock speed and cores are reduced, and the GPU is crippled.

Nintendont and Devolution run GameCube games in Wii Mode which is how they benefit from the faster clock speeds of the Wii. This would need to happen for Wii games to attempt to benefit from the performance of the Wii U.

Someone would need to write the equivalent to Nintendont or Devolution for Wii and/or GameCube games that runs those games in Wii U mode instead of vWii which would allow full GPU and faster clocked CPU support for Wii games provided the work was done to allow those games to properly use features like higher resolutions or clock speeds without crashing, major bugs, and timing issues.

if anyone was up for such a crazy task, it would be the creator of Devolution, Tueidj who put forth the first effort to run GameCube games in Wii mode, which was a concept that was insane at first and is now reality due to him. unfortunately, he does not seem to be interested in projects like this now and the homebrew community and other devs burned enough bridges with him over his practice of useless AntiPiracy measures on old games and not open sourcing everything. This is why Devolution is rarely used over Nintendont nowadays and without the source we wont really know what kind of crazy coding went into Devolution and it wont help to improve Nintendont.
 
D

Deleted User

Guest
Probably the only way to get improved resolution Wii games is Dolphin on a PC, or an original Wii with components.

But I have Galaxy 2 on my Wii U and there's no black borders.
 

MultipedBeatle

Active Member
OP
Newcomer
Joined
Oct 25, 2016
Messages
37
Trophies
0
XP
534
Country
United States
The reason for is that vWii is not emulated. vWii runs on the Wii U hardware in a special mode where the clock speed and cores are reduced, and the GPU is crippled.

Nintendont and Devolution run GameCube games in Wii Mode which is how they benefit from the faster clock speeds of the Wii. This would need to happen for Wii games to attempt to benefit from the performance of the Wii U.

Someone would need to write the equivalent to Nintendont or Devolution for Wii and/or GameCube games that runs those games in Wii U mode instead of vWii which would allow full GPU and faster clocked CPU support for Wii games provided the work was done to allow those games to properly use features like higher resolutions or clock speeds without crashing, major bugs, and timing issues.

if anyone was up for such a crazy task, it would be the creator of Devolution, Tueidj who put forth the first effort to run GameCube games in Wii mode, which was a concept that was insane at first and is now reality due to him. unfortunately, he does not seem to be interested in projects like this now and the homebrew community and other devs burned enough bridges with him over his practice of useless AntiPiracy measures on old games and not open sourcing everything. This is why Devolution is rarely used over Nintendont nowadays and without the source we wont really know what kind of crazy coding went into Devolution and it wont help to improve Nintendont.
I feel like if the community came together to start something like this, it would make some feel more inclined to buy a Wii U to play all the games from the GameCube and up through the Wii U. I'd do something but I've got no knowledge on the workings of the console or how to code at all. It makes me wonder if anybody would be willing to step up or kickstart it, because all I could do is present the idea and say "have fun".

--------------------- MERGED ---------------------------

Probably the only way to get improved resolution Wii games is Dolphin on a PC, or an original Wii with components.

But I have Galaxy 2 on my Wii U and there's no black borders.
There's a black letterbox around the screen, I can send a picture of it when I get the chance to show it. It might be because my Wii U is set to 1080p or because I'm using the original disc.
 

V10lator

Well-Known Member
Member
Joined
Apr 21, 2019
Messages
2,646
Trophies
1
Age
36
XP
5,513
Country
Germany
Nintendont and Devolution run GameCube games in Wii Mode which is how they benefit from the faster clock speeds of the Wii. This would need to happen for Wii games to attempt to benefit from the performance of the Wii U.
https://github.com/FIX94/c2w_patcher - "unlock the wii PPC clock multiplier to its wiiu speed, giving you 1.215ghz in wii mode"
https://github.com/FIX94/sign_c2w_patcher - "redirect cafe2wii to be loaded from any wii vc title you boot up, this allows for easy cafe2wii patches"
 
Last edited by V10lator,
  • Like
Reactions: XDeltaOne

driverdis

I am Justice
Member
Joined
Sep 21, 2011
Messages
2,867
Trophies
2
Age
31
Location
1.048596β
XP
2,838
Country
United States
https://github.com/FIX94/c2w_patcher - "unlock the wii PPC clock multiplier to its wiiu speed, giving you 1.215ghz in wii mode"
https://github.com/FIX94/sign_c2w_patcher - "redirect cafe2wii to be loaded from any wii vc title you boot up, this allows for easy cafe2wii patches"

while the Processor’s PPC and ARM speed can be set to Wii U, the GPU is still not operating in Wii U mode and this is primarily what will prevent higher resolution rendering of Wii games.

This will not always be true as is the case on Xbox where a few games could be run at 720p with extra ram and/or the 1.4GHz mod without touching the GPU.
 

pedro702

Well-Known Member
Member
Joined
Mar 3, 2014
Messages
12,722
Trophies
2
Age
33
XP
8,706
Country
Portugal
while the Processor’s PPC and ARM speed can be set to Wii U, the GPU is still not operating in Wii U mode and this is primarily what will prevent higher resolution rendering of Wii games.

This will not always be true as is the case on Xbox where a few games could be run at 720p with extra ram and/or the 1.4GHz mod without touching the GPU.
this is completely wrong...

the vwii doesnt run on a crippled gpu...

while cpu is underclocked and we can overclock it to wiiu speed

the wiiu itself has 2 gpus the wii gpu and the wiiu gpu and vwii runs on the wii gpu, which is the exact same gpu on a regular wii, the wiigpu maximum output is 480p and nothing ever can make it do 720p ... the xbox was capable of going to 10801 on its gpu settings for some games while the wii was not.
 
  • Like
Reactions: N7Kopper

V10lator

Well-Known Member
Member
Joined
Apr 21, 2019
Messages
2,646
Trophies
1
Age
36
XP
5,513
Country
Germany
the wiiu itself has 2 gpus
Please tell me you're not repeating what TheChosen told but have a source (link) to that statement. This is a legit request as to my knowledge the Wii U has exactly one GPU called Latte but that troll called TheChosen spreaded a lot of false informations before finally leaving GBATemp with the mission to destroy it with YouTube...
Wait, you're Nintendont dev? Then I believe you without a source. Anway, getting one would still be nice as I would love to read more about the technical details on that.

the GPU is still not operating in Wii U mode and this is primarily what will prevent higher resolution rendering of Wii games
That sounds technically more resonable than having two GPUs (esp. as Nintendo is known to build as cheap as possible. A GPU is way more expensive than software crippling, esp as Wii and Wii U GPUs are manufatured for Nintendo only).
Now to what @driverdis told: Well, the cafe2wii patch shows it is possible to manipulate cafe2wii which should be responsible for all hardware crippling. So it should be possible to extend it for that job.


Does it also work when i start the VWII channel?
To my knowledge it's for Wii VC but to be honest Im not sure about that. Also not sure what more eperienced Wii U developer to summon to get an answer to that question. @pedro702 do you know that?

//EDIT: Also @pedro702 would it theoretically be possible to write an GX to GX2 wrapper and inject that into cafe2wii to get the Wii games to use the Wii U GPU ? Keep in mind it's a theoretical queston, I'm not demanding an implementation of that.

//EDIT²: Readed a bit on wiiubrew but they have just very thin informaton about that. Also they call GX and GX2 hardware but I'm talking about the software APIs when asking if a wrapper could be possible.

//EDIT³: @pedro702 sorry for all that questions but you seem to be the one with the most experiene on that topic, so did I get this right?
- Latte is containing the ARM CPU as well as two GPUs called GX and GX2.
- Nintendo calls a GPU GX as well as it calls the Wiis graphic API GX.
- Nintendo calls a GPU GX2 as well as it calls the Wii Us gaphic API GX2.
If so I would request to call the APIS SGX (Software GX) / SGX2 and the chips HGX (Hardware GX) / HGX2 when continuing this conversation to prevent confusion.
 
Last edited by V10lator,

pedro702

Well-Known Member
Member
Joined
Mar 3, 2014
Messages
12,722
Trophies
2
Age
33
XP
8,706
Country
Portugal
Please tell me you're not repeating what TheChosen told but have a source (link) to that statement. This is a legit request as to my knowledge the Wii U has exactly one GPU called Latte but that troll called TheChosen spreaded a lot of false informations before finally leaving GBATemp with the mission to destroy it with YouTube...
Wait, you're Nintendont dev? Then I believe you without a source. Anway, getting one would still be nice as I would love to read more about the technical details on that.


That sounds technically more resonable than having two GPUs (esp. as Nintendo is known to build as cheap as possible. A GPU is way more expensive than software crippling, esp as Wii and Wii U GPUs are manufatured for Nintendo only).
Now to what @driverdis told: Well, the cafe2wii patch shows it is possible to manipulate cafe2wii which should be responsible for all hardware crippling. So it should be possible to extend it for that job.



To my knowledge it's for Wii VC but to be honest Im not sure about that. Also not sure what more eperienced Wii U developer to summon to get an answer to that question. @pedro702 do you know that?

//EDIT: Also @pedro702 would it theoretically be possible to write an GX to GX2 wrapper and inject that into cafe2wii to get the Wii games to use the Wii U GPU ? Keep in mind it's a theoretical queston, I'm not demanding an implementation of that.

//EDIT²: Readed a bit on wiiubrew but they have just very thin informaton about that. Also they call GX and GX2 hardware but I'm talking about the software APIs when asking if a wrapper could be possible.

//EDIT³: @pedro702 sorry for all that questions but you seem to be the one with the most experiene on that topic, so did I get this right?
- Latte is containing the ARM CPU as well as two GPUs called GX and GX2.
- Nintendo calls a GPU GX as well as it calls the Wiis graphic API GX.
- Nintendo calls a GPU GX2 as well as it calls the Wii Us gaphic API GX2.
If so I would request to call the APIS SGX (Software GX) / SGX2 and the chips HGX (Hardware GX) / HGX2 when continuing this conversation to prevent confusion.
just becuase they are named gx and gx2 doesnt mean they are similar in language, gx2 is very different in language and most of it is undocumented since barely anyone did any gx2 work on wiiu sadly.

From fix94 days he said that gx2 is very very diferent from gx so a wrapper while not impossible it would be extremely hard work if it would ever work at all.

nintendont works becuase wii gpu is the exact same gpu that the original gc had, the GX one there is literelay minor diferences on cache and suchbut the languague is the exact same, the cpu was the only diference and that is what nintendont addapted to make the correct calls because there was some diferente newer instructions and so on.

So yeah the wii has one cpu which is downclocked for vwii(we can overclock it to vwii injects to get wiiu cpu regular speed for quite a while.

2 gpus the original wii/gc gpu called GX and GX2 they are actually hardware and not software it has actualy 2 gpus inside.
 
  • Like
Reactions: V10lator

V10lator

Well-Known Member
Member
Joined
Apr 21, 2019
Messages
2,646
Trophies
1
Age
36
XP
5,513
Country
Germany
Thanks for the info. It seems SGX uses a FIFO between the CPU and HGX. I guess one could implement such a queue on the CPU only when writing an SGX to SGX2 wrapper. The worker thread would then execute the commands via GX2. Will need to read more abot SGX to fully understand it through.

//EDIT:
most of it is undocumented since barely anyone did any gx2 work on wiiu sadly.
I'm currently looking into libgui codes, cause I'm a bit familiar with them. LibRetro seems to make a lot of use of GX2 but their doxygen is a nightmare (found a damn good API doc for SGX: https://bot.libretro.com/doxygen/a06329.html#af02173c9ee890c6d481c14a6f16184af - still looking for SGX2 - @QuarkTheAwesome do you by any chance have a link at hand? :)).

//EDIT²:
a wrapper while not impossible it would be extremely hard work if it would ever work at all

I'm still in my early lookings but it seems the hardest part will be to implement the fifo queue. Then one would need to implement the none-fifo commands as wrappers over the queue commands (like GX_Flush sending the flush command into the queue, then waiting for it to be completed. The worker thread simply calls GX2Flush()). Also yea, things like initialising seems to be handle differently but that shouldn't be a big deal (have a loo at my early, all other than finished GXFifoObj:
Code:
typedef struct
{
    // GX stuff
    void *fifo;
    size_t size;
    uint32_t high;
    uint32_t low;
    void *read_ptr;
    void *write_ptr;
 
    //GX2 stuff
    void *gx2_cmd_buffer;
    GX2ColoBuffer gx2_color_buffer;
} GXFifoObj;
As well as this early, also unfinished GX_Init():
Code:
MEMHeapHandle mem1_heap = NULL;
MEMHeapHandle bucket_heap = NULL;
void *VMEM_alloc(size_t size, uint32_t align)
{
    if (align < 4)
        align = 4;
  
    size++;
    uint8_t area;
    uint8_t *ret = (uint8_t *)MEMAllocFromExpHeapEx(mem1_heap, size, align);
    if(ret == NULL)
    {
        // Let's use bucket just like MEM1
        ret = (uint8_t *)MEMAllocFromExpHeapEx(bucket_heap, size, align);
        if(ret == NULL)
        {
            // Last hance to get memory: Use slow MEM2
            ret = (uint8_t *)MEMAllocFromDefaultHeapEx(size, align);
            if(ret == NULL)
                return NULL;
          
            area = 2;
        }
        else
            area = 1;
    }
    else
        area = 0;
  
    *ret = area;
    return ret + 1;
}
void *VMEM_free(void *ptr)
{
    if(ptr == NULL)
        return;
  
    switch(--ptr)
    {
        case 0:
            MEMFreeToExpHeap(mem1_heap, ptr);
            return;
        case 1:
            MEMFreeToExpHeap(bucket_heap, ptr);
            return;
        default:
            MEMFreeToDefaultHeap(ptr);
    }
}
static inline void release_vmem()
{
    if(mem1_heap != NULL)
    {
        MEMDestroyExpHeap(mem1_heap);
        MEMFreeToFrmHeap(MEMGetBaseHeapHandle(MEMORY_ARENA_1), 3);
        mem1_heap = NULL;
    }
    if(bucket_heap != NULL)
    {
        MEMDestroyExpHeap(bucket_heap);
        MEMFreeToFrmHeap(MEMGetBaseHeapHandle(MEMORY_ARENA_FG_BUCKET), 3);
        bucket_heap = NULL;
    }
}
GXFifoObj *GX_Init(void *fifo, size_t fifo_size)
{
    for(int i = 0; i < 2; i++)
    {
        MEMHeapHandle heap_handle;
        uint32_t allocatable_size;
        void *memory;
        if((i == 0 && mem1_heap == NULL) || (i == 1 && bucket_heap == NULL))
        {
            heap_handle = MEMGetBaseHeapHandle(i == 0 ? MEMORY_ARENA_1 : MEMORY_ARENA_FG_BUCKET);
            allocatable_size = MEMGetAllocatableSizeForFrmHeapEx(heap_handle, 4);
            memory = MEMAllocFromFrmHeapEx(heap_handle, allocatable_size, 4);
            if(i == 0)
            {
                if(mem1_heap == NULL)
                    return NULL;
              
                mem1_heap = MEMCreateExpHeapEx(memory, allocatable_size, 0);
                if(mem1_heap == NULL)
                {
                    MEMFreeToFrmHeap(heap_handle, 3);
                    return NULL;
                }
            }
            else
            {
                if(bucket_heap == NULL)
                {
                    release_vmem();
                    return NULL;
                }
              
                bucket_heap = MEMCreateExpHeapEx(mem1_memory, mem1_allocatable_size, 0);
                if(bucket_heap == NULL)
                {
                    MEMFreeToFrmHeap(heap_handle, 3);
                    release_vmem();
                    return NULL;
                }
            }
        }
    }
  
    GXFifoObj *ret = VMEM_alloc(sizeof(GXFifoObj), 4);
    if(ret == NULL)
    {
        release_vmem();
        return NULL;
    }
  
    ret->gx2_cmdbuffer = VMEM_alloc(GX2_COMMAND_BUFFER_SIZE, 0x40);
    if(ret->gx2_cmdbuffer == NULL)
    {
        VMEM_free(ret);
        release_vmem();
        return NULL;
    }
  
    ret->fifo = fifo;
    ret->fifo_size = fifo_size;
    GX_InitFifoBase(ret, fifo, fifo_size);
  
    uint32_t gx2_init_attributes[9] = {;
        GX2_INIT_CMD_BUF_BASE,
        (uint32_t)ret->gx2_cmdbuffer,
        GX2_INIT_CMD_BUF_POOL_SIZE,
        GX2_COMMAND_BUFFER_SIZE,
        GX2_INIT_ARGC,
        (uint32_t)NULL,
        GX2_INIT_ARGV,
        0,
        GX2_INIT_END
    };
    GX2Init(gx2_init_attributes);
  
    int32_t tvScanMode = GX2_TV_SCAN_MODE_480I;
    uint32_t tvWidth = 854;
    uint32_t tvHeight = 480;
    GX2TVRenderMode tvRenderMode = GX2_TV_RENDER_MODE_WIDE_480P;
    uint32_t scanBufferSize;
    uint32_t scaleNeeded;
    GX2CalcTVSize(tvRenderMode, GX2_SURFACE_FORMAT_UNORM_R8_G8_B8_A8, GX2_BUFFERING_MODE_DOUBLE, &scanBufferSize, &scaleNeeded);
    tvScanBuffer = VMEM_alloc(scanBufferSize, GX2_SCAN_BUFFER_ALIGNMENT); // TODO
    GX2Invalidate(GX2_INVALIDATE_MODE_CPU, tvScanBuffer, scanBufferSize);
    GX2SetTVBuffer(tvScanBuffer, scanBufferSize, tvRenderMode, GX2_SURFACE_FORMAT_UNORM_R8_G8_B8_A8, GX2_BUFFERING_MODE_DOUBLE);
  
    GX2CalcDRCSize((GX2DrcRenderMode)drcScanMode, GX2_SURFACE_FORMAT_UNORM_R8_G8_B8_A8, GX2_BUFFERING_MODE_DOUBLE, &scanBufferSize, &scaleNeeded);
    drcScanBuffer = VMEM_alloc(scanBufferSize, GX2_SCAN_BUFFER_ALIGNMENT); // TODO
    GX2Invalidate(GX2_INVALIDATE_MODE_CPU, drcScanBuffer, scanBufferSize);
    GX2SetDRCBuffer(drcScanBuffer, scanBufferSize, (GX2DrcRenderMode)drcScanMode, GX2_SURFACE_FORMAT_UNORM_R8_G8_B8_A8, GX2_BUFFERING_MODE_DOUBLE);
  
    GX2AAMode tvAAMode = GX2_AA_MODE1X;
    GX2AAMode drcAAMode = GX2_AA_MODE4X;
  
    GX2InitColorBuffer(&tvColorBuffer, GX2_SURFACE_DIM_TEXTURE_2D, tvWidth, tvHeight, 1, GX2_SURFACE_FORMAT_UNORM_R8_G8_B8_A8, tvAAMode);
    tvColorBuffer.surface.image = VMEM_alloc(tvColorBuffer.surface.imageSize, tvColorBuffer.surface.alignment);
    GX2Invalidate(GX2_INVALIDATE_MODE_CPU, tvColorBuffer.surface.image, tvColorBuffer.surface.imageSize);
    GX2InitDepthBuffer(&tvDepthBuffer, GX2_SURFACE_DIM_TEXTURE_2D, tvColorBuffer.surface.width, tvColorBuffer.surface.height, 1, GX2_SURFACE_FORMAT_FLOAT_R32, (GX2AAMode)tvAAMode);
    tvDepthBuffer.surface.image = VMEM_alloc(tvDepthBuffer.surface.imageSize, tvDepthBuffer.surface.alignment);
    drcDepthBuffer.surface.image = tvDepthBuffer.surface.image:
    GX2Invalidate(GX2_INVALIDATE_MODE_CPU, tvDepthBuffer.surface.image, tvDepthBuffer.surface.imageSize);
    GX2Invalidate(GX2_INVALIDATE_MODE_CPU, drcDepthBuffer.surface.image, drcDepthBuffer.surface.imageSize);
  
    uint32_t size, align;
    GX2CalcDepthBufferHiZInfo(&tvDepthBuffer, &size, &align);
    tvDepthBuffer.hiZPtr = VMEM_alloc(size, align);
    GX2Invalidate(GX2_INVALIDATE_MODE_CPU, tvDepthBuffer.hiZPtr, size);
    GX2InitDepthBufferHiZEnable(&tvDepthBuffer, GX2_ENABLE);
  
    GX2InitColorBuffer(&drcColorBuffer, GX2_SURFACE_DIM_TEXTURE_2D, 854, 480, 1, GX2_SURFACE_FORMAT_UNORM_R8_G8_B8_A8, (GX2AAMode)drcAAMode);
    drcColorBuffer.surface.image = VMEM_alloc(drcColorBuffer.surface.imageSize, drcColorBuffer.surface.alignment);
    GX2Invalidate(GX2_INVALIDATE_MODE_CPU, drcColorBuffer.surface.image, drcColorBuffer.surface.imageSize);
  
    GX2CalcDepthBufferHiZInfo(&drcDepthBuffer, &size, &align);
    drcDepthBuffer.hiZPtr = VMEM_alloc(size, align);
    GX2Invalidate(GX2_INVALIDATE_MODE_CPU, drcDepthBuffer.hiZPtr, size);
    GX2InitDepthBufferHiZEnable(&drcDepthBuffer, GX2_ENABLE);
  
    uint32_t auxSize, auxAlign;
        GX2CalcColorBufferAuxInfo(&tvColorBuffer, &auxSize, &auxAlign);
        tvColorBuffer.aaBuffer = VMEM_alloc(auxSize, auxAlign);
        tvColorBuffer.aaSize = auxSize;
        memset(tvColorBuffer.aaBuffer, GX2_AA_BUFFER_CLEAR_VALUE, auxSize);
        GX2Invalidate(GX2_INVALIDATE_MODE_CPU, tvColorBuffer.aaBuffer, auxSize);
  
    GX2CalcColorBufferAuxInfo(&drcColorBuffer, &auxSize, &auxAlign);
        drcColorBuffer.aaBuffer = VMEM_alloc(auxSize, auxAlign);
        drcColorBuffer.aaSize = auxSize;
        memset(drcColorBuffer.aaBuffer, GX2_AA_BUFFER_CLEAR_VALUE, auxSize);
        GX2Invalidate(GX2_INVALIDATE_MODE_CPU, drcColorBuffer.aaBuffer, auxSize);
  
    tvContextState = (GX2ContextState*)VMEM_alloc(sizeof(GX2ContextState), GX2_CONTEXT_STATE_ALIGNMENT);
    GX2SetupContextStateEx(tvContextState, GX2_TRUE);
    GX2SetContextState(tvContextState);
    GX2SetColorBuffer(&tvColorBuffer, GX2_RENDER_TARGET_0);
    GX2SetDepthBuffer(&tvDepthBuffer);
  
    drcContextState = (GX2ContextState*)VMEM_alloc(sizeof(GX2ContextState), GX2_CONTEXT_STATE_ALIGNMENT);
    GX2SetupContextStateEx(drcContextState, GX2_TRUE);
    GX2SetContextState(drcContextState);
    GX2SetColorBuffer(&drcColorBuffer, GX2_RENDER_TARGET_0);
    GX2SetDepthBuffer(&drcDepthBuffer);
  
    GX2SetViewport(0.0f, 0.0f, tvColorBuffer.surface.width, tvColorBuffer.surface.height, 0.0f, 1.0f);
    GX2SetScissor(0, 0, tvColorBuffer.surface.width, tvColorBuffer.surface.height);
  
    projectionMtx = glm::perspective(45.0f, 1.0f, 0.1f, 100.0f);
  
    viewMtx = glm::mat4(1.0f);
    viewMtx = glm::translate(viewMtx, glm::vec3(0.0f, 0.0f, -2.5f));
    viewMtx = glm::rotate(viewMtx, DegToRad(25.0f), glm::vec3(1.0f, 0.0f, 0.0f));
  
    GX2InitSampler(&aaSampler, GX2_TEX_CLAMP_MODE_CLAMP, GX2_TEX_XY_FILTER_MODE_LINEAR);
    GX2InitTexture(&tvAaTexture, tvColorBuffer.surface.width, tvColorBuffer.surface.height, 1, 0, GX2_SURFACE_FORMAT_UNORM_R8_G8_B8_A8, GX2_SURFACE_DIM_TEXTURE_2D, GX2_TILE_MODE_DEFAULT);
    tvAaTexture.surface.image = tvColorBuffer.surface.image;
    tvAaTexture.surface.imageSize = tvColorBuffer.surface.imageSize;
    tvAaTexture.surface.mipmaps = tvColorBuffer.surface.mipmaps;
  
    return ret;
}
void GX_InitFifoBase(GXFifoObj *fifo, void *base, size_t size)
{
    size -= 1;
    GX_InitFifoLimits(fifo, size, size >> 1); // TODO
    GX_InitFifoPtrs(fifo, base, base);
}
void GX_InitFifoLimits(GXFifoObj *fifo, uint32_t highwater, uint32_t lowwater)
{
    fifo->high = highwater;
    fifo->low = lowwater;
}
void GX_InitFifoPtrs(XFifoObj *fifo, void *read, void *write)
{
    fifo->read = read;
    fifo->write = write;
}
void GX_GetFifoPtrs(GXFifoObj *fifo, void **read_out, void **write_out)
{
    *read_out = fifo->read;
    *write_out = fifo->write;
}
).

//EDIT³: Okay, implementing the queue is a PITA. Anyway, there is a open source project which should have a queue ready to use: Dolphin: https://github.com/dolphin-emu/dolphin/blob/master/Source/Core/VideoCommon/Fifo.cpp
I gues someone would have to write a GX2 backend for dolphin, then extract the queue with all graphic related stuff and be done. There are developers who are eperienced with doing such work, so let's try to summon @GaryOderNichts as well as @AboodXD and hope that my guessing is correct and that one of them likes this idea.
 
Last edited by V10lator,

MultipedBeatle

Active Member
OP
Newcomer
Joined
Oct 25, 2016
Messages
37
Trophies
0
XP
534
Country
United States
Oh boy, if this ever became a thing the WiiU's popularity would go through the roof!

I got logged out of my account by accident and came back to see this thread with 1000 views! I totally agree with you though, especially since the Wii U is perfectly capable of running everything from the NES up to it's own games. I severely doubt there's a way to easily get the GX code running on the Wii U GPU without rewriting it entirely to run native, but it would definitely be cool to see.

2 gpus the original wii/gc gpu called GX and GX2 they are actually hardware and not software it has actualy 2 gpus inside.

With those two GPUs inside, I have a feeling that there should be a way to brute force GX to run on GX2 with more work than it'd probably be worth. If I knew what I was doing with anything related to code I'd look but you even said that GX2 is very undocumented.

I hope that this thread sparked something in someone so that in the future somebody can experience at least 720p Wii games natively through their Wii U
 

Immortallix

Well-Known Member
Member
Joined
Mar 15, 2009
Messages
174
Trophies
1
XP
1,090
Country
United States
Didn't the Wii's source code leak in one of the Gigaleaks? Would be any help in developing a cfw vWii that could output above 480p?
 

MultipedBeatle

Active Member
OP
Newcomer
Joined
Oct 25, 2016
Messages
37
Trophies
0
XP
534
Country
United States
Didn't the Wii's source code leak in one of the Gigaleaks? Would be any help in developing a cfw vWii that could output above 480p?
The problem with that is copyright and all that, for it to be viable for release it would need to be reverse engineered and that would take a lot of dedication for the Wii U. I don't remember seeing is the Wii source code got leaked or not, but I think vWii and Wii U would be the biggest ones here
 

Site & Scene News

Popular threads in this forum

General chit-chat
Help Users
  • No one is chatting at the moment.
    Psionic Roshambo @ Psionic Roshambo: 24,000 hmmmm lol