Thanks for clarifying. But to be clear, if I want to switch filters, I just have to go through the process of saving the TwlBg.cxi again using your patch? Then the new filter selected should apply automatically? If so, then I really appreciate the preview function you put into the patch.
Two more quick (hopefully) questions:
- Are you able to describe in more detail what each of the filters does? The names aren't totally obvious.
- When enabling the widescreen patch, there's also the option of "enable GPU scaling (blurry, no filters)" What does that actually do, and should it be combined with your filters? Or does enabling this option just make the widescreen as blurry as the original 4:3 option?
Yeah, you sadly need to re-patch it every time. The highlighted filter is used. Perhaps I should change how it works, as the filter selection was made when the patch menu did not exist, so it's just confusing now the way it is, that you must press A to enable the patches, but you must highlight the filter you want to use before pressing START, definitely confusing...
In hindsight, I should've added the preview function much earlier than I did... But at least it's in. Although I should probably finally fix the GBA previews at some point...
I've already optimized the code once, and it went down from 6m23s to 5m53s on old3DS, but the wait is still painful... It's probably still possible to make it go faster, but I'm reaching the limits of the CPU itself, so even if I had single-cycle memory, it would be still slow.
I'll try my best to remember the origin of each scale matrix, but here are the stories I can remember:
- Nintendo default - copypasted directly from TwlBg, it *is* the default
- Sono's crisp (original) - it's basically the least amount of interpolation possible. Basically the GBA upscale filter modified for DS(i). This one looks better with 2D games, but also okay with 3D.
- Sono's crisp (tweaked) - you're right that you don't see the difference, as the difference is in the subpixel! This one looks better in 3D, as there is some slight antialiasing effect in this one, but it looks horrible in 2D due to weird colorbleed.
- Zero interpolation (double pixel) - it's Sono's crisp, but without any blurring. Just as its name says, the interpolated pixel is doubled instead of blurred.
- Linear interpolation 1 - it is as pure linear interpolation as I can get (first output pixel is in the center of the first input pixel, meaning the first pixel is unchanged). If you hold X, you can see that Nintendo's default filter is actually sharper than pure linear, due to edge-detect. However Nintendo did something weird with that edge-detect, as it causes some artifacts around some edges, which looks bad in some cases.
- Linear interpolation 2 - different kind of linear interpolation. I think this one might be broken, as the order of the matrix may be reversed, so it creates a really weird jagged effect.
- Linear interpolation 3 - this one might also be broken due to a reversed matrix. This one has the pixel center point shifted by half a pixel, so there are no output pixels which perfectly fall into the center of an input pixel, meaning the output will always be blurry on purpose.
- Linear interpolation 4 - this is Linear interpolation 3, but shifted half a pixel, so one of the output pixels is on the center of an input pixel, meaning not everything is a blurfest. Similarly to the rest, this one may be reversed too, creating a weird jagged edges on perfectly diagonal lines.
- Sharpen test 1 - this is one of the broken filters (which has been removed since), but with an extremely aggressive edge-detect applied to it. Again, due to accidentally being reversed, diagonal edges are really jagged instead of perfect diagonal. This one is primarily for 2D games with text.
- Linear sharpen 1 - this is basically Nintendo default, except recreated from scratch. Due to being added late, and not copypasted from an another broken swapped matrix, this one is not reversed, and thus there are no jagged edges (other than some weird ringing artifact on diagonals every 4-5 pixels or so). You can see the difference this makes if you keep holding X to preview this with Nintendo's default.
- Darken crisp - this is Sono's crisp (original), but it's darkened by half. No pixel information is lost, all 64 shades are still displayed correctly. This was made before the Redshift patch, hence it's still there.
- Darken Nintendo - just like its name says, this is the Nintendo default matrix, but darkened.
- 4grid - these are experimental. This is an attempt at an LCD filter (search for "LCD shader" if you want to see examples), but on the 3DS this looks bad for some reason. It's most likely my fault for not understanding how this effect works.
The GBA scale filters are self-explainatory. They may also be reversed, I'm not sure.
You can't preview them due to an age-old bug in TWPatch scale matrix emulator, so you'll have to test them in-game.
But then again, GBA mode in TWPatch is kind of undocumented, intentionally. I'm rewriting AgbBg (and eventually AGB_FIRM as well), and there is also open_agb_firm, which is much more superior, use that instead.
As for the widescreen and GPU patch, they are technically very complicated.
Oversimplified, there are two ways in Nintendo's code to scale the capture card output: using the capture card's matrixes (basically the filter selection in TWPatch edits this), or using the GPU purely (the capture card upscaler is turned off, and the GPU upscales the image instead, using linear interpolation).
Nintendo actually wanted to implement GPU scaling, but the code is almost completely missing. Luckily, they left enough hints in the binary, so that I could painfully reimplement it. Sadly too much of the code is missing, so I had to resort to destructive patches to make it work.
Ideally we'd use both the scale matrixes *and* the GPU (upscale as high as we can using the scale matrixes, then *downscale* with the GPU) for the highest quality scaling, but I just simply can't figure out how to hack it into Nintendo's janky code. There is also not enough RAM for a higher-resolution image.
No, the GPU scale will always be blurry. If I switch from linear to nearest interpolation, due to how the GPU works, it will cause nasty jagged edges and mismatched pixels to be drawn due to rounding errors, so I can't do that. The reason it's blurrier than mine, is that the GPU has the highest quality linear interpolation (due to true 2D multi-sample, instead of fake 2x 1D interpolation), so this is how it is sadly.