PS1/2 [TUT] Quick 240p emulation guide

notimp

Well-Known Member
OP
Member
Joined
Sep 18, 2007
Messages
5,779
Trophies
1
XP
4,419
Country
Laos
Didnt know it was a thing - until I fell in love with it. ;) I bought an OSSC recently, just to play with it, and also decided, that for some reason the first old console, that I'd move back in from storage (I used emulation for everything else exclusively for years) should be a PS2 - so its the first console I'm experiencing the OSSC with, and boy what a stunner.

The short synopsis is, that many (/most) consoles prior to the N64 in some form or fashion used 240p-263p (could also be 270p dont know, just havent seen the resolution yet) natively, when calculating the image - and then used the characteristics of CRTs, to display an empty black line every other line (what we now sometimes try to simulate using scanlines) and thereby ended up outputting at a vertical resolution of 480 (interlaced). (Something in that explanation is not quite right, as interlaced is line altering - have to think it through once more.. ;) ).

Regardless, the point is, that the entire image information is stored in 240 (263 (PAL), ...) lines, that can be scaled 5x to get a pretty decent 1080p image, with 3:2 aspect ratio (top and bottom cut off a bit probably - still have to doublecheck), thats not that impacted by your TVs scaling - in fact, this is what the OSSC does (although not perfectly, see tearing issue in the other thread, the tearing issue could probably be solved, by outputing 1920x1200 to my LG OLED, and then stretching the bars away (in the TVs options (stretch to all sides, with manual adjustments)) - although this will impact image quality - in 1920x1200, the tearing does not occur, but black bars are introduced.) That 240p image, integer scaled to 1080p - looks money. Its crisp, its sharp, its vibrant (OSSC takes YCbCr (limited dynamic range), and outputs full dynamic range, and does the rec601 conversion to rec709 correctly) - depending on the game I sometimes even use dynamic range set to limited on my TV crushing near blacks in the signal - but boy do some games look good doing this...

As the resulting image is a little too sharp (jaggies, compression artefacts), we introduce fake scanlines using the OSSC - and boy, if that image (at 18% to 50% scanline intensity (I most often use 25%, but then thats with the black crush dynamic range mismatch, mentioned above. If I set it correctly, scanlines closer to 50% would be appropriate) doesnt look gorgeous, I dont know what else would. :)

Example:
https://streamable.com/jplgch

The game in the example is ICO, which renders at 240p natively. Its one of the few PS2 games that do. Most of them actually render in 480i - which using the OSSC cant be 5x scaled, and therefore is less impressive looking on current TVs. ;) (That said, my OLED doesnt accept the 3x scaled signal from 480i or 576i (Normally lower, I forgot the number I actually see all the time 525i I believe - (will edit, when checked)), so I only can see 2x and sometimes 4x - which look significantly less impressive. (4x (the higher starting resolution of 480i deinterlaced) on an LG OLED produces a resolution that lets the TV switch into a "PC mode" (much like 1920x1200) where you have to reapply image settings (calibration settings) once more, btw, because they are handled (grouped) separately from "TV resolution" on that device.). But thats not all.

Using PopStation we have access to quite a few titles of the PSX library running in full speed, most of which also render in 240p (or 263p) those also can be integer scaled 5x.

Using Emulators we have access to SNES and Genesis, up to Sega CD games - also running in 240p (263p).

Sega CD is where the PS2 struggles to keep up emulation wise. It could be the USB 1.0 ports on the PS2 that are the limiting factor, havent looked into that yet.
Regardless, the emulator to use is PicoDrive Standalone ( https://mundowiihack.wordpress.com/2015/08/03/picodrive-ps2/#more-466 ), and the CD images have to be prepared a little to be able to play them - the following video tells you how: Its in russian though, and since neither me nor probably most of you are speaking it, and the person explaining makes the entire process more complicated than it needs to be - here is the way I ended up preparing my images:

Software thats needed is Isobuster and any software that can convert .wav files into .mp3. Open the Sega CD image in Isobuster, go into Isobuster settings, uncheck "automatically just use .iso over bin". Rightclick on CD once the image is loaded, then extract content, both as raw bin, and as an iso. The Iso run will give you the data partition of the CD as an iso file, which you can delete, but will also extract audio tracks as .wav which you need. The .bin extraction run will give you all of the data segments on the CD as .bin files, of which we only keep the first (data partition). We rename .bin and .wav files according to their naming scheme in the .cue , that lists one data partition (.bin), and several other .bins with a music designation (starttime, length, and chapter markers). Those bins are still designated as BINARY in the .cue file, we replace BINARY with MP3 in all but the first instance (where it refers to the data partition), and then replace the file endings of all other entries with .mp3. The .cue sheet example under "Compressed audio tracks" in this wiki is what we are aiming for structurally: https://www.dosbox.com/wiki/Cuesheet (just not with .ogg, but with .mp3)) Then we have to convert our .wav files into .mp3 (128k/bit max as quality, otherwise the USB connection on the PS2 becomes the bottleneck again.. :) Probably (have to double check).), and place them in the same folder. In PicoDrive we then just have to load the .cue file (with the setting "use .mp3" enabled.). Use 16 bit accurate emulation and 44100Hz, whenever possible). Games might stutter once in a while, but in general are playable - and also PicoDrive has a setting to output in 240p. Yay. :)

For PSX and Popstation, just use a tool named CUE2POPS_2_3.exe that creates .vcd files from .cue and .bin which are compatible with Popstation - those files have to contain one .bin only, before CUE2POPS_2_3.exe can convert them and for that we use Isobuster again (this time with create image and .bin as the option (right click on CD once loaded). This produces an image which exactly one .cue and one .bin, which we can load in CUE2POPS_2_3.exe, and thats basically it.

FMV games on PSX dont run well using this method, btw - X-Files is a stutterfest, but then , that also uses 320p native resolution, so far exceeding 240p (and 263p ;) ) - but other demanding games like Chrono Cross, basically run flawlessly. No emulation settings available, btw, those games simply will output in their 240p (and 263p ;) ) modes by default. :)

Long story short - with the right linedoubler (OSSC or a more expensive one ;) ), and an option to be able to add scanlines in the linedoubler, 240p looks money, even on LCD and OLEDs. No CRT needed. :) All interlaced resolution outputs, look worse in my case, due to the scaling of the TV, but still worse if integer scaled by the OSSC (to a lesser than 5x multiple).

Policenauts 240p 5x Scaled with scanlines, I tell you... *Mmmoaaa* -- basically you get the feeling back of having played those games on a CRT, just bigger than your CRT ever was.

First time I have had that experience on a modern flat panel TV. And its all due to integer scaling 240p signals (bypassing the TVs internal scaler (up to 1080p at least) and adding scanlines. (Which normally I dont love - but in this case... they look great (and combat the aliasing.. :) ) Because the PS2 and earlier games didnt have any Anti Aliasing; and it shows.. ;) ).

edit: For reference, other people having fallen in love with 240p. ;)
https://gbatemp.net/threads/wii-240p-emulators.473191/
edit: Fixed the wrong link just above this sentence. :)
 
Last edited by notimp,

Maeson

Well-Known Member
Member
Joined
Apr 3, 2013
Messages
1,179
Trophies
1
XP
3,380
Country
Spain
PS1 games (or any 240p games for that matter) do look great through the OSSC. My current screen can't do 1200p, but they look amazing at 960p still. 480i PS2 games on the other hand is a bit of a lottery, depending on how they were programmed.

Many were 448i resolution games, while other were oddball resolutions too, with very few being 480, and that combined with the fact that many games (specially during the early years) didn't have any sort of way to smooth out that, pixels have uneven sizes and the overall image can look really bad. Plus, without deflicker filters interlaced pictures can be bad for your eyes.

If there's a way to work around that with the OSSC I would love to find it, I haven't yet, and I've read quite a bit about it, with other people finding the same issues.

For emulation I use my Wii, though, for a variety of reasons. But yeah, the results are fantastic.

By the way, you link to this very post at the end. Is that intentional?
 

ciaomao

Well-Known Dude
Member
Joined
Feb 20, 2014
Messages
569
Trophies
1
XP
1,912
Country
Albania
Didnt know it was a thing - until I fell in love with it. ;) I bought an OSSC recently, just to play with it, and also decided, that for some reason the first old console, that I'd move back in from storage (I used emulation for everything else exclusively for years) should be a PS2 - so its the first console I'm experiencing the OSSC with, and boy what a stunner.

The short synopsis is, that many (/most) consoles prior to the N64 in some form or fashion used 240p-263p (could also be 270p dont know, just havent seen the resolution yet) natively, when calculating the image - and then used the characteristics of CRTs, to display an empty black line every other line (what we now sometimes try to simulate using scanlines) and thereby ended up outputting at a vertical resolution of 480 (interlaced). (Something in that explanation is not quite right, as interlaced is line altering - have to think it through once more.. ;) ).

Regardless, the point is, that the entire image information is stored in 240 (263 (PAL), ...) lines, that can be scaled 5x to get a pretty decent 1080p image, with 3:2 aspect ratio (top and bottom cut off a bit probably - still have to doublecheck), thats not that impacted by your TVs scaling - in fact, this is what the OSSC does (although not perfectly, see tearing issue in the other thread, the tearing issue could probably be solved, by outputing 1920x1200 to my LG OLED, and then stretching the bars away (in the TVs options (stretch to all sides, with manual adjustments)) - although this will impact image quality - in 1920x1200, the tearing does not occur, but black bars are introduced.) That 240p image, integer scaled to 1080p - looks money. Its crisp, its sharp, its vibrant (OSSC takes YCbCr (limited dynamic range), and outputs full dynamic range, and does the rec601 conversion to rec709 correctly) - depending on the game I sometimes even use dynamic range set to limited on my TV crushing near blacks in the signal - but boy do some games look good doing this...

As the resulting image is a little too sharp (jaggies, compression artefacts), we introduce fake scanlines using the OSSC - and boy, if that image (at 18% to 50% scanline intensity (I most often use 25%, but then thats with the black crush dynamic range mismatch, mentioned above. If I set it correctly, scanlines closer to 50% would be appropriate) doesnt look gorgeous, I dont know what else would. :)

Example:
https://streamable.com/jplgch

The game in the example is ICO, which renders at 240p natively. Its one of the few PS2 games that do. Most of them actually render in 480i - which using the OSSC cant be 5x scaled, and therefore is less impressive looking on current TVs. ;) (That said, my OLED doesnt accept the 3x scaled signal from 480i or 576i (Normally lower, I forgot the number I actually see all the time 525i I believe - (will edit, when checked)), so I only can see 2x and sometimes 4x - which look significantly less impressive. (4x (the higher starting resolution of 480i deinterlaced) on an LG OLED produces a resolution that lets the TV switch into a "PC mode" (much like 1920x1200) where you have to reapply image settings (calibration settings) once more, btw, because they are handled (grouped) separately from "TV resolution" on that device.). But thats not all.

Using PopStation we have access to quite a few titles of the PSX library running in full speed, most of which also render in 240p (or 263p) those also can be integer scaled 5x.

Using Emulators we have access to SNES and Genesis, up to Sega CD games - also running in 240p (263p).

Sega CD is where the PS2 struggles to keep up emulation wise. It could be the USB 1.0 ports on the PS2 that are the limiting factor, havent looked into that yet.
Regardless, the emulator to use is PicoDrive Standalone ( https://mundowiihack.wordpress.com/2015/08/03/picodrive-ps2/#more-466 ), and the CD images have to be prepared a little to be able to play them - the following video tells you how: Its in russian though, and since neither me nor probably most of you are speaking it, and the person explaining makes the entire process more complicated than it needs to be - here is the way I ended up preparing my images:

Software thats needed is Isobuster and any software that can convert .wav files into .mp3. Open the Sega CD image in Isobuster, go into Isobuster settings, uncheck "automatically just use .iso over bin". Rightclick on CD once the image is loaded, then extract content, both as raw bin, and as an iso. The Iso run will give you the data partition of the CD as an iso file, which you can delete, but will also extract audio tracks as .wav which you need. The .bin extraction run will give you all of the data segments on the CD as .bin files, of which we only keep the first (data partition). We rename .bin and .wav files according to their naming scheme in the .cue , that lists one data partition (.bin), and several other .bins with a music designation (starttime, length, and chapter markers). Those bins are still designated as BINARY in the .cue file, we replace BINARY with MP3 in all but the first instance (where it refers to the data partition), and then replace the file endings of all other entries with .mp3. The .cue sheet example under "Compressed audio tracks" in this wiki is what we are aiming for structurally: https://www.dosbox.com/wiki/Cuesheet (just not with .ogg, but with .mp3)) Then we have to convert our .wav files into .mp3 (128k/bit max as quality, otherwise the USB connection on the PS2 becomes the bottleneck again.. :) Probably (have to double check).), and place them in the same folder. In PicoDrive we then just have to load the .cue file (with the setting "use .mp3" enabled.). Use 16 bit accurate emulation and 44100Hz, whenever possible). Games might stutter once in a while, but in general are playable - and also PicoDrive has a setting to output in 240p. Yay. :)

For PSX and Popstation, just use a tool named CUE2POPS_2_3.exe that creates .vcd files from .cue and .bin which are compatible with Popstation - those files have to contain one .bin only, before CUE2POPS_2_3.exe can convert them and for that we use Isobuster again (this time with create image and .bin as the option (right click on CD once loaded). This produces an image which exactly one .cue and one .bin, which we can load in CUE2POPS_2_3.exe, and thats basically it.

FMV games on PSX dont run well using this method, btw - X-Files is a stutterfest, but then , that also uses 320p native resolution, so far exceeding 240p (and 263p ;) ) - but other demanding games like Chrono Cross, basically run flawlessly. No emulation settings available, btw, those games simply will output in their 240p (and 263p ;) ) modes by default. :)

Long story short - with the right linedoubler (OSSC or a more expensive one ;) ), and an option to be able to add scanlines in the linedoubler, 240p looks money, even on LCD and OLEDs. No CRT needed. :) All interlaced resolution outputs, look worse in my case, due to the scaling of the TV, but still worse if integer scaled by the OSSC (to a lesser than 5x multiple).

Policenauts 240p 5x Scaled with scanlines, I tell you... *Mmmoaaa* -- basically you get the feeling back of having played those games on a CRT, just bigger than your CRT ever was.

First time I have had that experience on a modern flat panel TV. And its all due to integer scaling 240p signals (bypassing the TVs internal scaler (up to 1080p at least) and adding scanlines. (Which normally I dont love - but in this case... they look great (and combat the aliasing.. :) ) Because the PS2 and earlier games didnt have any Anti Aliasing; and it shows.. ;) ).

edit: For reference, other people having fallen in love with 240p. ;)
https://gbatemp.net/threads/wii-240p-emulators.473191/
edit: Fixed the wrong link just above this sentence. :)



Thank you for sharing your thoughts. do you have any idea here too? thanks.
https://gbatemp.net/threads/wanted-ossc-custom-firmware-ossc_0-77-rg0d-bin.560494/
 

notimp

Well-Known Member
OP
Member
Joined
Sep 18, 2007
Messages
5,779
Trophies
1
XP
4,419
Country
Laos
Thank you for sharing your thoughts. do you have any idea here too? thanks.
https://gbatemp.net/threads/wanted-ossc-custom-firmware-ossc_0-77-rg0d-bin.560494/
Sadly no. Might look into it if In a spare moment, but I'm afraid, I wont be of much use..

Slight correction to the information on Sega CD above, audio compression to .mp3 isnt needed, nor is it recommended. Snatcher plays fine (better audio quality) in original bin form.

NTSC Sega CD games stutter a bit more than PAL games (auto frame skip on), so keep that in mind. The PAL version of Snatcher actually seems pretty playable to me... :)

That said, Wii likely is the better console to go with for this (240p output on emulated games), because it doesnt rely on Picodrive, which is not the best Genesis emulation out there (misses some sound effects in Snatcher.. :) ).

But still - until I get that set up, I'll probably play through Snatcher (PAL version) on the PS2 (once more.. ;) not the first time I'm playing through the game.. :) ) in 240p. :) .

Also and this is just a throw away line, but sound quality of spoken dialog may actually be better on the PAL version of Snatcher. I've just listened to it with monitoring headphones (very transparent) and I think thats the case... Anyhow... Thats probably all I have to add on this for a while.. :) Just mainly wanted to share, that 240p games with scanlines look very good on modern TVs using an OSSC. (Retrotink 5x-Pro probably is the better solution currently, but its 3x the price. :) (300 USD plus) - if its available. Just to throw that one in here as well. OSSC mainly 'suffers' from Bob deinterlacing, so a progressive signal, always is prefered, for motion performance alone. (Not sure if 480i signals would look better on the Retrotink 5x-Pro, but at some point you have to stop second guessing your buying decissions.. ;) ) )
 
Last edited by notimp,

Maeson

Well-Known Member
Member
Joined
Apr 3, 2013
Messages
1,179
Trophies
1
XP
3,380
Country
Spain
It'd be weird if the PAL version had better audio than NTSC, but it is an interesting thing to try sometime.

Retrotink 5x-Pro seems to be really good, but in a video made by RetroRGB, it shows that the OSSC offers a slightly better picture. The Retrotink has some features the OSSC does not, though. Both devices are fantastic, but it's true one is quite a bit more expensive...

With that said, only the PS2 games give me issue with the OSSC. If I played GC, XBOX, DC or Wii games on interlaced resolutions there wouldn't be much of an issue because those systems use a smoothering filter when in interlaced. With progressive modes is always better though.

I'm surprised that there was a OSSC firmware with smoothing options. I can't seem to find anything either...
 

notimp

Well-Known Member
OP
Member
Joined
Sep 18, 2007
Messages
5,779
Trophies
1
XP
4,419
Country
Laos
Its probably worth to talk about TV setups with the OSSC at this point - with interlaced content on my LG Oled I actually prefer to scale to 2x, and then let the TV do the rest of the scaling, with "Super Resolution" on the LG Oled set to low (not off). It does some smoothing (just a bit, not very much) on its own, which also helps with image definition (I'm sure they are doing edge sharpening as well). All other sharpening off, all other enhancements off.

525i 2x scaled (with this post processing feature) looks better than 4x scaled on my setup (3x doesnt work on my TV (TV doesnt like the resolution)). Good enough actually, that I played Death by Degrees today, Pal>NTSC patched, 16:9 patched (uses zoom) and image quality with scanlines was still good enough, that I actually admired the gameart. Only setback is added input lag 30-40ms on my TV, so quite a bit - but for most things I play - thats fine.

I'm actually not a video purist at all, and once I learned calibrating down to the 3D lut level, I actually pivoted the other way, and bought some external scalers just to play with them - today I have 3 external scalers in my video setup (on a separate lane (HDMI splitter)), so I can add them whenever I want to - so 'the OSSC is looking better' to me means absolutely nothing, because I know, how subjective that stuff can be - yet I still get better results postprocessing about 60-70% of my 1080p based movie content. (Never went the 4k Bluray Route, because of the nightmare that is HDR with a fluid luminance (brightness) target over time).

So when I say I enjoy the image that comes out of the OSSC, I just mean it. It looks great. :) Probably could have achieved the same thing with a PC and emulation, but there you usually go the upscaling internal resolution, texture filtering route, and i never quite saw the appeal of scanlines. (Because integer scaling on a PC is not a problem either. :) ) Maybe its just the remote the OSSC comes with, that allows you to adjust scanline intensity on the fly... That and the black crush (dynamic range mismatch (OSSC outputs full, TV was set to limited)), just produced something with great visual appeal on my TV, that looked like the better CRT images I've seen.

All of that appeal is lost, if I just change image modes, to one that pumps brightness f.e. (I've my setup set up to a 85 nits target I believe, which is lower than the CIE Standard) - its really the CRT like experience that makes it here, the perceived image depth, that 'just the right amount of scanlines' add (because it then has an effect similar to dithering)... the... everything. It just worked this time around.. :)

Here - in this video I made a few hours ago, that actually turned out bad (none of the definition is captured, black leves elevated), I tried to capture some of that nuance. Video starts at the interlaced 525i signal 2x scaled, then I turn off the OSSC scaling (TV native scaling takes over all the way), then I turn it on again.

https://streamable.com/4e2i3r

I'm sure, many people would say that the passthrough setting "looks better" (colors pop more), I think the image looks more congruent when 2x scaled.

Its nuances. Its subjective. Yet the impression 240p content with scanlines left on me - was real. There is something to it, if you get that integer scaled by at least 5x - it really looks good. ;)

Thats most of what I tried to say in this thread. :)

Will make another one on using cheats to patch games to 16:9, probably tomorrow - without the need to patch gamefiles - because again, it just works. And leaves you the option to turn it off an on again. :) With that your tv has to have an image mode, where you can stretch the image in four directions though, otherwise, random bars, that negatively impact aspect ratio (after aspect ratio patches) with the OSSC. LG Oleds, luckily have that feature, so I'll write about that one as well... Because I'm sure not all people know about it. :)
 
Last edited by notimp,

notimp

Well-Known Member
OP
Member
Joined
Sep 18, 2007
Messages
5,779
Trophies
1
XP
4,419
Country
Laos
I just bought and soft-modded a Wii to get a better frame of reference to what I was seeing, and how it would compare to 'one of the standards' of 240p emulation. (Modded Wii. :) The next step up probably would be a MiSTer)

So first up, these are impressions using an OSSC on an Oled with most enhancements turned off. So whatever any other TVs settings/characteristics might add, I cant account for.

Short story is, that the PS2 when outputting 240p seems to produce a cleaner picture, than I can coax out of the Wii. I have to double check to confirm it, because most of my impressions I got, while playing Policenauts on Popstarter, and I wasnt able to compare to PSX games on the Wii, because the only PSX emulator thats worthwhile currently outputs in 480i (and probably interlaced PAL resolutions) only. It should get a 480p patch in the next months which will improve things, but not to the point where I could get 240p or 320p resolutions out of the Wii that produce the best results with an OSSC on those older games effectively using those resolutions while rendering the image.

So impressions on compairing SNES and Sega CD games first are, that the OSSC Scanline 'wow factor' was greater on PSX games, than Snes or Sega CD. And second, that the PS2 - when outputing those games in 240p was on par (despite performance issues on Sega CD) in visual fidelity, if not slightly better, than the image I could get out of the Wii.

With the Wii, you are essentially using this build ( - readme here: https://old.reddit.com/r/WiiHacks/comments/g5b8rc/a_mod_of_retroarch_with_autoresolution_switching/ ) of RA-HEXAECO (RA = Retroarch), which has auto switching to 240p (and slightly higher, and slightly lower) resolutions built in (can be enabled). It also has several settings, trying to defeat the switches internal filtering.

Short explaination. When targeting a 480i resolution, like the PS2 and Wii for most of their customer base, you are dealing with shimmering even on CRTs, so think of some non anti aliased edges, against a bright background (sky) while you have camera movement and an interlaced output pattern. To compensate for shimmering the Switch did implement internal Bilinear filtering before the image got sent out to the TV screen, which afaik doesnt even get disabled when you've put the switch in 480p mode (Pal resolution slightly different, but also progressive.). RA-HEXAECO (on Wii) has several options to deal with this.

First you can enable point filtering, which at least disables bilinear filtering, but the resulting image still isnt as crisp. Next you can enable "Trap filtering", which reduces color bleeding from pixel to pixel, and just produces an overall cleaner look on all pixel art games, and systems.
After that, you can turn on Deflicker and Dithering.

Deflicker, even with a progressive output signal, has an effect similar to what a (slight) sharpening filter would produce, where even with scanlines on to op it, the image after having turned it on doesnt blend as well (you still see something akin to edge sharpening). But if you then turn on dithering, it 'blurs' the resulting image just enough, that Deflicker and Dithering on is an option to play with. On Chrono Trigger (SNES) over all the resulting image looked better with those two enabled, on Snatcher (Sega CD) it looked worse.

The PS2 altogether doesnt have any image postprocessing filters enabled at all, so none of that is an issue with pixel based games. :)

Its neat to have those options and to be able to manipulate pixel art based games that granular (while playing with the intensity of OSSC scanlines on top of it), but at no point did I get the "wow" feeling that I got with Policenauts in 240p (or 320p cant remember the base rendering resolution, probably 320p) in Popstarter on PS2. Its not bad looking, but... ;)

So to sum it up, If you own an OSSC, you have to play Policenauts in Popstarter with OSSC scanlines at least once. If you have an OSSC, dont feel preasured to buy a Wii to experience 240p (or 320p or..) emulation gaming on it necessarily. Its neat. And I'm not sure I'd get a better scanline image out of any of my other devices (as the OSSC actually adds line based (I believe), rather than just as an 'overlay', and adjusting intensity with a remote ads quite a bit to the experience), but its not as impressive as PSX games in Popstarter.

Why in Popstarter? Because I've tried the Hardware PSX emulation on the PS2 (which using Tonyhax works with backup CDs as well, and at a lower than expected noise level (CD drive), but with my CD burner I ran into read issues on burned CDs eventually) and it actually outputs in 480i - so only popstarter downsamples to 240-330p which give you the best possible image for an OSSC to work with. And as others have said in here before, Popstarters compatibility actually isnt the best. (Would have loved to play the Persona 2 games on it, but crash in the fist battle. F.e.) on the other hand, games that it can run, especially 2D or fake 3D based ones (sprites), look great, using their 240-330p render resolution and the OSSC to scale them to 5x (or even 2x), with scanlines enabled.

Thats it. :)

And if you are curious, buy a Wii - setting up RA-HEXAECO is a bit fiddly (mostly because of controller mappings), and you wont get Game Covers to look at, if only using retroarch (which is bound to use RGUI), but where its unbeatable is resuming from standby, and going straight into game in just a few seconds. All from the Wiimote (Currently play with Wiimote and Nuchucks Pro Controllers for the Wiimote are already ordered) and the comfort of your couch. And again, It doesnt look bad, it just doesnt look as impressive. :)

edit: Oh, and if you have multiple consoles, you want to run through an OSSC, this cable ( https://www.amazon.co.uk/XCM-Xbox-Xbox1-Component-Cable/dp/B0056AABYY ) isnt half as bad, as the one one star Review on amazon makes it look like.. ;) And its shielded. :)

edit2: Oh, and dont buy a WiiU, its video output options are worse for 240p gaming than the ones on the Wii. :) Apparently.
 
Last edited by notimp,
  • Like
Reactions: Garcimak

notimp

Well-Known Member
OP
Member
Joined
Sep 18, 2007
Messages
5,779
Trophies
1
XP
4,419
Country
Laos
Tested a bunch of the advanced settings on the OSSC for 240p content on Wii - and found that, with my TV (LG OLED B6, all postprocessing off except for Super resolution: low - and I'm still using the dynamic range mismatch (OSSC outputs full, TV is set to limited) on most games, because, of color pop.

And after familiarizing myself with all settings, the one than produced a little better visual fidelity with 240p games was a rather simple one. Leave optimization to Generic 4:3 in the OSSC and use 3x scale.

On the OSSC I usually use Native output (no scaling) for 480p content, 2x scale for PS2 games, and PSX games that are played though the PS2s backwards compatibility mode. And my go to setting for PSX games played through Popstarter (if they are pixel based and output anywhere in the 240p to 330p range) is 5x.

So it took some time for me to discover 3x - as a setting. ;) Basically 240p games with a detail level lover than PSX sometimes look more congruent with scanlines, if displayed as 3x. Scanlines get smaller, and lines seem to flow better. At the same time, while in all other cases I tend to use 18% scanlines (with the exception for 480p content where I use none) in that 3x case, 25% scanlines sometimes look better. So 5x with high detail level (PSX generation) 2x, 3x or 5x with lower detail level - and usually 2x or 3x. While 2x mostly for interlaced output. And passthrough for 480p. :)
Dont forget, that all of this then runs through the LG OLEDs Super resolution low filtering, which ads a tiny bit of smoothing.
-

Also I've found that the Wiis 480p output with Wii games (especially in 16:9) mode benefits from tweaking OSSC advanced settings (OSSC is set to not scale the image) to get the image more congruent looking. A great game to test this is Another Code for the Wii, because Its artstyle is realistic cell shading with a subdued color pallet and high quality pixel art backgrounds so its quite easy to tell, when the image gets blurrier, or when it gets a little too harsh in separating hard edges, and when you'd like to add a little blur, for back for the congruent look. ;)

Here are the settings I ended up with:
https://i.imgur.com/Mx1PlRn.jpeg

If anyone wants to try them out as well.

I like it very much. :)

Also tried it with an mClassic in the chain, and I actually prefer it not in my Image chain. I like the 480p look that I achieved without it better.

Also just FYI Final Fantasy 6 on Retroarch (RA-HEXAECO) on Wii stutters frequently even with frameskip set to 1 (mGBA).

So Id say if you are into experiencing 240p scanlined content, GBA or higher go with a MiSTer FPGA and not a Wii.
-

That said, a hacked Wii is _wonderful_ - fast and easy to wake from standby, playing adventure games one handedly, with your arm dangling behind the couch. ScummVM with the WiiMote (and the Motion Plus thingy on the Wiimote) works wonderfully. (Playing Blade Runner is a bliss, best system to experience it, I've found so far - one downside, it crashes about once every half hour or so.. ;) (probably a memory leak), other ScummVM games run fine. (Also, use this ScummVm version (as of the time of me typing this: https://mundowiihack.wordpress.com/2020/11/05/wii-scummvm-v1-8-0/ (2.2.1) - the official release version currently (2.2.0) crashes to blackscreen, when you exit a game back to menu.))) Full access to the GCN Library. Best system to experience Phoenix Wright titles (1-3) on (WiiWare editions). And then N64 Virtual Console (VC) titles. The more I set things up on it, the more I adore the system.

PSX emulation on it is trash (compatibility, and vaseline 480i output), but aside from that - so many systems, such a great experience. :)

Pro controllers can be had for 10 USD on Aliexpress (likely not originals.. ;) ).

And its cheap, cheap, cheap. I got mine for 60 EUR with 2 Wiimotes, and 2 Motion Plus Thingies, two plastic grip thingies in Black. So if you go for a white one 30 bucks 40 bucks youve got it. Modding is super easy (although the scenes documentation is a little scattered, so only do necessary things first, and then inch yourself further along your goal of getting all systems to run in emulation). So 60 USD, then add 20USD for a 256GB sdcard (on amazon days sale.. ;)), and maybe a 10 USD Thumbdrive (keeping Wii isos off of the SDcard allows you to use the best loader (usb loader gx), which also gets installed using ModMii (the method of choice, when looking to mod a Wii these days).

The amount of systems it runs well is mind boggling.

Probably go for a Xbox Series S instead, if you are only interested in emulation (and not so much in Wii games (which have to be experienced with a Wiimote, really), and not in 240p output (to then scale it up using your own scalers (OSSC and the one setting in the OLED TV in my case), but thats what - five times as expensive, for maybe adding PS2 emulation (which isnt that great), and access to current Xbox games (ok.. ;) ). As someone that has everything - Wii hacking is probably the best price/performance proposal I've ever experienced (its stupid good) and usability wise its wonderful. Oh, and the stock consoles are almost silent, no mods necessary.
--

Aside from the 480p settings - most of this posting only is Wii related, and this is a PS2 forum... But, I liked it too much not to document it... ;) And since most of the stuff is 240p related, why not post it here.. :)

Best entry point for 240p gaming (and then scaling it yourself, and adding scanlines) = Wii (because of price perfomance). Then PS2 because of WOW factor (on 2D games that run in Popstarter) and price performance, and then MiSTer FPGA - because lets not kid ourselves, that is not an entry point, thats the enthusiast league. (And the thing I probably will end up with - but I dont mind waiting a bit, maybe the next generation of open source FPGA projects doesnt take 5 years to arrive, and in the meantime, I'm playing Wii, and PS2 (which a bunch of emulators for older systems). :)
 
Last edited by notimp,
  • Like
Reactions: Garcimak

Maeson

Well-Known Member
Member
Joined
Apr 3, 2013
Messages
1,179
Trophies
1
XP
3,380
Country
Spain
All the 240p content I do on my Wii is with x4 scaling, and I had to do minimal settings on the OSSC, most of it is on the emulators themselves (to get perfect pixel aspect ratio and such). But again, my Wii is set to 4:3 so that will also make thing easier.

You can play the GBA version of FF6 fine with mGBA's standalone homebrew application, pretty much every game I threw at it worked at full speed, only things like Monkey Ball Jr. give it issue on my experience. The port in Retro-Arch for some reason doesn't perform as well, though...

https://mgba.io/

I did not know Blade Runner ran on Wii, that's quite unexpected though.

About PS1 emulators, yeah, the picture quality is certainly blurry. I tried with Mega Man 8 yesterday and was kinda disappointed. The Wii runs at 480p while in the emulator, but the game seems to be emulated with like a bilinear filter or something, which is most noticeable with 2D games. A 240p mode would be great.

I've played a few games there, but I much rather use my PS2 for PS1 games.

The N64 emulators, or at least Not64 can be forced to work at 240p... But on the other hand, with those emulators the internal resolution of games is doubled, like with N64 VC, so 3D games look better.

The Wii really is a very neat device for retro gaming, not only for the plethora of emulators or its own library and VC, but also for the backward compatibility with the GC. Even more, playing GC games through Nintendont you can use the extra power of the Wii and improve the frame rate of GC games, which is insane, and it helps to smooth out many games and let them have stable performance.

For softmodding, nowadays the place to look for is https://wii.guide . It has all the relevant information for the basic process, and some extra things you might want.

And recently, the official port of Retro-Arch also got a Sega 32x core (Picodrive) that works surprisingly great too. Surely I did not expect to see things like Star Wars Arcade working, and even less working well. The 32x might not have the largest library but the few great things it has are certainly welcomed, it makes the absurd list of things you can play on a Wii well longer.

Overall, Wii + PS2 is a really good combo of retro consoles, even if PS2 games' video resolutions kinda drive me crazy lately. It gives you so much to play it's mind blowing. I find myself so comfortable with it that I've kinda lost interest in looking for newer alternatives, lol. I have done almost 0 research on MiSTer.

You're also right, the Wii is pretty darn silent (specially compared to my PS2 or my DC), even more if loading games through SD or USB, and if you have one of the latter revisions like mine, it also runs much cooler.
 

notimp

Well-Known Member
OP
Member
Joined
Sep 18, 2007
Messages
5,779
Trophies
1
XP
4,419
Country
Laos
You can play the GBA version of FF6 fine with mGBA's standalone homebrew application, pretty much every game I threw at it worked at full speed, only things like Monkey Ball Jr. give it issue on my experience. The port in Retro-Arch for some reason doesn't perform as well, though...
Thanks for the headsup. :)

Also thank you for mentioning that Not64 can be forced to output in 240p. I have to try that as well.

I like some edge smoothing (handled by LG Oled Super resolution low), which is why 4x isnt my go to - it just produces an output with very distinct pixel definitions. I dont mind it in Policenauts (in 240p) obviously (since 5x together with scanlines produces the image I prefer - ) but on games with 2d art not as complex, Id rather use 3x (which results in a 720p output), and then blow it up with some edge smoothing and blur. Im not into the crisp pixel look, except, when the games art already used a vast color pallet and complex art with dithering in mind probably. I.e. I preferred 3x on Snatcher.


Also - if you are using ScummVM (Wii remote navigation is great btw.) and you happen to own an OSSC - make sure you set input from rec601 to rec709. PC games were made with sRGB color space in mind, which used rec709 primaries (red green blue color targets). You'll get far more accurate colors, compared to when the OSSC thinks you are inputing NTSC.. (or PAL.. ;) ) ;)
 
Last edited by notimp,

Maeson

Well-Known Member
Member
Joined
Apr 3, 2013
Messages
1,179
Trophies
1
XP
3,380
Country
Spain
I use x4 because it is what looks best on my screen and x5 is not compatible. My tv (a quite old samsung lcd) on x3 gives a very weird picture for some reason, kinda like when you try to play a pc game on full screen with a weird resolution. Really bad looking, I was pretty disappointed.

But when I tried on x4, it was beautiful. I do prefer crisper pixel art though, I am so done with blurry pictures and the "crt experience"...

I only use scanlines sometimes. I want to like them more but the way most people go about them confuses me tremendously, they set them so, so prominent that darkens the picture so much it makes it hard to see details or to appreciate colors.

Like, look at the Sonic 3 pictures here for example:
https://www.retrorgb.com/wiivsclassic.html

Sonic 3 is bright and vibrant, but there looks lifeless to me.

I can't see myself enjoying games like that. When I use them I se them much lighter, and use the Multiplication option instead of Substraction, so instead of black lines they darken the colors they have underneath, which results in a more colorful picture.

Good thing the OSSC gives you so much control over them.
 

SG854

Hail Mary
Member
Joined
Feb 17, 2017
Messages
5,215
Trophies
1
Location
N/A
XP
8,104
Country
Congo, Republic of the
Its probably worth to talk about TV setups with the OSSC at this point - with interlaced content on my LG Oled I actually prefer to scale to 2x, and then let the TV do the rest of the scaling, with "Super Resolution" on the LG Oled set to low (not off). It does some smoothing (just a bit, not very much) on its own, which also helps with image definition (I'm sure they are doing edge sharpening as well). All other sharpening off, all other enhancements off.

525i 2x scaled (with this post processing feature) looks better than 4x scaled on my setup (3x doesnt work on my TV (TV doesnt like the resolution)). Good enough actually, that I played Death by Degrees today, Pal>NTSC patched, 16:9 patched (uses zoom) and image quality with scanlines was still good enough, that I actually admired the gameart. Only setback is added input lag 30-40ms on my TV, so quite a bit - but for most things I play - thats fine.

I'm actually not a video purist at all, and once I learned calibrating down to the 3D lut level, I actually pivoted the other way, and bought some external scalers just to play with them - today I have 3 external scalers in my video setup (on a separate lane (HDMI splitter)), so I can add them whenever I want to - so 'the OSSC is looking better' to me means absolutely nothing, because I know, how subjective that stuff can be - yet I still get better results postprocessing about 60-70% of my 1080p based movie content. (Never went the 4k Bluray Route, because of the nightmare that is HDR with a fluid luminance (brightness) target over time).

So when I say I enjoy the image that comes out of the OSSC, I just mean it. It looks great. :) Probably could have achieved the same thing with a PC and emulation, but there you usually go the upscaling internal resolution, texture filtering route, and i never quite saw the appeal of scanlines. (Because integer scaling on a PC is not a problem either. :) ) Maybe its just the remote the OSSC comes with, that allows you to adjust scanline intensity on the fly... That and the black crush (dynamic range mismatch (OSSC outputs full, TV was set to limited)), just produced something with great visual appeal on my TV, that looked like the better CRT images I've seen.

All of that appeal is lost, if I just change image modes, to one that pumps brightness f.e. (I've my setup set up to a 85 nits target I believe, which is lower than the CIE Standard) - its really the CRT like experience that makes it here, the perceived image depth, that 'just the right amount of scanlines' add (because it then has an effect similar to dithering)... the... everything. It just worked this time around.. :)

Here - in this video I made a few hours ago, that actually turned out bad (none of the definition is captured, black leves elevated), I tried to capture some of that nuance. Video starts at the interlaced 525i signal 2x scaled, then I turn off the OSSC scaling (TV native scaling takes over all the way), then I turn it on again.

https://streamable.com/4e2i3r

I'm sure, many people would say that the passthrough setting "looks better" (colors pop more), I think the image looks more congruent when 2x scaled.

Its nuances. Its subjective. Yet the impression 240p content with scanlines left on me - was real. There is something to it, if you get that integer scaled by at least 5x - it really looks good. ;)

Thats most of what I tried to say in this thread. :)

Will make another one on using cheats to patch games to 16:9, probably tomorrow - without the need to patch gamefiles - because again, it just works. And leaves you the option to turn it off an on again. :) With that your tv has to have an image mode, where you can stretch the image in four directions though, otherwise, random bars, that negatively impact aspect ratio (after aspect ratio patches) with the OSSC. LG Oleds, luckily have that feature, so I'll write about that one as well... Because I'm sure not all people know about it. :)
Capturing images of a CRT is difficult. Colors get thrown off, there's moire, and other issues.


I have a 1440p LCD and the CRT does look more contrasty and better. Something with the CRT's native gamma response and the output of the consoles is giving a more contrasty look. It's seems like it's somewhere between 2.35-2.5.


My attempt at re-creating the CRT look on a LCD was to use a 3D Lut loaded up into reshade then incorporate that into retroarch. I target Rec.601 and a 2.5 gamma to match as close to a CRT's contrasty look as I can.

The standard is around 80 nits to 120 nits for sdr. Usually though people target 100 nits. But it's only a recommendation and only for a dark room. In a brighter room you have to ouput a brighter image to compensate. My CRT at max measured as high as 250-300 nits. But 240p with scanlines on a CRT looks so much brighter then scanline filters on a LCD. They darken the LCD way too much.


I use CRT Royale Kurozumi shader on a LCD which looks identical to a high end Sony Trinitron.

But you can't have a experience better then a CRT unless you kill motion blur. The newer LG OLED panels CX and C1 have a Black Frame Insertion mode that can reduce image persistence from 16ms down to about 4ms. Very close to a CRT's 1ms. Or you can use a 360hz display and use BFI in retroarch to cut blur by 83%. Those are the only options right now that can match close to a CRT. The older OLED models aren't as good at blur handling.
 
Last edited by SG854,

notimp

Well-Known Member
OP
Member
Joined
Sep 18, 2007
Messages
5,779
Trophies
1
XP
4,419
Country
Laos
Haha - :) Its fun to see how much our positions diverge. :)
I have a 1440p LCD and the CRT does look more contrasty and better. Something with the CRT's native gamma response and the output of the consoles is giving a more contrasty look. It's seems like it's somewhere between 2.35-2.5.
Agree, with a caveat. The number you are naming is just an average, and the range you are setting, is quite a range. :) There also is the aspect, that sample and hold devices are able to reproduce completely linear gamma (from 0-100 IRE), while CRTs are not. CRTs usually roll off at the low end, so you usually have - perfect black, thenn an out of black curve that gets brighter much faster, while the majority of the curve is sitting at a pretty high gamma - lets say 2.4 or 2.5. So its not only average, its also shape of the gamma curve.

Short interlude, gamma basically was an inverse function of movie cameras EOTFs. Tech was not so refined, that light output could be captured linearly - so when the material was reproduced (on a TV lets say) the inverse of the EOTF would be added (gamma), and the resuting image would be close(er) to reality. Initially this was just a 'lucky match' as CRTs native gamma characteristics almost 'naturally' produced a good inverse match to the movie cameras of the time (just saying that you could not alter CRTs gamma on a finegrained or detailed level). All of this then "was responsible" for CRT characteristics of the time.

BT1886 (a gamma function, that you might see on your TV, if newer) was designed to 'mirror CRT behavior on gamma, with one immense flaw. Its highly dependent on display black level. With a near perfect black level, the function works as intended. With a much lower black level (lets say IPS LCD without backlight dimming), the curve washes out near black much too heavily (makes it too bright), so the resulting image isnt anything that would resemble a CRTs performance.
My attempt at re-creating the CRT look on a LCD was to use a 3D Lut loaded up into reshade then incorporate that into retroarch. I target Rec.601 and a 2.5 gamma to match as close to a CRT's contrasty look as I can.
Perfectly fine approach. I actually own a 3D Lut box myself, and am able to create luts for it, on an early gen OLED, with specialized pattern sets (bright, not bright alternating), as OLEDs show luminance drift (even with brightness compensation algos disabled) on large pattern sets (1500 or so), so you need to compensate - and long story short - the improvement in my case wasnt as significant (to a normal 20 point calibration), that I would have kept that in my chain. With one exception, and that is, that the B6 I own shows much too saturated colors at very low IREs (the built in luts of OLEDs werent as fine grained back then).
I've found that on movies, Postprocessing the image a little (mClassic and DVDO iScan mini in chain - which allows for very gradual image postprocessing (sharpening, bluring edges, bleh)) offered much greater benefits in getting something that registers as 'lifelike', than the slight improvement in color accuracy a 3D lut would have produced.
That said, stepping away from 3D lut in my case - I'm now not using configurations that would mimic CRT gamma (curve), I'm basically using 2.4 linear for Movies, and 2.2 linear for games. (Also I'm not using any postprocessing (except LG Oled Super resolution set to low which blurs gradients, and does some edge enhancement, but everything 'minor' (not very pronounced).

Enter scanlines. Scanlines overall darken an image. But not quite linearly (I think ?), long story short, they also change depth perception on an image. And when scanlines are 'perfect' on lets say a pixel art image, its perceived depth usually (in my case) looks more natural than without scanlines. Some of that may have to do with dithering, but I think its mostly how (artificial) scanlines change my TVs gamma response. I like very much. Thats one part of it feeling 'like a CRT' in my case (underlying gamma I have the TV running on is 2.2 PLG (so linear)).

The next thing I do (sometimes) is to produce a full RGB/limited RGB mismatch -- which is not at all recommended. So OSSC outputing full, TV expecting limited. This crushes all near blacks (you wont see them, they become black), which is not at all what you want - but it gives you an 'oversaturated' image pretty quickly, by pressing a few buttons on the TV remote. As not all pixel art games worked with colorpallettes close to near black (too few colors for that.. ;) ) its not the end of the world, but thats just the excuse I'm telling myself, for using this at all. When I see too much black crush in a scene, or game ('darker mood'), or when colors become too artificial looking, I disable it. (Set it correctly). Setting gamma to PLG 2.4 didnt give me the same 'pop'. (But would have retained near black colors). (I can dimm my room, so roomlight is not an issue here - dont have to compensate for that. :) )

Calibrating for rec 601 is not something I need to do either, as the OSSC does the color conversion for you (I think). It has an input is rec.709 slider (which is off by default), which implies, that it does a rec 709 conversion internally, before outputting the signal.

So all in all - I dont really need a 3D lut. Scanlines alter gamma in a way I like, OSSC does the color space conversion, and If I want more color pop, I do the thing you arent supposed to do (use the full/limited mismatch, that crushes near blacks). Or so I'm telling myself. ;)
The standard is around 80 nits to 120 nits for sdr. Usually though people target 100 nits. But it's only a recommendation and only for a dark room. In a brighter room you have to ouput a brighter image to compensate. My CRT at max measured as high as 250-300 nits. But 240p with scanlines on a CRT looks so much brighter then scanline filters on a LCD. They darken the LCD way too much.
I have a fully light controlled environment. Oled produces perfect blacks, and 100 nits seemed too bright in this environment for sustained use (dont have bias lighting mambo jumbo.. ;) ), so I calibrated with a 100 IRE brightness target of afair 85. Scanlines lower that even more.

Also its a little more than just a recommendation, its literally what the color grading department saw, when mastering movies. (But they usually saw it with bias lighting - I am not). Color luminance also alters color perception although its the 'axis' (in 3 dimensional color charts) that impacts color perception least. So usually we leave it to the user to adjust.. ;) And we also have to because we cant tell people 'live in a batcave' (lighting conditions).

One of the main benefits of using an OSSC is that you literally get a remote, with a +/- rocker, that regulates scanline transparency. This is great by the way, and one of the reasons why I promoted 'as close to CRT look' as I've seen in years. This is also what lets me finetune 'image depth' (so scanlines influencing 2.2 Gamma (PLG)). Some games look better with 25% scanlines in my case some look better with 18% scanlines. Depends on the game. (With full/limited mismatch on, so scanlines becoming darker than they would be without that.)
But you can't have a experience better then a CRT unless you kill motion blur.
Very, very true. Sample and hold displays introduce something called image retention in your eyes (even if their switching time is instant), so the pixel color switches, but your eyes havent adjusted yet, and when there is motion on screen this results in an afterimage on the retina, or as we'd say bluring (basically motion blur). CRTs worked entirely differently (with an electron gun beaming a line into existence, from black ;) ) which means 'instant pixel response time' and no image retention on our retina whatsoever, for some reason. So - much higher motion resolution (resolution of an image you can perceive, when it is in motion).

Notice also that I had my wow effect moment on a game which consists mostly of static 2D images (Policenauts), where motion blur is not an issue. :)

Also - higher resolution signal, makes up for it. ;) (So a 1080p image in motion still looks better than a 480p image back in the day on a CRT, in terms of detail perception. :) On pixel art (of back in the day) you probably still will have motion blur though on anything thats not a CRT (Or using BFI.).)
The newer LG OLED panels CX and C1 have a Black Frame Insertion mode that can reduce image persistence from 16ms down to about 4ms. Very close to a CRT's 1ms. Or you can use a 360hz display and use BFI in retroarch to cut blur by 83%. Those are the only options right now that can match close to a CRT. The older OLED models aren't as good at blur handling.
Agree entirely. :) But it also makes the image flicker more than CRTs did back in the day. (Perception based.), so companies are still tweaking approaches and algos. :)
 
Last edited by notimp,

SG854

Hail Mary
Member
Joined
Feb 17, 2017
Messages
5,215
Trophies
1
Location
N/A
XP
8,104
Country
Congo, Republic of the
Haha - :) Its fun to see how much our positions diverge. :)

Agree, with a caveat. The number you are naming is just an average, and the range you are setting, is quite a range. :) There also is the aspect, that sample and hold devices are able to reproduce completely linear gamma (from 0-100 IRE), while CRTs are not. CRTs usually roll off at the low end, so you usually have - perfect black, thenn an out of black curve that gets brighter much faster, while the majority of the curve is sitting at a pretty high gamma - lets say 2.4 or 2.5. So its not only average, its also shape of the gamma curve.

Short interlude, gamma basically was an inverse function of movie cameras EOTFs. Tech was not so refined, that light output could be captured linearly - so when the material was reproduced (on a TV lets say) the inverse of the EOTF would be added (gamma), and the resuting image would be close(er) to reality. Initially this was just a 'lucky match' as CRTs native gamma characteristics almost 'naturally' produced a good inverse match to the movie cameras of the time (just saying that you could not alter CRTs gamma on a finegrained or detailed level). All of this then "was responsible" for CRT characteristics of the time.

BT1886 (a gamma function, that you might see on your TV, if newer) was designed to 'mirror CRT behavior on gamma, with one immense flaw. Its highly dependent on display black level. With a near perfect black level, the function works as intended. With a much lower black level (lets say IPS LCD without backlight dimming), the curve washes out near black much too heavily (makes it too bright), so the resulting image isnt anything that would resemble a CRTs performance.

Perfectly fine approach. I actually own a 3D Lut box myself, and am able to create luts for it, on an early gen OLED, with specialized pattern sets (bright, not bright alternating), as OLEDs show luminance drift (even with brightness compensation algos disabled) on large pattern sets (1500 or so), so you need to compensate - and long story short - the improvement in my case wasnt as significant (to a normal 20 point calibration), that I would have kept that in my chain. With one exception, and that is, that the B6 I own shows much too saturated colors at very low IREs (the built in luts of OLEDs werent as fine grained back then).
I've found that on movies, Postprocessing the image a little (mClassic and DVDO iScan mini in chain - which allows for very gradual image postprocessing (sharpening, bluring edges, bleh)) offered much greater benefits in getting something that registers as 'lifelike', than the slight improvement in color accuracy a 3D lut would have produced.
That said, stepping away from 3D lut in my case - I'm now not using configurations that would mimic CRT gamma (curve), I'm basically using 2.4 linear for Movies, and 2.2 linear for games. (Also I'm not using any postprocessing (except LG Oled Super resolution set to low which blurs gradients, and does some edge enhancement, but everything 'minor' (not very pronounced).

Enter scanlines. Scanlines overall darken an image. But not quite linearly (I think ?), long story short, they also change depth perception on an image. And when scanlines are 'perfect' on lets say a pixel art image, its perceived depth usually (in my case) looks more natural than without scanlines. Some of that may have to do with dithering, but I think its mostly how (artificial) scanlines change my TVs gamma response. I like very much. Thats one part of it feeling 'like a CRT' in my case (underlying gamma I have the TV running on is 2.2 PLG (so linear)).

The next thing I do (sometimes) is to produce a full RGB/limited RGB mismatch -- which is not at all recommended. So OSSC outputing full, TV expecting limited. This crushes all near blacks (you wont see them, they become black), which is not at all what you want - but it gives you an 'oversaturated' image pretty quickly, by pressing a few buttons on the TV remote. As not all pixel art games worked with colorpallettes close to near black (too few colors for that.. ;) ) its not the end of the world, but thats just the excuse I'm telling myself, for using this at all. When I see too much black crush in a scene, or game ('darker mood'), or when colors become too artificial looking, I disable it. (Set it correctly). Setting gamma to PLG 2.4 didnt give me the same 'pop'. (But would have retained near black colors). (I can dimm my room, so roomlight is not an issue here - dont have to compensate for that. :) )

Calibrating for rec 601 is not something I need to do either, as the OSSC does the color conversion for you (I think). It has an input is rec.709 slider (which is off by default), which implies, that it does a rec 709 conversion internally, before outputting the signal.

So all in all - I dont really need a 3D lut. Scanlines alter gamma in a way I like, OSSC does the color space conversion, and If I want more color pop, I do the thing you arent supposed to do (use the full/limited mismatch, that crushes near blacks). Or so I'm telling myself. ;)

I have a fully light controlled environment. Oled produces perfect blacks, and 100 nits seemed too bright in this environment for sustained use (dont have bias lighting mambo jumbo.. ;) ), so I calibrated with a 100 IRE brightness target of afair 85. Scanlines lower that even more.

Also its a little more than just a recommendation, its literally what the color grading department saw, when mastering movies. (But they usually saw it with bias lighting - I am not). Color luminance also alters color perception although its the 'axis' (in 3 dimensional color charts) that impacts color perception least. So usually we leave it to the user to adjust.. ;) And we also have to because we cant tell people 'live in a batcave' (lighting conditions).

One of the main benefits of using an OSSC is that you literally get a remote, with a +/- rocker, that regulates scanline transparency. This is great by the way, and one of the reasons why I promoted 'as close to CRT look' as I've seen in years. This is also what lets me finetune 'image depth' (so scanlines influencing 2.2 Gamma (PLG)). Some games look better with 25% scanlines in my case some look better with 18% scanlines. Depends on the game. (With full/limited mismatch on, so scanlines becoming darker than they would be without that.)

Very, very true. Sample and hold displays introduce something called image retention in your eyes (even if their switching time is instant), so the pixel color switches, but your eyes havent adjusted yet, and when there is motion on screen this results in an afterimage on the retina, or as we'd say bluring (basically motion blur). CRTs worked entirely differently (with an electron gun beaming a line into existence, from black ;) ) which means 'instant pixel response time' and no image retention on our retina whatsoever, for some reason. So - much higher motion resolution (resolution of an image you can perceive, when it is in motion).

Notice also that I had my wow effect moment on a game which consists mostly of static 2D images (Policenauts), where motion blur is not an issue. :)

Also - higher resolution signal, makes up for it. ;) (So a 1080p image in motion still looks better than a 480p image back in the day on a CRT, in terms of detail perception. :) On pixel art (of back in the day) you probably still will have motion blur though on anything thats not a CRT (Or using BFI.).)

Agree entirely. :) But it also makes the image flicker more than CRTs did back in the day. (Perception based.), so companies are still tweaking approaches and algos. :)
When I measure my CRT the average is about 2.2. And yes its true that it's not a flat 2.2, a CRT gamma curve is different from 2.2 but overall average calibration software reports is 2.2


You also have to compensate for the fact that LCD's has weaker black levels and also compensate for consoles output. When I create 3D Lut with a 2.2 curve it looks washed out compared to an actual CRT so I need to hit a higher gamma. Classic Video games consoles are not like movie cameras so the 2.2 curve doesn't really apply. Games seem to target something different.


White drift compensation and patch size would be good for large patch size measurements on OLED. But modern displays are decently calibrated out of factory so you may need to do slight adjustments to compensate for panel variance without a need for 3D Lut. The amount of accuracy of a 3D Lut is not really needed for the everyday person as long as your display is well behaved and most people won't care about slight inaccuracy as long as it's good enough. Only pro's need that amount of accuracy and sometimes calibrate their displays every week if it has bad drift.

I also hit the same targets you do. I have a newer LG Oled and it supports internal 3D Luts without a need for external boxes. So I target 2.4 for movies. And for web browsing, modern games and YouTube, a 2.2 gamma. I noticed that 2.4 crushes too much detail in modern games and makes it harder to see so I stick with 2.2.


But I meant 100 nits is only a recommendation for the consumer. In Broadcast its a requirement. But it also depended on CRT sizes. 32 inches the target was 80 nits. Smaller sizes was 100 nits. Sometimes service manuals will have you calibrate to 120 nits. But that doesn't really matter for everyday home use for a non pro. You can set it to 50 nits if you want to. What ever is comfortable to the person eyes.


Yes CRT's didn't flicker at all with 240p sources because the way it draws an image is different then bfi. It's drawing by rolling scan and the flicker is less bad on a crt. There's some modern displays that incorporates rolling scan as an alternative to bfi to eliminate blur along with other solutions.


I have a CRT that can output 1080i so I have clarity and resolution on my side :D.
 
Last edited by SG854,

notimp

Well-Known Member
OP
Member
Joined
Sep 18, 2007
Messages
5,779
Trophies
1
XP
4,419
Country
Laos
You also have to compensate for the fact that LCD's has weaker black levels and also compensate for consoles output. When I create 3D Lut with a 2.2 curve it looks washed out compared to an actual CRT so I need to hit a higher gamma. Classic Video games consoles are not like movie cameras so the 2.2 curve doesn't really apply. Games seem to target something different.
2.2 is the SRGB Standard (so what was usually used in desktop publishing (PC work), movies usually used 2.4)
Also a CRT that averages to 2.2 is actually almost odd. but then again I dont have much experience profiling or calibrating CRTs, and the black compensation part of the gamma curve (usually looking something like this ( https://forum.doom9.org/showthread.php?t=173532 ) (left part, first graph) can move the average quite low, I presume.

Decreasing black level on an OLED (increasing the brightness of black) isnt recommended, nor possible (usually). (Although I tweaked mine in the Service menu (upped it a little, but still measuring as infinite on an ANSI checkerboard... ;) )).

But 'out of black' (so near black) colors usually get brighter faster (on CRTs) than on a linear 2.2 or 2.4 line gamma chart. (Thats the bend on the left side that I referred to as 'the black compensation part')

I would be very interested in your CRTs actual gamma curve and black level, if you would happen to have them on hand (brand and model also, but mostly the other two). As I always asumed, that black level on a crt was 'close to perfect black' (Certainly not higher than 0.01 cd/m2 (nits)).

But thats just a side note... :)

Yes, games dont compensate for the inverse of camera gamma, but thats not the point. :) The point is, that the TVs at the time had that curve baked in, and therefore looked like we expected them to look back then. Also thats just a nice anecdote. For reference just remember that on consumer CRTs you didnt change gamma (like at all.. ;) ), and the most significant difference to the mastering environment might have been a game studio mastering on PC monitors targeting 2.2 gamma, while consumer screens should have targeted an average 2.4 (roughly).

Also I know my B6 OLED and its issues very well (and no its factory calibration was crap ;) - but yes on modern OLEDs it isnt ;) ), the issue actually was the factory lut which was too small back in the day. So normal Calibrators wouldnt catch it in normal measurements, but if you ran a saturation sweep at 15% luminance (normally you only run it at 100% for calibration), you suddenly saw saturation of your primaries and secondaries explode, which is why I bought the lut box in the first place. :) Also 'normal calibration' isnt an issue (you can use normal window patterns), luminance usually wont shift much - its just that after 150 patterns or so in a row it does. Which also means it does on real world material, but thats a whole different story.. ;)
But I meant 100 nits is only a recommendation for the consumer.
I actually read the ITU (I believe ?) spec (the one for HD stuff (probably rec 709)) way back when, and its in the spec so its what all calibration is based on. Either ITU (or the BBC ;) dont remember) then specifies, that this is the target for dark room viewing, and you have adjust viewing conditions accordingly.

But. I'm fairly certain - that they arrived at 100 nits by pulling a number out of a hat. ;)

In general its the brightnes of white in middle europe on a clear sky day in the hours shortly after noon. :) Why? Because thats when most camera crews shot their 'sunny day' scenes, and they usually used a sheet of paper for whitebalancing.. ;)

Movie theaters back in the day - never reached 100 nits on screen btw. Usually only about 60 to 80 nits. So yeah... theres that as well... :)

Calibrators usually will tell you, that 100 nits is just a suggestion, because they can tell you 'adjust room conditions' - but thats just a sidenote really, because in practice, roomlight dictates everything else in terms of brightness (luminance) target and thats it.

Nowadays most calibrators even calibrate with 120 cd/m2 as an SDR target, which I could never find the reasoning for. Probably because the customer likes it brighter (on average.. ;) ).

That all said, once more - color luminance (so 'brightness') is the parameter that least impacts color perception. People usually say that brighter looks more saturated perception wise - but meh. Once you get into color perception science and experiments (read a few papers), everything becomes almost esoteric.. ;)
 
Last edited by notimp,

SG854

Hail Mary
Member
Joined
Feb 17, 2017
Messages
5,215
Trophies
1
Location
N/A
XP
8,104
Country
Congo, Republic of the
2.2 is the SRGB Standard (so what was usually used in desktop publishing (PC work), movies usually used 2.4)
Also a CRT that averages to 2.2 is actually almost odd. but then again I dont have much experience profiling or calibrating CRTs, and the black compensation part of the gamma curve (usually looking something like this ( https://forum.doom9.org/showthread.php?t=173532 ) (left part, first graph) can move the average quite low, I presume.

Decreasing black level on an OLED (increasing the brightness of black) isnt recommended, nor possible (usually). (Although I tweaked mine in the Service menu (upped it a little, but still measuring as infinite on an ANSI checkerboard... ;) )).

But 'out of black' (so near black) colors usually get brighter faster (on CRTs) than on a linear 2.2 or 2.4 line gamma chart. (Thats the bend on the left side that I referred to as 'the black compensation part')

I would be very interested in your CRTs actual gamma curve and black level, if you would happen to have them on hand (brand and model also, but mostly the other two). As I always asumed, that black level on a crt was 'close to perfect black' (Certainly not higher than 0.01 cd/m2 (nits)).

But thats just a side note... :)

Yes, games dont compensate for the inverse of camera gamma, but thats not the point. :) The point is, that the TVs at the time had that curve baked in, and therefore looked like we expected them to look back then. Also thats just a nice anecdote. For reference just remember that on consumer CRTs you didnt change gamma (like at all.. ;) ), and the most significant difference to the mastering environment might have been a game studio mastering on PC monitors targeting 2.2 gamma, while consumer screens should have targeted an average 2.4 (roughly).

Also I know my B6 OLED and its issues very well (and no its factory calibration was crap ;) - but yes on modern OLEDs it isnt ;) ), the issue actually was the factory lut which was too small back in the day. So normal Calibrators wouldnt catch it in normal measurements, but if you ran a saturation sweep at 15% luminance (normally you only run it at 100% for calibration), you suddenly saw saturation of your primaries and secondaries explode, which is why I bought the lut box in the first place. :) Also 'normal calibration' isnt an issue (you can use normal window patterns), luminance usually wont shift much - its just that after 150 patterns or so in a row it does. Which also means it does on real world material, but thats a whole different story.. ;)

I actually read the ITU (I believe ?) spec (the one for HD stuff (probably rec 709)) way back when, and its in the spec so its what all calibration is based on. Either ITU (or the BBC ;) dont remember) then specifies, that this is the target for dark room viewing, and you have adjust viewing conditions accordingly.

But. I'm fairly certain - that they arrived at 100 nits by pulling a number out of a hat. ;)

In general its the brightnes of white in middle europe on a clear sky day in the hours shortly after noon. :) Why? Because thats when most camera crews shot their 'sunny day' scenes, and they usually used a sheet of paper for whitebalancing.. ;)

Movie theaters back in the day - never reached 100 nits on screen btw. Usually only about 60 to 80 nits. So yeah... theres that as well... :)

Calibrators usually will tell you, that 100 nits is just a suggestion, because they can tell you 'adjust room conditions' - but thats just a sidenote really, because in practice, roomlight dictates everything else in terms of brightness (luminance) target and thats it.

Nowadays most calibrators even calibrate with 120 cd/m2 as an SDR target, which I could never find the reasoning for. Probably because the customer likes it brighter (on average.. ;) ).

That all said, once more - color luminance (so 'brightness') is the parameter that least impacts color perception. People usually say that brighter looks more saturated perception wise - but meh. Once you get into color perception science and experiments (read a few papers), everything becomes almost esoteric.. ;)
My CRT is a broadcast grade monitor, model Sony BVM D20F1U, the reference for color. This Sony monitor was the most popular used around the world for broadcast. I can tell you the the monitors used back then for color grading movies has an average gamma of 2.2 since I have one right next to me.



The black level does indeed measure 0.01 nits. But it's not perfect black. The black level on consumer sets was deeper since they had a dark tinted tube. Which does better in a bright room. Really high end broadcast grade crts had lighter tinted tubes so black levels weren't as deep and were meant for a dark room where black levels perform best.


It's not at all strange that average gamma of my crt is 2.2. Most calibration software targets 2.2 on modern displays to mimick the average behavior of a CRT. Its under (Why has a default gamma of 2.2 been chosen for some presets?) I do not profile my crt. That is its native response.

https://displaycal.net/#presets


Games were made on a rgb monitor be it a broadcast quality or computer. Both have average gamma of 2.2. And they'll have a consumer set around with Composite to see how their work will look like on the average tv.


All I know is 2.2 on a LCD doesn't match a crt for classic game consoles. So I choose a different gamma on my LCD to try to closely match it.
 
Last edited by SG854,

notimp

Well-Known Member
OP
Member
Joined
Sep 18, 2007
Messages
5,779
Trophies
1
XP
4,419
Country
Laos
My CRT is a broadcast grade monitor, model Sony BVM D20F1U, the reference for color. This Sony monitor was the most popular used around the world for broadcast. I can tell you the the monitors used back then for color grading movies has an average gamma of 2.2 since I have one right next to me.



The black level does indeed measure 0.01 nits. But it's not perfect black. The black level on consumer sets was deeper since they had a dark tinted tube. Which does better in a bright room. Really high end broadcast grade crts had lighter tinted tubes so black levels weren't as deep and were meant for a dark room where black levels perform best.


It's not at all strange that average gamma of my crt is 2.2. Most calibration software targets 2.2 on modern displays to mimick the average behavior of a CRT. Its under (Why has a default gamma of 2.2 been chosen for some presets?) I do not profile my crt. That is its native response.

https://displaycal.net/#presets


Games were made on a rgb monitor be it a broadcast quality or computer. Both have average gamma of 2.2. And they'll have a consumer set around with Composite to see how their work will look like on the average tv.


All I know is 2.2 on a LCD doesn't match a crt for classic game consoles. So I choose a different gamma on my LCD to try to closely match it.
Thank you very much. :) Learned something today. :)
 

SG854

Hail Mary
Member
Joined
Feb 17, 2017
Messages
5,215
Trophies
1
Location
N/A
XP
8,104
Country
Congo, Republic of the
Thank you very much. :) Learned something today. :)
Consumer sets was something like 2.35-2.5, that's what people are reporting. I don't have a consumer set to measure. And gamma seems to vary by brand. And Broadcast and PC Monitors was average 2.2. I don't know if this is done by the electronics but all I know is what I'm measuring.


But gamma wasn't really standardized on displays till fairly recently. White point and color space was standardized. But gamma wasn't. They didn't really need to standardize gamma when everyone had CRT's. But when they started to switch to LCD's then there was a need to standardize it and 2011 was the first attempt at standardizing it with bt.1886.

Back then they use to target 2.2 for movies on LCD but they changed it to 2.4 over time. Generally nowadays it's usually 2.4 for a dark room and 2.2 for a bright room. Gamma is one of those tricky things.


But anyways back to 240p emulation. 2.5 on LCD seems to give me fairly good results. Very saturated colors for classic games. I'm switching between 2.4 and 2.5 to see what looks best.
 
Last edited by SG854,

notimp

Well-Known Member
OP
Member
Joined
Sep 18, 2007
Messages
5,779
Trophies
1
XP
4,419
Country
Laos
And Broadcast and PC Monitors was average 2.2
Yes, at least the later ones were targeting sRGB which is defined to use gamma 2.2 (with a 'black compensation' curve characteristic near black). See: https://en.wikipedia.org/wiki/SRGB

Also, btw - if you ever want to create another lut that might move you closer to CRT characteristics, see if you can pick bt1886 as a gamma target an are able to set the black level to 0.01cd/m2 fixed (Important to get a shape of the bt1886 curve that will probably resemble your CRTs gamma curve a little more). Also check, where the Curve lands at the high end. Should be slightly above 2.4. You could also experiment with setting deeper black levels. Just know that at infinite black, the bt1886 curve becomes 2.4 PLG (= flat).
 
Last edited by notimp,

Site & Scene News

Popular threads in this forum

General chit-chat
Help Users
  • No one is chatting at the moment.
    K3Nv2 @ K3Nv2: https://youtu.be/MddR6PTmGKg?si=mU2EO5hoE7XXSbSr