Agree with everything that was posted.
To get a better image, you'd actually should try to get into calibrating it.
Your monitor is 98% sRGB only - which in my book is actually a good thing. But it also means, that its not HDR capable and it doesnt have wide color gamut.
So lets talk about calibrating it correctly.
Turn off HDR. (Sorry to tell you.
) HDR on your monitor would actually "overdrive" medium saturated colors and make them look more saturated, but, alongside a wrong gamma curve your monitor doesnt use (by default, more on that later), probably washing the entire image out.
rtings sadly only reviewed the 32inch variant of your monitor (sadly, because panels used can vary widely in characteristics, between different size versions), but there are a few things we can do.
First - make sure full/limited color range is set correctly. To do that open up your graphics card control panel (i.e. nVidia controll panel) and navigate to the tab, wehre you would change your resolution.
Normally its recommended to set signal source to RGB and FULL in there. And as high bit rate as your (cable (seriously) and or graphics card(/output)) support - for that framerate. Meaning, if you opt for 240Hz (just an example, your monitor probably would not allow you to set that), bit rate might have to be lower to still be able to get a signal and not a black screen. Same goes for chroma. 4:4:4 is best - but if you have to use 4:2:2 (reduces "bitrate") its still not a major issue.
Once you've set your signal path to RGB (full - as in if there is a limited option, chose full instead) open your monitors control panel, and find the corresponding color range setting (to set accordingly) there.
You are looking for a setting (maybe named Blacklevel, or something entirely different..
) that has two options, full(or high) and limited(or low) (and maybe auto). In the display options of your monitor. Set it to full(/high) as well.
Have the Lagcom black level test open at the same time.
http://www.lagom.nl/lcd-test/black.php
--
Insert. At that stage, also try setting your graphics card in the nVidia control panel on the resolution screen to output YCbCr and limited, and then set the monitor to limited (in the monitor settings). See if the black level changes significantly (IMPORTANT: Only "judge" full/full (set both in the graphics card control panel and monitor settings) vs limited/limited (set both in the graphics card control panel and monitor settings), dont judge full/limited, or limited/full. If the black level doesnt change significantly (as in black doesnt become "deeper" using one setting compared to the other -- stay with (RGB) full/full.
I'm mentioning this here, because on my LG monitor - for some reason using a full/full signal path, leads to elevated blacks, meaning, for me it makes sense to stick to limited/limited. Thats a bug btw (on the monitor side). dont expect to see it. Just know, that if you do, its better to stay with the signal path, that gives you the deeper black level.
--
You should have the lagcom.nl black level test open at the same time to be able to judge it easier.
Ideally it would allow you to differenciate all greys close to black, but dont worry if it clips out a few of them (a few of them are undistinguishable from black) this is influenced by display brightness, gamma, and your room lighting conditions, and two of those you are setting later, but first we need to make sure, that you dont have a limited/full or full/limited mismatch between your graphics cards settings panel and your monitor settings. Which would result in either _many_ of the close to black fields in the lagcom test crushing (being not distinguishable from black) or in a _very_ washed out image (elevated black level).
So your job is to make sure that you either have set RGB full (nVidia control panel) and the monitor on full/high (setting probably named black level, or similar) -- or YCbCr limited (nVidia Controlpanel) and the monitor on limited/low. But no mix of limited/full or full/limited.
Also, if blacks dont become "deeper" using either full/full or limited/limited, use full/full preferably.
--
The next steps are setting some additional settings - following the rtings test of the 32 inch model:
https://www.rtings.com/monitor/reviews/lg/32ul500-w
On the Monitor:
Set picture mode to custom.
Set gamma to Gamma 2
Set Luminance (probably named brightness) to 18 (default probably was 55)
Set contrast to 68 (default probably was 70)
Set sharpness to 50 (== default, lower would blur the image, more would sharpen it.)
Then the only thing left to set are RGB values (whitepoint). For that you can use the test images on
https://harlekin.me/burosch -
look at them and see if raising green from 50 to 60 makes them look more natural. This is the only setting where you likely have to guess or go by feel, because the rtings test only tested the 32 inch model of that monitor, and whitepoint is the most likely setting to change between panels (different manufaturing runs) - so we cant say for sure if this benefits your calibration, or not.
(The other settings are more likely to be driven by the chipset used in those monitors and would likely be the same on different screen size versions of the same monitor model.)
rtings settings for the 32 inch version of the monitor are 45-60-48 (red-green-blue) (default: 50-50-50). -5 red you probably would notice the least, +10 green you'd definitely notice, and -2 blue you'd notice as well. So use the test images, and try to judge if raising green by 10 ticks makes them look more natural.
Thats it.
Turn off any "black level enhancers" (== dynamic contrast, if your monitor has them, off might not be 0 but rather 45 or 50 (*sigh*)), and "super resolution+" mode (smudges the image, then adds edge sharpening, might be nice for some movies, but not for PC work, adds lag).
You can (and should) then turn up/down brightness as you'd want again, but notice that 18 is the "standard" setting for the sRGB standard (100 nits) on your monitor. What brightness setting you'd need ultimately depends on your roomlight settings. But dont feel the need to turn it to 50. 18 is closer to the default than all other settings. Manufacturers just like to pump it in default settings, because it sells monitors (look - this one is brighter!), in showrooms under halogen lighting.
--
If you want to be more thorough, you need a meter (xrite i1d3 (== X-Rite i1Display Pro) preferably - you can buy them used (used is fine) for less than 200 USD). With that you'd be able to calibrate RGB (whitepoint) correctly - but all other settings probably would not change.
Also - and thats new - you could use that to generate a 3DLUT profile for windows (
https://hub.displaycal.net/forums/topic/i-made-a-tool-for-applying-3d-luts-to-the-windows-desktop/ ), upside: perfect colors, downside +10% points more GPU usage in games.
With a 3D lut you could also use any gamma curve you'd want. (Thats just the resolve of the "more on that later" comment, made earlier..
)
Thats mostly useful if your monitor is wide color gamut and doesnt allow you to adjust the sRGB profile. (99% of all internet and even gaming content still is sRGB (everything thats not authored for HDR is), which basically is the same as rec709 (standard for Blurays, just not HDR Bluerays) (gamma for Bluerays is "darker", so when watching a movie, you might want to switch through your gamma options on your monitor and see if you prefer a gamma setting that makes the movie "darker". For PC work (and internet content) Gamma 2 on your monitor is correct.) )) But yours is not wide color gamut (the thing aside from high brightness - you need for a "HDR compatible monitor") - and does allow you to adjust it, so its not as necessary for you.
If you want to go that route - you need to buy a i1d3 and then ping me again, its not hard - but you need an additional howto.
--
Thats it.