Question about 4k tv

Noctosphere

Nova's Guardian
OP
Member
Joined
Dec 30, 2013
Messages
6,756
Trophies
3
Age
30
Location
Biblically accurate Hell
XP
18,709
Country
Canada
Hello
So, I might plan on buying a 4k TV
However, I've been told I need some kind of extra technology, HDR or something like that...
But, if I don't want to game on it, only viewing video and such, do I need this tech?
I mean, that HDR or whatever, is it only for console gaming and pc?
Or does it impact on Roku TV and such?

thanks
 

retKHAAAN

Well-Known Member
Member
Joined
Mar 14, 2009
Messages
3,840
Trophies
1
XP
1,599
Country
United States
HDR is a technology that can potentially improve picture via color/contrast in a more noticeable manner than 4K (depending on screen size and how far you sit from it). After owning a TV with HDR10 and Dolbyvision I personally won’t buy anything less.

The good news is that HDR is starting to show up in even budget TVs so it shouldn’t be hard to find a sufficient set.

TCL has some pretty solid options for cheap.
 

Noctosphere

Nova's Guardian
OP
Member
Joined
Dec 30, 2013
Messages
6,756
Trophies
3
Age
30
Location
Biblically accurate Hell
XP
18,709
Country
Canada
HDR is a technology that can potentially improve picture via color/contrast in a more noticeable manner than 4K (depending on screen size and how far you sit from it). After owning a TV with HDR10 and Dolbyvision I personally won’t buy anything less.

The good news is that HDR is starting to show up in even budget TVs so it shouldn’t be hard to find a sufficient set.

TCL has some pretty solid options for cheap.
alright, thanks
so it does affect video streaming, right?
HDR isn'T only for gaming, right?
 

retKHAAAN

Well-Known Member
Member
Joined
Mar 14, 2009
Messages
3,840
Trophies
1
XP
1,599
Country
United States
alright, thanks
so it does affect video streaming, right?
HDR isn'T only for gaming, right?
Content has to specifically take advantage of it. Netflix has a lot of Dolbyvision content and most 4K content from digital marketplaces (iTunes, Vudu, Movies Anywhere integrated services) includes some form of HDR.
 

Noctosphere

Nova's Guardian
OP
Member
Joined
Dec 30, 2013
Messages
6,756
Trophies
3
Age
30
Location
Biblically accurate Hell
XP
18,709
Country
Canada
Content has to specifically take advantage of it. Netflix has a lot of Dolbyvision content and most 4K content from digital marketplaces (iTunes, Vudu, Movies Anywhere integrated services) includes some form of HDR.
alright, so better to take it just in case?
thanks
 

notimp

Well-Known Member
Member
Joined
Sep 18, 2007
Messages
5,779
Trophies
1
XP
4,420
Country
Laos
Here is the rundown.

You will not see 4k. At normal viewing distances. Unless the TV is 65 more likely 70" or bigger.

(You will never see 8k, even in a cinema - when sitting in any row other than the first three.)

Thats why HDR was chosen to be the next 'thing' to sell people TVs with.

Its 'more contrasty images'.

It means nothing.

OLED lobbied, that they could have premium HDR label, with 300 nits max - when arguably you'd need 2000-3000 nits to actually be somewhat color correct in regards to negligible error - compared to mastering in standard, fun fact - most content today isnt even mastered at 1500 nits.

Also the most common standards are broken.

Also the standards that arent broken (dynamic metadata) are often misimplemented and broken. Also they cost licensing fees, so no one wants to pay for them. So it isnt mass market.

As a result. 'Best' HDR today is actually gained from an approximation algorithm scanning a broken signal (HDR10) trying to glean what the HECK this scene could be about, and then dynamically remapping colors.

That stuff only more expensive TVs can do.

So for 98% of the market HDR is an effing stupid thing at the moment. If you like your image to be somewhat color correct.
--

If you need a technical explanation.


HDR was spec defined as rec2020 with PQ gamma at 10.000 nits peak brightness.

Prior gamma curves (2.2, 2.4, ...) were relative to display brightness, PQ gamma is fixed. 10.000 nits is fixed in the standard. No TV can do 10.000 nits.

From a mastering perspective - everything above 1500 nits is "highlights only" so not that that relevant for "scene color" - which is why, at TVs with an output of 1500 nits (real nits, RGB), we'd slowly come into the region of the current HDR standard actually making sense.

We arent there yet.

We are still watching HDR movies on screens with 300 (fake in case of OLED, because white subpixel brightness is not an rgb color)-1000nits brightness. That most commonly were mastered on maybe 1200nits mastering monitors.

This means the following. TVs today cant display the color space, that most HDR movies get mastered in.

It doesnt end there. HDR movies today get mastered on screens that arent standard compliant, because the standard says 10.000 nits - and those things dont exist yet.

Which makes HDR a movable goal, which is stupid.

it doesnt end there.

Because normal TVs today only can reproduce 200-500 nits, the resulting color space is vastly reduced compared to the mastering screen. Meaning, colors are all wrong.

The most popular format (HDR10), only has 1 indicator of max mastering brightness encoded in the movie - for its entire length. So the entire colorspace gets compressed down to your TVs capabilities, based on that. Meaning wrong colors.

Which is where "approximation algorithm scanning a broken signal (HDR10) trying to glean what the HECK this scene could be about" comes in, and interprets what kind of scene this is, and does dynamic adjustments based on max brightness in scene, if this is a daylight scene, if its an outdoors scene, ...., in dark scenes more of the "guessed as intended" color pallet is shown (because maybe there isnt 1500 nits content on screen), in bright scenes, more of the brighter colors are clipped.

So a good algorithm there (LG, Sony, Panasonic, not sure about Samsung - high cost TVs) is your best bet currently.
--

There would be a more sane approach to all this, and this would be dynamic metadata based HDR formats (Dolby Vision, HDR10+) but those have been wrongly encoded on the disc in the past as well - so forget it. Dynamic metadata would be able to tell the TV whats the brightest mastered color on screen in each scene, allowing the TV to display more of the intended color space in darker scenes.
--

That said, the point where color differences in mastering are only concerning 'highlight' and not scene painting colors is at 1500-2000 nits.

So currently all TVs are displaying HDR movies wrong at least in some higher brightness color patches.

This is just comparing color spaces, not looking at accuracy within color spaces (what you'd call calibration) yet. So its "can the TV even reproduce the stuff its fed" - and the answer in all HDR cases currently is - well, no.
--

Next stupidest thing is that mastering within the HDR standard, has now reduced the colorspace to P3 primaries (more limited contrast compaired to rec2020, but brightness race still continues) - without this even being a standard, its just, that everyone is doing it.

AND - that mastering is supposed to switch to brighter screens (10.000 nits ultimate goal) in the future - but still within the same HDR standard.
--


HDR IS WHAT HAPPENS, WHEN YOU LET INDUSTRY CONSORTIUMS DECIDE ON THE NEXT STANDARD FORMAT - when everyone has a vested interest to slap their "premium" logo onto as many devices as possible.

Its an outright joke. Its so bad. People had to resort to fixing the mess with algorithms - the standard is so broken.

People who dont care about color accuracy - go ahead, buy it - its brighter.
--

Addition, half the mastering studios out there dont know what they are doing either. Its the wild west.

Also many people think, that "average brightness level" (how bright is image) of PQ gamma (which is now fixed and not relative) is too dark. Also it is optimized for dark room viewing. All HDR viewing is.

Consumers dont like that, so manufacturers started implementing "daytime HDR modes" - issue is within the standard you cant. And if you do, you are just messing up colors again.
 
Last edited by notimp,

IncredulousP

GBAtemp's Resident Bastard
Member
Joined
Aug 21, 2012
Messages
679
Trophies
2
Location
Penguin Village
XP
3,036
Country
United States
Here is the rundown.

You will not see 4k. At normal viewing distances. Unless the TV is 65 more likely 70" or bigger.

(You will never see 8k, even in a cinema - when sitting in any row other than the first three.)

Thats why HDR was chosen to be the next 'thing' to sell people TVs with.

Its 'more contrasty images'.

It means nothing.

OLED lobbied, that they could have premium HDR label, with 300 nits max - when arguably you'd need 2000-3000 nits to actually be somewhat color correct in regards to negligible error - compared to mastering, fun fact - most content today isnt even mastered at 1500 nits.

Also the most common standards are broken.

Also the standards that arent broken (dynamic metadata) are often misimplemented and broken. Also they cost licensing fees, so no one wants to pay for them. So it isnt mass market.

As a result. 'Best' HDR today is actually gained from an approximation algorithm scanning a broken signal (HDR10) trying to glean what the HECK this scene could be about, and then dynamically remapping colors.

That stuff only more expensive TVs can do.

So for 98% of the market HDR is an effing stupid thing at the moment. If you like your image to be somewhat color correct.
--

If you need a technical explanation.


HDR was spec defined as rec2020 with PQ gamma at 10.000 nits peak brightness.

Prior gamma curves (2.2, 2.4, ...) were relative to display brightness, PQ gamma is fixed. 10.000 nits is fixed in the standard. No TV can do 10.000 nits.

From a mastering perspective - everything above 1500 nits is "highlights only" so not that that relevant for "scene color" - which is why, at TVs with an output of 1500 nits (real nits, RGB), we'd slowly come into the region of the current HDR standard actually making sense.

We arent there yet.

We are still watching HDR movies on screens with 300 (fake in case of OLED, because white subpixel brightness is not an rgb color)-1000nits brightness. That most commonly where mastered on maybe 1200nits mastering monitors.

This means the following. TVs today cant display the color space, that most HDR movies get mastered in.

It doesnt end there. HDR movies today get mastered on screens that arent standard compliant, because the standard says 10.000 nits - and those things dont exist yet.

Which makes HDR a movable goal, which is stupid.

it doesnt end there.

Because normal TVs today only can reproduce 200-500 nits, the resulting color space is vastly reduced compared to the mastering screen. Meaning, colors are all wrong.

The most popular format (HDR10), only has 1 indicator of max mastering brightness encoded in the movie - for its entire length. So the entire colorspace gets compressed down to your TVs capabilities, based on that. Meaning wrong colors.

Which is where "approximation algorithm scanning a broken signal (HDR10) trying to glean what the HECK this scene could be about" comes in, and interprets what kind of scene this is, and does dynamic adjustments based on max brightness in scene, if this is a daylight scene, if its an outdoors scene, ...., in dark scenes more of the "guessed as intended" color pallet is shown (because maybe there isnt 1500 nits content on screen), in bright scenes, more of the brighter colors are clipped.

So a good algorithm there (LG, Sony, Panasonic, not sure about Samsung - high cost TVs) is your best bet currently.
--

There would be a more san approach to all this, and this would be dynamic metadata based HDR formats (Dolby Vision, HDR10+) but those have been wrongly encoded on the disc in the past as well - so forget it. Dynamic metadata would be able to tell the TV whats the brightest mastered color on screen in each scene, allowing the TV to display more of the intended color space in darker scenes.
--

That said, the point where color differences in mastering are only concerning 'highlight' and not scene painting colors is at 1500-2000 nits.

So currently all TVs are displaying HDR movies wrong at least in some higher brightness color patches.

This is just comparing color spaces, not looking at accuracy within a color spaces (what you'd call calibration) yet. So it "can the TV even reproduce the stuff its fed" - and the answer in all HDR cases currently is - well, no.
--

Next stupidest thing is that mastering within the HDR standard, has now reduced the colorspace to P3 primaries (more limited contrast compaired to rec2020, but brightness race still continues) - without this even being a standard, its just, that everyone is doing it.

AND - that mastering is supposed to switch to brighter screens (10.000 nits ultimate goal) in the future - but still within the same HDR standard.
--


HDR IS WHAT HAPPENS, WHEN YOU LET INDUSTRY CONSORTIUMS DECIDE ON THE NEXT STANDARD FORMAT - when everyone has a vested interest to slap their "premium" logo onto as much devices as possible.

Its an outright joke. Its so bad. People had to resort fixing the mess with algorithms - the standard is so broken.

People who dont care about color accuracy - go ahead, buy it - its brighter.
--

Addition, half the mastering studios out they dont know what they are doing either. Its the wild west.

Also many people think, that "average brightness level" (how bright is image) of PQ gamma (which is now fixed and not relative) is too dark. Also it is optimized for dark room viewing. All HDR viewing is.

Consumers dont like that, so manufacturers start implementing "daytime HDR modes" - issue is within the standard you cant. And if you do, you are just messing up colors again.
Excellent breakdown. Ok, so for a customer that had no idea what most of that means, what do you, in layman's terms, recommend looking for in a tv? Let's assume the res I'm looking for is 1080. Should I just go into a store irl and find the picture I think is most pleasurable to my eyes?
 

notimp

Well-Known Member
Member
Joined
Sep 18, 2007
Messages
5,779
Trophies
1
XP
4,420
Country
Laos
Answer is no - because in stores they use "store modes" which push blue in the image by 30+% and they display under halogen light, and they have light scatter from all the other TVs around, so you'll never see what you'd see at home.

Should you go with tests an buy what they recommend? Answer is also partly no - because at the high cost level weighing is largely subjective.

In the end it would be a mixture of both.

(Buy a LG OLED, or a TCL 55S517, or a Vizio P-Series F1 -- or a Sony 'that model' with zone based background brightening in flickering mode ("Backlight Master Drive™" should be the marketing word there - there should be a new one coming that was shown at CES.))

The next thing around the corner is "better motion resolution", which is actually dearly needed, but the jury is still out on how thats going (This years fall TVs).

Also you almost cant buy 1080p sets these days. Think of 4k as something you almost get 'for free' because it was important for everyone to have it because of a marketing race.
--

If you dont care about HDR, and dont care about the inkiest blacks, you could actually buy VERY low cost, and depending on panel characteristics (how good is the rec709 color mapping ("calibration")). Rec709 (non HDR Bluray stuff, 90% of the HD content out there) has matured and everyone can do it.

Then you would go into - is viewing angle important? And maybe buy a Samsung non-QLED.

But TCL and Vizio were aggressive with full array local dimming pricing on the two models listed, so you get better blacks at LCD prices with them. The question is, do you need it?

If you are way into HDR these days, you'd probably go with a SONY LCD or a Samsung QLED - but you shouldnt (see issues with HDR).

Everyone and their mother is recommending OLEDs for their "perfect black", which in fact, you might not need so much (full array local dimming black on LCDs is good enough). Also they have the best viewing angles currently, but suck at HDR comparatively to other high end TVs.
--

Also the next battle ground will be motion resolution, which is overdue - because, all TVs motion performance nowadays sucks.

(When the image moves fast, it smears on screen or in your eyes. Compared to Plasmas or CRTs of old)

How that pans out, well see at the end of the year.
--

Short summery

Buy what the wirecutter tells you:
https://thewirecutter.com/guides/buying-a-tv/

Or an LG OLED. Or a Panasonic OLED (better color much higher price, no Dolby Vision (?)). Or a high priced SONY TV with "Backlight Master Drive™".

Dont buy Samsung in the high price segment yet (maybe they bring out better products in the end of the year).

If you dont care about best black level, best motion performance (which still suck) or HDR - buy anything, that has a decent review on color accuracy out of the box, and is cheap.

If you spend more money, make sure you get a 120Hz panel (or at least a 60Hz panel that can do 48hz for judderfree 24p (there arent usually many of them out there)). Oleds have 120Hz panels. Dont go with manufacturer Hz figueres, they fake them (480+Hz! (=we made up)).

And before you decide on which one - go into the store and check the haptics. How you like the remote, how you like the menues, and stuff.

Never buy a TV because of their SMART capabilities. Buy an Apple TV box (HDR10, DV best implemented (?)), or a Fire TV or a Roku afterwards.

Yes - its really that indiscernible and chaotic.. ;)

Which is also why everyone just says buy OLED - but thats really not the jinx of it. They are good - but they arent necessarily that good for the price (chinese manufacturer have cheap prices) - and 'what is best' depends on how you weigh certain aspects.
 
Last edited by notimp,
  • Like
Reactions: IncredulousP

IncredulousP

GBAtemp's Resident Bastard
Member
Joined
Aug 21, 2012
Messages
679
Trophies
2
Location
Penguin Village
XP
3,036
Country
United States
Amazing, thanks for your detailed posts. One more question, what would you recommend for good motion resolution and low input latency? These are the two qualities that stick out to me the most. Preferably something cheaper, but if it can't be helped then ignoring price.
 

notimp

Well-Known Member
Member
Joined
Sep 18, 2007
Messages
5,779
Trophies
1
XP
4,420
Country
Laos
Dont know for input latency at the top of my head. A few manufacturers reduced it quite significantly in the past year, so - check https://www.rtings.com/tv for that stuff (their reviews suck, their graphs don't ;) ) best TV reviews out there are from Vincent Teoh (imho) - https://www.hdtvtest.co.uk/ and https://www.youtube.com/HDTVtest (doesnt test cheap US brands, as he is UK based).

Motion resolution is bad on all TVs - except for the SONY(s) with Backlight Master Drive. Which is also part of the solution a few manufacturers are looking into for fall releases.

The issue here is, that without some form of flicker or strobing (black then color, black then color) we always see after images on todays tech (didnt on CRT or plasma). Reason is either image retention on the pixel level (time for fully switching between two colors), or f.e. in the OLED case, image retention on our retinas.

Plasmas where pulsing (with black frames in between), CRTs created images line by line out of a black screen (flicker).

So the new ventures to get better motion resolution, all try to insert black frames into the image content (they have ample time to do so, because the singnal usually is only 24p or 30p and they can push frames at 120hz). The issue here is, that in the past, when they did that, it reduced overall screen brightness by so much, that no one was using those modes - and also the flicker was "severe".

Sony was the first that tried to tackle that by selectively raising brightness on high brightness screen image elements while, or shortly after (dont know) a black frame was pushed. (Thats their "backlight master driveTM" stuff). Now others are looking at similar solutions. We'll see how it goes.


In the early days - everyone blamed bad motion resolution on LCDs on pixel response (better switch) time, but that turned out not to be the - sole - case, as OLEDs have very low numbers there on paper - put their motion resolution still isnt any better. Then people came up with "its actually image retention on your retina" - and you need some sort of flicker or strobing. The first implementations of flicker for more motion resolution on LCDs where useless (image too dark), with the next TV generation in might slowly get better.

The Sony TVs with backlight master drive already have some form of the 'better solution implemented' (1 or 2 models in total).
--

If you are talking about "motion resolution" as in "interpolation"/motion smoothing (making up in between frames that dont exist). Thats the whole 120hz panel thing (one part of it). You need 120Hz Panels for that. Also that has been sold to you as the solution for the past 5-7 years, but it never really was. ;)

Sony also has the best implementation there. (For 24p movies certainly.)

But everyone elses is close. (Kind of a "everyone is cooking with water" thing).

The thing with making up intermediate frames that dont exist is, that that still sucks, because it changes the "feeling" of movies.
Motion resolution in the end is great - but up to +200% (two thirds ;) ) of the frames in the movie you see are entirely made up (> soap opera effect).

Tom Cruise and another scientology buddy of his will tell you all about this:


Dont ask me why they were doing that. ;)
 
Last edited by notimp,
  • Like
Reactions: IncredulousP

notimp

Well-Known Member
Member
Joined
Sep 18, 2007
Messages
5,779
Trophies
1
XP
4,420
Country
Laos
Motion resolution btw means "image detail you are able to see - when f.e a camera is panning fast, or something in shot is moving fast". The issue here is, that with all current display tech - and movies being delivered at 24p or 30p - visible resolution is fine (4k!) if the images are fairly still, but as soon as there is quick movement on screen - visible detail resolution 'drops down to 480p' (thats an analogy, but very descriptive of whats happening).

Now - this could be solved by shooting movies in 60Hz (p) or 120Hz (p) but thats never going to happen. (This is also why this isnt an issue for 60+Hz gaming on todays screens.)

This also could be solved by interpolation (making up intermediate frames for up to 2/3 of the frames in a movie you see), but the issue then is, that you'll see a largely "made up" movie - and not whats on the disc, or was shot. ;) (>soap opera effect).

In the future more people might try to solve it with some sort of black frame insertion. But 'better' than past attempts.

Your current best bet approach is to use a very slight interpolation setting (motion settings menue on your TV set to custom) - or actually turn it off, when you are watching 24p content. Which is most movie content.

But then you still have bad motion resolution, even with a 120Hz panel.

(You still want 120 Hz panel for 'its a multiple of 24(p)' reasons. If you get 60Hz panel that cant be refreshed at 48Hz, TV repeats full frames every now and then, to bring 24p up to 60Hz refresh rate (not a multiple of 24). Thats called judder, and its also bad. Although most people might not notice it - if you dont tell them what to look for. On bigger screens it becomes more noticable. But then you dont need big screens either, do you.. ;) )
--

More detailed explaination of what the "approximation algorithm scanning a broken signal (HDR10) trying to glean what the HECK this scene could be about" actually does on HDR10 content (good luck finding which TV setting enables it for your brands TV.. ;) ), on more expensive current TVs.

HDR10 has only one max brightness value encoded for the movie (max brightness seen in mastering). That means, that with the conventional approach you are compressing the entire color space defined on the disc (or in the dicital stream) likely from 1500nits, maybe even from 2000nits - down to lets say 300nits on your "premium" HDR TV. This means, you are shrinking the entire color space - linearly.

Meaning every color thats getting displayed - is getting displayed wrongly. (Wrong brightness level, usually also wrong saturation levels.) - Even those that could be displayed correctly (less bright), because the entire color space is shrunk linearly. There is then some additional stuff at peak brightness rolloff - but you can look into that on your own. ;)

The "approximation algorithm scanning a broken signal (HDR10) trying to glean what the HECK this scene could be about" approach there now is - to "identify" what scene you are watching (is the high nits (nits btw is a meassure of brightness) content in the scene, because its a daylight scene, a shot where you see the sun and the sky, or simply because of some bright highlights in the scene) and doint different color mapping on your TV depending on what the scene is showing.

So in dark scenes - show more of the intended original color, the TV can display. In bright scenes clip more of the highlight colors, so you still have somewhat closer to accurate colors, even with a HDR wow effect - but AT NO COST, actually do what the standard (HDR10) would tell you to and compress the color space linearly. (Resulting in every color being displayed max wrong.. ;) )

Because HDR10 is the most common HDR standard today, thats actually the most viable approach.

And its still not displaying HDR colors more accurately than your TV can (What 300 nits max brightness? When the movie was mastered at 1500nits?). Its just reducing the reproducing the HDR10 standard issues, that make it look more wrong.

Its so painfully stupid...

LG calls that algorithm stuff 'dynamic tone mapping' btw:


So that you have a reference. Also out of all High end TVs OLEDs need it the most (Because lowest peak brightness).

And it might be less needed the closer to real 1500 nits capability your TVs get (and we are getting there). But then mastering is supposed to chase the 10.000 nits goal of the actual standard, and then the gap widens again (at least then it poses less of an issue, because its mostly impacting highlight colors at that point.) and... Its so stupid... ;)
 
Last edited by notimp,
  • Like
Reactions: IncredulousP

notimp

Well-Known Member
Member
Joined
Sep 18, 2007
Messages
5,779
Trophies
1
XP
4,420
Country
Laos
Watching the following two videos (split screen portion):

h**ps://www.youtube.com/watch?v=ZhSyxjnwLOA
h**ps://www.youtube.com/watch?v=8_rUfVBhM8U

Is actually a good way to understand how HDR is used to "paint" scenery.

Both (?) videogames already are able to output 4000nits peak brightness, so using a color coded heatmap, those videos actually show you which parts of the image you would "loose" color accuracy on - once your TVs are capable of displaying 1500 nits.

The very short summery on this is - HDR starts to make more sense, once TVs can output 1500+ nits. Then the color accuracy thats lost almost entirely sits in highlights. Also HDR10 becomes less problematic then.

For the purpose of actually adhereing to the 10.000 nits standard (no one currently is). 2000-3000 nit TVs would be even better. (And 10k nits TV would be best.. ;) )

Its also a good way to "imagine" how much of an image would be not color correct (even after dynamic tone mapping) on a 300-500 nits OLED (and those are fake nits (white subpixel pushed) - meaning shortly below max nits colors still arent correct on OLEDs)

Its a good visualization basically.
 
Last edited by notimp,

notimp

Well-Known Member
Member
Joined
Sep 18, 2007
Messages
5,779
Trophies
1
XP
4,420
Country
Laos
Gotta love anylytical minds. :) Send them in a room with the premise not to be able to take any notes or pictures - and they simply reconstruct whats been shown from memory. ;) Vincent Teoh ftw. ;)

This video basically shows whats in the pipeline for 2019 from Samsung.



Vincent also states that so far they havent used dynamic tonemapping, but that starting with 2019 they probably will.
-

My assessment that more manufacturers will try to improve motion resolution is based on there being related demos on CES, which might not translate into mass production models that quickly. Just as a fair word of warning. :)

edit LG OLEDs currently are (500-)700 nits calibrated please correct that number in mind in the postings above.. ;)
 
Last edited by notimp,

fiis

Well-Known Member
Member
Joined
Mar 27, 2015
Messages
106
Trophies
0
Age
35
XP
97
Country
United States
It will look beautiful for about a movie, then your eyes adjust and you wont notice it any longer, youll forget you have it.
 

Site & Scene News

Popular threads in this forum

General chit-chat
Help Users
  • BakerMan
    The snack that smiles back, Ballsack!
    SylverReZ @ SylverReZ: https://www.msn.com/en-gb/news/offbeat/twitch-streamer-places-24000-hit-on-youtuber-after-stellar...