TV/Monitor advice needed

Yiffna

Active Member
OP
Newcomer
Joined
Mar 7, 2017
Messages
35
Trophies
0
Location
UK
XP
1,450
Country
Hi Guys,

I'm after some advice in regards purchasing a new TV/Monitor for next gen games consoles.
So I've pre-ordered an Xbox Series-X. I normally sit at a desk and play my games consoles so I'm after a screen no bigger than 27".
My questions are:
Monitor or TV?
What specifications should i be looking for so the picture looks brilliant and responds well? Hz/ms?
What added features should i try to get?

Any advice appreciated
 

FAST6191

Techromancer
Editorial Team
Joined
Nov 21, 2005
Messages
36,798
Trophies
3
XP
28,285
Country
United Kingdom
Monitor vs TV.
For the most part small TVs are expensive monitors while large TVs are cheap monitors, many TVs these days also come with a junk tier android board or similar built in and call themselves a smartTV -- they might just about play youtube with adverts, possibly also netflix for a few years until they change the protocols, but I have never found one I would want to use and I could almost just about deal with a tablet as a computing device if I had to. Once upon a time there was a fairly notable difference in resolution and inputs but today a TV is mostly a monitor with a TV tuner, this tuner is not free so you do pay for it in other ways if the prices reflect the input materials (debatable). Speakers will be on 99% of TVs as well (for a hot moment about 10 years ago speakers were a separate deal to what was otherwise a TV) but with the move to thinner and thinner screens the speaker quality dropped off a cliff so headphones and soundbar might be in. Be aware of this however if you are looking at consoles -- there are devices to pull audio from HDMI if your base console won't. A minor thing (and solved with a HDMI switch for some) but a lot of monitors will possibly make your fingers do a dance to swap inputs where a TV will expect to do it a lot and have one button.

For some setups there is a debate over the colour quality and effective resolution for TVs (the way they work mean they might be a lesser offering). A TV may or may not also come with a bunch of other inputs than plain old HDMI if you want to connect old game consoles, though even if the ports are there they might not support it. There are adapters of varying quality, and price, for most things to HDMI. Gaming tending to want the higher priced stuff though frankly if you are on this site then emulators are probably a thing you ought to be looking at there.
Not all graphics cards revolve around HDMI either (see displayport, and there are also reasons to be using DVI in some cases) so that might be a factor.


Specifications.
4 main fields most look at
Simple resolution.
Refresh rate.
Latency (this will be the milliseconds/ms rating some talk about but more on that later)
Colour quality and additional effects (see HDR), though for a lot of gamer types this can fall by the wayside.


Resolution then. How many pixels on the screen. More pixels, more information can be displayed but you also come up against a limit on how many you can physically see.
There are some minor variations for things like 16:10 monitors, ultrawidescreen 21:9 and beyond aspect ratio and film 4k) and some mid tier stuff but for the most part the debate is between
1080p aka 1920x1080 and 4k aka 3840x2160 (4 1080p monitors squished together).
4 times the resolution jump to 4k means 4 times the number of pixels your graphics card needs to push. Also at 27 inches and under it becomes tricky to see the differences, even more so if you are sitting the other side of the room. For games 4k tends to be most useful in long range sniper games (4 pixels in the distance is now 16), real time strategy if they allow it (more map visible at once, obviously then a big cheat). It is very nice for looking at schematics, looking at 3d models, spreadsheets and video editing if you want to have full resolution 1080p video visible on the screen all at once (it only taking up say one quarter and allowing you to have all the information around it).

Refresh rate.
How many times per second a new frame is drawn on screen.
Many years ago we had CRT monitors. The way these worked by drawing one line at a time (ever seen cameras looking at old screens and a line going down them? This is that) meant higher refresh rate (which back then you could trade off against resolution) meant a clearer image. The shift to LCD which draws the whole screen at once meant the perks of higher rates became less obvious and 60 (the favoured rate of the US and Japan at the time) became the default choice. More frames the smoother the motion, especially as most computer graphics don't have anything like proper motion blur, and also something for input but put that aside for now. It does also mean that the nice FPS counter in the corner of the screen saying your PC is spitting out 300 frames per second is nice and all but if your screen is only ever going to output 60 of them a second you are not getting much for your trouble. Likewise if you are playing a system grinder of a game (or using system grinder options) and only getting 60fps then your 244Hz screen is not going to do you any good. This does also mean your nice 60fps DVD that was filmed on a 24fps film camera is not going to get better magically, there are some very clever tools (see things like mvtools) that can fake something but eh it is still producing data from nothing and hoping it works.
We will skip the console refresh rate discussion, interlacing and NTSC vs PAL vs FILM vs old school stuff as that will get us more off topic than we already are. Sadly it is not a dead discussion but might as well be for most modern uses.
A few years back though 2 things happened, or maybe 3.
1) HDMI at the time came to the limits of what it was designed for and 4K resolution might only update at 30Hz. 4K is still 4K so some screens were made but only updated 30 times a second at that resolution. Good enough for static images, diagrams and still plenty workable but many would notice something in the mouse cursor or scrolling around fast. You probably won't find this unless you are doing second hand stuff but there are still some cheapo screens that do it so be aware of it.
2) Gamers decided it was good stuff. I am less sold on this one but the theory runs that you can see more data happen (if something is moving fast near you then one frame to the next is potentially the difference between hitting and not) and if the inputs are tied to framerate (for many years this was the case, PC games for a while now have largely decoupled input from frame updates, old consoles however will) then you get more chances to refine inputs on controls. Also a dip of 10 frames per second from 30fps to 20 vs 60fps to 50 (or maybe 40 if you are being proportional) is the difference between slideshow and barely noticeable so there is that.
The third (or 2i maybe) was VR goggles make fewer people sick if they pump up the refresh rate but that is a slightly different discussion.
More frames per second means more frames per second your computer needs to generate.
This means going from 60Hz (and potentially 60 FPS) to 120Hz (a popular first step, though some seek things in the 200 range) means twice the power needed. If you couple that with 4 times needed for 4K then you are at 8 times the power for something you might not even notice or feel any great benefit from. 8 times is more the theoretical and can be more if they want more detailed textures to load in, and can be less as one frame is similar to the next and you can predict things fairly reliably, though this was also true before the leap to these standards and was largely already done.

Latency.
This is the time taken between the graphics being made on the computer and being displayed on the screen.
Back in the day then CRT was about as fast as electricity really. You change colour balance, screen position and more by changing physical things. To this end CRT tends to be the baseline everything else gets compared to (indeed if you don't want to invest in high end tools most will stick a CRT next to whatever they are testing and set a high speed camera on it and compare how long it takes for the monitor to display the data that the CRT displays earlier).
Some modern emulators are doing things like input prediction and making things faster than the hardware it originally ran on but that is a different story.
Anyway the move to LCD meant most corrections were done via electronics that take a signal in, do something with it and fire it on. This takes time and do this a few thousand times and it adds up. Not so bad for watching TV or playing a DVD -- if your DVD pauses half a second after you pressed the button who really cares, half a second (500ms) of ping in a FPS and you are done, a spectator... basically pointless. 500ms is a bit much though but there is a marked difference between 20ms (some TVs and types of screens) and 2ms (some of the best). It is rare as market for it is tiny but you can have a high refresh rate screen with awful latency.
Sometimes you will see this measurement given as g2g which is grey to grey aka how fast can it cycle round a pattern. This is not pointless as a good g2g will tend to be an indicator of things but is still not a real measurement and is mostly only the concern of marketing people that are designed to get you to part with your funds.
TVs will then have a PC mode, gaming mode or something similar that drops the processing they might be doing but increases either the legibility or text or latency in that mode. You might have to read the manual to figure out how to activate it as well.


Game companies have also sought to lessen both latency concerns and aspects of framerate by removing the idea of framerate from a fixed concept to more of a "when it is done we will send it" AMD/ATI calls their take on this tech FreeSync while Nvidia calls their G-Sync. More classically this is what is known as variable frame rate. Freesync is released without royalties and the like so tends to be a bit more available. This tends not to be available for TVs but there are exceptions. The xbox one x and s (not the original xbone however) do support freesync though.

Colours.
Often overlooked in gaming circles. If however you edit images, video or the like then it pays to be able to send an image to somewhere else and them be sure they are looking at the same colours, and printing the same. Though do be careful if you head down this path as not everybody will -- I have nice monitors here for this and might do a website for someone, in this case it was a curry restaurant so brown it was. Nice brown on my screen was a decidedly less pleasant brown on someone else's, certainly one you don't want associated with food.
Anyway Apple have their own thing that some play to and others will use other means.
You you will tend to see expressed as a gamut and how close it is to achieving the ideal there
https://www.photoreview.com.au/tips/editing/is-your-monitor-up-to-scratch/
https://www.tomshardware.com/uk/reference/how-we-test-pc-monitors-benchmarking/3
Anyway photography, printing and videography have their own monitor lines (Dell do some quite nice takes here for reasonable prices compared to some) that will make most times someone slaps on a coloured shell, a colour changing LED and calls it gamer before doubling the price look like pocket change. They also might not go in for all the same things as gamers -- refresh rate and latency is less of a concern.

HDR is a thing people look at nowadays. It is short for high dynamic range. Originally mostly seen as a camera trick it allows you to have really detailed dark parts of an image while a bright light source might be in another (say you are taking a sunrise picture with some clouds, you take two shots, one exposing such that the detail in the clouds is visible, the other so the bright light aspects are and the clouds might be washed out, stitch the two together and you have a nice image. Games and monitors for such things have recently taken to looking at it.


Related to this is brightness. Fairly obvious. More brighter is more better for some if the screen is to illuminate their basement or deal with sunlight coming in. You will tend to see this rated in candela aka CD per unit area (usually square metre). Ratings vary but 200 is low and above 350 is high, though higher exists (if you see digital signage as an option in your monitor searches this will be a big thing as having a nice bright screen able to go through a window and still be see in sunlight is desirable to many playing such games). A high brightness, high colour quality screen is a trickier thing as brightness might well wash things out.
That said while I am quite happy to risk arc eye and have a screen as though someone is welding in front of me it is a psychological thing after a while, much like loudness in headphones.
Contrast though is something you might want to consider (this will tend to be expressed a ratio of 1:[some high number]). More contrast means the black will be nice and black while the light stuff will be nice and bright. Simple brightness can mean a tradeoff. Contrast is also quite noticeable if you sit two screens next to each other and compare.

Be aware that brightness, colours, contrast and everything will usually be cranked to the max in shops that have displays of the things as it tricks people that don't know into thinking they are looking at something good (louder is better, sweeter is better, noisier engine is faster... same idea).


Anyway more functionality tends to cost and the various screen technologies out there come with tradeoffs for all of it.
CRT is no longer made and big ones are heavy as anything, not to mention gobble power rather nicely. Still they do have nice colours if you want, high framerates, tend not to make it to 4k and might even not be widescreen but they had resolutions around or above 1080 vertical back in the 90s. To that end there are certain models/families of CRTs that are still sought out and used by gamers that care to deal with the downsides, or maybe want the perks ( http://bogost.com/games/a_television_simulator/ ).

Plasma is a thing you can still find in odd corners (most places sold the last of them off cheap about 5 years ago). They gobble a lot of power, usually weigh a lot, tend to die after a few years, can have fairly high latency, have various downsides as far as screen burn but their colours are usually great and can probably be had very cheap these days.

LCD then has many families with a few more set to join in the near future.
https://www.tomshardware.com/reviews/lcd-led-led-oled-panel-difference,5394.html
For many years the big debate was between IPS (in plane switching) and TN screens (twisted nematic). VA is also a thing but mostly was a middle ground.
IPS tended to have better colours and viewing angles (if you have ever had the brightness and colours change radically when standing off to the side this is that) but a higher latency, or more expense if you wanted a low one, where TN tended to be low latency but colours would appear washed out unless you got a good one. To that end gamers on a budget or seeking the fastest would opt for TN even at the cost of some colour quality, and if you are the only one looking at it from a chair in front of it then viewing angle is pointless.

OLED is coming in which is a different style entirely but not really in big screens (making them larger than phone screens is tricky aka expensive). If you see an OLED screen for something larger than an incredibly expensive tablet then it is a marketing term as the light that lights the screen is OLED rather than more conventional LED or the even older cold cathode. One day it might be real but today meh. They are nice lights though.

If you want to mount your screen on an arm attached to a wall/table you will want to make sure the screen has mounts for it. These are typically known as VESA mounts (4 bolts in the back of the screen, larger screens may have a larger distance which can make it hard or need you to buy/make an adapter plate.). At one point this was nearly ubiquitous but I am increasingly seeing more monitors without it.

This features lark could go on for a while (thin bezel if you want to stick 3 monitors together and wrap around) and this is looking like one of my walls of text as it is.

Short version

For most gamers then it becomes a matter of

Resolution. 1080p vs 4k being the big decision, though some might go for serious widescreen in lieu of 4k ( https://www.wsgf.org/ being a choice link for serious widescreen on new and old games). There are higher resolutions still but unless you are looking at your screen with a magnifying glass they are pointless. What benefits you get from 4k might be debatable, though see if you can see for yourself.

Refresh rate. 60Hz is the baseline, don't go below that. Some opting for 120,144 or even into the 200s. What benefit you get from this may vary again but again go see some side by side if you can. There is also the freesync thing that does some things for some people.

Latency. Lower the better. 10ms is max for most but 2 to 4 is what most will seek if they can. If you don't play many twitchy action games that are all about reacting quickly to events on screen then you can also skimp here -- not like it matters for a turn based move the pieces game.

If you care to sacrifice some colour quality and viewing angle then you can get a bit cheaper with a TN panel, if not then you get to pay a bit more for IPS and fancy backlights while you are at it but maybe see the latency bumped up more. Prices vary and offers, bankruptcy sales and more happen all the time.
Make sure your screen has the inputs you need, though you can do things like buy an adapter box and maybe amplifiers these days will have inputs you can select between.

4k and high refresh rates also come at a cost to the machine needed to pump out the graphics in the first place. While the current gen of consoles nominally claims some support they really don't. Newer ones might have something more but likely will not be doing 4K120 in any meaningful capacity either.
For myself give me a 1080p60 monitor of decent size, latency and resolution (I like colours too but can compromise there) and I don't think I would be missing out on anything (not like sniper games are popular these days) for several years yet with the added bonus that a more modest machine will be able to make use of things for longer or play with the higher settings. If I return to things in 5-7 years this might change, but so too will have prices, and that is also about the length of time most will hang onto a monitor before they break. VR goggles might also be doing something worthwhile at such a point in time.

https://www.digitaltrends.com/computing/computer-monitor-buying-guide/ has some good info too.
 
  • Like
Reactions: 9x0 and Yiffna

Tom Bombadildo

Dick, With Balls
Member
Joined
Jul 11, 2009
Messages
14,575
Trophies
2
Age
29
Location
I forgot
Website
POCKET.LIKEITS
XP
19,210
Country
United States
Hi Guys,

I'm after some advice in regards purchasing a new TV/Monitor for next gen games consoles.
So I've pre-ordered an Xbox Series-X. I normally sit at a desk and play my games consoles so I'm after a screen no bigger than 27".
My questions are:
Monitor or TV?
What specifications should i be looking for so the picture looks brilliant and responds well? Hz/ms?
What added features should i try to get?

Any advice appreciated
What's your budget? You can only get so far with so much money, so it'd be useful to know what you have to spend so we could recommend something. As for "TV or monitor?", your only options at those sizes would basically be monitor only, they don't make 4k TVs that size and these days nobody makes 1080p TVs under 32", so you'd possibly need to look at something older for a 1080p 27" or smaller TV.

As for what you should look for, for next gen consoles you basically have the choice between either 4k@60hz (even though they claim they can do 4k at 120fps, like FAST I'm going to go with "Doubt it", or at the least you'll have to choose between good looking settings and 60fps 4k or not so good looking settings and 120fps 4k) or 1080-1440p@120+hz. IMO, high refresh rate is infinitely better than higher resolution every single time, and the "sweet spot" for quality and refresh rate is 1440p@120hz.

For color quality, your choices will be between IPS, TN, and VA panels (there's also OLED, but they're still very expensive so don't bother IMO). In terms of color accuracy and best looking, it goes: IPS, VA, TN, so if you're looking for best color quality monitor you'll likely want something that uses at least a VN panel, and IPS if you have the money.

As for comparing monitors you may want to buy, the best place for this is https://www.rtings.com/ < Rtings. They do in depth monitor and TV reviews and is the best place to go looking for recommendations for a display. I would suggest you look at their recommendations to find the right monitor for your requirements.
 
  • Like
Reactions: Yiffna

FAST6191

Techromancer
Editorial Team
Joined
Nov 21, 2005
Messages
36,798
Trophies
3
XP
28,285
Country
United Kingdom
and these days nobody makes 1080p TVs under 32"
There are still a few fly by night companies using panels from other vendors*, though doing stuff on the low end.

Now finding a 1080p monitor under 19 inches that is not a portable display. That is a bit more of a trick it seems.


*oh and for others playing along then there very few companies actually making large screens aka panels when it is just the screen without power supply, case or input handling. Unlike some other things in tech, see flash memory, this matters rather less here and few pay much attention to it outside of repair shops where something might be able to harvest panels for repairing more expensive devices.
 

tech3475

Well-Known Member
Member
Joined
Jun 12, 2009
Messages
3,651
Trophies
2
XP
6,030
Country
If you do go for a monitor, you may need to buy separate speakers. Every monitor I've had with integrated speakers were naff.

I'd make sure to get a monitor with integrated audio output to make things easier.
 
Last edited by tech3475,

tech3475

Well-Known Member
Member
Joined
Jun 12, 2009
Messages
3,651
Trophies
2
XP
6,030
Country
Have you had a TV since the CRT days with inbuilt speakers that were good?

Decent enough that I don't need a sound bar and way way better than the monitors.

The monitors are like my last earbuds from poundland with the TV being something £10-20.
 

Yiffna

Active Member
OP
Newcomer
Joined
Mar 7, 2017
Messages
35
Trophies
0
Location
UK
XP
1,450
Country
Monitor vs TV.
For the most part small TVs are expensive monitors while large TVs are cheap monitors, many TVs these days also come with a junk tier android board or similar built in and call themselves a smartTV -- they might just about play youtube with adverts, possibly also netflix for a few years until they change the protocols, but I have never found one I would want to use and I could almost just about deal with a tablet as a computing device if I had to. Once upon a time there was a fairly notable difference in resolution and inputs but today a TV is mostly a monitor with a TV tuner, this tuner is not free so you do pay for it in other ways if the prices reflect the input materials (debatable). Speakers will be on 99% of TVs as well (for a hot moment about 10 years ago speakers were a separate deal to what was otherwise a TV) but with the move to thinner and thinner screens the speaker quality dropped off a cliff so headphones and soundbar might be in. Be aware of this however if you are looking at consoles -- there are devices to pull audio from HDMI if your base console won't. A minor thing (and solved with a HDMI switch for some) but a lot of monitors will possibly make your fingers do a dance to swap inputs where a TV will expect to do it a lot and have one button.

For some setups there is a debate over the colour quality and effective resolution for TVs (the way they work mean they might be a lesser offering). A TV may or may not also come with a bunch of other inputs than plain old HDMI if you want to connect old game consoles, though even if the ports are there they might not support it. There are adapters of varying quality, and price, for most things to HDMI. Gaming tending to want the higher priced stuff though frankly if you are on this site then emulators are probably a thing you ought to be looking at there.
Not all graphics cards revolve around HDMI either (see displayport, and there are also reasons to be using DVI in some cases) so that might be a factor.


Specifications.
4 main fields most look at
Simple resolution.
Refresh rate.
Latency (this will be the milliseconds/ms rating some talk about but more on that later)
Colour quality and additional effects (see HDR), though for a lot of gamer types this can fall by the wayside.


Resolution then. How many pixels on the screen. More pixels, more information can be displayed but you also come up against a limit on how many you can physically see.
There are some minor variations for things like 16:10 monitors, ultrawidescreen 21:9 and beyond aspect ratio and film 4k) and some mid tier stuff but for the most part the debate is between
1080p aka 1920x1080 and 4k aka 3840x2160 (4 1080p monitors squished together).
4 times the resolution jump to 4k means 4 times the number of pixels your graphics card needs to push. Also at 27 inches and under it becomes tricky to see the differences, even more so if you are sitting the other side of the room. For games 4k tends to be most useful in long range sniper games (4 pixels in the distance is now 16), real time strategy if they allow it (more map visible at once, obviously then a big cheat). It is very nice for looking at schematics, looking at 3d models, spreadsheets and video editing if you want to have full resolution 1080p video visible on the screen all at once (it only taking up say one quarter and allowing you to have all the information around it).

Refresh rate.
How many times per second a new frame is drawn on screen.
Many years ago we had CRT monitors. The way these worked by drawing one line at a time (ever seen cameras looking at old screens and a line going down them? This is that) meant higher refresh rate (which back then you could trade off against resolution) meant a clearer image. The shift to LCD which draws the whole screen at once meant the perks of higher rates became less obvious and 60 (the favoured rate of the US and Japan at the time) became the default choice. More frames the smoother the motion, especially as most computer graphics don't have anything like proper motion blur, and also something for input but put that aside for now. It does also mean that the nice FPS counter in the corner of the screen saying your PC is spitting out 300 frames per second is nice and all but if your screen is only ever going to output 60 of them a second you are not getting much for your trouble. Likewise if you are playing a system grinder of a game (or using system grinder options) and only getting 60fps then your 244Hz screen is not going to do you any good. This does also mean your nice 60fps DVD that was filmed on a 24fps film camera is not going to get better magically, there are some very clever tools (see things like mvtools) that can fake something but eh it is still producing data from nothing and hoping it works.
We will skip the console refresh rate discussion, interlacing and NTSC vs PAL vs FILM vs old school stuff as that will get us more off topic than we already are. Sadly it is not a dead discussion but might as well be for most modern uses.
A few years back though 2 things happened, or maybe 3.
1) HDMI at the time came to the limits of what it was designed for and 4K resolution might only update at 30Hz. 4K is still 4K so some screens were made but only updated 30 times a second at that resolution. Good enough for static images, diagrams and still plenty workable but many would notice something in the mouse cursor or scrolling around fast. You probably won't find this unless you are doing second hand stuff but there are still some cheapo screens that do it so be aware of it.
2) Gamers decided it was good stuff. I am less sold on this one but the theory runs that you can see more data happen (if something is moving fast near you then one frame to the next is potentially the difference between hitting and not) and if the inputs are tied to framerate (for many years this was the case, PC games for a while now have largely decoupled input from frame updates, old consoles however will) then you get more chances to refine inputs on controls. Also a dip of 10 frames per second from 30fps to 20 vs 60fps to 50 (or maybe 40 if you are being proportional) is the difference between slideshow and barely noticeable so there is that.
The third (or 2i maybe) was VR goggles make fewer people sick if they pump up the refresh rate but that is a slightly different discussion.
More frames per second means more frames per second your computer needs to generate.
This means going from 60Hz (and potentially 60 FPS) to 120Hz (a popular first step, though some seek things in the 200 range) means twice the power needed. If you couple that with 4 times needed for 4K then you are at 8 times the power for something you might not even notice or feel any great benefit from. 8 times is more the theoretical and can be more if they want more detailed textures to load in, and can be less as one frame is similar to the next and you can predict things fairly reliably, though this was also true before the leap to these standards and was largely already done.

Latency.
This is the time taken between the graphics being made on the computer and being displayed on the screen.
Back in the day then CRT was about as fast as electricity really. You change colour balance, screen position and more by changing physical things. To this end CRT tends to be the baseline everything else gets compared to (indeed if you don't want to invest in high end tools most will stick a CRT next to whatever they are testing and set a high speed camera on it and compare how long it takes for the monitor to display the data that the CRT displays earlier).
Some modern emulators are doing things like input prediction and making things faster than the hardware it originally ran on but that is a different story.
Anyway the move to LCD meant most corrections were done via electronics that take a signal in, do something with it and fire it on. This takes time and do this a few thousand times and it adds up. Not so bad for watching TV or playing a DVD -- if your DVD pauses half a second after you pressed the button who really cares, half a second (500ms) of ping in a FPS and you are done, a spectator... basically pointless. 500ms is a bit much though but there is a marked difference between 20ms (some TVs and types of screens) and 2ms (some of the best). It is rare as market for it is tiny but you can have a high refresh rate screen with awful latency.
Sometimes you will see this measurement given as g2g which is grey to grey aka how fast can it cycle round a pattern. This is not pointless as a good g2g will tend to be an indicator of things but is still not a real measurement and is mostly only the concern of marketing people that are designed to get you to part with your funds.
TVs will then have a PC mode, gaming mode or something similar that drops the processing they might be doing but increases either the legibility or text or latency in that mode. You might have to read the manual to figure out how to activate it as well.


Game companies have also sought to lessen both latency concerns and aspects of framerate by removing the idea of framerate from a fixed concept to more of a "when it is done we will send it" AMD/ATI calls their take on this tech FreeSync while Nvidia calls their G-Sync. More classically this is what is known as variable frame rate. Freesync is released without royalties and the like so tends to be a bit more available. This tends not to be available for TVs but there are exceptions. The xbox one x and s (not the original xbone however) do support freesync though.

Colours.
Often overlooked in gaming circles. If however you edit images, video or the like then it pays to be able to send an image to somewhere else and them be sure they are looking at the same colours, and printing the same. Though do be careful if you head down this path as not everybody will -- I have nice monitors here for this and might do a website for someone, in this case it was a curry restaurant so brown it was. Nice brown on my screen was a decidedly less pleasant brown on someone else's, certainly one you don't want associated with food.
Anyway Apple have their own thing that some play to and others will use other means.
You you will tend to see expressed as a gamut and how close it is to achieving the ideal there
https://www.photoreview.com.au/tips/editing/is-your-monitor-up-to-scratch/
https://www.tomshardware.com/uk/reference/how-we-test-pc-monitors-benchmarking/3
Anyway photography, printing and videography have their own monitor lines (Dell do some quite nice takes here for reasonable prices compared to some) that will make most times someone slaps on a coloured shell, a colour changing LED and calls it gamer before doubling the price look like pocket change. They also might not go in for all the same things as gamers -- refresh rate and latency is less of a concern.

HDR is a thing people look at nowadays. It is short for high dynamic range. Originally mostly seen as a camera trick it allows you to have really detailed dark parts of an image while a bright light source might be in another (say you are taking a sunrise picture with some clouds, you take two shots, one exposing such that the detail in the clouds is visible, the other so the bright light aspects are and the clouds might be washed out, stitch the two together and you have a nice image. Games and monitors for such things have recently taken to looking at it.


Related to this is brightness. Fairly obvious. More brighter is more better for some if the screen is to illuminate their basement or deal with sunlight coming in. You will tend to see this rated in candela aka CD per unit area (usually square metre). Ratings vary but 200 is low and above 350 is high, though higher exists (if you see digital signage as an option in your monitor searches this will be a big thing as having a nice bright screen able to go through a window and still be see in sunlight is desirable to many playing such games). A high brightness, high colour quality screen is a trickier thing as brightness might well wash things out.
That said while I am quite happy to risk arc eye and have a screen as though someone is welding in front of me it is a psychological thing after a while, much like loudness in headphones.
Contrast though is something you might want to consider (this will tend to be expressed a ratio of 1:[some high number]). More contrast means the black will be nice and black while the light stuff will be nice and bright. Simple brightness can mean a tradeoff. Contrast is also quite noticeable if you sit two screens next to each other and compare.

Be aware that brightness, colours, contrast and everything will usually be cranked to the max in shops that have displays of the things as it tricks people that don't know into thinking they are looking at something good (louder is better, sweeter is better, noisier engine is faster... same idea).


Anyway more functionality tends to cost and the various screen technologies out there come with tradeoffs for all of it.
CRT is no longer made and big ones are heavy as anything, not to mention gobble power rather nicely. Still they do have nice colours if you want, high framerates, tend not to make it to 4k and might even not be widescreen but they had resolutions around or above 1080 vertical back in the 90s. To that end there are certain models/families of CRTs that are still sought out and used by gamers that care to deal with the downsides, or maybe want the perks ( http://bogost.com/games/a_television_simulator/ ).

Plasma is a thing you can still find in odd corners (most places sold the last of them off cheap about 5 years ago). They gobble a lot of power, usually weigh a lot, tend to die after a few years, can have fairly high latency, have various downsides as far as screen burn but their colours are usually great and can probably be had very cheap these days.

LCD then has many families with a few more set to join in the near future.
https://www.tomshardware.com/reviews/lcd-led-led-oled-panel-difference,5394.html
For many years the big debate was between IPS (in plane switching) and TN screens (twisted nematic). VA is also a thing but mostly was a middle ground.
IPS tended to have better colours and viewing angles (if you have ever had the brightness and colours change radically when standing off to the side this is that) but a higher latency, or more expense if you wanted a low one, where TN tended to be low latency but colours would appear washed out unless you got a good one. To that end gamers on a budget or seeking the fastest would opt for TN even at the cost of some colour quality, and if you are the only one looking at it from a chair in front of it then viewing angle is pointless.

OLED is coming in which is a different style entirely but not really in big screens (making them larger than phone screens is tricky aka expensive). If you see an OLED screen for something larger than an incredibly expensive tablet then it is a marketing term as the light that lights the screen is OLED rather than more conventional LED or the even older cold cathode. One day it might be real but today meh. They are nice lights though.

If you want to mount your screen on an arm attached to a wall/table you will want to make sure the screen has mounts for it. These are typically known as VESA mounts (4 bolts in the back of the screen, larger screens may have a larger distance which can make it hard or need you to buy/make an adapter plate.). At one point this was nearly ubiquitous but I am increasingly seeing more monitors without it.

This features lark could go on for a while (thin bezel if you want to stick 3 monitors together and wrap around) and this is looking like one of my walls of text as it is.

Short version

For most gamers then it becomes a matter of

Resolution. 1080p vs 4k being the big decision, though some might go for serious widescreen in lieu of 4k ( https://www.wsgf.org/ being a choice link for serious widescreen on new and old games). There are higher resolutions still but unless you are looking at your screen with a magnifying glass they are pointless. What benefits you get from 4k might be debatable, though see if you can see for yourself.

Refresh rate. 60Hz is the baseline, don't go below that. Some opting for 120,144 or even into the 200s. What benefit you get from this may vary again but again go see some side by side if you can. There is also the freesync thing that does some things for some people.

Latency. Lower the better. 10ms is max for most but 2 to 4 is what most will seek if they can. If you don't play many twitchy action games that are all about reacting quickly to events on screen then you can also skimp here -- not like it matters for a turn based move the pieces game.

If you care to sacrifice some colour quality and viewing angle then you can get a bit cheaper with a TN panel, if not then you get to pay a bit more for IPS and fancy backlights while you are at it but maybe see the latency bumped up more. Prices vary and offers, bankruptcy sales and more happen all the time.
Make sure your screen has the inputs you need, though you can do things like buy an adapter box and maybe amplifiers these days will have inputs you can select between.

4k and high refresh rates also come at a cost to the machine needed to pump out the graphics in the first place. While the current gen of consoles nominally claims some support they really don't. Newer ones might have something more but likely will not be doing 4K120 in any meaningful capacity either.
For myself give me a 1080p60 monitor of decent size, latency and resolution (I like colours too but can compromise there) and I don't think I would be missing out on anything (not like sniper games are popular these days) for several years yet with the added bonus that a more modest machine will be able to make use of things for longer or play with the higher settings. If I return to things in 5-7 years this might change, but so too will have prices, and that is also about the length of time most will hang onto a monitor before they break. VR goggles might also be doing something worthwhile at such a point in time.

https://www.digitaltrends.com/computing/computer-monitor-buying-guide/ has some good info too.
Thanks for this. I’ve actually learned quite abit. My budget is around £500.
I want a tv/monitor that will give me a few good years and handle pretty much anything I throw at it.
You’ve given me a lot to consider

thanks
 

VGrift

Member
Newcomer
Joined
Oct 4, 2020
Messages
8
Trophies
0
Age
30
XP
66
Country
United States
I've used a dell S27 ips monitor before upgrading to a 34 in. It was a good monitor but on the pricey side. Minimal glare and had a 144hz refresh rate. I believe the newest model has an even higher refresh rate at 165hz but I'm not sure what refresh rate the next gens will be running at. The colors were also good out of box but I still tweaked them a bit.
 
  • Like
Reactions: Yiffna

fvig2001

Well-Known Member
Member
Joined
Aug 21, 2006
Messages
928
Trophies
1
XP
2,916
Country
Philippines
Well since Xbox series X is supposedly getting 8k and 120hz support, you are better off looking for a monitor that supports 120hz - 144hz. Then just look for one that does 4k (but hopefully if you play at 1080p, maybe the game plays better), minimal latency and viewing angle (probably get an IPS screen). You'll probably have to buy separate speakers since most monitors don't come with one (or they're bad).
 

Site & Scene News

Popular threads in this forum

General chit-chat
Help Users
  • No one is chatting at the moment.
  • Xdqwerty @ Xdqwerty:
    and idk if something will happen to my ps3 if i connect it to wifi with hen activated
  • Psionic Roshambo @ Psionic Roshambo:
    I remember when the Atari 2600 was new and exciting lol
  • Psionic Roshambo @ Psionic Roshambo:
    It could get banned from PSN but you can change your ID I think?
  • Xdqwerty @ Xdqwerty:
    also gonna install twilight menu in my r4 flashcard
  • Psionic Roshambo @ Psionic Roshambo:
    One thing that just occurred to me.... The sound on the 2600 sucked less back then the harsh sound we hear now is from infinitely better speakers we have now, back when the 2600 was new speakers produced a almost muffled sound, like CRTs made old graphics look slightly better.
  • Psionic Roshambo @ Psionic Roshambo:
    I wonder if I could recommend that to some emulation devs that perhaps the sound could use some smoothing out to simulate those old TVs
  • Psionic Roshambo @ Psionic Roshambo:
    I think a few of the early systems could benefit from that, at least up to the 8 bit generation, by the 16 bit generation I think TVs had gotten a lot better in almost every way
  • Xdqwerty @ Xdqwerty:
    i dont have an sd card adapter but I have an usb sd card adapter
  • K3Nv2 @ K3Nv2:
    Old people games
  • Xdqwerty @ Xdqwerty:
    its not the one that comes with the r4
  • Xdqwerty @ Xdqwerty:
    doesnt work (my flashcard is from r4isdhc.com)
  • Xdqwerty @ Xdqwerty:
    might install ysmenu first
  • Psionic Roshambo @ Psionic Roshambo:
    Try Wood firmware
  • Psionic Roshambo @ Psionic Roshambo:
    For your R4
  • Psionic Roshambo @ Psionic Roshambo:
    It's old but it's the best firmware out for DS stuff
  • Xdqwerty @ Xdqwerty:
    it says it only works for the original R4, R4i Gold (r4ids.cn), R4iDSN (r4idsn.com) and Acekard R.P.G.
  • Xdqwerty @ Xdqwerty:
    nvm it does support mine
  • Xdqwerty @ Xdqwerty:
    but why choose it over ysmenu @Psionic Roshambo?
  • Xdqwerty @ Xdqwerty:
    bc im stupid?
  • Xdqwerty @ Xdqwerty:
    yea ik im stupid
  • Xdqwerty @ Xdqwerty:
    good night
  • Psionic Roshambo @ Psionic Roshambo:
    Just give it a try, but honestly if you have a 3DS you can play DS games without a card just off the internal SD card
  • Psionic Roshambo @ Psionic Roshambo:
    Slightly slower loading but a bit more convenient
    Psionic Roshambo @ Psionic Roshambo: Slightly slower loading but a bit more convenient