Monitor vs TV.
For the most part small TVs are expensive monitors while large TVs are cheap monitors, many TVs these days also come with a junk tier android board or similar built in and call themselves a smartTV -- they might just about play youtube with adverts, possibly also netflix for a few years until they change the protocols, but I have never found one I would want to use and I could almost just about deal with a tablet as a computing device if I had to. Once upon a time there was a fairly notable difference in resolution and inputs but today a TV is mostly a monitor with a TV tuner, this tuner is not free so you do pay for it in other ways if the prices reflect the input materials (debatable). Speakers will be on 99% of TVs as well (for a hot moment about 10 years ago speakers were a separate deal to what was otherwise a TV) but with the move to thinner and thinner screens the speaker quality dropped off a cliff so headphones and soundbar might be in. Be aware of this however if you are looking at consoles -- there are devices to pull audio from HDMI if your base console won't. A minor thing (and solved with a HDMI switch for some) but a lot of monitors will possibly make your fingers do a dance to swap inputs where a TV will expect to do it a lot and have one button.
For some setups there is a debate over the colour quality and effective resolution for TVs (the way they work mean they might be a lesser offering). A TV may or may not also come with a bunch of other inputs than plain old HDMI if you want to connect old game consoles, though even if the ports are there they might not support it. There are adapters of varying quality, and price, for most things to HDMI. Gaming tending to want the higher priced stuff though frankly if you are on this site then emulators are probably a thing you ought to be looking at there.
Not all graphics cards revolve around HDMI either (see displayport, and there are also reasons to be using DVI in some cases) so that might be a factor.
Specifications.
4 main fields most look at
Simple resolution.
Refresh rate.
Latency (this will be the milliseconds/ms rating some talk about but more on that later)
Colour quality and additional effects (see HDR), though for a lot of gamer types this can fall by the wayside.
Resolution then. How many pixels on the screen. More pixels, more information can be displayed but you also come up against a limit on how many you can physically see.
There are some minor variations for things like 16:10 monitors, ultrawidescreen 21:9 and beyond aspect ratio and film 4k) and some mid tier stuff but for the most part the debate is between
1080p aka 1920x1080 and 4k aka 3840x2160 (4 1080p monitors squished together).
4 times the resolution jump to 4k means 4 times the number of pixels your graphics card needs to push. Also at 27 inches and under it becomes tricky to see the differences, even more so if you are sitting the other side of the room. For games 4k tends to be most useful in long range sniper games (4 pixels in the distance is now 16), real time strategy if they allow it (more map visible at once, obviously then a big cheat). It is very nice for looking at schematics, looking at 3d models, spreadsheets and video editing if you want to have full resolution 1080p video visible on the screen all at once (it only taking up say one quarter and allowing you to have all the information around it).
Refresh rate.
How many times per second a new frame is drawn on screen.
Many years ago we had CRT monitors. The way these worked by drawing one line at a time (ever seen cameras looking at old screens and a line going down them? This is that) meant higher refresh rate (which back then you could trade off against resolution) meant a clearer image. The shift to LCD which draws the whole screen at once meant the perks of higher rates became less obvious and 60 (the favoured rate of the US and Japan at the time) became the default choice. More frames the smoother the motion, especially as most computer graphics don't have anything like proper motion blur, and also something for input but put that aside for now. It does also mean that the nice FPS counter in the corner of the screen saying your PC is spitting out 300 frames per second is nice and all but if your screen is only ever going to output 60 of them a second you are not getting much for your trouble. Likewise if you are playing a system grinder of a game (or using system grinder options) and only getting 60fps then your 244Hz screen is not going to do you any good. This does also mean your nice 60fps DVD that was filmed on a 24fps film camera is not going to get better magically, there are some very clever tools (see things like mvtools) that can fake something but eh it is still producing data from nothing and hoping it works.
We will skip the console refresh rate discussion, interlacing and NTSC vs PAL vs FILM vs old school stuff as that will get us more off topic than we already are. Sadly it is not a dead discussion but might as well be for most modern uses.
A few years back though 2 things happened, or maybe 3.
1) HDMI at the time came to the limits of what it was designed for and 4K resolution might only update at 30Hz. 4K is still 4K so some screens were made but only updated 30 times a second at that resolution. Good enough for static images, diagrams and still plenty workable but many would notice something in the mouse cursor or scrolling around fast. You probably won't find this unless you are doing second hand stuff but there are still some cheapo screens that do it so be aware of it.
2) Gamers decided it was good stuff. I am less sold on this one but the theory runs that you can see more data happen (if something is moving fast near you then one frame to the next is potentially the difference between hitting and not) and if the inputs are tied to framerate (for many years this was the case, PC games for a while now have largely decoupled input from frame updates, old consoles however will) then you get more chances to refine inputs on controls. Also a dip of 10 frames per second from 30fps to 20 vs 60fps to 50 (or maybe 40 if you are being proportional) is the difference between slideshow and barely noticeable so there is that.
The third (or 2i maybe) was VR goggles make fewer people sick if they pump up the refresh rate but that is a slightly different discussion.
More frames per second means more frames per second your computer needs to generate.
This means going from 60Hz (and potentially 60 FPS) to 120Hz (a popular first step, though some seek things in the 200 range) means twice the power needed. If you couple that with 4 times needed for 4K then you are at 8 times the power for something you might not even notice or feel any great benefit from. 8 times is more the theoretical and can be more if they want more detailed textures to load in, and can be less as one frame is similar to the next and you can predict things fairly reliably, though this was also true before the leap to these standards and was largely already done.
Latency.
This is the time taken between the graphics being made on the computer and being displayed on the screen.
Back in the day then CRT was about as fast as electricity really. You change colour balance, screen position and more by changing physical things. To this end CRT tends to be the baseline everything else gets compared to (indeed if you don't want to invest in high end tools most will stick a CRT next to whatever they are testing and set a high speed camera on it and compare how long it takes for the monitor to display the data that the CRT displays earlier).
Some modern emulators are doing things like input prediction and making things faster than the hardware it originally ran on but that is a different story.
Anyway the move to LCD meant most corrections were done via electronics that take a signal in, do something with it and fire it on. This takes time and do this a few thousand times and it adds up. Not so bad for watching TV or playing a DVD -- if your DVD pauses half a second after you pressed the button who really cares, half a second (500ms) of ping in a FPS and you are done, a spectator... basically pointless. 500ms is a bit much though but there is a marked difference between 20ms (some TVs and types of screens) and 2ms (some of the best). It is rare as market for it is tiny but you can have a high refresh rate screen with awful latency.
Sometimes you will see this measurement given as g2g which is grey to grey aka how fast can it cycle round a pattern. This is not pointless as a good g2g will tend to be an indicator of things but is still not a real measurement and is mostly only the concern of marketing people that are designed to get you to part with your funds.
TVs will then have a PC mode, gaming mode or something similar that drops the processing they might be doing but increases either the legibility or text or latency in that mode. You might have to read the manual to figure out how to activate it as well.
Game companies have also sought to lessen both latency concerns and aspects of framerate by removing the idea of framerate from a fixed concept to more of a "when it is done we will send it" AMD/ATI calls their take on this tech FreeSync while Nvidia calls their G-Sync. More classically this is what is known as variable frame rate. Freesync is released without royalties and the like so tends to be a bit more available. This tends not to be available for TVs but there are exceptions. The xbox one x and s (not the original xbone however) do support freesync though.
Colours.
Often overlooked in gaming circles. If however you edit images, video or the like then it pays to be able to send an image to somewhere else and them be sure they are looking at the same colours, and printing the same. Though do be careful if you head down this path as not everybody will -- I have nice monitors here for this and might do a website for someone, in this case it was a curry restaurant so brown it was. Nice brown on my screen was a decidedly less pleasant brown on someone else's, certainly one you don't want associated with food.
Anyway Apple have their own thing that some play to and others will use other means.
You you will tend to see expressed as a gamut and how close it is to achieving the ideal there
https://www.photoreview.com.au/tips/editing/is-your-monitor-up-to-scratch/
https://www.tomshardware.com/uk/reference/how-we-test-pc-monitors-benchmarking/3
Anyway photography, printing and videography have their own monitor lines (Dell do some quite nice takes here for reasonable prices compared to some) that will make most times someone slaps on a coloured shell, a colour changing LED and calls it gamer before doubling the price look like pocket change. They also might not go in for all the same things as gamers -- refresh rate and latency is less of a concern.
HDR is a thing people look at nowadays. It is short for high dynamic range. Originally mostly seen as a camera trick it allows you to have really detailed dark parts of an image while a bright light source might be in another (say you are taking a sunrise picture with some clouds, you take two shots, one exposing such that the detail in the clouds is visible, the other so the bright light aspects are and the clouds might be washed out, stitch the two together and you have a nice image. Games and monitors for such things have recently taken to looking at it.
Related to this is brightness. Fairly obvious. More brighter is more better for some if the screen is to illuminate their basement or deal with sunlight coming in. You will tend to see this rated in candela aka CD per unit area (usually square metre). Ratings vary but 200 is low and above 350 is high, though higher exists (if you see digital signage as an option in your monitor searches this will be a big thing as having a nice bright screen able to go through a window and still be see in sunlight is desirable to many playing such games). A high brightness, high colour quality screen is a trickier thing as brightness might well wash things out.
That said while I am quite happy to risk arc eye and have a screen as though someone is welding in front of me it is a psychological thing after a while, much like loudness in headphones.
Contrast though is something you might want to consider (this will tend to be expressed a ratio of 1:[some high number]). More contrast means the black will be nice and black while the light stuff will be nice and bright. Simple brightness can mean a tradeoff. Contrast is also quite noticeable if you sit two screens next to each other and compare.
Be aware that brightness, colours, contrast and everything will usually be cranked to the max in shops that have displays of the things as it tricks people that don't know into thinking they are looking at something good (louder is better, sweeter is better, noisier engine is faster... same idea).
Anyway more functionality tends to cost and the various screen technologies out there come with tradeoffs for all of it.
CRT is no longer made and big ones are heavy as anything, not to mention gobble power rather nicely. Still they do have nice colours if you want, high framerates, tend not to make it to 4k and might even not be widescreen but they had resolutions around or above 1080 vertical back in the 90s. To that end there are certain models/families of CRTs that are still sought out and used by gamers that care to deal with the downsides, or maybe want the perks (
http://bogost.com/games/a_television_simulator/ ).
Plasma is a thing you can still find in odd corners (most places sold the last of them off cheap about 5 years ago). They gobble a lot of power, usually weigh a lot, tend to die after a few years, can have fairly high latency, have various downsides as far as screen burn but their colours are usually great and can probably be had very cheap these days.
LCD then has many families with a few more set to join in the near future.
https://www.tomshardware.com/reviews/lcd-led-led-oled-panel-difference,5394.html
For many years the big debate was between IPS (in plane switching) and TN screens (twisted nematic). VA is also a thing but mostly was a middle ground.
IPS tended to have better colours and viewing angles (if you have ever had the brightness and colours change radically when standing off to the side this is that) but a higher latency, or more expense if you wanted a low one, where TN tended to be low latency but colours would appear washed out unless you got a good one. To that end gamers on a budget or seeking the fastest would opt for TN even at the cost of some colour quality, and if you are the only one looking at it from a chair in front of it then viewing angle is pointless.
OLED is coming in which is a different style entirely but not really in big screens (making them larger than phone screens is tricky aka expensive). If you see an OLED screen for something larger than an incredibly expensive tablet then it is a marketing term as the light that lights the screen is OLED rather than more conventional LED or the even older cold cathode. One day it might be real but today meh. They are nice lights though.
If you want to mount your screen on an arm attached to a wall/table you will want to make sure the screen has mounts for it. These are typically known as VESA mounts (4 bolts in the back of the screen, larger screens may have a larger distance which can make it hard or need you to buy/make an adapter plate.). At one point this was nearly ubiquitous but I am increasingly seeing more monitors without it.
This features lark could go on for a while (thin bezel if you want to stick 3 monitors together and wrap around) and this is looking like one of my walls of text as it is.
Short version
For most gamers then it becomes a matter of
Resolution. 1080p vs 4k being the big decision, though some might go for serious widescreen in lieu of 4k (
https://www.wsgf.org/ being a choice link for serious widescreen on new and old games). There are higher resolutions still but unless you are looking at your screen with a magnifying glass they are pointless. What benefits you get from 4k might be debatable, though see if you can see for yourself.
Refresh rate. 60Hz is the baseline, don't go below that. Some opting for 120,144 or even into the 200s. What benefit you get from this may vary again but again go see some side by side if you can. There is also the freesync thing that does some things for some people.
Latency. Lower the better. 10ms is max for most but 2 to 4 is what most will seek if they can. If you don't play many twitchy action games that are all about reacting quickly to events on screen then you can also skimp here -- not like it matters for a turn based move the pieces game.
If you care to sacrifice some colour quality and viewing angle then you can get a bit cheaper with a TN panel, if not then you get to pay a bit more for IPS and fancy backlights while you are at it but maybe see the latency bumped up more. Prices vary and offers, bankruptcy sales and more happen all the time.
Make sure your screen has the inputs you need, though you can do things like buy an adapter box and maybe amplifiers these days will have inputs you can select between.
4k and high refresh rates also come at a cost to the machine needed to pump out the graphics in the first place. While the current gen of consoles nominally claims some support they really don't. Newer ones might have something more but likely will not be doing 4K120 in any meaningful capacity either.
For myself give me a 1080p60 monitor of decent size, latency and resolution (I like colours too but can compromise there) and I don't think I would be missing out on anything (not like sniper games are popular these days) for several years yet with the added bonus that a more modest machine will be able to make use of things for longer or play with the higher settings. If I return to things in 5-7 years this might change, but so too will have prices, and that is also about the length of time most will hang onto a monitor before they break. VR goggles might also be doing something worthwhile at such a point in time.
https://www.digitaltrends.com/computing/computer-monitor-buying-guide/ has some good info too.