What Originality said. I would also pay attention to latency and I do have to mention that current HDMI is only 30Hz if you are going into 4K*. HDMI2.0 will change this and various DVI and displayport options will do this now.
*technically 4K is a minefield as we decide what is consumer 4K, what is film 4K and what is something else but still greater than 1080p (4k in consumer world is 4 times the real estate of a 1080p monitor). You can poke around
http://www.wsgf.org/mgl to see what games do what here. I am not sure how useful it would be for most games, aside from pixel type shooters. For CAD, video editing and some diagrams though they are lovely.
Latency is how long between the picture being sent down the cable to it displaying on the screen. In the good old days of CRT it was crazy quick*, with modern TVs especially they tend to add some processing to make things look better or just take longer. This is not a problem when you are doing some passive like watching TV or a DVD but games and active computer use are bothersome here. This is why many TVs will have a game/computer mode. The TN stuff will tend to have lower latency for less money than some of the IPS stuff, though IPS is mature enough that it is not as big a gap as it once was. For most non gaming purposes it tends not to matter so much until you get really high (in which case you would probably be on a TV and not in game/computer mode) and even a lot of normal people playing games will not be too troubled (though they could probably tell the difference if you sat them side by side).
*the cheap way of testing for lesser screen reviews is to output the same frame to a CRT and a LCD and in camera figure out the difference.
I could get into colour reproduction, however I am not sure how useful it will be if you are not inclined to edit video or photos at a professional level and even then there is the old audio mastering adage of "make it work well on shit" -- you get some perfect clarity speakers and it sounds amazing then great, however if the same sounds crap on my office radio/building site radio/general consumer gear then you have failed. Such things are probably fighting words to some but I will stand by them. Similarly it is not like days of old when there could well be a massive difference than even the plebs could see.
On backlights then some also like brightness discussions (usually given in cd/ and typically range from about 250 to some 400 now I think). I like bright screens but unless I am installing one in a trade show display/window display then the better solution is close the bloody curtains if it is bothering you.
Contrast Ratio does mean something if you dig hard enough, alas the marketing idiots took the number over and now it means nothing for most purposes unless you are comparing two similar vintage models from the same company.
"I've also heard about monitors that can sync with my GPU and thus outputing the best picture possible."
That could mean a whole bunch of things from basic EDID to just being a digital signal (VGA is not, HDMI, displayport and most of the better types of DVI are digital) to something actually quite fancy (and probably pointless for most people). Unless you are going for shutter or some kind of powered glasses using 3d then I would say it is not worth thinking of.