This could take a while
Originally video was made when people noticed that if you played back photos at around 17fps you could get something resembling motion, this is why old war footage often has people acting like they were just ripped out of an especially sombre black and white edition of the Benny Hill show. Mainly when the video makers do not do it properly and just speed up the footage.
17 works but 24 aka FILM speed is where most people do not care. Most old films and high quality footage was, and in some cases still is, shot like this.
25 frames per second, or more commonly 50 fields per second (basically every other line in a frame in a process known as interlacing, a horrible legacy thing that stuck around far too long and if you have ever seen black horizontal lines (known as combs) in a moving video then you have seen part of its legacy). FILM to PAL usually involved a speed up with accompanying audio and motion speed up, most people in PAL countries do not notice it but some NTSC types might. Modern FILM to PAL does have ways of mitigating the audio distortion using some fairly creative audio engineering (elongating silence is the start of it).
NTSC is 30 frames per second or more commonly 60 films per second. Actually I say that but 59.97 (and 24.97 for PAL) is what often happens, reasons for this are legacy CRT tube issues with colours but we will return to such things later.
24 to 30 is very noticeable if you just speed it up so instead the is the so called 3:2 pulldown. This means various fields are replayed in various pulldown orders to get the speed up. To PAL users this may make some things seem jerky. Exceptions are for sports, news broadcasts and soaps which were often captured in direct 30 fps, hence the smooth motion of 30 actual fps making people think they are watching one of the exceptions, most commonly a soap.
One other notable exception is DVDs and CGI. CGI is often done in 30fps but we will come back to that later. If you film your actors running around on a set in 24 but have some CGI in 30 then you have fun as variable frame rate is not really a thing. To make matters even more fun there are so called soft pulldowns where the decode/display device does its own pulldown. If you have ever watched a DVD ripper/transcoder have a value like percentage FILM then this is the reason why.
Onto CRT tubes and the fun they bring. I already mentioned the colour induced latency, short version is it was to help with the 3 colours (RGB) display on older TVs but let us not go there.
CRTs scan a phosphor with an electron beam. If you have ever pointed a camera at a CRT and seen a black line racing up and down on it then you are seeing this, it is also why your camera might have a "TV standard - 50Hz or 60Hz" option somewhere in it. Such things are nice enough for most but you can see the difference on a CRT between 60 and something like 120Hz. This is not quite related to the current issues or the legacy ones we will cover later.
CRT is also an awful method for making an accurate display but it does have its perks. Devs knew this and designed things to take advantage of this, some of them are quite nicely summarised in
http://bogost.com/games/a_television_simulator/
It is also why some argue that emulation is not going to be the like real thing.
Another perk of CRT is very low latency and some minor tweaking abilities that are easy by virtue of being a voltage difference rather than a complex interaction of signal processing. Latency in this case is the difference between a signal leaving a device and being displayed on a screen. This is why your LCD might have a game/PC mode that looks worse when turned on -- turn it off and the TV stops doing all its post processing of your image, however it gains you the 30 milliseconds that might have taken which is nothing for something passive like TV or DVD watching but everything in a game. Effective latency is why some are seeing a push for 60fps or even 120fps (though this can also be to do with 3d). Though another reason for this is 80fps to 60 and nobody cares, 30fps to 10 and everybody cares.
If your screen's refresh rate is 30Hz, like if you use a 4k display with current HDMI, then it is all well and good your graphics card pumping out 4000 new frames per second but the screen will never update in time to display them. This framerate to screen refresh rate lock is known as vsync, it was more important in CRT days when if you had a frame update happen halfway during a redraw you would end up with a screen tear.
Of course we are now back to games. There are two things to consider here
1) Games were not always linked to TVs, you may have heard the older types speak of arcades in hushed tones. If you are buying in custom CRTs from the factory then there is no need to have them be glorified TVs, indeed many did not.
2) Motion blur.
1)
http://retrogaming.hazard-city.de/ because I really can not be bothered to take all the arcade and console oddities at 4am. Suffice it to say as handhelds tend to use their own screens then they are not so troubled.
2) Motion blur then. Your eyes do not see in photographs and rather they see some kind of quasi fluid animation. The effective framerate of your eyes seems to vary but most would agree it is above 30fps. Video cameras, by virtue of being optical sensors, have a certain amount of motion blur which works for people. Computers tend to do one frame at a time and this is obvious, at least until you crank the frame rate up, and this looks odd to people.
A fast pan on a video camera can also trigger this.
You can not just blur frames between each other either as motion blur, as the name implies, deals with motion which can be relative. Those few devs that did anything opted for the far easier but far inferior former method. Nobody that I have seen in games has done the latter. This is why motion blur is not highly regarded by game players that are sort of in the know.
I would also hold a variation on this them is why people found 50 frames per second in the hobbit films, which had no small amount of CGI, to be eye pleasing and found the other to "look like a computer/video game".
Some have also pushed as theoretically, and accurately enough even if I do not hold it to be very practical, the input lag between 30fps and 60 would be half as you could theoretically react quicker. This is not wrong but how many people exist in this world that can take advantage of that I do not know, I imagine it is probably similar to the amounts of people with very high sports skills.
TVs though, or possibly bad ideas from console makers.
PAL is higher resolution than NTSC (PAL being 576 interlaced, though overscan can change this, and NTSC being 480). Most PAL TV sets after the late 80's, basically by law, could also decode NTSC but the same was not true in reverse.
Anyway there is theoretically no reason why your platform character can not run 34 pixels a second on a PAL or NTSC refresh rate TV set, give or take the minor fractional differences anyway. However as things are tied to the vblanks and other things this is a pig to change.
Equally there is no reason the extra resolution of PAL could not be used to give a bit more screen to the world. However that could possibly have take more video memory (and devs often used every last drop of it) and beyond that the screen is actually a useful concept within it so you would have possibly also had to reprogram enemies, game events and more besides.
To this end PAL conversion often meant slap some black bars on the top and bottom and slow the video and audio down to match the signal. Indeed games designed solely for the European market often ran at the rate the devs wanted with screen display everything they could, and NTSC then got sped up and cropped games.
Historically there were also issues with translations, censorship, cut content (all those game scripts for 5 different languages take precious space).... what went varies somewhat but it was bad often enough that when combined with the shoddy PAL conversions that either PAL became a bad word or Europe got itself viewed as a third world type setup for games, doubly so when not every game appeared (Chrono Trigger -- technically never released in Europe until the 2008 DS release, Earthbound, well that was 2013 on the Wii U that Europe first saw it). This persisted for a few more years as well with things not getting progressive display support, HD support (the original xbox never got it in Europe) or even being lumped with RF rather than composite for years.
It was certainly not all bad -- on the censorship thing there is a reason Americans were viewed as a bit puritanical. Likewise it also saw things like the Amiga flourish and even Sega was about level pegged in various European territories during and before Nintendo arrived; I have often seen US types consider the Megadrive/Genesis and also ran, some of this may be historical revisionism but in the UK at least it was common, possibly even the majority with a megadrive. Sega did not also have the whole censorship going on quite as hard either.
On the matter of colours it is not all nice in the modern world and yes NTSC has traditionally seen poorer colour reproduction than PAL (so much so that NTSC has jokingly been called Never The Same Colour), for most practical purposes though most people do not calibrate their TV.... the amount of places I have been where people have the contrast wound right up and colours so saturated it makes it look like an albino person has sunburn....
Modern TVs have theoretically done away with most of this, they have their problems (really interlacing should have died long ago but 1080i does exist it seems) but they more or less have done away with this. Now we are back to regional laws, lack of releases, different translations, lack of DLC, various cut content, lack of support and whatever else like that. Something that does apply to all regions in various ways. There are still colour issues at various levels (which should have also not been a thing, though again most people do not set things up properly in the first place), the whole 4K but only at 30Hz thing with current HDMI (various DVI and displayport settings sort this), overscan and probably something else I am not thinking of but most of those are solvable by paying attention to what you are buying.
and thus we have the reasons why I often find myself shaking my head as computer game types that want to talk video often sound like audiophools that think buying crazy deoxygenated cable, to transmit a digital signal no less, have the right idea.