Not the literal terms; those are a piece of cake.
What I mean is display and capture. NTSC is 525 (effective 480) lines vertically, and half of those lines, alternating, are displayed every 1/59.94 second. That's fine and dandy; I get deinterlacing for 480i just fine.
Now, stop me if I'm wrong here.
What's tripping me up is 240p. The TV is still updated at 59.94 fields per second, except the lines untouched by one pass are turned black. This creates scanlines. If you take the updated fields 59.94 times per second, that's the original image.
However, when capturing videos of 240p material on a PC, you capture at 29.97 fields per second. I'm also told that you can deinterlace to get the lossless source. I simply don't see how that can be true. Aren't you getting half black during that time? How can you recover all the original data? Or is it just not lossless?
What I mean is display and capture. NTSC is 525 (effective 480) lines vertically, and half of those lines, alternating, are displayed every 1/59.94 second. That's fine and dandy; I get deinterlacing for 480i just fine.
Now, stop me if I'm wrong here.
What's tripping me up is 240p. The TV is still updated at 59.94 fields per second, except the lines untouched by one pass are turned black. This creates scanlines. If you take the updated fields 59.94 times per second, that's the original image.
However, when capturing videos of 240p material on a PC, you capture at 29.97 fields per second. I'm also told that you can deinterlace to get the lossless source. I simply don't see how that can be true. Aren't you getting half black during that time? How can you recover all the original data? Or is it just not lossless?