I
My guess was that the TV has 1080p native output but even if it would be more (4k, 8k) that's no problem as "upscaling" from 1080p to 4k simply means showing each pixel 4 times. No real scaling algorythm needed. Same for going from 4k to 8k.
Correct me if I'm wrong but don't CRT eat almost any resolution you feed them (except when too high) ? So any modern (HD)CRT should do it, no?
Prove? From Reddit:
The large majority of TV’s do not use nearest neighbor to upscale 1080p content to 4K, or 4K to 8k. They use an interpolated method as that looks better for video based content which is what TV’s are designed for.
Again, most CRT’s out there are SD which max at 480i, and not HD.
Here is a direct comparison between the Wii in 480p over component versus the Wii U in 480p over component. First image is Wii, the second is Wii U. Note the blurriness of the Wii U file, the cut off pixels on the top and bottom, and the chroma errors (red and green fringing around other colors) visible on the top icons.
Additionally, here is a comparison between the Wii U’s vWii in 480p versus 1080p. Note the extreme blur in 480p mode (first image) versus the clearer checkerboard in 1080p (second image).
All this to say, the vWii mode on Wii U has compromised image quality which is most notable when the Wii U is set to 480p output.