Just keep in mind that not all HDTV's are created equal on their VGA input. Some or most of them actually do scale the VGA signal to their native resolution. And many of them actually do have post-processing on the VGA (albeit not as much as HDMI for example)
Even if a HDTV doesn't scale the signal or use post processing, it will still have some lag simply due to the display being digital (yep, even with a analog source, as it has to convert it to digital). I've seen some ridiculously high input lag reports on VGA inputs of some HDTV's, so its rarely a "sure thing". And, most current TV's are doing away with VGA altogether almost as fast as they did with S-Video.
And the SLG 3000 will give you a "faux" sense of reduced ghosting due to what you said, it adds the scanlines, thus reducing the total amount of pixels seen, thereby "covering" the amount of pixels seen which would "ghost" or "blur" while in motion, but, you cannot see them anymore. Still its a fake effect and does not actually reduce the TV's response time at all. (Its just as fake as frameskipping, in the sense that frameskipping deludes you into thinking the game is running faster, but its actually not and instead is using a "fast-forward" function). If a person is using scanlines to "cover" up alot of motion blur, then at this point he might as well add more and more scanlines to cover up the whole darn screen and get rid of the blur altogether.....
Its all garbage to me. We never had ghosting or input lag (or pixelation) in older technology, and if this "so-called" better technology cannot at the very least match the old in all areas, then I don't want it. Sure the old CRT's are bulky, but, so is your couch, furniture, refrigerator, bed, recliner, and toilet. Until something better comes along, then the only way to perfectly emulate a CRT is to get one. I do however believe that maybe 5-10 years from now though we will have HDTV's with high enough contrast, dpi and resolution to allow us to show a convincing CRT shader.