With the classic consoles, they were something quite a bit different from PCs. For one, games were actually programmed to the metal back then. Everything about the hardware was ingrained in the game development process whereas now it's all done with APIs and OSes and such. You no longer get that bang-for-your-buck aspect as before. Game controllers were hardly standardized for PCs back then, too, as you had to install a 15-pin game port card into your PC and use their custom drivers rather than modern day where you just plug'n'play a USB controller.
I don't see how that changes anything - many games of that era were also programmed for specific hardware because API's like OpenGL or DirectX didn't even exist yet - standardization in the graphics field was yet to come in future generations. Games that looked one way on one PC looked vastly different on another depending on hardware
(Hercules versus EGA versus CGA versus VGA etc., although admittedly that was more of a worry in NES/SMS days as by the time the SNES/Genesis rolled in we mostly settled on VGA). This disparity was especially prominent in the audio field - no MIDI sounded
"the same" on vastly different setups
(the battle between Sound Blaster, Roland, Gravis Ultrasound and AdLib, among others, was raging on well into the generation).
Now that we
have API's that allow more leeway and relatively similar results on vastly different hardware, we use them across the board to provide a seamless experience. This doesn't negate the advantage of consoles - their hardware is standardized and as such their API's are shorter than on PC - every PS4 and every Xbox One are exactly the same, thus you can cut out a lot of peripheral nonsense, gaining a lot in performance.
And those defining features for the twins you listed... well touchpads have been around on PCs long before the PS4. Quite standard on any remotely modern laptop, actually. Perhaps not on a controller, once the Steam controller drops, that'll change.
The only other controller I know that has a touchpad is the OUYA controller - it's hardly a standard.
PSEye and Kinect? Neither of those are integrated into the console as a guarantee users have it. PCs have Kinect, too, should developers choose to use it. And Sixaxis is just a hasty attempt at a Wii rip-off that never took off.
They're not integrated, but they're available. As for SIXAXIS, it was unveiled on the same E3 the Wii was unveiled on and it was released before the Wii, I can't quite see how it'd be a rip-off. There are certain gameplay elements where it's useful
(steering wheels in racing games or balancing in platformers), but to be fair it's mostly gimmicky and I pay little attention to it.
None of that brings an experience that you won't get on a PC except MAYBE the Sixaxis, but I don't see anyone clamouring for it. But in terms of games that couldn't be made for PC without fundamentally changing the experience... I just don't see it. If the Xbone kept the Kinect as an integrated part of the system, it would have a legitimate argument, but they cut and run from that strategy and it no longer holds much value as there's no guarantee to developers that a potential customer has it, which puts it in the same boat as the multitude of optional peripherals available for PC.
The fundamental problem in your reasoning is that you want a system that offers an experience different than the PC instead of just wanting a good system. I'll define
"good" as a system that supports gaming first and foremost and is a good marriage of relevant hardware, strong software support and a good price point.
On a PC you can attach any peripheral you want out of the gate, you can play on 15 monitors if you so deem fit, so the dual screen point is somewhat moot.
The uniqueness of control only goes so far - if Nintendo releases a console that's controlled with a buttplug, you won't see gamers bursting in through windows to get one on launch day - the innovation you crave so much has to be practical, and the second screen turned out to be impractical because it jacked up the price point by $100 which made improving the console's specs to an acceptable degree unfeasible even if Nintendo wanted to do so
(which they didn't). Not all innovation works out to be good innovation.
The Wii U is an attempt at making a DS that's a home console and it just didn't work out - I like the concept of having a screen on my controller and I quite enjoy the Wii U gamepad, I don't think it's a bad idea to add a screen to a controller and I don't think that's why the Wii U is failing - the Wii U is failing because it's
"different" to its own detriment - it runs on obsolete hardware that's vastly different than the competition's machines and developing games for it would require watering down software assets that are almost cross-platform otherwise.
That being said, I think the Wii has a lot to do with why the Wii U isn't doing so well - I think that the 100-million-dollar-dust-collector disappointed many so-called
"core gamers" who bought it expecting all-around coverage, became disappointed and moved on to either the PS3, the Xbox or the PC camps. By the time the Wii U came along, nobody really expected it to deliver a well-rounded gaming experience, so only Nintendo hardcore fans pulled out their wallets to buy one - very intuitive, but also detrimental to the platform as third-party support is hard to obtain on a platform that doesn't sell very well. I really do think the Wii did more harm to the Wii U than the Wii U itself.