Wow this generation lark seems even more pointless than the time I tried scoring game genres.
I'd imagine four, but that all depends on how well optimized they are. That's also going to determine the minimum spec to emulate at full speed. i5s and i7s are always a safe bet, and as the emulators are improved those CPUs will almost certainly handle 360/PS3/WiiU games without much issue.Now what I want to know, how many cores do Xernia Rpcs3 and Cemu make use of?
I fail to see how. I'd rather spend $250 on a good CPU than $100 on something outdated and another $250 to replace it shortly after.future proofing PCs is a waste of money IMO.
what you're talking about isn't future proofing. the simple fact about future proofing is that you can't know what'l advances will be done in the coming years. so its pointless to try and preempt them. it's best to just buy a PC that plays current games well, instead of spending a lot extra to try and make it so it'l play games well for the next 2-4 years.it's usually more cost effective because it keeps your PC more up to date.I fail to see how. I'd rather spend $250 on a good CPU than $100 on something outdated and another $250 to replace it shortly after.
This is a 360 emulator running Eragon (and Catherine) at full speed, he's running on a CPU - Intel Core i7 3770K @ 4.20 Ghz (4 cores, overclocked, $300). More playable 360 games will run around full speed with that CPU going by his channel. From what I'm looking at here I'm guessing we won't need any more than this CPU to run 360 games, at least as Xernia progresses.
Eragon (360, full speed)
Catherine (360, full speed)
He recently uploaded a video featuring Rpcs3
Silent Hill 3 HD (PS3, around 2/3 full speed, It looks like he upgraded to Intel Core i7 6700K @ 4.60 Ghz) (4 cores, overclocked, $430)
If I had to guess I'd say rpcs3 isn't as optimized as it could be. Obviously 3ds will run full speed on any of these CPUs as it gets to be more optimized.
Now what I want to know, how many cores do Xernia Rpcs3 and Cemu make use of?
I know it's hyperthreading, still easier to call it a quad-core.
I'll make up a new word for hyperthreading then... Shitty-core hehe
I fail to see how. I'd rather spend $250 on a good CPU than $100 on something outdated and another $250 to replace it shortly after.
Probably a combination of lack of interest from developers and how inherently difficult emulating different X86 instruction sets can be. I wouldn't really know about the rest of the hardware. I haven't a clue how hard it would be to pull off emulating the video hardware.How are there both xbox 360 and ps3 emulators yet there is still no original xbox emulator that works like that?
Here is the thing: PS3 and Xbox360 emulation will most likely not be a thing for years, and common CPU's you can get NOW probably will not cut it when they finally are a thing. Cell and Xenon which is really just a modified Cell (do not bother making snarky "XBOX IS A DIFFERENT THING" just wikipedia Xenon in 360) are hard to emulate. They just will not be a outright thing for years.
At moment the progress will be slow if not in grinding halt. Want proof? Look at where xbox emulation is. Not 360 emulation. Xbox emulation. Thus far There has been only one xbox emulator that could boot one commercial game, that being halo, and not even able to play that at full speed. That is xbox. Cell is something that will take years to be emulated well enough. Wii U also.
Unless you want to be just silly, do not be shopping for a computer for future emulation NOW. Buy that darn machine when you actually can emulate there is no reason to dump loads of money into something that will still take years to happen and meanwhile you spent money on hardware you are not fully using and you end up needing to replace anyway once those emulators hit.
If you want to play those games so bad; Dump some money on modded PS3 Saves you TONS of money trust me. Modded 360 too if you feel adventurous. Beyond that the world is your oyster.
That's usually in essence, the idea of future proofing, only taking it a step further and making current games run great or stupendous. Meaning future games, will be more likely run, at the very least, good or acceptable for most, or at least more forgiving, PC gamers. I can't run all new games in 1080p full hd max quality or anything, but my 7770GHz reference card from like 2011-2012, it still runs most games I want quite acceptably. Future proofing is not so much a myth as it is proper research and proper investment.what you're talking about isn't future proofing. the simple fact about future proofing is that you can't know what'l advances will be done in the coming years. so its pointless to try and preempt them. it's best to just buy a PC that plays current games well, instead of spending a lot extra to try and make it so it'l play games well for the next 2-4 years.it's usually more cost effective because it keeps your PC more up to date.
and its likely you would've gotten more bang for your buck if you hadn't gone with top of the line hardware.That's usually in essence, the idea of future proofing, only taking it a step further and making current games run great or stupendous. Meaning future games, will be more likely run, at the very least, good or acceptable for most, or at least more forgiving, PC gamers. I can't run all new games in 1080p full hd max quality or anything, but my 7770GHz reference card from like 2011-2012, it still runs most games I want quite acceptably. Future proofing is not so much a myth as it is proper research and proper investment.
It depends what you do, what you plan to do, and how long you want it to last. Having unrealistic expectations like 7+, hell even 5+ years (imo), on hardware without needing to upgrade still is a whole different matter altogether. I know I need to upgrade, but that doesn't mean I didn't choose hardware that's benefited me for a long time vs an upgrade every year or two. I'll save the extra trip of purchases myself, thanks.
Most people who talk about future proofing aren't talking about programming, running servers, or anything like that. They're just normal users. And yes, they can future proof for most of the crap they do usually. Advances like opengl and directx are the only ones you need to watch for, and even those can be made compatible as seen with DX11 and DX12. As well as my AMD card now supporting the Mantle API.
I don't have top of the line hardware. Lol.and its likely you would've gotten more bang for your buck if you hadn't gone with top of the line hardware.
then you didn't future proof. you just bought a PC.I don't have top of the line hardware. Lol.
I never said I future proofed jackass. All I said was I chose hardware that has benefited me for a long time. Quit trolling, grow up, and learn something about technology instead of being a snotty little or old brat. I explained what future proofing is to you. I can't help that your mental retardation prevents you from learning. No one who knows anything about building a computer in their right mind should be expecting their components to magically upgrade their internal chips, diodes, resistors, die-sizes, speeds, technology etc on their own.then you didn't future proof. you just bought a PC.