as a newbie to the whole hardware side of computer i have found myself drawn in my the history of hardware. whilst reading it i tried to make heads or tails of it and found it a bit confusing. my question to all you hardware fanatics out there is over the years when games where being invented what limitations were there to stop them making the best games? and where would i find information on this subject as i am very interested.
the games were the best they could be, they were limited by the available hardware at the time e.g. when i was a kid the best hardware they had invented was the Atari 2600, so as hardware was improved the games (software) also improved if you want to find an answer to a specific question try Google or Wikipedia
But... Didn't anyone notice that by the time goes on, if we compare speed of the older computers (with additional old OS installed) and the newest systems, the performance is dropped now? For example, 2000 year High End computer had greater performance running on Win98/Win2k than the newest High End pc running on Windows Vista. I mean... software technology is one step further than hardware nowadays. Micro$oft does everything to make people buy a new PC and know why they should have such a powerful PC. M$ marketing goes on...
:agree: :agree: *nix is also in this hardware race, although, it doesn't usually eat resources like vista i think my next major hardware upgrade will be in about 3 years BTW: personally i don't like M$
A few things limit it, its a combination of hardware, software programming platforms (which developers can use to create more powerful software in less time) and the skill and quality of the industry. Software may run slower these days (especially vista), but sometimes its because its much higher in quality. You don't see computers crashing as often as they did in the 90s running Windows, Windows pre2000 was god aweful.