--- brad [email protected] wrote:
On Tue, 2005-08-09 at 13:58 -0700, Jack wrote:
How realistic does a scene need to be before it doesn't make a difference any more.
Very. And as all gamers know, we are not there yet. And here I swore I would never get dragged into one of these debates.
While I can understand the gamers with big budgets whining that they aren't getting every little bit of their new RADEON 100,000 used in their new games, it is somewhat annoying when you discover that your fairly new, say, nVidia video card, is completely inadequate for every single new game out there.
Thief I and Thief II sold well, sticking you into an immersive environment that *downgraded gracefully*. You got to play the game, you realized that you should get a better card, but you could still have fun while you waited to be able to afford the new video card.
Then came Thief III, a game which demanded an $80+ video card which you were pretty much guaranteed not to have already. You couldn't even see what it was like without the $80+ video card. This is the kind of thing I'm annoyed about: the lack of graceful downgrading of the gaming experience.
I just don't understand an industry that is remarkably similar to requiring people to upgrade their cars just to be able to play a new music CD in their car CD player.
____________________________________________________ Start your day with Yahoo! - make it your home page http://www.yahoo.com/r/hs