As long as you've got nvidia, you should be fine. I've only heard mediocre things from ATi. It can be a bit tricky to get nvidia working if your distro upgrades kernels faster than they provide precompiled interfaces. On the other hand, their configuration tools are quickly approaching Window's click and point interface.
There is a company that is supposedly tailoring to linux, but it appears they're totally half-assing 3d. From what I gather, they're taking MesaGL and slowly moving the fragment generator over to an FPGA. FPGAs are appreciated for letting engineers quickly create a test version of their circuits, but they're simultaneously cursed for being slow. And you know they're not planning to bring anything to the table from the last five years because the card will initially be PCI. Not PCI express, mind you. PCI--133 Mbyte/sec PCI. I'm not entirely sure what the goal here is, aside from beginning a legacy of an open source driver. The group has made no concrete plans to open the hardware implementations, but clearly this card needs to be the first in a line of iterative improvements if it plans to be of any value to the community.
Of course, I'd appreciate being proven wrong. But 3d math and hardware accelleration aren't simple tasks. But if you were expecting something more useful than the already open sourced Intel Extreme Graphics hardware and drivers, then you'll be sorely dissapointed. At best, these will be great for providing an old computer with enough 3d accelleration to handle the upcoming 3d accelerated desktops in OSX, GNOME and Longhorn. Remember that these projects aim to use the GPU because it's there and its not being used. Apple plans to continue offloading more to the GPU, so the requirements here are only going up.
If you're still interested in what the group's doing, they're called the Open Graphics Project. I think their site just got hacked though, which is a bit unprofessional. I hope it doesn't put them at a serious risk of making their november deadline...
Justin
On 8/5/05, Josh Charles [email protected] wrote:
One problem that has kept me from playing good games on linux is the continuing lack of decent 3D acceleration support. I've had 3d cards that work on some distro's, but not others, even with the same configuration. This isn't so much a linux problem as a hardware vendor problem, though. I hear that there is a video card vendor that is tailoring directly to linux now, so that's a step in the right direction.
On 8/5/05, Justin Dugger [email protected] wrote:
There's lots more to open source 3d gaming than just First person shooters. Most of them also have a linux version, provided the developers didn't choose directX for their game.
There's crack-attack, a tetris attack clone that plays well. There's also Glest, which looks something like Warcraft 3 from the screenshots. Kenta Cho has some great 2d shooters that use 3d graphics. There's also Armegetron, a neat Tron lightcycles game with decent multiplayer aspects. GL-117 is an okayish air force game.
If you know where to look, you can find a lot of good open source games. I think one of the remaining barriers to Linux gaming is a decent website that focuses on open source games. Happypengiun is neat, but the web design needs... a makeover. The whole site looks like it was designed using placeholder art and then decided to go live with what they had when they heard what artists charge for that stuff. The ratings system is a bit strange, not becaues it uses 5 starts, but that the ratings aren't tied to any specific version. If a game doesn't run, it gets one star, even if the bug is fixed the next day. And hardly anybody posts or reviews.
So yea, the games you mentioned are okay, but they kinda suck for similar reasons to the ones that plague happypengiun.
Justin Dugger
On 8/5/05, Josh Charles [email protected] wrote:
I haven't been much of a gamer, but I recently purchased Allied Assault and have become quite addicted to it. I was amazed to find that there are completely open source FPS out there, though from what I understand, the quality isn't up to current proprietary standards.
From what I can see of the movies, though, the gameplay isn't too bad.
Here are some links to check out if you are interested:
http://www.nexuiz.com/ - a FPS http://www.planeshift.it/ - More of a RPG than a FPS, but looks neat.
Enjoy! Josh _______________________________________________ Kclug mailing list [email protected] http://kclug.org/mailman/listinfo/kclug
On Sat, 6 Aug 2005, Justin Dugger wrote:
As long as you've got nvidia, you should be fine. I've only heard mediocre things from ATi. It can be a bit tricky to get nvidia working if your distro upgrades kernels faster than they provide precompiled interfaces. On the other hand, their configuration tools are quickly approaching Window's click and point interface.
I disagree with your comments about ATI. I think they are solid, reliable cards and the r200 DRI driver that ships with X.org gives good quality 3D graphics. I have found that most people that have trouble with "getting these cards to work" really don't have clue one about how all the parts of the system interact together. I am not saying that this is a requirement for running Linux, but it might be for running certain distros. I can say that RedHat/Fedora will pretty much configure it for you out of the box.
You can get a pretty interface to tweak the DRI driver(s) here:
http://dri.freedesktop.org/wiki/DriConf
The worse problem with the ATI set of drivers is the slow support of new chipsets. All of the new cards shipping from ATI now are from a series of chips they call the r300 series. A r300 series driver is in the works and is fairly decent now, but still in heavy development. I expect to see it put out to the general public in the X release.
I can't say anything about ATI's closed source drivers. I try to stick with open source as much as I can for my systems.
Do the open sourced drivers perform well under intensive games? It seems a bit useless to have open drivers that perform only marginally better than CPU processing of GL commands. Don't get me wrong, I don't mean to dispairage ATi, it's open source efforts, or it's users, I'm just heavily skeptical of a third party competing with a manufacturer. There's also the truth of depreciation: the r200 is getting pretty old, and the r300 will probably begin to lose support from the most aggressive games starting next year or two (Battlefield 2 cuts off anything below a GF 5700 and Radeon 8500).
In particular, I was speaking to ATi's closed source drivers, which service Radeon 8500 and above. I've seen plenty of ATi horror stories on slashdot and linux gaming sites, probably slightly more than I've seen nVidia. Given the fast pace of 3d graphics cards, I'm not sure a volunteer open source effort can accomplish anything significant. One thing that appears obvious is that OSS consistantly trails behind industry leaders of various software components, with a few possible core exceptions like Firefox and... well mostly just Mozilla. The fact that it's much harder to innovate than it is to imitate appears to enable the Open Source community to play catch up with a moving target with only a few dozen contributers, and only when the list grows into the hundreds can a project really move past incumbants. Perhaps Open Source is a living counterexample to the Mythical Man Month argument?
Justin Dugger
On 8/9/05, D. Hageman [email protected] wrote:
On Sat, 6 Aug 2005, Justin Dugger wrote:
As long as you've got nvidia, you should be fine. I've only heard mediocre things from ATi. It can be a bit tricky to get nvidia working if your distro upgrades kernels faster than they provide precompiled interfaces. On the other hand, their configuration tools are quickly approaching Window's click and point interface.
I disagree with your comments about ATI. I think they are solid, reliable cards and the r200 DRI driver that ships with X.org gives good quality 3D graphics. I have found that most people that have trouble with "getting these cards to work" really don't have clue one about how all the parts of the system interact together. I am not saying that this is a requirement for running Linux, but it might be for running certain distros. I can say that RedHat/Fedora will pretty much configure it for you out of the box.
You can get a pretty interface to tweak the DRI driver(s) here:
http://dri.freedesktop.org/wiki/DriConf
The worse problem with the ATI set of drivers is the slow support of new chipsets. All of the new cards shipping from ATI now are from a series of chips they call the r300 series. A r300 series driver is in the works and is fairly decent now, but still in heavy development. I expect to see it put out to the general public in the X release.
I can't say anything about ATI's closed source drivers. I try to stick with open source as much as I can for my systems.
-- //========================================================\ || D. Hageman [email protected] || \========================================================//
<Warning lengthy reply>
So is the linux kernel playing catch up with some other kernel/os? I thought that Linux was pretty much a frontrunner. Also, I've noticed that certain GUI aspects of certain OSS window managers were finding their way into other places like the Windows desktop. Also, if games need to stop supporting such big and powerful cards as Radeon 8500s and 9700s then you might want to take a look at what is wrong with the game software and not the hardware or OSS drivers. Sloppy coding is after all sloppy coding. GIGO. But hey what do I know. I don't play 3D games they give me motion sickness. However I do like to do the occasional 3D graphic design, and would like to get into 3D movie making. Which is in essence a major part of 3D games. However the day I need to go out and buy a $400 video card to play a $50 game is the day I stop buying games. Oh wait, I've already done that. ;)
Seriously though, most of the "3rd party" OSS writers are professional coders. I would say they are at least as capable of writing drivers as the manufacturers, and maybe a bit better, since they don't have to deal with a corporate boss telling them it needs to ship yesterday. Also most commercial coding projects I've ever seen have groups in the low teens if that many, so an OSS project with a few dozen coders is likely to out-perform *any* commercial vendor. It's not the matter of playing catch up, it's a matter of needing to reverse engineer the hardware that is so time consuming. All in all, there are places where Linux is trailing, but not due to the quality or ability of the coders but due to the head-in-the-sand attitude of certain corporations. On that note expect the SuSE distro to get more and more sophisticated and have better and better support for hardware. If I am reading the currents right, SuSE is heading in the direction to get more and more big industry names to port code to SuSE. I may be switching back to SuSE someday soon.
Just my 2 cents, Brian J.D.
--- Justin Dugger [email protected] wrote:
Do the open sourced drivers perform well under intensive games? It seems a bit useless to have open drivers that perform only marginally better than CPU processing of GL commands. Don't get me wrong, I don't mean to dispairage ATi, it's open source efforts, or it's users, I'm just heavily skeptical of a third party competing with a manufacturer.
...
I've seen plenty of ATi horror stories on slashdot and linux gaming sites, probably slightly more than I've seen nVidia. Given the fast pace of 3d graphics cards, I'm not sure a volunteer open source effort can accomplish anything significant. One thing that appears obvious is that OSS consistantly trails behind industry leaders of various software components,
...
with only a few dozen contributers, and only when the list grows into the hundreds can a project really move past incumbants. Perhaps Open Source is a living counterexample to the Mythical Man Month argument?
__________________________________________________ Do You Yahoo!? Tired of spam? Yahoo! Mail has the best spam protection around http://mail.yahoo.com
On Tuesday 09 August 2005 13:30, Jack wrote:
So is the linux kernel playing catch up with some other kernel/os? I thought that Linux was pretty much a frontrunner. Also, I've noticed that certain GUI aspects of certain OSS window managers were finding their way into other places like the Windows desktop. Also, if games need to stop supporting such big and powerful cards as Radeon 8500s and 9700s then you might want to take a look at what is wrong with the game software and not the hardware or OSS drivers. Sloppy coding is after all sloppy coding. GIGO. But hey what do I know. I don't play 3D games they give me
It's not a matter of sloppy code but merely advances in hardware capabilities. The newer video cards feature pixel and vertex shaders and hardware transform and lighting -- all of the capabilities make for more realistic scenes; attaining the same result in software would require nothing short of a small supercomputering cluster. For all intents and purposes, the OSS drivers for ATI and NV cannot be used to play anything newer than Quake 3 Arena (a game that is now around 6 years old). That's a serious limitation. The proprietary drivers from ATI are half-asked, token implementations of their Windows equivalents. They have long-standing bugs which ATI seems rather un-interested in fixing.
Anyone doing any serious gaming or 3D modeling and design in Linux is using NVidia's high-end consumer cards or their Quatro workstation cards (very expensive).
All is not bad on ATI's side though: the OSS driver writers will probably have the best Composite/Xgl implementation, first, when its ready.
I have an NVidia 5700 thats starting to show its age but can still play some damn sweet games from its era.
On Tue, 9 Aug 2005, Jason Clinton wrote:
It's not a matter of sloppy code but merely advances in hardware capabilities. The newer video cards feature pixel and vertex shaders and hardware transform and lighting -- all of the capabilities make for more realistic scenes; attaining the same result in software would require nothing short of a small supercomputering cluster. For all intents and purposes, the OSS drivers for ATI and NV cannot be used to play anything newer than Quake 3 Arena (a game that is now around 6 years old). That's a serious limitation. The proprietary drivers from ATI are half-asked, token implementations of their Windows equivalents. They have long-standing bugs which ATI seems rather un-interested in fixing.
Your comments about the OSS driver for ATI not being able to run anything newer then Quake 3 Arena is incorrect. (I can't say the same about the OSS nVidia driver). At any rate, you can google the mailing lists for more information.
Again, the r300 has made significant progress in the past two months. I follow the development very closely and give feedback/help when I can - which isn't much these days due to time constraints. :-(
The biggest enemy of being able to produce quality OSS drivers is all the proprietary enhancement technology that is going into the new cards. Luckily not everyone lives in the United States, so there are ways around such things.
I guess I should add the disclaimer that I helped create the initial specification for the driconf tool, so I have ties to the project. I also maintain the RPM packages for the tool. I guess I might be wearing rose colored glasses.
--- Jason Clinton wrote:
On Tuesday 09 August 2005 13:30, Jack wrote: ... if games need to stop supporting such big and
powerful cards as Radeon 8500s and 9700s then you might want to take a look at what is wrong with
the
game software and not the hardware or OSS drivers. Sloppy coding is after all sloppy coding. GIGO.
But
It's not a matter of sloppy code but merely advances in hardware capabilities. The newer video cards feature pixel and vertex shaders and hardware transform and lighting -- all of the capabilities make for more realistic scenes; ...
Anyone doing any serious gaming or 3D modeling and design in Linux is using NVidia's high-end consumer cards or their Quatro workstation cards (very expensive).
As I said, if games require video cards with vertex shading then you, you might have an overcoded game. (aka sloppy) How realistic does a scene need to be before it doesn't make a difference any more. We are talking about games here not holodecks, right? If and when they come out with a holodeck game, then yes by all means throw in the vertex shading. However I suspect the vertex shading is being done by firmware and not hardware, and hence the difference in performance between Windows and Linux. Were it truly hardware then simply probing the card would locate the commands necessary to call the vertex shading and hence fully support that feature. I could be wrong, I have been known to be wrong before, but I don't think I am now. Not every serious designer is using those high-end cards. Note, however, that getting an $800 professional card for 3D CADD is only worthwhile if you're doing some really intense stuff professionally and can afford to pay it off with profits. For the serious home robotics/rocketeer/modelist/small-shop an ATi 8700 will be quite sufficient. I do however recommend a gig of RAM with that. And if you happen to have a dual processor all the better, but a decent modern desktop and an ATi 8700 work quite well. Even with that dual processor and an $800 professional video card you'll still be waiting for the design to render. My reasoning is you can easily and cheaply upgrade RAM more than you can a video card with the money you save on the video card you can upgrade the CPU and memory and wind up I think better off.
IMNSHO, Brian J.D.
__________________________________________________ Do You Yahoo!? Tired of spam? Yahoo! Mail has the best spam protection around http://mail.yahoo.com
As I said, if games require video cards with vertex shading then you, you might have an overcoded game. (aka sloppy) How realistic does a scene need to be before it doesn't make a difference any more. We are talking about games here not holodecks, right?
Well, this is one of those situations where I'll just have to say that that's a nice opinion that you have and leave it at that.
On Tue, 2005-08-09 at 13:58 -0700, Jack wrote:
How realistic does a scene need to be before it doesn't make a difference any more.
Very. And as all gamers know, we are not there yet. And here I swore I would never get dragged into one of these debates.
Brad
--- brad [email protected] wrote:
On Tue, 2005-08-09 at 13:58 -0700, Jack wrote:
How realistic does a scene need to be before it doesn't make a difference any more.
Very. And as all gamers know, we are not there yet. And here I swore I would never get dragged into one of these debates.
While I can understand the gamers with big budgets whining that they aren't getting every little bit of their new RADEON 100,000 used in their new games, it is somewhat annoying when you discover that your fairly new, say, nVidia video card, is completely inadequate for every single new game out there.
Thief I and Thief II sold well, sticking you into an immersive environment that *downgraded gracefully*. You got to play the game, you realized that you should get a better card, but you could still have fun while you waited to be able to afford the new video card.
Then came Thief III, a game which demanded an $80+ video card which you were pretty much guaranteed not to have already. You couldn't even see what it was like without the $80+ video card. This is the kind of thing I'm annoyed about: the lack of graceful downgrading of the gaming experience.
I just don't understand an industry that is remarkably similar to requiring people to upgrade their cars just to be able to play a new music CD in their car CD player.
____________________________________________________ Start your day with Yahoo! - make it your home page http://www.yahoo.com/r/hs
I just don't understand an industry that is remarkably similar to requiring people to upgrade their cars just to be able to play a new music CD in their car CD player.
Oh, I very much agree that PC gaming is in a serious decline. The battlefield 2 syndrome (you could probably attribute that to a more popular game, if I could figure out which one) is very detrimental sales. So much so that when Microsoft presented the X-Box as a PC developer friendly platform, many left and never looked back, and others just decided to half-ass the PC platform for a fistful of dollars more. Thief 3 would be an excellent example of such shennanigans.
But I've not seen many open source games that are of high quality, that even comes close to five years ago. Most of the ones that are, come from the results of a single guy working hard to clone a game he liked before (crack-attack, wesnoth, armegettron). Partly, game authors on the PC need to start looking towards smaller, simpler games than the massive 'partake in a joint operations military strike, fighting from base to base in a set of vehicles in the dusty dunes of Iraq, working your way up from soldier grunt to squad leader to divisional commander.' They're massive undertakings that rarely win back their investment, and they begin to all sound alike.
--- Justin Dugger [email protected] wrote:
I just don't understand an industry that is remarkably similar to requiring people to upgrade their cars just to be able to play a new music CD in their car CD player.
Oh, I very much agree that PC gaming is in a serious decline. The battlefield 2 syndrome (you could probably attribute that to a more popular game, if I could figure out which one) is very detrimental sales. So much so that when Microsoft presented the X-Box as a PC developer friendly platform, many left and never looked back, and others just decided to half-ass the PC platform for a fistful of dollars more. Thief 3 would be an excellent example of such shennanigans.
Yes, I'd have to agree that Gaming Boxes have really killed the PC Game market in a serious way. It seems like the game developers now tell us to either buy a Game Box or make our PCs into the equivalent of Game Boxes.
I miss the good old days, when the PC Platform was the middle ground: a game could be played on only ONE type of Game Box, but also on the PC. Or that you could expect a game which was only available on ONE Game Box to eventually make it onto the PC Platform (such as the Final Fantasy series).
But I've not seen many open source games that are of high quality, that even comes close to five years ago. Most of the ones that are, come from the results of a single guy working hard to clone a game he liked before (crack-attack, wesnoth, armegettron). Partly, game authors on the PC need to start looking towards smaller, simpler games than the massive 'partake in a joint operations military strike, fighting from base to base in a set of vehicles in the dusty dunes of Iraq, working your way up from soldier grunt to squad leader to divisional commander.' They're massive undertakings that rarely win back their investment, and they begin to all sound alike.
PopCap Games and all the PopCap Clone Companies that have sprung up are doing just that, creating smaller, simpler games and using that old-time system of crippled (but not too much) shareware to sell their product.
They aren't open source (though I've seen a freeware OSS clone of PopCap's original Bejeweled, called "Jools"), but their shareware model seems to be the way to go. Or perhaps a Transgaming Cedega model to make money: source code is free, binary builds require a subscription fee to download.
__________________________________________________ Do You Yahoo!? Tired of spam? Yahoo! Mail has the best spam protection around http://mail.yahoo.com
As far I as I know, there is no such thing as the Cedaga model in the wild.
jldugger
On 8/10/05, Leo Mauler [email protected] wrote:
--- Justin Dugger [email protected] wrote:
I just don't understand an industry that is remarkably similar to requiring people to upgrade their cars just to be able to play a new music CD in their car CD player.
Oh, I very much agree that PC gaming is in a serious decline. The battlefield 2 syndrome (you could probably attribute that to a more popular game, if I could figure out which one) is very detrimental sales. So much so that when Microsoft presented the X-Box as a PC developer friendly platform, many left and never looked back, and others just decided to half-ass the PC platform for a fistful of dollars more. Thief 3 would be an excellent example of such shennanigans.
Yes, I'd have to agree that Gaming Boxes have really killed the PC Game market in a serious way. It seems like the game developers now tell us to either buy a Game Box or make our PCs into the equivalent of Game Boxes.
I miss the good old days, when the PC Platform was the middle ground: a game could be played on only ONE type of Game Box, but also on the PC. Or that you could expect a game which was only available on ONE Game Box to eventually make it onto the PC Platform (such as the Final Fantasy series).
But I've not seen many open source games that are of high quality, that even comes close to five years ago. Most of the ones that are, come from the results of a single guy working hard to clone a game he liked before (crack-attack, wesnoth, armegettron). Partly, game authors on the PC need to start looking towards smaller, simpler games than the massive 'partake in a joint operations military strike, fighting from base to base in a set of vehicles in the dusty dunes of Iraq, working your way up from soldier grunt to squad leader to divisional commander.' They're massive undertakings that rarely win back their investment, and they begin to all sound alike.
PopCap Games and all the PopCap Clone Companies that have sprung up are doing just that, creating smaller, simpler games and using that old-time system of crippled (but not too much) shareware to sell their product.
They aren't open source (though I've seen a freeware OSS clone of PopCap's original Bejeweled, called "Jools"), but their shareware model seems to be the way to go. Or perhaps a Transgaming Cedega model to make money: source code is free, binary builds require a subscription fee to download.
Do You Yahoo!? Tired of spam? Yahoo! Mail has the best spam protection around http://mail.yahoo.com
Sorry for missing the bulk of the conversation I apparently triggered, but I just got finished with a ten hour work session cleaning computer labs. The one week we actually do something a semester, and the list actually brings up something I care about. So, onwards to the reply.
So is the linux kernel playing catch up with some other kernel/os? I thought that Linux was pretty much a frontrunner.
The Linux kernel is a front runner, but the top is crowded. Linux can be most likened as an experimental kernel, willing to forsake stability in many cases, and generally not submitting to a strong and intensive testing unit (some distros may do such work, however). The SPARC port is generally considered faster on the same Sun Hardware because the Sun kernel does a lot of assinine checks and redundancy. At least that's what I got out of the flamewar between the SPARC maintainer and a Solaris engineer. Anyways, the linux kernel is one of those things that I mentioned earlier, that gets 100+ developers looking at it. If you had that many people working on a regular kernel endevor, I don't think any management team on the planet could handle it, without resorting to the same system in place now. Half of what makes OSS work is the internet centric systems they utilize.
Also, I've noticed that certain GUI aspects of certain OSS window managers were finding their way into other places like the Windows desktop.
As best I can tell, you're referring to virtual desktops and pagers. This would be an example the imitation rule being a two way street. It's easier to copy than to innovate. Even so, virtual desktops are a kludge. I rarely use more than one. I would rather see something akin to Expose, which uses computing power to manage and display these things effectively. As a tie-in to the major theme of the thread, rendering windows to images via GL will require some advanced 3d hardware, although probably not anything more than you already own.
Also, if games need to stop supporting such big and powerful cards as Radeon 8500s and 9700s then you might want to take a look at what is wrong with the game software and not the hardware or OSS drivers.
That is a terribly convient viewpoint, but you might want to examine where that puts you. You're against more video ram, which means we may never see expansive landscapes of high quality without a draw distance and fog. I agree that Battlefield 2 is overly aggressive, and might even suffer from poor coding, but there's still other games that don't run at a reaonable framerate on today's affordable video cards. You've effectively taken a stance against progress. Also, I'd love to know why you think Carmack's Doom 3 engine is sloppy code. If anything, I'd argue that the games that STARTED this threat might be the "sloppy code:"
System Requirements 200mb of hard drive space High Quality - Realtime lights and shadows on, bloom on, high detailed maps, 1024x768 or higher res - A 1.5-gigahertz Intel Pentium 4 chip or AMD Athlon 1500 - 9600ati or 5700fx - 256 mb of ram
Given what their "High Quality" results look like, you might be easily persuaded to that conclusion (the truth is, the models and level design are 90 percent of how a level looks; the other 10 percent is about taking the latest tech and helping the artists manipulate it appropriately).
Seriously though, most of the "3rd party" OSS writers are professional coders.
If I were a lawyer, I don't think I'd conceed that point. I'll allow it, if only so I can make statistics up too.
I would say they are at least as capable of writing drivers as the manufacturers, and maybe a bit better, since they don't have to deal with a corporate boss telling them it needs to ship yesterday.
And yet the mantra of Open Source code is "release early, release often." At least the successful ones. I have yet to figure out a cute one liner to summarize RMS's software. The best I have so far is "brag early, release eventually." But I'm being overly bitter, probably because I'm up past my bedtime.
I guess I'm misunderstanding something here. If these professional developers are capable of writing such software, then surely by their professional nature, they could lend their services to the manufacturers, much in the same way one dude wound up writing the Atheros driver I'm currently using (http://cvs.sourceforge.net/viewcvs.py/madwifi/madwifi/README?view=markup). Another guy got a Google Summer of Code project to work on writing an open source version of that driver's hardware access layer, a binary only piece of code that limits, in part, which frequencies can be transmitted on, and how much power can be used to accomplish the task.
Also, when I say few dozen contributers, I means people submitting patches, not just core developers who actively work on features they think people want and generally move things forward. And even the core contributers may be putting in only a few hours a week towards the task. Especially when their real job starts getting in the way. I'd wager plenty of one or two man driven projects stagnate after a couple weeks of crunch time at a real job pre-empts personal projects. I know it happened to gnocatan. Games in particular seem to suffer from half-done syndrome. I guess we gamers are a lazy breed =). This is another time where the Open Source system can come in handy, when someone can easily fork a stagnant project and begin progress anew.
so an OSS project with a few dozen coders is likely to out-perform *any* commercial vendor.
A few dozen coders is already a step above the average beginning project. A few dozen core coders brings you into the realm of the top twenty most active OSS projects. Browsing CVS commits, Inkscape, a fairly popular and significant undertaking to create something similar to Adobe Illustrator, has about five core coders. Judging from the numbers the Inkscape Project status released (http://www.inkscape.org/status/status_20050801.php), I'd say that's about accurate. In the past two months they've collectively added about 10 thousand lines of code. I'm not a graphics artists by nature, but it sounds like you might be qualified to examine the differences in quality between Inkscape and Illustrator.
Any theory on why OSS can succeed needs to account for the fact that the biggest successes are (or started life as) mere imitators. A Photoshop clone. A Quake clone. A Microsoft Office clone. A Netscape clone. A UNIX clone. My best guess is that people with truly innovative ideas don't often feel that the GPL (or similar liscenses) provides adaquate reward, or that the Open Source volunteers in the community can adaquately work to improve something with even more unheard of ideas (this sort of thing being itself a reward from the GPL). But I'm hardly a qualified economist.
On that note expect the SuSE distro to get more and more sophisticated and have better and better support for hardware. If I am reading the currents right, SuSE is heading in the direction to get more and more big industry names to port code to SuSE. I may be switching back to SuSE someday soon.
The beauty of the GPL it's very difficult for SuSE to maintain some strong advantage in hardware support. But I get the feeling you might be referring to Adobe. This is where I own up to one of the big disadvantages of using closed code. By taking the easy way out with nvidia drivers and the madwifi halfway closed driver, I'm taking a lot of the steam out of open source alternatives. It's a difficult decision, but the situation appears to be solving the atheros problem on its own, and I don't think nvidia cards will ever see worthwhile open source 3d drivers. As far as I know, nobody's even tried to start on such a project, which is partly why I'm amazed the ati 3d oss drivers fare so well. I've heard on Slashdot that ati might have contributed that code, which while it would explain much to me, it's attribute slashdot comments authority. Is this correct, or is the ati driver scenario even more amazing than I realize?
Anyways, I've probably written enough to persuade all involved to simply ignore the thread, when the simple goal of the thread was to discuss 3d games. I think everyone here should give crack-attack a shot, even if all they've got is a TNT2 (it ran fine on mine last year). Even if you suffer from motion sickness, this game shouldn't affect you.
The End
Justin Dugger
--- Justin Dugger [email protected] wrote:
Anyways, I've probably written enough to persuade all involved to simply ignore the thread, when the simple goal of the thread was to discuss 3d games. I think everyone here should give crack-attack a shot, even if all they've got is a TNT2 (it ran fine on mine last year). Even if you suffer from motion sickness, this game shouldn't affect you.
I play crack-attack on a dual PII-450Mhz with a RADEON 7500 with 32MB RAM. The GPU is just fine but the CPU prevents the game from being played in its primary format. So here's another reason why crack-attack is good: crack-attack downgrades gracefully with the "--low" option. This makes the blocks a little less 3D but improves game performance on lower-end machines.
__________________________________________________ Do You Yahoo!? Tired of spam? Yahoo! Mail has the best spam protection around http://mail.yahoo.com
On 8/10/05, Leo Mauler [email protected] wrote:
I play crack-attack on a dual PII-450Mhz with a RADEON 7500 with 32MB RAM. The GPU is just fine but the CPU prevents the game from being played in its primary format. So here's another reason why crack-attack is good: crack-attack downgrades gracefully with the "--low" option. This makes the blocks a little less 3D but improves game performance on lower-end machines.
On that note, I have a question. I'm not a gamer, and I haven't played any FPS except Wolf3D and Quake and Halo one time, so I have next-to-no experience in configuring newer games. Have the developers ever thought of putting a Frame rate throttle on the engines? Really, if you are cranking out FPS faster than you can see them, why not throttle the system back and use the CPU power to handle the better effects? Setting a system max FPS of 30-60 FPS on a system capable of 120+ would give alot of extra horsepower back to the system to mkae all the goodies that much better.
-----BEGIN PGP SIGNED MESSAGE----- Hash: SHA1
Jon Pruente wrote:
On that note, I have a question. I'm not a gamer, and I haven't played any FPS except Wolf3D and Quake and Halo one time, so I have next-to-no experience in configuring newer games. Have the developers ever thought of putting a Frame rate throttle on the engines? Really, if you are cranking out FPS faster than you can see them, why not throttle the system back and use the CPU power to handle the better effects? Setting a system max FPS of 30-60 FPS on a system capable of 120+ would give alot of extra horsepower back to the system to mkae all the goodies that much better.
Most of the serious FPS gamers turn off all but essential graphics to get a faster framerate. The faster it is, the smoother and more accurate you can be.
Chris - -- I digitally sign my emails. If you see an attachment with .asc, then that means your email client doesn't support PGP digital signatures. http://www.gnupg.org/(en)/documentation/faqs.html#q1.1
Does having a frame rate faster than you can see really help the gameplay? I'd be inclined to think that by freeing up some CPU time the system would be more responsive to input and thus feel faster. How different is 100fps vs. 60fps? I've read the fps speed claims before, but not having personal experience, I dunno if it is really any better if your brain can't comprehend all those extra frames. I've been thinking lately that gamers are approaching the drag race effect: They get going so fast that they go by muscle memory and luck more than critical evaluation. I know there is some strategy to the actual game, but how many times do I watch people play and it's point the gun and spray a dozen shots instead of squeeze off one or two.
On 8/13/05, Chris Bier [email protected] wrote:
Most of the serious FPS gamers turn off all but essential graphics to get a faster framerate. The faster it is, the smoother and more accurate you can be.
Chris
A couple of notes here reguarding FPS. A higher average FPS usually means a higher minimum fps. But most people would far rather control for themselves the options that trade quality for render speed. As an example, in Battlefield 2 (BF2), one can set a draw distance. Lowering it will cut down the number of polys the engine needs to handle, but it also cuts visibility. The difference between 60 and 100 FPS is generally the difference between waiting for vblank or not. The extra oomph increases slack time, so you miss fewer deadlines to draw on the screen. Otherwise, you get "tearing", where the upper half-ish region of the screen is drawn from one point in time, and the rest of the screen is drawn from the newer time frame. This is easily noticable when you look left or right, or up or down.
I shouldn't need to mention that the difference between 30 and 60 is great. I get BF2 to run at about 43 fps average and its noticable, especially during dust particle intensive artillery strikes. But some do cap the performance at 60. I think the original halflife did so with a variable called maxfps.
Developers have attempted a few scaling mechanisms, but it generally doesn't work out well for them. I can't remember the game they were considering this for, but one of the iD or former iD members was considering a dynamic poly culling system that would reduce the number of polys in a model based on performance and size on screen. I don't think it went well for them.
Justin Dugger
On 8/14/05, Jon Pruente [email protected] wrote:
Does having a frame rate faster than you can see really help the gameplay? I'd be inclined to think that by freeing up some CPU time the system would be more responsive to input and thus feel faster. How different is 100fps vs. 60fps? I've read the fps speed claims before, but not having personal experience, I dunno if it is really any better if your brain can't comprehend all those extra frames. I've been thinking lately that gamers are approaching the drag race effect: They get going so fast that they go by muscle memory and luck more than critical evaluation. I know there is some strategy to the actual game, but how many times do I watch people play and it's point the gun and spray a dozen shots instead of squeeze off one or two.