<article>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#article10_03_08_1636259</id>
	<title>Game Devs Only Use PhysX For the Money, Says AMD</title>
	<author>Soulskill</author>
	<datestamp>1268072100000</datestamp>
	<htmltext>arcticstoat writes <i>"AMD has just aimed a shot at Nvidia's PhysX technology, saying that most game developers <a href="http://www.thinq.co.uk/news/2010/3/8/amd-game-devs-only-use-physx-for-the-cash/">only implement GPU-accelerated PhysX for the money</a>. AMD's Richard Huddy explained that 'Nvidia creates a marketing deal with a title, and then as part of that marketing deal, they have the right to go in and implement PhysX in the game.'  However, he adds that 'the problem with that is obviously that the game developer doesn't actually want it. They're not doing it because they want it; they're doing it because they're paid to do it. So we have a rather artificial situation at the moment where you see PhysX in games, but it isn't because the game developer wants it in there.'  AMD is pushing open standards such as OpenCL and DirectCompute as alternatives to PhysX, as these APIs can run on both AMD and Nvidia GPUs. AMD also announced today that it will be <a href="http://www.thinq.co.uk/features/2010/3/8/amd-gives-away-gpu-physics-tools/">giving away free versions of Pixelux's DMM2 physics engine</a>, which now includes Bullet Physics, to some game developers."</i></htmltext>
<tokenext>arcticstoat writes " AMD has just aimed a shot at Nvidia 's PhysX technology , saying that most game developers only implement GPU-accelerated PhysX for the money .
AMD 's Richard Huddy explained that 'Nvidia creates a marketing deal with a title , and then as part of that marketing deal , they have the right to go in and implement PhysX in the game .
' However , he adds that 'the problem with that is obviously that the game developer does n't actually want it .
They 're not doing it because they want it ; they 're doing it because they 're paid to do it .
So we have a rather artificial situation at the moment where you see PhysX in games , but it is n't because the game developer wants it in there .
' AMD is pushing open standards such as OpenCL and DirectCompute as alternatives to PhysX , as these APIs can run on both AMD and Nvidia GPUs .
AMD also announced today that it will be giving away free versions of Pixelux 's DMM2 physics engine , which now includes Bullet Physics , to some game developers .
"</tokentext>
<sentencetext>arcticstoat writes "AMD has just aimed a shot at Nvidia's PhysX technology, saying that most game developers only implement GPU-accelerated PhysX for the money.
AMD's Richard Huddy explained that 'Nvidia creates a marketing deal with a title, and then as part of that marketing deal, they have the right to go in and implement PhysX in the game.
'  However, he adds that 'the problem with that is obviously that the game developer doesn't actually want it.
They're not doing it because they want it; they're doing it because they're paid to do it.
So we have a rather artificial situation at the moment where you see PhysX in games, but it isn't because the game developer wants it in there.
'  AMD is pushing open standards such as OpenCL and DirectCompute as alternatives to PhysX, as these APIs can run on both AMD and Nvidia GPUs.
AMD also announced today that it will be giving away free versions of Pixelux's DMM2 physics engine, which now includes Bullet Physics, to some game developers.
"</sentencetext>
</article>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_03_08_1636259.31411454</id>
	<title>Re:Maybe</title>
	<author>Anonymous</author>
	<datestamp>1268133600000</datestamp>
	<modclass>None</modclass>
	<modscore>0</modscore>
	<htmltext><p>No....not 66\% of the PC gaming market.  66\% of the <b>Steam</b> market.  I, for one, refuse to install Steam on my computer.  If the game needs Steam, I don't need the game.  Having a downloadable version of the game is great and I'm glad that the option is available to me should I need it, but it's not the option I choose (at least not in the form Steam games are presented; feel like giving me an ISO and I have no problem buying your downloadable game).  My point being that perhaps the type of person who purchases an NVidia card (or an OEM computer with an NVidia card) is the same type of person who would get their games from Steam.  When you can point me to a study from an independent group showing statistics (and making the raw data and study methodology available, anything less is just posturing), then we'll talk  Until then, your statistics are about as useful as my metric that shows that 100\% of the direct parents of this post are morons, NVidia owners, and put way too much faith in statistics.</p></htmltext>
<tokenext>No....not 66 \ % of the PC gaming market .
66 \ % of the Steam market .
I , for one , refuse to install Steam on my computer .
If the game needs Steam , I do n't need the game .
Having a downloadable version of the game is great and I 'm glad that the option is available to me should I need it , but it 's not the option I choose ( at least not in the form Steam games are presented ; feel like giving me an ISO and I have no problem buying your downloadable game ) .
My point being that perhaps the type of person who purchases an NVidia card ( or an OEM computer with an NVidia card ) is the same type of person who would get their games from Steam .
When you can point me to a study from an independent group showing statistics ( and making the raw data and study methodology available , anything less is just posturing ) , then we 'll talk Until then , your statistics are about as useful as my metric that shows that 100 \ % of the direct parents of this post are morons , NVidia owners , and put way too much faith in statistics .</tokentext>
<sentencetext>No....not 66\% of the PC gaming market.
66\% of the Steam market.
I, for one, refuse to install Steam on my computer.
If the game needs Steam, I don't need the game.
Having a downloadable version of the game is great and I'm glad that the option is available to me should I need it, but it's not the option I choose (at least not in the form Steam games are presented; feel like giving me an ISO and I have no problem buying your downloadable game).
My point being that perhaps the type of person who purchases an NVidia card (or an OEM computer with an NVidia card) is the same type of person who would get their games from Steam.
When you can point me to a study from an independent group showing statistics (and making the raw data and study methodology available, anything less is just posturing), then we'll talk  Until then, your statistics are about as useful as my metric that shows that 100\% of the direct parents of this post are morons, NVidia owners, and put way too much faith in statistics.</sentencetext>
	<parent>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_03_08_1636259.31407414</parent>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_03_08_1636259.31403054</id>
	<title>Is anyone</title>
	<author>Anonymous</author>
	<datestamp>1268076000000</datestamp>
	<modclass>None</modclass>
	<modscore>0</modscore>
	<htmltext><p>surprised?</p></htmltext>
<tokenext>surprised ?</tokentext>
<sentencetext>surprised?</sentencetext>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_03_08_1636259.31403502</id>
	<title>Re:They wish they'd thought of it first</title>
	<author>BatGnat</author>
	<datestamp>1268077920000</datestamp>
	<modclass>Interestin</modclass>
	<modscore>2</modscore>
	<htmltext>Just because some companies use PhysX for pretty effects only, does not mean that someone else wont come along and use it for something cool that <i>will</i> add something to gameplay...</htmltext>
<tokenext>Just because some companies use PhysX for pretty effects only , does not mean that someone else wont come along and use it for something cool that will add something to gameplay.. .</tokentext>
<sentencetext>Just because some companies use PhysX for pretty effects only, does not mean that someone else wont come along and use it for something cool that will add something to gameplay...</sentencetext>
	<parent>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_03_08_1636259.31403172</parent>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_03_08_1636259.31411294</id>
	<title>Re:Maybe</title>
	<author>Anonymous</author>
	<datestamp>1268130600000</datestamp>
	<modclass>None</modclass>
	<modscore>0</modscore>
	<htmltext><p>You forgot one important point - bullet physics is shit.  One of Irwin Couman's benchmarks is a pyramid of blocks which PhysX can actually simulate - the blocks fall then stay in place.  In bullet physics they slide off each other, totally contrary to the laws of physics.  Havok is moderately better, but the friction models in both systems are just wrong.  Also, PhysX handles up to 128 contact points between each object while Havok and bullet manage only 4.  Just stacking a pair of cubes needs 8 contact points in the general case, so you can see the problem.</p><p>Bullet was written for speed, for simple physical reactions (not real "physics" as such), but since upping the ante, PhysX have gone and optimized their solution a great deal more, and it's now as fast as Havok and almost as fast as Bullet for physical reaction stuff, while being accurate and stable for more complex uses.  (This is on the CPU and Cell SPU; you're saying PhysX is faster anyway on GPU, which is very interesting, although not many triple-A games have spare GPU to spend on physics.)</p><p>I don't know what AMD are bringing to the table.  Bullet physics is free anyway, isn't it?  As is PhysX, if you only want libs and not full source code.</p></htmltext>
<tokenext>You forgot one important point - bullet physics is shit .
One of Irwin Couman 's benchmarks is a pyramid of blocks which PhysX can actually simulate - the blocks fall then stay in place .
In bullet physics they slide off each other , totally contrary to the laws of physics .
Havok is moderately better , but the friction models in both systems are just wrong .
Also , PhysX handles up to 128 contact points between each object while Havok and bullet manage only 4 .
Just stacking a pair of cubes needs 8 contact points in the general case , so you can see the problem.Bullet was written for speed , for simple physical reactions ( not real " physics " as such ) , but since upping the ante , PhysX have gone and optimized their solution a great deal more , and it 's now as fast as Havok and almost as fast as Bullet for physical reaction stuff , while being accurate and stable for more complex uses .
( This is on the CPU and Cell SPU ; you 're saying PhysX is faster anyway on GPU , which is very interesting , although not many triple-A games have spare GPU to spend on physics .
) I do n't know what AMD are bringing to the table .
Bullet physics is free anyway , is n't it ?
As is PhysX , if you only want libs and not full source code .</tokentext>
<sentencetext>You forgot one important point - bullet physics is shit.
One of Irwin Couman's benchmarks is a pyramid of blocks which PhysX can actually simulate - the blocks fall then stay in place.
In bullet physics they slide off each other, totally contrary to the laws of physics.
Havok is moderately better, but the friction models in both systems are just wrong.
Also, PhysX handles up to 128 contact points between each object while Havok and bullet manage only 4.
Just stacking a pair of cubes needs 8 contact points in the general case, so you can see the problem.Bullet was written for speed, for simple physical reactions (not real "physics" as such), but since upping the ante, PhysX have gone and optimized their solution a great deal more, and it's now as fast as Havok and almost as fast as Bullet for physical reaction stuff, while being accurate and stable for more complex uses.
(This is on the CPU and Cell SPU; you're saying PhysX is faster anyway on GPU, which is very interesting, although not many triple-A games have spare GPU to spend on physics.
)I don't know what AMD are bringing to the table.
Bullet physics is free anyway, isn't it?
As is PhysX, if you only want libs and not full source code.</sentencetext>
	<parent>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_03_08_1636259.31403946</parent>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_03_08_1636259.31403296</id>
	<title>Re:What does PhysX do anyways?</title>
	<author>Pojut</author>
	<datestamp>1268076900000</datestamp>
	<modclass>Informativ</modclass>
	<modscore>2</modscore>
	<htmltext><p>Ask, and ye shall receive: <a href="http://en.wikipedia.org/wiki/PhysX#PPU" title="wikipedia.org">http://en.wikipedia.org/wiki/PhysX#PPU</a> [wikipedia.org]</p></htmltext>
<tokenext>Ask , and ye shall receive : http : //en.wikipedia.org/wiki/PhysX # PPU [ wikipedia.org ]</tokentext>
<sentencetext>Ask, and ye shall receive: http://en.wikipedia.org/wiki/PhysX#PPU [wikipedia.org]</sentencetext>
	<parent>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_03_08_1636259.31403156</parent>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_03_08_1636259.31405456</id>
	<title>For the money?</title>
	<author>Darinbob</author>
	<datestamp>1268043420000</datestamp>
	<modclass>None</modclass>
	<modscore>1</modscore>
	<htmltext>Um, don't game developers only develop \_games\_ for the money?</htmltext>
<tokenext>Um , do n't game developers only develop \ _games \ _ for the money ?</tokentext>
<sentencetext>Um, don't game developers only develop \_games\_ for the money?</sentencetext>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_03_08_1636259.31404042</id>
	<title>Re:clutching at straws</title>
	<author>LWATCDR</author>
	<datestamp>1268080380000</datestamp>
	<modclass>None</modclass>
	<modscore>1</modscore>
	<htmltext><p>"GPGPU is useless except in scientific computing: we already have more x86 cores than the devs know how to use, let alone use a different computing paradigm"<br>Well maybe for games but GPGPU will mean a lot for transcoding.<br>Home HD video is going to be big soon and it takes forever to transcode. However you can do that with an ARM. The TegaII and the OMAP line have enough GPU power to use it for transcoding.</p></htmltext>
<tokenext>" GPGPU is useless except in scientific computing : we already have more x86 cores than the devs know how to use , let alone use a different computing paradigm " Well maybe for games but GPGPU will mean a lot for transcoding.Home HD video is going to be big soon and it takes forever to transcode .
However you can do that with an ARM .
The TegaII and the OMAP line have enough GPU power to use it for transcoding .</tokentext>
<sentencetext>"GPGPU is useless except in scientific computing: we already have more x86 cores than the devs know how to use, let alone use a different computing paradigm"Well maybe for games but GPGPU will mean a lot for transcoding.Home HD video is going to be big soon and it takes forever to transcode.
However you can do that with an ARM.
The TegaII and the OMAP line have enough GPU power to use it for transcoding.</sentencetext>
	<parent>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_03_08_1636259.31403488</parent>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_03_08_1636259.31403156</id>
	<title>What does PhysX do anyways?</title>
	<author>Anonymous</author>
	<datestamp>1268076420000</datestamp>
	<modclass>None</modclass>
	<modscore>0</modscore>
	<htmltext><p>I've never figured out what PhysX is supposed to do. More realistic physics I suppose? Well I can't say I've ever noticed any difference between a game that uses it and a game that doesn't. So, what, the corpses flop differently?</p></htmltext>
<tokenext>I 've never figured out what PhysX is supposed to do .
More realistic physics I suppose ?
Well I ca n't say I 've ever noticed any difference between a game that uses it and a game that does n't .
So , what , the corpses flop differently ?</tokentext>
<sentencetext>I've never figured out what PhysX is supposed to do.
More realistic physics I suppose?
Well I can't say I've ever noticed any difference between a game that uses it and a game that doesn't.
So, what, the corpses flop differently?</sentencetext>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_03_08_1636259.31406250</id>
	<title>Re:It's a new riff on the old joke</title>
	<author>Endo13</author>
	<datestamp>1268046420000</datestamp>
	<modclass>None</modclass>
	<modscore>1</modscore>
	<htmltext><p>I've seen it before. As a result, I read over it so quickly that not only did I get the gist of it, I didn't even notice he'd butchered it until I read his next post. I'm not sure which is worse, his typing or my reading.</p></htmltext>
<tokenext>I 've seen it before .
As a result , I read over it so quickly that not only did I get the gist of it , I did n't even notice he 'd butchered it until I read his next post .
I 'm not sure which is worse , his typing or my reading .</tokentext>
<sentencetext>I've seen it before.
As a result, I read over it so quickly that not only did I get the gist of it, I didn't even notice he'd butchered it until I read his next post.
I'm not sure which is worse, his typing or my reading.</sentencetext>
	<parent>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_03_08_1636259.31405300</parent>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_03_08_1636259.31412222</id>
	<title>Re:clutching at straws</title>
	<author>Jarik\_Tentsu</author>
	<datestamp>1268142840000</datestamp>
	<modclass>None</modclass>
	<modscore>1</modscore>
	<htmltext><p>It has nothing to do with what is 'enough'.</p><p>People always like what is out there. A $20,000 family car is more than enough to transport you from point A to point B. Yet hundreds of thousands of people buy cars worth over $150k boasting amazing lap times around Nurbugring, fast 0-100 time and all that...and the funny thing is, probably a tiny, tiny, tiny portion actually race their cars. Most just drive around normally.</p><p>Marketing means that when people go to buy something, the mere existence of something better lulls them to it. So while your 5 year old graphics card renders your web pages and email fine, when you go to buy a new graphics card, customers would be drawn away from the dull, cheap rigs to the more expensive ones. Then they'll remember some lagginess, some slowness in certain instances and justify the expense.</p><p>Hell, I'm a prime example of this. Unlike my mates, I rarely game. I used to video edit a lot, but not really more. But I spent huge money on my current computer, buying massive heatsinks, overclocking it to the max, buying top end parts, etc. Certainly didn't 'need' it. But I did 'want' it.</p><p>And as long as people and society are like this - GPU's are not in a bind. They just have  to keep releasing something better and people will want it.</p></htmltext>
<tokenext>It has nothing to do with what is 'enough'.People always like what is out there .
A $ 20,000 family car is more than enough to transport you from point A to point B. Yet hundreds of thousands of people buy cars worth over $ 150k boasting amazing lap times around Nurbugring , fast 0-100 time and all that...and the funny thing is , probably a tiny , tiny , tiny portion actually race their cars .
Most just drive around normally.Marketing means that when people go to buy something , the mere existence of something better lulls them to it .
So while your 5 year old graphics card renders your web pages and email fine , when you go to buy a new graphics card , customers would be drawn away from the dull , cheap rigs to the more expensive ones .
Then they 'll remember some lagginess , some slowness in certain instances and justify the expense.Hell , I 'm a prime example of this .
Unlike my mates , I rarely game .
I used to video edit a lot , but not really more .
But I spent huge money on my current computer , buying massive heatsinks , overclocking it to the max , buying top end parts , etc .
Certainly did n't 'need ' it .
But I did 'want ' it.And as long as people and society are like this - GPU 's are not in a bind .
They just have to keep releasing something better and people will want it .</tokentext>
<sentencetext>It has nothing to do with what is 'enough'.People always like what is out there.
A $20,000 family car is more than enough to transport you from point A to point B. Yet hundreds of thousands of people buy cars worth over $150k boasting amazing lap times around Nurbugring, fast 0-100 time and all that...and the funny thing is, probably a tiny, tiny, tiny portion actually race their cars.
Most just drive around normally.Marketing means that when people go to buy something, the mere existence of something better lulls them to it.
So while your 5 year old graphics card renders your web pages and email fine, when you go to buy a new graphics card, customers would be drawn away from the dull, cheap rigs to the more expensive ones.
Then they'll remember some lagginess, some slowness in certain instances and justify the expense.Hell, I'm a prime example of this.
Unlike my mates, I rarely game.
I used to video edit a lot, but not really more.
But I spent huge money on my current computer, buying massive heatsinks, overclocking it to the max, buying top end parts, etc.
Certainly didn't 'need' it.
But I did 'want' it.And as long as people and society are like this - GPU's are not in a bind.
They just have  to keep releasing something better and people will want it.</sentencetext>
	<parent>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_03_08_1636259.31403488</parent>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_03_08_1636259.31409308</id>
	<title>Re:They wish they'd thought of it first</title>
	<author>Hurricane78</author>
	<datestamp>1268063640000</datestamp>
	<modclass>None</modclass>
	<modscore>1</modscore>
	<htmltext><p>Did you even read the summary to the end? It says so right there, that this is what they will start doing now:</p><p><div class="quote"><p>AMD also announced today that it will be giving away free versions of Pixelux's DMM2 physics engine, which now includes Bullet Physics, to some game developers.</p></div><p>You were more right with your comment, than you could imagine.<nobr> <wbr></nobr>:)</p></div>
	</htmltext>
<tokenext>Did you even read the summary to the end ?
It says so right there , that this is what they will start doing now : AMD also announced today that it will be giving away free versions of Pixelux 's DMM2 physics engine , which now includes Bullet Physics , to some game developers.You were more right with your comment , than you could imagine .
: )</tokentext>
<sentencetext>Did you even read the summary to the end?
It says so right there, that this is what they will start doing now:AMD also announced today that it will be giving away free versions of Pixelux's DMM2 physics engine, which now includes Bullet Physics, to some game developers.You were more right with your comment, than you could imagine.
:)
	</sentencetext>
	<parent>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_03_08_1636259.31403034</parent>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_03_08_1636259.31404038</id>
	<title>Best example with the MMORPG UTOPIA</title>
	<author>SharpFang</author>
	<datestamp>1268080380000</datestamp>
	<modclass>Interestin</modclass>
	<modscore>4</modscore>
	<htmltext><p>A friend told me about his experience with Utopia. It implemented GPU-accelerated physics in one of recent patches. But try hard as you wish, he failed to notice any difference for weeks of gameplay. Until he entered the central city. With flags by the entrance fluttering smoothly in the wind, instead of the old static animation.</p><p>Yep, that's it. Many megabytes of a patch, a game of hundreds of miles of terrain, hundreds of locations, battles, vehicles, all that stuff... and physics acceleration is used to flutter flags by the entrance.</p></htmltext>
<tokenext>A friend told me about his experience with Utopia .
It implemented GPU-accelerated physics in one of recent patches .
But try hard as you wish , he failed to notice any difference for weeks of gameplay .
Until he entered the central city .
With flags by the entrance fluttering smoothly in the wind , instead of the old static animation.Yep , that 's it .
Many megabytes of a patch , a game of hundreds of miles of terrain , hundreds of locations , battles , vehicles , all that stuff... and physics acceleration is used to flutter flags by the entrance .</tokentext>
<sentencetext>A friend told me about his experience with Utopia.
It implemented GPU-accelerated physics in one of recent patches.
But try hard as you wish, he failed to notice any difference for weeks of gameplay.
Until he entered the central city.
With flags by the entrance fluttering smoothly in the wind, instead of the old static animation.Yep, that's it.
Many megabytes of a patch, a game of hundreds of miles of terrain, hundreds of locations, battles, vehicles, all that stuff... and physics acceleration is used to flutter flags by the entrance.</sentencetext>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_03_08_1636259.31403062</id>
	<title>It's a new riff on the old joke</title>
	<author>Anonymous</author>
	<datestamp>1268076060000</datestamp>
	<modclass>None</modclass>
	<modscore>1</modscore>
	<htmltext><p>"You're so ugly the only way to get the dog to play with you is to tie a steak around your neck."</p><p>Says the kid the dog without a dog to play with.</p></htmltext>
<tokenext>" You 're so ugly the only way to get the dog to play with you is to tie a steak around your neck .
" Says the kid the dog without a dog to play with .</tokentext>
<sentencetext>"You're so ugly the only way to get the dog to play with you is to tie a steak around your neck.
"Says the kid the dog without a dog to play with.</sentencetext>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_03_08_1636259.31405718</id>
	<title>Re:Maybe</title>
	<author>Anonymous</author>
	<datestamp>1268044560000</datestamp>
	<modclass>None</modclass>
	<modscore>0</modscore>
	<htmltext><p><div class="quote"><p>Everyone seems to be glossing over a nice little fact:</p><p>Physx works on -all- modern Windows computers, whether they have a graphics accelerator or not.  So yes, only have the market can use the hardware accelerated Physx, but the other half isn't barred from the game.  They get to play, too.</p></div><p>The issue is that when the PhysX API requires hardware (as in GPU) acceleration, it simply will not work on an ATI/AMD video card.  Only nVidia GeForce 8xxx series and higher cards support PhysX Hardware acceleration.</p><p>Any game that has come out recently (Batman Arkham Asylum comes to mind) that touts PhysX as a feature requires GPU acceleration, and this locks that specific feature to nVidia only users.</p><p>AMD/ATI's approach is more likely to gain support, as OpenCL is, by nature, an open API like OpenGL is.  Makes far more sense for a developer to support that as they can ensure that their features are hitting the widest demographic.</p></div>
	</htmltext>
<tokenext>Everyone seems to be glossing over a nice little fact : Physx works on -all- modern Windows computers , whether they have a graphics accelerator or not .
So yes , only have the market can use the hardware accelerated Physx , but the other half is n't barred from the game .
They get to play , too.The issue is that when the PhysX API requires hardware ( as in GPU ) acceleration , it simply will not work on an ATI/AMD video card .
Only nVidia GeForce 8xxx series and higher cards support PhysX Hardware acceleration.Any game that has come out recently ( Batman Arkham Asylum comes to mind ) that touts PhysX as a feature requires GPU acceleration , and this locks that specific feature to nVidia only users.AMD/ATI 's approach is more likely to gain support , as OpenCL is , by nature , an open API like OpenGL is .
Makes far more sense for a developer to support that as they can ensure that their features are hitting the widest demographic .</tokentext>
<sentencetext>Everyone seems to be glossing over a nice little fact:Physx works on -all- modern Windows computers, whether they have a graphics accelerator or not.
So yes, only have the market can use the hardware accelerated Physx, but the other half isn't barred from the game.
They get to play, too.The issue is that when the PhysX API requires hardware (as in GPU) acceleration, it simply will not work on an ATI/AMD video card.
Only nVidia GeForce 8xxx series and higher cards support PhysX Hardware acceleration.Any game that has come out recently (Batman Arkham Asylum comes to mind) that touts PhysX as a feature requires GPU acceleration, and this locks that specific feature to nVidia only users.AMD/ATI's approach is more likely to gain support, as OpenCL is, by nature, an open API like OpenGL is.
Makes far more sense for a developer to support that as they can ensure that their features are hitting the widest demographic.
	</sentencetext>
	<parent>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_03_08_1636259.31403498</parent>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_03_08_1636259.31403766</id>
	<title>Re:clutching at straws</title>
	<author>Ironhandx</author>
	<datestamp>1268079060000</datestamp>
	<modclass>Informativ</modclass>
	<modscore>5</modscore>
	<htmltext><tt>Tell that to AMD who have sold 2 million directx11 GPUs since release. (http://www.dailytech.com/ATI+Sells+Over+2+Million+DirectX+11+GPUs+Celebrates+With+Radeon+Cake/article17349.htm)<br><br>IGP are sufficient for 90\% of users... but that hasn't changed since back in the Pentium 1 days. Many PCs were equipped with IGP or something that amounted to the same thing but in card form even then.<br><br>Also: GPGPU is NOT meant for gfx processing on the fly at all, so it has absolutely nothing to do with devs having to target the lowest common denominator. You even state that its useless except for scientific purposes in your own comment. The entire purpose of the GPGPU move is towards scientific purposes where vast quantities of repeated calcs have to be done. Something that GPUs excel at.<br><br>At least get SOME of your facts straight before spouting FUD.</tt></htmltext>
<tokenext>Tell that to AMD who have sold 2 million directx11 GPUs since release .
( http : //www.dailytech.com/ATI + Sells + Over + 2 + Million + DirectX + 11 + GPUs + Celebrates + With + Radeon + Cake/article17349.htm ) IGP are sufficient for 90 \ % of users... but that has n't changed since back in the Pentium 1 days .
Many PCs were equipped with IGP or something that amounted to the same thing but in card form even then.Also : GPGPU is NOT meant for gfx processing on the fly at all , so it has absolutely nothing to do with devs having to target the lowest common denominator .
You even state that its useless except for scientific purposes in your own comment .
The entire purpose of the GPGPU move is towards scientific purposes where vast quantities of repeated calcs have to be done .
Something that GPUs excel at.At least get SOME of your facts straight before spouting FUD .</tokentext>
<sentencetext>Tell that to AMD who have sold 2 million directx11 GPUs since release.
(http://www.dailytech.com/ATI+Sells+Over+2+Million+DirectX+11+GPUs+Celebrates+With+Radeon+Cake/article17349.htm)IGP are sufficient for 90\% of users... but that hasn't changed since back in the Pentium 1 days.
Many PCs were equipped with IGP or something that amounted to the same thing but in card form even then.Also: GPGPU is NOT meant for gfx processing on the fly at all, so it has absolutely nothing to do with devs having to target the lowest common denominator.
You even state that its useless except for scientific purposes in your own comment.
The entire purpose of the GPGPU move is towards scientific purposes where vast quantities of repeated calcs have to be done.
Something that GPUs excel at.At least get SOME of your facts straight before spouting FUD.</sentencetext>
	<parent>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_03_08_1636259.31403488</parent>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_03_08_1636259.31407908</id>
	<title>Re:Maybe</title>
	<author>JernejL</author>
	<datestamp>1268053500000</datestamp>
	<modclass>None</modclass>
	<modscore>1</modscore>
	<htmltext>Except that PAL - Physics Abstraction Layer contains some flawed and bugged support for half the physics engines it supports.

And even those only support ancient old versions, the pal benchmarks are pretty much leaning towards bullet, you can see for yourself which engine support is given most work there.

If you cant see that pal favorizes bullet on purpose you are blind.</htmltext>
<tokenext>Except that PAL - Physics Abstraction Layer contains some flawed and bugged support for half the physics engines it supports .
And even those only support ancient old versions , the pal benchmarks are pretty much leaning towards bullet , you can see for yourself which engine support is given most work there .
If you cant see that pal favorizes bullet on purpose you are blind .</tokentext>
<sentencetext>Except that PAL - Physics Abstraction Layer contains some flawed and bugged support for half the physics engines it supports.
And even those only support ancient old versions, the pal benchmarks are pretty much leaning towards bullet, you can see for yourself which engine support is given most work there.
If you cant see that pal favorizes bullet on purpose you are blind.</sentencetext>
	<parent>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_03_08_1636259.31403946</parent>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_03_08_1636259.31403178</id>
	<title>Re:They wish they'd thought of it first</title>
	<author>Anonymous</author>
	<datestamp>1268076480000</datestamp>
	<modclass>None</modclass>
	<modscore>0</modscore>
	<htmltext><p>Great, so we'll have two graphics card manufacturers bribing games developers with money to implement visual features that only work on their graphics cards. Although the cynical part of me thinks 'they might not be able to afford it', the more benevolent part thinks 'because the concept is absolutely not in the best interest of customers'. In an economic analysis, at the very least, the costs spent by NVIDIA and ATi will be absorbed through their graphics card prices (so consumers pay in any case), with a net productivity loss from the developers implementing graphics features twice that could have been based on a single model.</p></htmltext>
<tokenext>Great , so we 'll have two graphics card manufacturers bribing games developers with money to implement visual features that only work on their graphics cards .
Although the cynical part of me thinks 'they might not be able to afford it ' , the more benevolent part thinks 'because the concept is absolutely not in the best interest of customers' .
In an economic analysis , at the very least , the costs spent by NVIDIA and ATi will be absorbed through their graphics card prices ( so consumers pay in any case ) , with a net productivity loss from the developers implementing graphics features twice that could have been based on a single model .</tokentext>
<sentencetext>Great, so we'll have two graphics card manufacturers bribing games developers with money to implement visual features that only work on their graphics cards.
Although the cynical part of me thinks 'they might not be able to afford it', the more benevolent part thinks 'because the concept is absolutely not in the best interest of customers'.
In an economic analysis, at the very least, the costs spent by NVIDIA and ATi will be absorbed through their graphics card prices (so consumers pay in any case), with a net productivity loss from the developers implementing graphics features twice that could have been based on a single model.</sentencetext>
	<parent>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_03_08_1636259.31403034</parent>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_03_08_1636259.31403962</id>
	<title>Re:They wish they'd thought of it first</title>
	<author>Anonymous</author>
	<datestamp>1268079900000</datestamp>
	<modclass>None</modclass>
	<modscore>0</modscore>
	<htmltext><p>Yeah, but in truth, Nvidia didn't either. They bought Ageia to get this, but it was a smart buy.</p><p>FYI, at first you had to pay developers to use 3D at first too.</p><p>Oh and I'll give people another piece to chew on: AMD (and ATI before them) has the worst developer relations group. They utterly suck at anything useful other than throwing money themselves. How many millions of dollars did ATI spend again to gain a couple FPS (less than 10) in Source engine again?</p><p>Hypocrites shouldn't throw stones. Then again, it's never stopped Nvidia either...</p></htmltext>
<tokenext>Yeah , but in truth , Nvidia did n't either .
They bought Ageia to get this , but it was a smart buy.FYI , at first you had to pay developers to use 3D at first too.Oh and I 'll give people another piece to chew on : AMD ( and ATI before them ) has the worst developer relations group .
They utterly suck at anything useful other than throwing money themselves .
How many millions of dollars did ATI spend again to gain a couple FPS ( less than 10 ) in Source engine again ? Hypocrites should n't throw stones .
Then again , it 's never stopped Nvidia either.. .</tokentext>
<sentencetext>Yeah, but in truth, Nvidia didn't either.
They bought Ageia to get this, but it was a smart buy.FYI, at first you had to pay developers to use 3D at first too.Oh and I'll give people another piece to chew on: AMD (and ATI before them) has the worst developer relations group.
They utterly suck at anything useful other than throwing money themselves.
How many millions of dollars did ATI spend again to gain a couple FPS (less than 10) in Source engine again?Hypocrites shouldn't throw stones.
Then again, it's never stopped Nvidia either...</sentencetext>
	<parent>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_03_08_1636259.31403034</parent>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_03_08_1636259.31403610</id>
	<title>Re:Maybe</title>
	<author>jellomizer</author>
	<datestamp>1268078340000</datestamp>
	<modclass>None</modclass>
	<modscore>1</modscore>
	<htmltext><p>Well most devs would want to use their version because of pride in their work.</p></htmltext>
<tokenext>Well most devs would want to use their version because of pride in their work .</tokentext>
<sentencetext>Well most devs would want to use their version because of pride in their work.</sentencetext>
	<parent>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_03_08_1636259.31403036</parent>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_03_08_1636259.31406622</id>
	<title>Re:Maybe</title>
	<author>Anonymous</author>
	<datestamp>1268047560000</datestamp>
	<modclass>None</modclass>
	<modscore>0</modscore>
	<htmltext><p>Technically, PhysX was designed to run on the PhysX standalone card (somewhat similar to modern graphic cards in design), which maybe even isn't supported anymore. Though graphic card is more logical place to have it - because it cuts down back and forth copying of geometry data from CPU to PPU, back to CPU and then to GPU.</p></htmltext>
<tokenext>Technically , PhysX was designed to run on the PhysX standalone card ( somewhat similar to modern graphic cards in design ) , which maybe even is n't supported anymore .
Though graphic card is more logical place to have it - because it cuts down back and forth copying of geometry data from CPU to PPU , back to CPU and then to GPU .</tokentext>
<sentencetext>Technically, PhysX was designed to run on the PhysX standalone card (somewhat similar to modern graphic cards in design), which maybe even isn't supported anymore.
Though graphic card is more logical place to have it - because it cuts down back and forth copying of geometry data from CPU to PPU, back to CPU and then to GPU.</sentencetext>
	<parent>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_03_08_1636259.31403946</parent>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_03_08_1636259.31404072</id>
	<title>My complaint</title>
	<author>w0mprat</author>
	<datestamp>1268080500000</datestamp>
	<modclass>None</modclass>
	<modscore>1</modscore>
	<htmltext>Nvidia has failed to engage the coding community in the right way. Any hardware-accelerated physics API needed to be openly available at the DirectX/OpenCL level from the begining. AMD has kind of seen the light here.
<br> <br>
The original intention of Ageia and their PhysX set up seemed to be just to sell the company, rather than try to make a viable business model of selling hardware. Ageia would have been more open with API and code right from the start if they intended to make a business selling hardware.</htmltext>
<tokenext>Nvidia has failed to engage the coding community in the right way .
Any hardware-accelerated physics API needed to be openly available at the DirectX/OpenCL level from the begining .
AMD has kind of seen the light here .
The original intention of Ageia and their PhysX set up seemed to be just to sell the company , rather than try to make a viable business model of selling hardware .
Ageia would have been more open with API and code right from the start if they intended to make a business selling hardware .</tokentext>
<sentencetext>Nvidia has failed to engage the coding community in the right way.
Any hardware-accelerated physics API needed to be openly available at the DirectX/OpenCL level from the begining.
AMD has kind of seen the light here.
The original intention of Ageia and their PhysX set up seemed to be just to sell the company, rather than try to make a viable business model of selling hardware.
Ageia would have been more open with API and code right from the start if they intended to make a business selling hardware.</sentencetext>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_03_08_1636259.31406006</id>
	<title>Re:Some truth to that.</title>
	<author>Anonymous</author>
	<datestamp>1268045640000</datestamp>
	<modclass>None</modclass>
	<modscore>0</modscore>
	<htmltext><p><div class="quote"><p>While I don't think it's super dire, it's certainly a concern. I can add another point. <a href="http://www.wired.com.nyud.net/gamelife/2010/03/steam-mac/" title="nyud.net" rel="nofollow">Steam confirmed for Mac</a> [nyud.net].</p><p>Problem? Macs don't take the latest and greatest off-the-shelf graphics cards, and generally are a fair bit behind the curve, way back in 'casual land'.</p><p>On the other hand, maybe if Apple open up a bit this is a way to sell more and better cards rather than another spike in the coffin.</p></div><p>There's a larger issue with that; nVidia's PhysX only works in Windows.</p><p>Another reason why OpenCL should be the way to go as it's cross platform, and also vendor agnostic.</p></div>
	</htmltext>
<tokenext>While I do n't think it 's super dire , it 's certainly a concern .
I can add another point .
Steam confirmed for Mac [ nyud.net ] .Problem ?
Macs do n't take the latest and greatest off-the-shelf graphics cards , and generally are a fair bit behind the curve , way back in 'casual land'.On the other hand , maybe if Apple open up a bit this is a way to sell more and better cards rather than another spike in the coffin.There 's a larger issue with that ; nVidia 's PhysX only works in Windows.Another reason why OpenCL should be the way to go as it 's cross platform , and also vendor agnostic .</tokentext>
<sentencetext>While I don't think it's super dire, it's certainly a concern.
I can add another point.
Steam confirmed for Mac [nyud.net].Problem?
Macs don't take the latest and greatest off-the-shelf graphics cards, and generally are a fair bit behind the curve, way back in 'casual land'.On the other hand, maybe if Apple open up a bit this is a way to sell more and better cards rather than another spike in the coffin.There's a larger issue with that; nVidia's PhysX only works in Windows.Another reason why OpenCL should be the way to go as it's cross platform, and also vendor agnostic.
	</sentencetext>
	<parent>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_03_08_1636259.31403934</parent>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_03_08_1636259.31403124</id>
	<title>Re:Maybe</title>
	<author>hedwards</author>
	<datestamp>1268076300000</datestamp>
	<modclass>Informativ</modclass>
	<modscore>4</modscore>
	<htmltext>If you noticed in the summary, AMD is advocating for a similar technology that works on their hardware as well as on nVidia's, seems like developers would prefer that for practical reasons.</htmltext>
<tokenext>If you noticed in the summary , AMD is advocating for a similar technology that works on their hardware as well as on nVidia 's , seems like developers would prefer that for practical reasons .</tokentext>
<sentencetext>If you noticed in the summary, AMD is advocating for a similar technology that works on their hardware as well as on nVidia's, seems like developers would prefer that for practical reasons.</sentencetext>
	<parent>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_03_08_1636259.31403036</parent>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_03_08_1636259.31403176</id>
	<title>Re:They wish they'd thought of it first</title>
	<author>InsaneProcessor</author>
	<datestamp>1268076480000</datestamp>
	<modclass>None</modclass>
	<modscore>1</modscore>
	<htmltext>It reads like sour grapes to me.  If there wasn't enough money in doing it (either promotional or customer desires), then they wouldn't be doing it.</htmltext>
<tokenext>It reads like sour grapes to me .
If there was n't enough money in doing it ( either promotional or customer desires ) , then they would n't be doing it .</tokentext>
<sentencetext>It reads like sour grapes to me.
If there wasn't enough money in doing it (either promotional or customer desires), then they wouldn't be doing it.</sentencetext>
	<parent>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_03_08_1636259.31403034</parent>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_03_08_1636259.31403934</id>
	<title>Some truth to that.</title>
	<author>eddy</author>
	<datestamp>1268079780000</datestamp>
	<modclass>None</modclass>
	<modscore>1</modscore>
	<htmltext><p>While I don't think it's super dire, it's certainly a concern. I can add another point. <a href="http://www.wired.com.nyud.net/gamelife/2010/03/steam-mac/" title="nyud.net">Steam confirmed for Mac</a> [nyud.net].</p><p>Problem? Macs don't take the latest and greatest off-the-shelf graphics cards, and generally are a fair bit behind the curve, way back in 'casual land'.</p><p>On the other hand, maybe if Apple open up a bit this is a way to sell more and better cards rather than another spike in the coffin.</p></htmltext>
<tokenext>While I do n't think it 's super dire , it 's certainly a concern .
I can add another point .
Steam confirmed for Mac [ nyud.net ] .Problem ?
Macs do n't take the latest and greatest off-the-shelf graphics cards , and generally are a fair bit behind the curve , way back in 'casual land'.On the other hand , maybe if Apple open up a bit this is a way to sell more and better cards rather than another spike in the coffin .</tokentext>
<sentencetext>While I don't think it's super dire, it's certainly a concern.
I can add another point.
Steam confirmed for Mac [nyud.net].Problem?
Macs don't take the latest and greatest off-the-shelf graphics cards, and generally are a fair bit behind the curve, way back in 'casual land'.On the other hand, maybe if Apple open up a bit this is a way to sell more and better cards rather than another spike in the coffin.</sentencetext>
	<parent>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_03_08_1636259.31403488</parent>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_03_08_1636259.31404328</id>
	<title>Wha?</title>
	<author>rgviza</author>
	<datestamp>1268081580000</datestamp>
	<modclass>None</modclass>
	<modscore>1</modscore>
	<htmltext><p>"They're not doing it because they want it; they're doing it because they're paid to do it."</p><p>Doesn't this describe just about any paid project? Just sayin'</p></htmltext>
<tokenext>" They 're not doing it because they want it ; they 're doing it because they 're paid to do it .
" Does n't this describe just about any paid project ?
Just sayin'</tokentext>
<sentencetext>"They're not doing it because they want it; they're doing it because they're paid to do it.
"Doesn't this describe just about any paid project?
Just sayin'</sentencetext>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_03_08_1636259.31403656</id>
	<title>Re:Maybe</title>
	<author>Anonymous</author>
	<datestamp>1268078520000</datestamp>
	<modclass>None</modclass>
	<modscore>0</modscore>
	<htmltext><p>Nvidia's stuff deliberately doesn't work on non-Nvidia hardware.</p><p>Swapping it so that it's something cross-platform that works equally for both will show people just how much Nvidia's Physx was a bunch of crap. Basically, you will see no Nvidia performance advantage in physx games as you see right now. This is not a new issue, but AMD's approach is.</p><p>Now, Nvidia will have no excuse as to why people should support PhysX.</p><p>However, Nvidia has moved on already. They're doing that bullshit 3d stuff - OMG 3d TV! Half the performance due to twice the refresh rate! must buy!</p><p>etc.</p></htmltext>
<tokenext>Nvidia 's stuff deliberately does n't work on non-Nvidia hardware.Swapping it so that it 's something cross-platform that works equally for both will show people just how much Nvidia 's Physx was a bunch of crap .
Basically , you will see no Nvidia performance advantage in physx games as you see right now .
This is not a new issue , but AMD 's approach is.Now , Nvidia will have no excuse as to why people should support PhysX.However , Nvidia has moved on already .
They 're doing that bullshit 3d stuff - OMG 3d TV !
Half the performance due to twice the refresh rate !
must buy ! etc .</tokentext>
<sentencetext>Nvidia's stuff deliberately doesn't work on non-Nvidia hardware.Swapping it so that it's something cross-platform that works equally for both will show people just how much Nvidia's Physx was a bunch of crap.
Basically, you will see no Nvidia performance advantage in physx games as you see right now.
This is not a new issue, but AMD's approach is.Now, Nvidia will have no excuse as to why people should support PhysX.However, Nvidia has moved on already.
They're doing that bullshit 3d stuff - OMG 3d TV!
Half the performance due to twice the refresh rate!
must buy!etc.</sentencetext>
	<parent>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_03_08_1636259.31403124</parent>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_03_08_1636259.31405300</id>
	<title>Re:It's a new riff on the old joke</title>
	<author>dbcad7</author>
	<datestamp>1268042820000</datestamp>
	<modclass>None</modclass>
	<modscore>1</modscore>
	<htmltext>I must be defective. Even with his mistake, I got the gist of what he was trying to say... Life may make more sense to you at this web site.. <a href="http://forums.about.com/n/pfx/forum.aspx?webtag=ab-grammar" title="about.com">http://forums.about.com/n/pfx/forum.aspx?webtag=ab-grammar</a> [about.com]</htmltext>
<tokenext>I must be defective .
Even with his mistake , I got the gist of what he was trying to say... Life may make more sense to you at this web site.. http : //forums.about.com/n/pfx/forum.aspx ? webtag = ab-grammar [ about.com ]</tokentext>
<sentencetext>I must be defective.
Even with his mistake, I got the gist of what he was trying to say... Life may make more sense to you at this web site.. http://forums.about.com/n/pfx/forum.aspx?webtag=ab-grammar [about.com]</sentencetext>
	<parent>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_03_08_1636259.31403208</parent>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_03_08_1636259.31407312</id>
	<title>In my experience, PhysX has only been a hindrance!</title>
	<author>Roger Wilcox</author>
	<datestamp>1268050320000</datestamp>
	<modclass>Interestin</modclass>
	<modscore>5</modscore>
	<htmltext>I have to disable PhysX in the nVidia control panel to get HL2 or any of the Source engine games to run properly!  I had no idea what was causing these games to crash.  After disabling PhysX they work right every time!
<br> <br>
Apparently it doesn't do anything crucial or even noticable as my games run just fine with it turned off.  And now I'm told the game devs don't even want to use it?
<br> <br>
This "feature" has caused me nothing but grief!</htmltext>
<tokenext>I have to disable PhysX in the nVidia control panel to get HL2 or any of the Source engine games to run properly !
I had no idea what was causing these games to crash .
After disabling PhysX they work right every time !
Apparently it does n't do anything crucial or even noticable as my games run just fine with it turned off .
And now I 'm told the game devs do n't even want to use it ?
This " feature " has caused me nothing but grief !</tokentext>
<sentencetext>I have to disable PhysX in the nVidia control panel to get HL2 or any of the Source engine games to run properly!
I had no idea what was causing these games to crash.
After disabling PhysX they work right every time!
Apparently it doesn't do anything crucial or even noticable as my games run just fine with it turned off.
And now I'm told the game devs don't even want to use it?
This "feature" has caused me nothing but grief!</sentencetext>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_03_08_1636259.31404758</id>
	<title>Re:They wish they'd thought of it first</title>
	<author>Draek</author>
	<datestamp>1268040240000</datestamp>
	<modclass>None</modclass>
	<modscore>1</modscore>
	<htmltext><p>But they shouldn't have to, that's the thing. Microsoft and Intel have already gotten in trouble for offering similar deals to OEMs to favor their products (the latter also in detriment of AMD), so it's not unconceivable that NVidia could be hit with a similar problem as well, as they also hold a significant share of the market in question and this could very well be interpreted as a move designed to keep a hold of it by unfair means.</p></htmltext>
<tokenext>But they should n't have to , that 's the thing .
Microsoft and Intel have already gotten in trouble for offering similar deals to OEMs to favor their products ( the latter also in detriment of AMD ) , so it 's not unconceivable that NVidia could be hit with a similar problem as well , as they also hold a significant share of the market in question and this could very well be interpreted as a move designed to keep a hold of it by unfair means .</tokentext>
<sentencetext>But they shouldn't have to, that's the thing.
Microsoft and Intel have already gotten in trouble for offering similar deals to OEMs to favor their products (the latter also in detriment of AMD), so it's not unconceivable that NVidia could be hit with a similar problem as well, as they also hold a significant share of the market in question and this could very well be interpreted as a move designed to keep a hold of it by unfair means.</sentencetext>
	<parent>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_03_08_1636259.31403034</parent>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_03_08_1636259.31405246</id>
	<title>Duhhh!!!</title>
	<author>frank\_adrian314159</author>
	<datestamp>1268042640000</datestamp>
	<modclass>None</modclass>
	<modscore>1</modscore>
	<htmltext><p><i>They're not doing it because they want it; they're doing it because they're paid to do it.</i> </p><p>I can say the same thing of just about everyone who is employed, even the folks at AMD.  Though, it's only in the "creative" arts where there's always this odd shiny coating of "fidelity" that seems to be desired and added on as a last step.  In reality, this coating is as faux as the images and sounds that these arts provide.  The bottom line - it's a business, any art is just an afterthought.  If they can make more money with a feature plus marketing kickbacks than by leaving the feature out, they'll do it.  I guess AMD is getting its ass kicked and that's why they're whining.  Please, spare us...</p></htmltext>
<tokenext>They 're not doing it because they want it ; they 're doing it because they 're paid to do it .
I can say the same thing of just about everyone who is employed , even the folks at AMD .
Though , it 's only in the " creative " arts where there 's always this odd shiny coating of " fidelity " that seems to be desired and added on as a last step .
In reality , this coating is as faux as the images and sounds that these arts provide .
The bottom line - it 's a business , any art is just an afterthought .
If they can make more money with a feature plus marketing kickbacks than by leaving the feature out , they 'll do it .
I guess AMD is getting its ass kicked and that 's why they 're whining .
Please , spare us.. .</tokentext>
<sentencetext>They're not doing it because they want it; they're doing it because they're paid to do it.
I can say the same thing of just about everyone who is employed, even the folks at AMD.
Though, it's only in the "creative" arts where there's always this odd shiny coating of "fidelity" that seems to be desired and added on as a last step.
In reality, this coating is as faux as the images and sounds that these arts provide.
The bottom line - it's a business, any art is just an afterthought.
If they can make more money with a feature plus marketing kickbacks than by leaving the feature out, they'll do it.
I guess AMD is getting its ass kicked and that's why they're whining.
Please, spare us...</sentencetext>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_03_08_1636259.31404188</id>
	<title>Re:What does PhysX do anyways?</title>
	<author>Spatial</author>
	<datestamp>1268081100000</datestamp>
	<modclass>None</modclass>
	<modscore>1</modscore>
	<htmltext>It can be hardware accelerated on the GPU.  That's it.<br> <br>

The benefit: Physics is one of those easily parallelised problems so a very large increase in complexity is possible.<br> <br>

The drawbacks: There's less GPU time available for drawing stuff so your framerate suffers.  And of course, it's limited to Nvidia hardware only.<br> <br>

The latter leads to a drawback of its own: the technology can't be used to its full potential because many people who buy a game won't have the necessary hardware.  So it can't be used in ways that would affect gameplay.</htmltext>
<tokenext>It can be hardware accelerated on the GPU .
That 's it .
The benefit : Physics is one of those easily parallelised problems so a very large increase in complexity is possible .
The drawbacks : There 's less GPU time available for drawing stuff so your framerate suffers .
And of course , it 's limited to Nvidia hardware only .
The latter leads to a drawback of its own : the technology ca n't be used to its full potential because many people who buy a game wo n't have the necessary hardware .
So it ca n't be used in ways that would affect gameplay .</tokentext>
<sentencetext>It can be hardware accelerated on the GPU.
That's it.
The benefit: Physics is one of those easily parallelised problems so a very large increase in complexity is possible.
The drawbacks: There's less GPU time available for drawing stuff so your framerate suffers.
And of course, it's limited to Nvidia hardware only.
The latter leads to a drawback of its own: the technology can't be used to its full potential because many people who buy a game won't have the necessary hardware.
So it can't be used in ways that would affect gameplay.</sentencetext>
	<parent>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_03_08_1636259.31403156</parent>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_03_08_1636259.31403488</id>
	<title>clutching at straws</title>
	<author>Anonymous</author>
	<datestamp>1268077800000</datestamp>
	<modclass>Insightful</modclass>
	<modscore>5</modscore>
	<htmltext><p>GPU makers are in a bind:<br>- IGP are now enough for 90\% of users: office work (even w/ Aero), video, light gaming, dual-screen... all work fine with IGPs<br>- the remaining 10\% (gamers, graphic artists) are dwindling for lack or outstanding games: game publishers are turned off by rampant piracy, mainly online games bring in big money nowadays<br>- GPGPU is useless except in scientific computing: we already have more x86 cores than the devs know how to use, let alone use a different computing paradigm<br>- devs have to target the lowest common denominator, which means no GPGPU for games</p><p>I'm actually think of moving my home PC to one of the upcoming ARM-based smarttops. They look good enough for torrenting + video watching + web browsing, consume 10 watts instead of 150...</p></htmltext>
<tokenext>GPU makers are in a bind : - IGP are now enough for 90 \ % of users : office work ( even w/ Aero ) , video , light gaming , dual-screen... all work fine with IGPs- the remaining 10 \ % ( gamers , graphic artists ) are dwindling for lack or outstanding games : game publishers are turned off by rampant piracy , mainly online games bring in big money nowadays- GPGPU is useless except in scientific computing : we already have more x86 cores than the devs know how to use , let alone use a different computing paradigm- devs have to target the lowest common denominator , which means no GPGPU for gamesI 'm actually think of moving my home PC to one of the upcoming ARM-based smarttops .
They look good enough for torrenting + video watching + web browsing , consume 10 watts instead of 150.. .</tokentext>
<sentencetext>GPU makers are in a bind:- IGP are now enough for 90\% of users: office work (even w/ Aero), video, light gaming, dual-screen... all work fine with IGPs- the remaining 10\% (gamers, graphic artists) are dwindling for lack or outstanding games: game publishers are turned off by rampant piracy, mainly online games bring in big money nowadays- GPGPU is useless except in scientific computing: we already have more x86 cores than the devs know how to use, let alone use a different computing paradigm- devs have to target the lowest common denominator, which means no GPGPU for gamesI'm actually think of moving my home PC to one of the upcoming ARM-based smarttops.
They look good enough for torrenting + video watching + web browsing, consume 10 watts instead of 150...</sentencetext>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_03_08_1636259.31403034</id>
	<title>They wish they'd thought of it first</title>
	<author>EvolutionsPeak</author>
	<datestamp>1268075880000</datestamp>
	<modclass>Informativ</modclass>
	<modscore>4</modscore>
	<htmltext><p>Sounds to me like AMD just wishes they'd thought of it first.  There's no reason AMD couldn't offer similar deals.</p></htmltext>
<tokenext>Sounds to me like AMD just wishes they 'd thought of it first .
There 's no reason AMD could n't offer similar deals .</tokentext>
<sentencetext>Sounds to me like AMD just wishes they'd thought of it first.
There's no reason AMD couldn't offer similar deals.</sentencetext>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_03_08_1636259.31403428</id>
	<title>Re:What does PhysX do anyways?</title>
	<author>postmortem</author>
	<datestamp>1268077500000</datestamp>
	<modclass>None</modclass>
	<modscore>1</modscore>
	<htmltext><p>The ones with PhysX are more likey to crash because nvidia drivers aren't prefect.</p><p>Not a trolling attempt: I have played Mirror's Edge game where flag vawes realistically on PhysX, but it crashes on same spot always. Just before crash flag looks weird.</p><p>On Radeon, flag doesn't wave so naturally, but game does not crash either.</p></htmltext>
<tokenext>The ones with PhysX are more likey to crash because nvidia drivers are n't prefect.Not a trolling attempt : I have played Mirror 's Edge game where flag vawes realistically on PhysX , but it crashes on same spot always .
Just before crash flag looks weird.On Radeon , flag does n't wave so naturally , but game does not crash either .</tokentext>
<sentencetext>The ones with PhysX are more likey to crash because nvidia drivers aren't prefect.Not a trolling attempt: I have played Mirror's Edge game where flag vawes realistically on PhysX, but it crashes on same spot always.
Just before crash flag looks weird.On Radeon, flag doesn't wave so naturally, but game does not crash either.</sentencetext>
	<parent>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_03_08_1636259.31403156</parent>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_03_08_1636259.31406468</id>
	<title>For the money ?</title>
	<author>morcego</author>
	<datestamp>1268047080000</datestamp>
	<modclass>None</modclass>
	<modscore>1</modscore>
	<htmltext><blockquote><div><p>They're not doing it because they want it; they're doing it because they're paid to do it.</p></div></blockquote><p>So we are to believe that working for AMD, we can do whatever we want, and not be ordered to do stuff we don't want ?</p><p>And what ? Developer companies doing things for the money ? No, I can't believe that. I mean, all their games are free today, right ?</p><p>This simply can't be true. I never heard of any company doing their stuff for the money. They all do it for the sheer pleasure of whatever it is they do.</p></div>
	</htmltext>
<tokenext>They 're not doing it because they want it ; they 're doing it because they 're paid to do it.So we are to believe that working for AMD , we can do whatever we want , and not be ordered to do stuff we do n't want ? And what ?
Developer companies doing things for the money ?
No , I ca n't believe that .
I mean , all their games are free today , right ? This simply ca n't be true .
I never heard of any company doing their stuff for the money .
They all do it for the sheer pleasure of whatever it is they do .</tokentext>
<sentencetext>They're not doing it because they want it; they're doing it because they're paid to do it.So we are to believe that working for AMD, we can do whatever we want, and not be ordered to do stuff we don't want ?And what ?
Developer companies doing things for the money ?
No, I can't believe that.
I mean, all their games are free today, right ?This simply can't be true.
I never heard of any company doing their stuff for the money.
They all do it for the sheer pleasure of whatever it is they do.
	</sentencetext>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_03_08_1636259.31403384</id>
	<title>Re:What does PhysX do anyways?</title>
	<author>Anonymous</author>
	<datestamp>1268077320000</datestamp>
	<modclass>Troll</modclass>
	<modscore>0</modscore>
	<htmltext><a href="http://lmgtfy.com/?q=PhysX" title="lmgtfy.com">http://lmgtfy.com/?q=PhysX</a> [lmgtfy.com] <br> <br>
I [heart] this site... makes me happy every time I provide a link<nobr> <wbr></nobr>:)</htmltext>
<tokenext>http : //lmgtfy.com/ ? q = PhysX [ lmgtfy.com ] I [ heart ] this site... makes me happy every time I provide a link : )</tokentext>
<sentencetext>http://lmgtfy.com/?q=PhysX [lmgtfy.com]  
I [heart] this site... makes me happy every time I provide a link :)</sentencetext>
	<parent>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_03_08_1636259.31403156</parent>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_03_08_1636259.31403946</id>
	<title>Re:Maybe</title>
	<author>ASBands</author>
	<datestamp>1268079840000</datestamp>
	<modclass>Informativ</modclass>
	<modscore>5</modscore>
	<htmltext><p>I've done some work with both PhysX and the things that AMD is pushing for.  I try to keep with the <a href="http://www.adrianboeing.com/pal/index.html" title="adrianboeing.com">Physics Abstraction Layer</a> [adrianboeing.com], which lets me plug in whatever physics engine as the backend, which gives a pretty damn good apples-to-apples performance metric.  Personally, my ultimate choice of physics engine is the one which exhibits the best performance.  My experience may differ from others, but I generally get the best performance from PhysX on with an nVidia GPU and BulletPhysics with an AMD GPU.  Sometimes, the software version of PhysX outstrips the competition, but I have never seen anything beat PhysX in performance with GPU acceleration turned on.  And with PAL, it is easy to check if there is GPU support on the machine and swap in the physics engine with the best performance (PAL is awesome).</p><p>Here's the thing: GPU-accelerated physics are just plain faster.  Why?  Because collision detection is a highly parallelizable problem.  Guess what hardware we have that can help?  The GPU.  Another great part of using the GPU is that it frees the CPU to do more random crap (like AI or parsing the horribly slow scripting language).</p><p>AMD is working on both BulletPhysics and Havok so they can do GPU acceleration.  But I have a feeling that PhysX performance will remain faster for a while: PhysX was designed to natively run on the GPU (technically, a GPU-like device), while these other libraries are not.  Furthermore, nVidia has quite a head start in performance tuning, optimization and simple experience.  In five years, that shouldn't matter, but I'm just saying that it will take a while.</p><p>So here is my message to AMD: If you want people to use your stuff, make something that works and let me test it out in my applications.  You've released a <i>demo</i> of Havok with GPU acceleration.  PhysX has been and continues to work with GPU acceleration on nVidia GPUs and will frequently outperform the software implementation.  I'm all for open alternatives, but in this case, the open alternatives aren't good enough.</p></htmltext>
<tokenext>I 've done some work with both PhysX and the things that AMD is pushing for .
I try to keep with the Physics Abstraction Layer [ adrianboeing.com ] , which lets me plug in whatever physics engine as the backend , which gives a pretty damn good apples-to-apples performance metric .
Personally , my ultimate choice of physics engine is the one which exhibits the best performance .
My experience may differ from others , but I generally get the best performance from PhysX on with an nVidia GPU and BulletPhysics with an AMD GPU .
Sometimes , the software version of PhysX outstrips the competition , but I have never seen anything beat PhysX in performance with GPU acceleration turned on .
And with PAL , it is easy to check if there is GPU support on the machine and swap in the physics engine with the best performance ( PAL is awesome ) .Here 's the thing : GPU-accelerated physics are just plain faster .
Why ? Because collision detection is a highly parallelizable problem .
Guess what hardware we have that can help ?
The GPU .
Another great part of using the GPU is that it frees the CPU to do more random crap ( like AI or parsing the horribly slow scripting language ) .AMD is working on both BulletPhysics and Havok so they can do GPU acceleration .
But I have a feeling that PhysX performance will remain faster for a while : PhysX was designed to natively run on the GPU ( technically , a GPU-like device ) , while these other libraries are not .
Furthermore , nVidia has quite a head start in performance tuning , optimization and simple experience .
In five years , that should n't matter , but I 'm just saying that it will take a while.So here is my message to AMD : If you want people to use your stuff , make something that works and let me test it out in my applications .
You 've released a demo of Havok with GPU acceleration .
PhysX has been and continues to work with GPU acceleration on nVidia GPUs and will frequently outperform the software implementation .
I 'm all for open alternatives , but in this case , the open alternatives are n't good enough .</tokentext>
<sentencetext>I've done some work with both PhysX and the things that AMD is pushing for.
I try to keep with the Physics Abstraction Layer [adrianboeing.com], which lets me plug in whatever physics engine as the backend, which gives a pretty damn good apples-to-apples performance metric.
Personally, my ultimate choice of physics engine is the one which exhibits the best performance.
My experience may differ from others, but I generally get the best performance from PhysX on with an nVidia GPU and BulletPhysics with an AMD GPU.
Sometimes, the software version of PhysX outstrips the competition, but I have never seen anything beat PhysX in performance with GPU acceleration turned on.
And with PAL, it is easy to check if there is GPU support on the machine and swap in the physics engine with the best performance (PAL is awesome).Here's the thing: GPU-accelerated physics are just plain faster.
Why?  Because collision detection is a highly parallelizable problem.
Guess what hardware we have that can help?
The GPU.
Another great part of using the GPU is that it frees the CPU to do more random crap (like AI or parsing the horribly slow scripting language).AMD is working on both BulletPhysics and Havok so they can do GPU acceleration.
But I have a feeling that PhysX performance will remain faster for a while: PhysX was designed to natively run on the GPU (technically, a GPU-like device), while these other libraries are not.
Furthermore, nVidia has quite a head start in performance tuning, optimization and simple experience.
In five years, that shouldn't matter, but I'm just saying that it will take a while.So here is my message to AMD: If you want people to use your stuff, make something that works and let me test it out in my applications.
You've released a demo of Havok with GPU acceleration.
PhysX has been and continues to work with GPU acceleration on nVidia GPUs and will frequently outperform the software implementation.
I'm all for open alternatives, but in this case, the open alternatives aren't good enough.</sentencetext>
	<parent>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_03_08_1636259.31403124</parent>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_03_08_1636259.31410788</id>
	<title>Is that "Both country and Western"?</title>
	<author>jonaskoelker</author>
	<datestamp>1268165880000</datestamp>
	<modclass>None</modclass>
	<modscore>1</modscore>
	<htmltext><p><div class="quote"><p>AMD is advocating for a similar technology that works on their hardware as well as on nVidia's</p></div><p>That sounds a bit like "both country <em>and</em> western" to me.</p><p>Ah well, I guess it isn't AMD's fault that when they make open standards there are only two (maybe three) implementers...</p></div>
	</htmltext>
<tokenext>AMD is advocating for a similar technology that works on their hardware as well as on nVidia'sThat sounds a bit like " both country and western " to me.Ah well , I guess it is n't AMD 's fault that when they make open standards there are only two ( maybe three ) implementers.. .</tokentext>
<sentencetext>AMD is advocating for a similar technology that works on their hardware as well as on nVidia'sThat sounds a bit like "both country and western" to me.Ah well, I guess it isn't AMD's fault that when they make open standards there are only two (maybe three) implementers...
	</sentencetext>
	<parent>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_03_08_1636259.31403124</parent>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_03_08_1636259.31407414</id>
	<title>Re:Maybe</title>
	<author>BikeHelmet</author>
	<datestamp>1268050800000</datestamp>
	<modclass>Informativ</modclass>
	<modscore>2</modscore>
	<htmltext><p><div class="quote"><p>Only half the market is going to be able to take advantage of it after all.</p> </div><p>1) PhysX runs on the CPU if no nVidia GPU is present. A $100 quad-core CPU easily handles it for most games.<br>2) According to the Steam Survey, nVidia is approximately 66\% of the PC gaming market. Two thirds.</p></div>
	</htmltext>
<tokenext>Only half the market is going to be able to take advantage of it after all .
1 ) PhysX runs on the CPU if no nVidia GPU is present .
A $ 100 quad-core CPU easily handles it for most games.2 ) According to the Steam Survey , nVidia is approximately 66 \ % of the PC gaming market .
Two thirds .</tokentext>
<sentencetext>Only half the market is going to be able to take advantage of it after all.
1) PhysX runs on the CPU if no nVidia GPU is present.
A $100 quad-core CPU easily handles it for most games.2) According to the Steam Survey, nVidia is approximately 66\% of the PC gaming market.
Two thirds.
	</sentencetext>
	<parent>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_03_08_1636259.31403036</parent>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_03_08_1636259.31403208</id>
	<title>Re:It's a new riff on the old joke</title>
	<author>Anonymous</author>
	<datestamp>1268076540000</datestamp>
	<modclass>None</modclass>
	<modscore>1</modscore>
	<htmltext><p><div class="quote"><p>"You're so ugly the only way to get the dog to play with you is to tie a steak around your neck."</p><p> <b>Says the kid the dog without a dog to play with.</b> </p></div><p>Try again please. That statement is a grammatical failure. I'm not even sure what you were trying to say.</p></div>
	</htmltext>
<tokenext>" You 're so ugly the only way to get the dog to play with you is to tie a steak around your neck .
" Says the kid the dog without a dog to play with .
Try again please .
That statement is a grammatical failure .
I 'm not even sure what you were trying to say .</tokentext>
<sentencetext>"You're so ugly the only way to get the dog to play with you is to tie a steak around your neck.
" Says the kid the dog without a dog to play with.
Try again please.
That statement is a grammatical failure.
I'm not even sure what you were trying to say.
	</sentencetext>
	<parent>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_03_08_1636259.31403062</parent>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_03_08_1636259.31404470</id>
	<title>Re:clutching at straws</title>
	<author>Anonymous</author>
	<datestamp>1268038980000</datestamp>
	<modclass>Insightful</modclass>
	<modscore>1</modscore>
	<htmltext><p><div class="quote"><p>- the remaining 10\% (gamers, graphic artists) are dwindling for lack or outstanding games: game publishers are turned off by rampant piracy, mainly online games bring in big money nowadays</p></div><p>Piracy, or bad games?  Many games these days are just pretty graphics pretending to be quality gameplay -- I think I can count on one hand the number of games I've felt have had excellent gameplay in the last 5 years, enough to be worth paying full retail price for.</p><p>Also as far as the move to consoles, there's the standardization issues on PCs to be dealt with given any engine -- Borderlands couldn't run on certain machines (It had two separate hardware incompatabilities, one with the cpu one with graphics card) even if they met min specs.  On consoles, that's not an issue.  Also, the console market seems to be on the whole less demanding of quality products before they'll throw gobs of money at companies.</p><p>I think you're being too quick to blame pirates.</p></div>
	</htmltext>
<tokenext>- the remaining 10 \ % ( gamers , graphic artists ) are dwindling for lack or outstanding games : game publishers are turned off by rampant piracy , mainly online games bring in big money nowadaysPiracy , or bad games ?
Many games these days are just pretty graphics pretending to be quality gameplay -- I think I can count on one hand the number of games I 've felt have had excellent gameplay in the last 5 years , enough to be worth paying full retail price for.Also as far as the move to consoles , there 's the standardization issues on PCs to be dealt with given any engine -- Borderlands could n't run on certain machines ( It had two separate hardware incompatabilities , one with the cpu one with graphics card ) even if they met min specs .
On consoles , that 's not an issue .
Also , the console market seems to be on the whole less demanding of quality products before they 'll throw gobs of money at companies.I think you 're being too quick to blame pirates .</tokentext>
<sentencetext>- the remaining 10\% (gamers, graphic artists) are dwindling for lack or outstanding games: game publishers are turned off by rampant piracy, mainly online games bring in big money nowadaysPiracy, or bad games?
Many games these days are just pretty graphics pretending to be quality gameplay -- I think I can count on one hand the number of games I've felt have had excellent gameplay in the last 5 years, enough to be worth paying full retail price for.Also as far as the move to consoles, there's the standardization issues on PCs to be dealt with given any engine -- Borderlands couldn't run on certain machines (It had two separate hardware incompatabilities, one with the cpu one with graphics card) even if they met min specs.
On consoles, that's not an issue.
Also, the console market seems to be on the whole less demanding of quality products before they'll throw gobs of money at companies.I think you're being too quick to blame pirates.
	</sentencetext>
	<parent>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_03_08_1636259.31403488</parent>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_03_08_1636259.31406222</id>
	<title>Re:Maybe</title>
	<author>makomk</author>
	<datestamp>1268046300000</datestamp>
	<modclass>None</modclass>
	<modscore>1</modscore>
	<htmltext><p>Yeah, just so long as you don't mind either the FPS slowing down to single digits while it runs the physics using badly-optimised single core code, or disabling most of it.</p></htmltext>
<tokenext>Yeah , just so long as you do n't mind either the FPS slowing down to single digits while it runs the physics using badly-optimised single core code , or disabling most of it .</tokentext>
<sentencetext>Yeah, just so long as you don't mind either the FPS slowing down to single digits while it runs the physics using badly-optimised single core code, or disabling most of it.</sentencetext>
	<parent>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_03_08_1636259.31403498</parent>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_03_08_1636259.31405408</id>
	<title>At least they aren't directly sabotaging ATI cards</title>
	<author>mahsah</author>
	<datestamp>1268043300000</datestamp>
	<modclass>None</modclass>
	<modscore>1</modscore>
	<htmltext><p>Like they did with Crysis.</p></htmltext>
<tokenext>Like they did with Crysis .</tokentext>
<sentencetext>Like they did with Crysis.</sentencetext>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_03_08_1636259.31406736</id>
	<title>Re:clutching at straws</title>
	<author>mxs</author>
	<datestamp>1268047860000</datestamp>
	<modclass>Insightful</modclass>
	<modscore>2</modscore>
	<htmltext><p>I wonder why you attribute the lack of outstanding games to piracy being rampant -- the industry has been bitching and moaning about that for over 20 years now. That can't be the reason or we would not have a videogame-industry at all.</p><p>Few game developers are willing to do risky things though, and countless remakes of the same games just don't really appeal to all that many gamers -- add to that that gaming itself is being transformed (or rather, the marketplace is changing with mobile games becoming a pastime of millions, there actually being a LOT of games out there from the years prior, etc.), and I can see some reasons for that. Add to that the fact that asset-development (ugh) for modern games can be magnitudes more expensive than for old "outstanding" games and you see that the financials have changed quite a bit as well -- you can no longer just produce an AAA title in a team of 2-4 people in a basement -- the tools simply have not caught up yet.</p><p>I would not count out GPGPU as a niche product just yet -- it's true, paradigms change, but the mere fact that x86 cores are becoming more plentiful is leading to more tool support, more heads thinking about the problems and solutions involved, and more people getting used to concurrent programming. Once you know how to use 6 or 8 cores well, it will not be too much of a jump to the 300-500 threads a GPU will handle. I hope for great things in this area though admittedly I personally like it for the sciency stuff<nobr> <wbr></nobr>:)</p><p>Lowest common denominator, not. Degrading gracefully to it, yes. Games traditionally push the envelope, and will continue to do so -- unless the console-model and mobile gaming overtakes the entire market.</p><p>ARM-based devices have their uses, but to be quite honest -- I like a spiffy desktop.</p></htmltext>
<tokenext>I wonder why you attribute the lack of outstanding games to piracy being rampant -- the industry has been bitching and moaning about that for over 20 years now .
That ca n't be the reason or we would not have a videogame-industry at all.Few game developers are willing to do risky things though , and countless remakes of the same games just do n't really appeal to all that many gamers -- add to that that gaming itself is being transformed ( or rather , the marketplace is changing with mobile games becoming a pastime of millions , there actually being a LOT of games out there from the years prior , etc .
) , and I can see some reasons for that .
Add to that the fact that asset-development ( ugh ) for modern games can be magnitudes more expensive than for old " outstanding " games and you see that the financials have changed quite a bit as well -- you can no longer just produce an AAA title in a team of 2-4 people in a basement -- the tools simply have not caught up yet.I would not count out GPGPU as a niche product just yet -- it 's true , paradigms change , but the mere fact that x86 cores are becoming more plentiful is leading to more tool support , more heads thinking about the problems and solutions involved , and more people getting used to concurrent programming .
Once you know how to use 6 or 8 cores well , it will not be too much of a jump to the 300-500 threads a GPU will handle .
I hope for great things in this area though admittedly I personally like it for the sciency stuff : ) Lowest common denominator , not .
Degrading gracefully to it , yes .
Games traditionally push the envelope , and will continue to do so -- unless the console-model and mobile gaming overtakes the entire market.ARM-based devices have their uses , but to be quite honest -- I like a spiffy desktop .</tokentext>
<sentencetext>I wonder why you attribute the lack of outstanding games to piracy being rampant -- the industry has been bitching and moaning about that for over 20 years now.
That can't be the reason or we would not have a videogame-industry at all.Few game developers are willing to do risky things though, and countless remakes of the same games just don't really appeal to all that many gamers -- add to that that gaming itself is being transformed (or rather, the marketplace is changing with mobile games becoming a pastime of millions, there actually being a LOT of games out there from the years prior, etc.
), and I can see some reasons for that.
Add to that the fact that asset-development (ugh) for modern games can be magnitudes more expensive than for old "outstanding" games and you see that the financials have changed quite a bit as well -- you can no longer just produce an AAA title in a team of 2-4 people in a basement -- the tools simply have not caught up yet.I would not count out GPGPU as a niche product just yet -- it's true, paradigms change, but the mere fact that x86 cores are becoming more plentiful is leading to more tool support, more heads thinking about the problems and solutions involved, and more people getting used to concurrent programming.
Once you know how to use 6 or 8 cores well, it will not be too much of a jump to the 300-500 threads a GPU will handle.
I hope for great things in this area though admittedly I personally like it for the sciency stuff :)Lowest common denominator, not.
Degrading gracefully to it, yes.
Games traditionally push the envelope, and will continue to do so -- unless the console-model and mobile gaming overtakes the entire market.ARM-based devices have their uses, but to be quite honest -- I like a spiffy desktop.</sentencetext>
	<parent>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_03_08_1636259.31403488</parent>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_03_08_1636259.31403528</id>
	<title>Re:Maybe</title>
	<author>MobyDisk</author>
	<datestamp>1268078040000</datestamp>
	<modclass>Funny</modclass>
	<modscore>4</modscore>
	<htmltext><p>Open standards always win out over closed standards.  Like OpenGL -vs- DirectX.... oh... wait...<nobr> <wbr></nobr>:-P</p></htmltext>
<tokenext>Open standards always win out over closed standards .
Like OpenGL -vs- DirectX.... oh... wait... : -P</tokentext>
<sentencetext>Open standards always win out over closed standards.
Like OpenGL -vs- DirectX.... oh... wait... :-P</sentencetext>
	<parent>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_03_08_1636259.31403124</parent>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_03_08_1636259.31405168</id>
	<title>The physics business</title>
	<author>Animats</author>
	<datestamp>1268042220000</datestamp>
	<modclass>None</modclass>
	<modscore>1</modscore>
	<htmltext><p>
Ageia's innovation wasn't their technology.  It was their business model.  Havok gets a fixed fee per title.  Ageia's "physics chip" got revenue for each graphics card. Both Havok and Mathengine had serious revenue problems as standalone companies.
The original investors did not do well.  Both were eventually acquired.  The basic problem is that game middleware isn't a good business.
</p><p>
Physics in the GPU is mostly useful for visual effects like water, snow, fire, explosions, etc., where the motion doesn't feed back into the game engine.  Ragdolls and vehicles are usually still done in the main CPUs.</p></htmltext>
<tokenext>Ageia 's innovation was n't their technology .
It was their business model .
Havok gets a fixed fee per title .
Ageia 's " physics chip " got revenue for each graphics card .
Both Havok and Mathengine had serious revenue problems as standalone companies .
The original investors did not do well .
Both were eventually acquired .
The basic problem is that game middleware is n't a good business .
Physics in the GPU is mostly useful for visual effects like water , snow , fire , explosions , etc. , where the motion does n't feed back into the game engine .
Ragdolls and vehicles are usually still done in the main CPUs .</tokentext>
<sentencetext>
Ageia's innovation wasn't their technology.
It was their business model.
Havok gets a fixed fee per title.
Ageia's "physics chip" got revenue for each graphics card.
Both Havok and Mathengine had serious revenue problems as standalone companies.
The original investors did not do well.
Both were eventually acquired.
The basic problem is that game middleware isn't a good business.
Physics in the GPU is mostly useful for visual effects like water, snow, fire, explosions, etc., where the motion doesn't feed back into the game engine.
Ragdolls and vehicles are usually still done in the main CPUs.</sentencetext>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_03_08_1636259.31405258</id>
	<title>Open standards, and nothing else</title>
	<author>unity100</author>
	<datestamp>1268042700000</datestamp>
	<modclass>Insightful</modclass>
	<modscore>3</modscore>
	<htmltext><p>i wouldnt even care if physx was the biggest software innovation of the century - in gaming, especially in regard to graphics, we have suffered a lot because of proprietary shit in the last 2 decades. i dont want to see that again. even if its coarse, inadequate at the start, everyone should push for open standards so that we wont get in deep trouble later.</p></htmltext>
<tokenext>i wouldnt even care if physx was the biggest software innovation of the century - in gaming , especially in regard to graphics , we have suffered a lot because of proprietary shit in the last 2 decades .
i dont want to see that again .
even if its coarse , inadequate at the start , everyone should push for open standards so that we wont get in deep trouble later .</tokentext>
<sentencetext>i wouldnt even care if physx was the biggest software innovation of the century - in gaming, especially in regard to graphics, we have suffered a lot because of proprietary shit in the last 2 decades.
i dont want to see that again.
even if its coarse, inadequate at the start, everyone should push for open standards so that we wont get in deep trouble later.</sentencetext>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_03_08_1636259.31417356</id>
	<title>Re:Maybe</title>
	<author>Anonymous</author>
	<datestamp>1268165040000</datestamp>
	<modclass>None</modclass>
	<modscore>0</modscore>
	<htmltext>Except nVidia artificially limits the number of CPU cores that they allow to run PhysX.  Got a Quad Core CPU?  Too bad, PhysX won't use it.  Often, while the main gameplay will run on both software and hardware, they add extra eyecandy that only works on nVidia hardware even if it would run just fine if you had enough fast CPU cores.</htmltext>
<tokenext>Except nVidia artificially limits the number of CPU cores that they allow to run PhysX .
Got a Quad Core CPU ?
Too bad , PhysX wo n't use it .
Often , while the main gameplay will run on both software and hardware , they add extra eyecandy that only works on nVidia hardware even if it would run just fine if you had enough fast CPU cores .</tokentext>
<sentencetext>Except nVidia artificially limits the number of CPU cores that they allow to run PhysX.
Got a Quad Core CPU?
Too bad, PhysX won't use it.
Often, while the main gameplay will run on both software and hardware, they add extra eyecandy that only works on nVidia hardware even if it would run just fine if you had enough fast CPU cores.</sentencetext>
	<parent>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_03_08_1636259.31403498</parent>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_03_08_1636259.31403172</id>
	<title>Re:They wish they'd thought of it first</title>
	<author>Anonymous</author>
	<datestamp>1268076420000</datestamp>
	<modclass>Interestin</modclass>
	<modscore>1</modscore>
	<htmltext><p>PhysX adds nothing to the game play. It's just stupid clutter on the ground... At least in any games I've played that use it. As an Nvidia user, PhysX is no longer a reason to keep with the brand... at least now that I have used it.</p><p>That said, it's no surprise to me that game developers wouldn't support it without incentive</p></htmltext>
<tokenext>PhysX adds nothing to the game play .
It 's just stupid clutter on the ground... At least in any games I 've played that use it .
As an Nvidia user , PhysX is no longer a reason to keep with the brand... at least now that I have used it.That said , it 's no surprise to me that game developers would n't support it without incentive</tokentext>
<sentencetext>PhysX adds nothing to the game play.
It's just stupid clutter on the ground... At least in any games I've played that use it.
As an Nvidia user, PhysX is no longer a reason to keep with the brand... at least now that I have used it.That said, it's no surprise to me that game developers wouldn't support it without incentive</sentencetext>
	<parent>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_03_08_1636259.31403034</parent>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_03_08_1636259.31410796</id>
	<title>Re:Maybe</title>
	<author>Anonymous</author>
	<datestamp>1268166120000</datestamp>
	<modclass>None</modclass>
	<modscore>0</modscore>
	<htmltext><p>I blame the ARB and Khronos for that. OpenGL used to be cutting edge and DirectX was the one playing catch up.</p><p>DirectX used to really suck until around DX6 where the 2D acceleration interfaces got good enough to be programmer friendly (Using OGL for 2D is possible but not everyone had 3D accelerators in those days which would have made performance suck for most people). DX7 had reasonable 3D support, people ported their 2D engines to use Direct3D as an addon but were generally still 2D at the core [simpler migration path]. The problem arrived with DX8 -&gt; MS added hardware accelerated vertex and pixel shaders, that came out of left field and allowed DX to overtake OpenGL on features; OGL has been playing catch up ever since.</p><p>OpenGL3 was supposed to be a complete redesign to make OGL object oriented like DirectX (but better since it isn't as in-your-face low level) but that got kicked in the nuts at some point and so we're stuck with the retarded 90s era State Engine system.</p><p>It also doesn't help that ATi, the kings of shitty drivers, have worse OpenGL performance than DirectX on Windows (A program that can use both DX and OGL will run slower using the OGL API). Intel's (as though it is worth mentioning...) is flaky but their chips suck on both APIs so it doesn't make much difference. Only nVidia's OGL support is about as good as their DirectX support.</p><p>[Of course, there was also the usual fun anti-trust stuff from Microsoft where they deliberately redesigned the OpenGL bindings on Windows so that OpenGL drivers were more complex and ran slower but that isn't new or surprising]</p></htmltext>
<tokenext>I blame the ARB and Khronos for that .
OpenGL used to be cutting edge and DirectX was the one playing catch up.DirectX used to really suck until around DX6 where the 2D acceleration interfaces got good enough to be programmer friendly ( Using OGL for 2D is possible but not everyone had 3D accelerators in those days which would have made performance suck for most people ) .
DX7 had reasonable 3D support , people ported their 2D engines to use Direct3D as an addon but were generally still 2D at the core [ simpler migration path ] .
The problem arrived with DX8 - &gt; MS added hardware accelerated vertex and pixel shaders , that came out of left field and allowed DX to overtake OpenGL on features ; OGL has been playing catch up ever since.OpenGL3 was supposed to be a complete redesign to make OGL object oriented like DirectX ( but better since it is n't as in-your-face low level ) but that got kicked in the nuts at some point and so we 're stuck with the retarded 90s era State Engine system.It also does n't help that ATi , the kings of shitty drivers , have worse OpenGL performance than DirectX on Windows ( A program that can use both DX and OGL will run slower using the OGL API ) .
Intel 's ( as though it is worth mentioning... ) is flaky but their chips suck on both APIs so it does n't make much difference .
Only nVidia 's OGL support is about as good as their DirectX support .
[ Of course , there was also the usual fun anti-trust stuff from Microsoft where they deliberately redesigned the OpenGL bindings on Windows so that OpenGL drivers were more complex and ran slower but that is n't new or surprising ]</tokentext>
<sentencetext>I blame the ARB and Khronos for that.
OpenGL used to be cutting edge and DirectX was the one playing catch up.DirectX used to really suck until around DX6 where the 2D acceleration interfaces got good enough to be programmer friendly (Using OGL for 2D is possible but not everyone had 3D accelerators in those days which would have made performance suck for most people).
DX7 had reasonable 3D support, people ported their 2D engines to use Direct3D as an addon but were generally still 2D at the core [simpler migration path].
The problem arrived with DX8 -&gt; MS added hardware accelerated vertex and pixel shaders, that came out of left field and allowed DX to overtake OpenGL on features; OGL has been playing catch up ever since.OpenGL3 was supposed to be a complete redesign to make OGL object oriented like DirectX (but better since it isn't as in-your-face low level) but that got kicked in the nuts at some point and so we're stuck with the retarded 90s era State Engine system.It also doesn't help that ATi, the kings of shitty drivers, have worse OpenGL performance than DirectX on Windows (A program that can use both DX and OGL will run slower using the OGL API).
Intel's (as though it is worth mentioning...) is flaky but their chips suck on both APIs so it doesn't make much difference.
Only nVidia's OGL support is about as good as their DirectX support.
[Of course, there was also the usual fun anti-trust stuff from Microsoft where they deliberately redesigned the OpenGL bindings on Windows so that OpenGL drivers were more complex and ran slower but that isn't new or surprising]</sentencetext>
	<parent>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_03_08_1636259.31403528</parent>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_03_08_1636259.31403036</id>
	<title>Maybe</title>
	<author>Anonymous</author>
	<datestamp>1268075880000</datestamp>
	<modclass>Interestin</modclass>
	<modscore>3</modscore>
	<htmltext><p>I wouldn't be surprised if most game devs wouldn't implement PhysX if not for a subsidy.  Only half the market is going to be able to take advantage of it after all.  It may not be that they don't want it, just that it's not an economical use of their time otherwise.</p></htmltext>
<tokenext>I would n't be surprised if most game devs would n't implement PhysX if not for a subsidy .
Only half the market is going to be able to take advantage of it after all .
It may not be that they do n't want it , just that it 's not an economical use of their time otherwise .</tokentext>
<sentencetext>I wouldn't be surprised if most game devs wouldn't implement PhysX if not for a subsidy.
Only half the market is going to be able to take advantage of it after all.
It may not be that they don't want it, just that it's not an economical use of their time otherwise.</sentencetext>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_03_08_1636259.31403498</id>
	<title>Re:Maybe</title>
	<author>Aladrin</author>
	<datestamp>1268077920000</datestamp>
	<modclass>None</modclass>
	<modscore>1</modscore>
	<htmltext><p>Everyone seems to be glossing over a nice little fact:</p><p>Physx works on -all- modern Windows computers, whether they have a graphics accelerator or not.  So yes, only have the market can use the hardware accelerated Physx, but the other half isn't barred from the game.  They get to play, too.</p></htmltext>
<tokenext>Everyone seems to be glossing over a nice little fact : Physx works on -all- modern Windows computers , whether they have a graphics accelerator or not .
So yes , only have the market can use the hardware accelerated Physx , but the other half is n't barred from the game .
They get to play , too .</tokentext>
<sentencetext>Everyone seems to be glossing over a nice little fact:Physx works on -all- modern Windows computers, whether they have a graphics accelerator or not.
So yes, only have the market can use the hardware accelerated Physx, but the other half isn't barred from the game.
They get to play, too.</sentencetext>
	<parent>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_03_08_1636259.31403124</parent>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_03_08_1636259.31404684</id>
	<title>Optional Features are expensive</title>
	<author>happy\_place</author>
	<datestamp>1268039820000</datestamp>
	<modclass>None</modclass>
	<modscore>1</modscore>
	<htmltext>The game development cycle is already airtight, because competition is fierce, and every new feature is old news in a few months, when your competitor games catch up. They hardly have time to test games, these days. Every day the game isn't on the market is money lost. And it's hard enough to debug a game with all the standard set of PC's, now add to that specific hardware configurations with specific feature sets, and you've got a testing nightmare. And what if there's a bug? what sort of support comes, if at all? it's more likely the game project management will more likely instruct the testers/devs to turn off the feature and go gold.

--Ray</htmltext>
<tokenext>The game development cycle is already airtight , because competition is fierce , and every new feature is old news in a few months , when your competitor games catch up .
They hardly have time to test games , these days .
Every day the game is n't on the market is money lost .
And it 's hard enough to debug a game with all the standard set of PC 's , now add to that specific hardware configurations with specific feature sets , and you 've got a testing nightmare .
And what if there 's a bug ?
what sort of support comes , if at all ?
it 's more likely the game project management will more likely instruct the testers/devs to turn off the feature and go gold .
--Ray</tokentext>
<sentencetext>The game development cycle is already airtight, because competition is fierce, and every new feature is old news in a few months, when your competitor games catch up.
They hardly have time to test games, these days.
Every day the game isn't on the market is money lost.
And it's hard enough to debug a game with all the standard set of PC's, now add to that specific hardware configurations with specific feature sets, and you've got a testing nightmare.
And what if there's a bug?
what sort of support comes, if at all?
it's more likely the game project management will more likely instruct the testers/devs to turn off the feature and go gold.
--Ray</sentencetext>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_03_08_1636259.31404030</id>
	<title>Pure conjecture, but--</title>
	<author>Anonymous</author>
	<datestamp>1268080320000</datestamp>
	<modclass>None</modclass>
	<modscore>0</modscore>
	<htmltext><p>What strikes me as the real issue here, is that game devs dont want to invest time on a proprietary graphics API that only a portion of their potentially targeted demographic will have access to.</p><p>EG, They don't want to use PhysX, when some non-trivial percentage of their customers will have AMD/ATI video.</p><p>It doesn't matter if PhysX would allow them to do real-time particle simulations of shrapnel, and thus create a more immersive FPS-- If some non-trivial percentage of their target audience cannot make use of that technology, then they need to be compensated for that loss.</p><p>To me, it isn't that PhysX is bad; it is that it is nVidia Only.</p><p>Take a short history lesson from the 1990s, when DOS games were the rage.  Prior to the VESA standards people creating industry standards for high resolution video modes from DOS, certain games would only work on certain hardware;  Same situation with audio capabilities.  You wanted wavetable audio? Too damn bad unless you have a genuine MT32 plugged into your soundcard, or you have an actual AWE soundcard.</p><p>All that changed when Microsoft proposed DirectX for windows gaming.</p><p>Early versions of DirectX were indeed; total shit.  Now, however, it is a well mature API, and is the primary target for game developers, because of its uniformity and ubiquity.  It doesnt matter what random POS hardware is in there, as long as it has a directX driver; the game will at least start, and display a picture.</p><p>What needs to happen with "Processing on the GPU" taking off, is that a standardized implementation that is hardware agnostic needs to be drafted and approved.</p><p>Otherwise, it's just a case of yet another patent trolling, market playing pissing match between two or more squabbling children.</p><p>"He has to BUY PEOPLE OFF! His stuff is OBVIOUSLY crap! (Use my free license!)"</p><p>What we actually need is a platform and hardware agnostic API for doing GPU processing tasks. Not vendor kickbacks, like paid incentives or free licenses.</p><p>I find both players equally culpable in this debacle.</p></htmltext>
<tokenext>What strikes me as the real issue here , is that game devs dont want to invest time on a proprietary graphics API that only a portion of their potentially targeted demographic will have access to.EG , They do n't want to use PhysX , when some non-trivial percentage of their customers will have AMD/ATI video.It does n't matter if PhysX would allow them to do real-time particle simulations of shrapnel , and thus create a more immersive FPS-- If some non-trivial percentage of their target audience can not make use of that technology , then they need to be compensated for that loss.To me , it is n't that PhysX is bad ; it is that it is nVidia Only.Take a short history lesson from the 1990s , when DOS games were the rage .
Prior to the VESA standards people creating industry standards for high resolution video modes from DOS , certain games would only work on certain hardware ; Same situation with audio capabilities .
You wanted wavetable audio ?
Too damn bad unless you have a genuine MT32 plugged into your soundcard , or you have an actual AWE soundcard.All that changed when Microsoft proposed DirectX for windows gaming.Early versions of DirectX were indeed ; total shit .
Now , however , it is a well mature API , and is the primary target for game developers , because of its uniformity and ubiquity .
It doesnt matter what random POS hardware is in there , as long as it has a directX driver ; the game will at least start , and display a picture.What needs to happen with " Processing on the GPU " taking off , is that a standardized implementation that is hardware agnostic needs to be drafted and approved.Otherwise , it 's just a case of yet another patent trolling , market playing pissing match between two or more squabbling children .
" He has to BUY PEOPLE OFF !
His stuff is OBVIOUSLY crap !
( Use my free license !
) " What we actually need is a platform and hardware agnostic API for doing GPU processing tasks .
Not vendor kickbacks , like paid incentives or free licenses.I find both players equally culpable in this debacle .</tokentext>
<sentencetext>What strikes me as the real issue here, is that game devs dont want to invest time on a proprietary graphics API that only a portion of their potentially targeted demographic will have access to.EG, They don't want to use PhysX, when some non-trivial percentage of their customers will have AMD/ATI video.It doesn't matter if PhysX would allow them to do real-time particle simulations of shrapnel, and thus create a more immersive FPS-- If some non-trivial percentage of their target audience cannot make use of that technology, then they need to be compensated for that loss.To me, it isn't that PhysX is bad; it is that it is nVidia Only.Take a short history lesson from the 1990s, when DOS games were the rage.
Prior to the VESA standards people creating industry standards for high resolution video modes from DOS, certain games would only work on certain hardware;  Same situation with audio capabilities.
You wanted wavetable audio?
Too damn bad unless you have a genuine MT32 plugged into your soundcard, or you have an actual AWE soundcard.All that changed when Microsoft proposed DirectX for windows gaming.Early versions of DirectX were indeed; total shit.
Now, however, it is a well mature API, and is the primary target for game developers, because of its uniformity and ubiquity.
It doesnt matter what random POS hardware is in there, as long as it has a directX driver; the game will at least start, and display a picture.What needs to happen with "Processing on the GPU" taking off, is that a standardized implementation that is hardware agnostic needs to be drafted and approved.Otherwise, it's just a case of yet another patent trolling, market playing pissing match between two or more squabbling children.
"He has to BUY PEOPLE OFF!
His stuff is OBVIOUSLY crap!
(Use my free license!
)"What we actually need is a platform and hardware agnostic API for doing GPU processing tasks.
Not vendor kickbacks, like paid incentives or free licenses.I find both players equally culpable in this debacle.</sentencetext>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_03_08_1636259.31403480</id>
	<title>Re:They wish they'd thought of it first</title>
	<author>Anonymous</author>
	<datestamp>1268077800000</datestamp>
	<modclass>None</modclass>
	<modscore>1</modscore>
	<htmltext>I heard game devs are only incorporating the "third dimension" into games because of the money, not because they want to.</htmltext>
<tokenext>I heard game devs are only incorporating the " third dimension " into games because of the money , not because they want to .</tokentext>
<sentencetext>I heard game devs are only incorporating the "third dimension" into games because of the money, not because they want to.</sentencetext>
	<parent>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_03_08_1636259.31403034</parent>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_03_08_1636259.31403680</id>
	<title>Is 'Incentivizing' Anti-Competitive?</title>
	<author>mpapet</author>
	<datestamp>1268078640000</datestamp>
	<modclass>Interestin</modclass>
	<modscore>2</modscore>
	<htmltext><p>This kind of incentive is anti-competitive.</p><p>1. It eliminates competition by feature/functionality.<br>2. It meaningfully constrains innovation.  A novel product without capitalization to participate is shut out.  (That's the goal anyway)</p><p>That said, this kind of incentivizing is everywhere.  (game consoles, mega-retailers, mobile phones) No one seems to care about the increased costs consumers assume or constraint on innovation.</p><p>I have my bias, what is yours?</p></htmltext>
<tokenext>This kind of incentive is anti-competitive.1 .
It eliminates competition by feature/functionality.2 .
It meaningfully constrains innovation .
A novel product without capitalization to participate is shut out .
( That 's the goal anyway ) That said , this kind of incentivizing is everywhere .
( game consoles , mega-retailers , mobile phones ) No one seems to care about the increased costs consumers assume or constraint on innovation.I have my bias , what is yours ?</tokentext>
<sentencetext>This kind of incentive is anti-competitive.1.
It eliminates competition by feature/functionality.2.
It meaningfully constrains innovation.
A novel product without capitalization to participate is shut out.
(That's the goal anyway)That said, this kind of incentivizing is everywhere.
(game consoles, mega-retailers, mobile phones) No one seems to care about the increased costs consumers assume or constraint on innovation.I have my bias, what is yours?</sentencetext>
</comment>
<thread>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#thread_10_03_08_1636259_28</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_03_08_1636259.31406006
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_03_08_1636259.31403934
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_03_08_1636259.31403488
</commentlist>
</thread>
<thread>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#thread_10_03_08_1636259_23</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_03_08_1636259.31412222
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_03_08_1636259.31403488
</commentlist>
</thread>
<thread>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#thread_10_03_08_1636259_27</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_03_08_1636259.31403656
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_03_08_1636259.31403124
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_03_08_1636259.31403036
</commentlist>
</thread>
<thread>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#thread_10_03_08_1636259_11</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_03_08_1636259.31406736
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_03_08_1636259.31403488
</commentlist>
</thread>
<thread>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#thread_10_03_08_1636259_15</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_03_08_1636259.31406250
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_03_08_1636259.31405300
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_03_08_1636259.31403208
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_03_08_1636259.31403062
</commentlist>
</thread>
<thread>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#thread_10_03_08_1636259_25</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_03_08_1636259.31403766
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_03_08_1636259.31403488
</commentlist>
</thread>
<thread>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#thread_10_03_08_1636259_12</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_03_08_1636259.31409308
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_03_08_1636259.31403034
</commentlist>
</thread>
<thread>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#thread_10_03_08_1636259_21</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_03_08_1636259.31407908
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_03_08_1636259.31403946
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_03_08_1636259.31403124
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_03_08_1636259.31403036
</commentlist>
</thread>
<thread>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#thread_10_03_08_1636259_2</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_03_08_1636259.31403178
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_03_08_1636259.31403034
</commentlist>
</thread>
<thread>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#thread_10_03_08_1636259_19</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_03_08_1636259.31404042
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_03_08_1636259.31403488
</commentlist>
</thread>
<thread>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#thread_10_03_08_1636259_16</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_03_08_1636259.31406622
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_03_08_1636259.31403946
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_03_08_1636259.31403124
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_03_08_1636259.31403036
</commentlist>
</thread>
<thread>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#thread_10_03_08_1636259_13</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_03_08_1636259.31403610
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_03_08_1636259.31403036
</commentlist>
</thread>
<thread>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#thread_10_03_08_1636259_6</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_03_08_1636259.31405718
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_03_08_1636259.31403498
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_03_08_1636259.31403124
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_03_08_1636259.31403036
</commentlist>
</thread>
<thread>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#thread_10_03_08_1636259_3</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_03_08_1636259.31410788
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_03_08_1636259.31403124
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_03_08_1636259.31403036
</commentlist>
</thread>
<thread>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#thread_10_03_08_1636259_0</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_03_08_1636259.31404470
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_03_08_1636259.31403488
</commentlist>
</thread>
<thread>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#thread_10_03_08_1636259_17</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_03_08_1636259.31404188
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_03_08_1636259.31403156
</commentlist>
</thread>
<thread>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#thread_10_03_08_1636259_7</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_03_08_1636259.31403176
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_03_08_1636259.31403034
</commentlist>
</thread>
<thread>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#thread_10_03_08_1636259_4</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_03_08_1636259.31403296
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_03_08_1636259.31403156
</commentlist>
</thread>
<thread>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#thread_10_03_08_1636259_10</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_03_08_1636259.31403480
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_03_08_1636259.31403034
</commentlist>
</thread>
<thread>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#thread_10_03_08_1636259_14</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_03_08_1636259.31406222
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_03_08_1636259.31403498
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_03_08_1636259.31403124
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_03_08_1636259.31403036
</commentlist>
</thread>
<thread>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#thread_10_03_08_1636259_1</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_03_08_1636259.31403384
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_03_08_1636259.31403156
</commentlist>
</thread>
<thread>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#thread_10_03_08_1636259_18</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_03_08_1636259.31411294
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_03_08_1636259.31403946
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_03_08_1636259.31403124
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_03_08_1636259.31403036
</commentlist>
</thread>
<thread>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#thread_10_03_08_1636259_8</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_03_08_1636259.31417356
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_03_08_1636259.31403498
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_03_08_1636259.31403124
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_03_08_1636259.31403036
</commentlist>
</thread>
<thread>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#thread_10_03_08_1636259_5</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_03_08_1636259.31404758
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_03_08_1636259.31403034
</commentlist>
</thread>
<thread>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#thread_10_03_08_1636259_22</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_03_08_1636259.31403428
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_03_08_1636259.31403156
</commentlist>
</thread>
<thread>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#thread_10_03_08_1636259_26</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_03_08_1636259.31403502
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_03_08_1636259.31403172
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_03_08_1636259.31403034
</commentlist>
</thread>
<thread>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#thread_10_03_08_1636259_9</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_03_08_1636259.31411454
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_03_08_1636259.31407414
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_03_08_1636259.31403036
</commentlist>
</thread>
<thread>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#thread_10_03_08_1636259_20</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_03_08_1636259.31410796
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_03_08_1636259.31403528
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_03_08_1636259.31403124
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_03_08_1636259.31403036
</commentlist>
</thread>
<thread>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#thread_10_03_08_1636259_24</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_03_08_1636259.31403962
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_03_08_1636259.31403034
</commentlist>
</thread>
<conversation>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#conversation10_03_08_1636259.2</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_03_08_1636259.31404038
</commentlist>
</conversation>
<conversation>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#conversation10_03_08_1636259.0</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_03_08_1636259.31405408
</commentlist>
</conversation>
<conversation>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#conversation10_03_08_1636259.3</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_03_08_1636259.31403034
-http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_03_08_1636259.31403176
-http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_03_08_1636259.31403480
-http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_03_08_1636259.31409308
-http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_03_08_1636259.31403962
-http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_03_08_1636259.31403178
-http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_03_08_1636259.31403172
--http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_03_08_1636259.31403502
-http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_03_08_1636259.31404758
</commentlist>
</conversation>
<conversation>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#conversation10_03_08_1636259.6</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_03_08_1636259.31403680
</commentlist>
</conversation>
<conversation>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#conversation10_03_08_1636259.1</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_03_08_1636259.31403062
-http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_03_08_1636259.31403208
--http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_03_08_1636259.31405300
---http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_03_08_1636259.31406250
</commentlist>
</conversation>
<conversation>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#conversation10_03_08_1636259.4</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_03_08_1636259.31403036
-http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_03_08_1636259.31407414
--http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_03_08_1636259.31411454
-http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_03_08_1636259.31403124
--http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_03_08_1636259.31403946
---http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_03_08_1636259.31411294
---http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_03_08_1636259.31407908
---http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_03_08_1636259.31406622
--http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_03_08_1636259.31403656
--http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_03_08_1636259.31403498
---http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_03_08_1636259.31406222
---http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_03_08_1636259.31405718
---http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_03_08_1636259.31417356
--http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_03_08_1636259.31403528
---http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_03_08_1636259.31410796
--http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_03_08_1636259.31410788
-http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_03_08_1636259.31403610
</commentlist>
</conversation>
<conversation>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#conversation10_03_08_1636259.7</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_03_08_1636259.31403488
-http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_03_08_1636259.31406736
-http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_03_08_1636259.31404042
-http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_03_08_1636259.31412222
-http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_03_08_1636259.31404470
-http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_03_08_1636259.31403934
--http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_03_08_1636259.31406006
-http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_03_08_1636259.31403766
</commentlist>
</conversation>
<conversation>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#conversation10_03_08_1636259.5</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_03_08_1636259.31403156
-http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_03_08_1636259.31403384
-http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_03_08_1636259.31403296
-http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_03_08_1636259.31404188
-http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_03_08_1636259.31403428
</commentlist>
</conversation>
