<article>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#article10_01_16_2226235</id>
	<title>AMD Delivers DX11 Graphics Solution For Under $100</title>
	<author>timothy</author>
	<datestamp>1263637800000</datestamp>
	<htmltext>Vigile points out yesterday's launch of <i>"the new AMD Radeon HD 5670, the <a href="http://www.pcper.com/article.php?aid=857&amp;type=overview">first graphics card to bring DirectX 11 support to the sub-$100 market</a> and offer next-generation features to almost any budget. The Redwood part (as it was codenamed) is nearly 3.5x smaller in die size than <a href="http://games.slashdot.org/story/09/09/23/1530251/AMD-Radeon-HD-5870-Adds-DX11-Multi-Monitor-Gaming">the first DX11 GPUs from AMD</a> while still offering support for DirectCompute 5.0, Eyefinity multi-monitor gaming and of course DX11 features (like tessellation) in upcoming Windows gaming titles. Unfortunately, <a href="http://www.pcper.com/article.php?aid=857&amp;type=expert&amp;pid=4">performance on the card is not revolutionary</a> even for the $99 graphics market, though power consumption has been noticeably lowered while keeping the card well cooled in a single-slot design."</i></htmltext>
<tokenext>Vigile points out yesterday 's launch of " the new AMD Radeon HD 5670 , the first graphics card to bring DirectX 11 support to the sub- $ 100 market and offer next-generation features to almost any budget .
The Redwood part ( as it was codenamed ) is nearly 3.5x smaller in die size than the first DX11 GPUs from AMD while still offering support for DirectCompute 5.0 , Eyefinity multi-monitor gaming and of course DX11 features ( like tessellation ) in upcoming Windows gaming titles .
Unfortunately , performance on the card is not revolutionary even for the $ 99 graphics market , though power consumption has been noticeably lowered while keeping the card well cooled in a single-slot design .
"</tokentext>
<sentencetext>Vigile points out yesterday's launch of "the new AMD Radeon HD 5670, the first graphics card to bring DirectX 11 support to the sub-$100 market and offer next-generation features to almost any budget.
The Redwood part (as it was codenamed) is nearly 3.5x smaller in die size than the first DX11 GPUs from AMD while still offering support for DirectCompute 5.0, Eyefinity multi-monitor gaming and of course DX11 features (like tessellation) in upcoming Windows gaming titles.
Unfortunately, performance on the card is not revolutionary even for the $99 graphics market, though power consumption has been noticeably lowered while keeping the card well cooled in a single-slot design.
"</sentencetext>
</article>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_01_16_2226235.30795614</id>
	<title>Re:Why?</title>
	<author>iamhassi</author>
	<datestamp>1263657660000</datestamp>
	<modclass>None</modclass>
	<modscore>1</modscore>
	<htmltext><i>"Seriously, good for AMD, but I just don't see the point."</i>
<br> <br>
Not only that, but <a href="http://www.bit-tech.net/hardware/graphics/2010/01/14/ati-radeon-hd-5670-review/4" title="bit-tech.net">it's slower than the 8 month old $99 ATI Radeon HD 4770</a> [bit-tech.net]
<br> <br>
so if I bought the $99 ATI Radeon HD 4770 8 months ago, why would I spend $99 on a slower card now?</htmltext>
<tokenext>" Seriously , good for AMD , but I just do n't see the point .
" Not only that , but it 's slower than the 8 month old $ 99 ATI Radeon HD 4770 [ bit-tech.net ] so if I bought the $ 99 ATI Radeon HD 4770 8 months ago , why would I spend $ 99 on a slower card now ?</tokentext>
<sentencetext>"Seriously, good for AMD, but I just don't see the point.
"
 
Not only that, but it's slower than the 8 month old $99 ATI Radeon HD 4770 [bit-tech.net]
 
so if I bought the $99 ATI Radeon HD 4770 8 months ago, why would I spend $99 on a slower card now?</sentencetext>
	<parent>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_01_16_2226235.30793830</parent>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_01_16_2226235.30795468</id>
	<title>Re:State of AMD for HTPC Use?</title>
	<author>Jah-Wren Ryel</author>
	<datestamp>1263656160000</datestamp>
	<modclass>None</modclass>
	<modscore>1</modscore>
	<htmltext><p><div class="quote"><p>I am an HTPC user, and ATI has always been a non-factor in that realm.</p> </div><p>Not in Windows.  MPC-HC's hardware acceleration has worked better with ATI chips than with Nvidia until just recently.  The biggest sticking point was that VC1 bitstreaming (where you hand the entire bitstream to the gpu for decoding and display, rather than accelerating just parts of it like iDCT) didn't work on any nvidia gpus except the embedded ones - that did change with their most recent hardware release a couple of months ago, but ATI's had support for bitstreaming VC1 in their regular cards for at least a year.</p></div>
	</htmltext>
<tokenext>I am an HTPC user , and ATI has always been a non-factor in that realm .
Not in Windows .
MPC-HC 's hardware acceleration has worked better with ATI chips than with Nvidia until just recently .
The biggest sticking point was that VC1 bitstreaming ( where you hand the entire bitstream to the gpu for decoding and display , rather than accelerating just parts of it like iDCT ) did n't work on any nvidia gpus except the embedded ones - that did change with their most recent hardware release a couple of months ago , but ATI 's had support for bitstreaming VC1 in their regular cards for at least a year .</tokentext>
<sentencetext>I am an HTPC user, and ATI has always been a non-factor in that realm.
Not in Windows.
MPC-HC's hardware acceleration has worked better with ATI chips than with Nvidia until just recently.
The biggest sticking point was that VC1 bitstreaming (where you hand the entire bitstream to the gpu for decoding and display, rather than accelerating just parts of it like iDCT) didn't work on any nvidia gpus except the embedded ones - that did change with their most recent hardware release a couple of months ago, but ATI's had support for bitstreaming VC1 in their regular cards for at least a year.
	</sentencetext>
	<parent>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_01_16_2226235.30794160</parent>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_01_16_2226235.30799428</id>
	<title>Re:Whats the point?</title>
	<author>Anonymous</author>
	<datestamp>1263753000000</datestamp>
	<modclass>None</modclass>
	<modscore>0</modscore>
	<htmltext><p>&gt; barely keeping 30 FPS at 1680x1050</p><p>Watch it!  Those of us who used CGA graphics back in the dark ages are likely to start telling stories about walking uphill 10 miles to school  in 3' of snow, both ways...</p></htmltext>
<tokenext>&gt; barely keeping 30 FPS at 1680x1050Watch it !
Those of us who used CGA graphics back in the dark ages are likely to start telling stories about walking uphill 10 miles to school in 3 ' of snow , both ways.. .</tokentext>
<sentencetext>&gt; barely keeping 30 FPS at 1680x1050Watch it!
Those of us who used CGA graphics back in the dark ages are likely to start telling stories about walking uphill 10 miles to school  in 3' of snow, both ways...</sentencetext>
	<parent>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_01_16_2226235.30794020</parent>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_01_16_2226235.30794336</id>
	<title>Re:Whats the point?</title>
	<author>Dragoniz3r</author>
	<datestamp>1263645000000</datestamp>
	<modclass>None</modclass>
	<modscore>1</modscore>
	<htmltext>HardOCP agrees <a href="http://hardocp.com/article/2010/01/14/amds\_ati\_radeon\_hd\_5670\_review/8" title="hardocp.com" rel="nofollow">http://hardocp.com/article/2010/01/14/amds\_ati\_radeon\_hd\_5670\_review/8</a> [hardocp.com]</htmltext>
<tokenext>HardOCP agrees http : //hardocp.com/article/2010/01/14/amds \ _ati \ _radeon \ _hd \ _5670 \ _review/8 [ hardocp.com ]</tokentext>
<sentencetext>HardOCP agrees http://hardocp.com/article/2010/01/14/amds\_ati\_radeon\_hd\_5670\_review/8 [hardocp.com]</sentencetext>
	<parent>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_01_16_2226235.30794020</parent>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_01_16_2226235.30794148</id>
	<title>Re:Compiz is all I need.</title>
	<author>Anonymous</author>
	<datestamp>1263643740000</datestamp>
	<modclass>None</modclass>
	<modscore>1</modscore>
	<htmltext><p>For someone who doesn't care about 3d games you seem to have quite strong emotions about direct x and gamers.</p></htmltext>
<tokenext>For someone who does n't care about 3d games you seem to have quite strong emotions about direct x and gamers .</tokentext>
<sentencetext>For someone who doesn't care about 3d games you seem to have quite strong emotions about direct x and gamers.</sentencetext>
	<parent>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_01_16_2226235.30793986</parent>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_01_16_2226235.30794194</id>
	<title>Re:Why?</title>
	<author>Anonymous</author>
	<datestamp>1263644040000</datestamp>
	<modclass>None</modclass>
	<modscore>1</modscore>
	<htmltext><p>Same thing was said about DX10. And about HD4670.</p></htmltext>
<tokenext>Same thing was said about DX10 .
And about HD4670 .</tokentext>
<sentencetext>Same thing was said about DX10.
And about HD4670.</sentencetext>
	<parent>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_01_16_2226235.30793830</parent>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_01_16_2226235.30794754</id>
	<title>Re:State of AMD for HTPC Use?</title>
	<author>Anonymous</author>
	<datestamp>1263648120000</datestamp>
	<modclass>None</modclass>
	<modscore>0</modscore>
	<htmltext>No, they still don't. They suck.</htmltext>
<tokenext>No , they still do n't .
They suck .</tokentext>
<sentencetext>No, they still don't.
They suck.</sentencetext>
	<parent>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_01_16_2226235.30794160</parent>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_01_16_2226235.30793986</id>
	<title>Compiz is all I need.</title>
	<author>GNUALMAFUERTE</author>
	<datestamp>1263642600000</datestamp>
	<modclass>Flamebait</modclass>
	<modscore>0</modscore>
	<htmltext><p>And even my cheap integrated Intel 945 can run it in full glory at 1920x1080.</p><p>About games<nobr> <wbr></nobr>... Chess doesn't require OpenGL.</p><p>The thing I'm most worried about is how in the last two years everyone has accepted DirectShit. It's micro$hit technology! it's not open, not cross-platform, and you all know it's meant to screw you up. This is IE all over again. We had a beautiful standard, called HTML. Micro$hit convinced people to use their stupid proprietary extensions, and in a few years we had destroyed the web. It took us YEARS to get back in track, destroy explorer, and get the web to be standards compliant again. Now people is doing the same all over again, displacing OpenGL because it's "Obsolete" and letting micro$hit rule hardware production with DirectShit-compatible devices.</p><p>I hate Gamers, and I hate the kind of people that talk about video cards all day. For fucks sake, If you want to play games get a Famicom or that shitty new alternative, I believe it's called playstation or something.</p></htmltext>
<tokenext>And even my cheap integrated Intel 945 can run it in full glory at 1920x1080.About games ... Chess does n't require OpenGL.The thing I 'm most worried about is how in the last two years everyone has accepted DirectShit .
It 's micro $ hit technology !
it 's not open , not cross-platform , and you all know it 's meant to screw you up .
This is IE all over again .
We had a beautiful standard , called HTML .
Micro $ hit convinced people to use their stupid proprietary extensions , and in a few years we had destroyed the web .
It took us YEARS to get back in track , destroy explorer , and get the web to be standards compliant again .
Now people is doing the same all over again , displacing OpenGL because it 's " Obsolete " and letting micro $ hit rule hardware production with DirectShit-compatible devices.I hate Gamers , and I hate the kind of people that talk about video cards all day .
For fucks sake , If you want to play games get a Famicom or that shitty new alternative , I believe it 's called playstation or something .</tokentext>
<sentencetext>And even my cheap integrated Intel 945 can run it in full glory at 1920x1080.About games ... Chess doesn't require OpenGL.The thing I'm most worried about is how in the last two years everyone has accepted DirectShit.
It's micro$hit technology!
it's not open, not cross-platform, and you all know it's meant to screw you up.
This is IE all over again.
We had a beautiful standard, called HTML.
Micro$hit convinced people to use their stupid proprietary extensions, and in a few years we had destroyed the web.
It took us YEARS to get back in track, destroy explorer, and get the web to be standards compliant again.
Now people is doing the same all over again, displacing OpenGL because it's "Obsolete" and letting micro$hit rule hardware production with DirectShit-compatible devices.I hate Gamers, and I hate the kind of people that talk about video cards all day.
For fucks sake, If you want to play games get a Famicom or that shitty new alternative, I believe it's called playstation or something.</sentencetext>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_01_16_2226235.30794124</id>
	<title>Re:Whats the point?</title>
	<author>Anonymous</author>
	<datestamp>1263643560000</datestamp>
	<modclass>None</modclass>
	<modscore>1</modscore>
	<htmltext><p>Tom's Hardware seized to be informative years ago, nowadays they are just nVidia/intel advertizer.</p></htmltext>
<tokenext>Tom 's Hardware seized to be informative years ago , nowadays they are just nVidia/intel advertizer .</tokentext>
<sentencetext>Tom's Hardware seized to be informative years ago, nowadays they are just nVidia/intel advertizer.</sentencetext>
	<parent>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_01_16_2226235.30794020</parent>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_01_16_2226235.30800230</id>
	<title>Re:I don't really keep up with games...</title>
	<author>Lonewolf666</author>
	<datestamp>1263759180000</datestamp>
	<modclass>None</modclass>
	<modscore>1</modscore>
	<htmltext><p><div class="quote"><p>then try rotating and moving them in three dimensions instead of two</p></div><p>Rotation and movement in three dimensions is nothing new for computer graphics. Even if you just walk up and down some stairs, and look around and up/down in an game, the graphics engine has to do complete calculations for three dimensions.</p><p>A well known game which actually limited things (mostly) to two dimensions was Doom:<br>In Doom, you cannot look up/down and there is no perspective adjustment for vertical parts of the environment - those always appear as parallel on the screen. That was one of the tricks which allowed Doom to run on a 80386 with acceptable frame rates. But those days are over in game development. Half-Life 1 had a full 3D engine ten years ago.</p></div>
	</htmltext>
<tokenext>then try rotating and moving them in three dimensions instead of twoRotation and movement in three dimensions is nothing new for computer graphics .
Even if you just walk up and down some stairs , and look around and up/down in an game , the graphics engine has to do complete calculations for three dimensions.A well known game which actually limited things ( mostly ) to two dimensions was Doom : In Doom , you can not look up/down and there is no perspective adjustment for vertical parts of the environment - those always appear as parallel on the screen .
That was one of the tricks which allowed Doom to run on a 80386 with acceptable frame rates .
But those days are over in game development .
Half-Life 1 had a full 3D engine ten years ago .</tokentext>
<sentencetext>then try rotating and moving them in three dimensions instead of twoRotation and movement in three dimensions is nothing new for computer graphics.
Even if you just walk up and down some stairs, and look around and up/down in an game, the graphics engine has to do complete calculations for three dimensions.A well known game which actually limited things (mostly) to two dimensions was Doom:In Doom, you cannot look up/down and there is no perspective adjustment for vertical parts of the environment - those always appear as parallel on the screen.
That was one of the tricks which allowed Doom to run on a 80386 with acceptable frame rates.
But those days are over in game development.
Half-Life 1 had a full 3D engine ten years ago.
	</sentencetext>
	<parent>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_01_16_2226235.30794560</parent>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_01_16_2226235.30794160</id>
	<title>State of AMD for HTPC Use?</title>
	<author>tji</author>
	<datestamp>1263643860000</datestamp>
	<modclass>Insightful</modclass>
	<modscore>4</modscore>
	<htmltext><p>I'm not a gamer, so the 3D features are not important to me.  I am an HTPC user, and ATI has always been a non-factor in that realm.  So, I haven't paid any attention to their releases for the last few years.</p><p>Has there been any change in video acceleration in Linux with AMD?  Do they have any support for XvMC, VDPAU, or anything else usable in Linux?</p></htmltext>
<tokenext>I 'm not a gamer , so the 3D features are not important to me .
I am an HTPC user , and ATI has always been a non-factor in that realm .
So , I have n't paid any attention to their releases for the last few years.Has there been any change in video acceleration in Linux with AMD ?
Do they have any support for XvMC , VDPAU , or anything else usable in Linux ?</tokentext>
<sentencetext>I'm not a gamer, so the 3D features are not important to me.
I am an HTPC user, and ATI has always been a non-factor in that realm.
So, I haven't paid any attention to their releases for the last few years.Has there been any change in video acceleration in Linux with AMD?
Do they have any support for XvMC, VDPAU, or anything else usable in Linux?</sentencetext>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_01_16_2226235.30799294</id>
	<title>Monkey business</title>
	<author>Anonymous</author>
	<datestamp>1263751800000</datestamp>
	<modclass>None</modclass>
	<modscore>0</modscore>
	<htmltext><p>Take a gander at this <a href="http://www.youtube.com/watch?v=1DPQW0e9ufM" title="youtube.com" rel="nofollow">video card upgrade tutorial</a> [youtube.com] provided by AMD for some insight into the target market for this card.</p></htmltext>
<tokenext>Take a gander at this video card upgrade tutorial [ youtube.com ] provided by AMD for some insight into the target market for this card .</tokentext>
<sentencetext>Take a gander at this video card upgrade tutorial [youtube.com] provided by AMD for some insight into the target market for this card.</sentencetext>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_01_16_2226235.30796916</id>
	<title>Re:Why?</title>
	<author>Hadlock</author>
	<datestamp>1263723660000</datestamp>
	<modclass>None</modclass>
	<modscore>1</modscore>
	<htmltext><p>The real reason DX10 never took off is that nobody could tell the difference between DX9 and DX10 screenshots.</p></htmltext>
<tokenext>The real reason DX10 never took off is that nobody could tell the difference between DX9 and DX10 screenshots .</tokentext>
<sentencetext>The real reason DX10 never took off is that nobody could tell the difference between DX9 and DX10 screenshots.</sentencetext>
	<parent>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_01_16_2226235.30794926</parent>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_01_16_2226235.30795748</id>
	<title>Re:Whats the point?</title>
	<author>Shanrak</author>
	<datestamp>1263659220000</datestamp>
	<modclass>None</modclass>
	<modscore>1</modscore>
	<htmltext>yet the majority of their monthly best video card for the money is ATI, I fail to see how they are nvidia advertisers.</htmltext>
<tokenext>yet the majority of their monthly best video card for the money is ATI , I fail to see how they are nvidia advertisers .</tokentext>
<sentencetext>yet the majority of their monthly best video card for the money is ATI, I fail to see how they are nvidia advertisers.</sentencetext>
	<parent>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_01_16_2226235.30794124</parent>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_01_16_2226235.30793872</id>
	<title>I don't really keep up with games...</title>
	<author>h4rm0ny</author>
	<datestamp>1263641820000</datestamp>
	<modclass>None</modclass>
	<modscore>1</modscore>
	<htmltext>... so somebody tell me if we actually have any that can really take advantage of the latest greatest graphics cards, yet? Seems like the hardware is outpacing the software, isn't it?</htmltext>
<tokenext>... so somebody tell me if we actually have any that can really take advantage of the latest greatest graphics cards , yet ?
Seems like the hardware is outpacing the software , is n't it ?</tokentext>
<sentencetext>... so somebody tell me if we actually have any that can really take advantage of the latest greatest graphics cards, yet?
Seems like the hardware is outpacing the software, isn't it?</sentencetext>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_01_16_2226235.30797864</id>
	<title>Re:ATI is aweful</title>
	<author>Lonewolf666</author>
	<datestamp>1263739740000</datestamp>
	<modclass>None</modclass>
	<modscore>1</modscore>
	<htmltext><p>AMD is now supporting the development of Open Source drivers, and has released a lot of specification to make this possible. On the other hand, it is true that they have dropped support for older cards in their proprietary drivers. It seems they want to switch their Linux drivers from proprietary to Open Source.</p><p>Such Open Source Linux drivers are available by now for many ATI cards. For Ubuntu, see this list:<br><a href="https://help.ubuntu.com/community/RadeonDriver" title="ubuntu.com">https://help.ubuntu.com/community/RadeonDriver</a> [ubuntu.com]<br>The older cards are well supported while the new ones still don't have 3D acceleration in the Open Source drivers.</p></htmltext>
<tokenext>AMD is now supporting the development of Open Source drivers , and has released a lot of specification to make this possible .
On the other hand , it is true that they have dropped support for older cards in their proprietary drivers .
It seems they want to switch their Linux drivers from proprietary to Open Source.Such Open Source Linux drivers are available by now for many ATI cards .
For Ubuntu , see this list : https : //help.ubuntu.com/community/RadeonDriver [ ubuntu.com ] The older cards are well supported while the new ones still do n't have 3D acceleration in the Open Source drivers .</tokentext>
<sentencetext>AMD is now supporting the development of Open Source drivers, and has released a lot of specification to make this possible.
On the other hand, it is true that they have dropped support for older cards in their proprietary drivers.
It seems they want to switch their Linux drivers from proprietary to Open Source.Such Open Source Linux drivers are available by now for many ATI cards.
For Ubuntu, see this list:https://help.ubuntu.com/community/RadeonDriver [ubuntu.com]The older cards are well supported while the new ones still don't have 3D acceleration in the Open Source drivers.</sentencetext>
	<parent>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_01_16_2226235.30796854</parent>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_01_16_2226235.30799328</id>
	<title>Re:Whats the point?</title>
	<author>hkmwbz</author>
	<datestamp>1263751980000</datestamp>
	<modclass>None</modclass>
	<modscore>1</modscore>
	<htmltext><blockquote><div><p>barely keeping 30 FPS at 1680x1050</p></div></blockquote><p>
30 fps at 1680x1050 sounds fucking amazing to me. I would probably just run it at 1200x1024 or something anyway. But when did 30 fps become bad?</p></div>
	</htmltext>
<tokenext>barely keeping 30 FPS at 1680x1050 30 fps at 1680x1050 sounds fucking amazing to me .
I would probably just run it at 1200x1024 or something anyway .
But when did 30 fps become bad ?</tokentext>
<sentencetext>barely keeping 30 FPS at 1680x1050
30 fps at 1680x1050 sounds fucking amazing to me.
I would probably just run it at 1200x1024 or something anyway.
But when did 30 fps become bad?
	</sentencetext>
	<parent>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_01_16_2226235.30794020</parent>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_01_16_2226235.30795434</id>
	<title>Re:Why?</title>
	<author>TJamieson</author>
	<datestamp>1263655980000</datestamp>
	<modclass>None</modclass>
	<modscore>1</modscore>
	<htmltext><p><nobr> <wbr></nobr></p><div class="quote"><p>... hardware tessellation and compute shaders<nobr> <wbr></nobr>...</p></div><p>Compute Shader for Shader Model 5.0, yes. However, starting with Catalyst 9.12 (December 2009) the HD48xx cards have driver support for CS for SM4.0. Regardless, afaik, no one is using either presently. Would be interesting to see a new physics engine that ran on this; PhysX for Compute Shaders I guess.</p></div>
	</htmltext>
<tokenext>... hardware tessellation and compute shaders ...Compute Shader for Shader Model 5.0 , yes .
However , starting with Catalyst 9.12 ( December 2009 ) the HD48xx cards have driver support for CS for SM4.0 .
Regardless , afaik , no one is using either presently .
Would be interesting to see a new physics engine that ran on this ; PhysX for Compute Shaders I guess .</tokentext>
<sentencetext> ... hardware tessellation and compute shaders ...Compute Shader for Shader Model 5.0, yes.
However, starting with Catalyst 9.12 (December 2009) the HD48xx cards have driver support for CS for SM4.0.
Regardless, afaik, no one is using either presently.
Would be interesting to see a new physics engine that ran on this; PhysX for Compute Shaders I guess.
	</sentencetext>
	<parent>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_01_16_2226235.30793830</parent>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_01_16_2226235.30797056</id>
	<title>Re:Who needs performance</title>
	<author>Hadlock</author>
	<datestamp>1263726180000</datestamp>
	<modclass>None</modclass>
	<modscore>1</modscore>
	<htmltext><p>anything above 720P <b>at distances greater than 10'</b> is useless. most people sit 18-24" away from their displays. you can most definitely tell the difference between a 1440x900, 1680x1050 and 1900x1200 pixel 24" diagonal display at 24" distance. you're correct that a 40", 1080p display for sports (i.e. general TV, not video games) in the living room is a waste of money, but for video games you will appreciate the 1080p (gui, etc). high resolutions for 22-27" displays on the desktop is very much wanted and very much useful.</p></htmltext>
<tokenext>anything above 720P at distances greater than 10 ' is useless .
most people sit 18-24 " away from their displays .
you can most definitely tell the difference between a 1440x900 , 1680x1050 and 1900x1200 pixel 24 " diagonal display at 24 " distance .
you 're correct that a 40 " , 1080p display for sports ( i.e .
general TV , not video games ) in the living room is a waste of money , but for video games you will appreciate the 1080p ( gui , etc ) .
high resolutions for 22-27 " displays on the desktop is very much wanted and very much useful .</tokentext>
<sentencetext>anything above 720P at distances greater than 10' is useless.
most people sit 18-24" away from their displays.
you can most definitely tell the difference between a 1440x900, 1680x1050 and 1900x1200 pixel 24" diagonal display at 24" distance.
you're correct that a 40", 1080p display for sports (i.e.
general TV, not video games) in the living room is a waste of money, but for video games you will appreciate the 1080p (gui, etc).
high resolutions for 22-27" displays on the desktop is very much wanted and very much useful.</sentencetext>
	<parent>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_01_16_2226235.30795740</parent>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_01_16_2226235.30794392</id>
	<title>AMD -=- ATI</title>
	<author>Anonymous</author>
	<datestamp>1263645420000</datestamp>
	<modclass>None</modclass>
	<modscore>1</modscore>
	<htmltext><p>Anyone else still<nobr> <wbr></nobr>:\%s/AMD/ATI/g when coming up on these stories?</p></htmltext>
<tokenext>Anyone else still : \ % s/AMD/ATI/g when coming up on these stories ?</tokentext>
<sentencetext>Anyone else still :\%s/AMD/ATI/g when coming up on these stories?</sentencetext>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_01_16_2226235.30796754</id>
	<title>Re:I don't really keep up with games...</title>
	<author>Anonymous</author>
	<datestamp>1263720540000</datestamp>
	<modclass>Informativ</modclass>
	<modscore>1</modscore>
	<htmltext><p>Captain Obvious told me that while 3870 cost $200+ on 2+ years ago, this so called "about as powerful as a 3870" card cost around $100, uses less power than 3870, AND is listed on the x6xx line.</p><p>Truth be told, you're like comparing apples and oranges here. 2+ years ago, 3870 series might be the fastest card out there and should be compared now to a 5870.</p></htmltext>
<tokenext>Captain Obvious told me that while 3870 cost $ 200 + on 2 + years ago , this so called " about as powerful as a 3870 " card cost around $ 100 , uses less power than 3870 , AND is listed on the x6xx line.Truth be told , you 're like comparing apples and oranges here .
2 + years ago , 3870 series might be the fastest card out there and should be compared now to a 5870 .</tokentext>
<sentencetext>Captain Obvious told me that while 3870 cost $200+ on 2+ years ago, this so called "about as powerful as a 3870" card cost around $100, uses less power than 3870, AND is listed on the x6xx line.Truth be told, you're like comparing apples and oranges here.
2+ years ago, 3870 series might be the fastest card out there and should be compared now to a 5870.</sentencetext>
	<parent>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_01_16_2226235.30793950</parent>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_01_16_2226235.30796208</id>
	<title>Re:Why?</title>
	<author>voidphoenix</author>
	<datestamp>1263666540000</datestamp>
	<modclass>None</modclass>
	<modscore>1</modscore>
	<htmltext>Several (not-so-compelling) reasons: Eyefinity, DX11, lower power consumption. The 5670 would be good for budget HTPCs and for people with low-demand multi-monitor setups (like some of the people at work who run multiple spreadsheet-like apps concurrently). As well, the 4770 is scarce and it seems that many OEMs are discontinuing them. I'm thinking of grabbing another one for Crossfire before stocks run out at my supplier.</htmltext>
<tokenext>Several ( not-so-compelling ) reasons : Eyefinity , DX11 , lower power consumption .
The 5670 would be good for budget HTPCs and for people with low-demand multi-monitor setups ( like some of the people at work who run multiple spreadsheet-like apps concurrently ) .
As well , the 4770 is scarce and it seems that many OEMs are discontinuing them .
I 'm thinking of grabbing another one for Crossfire before stocks run out at my supplier .</tokentext>
<sentencetext>Several (not-so-compelling) reasons: Eyefinity, DX11, lower power consumption.
The 5670 would be good for budget HTPCs and for people with low-demand multi-monitor setups (like some of the people at work who run multiple spreadsheet-like apps concurrently).
As well, the 4770 is scarce and it seems that many OEMs are discontinuing them.
I'm thinking of grabbing another one for Crossfire before stocks run out at my supplier.</sentencetext>
	<parent>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_01_16_2226235.30795614</parent>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_01_16_2226235.30793950</id>
	<title>Re:I don't really keep up with games...</title>
	<author>Anonymous</author>
	<datestamp>1263642240000</datestamp>
	<modclass>Informativ</modclass>
	<modscore>3</modscore>
	<htmltext><p>A lot of games will struggle on this card significantly.  It's about as powerful as a 3870 from 2+ years ago.</p></htmltext>
<tokenext>A lot of games will struggle on this card significantly .
It 's about as powerful as a 3870 from 2 + years ago .</tokentext>
<sentencetext>A lot of games will struggle on this card significantly.
It's about as powerful as a 3870 from 2+ years ago.</sentencetext>
	<parent>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_01_16_2226235.30793872</parent>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_01_16_2226235.30793844</id>
	<title>Anonymous Coward</title>
	<author>Anonymous</author>
	<datestamp>1263641700000</datestamp>
	<modclass>None</modclass>
	<modscore>0</modscore>
	<htmltext><p>Great to see that Moore's Law still has some steam left for the GPU industry.</p></htmltext>
<tokenext>Great to see that Moore 's Law still has some steam left for the GPU industry .</tokentext>
<sentencetext>Great to see that Moore's Law still has some steam left for the GPU industry.</sentencetext>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_01_16_2226235.30794780</id>
	<title>Re:Compiz is all I need.</title>
	<author>ClosedSource</author>
	<datestamp>1263648300000</datestamp>
	<modclass>Funny</modclass>
	<modscore>3</modscore>
	<htmltext><p>"We had a beautiful standard, called HTML. Micro$hit convinced people to use their stupid proprietary extensions, and in a few years we had destroyed the web."</p><p>Yes, XMLHttpRequest that MS came up with which made AJAX possible is just another stupid extension. We should use only "beautiful" HTML.</p></htmltext>
<tokenext>" We had a beautiful standard , called HTML .
Micro $ hit convinced people to use their stupid proprietary extensions , and in a few years we had destroyed the web .
" Yes , XMLHttpRequest that MS came up with which made AJAX possible is just another stupid extension .
We should use only " beautiful " HTML .</tokentext>
<sentencetext>"We had a beautiful standard, called HTML.
Micro$hit convinced people to use their stupid proprietary extensions, and in a few years we had destroyed the web.
"Yes, XMLHttpRequest that MS came up with which made AJAX possible is just another stupid extension.
We should use only "beautiful" HTML.</sentencetext>
	<parent>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_01_16_2226235.30793986</parent>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_01_16_2226235.30795818</id>
	<title>Re:Whats the point?</title>
	<author>YojimboJango</author>
	<datestamp>1263660060000</datestamp>
	<modclass>Interestin</modclass>
	<modscore>4</modscore>
	<htmltext>I'd like to point out something in that review.  The only benchmarks that this card ever goes below 30fps minimum are Crysis and Far Cry 2 at 1920x1200 running in DX9 mode (instead of DX10 where the card is more likely to shine).  Also they list the GeForce 9600 as getting 40.7fps average while playing DIRT in DX11.  The GeForce 9600 does not support DX11.<p><div class="quote"><p>In DirectX 9 mode, the Radeon HD 5670 is once again keeping pace with the GeForce 9800 GT, delivering playable performance all the way to 1920x1200. However, once DirectX 11 features are enabled, the latest Radeon slows to a crawl. Even the powerful Radeon HD 5750 has difficulty keeping the minimum frame rate above 30 fps at 1680x1050.</p></div><p>

They pretty much tell us that they're testing these cards using higher settings for the ATI parts.  Also on the reviews front page it tells us that they've under-clocked all the cards before testing.  Why would anyone take their reviews seriously after actually reading that? <br> <br>
Not like I'm an ATI fanboy here either, my current and last 3 video cards were all Nvidia (was close to getting a 4850 about a year ago, but newegg had a sweet sale on the GTX260).  It's just that this level of sleaze really pisses me off.</p></div>
	</htmltext>
<tokenext>I 'd like to point out something in that review .
The only benchmarks that this card ever goes below 30fps minimum are Crysis and Far Cry 2 at 1920x1200 running in DX9 mode ( instead of DX10 where the card is more likely to shine ) .
Also they list the GeForce 9600 as getting 40.7fps average while playing DIRT in DX11 .
The GeForce 9600 does not support DX11.In DirectX 9 mode , the Radeon HD 5670 is once again keeping pace with the GeForce 9800 GT , delivering playable performance all the way to 1920x1200 .
However , once DirectX 11 features are enabled , the latest Radeon slows to a crawl .
Even the powerful Radeon HD 5750 has difficulty keeping the minimum frame rate above 30 fps at 1680x1050 .
They pretty much tell us that they 're testing these cards using higher settings for the ATI parts .
Also on the reviews front page it tells us that they 've under-clocked all the cards before testing .
Why would anyone take their reviews seriously after actually reading that ?
Not like I 'm an ATI fanboy here either , my current and last 3 video cards were all Nvidia ( was close to getting a 4850 about a year ago , but newegg had a sweet sale on the GTX260 ) .
It 's just that this level of sleaze really pisses me off .</tokentext>
<sentencetext>I'd like to point out something in that review.
The only benchmarks that this card ever goes below 30fps minimum are Crysis and Far Cry 2 at 1920x1200 running in DX9 mode (instead of DX10 where the card is more likely to shine).
Also they list the GeForce 9600 as getting 40.7fps average while playing DIRT in DX11.
The GeForce 9600 does not support DX11.In DirectX 9 mode, the Radeon HD 5670 is once again keeping pace with the GeForce 9800 GT, delivering playable performance all the way to 1920x1200.
However, once DirectX 11 features are enabled, the latest Radeon slows to a crawl.
Even the powerful Radeon HD 5750 has difficulty keeping the minimum frame rate above 30 fps at 1680x1050.
They pretty much tell us that they're testing these cards using higher settings for the ATI parts.
Also on the reviews front page it tells us that they've under-clocked all the cards before testing.
Why would anyone take their reviews seriously after actually reading that?
Not like I'm an ATI fanboy here either, my current and last 3 video cards were all Nvidia (was close to getting a 4850 about a year ago, but newegg had a sweet sale on the GTX260).
It's just that this level of sleaze really pisses me off.
	</sentencetext>
	<parent>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_01_16_2226235.30794020</parent>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_01_16_2226235.30795740</id>
	<title>Who needs performance</title>
	<author>rsilvergun</author>
	<datestamp>1263659100000</datestamp>
	<modclass>None</modclass>
	<modscore>0</modscore>
	<htmltext>What's the point of anything above 1200x720 when most people game on 22" or 24" monitors? Studies show on that size display anything above 720p is pointless. A bigger concern, you just don't need much to play what's comming out. I just got a 4760 and it'll play anything on the market at 720p full details. I can turn on FSAA without a big hit to frame rate, but honestly on my 22" Acer I can't tell when it on anyway. <br> <br>

PC gaming is kinda dead right now. Practically everything's an Xbox port. The last big title that needed major hardware was Crysis, and that came out 2 years ago. Makes me wonder why AMD/Nvidia don't go the Rockstar games route and commission a game that requires their hardware.</htmltext>
<tokenext>What 's the point of anything above 1200x720 when most people game on 22 " or 24 " monitors ?
Studies show on that size display anything above 720p is pointless .
A bigger concern , you just do n't need much to play what 's comming out .
I just got a 4760 and it 'll play anything on the market at 720p full details .
I can turn on FSAA without a big hit to frame rate , but honestly on my 22 " Acer I ca n't tell when it on anyway .
PC gaming is kinda dead right now .
Practically everything 's an Xbox port .
The last big title that needed major hardware was Crysis , and that came out 2 years ago .
Makes me wonder why AMD/Nvidia do n't go the Rockstar games route and commission a game that requires their hardware .</tokentext>
<sentencetext>What's the point of anything above 1200x720 when most people game on 22" or 24" monitors?
Studies show on that size display anything above 720p is pointless.
A bigger concern, you just don't need much to play what's comming out.
I just got a 4760 and it'll play anything on the market at 720p full details.
I can turn on FSAA without a big hit to frame rate, but honestly on my 22" Acer I can't tell when it on anyway.
PC gaming is kinda dead right now.
Practically everything's an Xbox port.
The last big title that needed major hardware was Crysis, and that came out 2 years ago.
Makes me wonder why AMD/Nvidia don't go the Rockstar games route and commission a game that requires their hardware.</sentencetext>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_01_16_2226235.30794874</id>
	<title>Re:State of AMD for HTPC Use?</title>
	<author>bfree</author>
	<datestamp>1263649440000</datestamp>
	<modclass>None</modclass>
	<modscore>1</modscore>
	<htmltext>I've had no problem displaying BBC-HD (1080i h264) with an 780G and an X2 5050e (low power dual core) with the Free drivers from x.org (but non-free firmware required for video acceleration and 3d).   I wouldn't touch the closed source drivers from Ati or NVidia with yours but I'd now regard the modern Intel or Ati solutions as just fine for undemanding users.</htmltext>
<tokenext>I 've had no problem displaying BBC-HD ( 1080i h264 ) with an 780G and an X2 5050e ( low power dual core ) with the Free drivers from x.org ( but non-free firmware required for video acceleration and 3d ) .
I would n't touch the closed source drivers from Ati or NVidia with yours but I 'd now regard the modern Intel or Ati solutions as just fine for undemanding users .</tokentext>
<sentencetext>I've had no problem displaying BBC-HD (1080i h264) with an 780G and an X2 5050e (low power dual core) with the Free drivers from x.org (but non-free firmware required for video acceleration and 3d).
I wouldn't touch the closed source drivers from Ati or NVidia with yours but I'd now regard the modern Intel or Ati solutions as just fine for undemanding users.</sentencetext>
	<parent>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_01_16_2226235.30794160</parent>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_01_16_2226235.30794426</id>
	<title>Meanwhile, NVidia is renaming cards</title>
	<author>Eukariote</author>
	<datestamp>1263645720000</datestamp>
	<modclass>Informativ</modclass>
	<modscore>3</modscore>
	<htmltext><p>With NVidia unable to release something competitive and therefore creating a "new" 3xx series into being through <a href="http://www.semiaccurate.com/2009/12/08/nvidia-geforce-300-series-grows/" title="semiaccurate.com">renaming 2xx series cards</a> [semiaccurate.com], the <a href="http://www.semiaccurate.com/2009/12/29/nvidia-gts360m-renamed-gt240/" title="semiaccurate.com">gts360m as well</a> [semiaccurate.com], those with a clue will be buying ATI for the time being.</p><p>Sadly, the average consumer will only look at higher number and is likely to be conned.</p></htmltext>
<tokenext>With NVidia unable to release something competitive and therefore creating a " new " 3xx series into being through renaming 2xx series cards [ semiaccurate.com ] , the gts360m as well [ semiaccurate.com ] , those with a clue will be buying ATI for the time being.Sadly , the average consumer will only look at higher number and is likely to be conned .</tokentext>
<sentencetext>With NVidia unable to release something competitive and therefore creating a "new" 3xx series into being through renaming 2xx series cards [semiaccurate.com], the gts360m as well [semiaccurate.com], those with a clue will be buying ATI for the time being.Sadly, the average consumer will only look at higher number and is likely to be conned.</sentencetext>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_01_16_2226235.30794054</id>
	<title>Re:Why?</title>
	<author>Anonymous</author>
	<datestamp>1263643080000</datestamp>
	<modclass>Insightful</modclass>
	<modscore>1</modscore>
	<htmltext><p><div class="quote"><p>I don't get it.</p></div><p>
Of course you don't. This card is for people who have lower resolution monitor (under 1440 X 900), since at lower resolutions it can run all modern games comfortably. About 50\% people still run at 1280 x 1024 or below, and for them this is a great graphics card. It gives good performance at reasonable price, and has the latest features.</p></div>
	</htmltext>
<tokenext>I do n't get it .
Of course you do n't .
This card is for people who have lower resolution monitor ( under 1440 X 900 ) , since at lower resolutions it can run all modern games comfortably .
About 50 \ % people still run at 1280 x 1024 or below , and for them this is a great graphics card .
It gives good performance at reasonable price , and has the latest features .</tokentext>
<sentencetext>I don't get it.
Of course you don't.
This card is for people who have lower resolution monitor (under 1440 X 900), since at lower resolutions it can run all modern games comfortably.
About 50\% people still run at 1280 x 1024 or below, and for them this is a great graphics card.
It gives good performance at reasonable price, and has the latest features.
	</sentencetext>
	<parent>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_01_16_2226235.30793830</parent>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_01_16_2226235.30801458</id>
	<title>Re:Meanwhile, NVidia is renaming cards</title>
	<author>Jthon</author>
	<datestamp>1263724680000</datestamp>
	<modclass>None</modclass>
	<modscore>1</modscore>
	<htmltext><p>Charlie doesn't know what he's talking about. Most of the 2xx cards at this point should be GT21x chips which are based on the GT200 high end chip. The 9000/8000 series cards were all g9x or g8x architectures. Though I think at the low end they might have reused g92 as a G210 in the desktop segment.</p></htmltext>
<tokenext>Charlie does n't know what he 's talking about .
Most of the 2xx cards at this point should be GT21x chips which are based on the GT200 high end chip .
The 9000/8000 series cards were all g9x or g8x architectures .
Though I think at the low end they might have reused g92 as a G210 in the desktop segment .</tokentext>
<sentencetext>Charlie doesn't know what he's talking about.
Most of the 2xx cards at this point should be GT21x chips which are based on the GT200 high end chip.
The 9000/8000 series cards were all g9x or g8x architectures.
Though I think at the low end they might have reused g92 as a G210 in the desktop segment.</sentencetext>
	<parent>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_01_16_2226235.30796594</parent>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_01_16_2226235.30794560</id>
	<title>Re:I don't really keep up with games...</title>
	<author>Anonymous</author>
	<datestamp>1263646560000</datestamp>
	<modclass>None</modclass>
	<modscore>0</modscore>
	<htmltext><p>Actually a game called Shattered Horizon can fully use the latest graphics cards and takes advantage of it. It is a FPS based in space meaning far more movement and mobility and requiring much higher graphical power than most games because of this, of course this is on top of some very nice graphics to begin with, then try rotating and moving them in three dimensions instead of two. Generally recommended requirements are in the 3000  series for ATI cards and 8000 series for nVidia. This game recommends at least a GTX 260 or ATI 4870 and you must have DX10 or higher on top of a quad core processor. The 5870 has a good time with this game and trouble maxing out. Sure it might only be one game but at least some games are moving in the direction of pushing the new hardware to higher limits.</p></htmltext>
<tokenext>Actually a game called Shattered Horizon can fully use the latest graphics cards and takes advantage of it .
It is a FPS based in space meaning far more movement and mobility and requiring much higher graphical power than most games because of this , of course this is on top of some very nice graphics to begin with , then try rotating and moving them in three dimensions instead of two .
Generally recommended requirements are in the 3000 series for ATI cards and 8000 series for nVidia .
This game recommends at least a GTX 260 or ATI 4870 and you must have DX10 or higher on top of a quad core processor .
The 5870 has a good time with this game and trouble maxing out .
Sure it might only be one game but at least some games are moving in the direction of pushing the new hardware to higher limits .</tokentext>
<sentencetext>Actually a game called Shattered Horizon can fully use the latest graphics cards and takes advantage of it.
It is a FPS based in space meaning far more movement and mobility and requiring much higher graphical power than most games because of this, of course this is on top of some very nice graphics to begin with, then try rotating and moving them in three dimensions instead of two.
Generally recommended requirements are in the 3000  series for ATI cards and 8000 series for nVidia.
This game recommends at least a GTX 260 or ATI 4870 and you must have DX10 or higher on top of a quad core processor.
The 5870 has a good time with this game and trouble maxing out.
Sure it might only be one game but at least some games are moving in the direction of pushing the new hardware to higher limits.</sentencetext>
	<parent>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_01_16_2226235.30793872</parent>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_01_16_2226235.30799642</id>
	<title>Other Reviews</title>
	<author>rubeng</author>
	<datestamp>1263754680000</datestamp>
	<modclass>None</modclass>
	<modscore>1</modscore>
	<htmltext>There's a partial roundup of 5670 Reviews <a href="http://0x6877.com/item/3574/" title="0x6877.com" rel="nofollow">here</a> [0x6877.com], generally they seem pretty positive.</htmltext>
<tokenext>There 's a partial roundup of 5670 Reviews here [ 0x6877.com ] , generally they seem pretty positive .</tokentext>
<sentencetext>There's a partial roundup of 5670 Reviews here [0x6877.com], generally they seem pretty positive.</sentencetext>
	<parent>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_01_16_2226235.30794124</parent>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_01_16_2226235.30795826</id>
	<title>Re:Look I don't mean to be a cynical bastard but,.</title>
	<author>starfire83</author>
	<datestamp>1263660120000</datestamp>
	<modclass>None</modclass>
	<modscore>0</modscore>
	<htmltext><p><div class="quote"><p>
FWIW I have a DX10 ATI 4890 card, it's summer here in Australia and it's underclocked and still runs 99.9\% of games flawlessly, I pretty much intend to completely skip the ATI 5xxx series and wait for the next ones, performance bumps aren't what they used to be.</p></div><p>Except that at the 5850 and 5870 cards are literally twice as powerful as the 4850 and 4870. Two 4850/4870 cards in CrossfireX are equal in performance as one 5850/5870 card. That doesn't seem to be the case on the lower end models, from what reviews are showing. I don't know why ATI decided to make the top end cards be twice as powerful as the previous generation but not the mid and low end cards. The low end cards seem to be slower, equal or just barely better. It certainly shoots their efforts of appealing to that market segment right in the foot. People interested in that market will snap up the cheaper but equally as powerful previous generation cards until they're out of circulation and unavailable.</p></div>
	</htmltext>
<tokenext>FWIW I have a DX10 ATI 4890 card , it 's summer here in Australia and it 's underclocked and still runs 99.9 \ % of games flawlessly , I pretty much intend to completely skip the ATI 5xxx series and wait for the next ones , performance bumps are n't what they used to be.Except that at the 5850 and 5870 cards are literally twice as powerful as the 4850 and 4870 .
Two 4850/4870 cards in CrossfireX are equal in performance as one 5850/5870 card .
That does n't seem to be the case on the lower end models , from what reviews are showing .
I do n't know why ATI decided to make the top end cards be twice as powerful as the previous generation but not the mid and low end cards .
The low end cards seem to be slower , equal or just barely better .
It certainly shoots their efforts of appealing to that market segment right in the foot .
People interested in that market will snap up the cheaper but equally as powerful previous generation cards until they 're out of circulation and unavailable .</tokentext>
<sentencetext>
FWIW I have a DX10 ATI 4890 card, it's summer here in Australia and it's underclocked and still runs 99.9\% of games flawlessly, I pretty much intend to completely skip the ATI 5xxx series and wait for the next ones, performance bumps aren't what they used to be.Except that at the 5850 and 5870 cards are literally twice as powerful as the 4850 and 4870.
Two 4850/4870 cards in CrossfireX are equal in performance as one 5850/5870 card.
That doesn't seem to be the case on the lower end models, from what reviews are showing.
I don't know why ATI decided to make the top end cards be twice as powerful as the previous generation but not the mid and low end cards.
The low end cards seem to be slower, equal or just barely better.
It certainly shoots their efforts of appealing to that market segment right in the foot.
People interested in that market will snap up the cheaper but equally as powerful previous generation cards until they're out of circulation and unavailable.
	</sentencetext>
	<parent>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_01_16_2226235.30795460</parent>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_01_16_2226235.30794020</id>
	<title>Whats the point?</title>
	<author>Anonymous</author>
	<datestamp>1263642840000</datestamp>
	<modclass>Informativ</modclass>
	<modscore>4</modscore>
	<htmltext>Toms Hardware's review here: <a href="http://www.tomshardware.com/reviews/radeon-hd-5670,2533.html" title="tomshardware.com" rel="nofollow">http://www.tomshardware.com/reviews/radeon-hd-5670,2533.html</a> [tomshardware.com]

TLDR: While it does support DX11, its not powerful enough to really do much with it, barely keeping 30 FPS at 1680x1050.</htmltext>
<tokenext>Toms Hardware 's review here : http : //www.tomshardware.com/reviews/radeon-hd-5670,2533.html [ tomshardware.com ] TLDR : While it does support DX11 , its not powerful enough to really do much with it , barely keeping 30 FPS at 1680x1050 .</tokentext>
<sentencetext>Toms Hardware's review here: http://www.tomshardware.com/reviews/radeon-hd-5670,2533.html [tomshardware.com]

TLDR: While it does support DX11, its not powerful enough to really do much with it, barely keeping 30 FPS at 1680x1050.</sentencetext>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_01_16_2226235.30796530</id>
	<title>Re:Why?</title>
	<author>hairyfeet</author>
	<datestamp>1263759060000</datestamp>
	<modclass>None</modclass>
	<modscore>1</modscore>
	<htmltext><p>I am also an AMD fan, switched last year, and I have to agree. The kicker for me is the 4870, which they used in the tests to whip this new card by around 30\%, is only $20 more. So why exactly would I buy this card, when for $20 I can get 30\% faster?</p><p>

 I rarely spend more than $100 on a GPU (currently using a 4650 1Gb)  but I just can't see the point of this card with it being so close in price to a much better card. Maybe if they priced it in the $55-$75 range, then yeah it would be a good buy. For that price I would be happy to get rid of my 4650 for a little speed boost. But at $100 there are just too many card that for a little bit more give a whole lot more bang for the buck. So sorry AMD, I just don't get it.</p></htmltext>
<tokenext>I am also an AMD fan , switched last year , and I have to agree .
The kicker for me is the 4870 , which they used in the tests to whip this new card by around 30 \ % , is only $ 20 more .
So why exactly would I buy this card , when for $ 20 I can get 30 \ % faster ?
I rarely spend more than $ 100 on a GPU ( currently using a 4650 1Gb ) but I just ca n't see the point of this card with it being so close in price to a much better card .
Maybe if they priced it in the $ 55- $ 75 range , then yeah it would be a good buy .
For that price I would be happy to get rid of my 4650 for a little speed boost .
But at $ 100 there are just too many card that for a little bit more give a whole lot more bang for the buck .
So sorry AMD , I just do n't get it .</tokentext>
<sentencetext>I am also an AMD fan, switched last year, and I have to agree.
The kicker for me is the 4870, which they used in the tests to whip this new card by around 30\%, is only $20 more.
So why exactly would I buy this card, when for $20 I can get 30\% faster?
I rarely spend more than $100 on a GPU (currently using a 4650 1Gb)  but I just can't see the point of this card with it being so close in price to a much better card.
Maybe if they priced it in the $55-$75 range, then yeah it would be a good buy.
For that price I would be happy to get rid of my 4650 for a little speed boost.
But at $100 there are just too many card that for a little bit more give a whole lot more bang for the buck.
So sorry AMD, I just don't get it.</sentencetext>
	<parent>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_01_16_2226235.30793830</parent>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_01_16_2226235.30794550</id>
	<title>Of course</title>
	<author>Sycraft-fu</author>
	<datestamp>1263646560000</datestamp>
	<modclass>None</modclass>
	<modscore>1</modscore>
	<htmltext><p>How could hardware not outpace software? I mean it is really hard to develop a game for what does not yet exist. The hardware has to come out, and in particular the API has to come out, then developers can develop for it. They do get engineering samples a little early but still.</p><p>In terms of DX11 support. Yes, there are a couple games that will use it. No, it isn't really very useful. Said games run and look fine in DX9 mode.</p><p>Really, you don't buy new cards because you need their features right away. There are two major reasons to get a card with new features:</p><p>1) You want the highest end performance. As always, the newer stuff is faster than the older stuff. So if you are a performance junky, you buy a high end DX11 card not because it is DX11, but because it is fast.</p><p>2) You need new hardware anyhow (maybe you are building a new system) so you might as well get current tech. That way, in 2 years, when things ARE using DX11, you card supports it and you don't have to upgrade unless you need better performance.</p><p>However if you expect software to fully support new hardware, well you are dreaming. The only way that would be possible is for the graphics card makers to deliberately hold their cards back from the public. They won't do that. Also, software companies often won't start supporting it until there is enough of a market for it. So it needs to be launched, get in to the hands of the public, then it is worth while to develop for.</p></htmltext>
<tokenext>How could hardware not outpace software ?
I mean it is really hard to develop a game for what does not yet exist .
The hardware has to come out , and in particular the API has to come out , then developers can develop for it .
They do get engineering samples a little early but still.In terms of DX11 support .
Yes , there are a couple games that will use it .
No , it is n't really very useful .
Said games run and look fine in DX9 mode.Really , you do n't buy new cards because you need their features right away .
There are two major reasons to get a card with new features : 1 ) You want the highest end performance .
As always , the newer stuff is faster than the older stuff .
So if you are a performance junky , you buy a high end DX11 card not because it is DX11 , but because it is fast.2 ) You need new hardware anyhow ( maybe you are building a new system ) so you might as well get current tech .
That way , in 2 years , when things ARE using DX11 , you card supports it and you do n't have to upgrade unless you need better performance.However if you expect software to fully support new hardware , well you are dreaming .
The only way that would be possible is for the graphics card makers to deliberately hold their cards back from the public .
They wo n't do that .
Also , software companies often wo n't start supporting it until there is enough of a market for it .
So it needs to be launched , get in to the hands of the public , then it is worth while to develop for .</tokentext>
<sentencetext>How could hardware not outpace software?
I mean it is really hard to develop a game for what does not yet exist.
The hardware has to come out, and in particular the API has to come out, then developers can develop for it.
They do get engineering samples a little early but still.In terms of DX11 support.
Yes, there are a couple games that will use it.
No, it isn't really very useful.
Said games run and look fine in DX9 mode.Really, you don't buy new cards because you need their features right away.
There are two major reasons to get a card with new features:1) You want the highest end performance.
As always, the newer stuff is faster than the older stuff.
So if you are a performance junky, you buy a high end DX11 card not because it is DX11, but because it is fast.2) You need new hardware anyhow (maybe you are building a new system) so you might as well get current tech.
That way, in 2 years, when things ARE using DX11, you card supports it and you don't have to upgrade unless you need better performance.However if you expect software to fully support new hardware, well you are dreaming.
The only way that would be possible is for the graphics card makers to deliberately hold their cards back from the public.
They won't do that.
Also, software companies often won't start supporting it until there is enough of a market for it.
So it needs to be launched, get in to the hands of the public, then it is worth while to develop for.</sentencetext>
	<parent>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_01_16_2226235.30793872</parent>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_01_16_2226235.30794676</id>
	<title>Re:Why?</title>
	<author>mdwh2</author>
	<datestamp>1263647520000</datestamp>
	<modclass>None</modclass>
	<modscore>1</modscore>
	<htmltext><p>Helps with standardisation? I might be writing a game/application that doesn't need tonnes of graphics processing power to run, but it's still easier if I can simply write one DirectX 11 renderer, instead of having to write multiple renderers for people with low end cards that only support older APIs.</p></htmltext>
<tokenext>Helps with standardisation ?
I might be writing a game/application that does n't need tonnes of graphics processing power to run , but it 's still easier if I can simply write one DirectX 11 renderer , instead of having to write multiple renderers for people with low end cards that only support older APIs .</tokentext>
<sentencetext>Helps with standardisation?
I might be writing a game/application that doesn't need tonnes of graphics processing power to run, but it's still easier if I can simply write one DirectX 11 renderer, instead of having to write multiple renderers for people with low end cards that only support older APIs.</sentencetext>
	<parent>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_01_16_2226235.30793830</parent>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_01_16_2226235.30797604</id>
	<title>Re:Why?</title>
	<author>PopeRatzo</author>
	<datestamp>1263736440000</datestamp>
	<modclass>None</modclass>
	<modscore>1</modscore>
	<htmltext><blockquote><div><p>I don't have any good examples specific to compute shaders but...</p></div></blockquote><p>That's OK, nobody does for home computing yet.  This article was just a marketing press release to move some video cards that will  be obsolete by Valentine's Day.</p></div>
	</htmltext>
<tokenext>I do n't have any good examples specific to compute shaders but...That 's OK , nobody does for home computing yet .
This article was just a marketing press release to move some video cards that will be obsolete by Valentine 's Day .</tokentext>
<sentencetext>I don't have any good examples specific to compute shaders but...That's OK, nobody does for home computing yet.
This article was just a marketing press release to move some video cards that will  be obsolete by Valentine's Day.
	</sentencetext>
	<parent>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_01_16_2226235.30794188</parent>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_01_16_2226235.30796240</id>
	<title>Re:Why?</title>
	<author>Mal-2</author>
	<datestamp>1263667080000</datestamp>
	<modclass>None</modclass>
	<modscore>1</modscore>
	<htmltext><blockquote><div><p>Rather than maintaining separate designs for separate lines, unify everything. Their low end DX11 parts are the same thing as their high end DX11 parts, just less of it. Less shaders, less ROPs, smaller memory controllers, etc.</p></div></blockquote><p>This also allows them to pop a couple fuses and re-purpose marginal would-have-been high end parts by blocking out the broken parts. They did this back in the 9500/9700 days, I don't see why they wouldn't want to do it now.</p><p>Mal-2</p></div>
	</htmltext>
<tokenext>Rather than maintaining separate designs for separate lines , unify everything .
Their low end DX11 parts are the same thing as their high end DX11 parts , just less of it .
Less shaders , less ROPs , smaller memory controllers , etc.This also allows them to pop a couple fuses and re-purpose marginal would-have-been high end parts by blocking out the broken parts .
They did this back in the 9500/9700 days , I do n't see why they would n't want to do it now.Mal-2</tokentext>
<sentencetext>Rather than maintaining separate designs for separate lines, unify everything.
Their low end DX11 parts are the same thing as their high end DX11 parts, just less of it.
Less shaders, less ROPs, smaller memory controllers, etc.This also allows them to pop a couple fuses and re-purpose marginal would-have-been high end parts by blocking out the broken parts.
They did this back in the 9500/9700 days, I don't see why they wouldn't want to do it now.Mal-2
	</sentencetext>
	<parent>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_01_16_2226235.30794188</parent>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_01_16_2226235.30796854</id>
	<title>ATI is aweful</title>
	<author>Anonymous</author>
	<datestamp>1263722400000</datestamp>
	<modclass>Offtopic</modclass>
	<modscore>-1</modscore>
	<htmltext><p>ATI refuses to update the linux proprietary drivers for my video card for the newer kernels because supposedly my card which is only a few years old is according to them "legacy," but they still support windows albeit in a very limited (essential) scope despite it being legacy so i don't care what AMD/ATI does i am having no part in it. I don't know if AMD will lead to any change in this regard, but Id sooner give up computers altogether than support anything connected to ATI at this point.</p><p>Boycotted.</p></htmltext>
<tokenext>ATI refuses to update the linux proprietary drivers for my video card for the newer kernels because supposedly my card which is only a few years old is according to them " legacy , " but they still support windows albeit in a very limited ( essential ) scope despite it being legacy so i do n't care what AMD/ATI does i am having no part in it .
I do n't know if AMD will lead to any change in this regard , but Id sooner give up computers altogether than support anything connected to ATI at this point.Boycotted .</tokentext>
<sentencetext>ATI refuses to update the linux proprietary drivers for my video card for the newer kernels because supposedly my card which is only a few years old is according to them "legacy," but they still support windows albeit in a very limited (essential) scope despite it being legacy so i don't care what AMD/ATI does i am having no part in it.
I don't know if AMD will lead to any change in this regard, but Id sooner give up computers altogether than support anything connected to ATI at this point.Boycotted.</sentencetext>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_01_16_2226235.30794296</id>
	<title>Yeah, I can provide you the same thing for FREE!</title>
	<author>Hurricane78</author>
	<datestamp>1263644700000</datestamp>
	<modclass>Funny</modclass>
	<modscore>2</modscore>
	<htmltext><p>It&rsquo;s called a &ldquo;software renderer&rdquo;.<nobr> <wbr></nobr>;)</p><p>Just as AMD, I did not say that it would actually render anything in <em>real time</em>, did I?<nobr> <wbr></nobr>:P</p></htmltext>
<tokenext>It    s called a    software renderer    .
; ) Just as AMD , I did not say that it would actually render anything in real time , did I ?
: P</tokentext>
<sentencetext>It’s called a “software renderer”.
;)Just as AMD, I did not say that it would actually render anything in real time, did I?
:P</sentencetext>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_01_16_2226235.30796942</id>
	<title>Re:I don't really keep up with games...</title>
	<author>Hadlock</author>
	<datestamp>1263724140000</datestamp>
	<modclass>None</modclass>
	<modscore>1</modscore>
	<htmltext><p>The biggest problem for game makers is that people went from 17-19" 1280x1024 displays (1.6 megapixels, i think, not going to do the math this late) to 21-24" displays at 1680x1050 (2.3 megapixels). The old standard used to be 1024x768. For a long time it was 1280x1024 (small step up). Now the standard (1680x1050) increased by about 50\% seemingly overnight. A card (8600GT 512MB) that could push Valve's TF2 (two year old game at this point) at 45-60fps on 1280x1024 no problem with most of the settings at medium-high, now struggles to push the game at 30fps at low settings at 1680x1050. So while cards bumped up in capability, people are buying these cards to play their current games at their old speed/visual quality. We're going to have to wait another year or two before video cards with the capability to push "tomorrow's" games at a modern resolution... for less than $150. ATI is heading in that direction more quickly than nVidia, but video card makers have yet to meet their market with a proper product.</p></htmltext>
<tokenext>The biggest problem for game makers is that people went from 17-19 " 1280x1024 displays ( 1.6 megapixels , i think , not going to do the math this late ) to 21-24 " displays at 1680x1050 ( 2.3 megapixels ) .
The old standard used to be 1024x768 .
For a long time it was 1280x1024 ( small step up ) .
Now the standard ( 1680x1050 ) increased by about 50 \ % seemingly overnight .
A card ( 8600GT 512MB ) that could push Valve 's TF2 ( two year old game at this point ) at 45-60fps on 1280x1024 no problem with most of the settings at medium-high , now struggles to push the game at 30fps at low settings at 1680x1050 .
So while cards bumped up in capability , people are buying these cards to play their current games at their old speed/visual quality .
We 're going to have to wait another year or two before video cards with the capability to push " tomorrow 's " games at a modern resolution... for less than $ 150 .
ATI is heading in that direction more quickly than nVidia , but video card makers have yet to meet their market with a proper product .</tokentext>
<sentencetext>The biggest problem for game makers is that people went from 17-19" 1280x1024 displays (1.6 megapixels, i think, not going to do the math this late) to 21-24" displays at 1680x1050 (2.3 megapixels).
The old standard used to be 1024x768.
For a long time it was 1280x1024 (small step up).
Now the standard (1680x1050) increased by about 50\% seemingly overnight.
A card (8600GT 512MB) that could push Valve's TF2 (two year old game at this point) at 45-60fps on 1280x1024 no problem with most of the settings at medium-high, now struggles to push the game at 30fps at low settings at 1680x1050.
So while cards bumped up in capability, people are buying these cards to play their current games at their old speed/visual quality.
We're going to have to wait another year or two before video cards with the capability to push "tomorrow's" games at a modern resolution... for less than $150.
ATI is heading in that direction more quickly than nVidia, but video card makers have yet to meet their market with a proper product.</sentencetext>
	<parent>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_01_16_2226235.30793872</parent>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_01_16_2226235.30796306</id>
	<title>Solution to what?</title>
	<author>seyyah</author>
	<datestamp>1263668220000</datestamp>
	<modclass>None</modclass>
	<modscore>1</modscore>
	<htmltext><p>"Solution"...</p><p>Can't we get beyond this word?</p></htmltext>
<tokenext>" Solution " ...Ca n't we get beyond this word ?</tokentext>
<sentencetext>"Solution"...Can't we get beyond this word?</sentencetext>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_01_16_2226235.30794926</id>
	<title>Re:Why?</title>
	<author>Anonymous</author>
	<datestamp>1263649980000</datestamp>
	<modclass>None</modclass>
	<modscore>1</modscore>
	<htmltext>Part of the reason DX10 never really took off was that only the highest end graphics cards supported it for years, and so software developers who used DX(far beyond just game writers) had to focus on supporting either just HW DX9 or both, to which the answer is pretty obvious. Because of the limited benefit you get from one version to the next on something like DX, this is a very bad trend. So by saturating the whole market with DX11 capable cards, hopefully this means that in a few years more apps will support DX beyond just 9, or even 8.</htmltext>
<tokenext>Part of the reason DX10 never really took off was that only the highest end graphics cards supported it for years , and so software developers who used DX ( far beyond just game writers ) had to focus on supporting either just HW DX9 or both , to which the answer is pretty obvious .
Because of the limited benefit you get from one version to the next on something like DX , this is a very bad trend .
So by saturating the whole market with DX11 capable cards , hopefully this means that in a few years more apps will support DX beyond just 9 , or even 8 .</tokentext>
<sentencetext>Part of the reason DX10 never really took off was that only the highest end graphics cards supported it for years, and so software developers who used DX(far beyond just game writers) had to focus on supporting either just HW DX9 or both, to which the answer is pretty obvious.
Because of the limited benefit you get from one version to the next on something like DX, this is a very bad trend.
So by saturating the whole market with DX11 capable cards, hopefully this means that in a few years more apps will support DX beyond just 9, or even 8.</sentencetext>
	<parent>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_01_16_2226235.30793830</parent>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_01_16_2226235.30798760</id>
	<title>Re:Whats the point?</title>
	<author>ultranova</author>
	<datestamp>1263747060000</datestamp>
	<modclass>None</modclass>
	<modscore>1</modscore>
	<htmltext><blockquote><div><p>While it does support DX11, its not powerful enough to really do much with it, barely keeping 30 FPS at 1680x1050.</p></div> </blockquote><p>When did 30 FPS become bad?</p></div>
	</htmltext>
<tokenext>While it does support DX11 , its not powerful enough to really do much with it , barely keeping 30 FPS at 1680x1050 .
When did 30 FPS become bad ?</tokentext>
<sentencetext>While it does support DX11, its not powerful enough to really do much with it, barely keeping 30 FPS at 1680x1050.
When did 30 FPS become bad?
	</sentencetext>
	<parent>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_01_16_2226235.30794020</parent>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_01_16_2226235.30795542</id>
	<title>Re:Why?</title>
	<author>Anonymous</author>
	<datestamp>1263656940000</datestamp>
	<modclass>Informativ</modclass>
	<modscore>1</modscore>
	<htmltext><blockquote><div><p> <i><br>1) Compute shaders. Those actually work on any card DX10 or higher using DX11 APIs (just lower versions of the shaders). The reason these are useful, even on lower end cards, is that some things run drastically faster on a GPU so even a low end one is better than the CPU. <b>I don't have any good examples specific to compute shaders</b> but an older non-computer shader example would be HD video. You can do HD H.264 on a lower end CPU so long as you have a GPU that can handle acceleration. Doesn't have to be a high end one either.<br></i></p></div> </blockquote><p>One of those problems is none other than solving large systems of linear equations, which encompasses a very wide range of specific problems such as 3D visualization (i.e., what the GPU was designed to do) but also solving partial differential equations through techniques such as the <a href="http://en.wikipedia.org/wiki/Finite\_element\_method" title="wikipedia.org" rel="nofollow">finite-element method</a> [wikipedia.org], which encompasses problems such as structural analysis, thermal, fluid dynamics, electromagnetics and even weather prediction and economic models. So, you pretty much can do everything better with a GPGPU.</p><blockquote><div><p> <i><br>2) 64-bit precision. Former versions of DX required only 32-bit FP max, since that is the most you generally need for graphics (32-bit per channel that is). However there are other math functions that need higher precision. DX11 mandates 64-bit FP support. In the case of the 5000 series, it works well too, 64-bit FP is half the speed of 32-bit FP so slower, but still plenty quick as to be useful.<br></i></p></div> </blockquote><p><nobr> <wbr></nobr>...which is extremely helpful for the applications stated above.</p><blockquote><div><p> <i><br>3) Multithreaded rendering/GPU multitasking. DX11 offers much, much better support for having multiple programs talk to the GPU at the same time. The idea is to have it fully preemptively multi-task, just like the CPU. Have the thing be a general purpose resource that can be addressed by multiple programs with no impact.<br></i></p></div> </blockquote><p><nobr> <wbr></nobr>...which is what makes this whole technology relevant to begin with.</p></div>
	</htmltext>
<tokenext>1 ) Compute shaders .
Those actually work on any card DX10 or higher using DX11 APIs ( just lower versions of the shaders ) .
The reason these are useful , even on lower end cards , is that some things run drastically faster on a GPU so even a low end one is better than the CPU .
I do n't have any good examples specific to compute shaders but an older non-computer shader example would be HD video .
You can do HD H.264 on a lower end CPU so long as you have a GPU that can handle acceleration .
Does n't have to be a high end one either .
One of those problems is none other than solving large systems of linear equations , which encompasses a very wide range of specific problems such as 3D visualization ( i.e. , what the GPU was designed to do ) but also solving partial differential equations through techniques such as the finite-element method [ wikipedia.org ] , which encompasses problems such as structural analysis , thermal , fluid dynamics , electromagnetics and even weather prediction and economic models .
So , you pretty much can do everything better with a GPGPU .
2 ) 64-bit precision .
Former versions of DX required only 32-bit FP max , since that is the most you generally need for graphics ( 32-bit per channel that is ) .
However there are other math functions that need higher precision .
DX11 mandates 64-bit FP support .
In the case of the 5000 series , it works well too , 64-bit FP is half the speed of 32-bit FP so slower , but still plenty quick as to be useful .
...which is extremely helpful for the applications stated above .
3 ) Multithreaded rendering/GPU multitasking .
DX11 offers much , much better support for having multiple programs talk to the GPU at the same time .
The idea is to have it fully preemptively multi-task , just like the CPU .
Have the thing be a general purpose resource that can be addressed by multiple programs with no impact .
...which is what makes this whole technology relevant to begin with .</tokentext>
<sentencetext> 1) Compute shaders.
Those actually work on any card DX10 or higher using DX11 APIs (just lower versions of the shaders).
The reason these are useful, even on lower end cards, is that some things run drastically faster on a GPU so even a low end one is better than the CPU.
I don't have any good examples specific to compute shaders but an older non-computer shader example would be HD video.
You can do HD H.264 on a lower end CPU so long as you have a GPU that can handle acceleration.
Doesn't have to be a high end one either.
One of those problems is none other than solving large systems of linear equations, which encompasses a very wide range of specific problems such as 3D visualization (i.e., what the GPU was designed to do) but also solving partial differential equations through techniques such as the finite-element method [wikipedia.org], which encompasses problems such as structural analysis, thermal, fluid dynamics, electromagnetics and even weather prediction and economic models.
So, you pretty much can do everything better with a GPGPU.
2) 64-bit precision.
Former versions of DX required only 32-bit FP max, since that is the most you generally need for graphics (32-bit per channel that is).
However there are other math functions that need higher precision.
DX11 mandates 64-bit FP support.
In the case of the 5000 series, it works well too, 64-bit FP is half the speed of 32-bit FP so slower, but still plenty quick as to be useful.
...which is extremely helpful for the applications stated above.
3) Multithreaded rendering/GPU multitasking.
DX11 offers much, much better support for having multiple programs talk to the GPU at the same time.
The idea is to have it fully preemptively multi-task, just like the CPU.
Have the thing be a general purpose resource that can be addressed by multiple programs with no impact.
...which is what makes this whole technology relevant to begin with.
	</sentencetext>
	<parent>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_01_16_2226235.30794188</parent>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_01_16_2226235.30793830</id>
	<title>Why?</title>
	<author>Nemyst</author>
	<datestamp>1263641640000</datestamp>
	<modclass>Insightful</modclass>
	<modscore>2</modscore>
	<htmltext>I'm sorry, I've seen this news go all around tech sites and... I don't get it. Yay, DX11. The biggest new features I could see about it were hardware tessellation and compute shaders. What, this requires a powerful GPU in the first place to be of any use? Something much, much better than this card? Oh...<br>
<br>
Seriously, good for AMD, but I just don't see the point. Say it's a good card, say it has very low power consumption, but hyping DX11 when it has no particular benefit - especially at this price point - is absolutely useless.<br>
<br>
And before anyone says I'm just bashing AMD, my computer has a 5850.</htmltext>
<tokenext>I 'm sorry , I 've seen this news go all around tech sites and... I do n't get it .
Yay , DX11 .
The biggest new features I could see about it were hardware tessellation and compute shaders .
What , this requires a powerful GPU in the first place to be of any use ?
Something much , much better than this card ?
Oh.. . Seriously , good for AMD , but I just do n't see the point .
Say it 's a good card , say it has very low power consumption , but hyping DX11 when it has no particular benefit - especially at this price point - is absolutely useless .
And before anyone says I 'm just bashing AMD , my computer has a 5850 .</tokentext>
<sentencetext>I'm sorry, I've seen this news go all around tech sites and... I don't get it.
Yay, DX11.
The biggest new features I could see about it were hardware tessellation and compute shaders.
What, this requires a powerful GPU in the first place to be of any use?
Something much, much better than this card?
Oh...

Seriously, good for AMD, but I just don't see the point.
Say it's a good card, say it has very low power consumption, but hyping DX11 when it has no particular benefit - especially at this price point - is absolutely useless.
And before anyone says I'm just bashing AMD, my computer has a 5850.</sentencetext>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_01_16_2226235.30795460</id>
	<title>Look I don't mean to be a cynical bastard but,...</title>
	<author>AbRASiON</author>
	<datestamp>1263656160000</datestamp>
	<modclass>None</modclass>
	<modscore>1</modscore>
	<htmltext><p>We consistently see new hardware like this for people "DX10 cards now as low as 150$" or in this case DX11 cards at the 100$  price point.<br>Time and time again the game developers couldn't give a damn and I don't blame them - they target the biggest possible audience.<br>I'll never forget the Geforce 3 announcement at one of the Apple Expos of all things, Carmack was there and showed off some early Doom 3, it was absolute hype extravaganza. "Doom 3 will need a pixel shader card like the GF3!" So many people purchased one, problem is by the time Doom 3 came out, the GF3 was basically dead and while it could do the graphics required, it wasn't too quick.</p><p>My point is, any new tech like DX11, while great for all of us is never fast enough in the first implimentations, you'll see in 18 months time though, the DX12 cards will be bloody fantastic at DX11 features though, this is just how it is.<br>FWIW I have a DX10 ATI 4890 card, it's summer here in Australia and it's underclocked and still runs 99.9\% of games flawlessly, I pretty much intend to completely skip the ATI 5xxx series and wait for the next ones, performance bumps aren't what they used to be.</p></htmltext>
<tokenext>We consistently see new hardware like this for people " DX10 cards now as low as 150 $ " or in this case DX11 cards at the 100 $ price point.Time and time again the game developers could n't give a damn and I do n't blame them - they target the biggest possible audience.I 'll never forget the Geforce 3 announcement at one of the Apple Expos of all things , Carmack was there and showed off some early Doom 3 , it was absolute hype extravaganza .
" Doom 3 will need a pixel shader card like the GF3 !
" So many people purchased one , problem is by the time Doom 3 came out , the GF3 was basically dead and while it could do the graphics required , it was n't too quick.My point is , any new tech like DX11 , while great for all of us is never fast enough in the first implimentations , you 'll see in 18 months time though , the DX12 cards will be bloody fantastic at DX11 features though , this is just how it is.FWIW I have a DX10 ATI 4890 card , it 's summer here in Australia and it 's underclocked and still runs 99.9 \ % of games flawlessly , I pretty much intend to completely skip the ATI 5xxx series and wait for the next ones , performance bumps are n't what they used to be .</tokentext>
<sentencetext>We consistently see new hardware like this for people "DX10 cards now as low as 150$" or in this case DX11 cards at the 100$  price point.Time and time again the game developers couldn't give a damn and I don't blame them - they target the biggest possible audience.I'll never forget the Geforce 3 announcement at one of the Apple Expos of all things, Carmack was there and showed off some early Doom 3, it was absolute hype extravaganza.
"Doom 3 will need a pixel shader card like the GF3!
" So many people purchased one, problem is by the time Doom 3 came out, the GF3 was basically dead and while it could do the graphics required, it wasn't too quick.My point is, any new tech like DX11, while great for all of us is never fast enough in the first implimentations, you'll see in 18 months time though, the DX12 cards will be bloody fantastic at DX11 features though, this is just how it is.FWIW I have a DX10 ATI 4890 card, it's summer here in Australia and it's underclocked and still runs 99.9\% of games flawlessly, I pretty much intend to completely skip the ATI 5xxx series and wait for the next ones, performance bumps aren't what they used to be.</sentencetext>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_01_16_2226235.30794188</id>
	<title>Re:Why?</title>
	<author>Sycraft-fu</author>
	<datestamp>1263643980000</datestamp>
	<modclass>Insightful</modclass>
	<modscore>5</modscore>
	<htmltext><p>Well the things that may make DX11 interesting in general, not just to high end graphics:</p><p>1) Compute shaders. Those actually work on any card DX10 or higher using DX11 APIs (just lower versions of the shaders). The reason these are useful, even on lower end cards, is that some things run drastically faster on a GPU so even a low end one is better than the CPU. I don't have any good examples specific to compute shaders but an older non-computer shader example would be HD video. You can do HD H.264 on a lower end CPU so long as you have a GPU that can handle acceleration. Doesn't have to be a high end one either.</p><p>2) 64-bit precision. Former versions of DX required only 32-bit FP max, since that is the most you generally need for graphics (32-bit per channel that is). However there are other math functions that need higher precision. DX11 mandates 64-bit FP support. In the case of the 5000 series, it works well too, 64-bit FP is half the speed of 32-bit FP so slower, but still plenty quick as to be useful.</p><p>3) Multithreaded rendering/GPU multitasking. DX11 offers much, much better support for having multiple programs talk to the GPU at the same time. The idea is to have it fully preemptively multi-task, just like the CPU. Have the thing be a general purpose resource that can be addressed by multiple programs with no impact.</p><p>It's a worthwhile new API. Now I'm not saying "Oh everyone needs a DX11 card!" If you have an older card and it works fine for you, great stick with it. However there is a point to wanting to have DX11 in all the segments of the market. Hopefully we can start having GPUs be used for more than just games on the average system.</p><p>Also, it makes sense from ATi's point of view. Rather than maintaining separate designs for separate lines, unify everything. Their low end DX11 parts are the same thing as their high end DX11 parts, just less of it. Less shaders, less ROPs, smaller memory controllers, etc. Makes sense to do that for a low end part, rather than a totally new design. Keeps your costs down, since most of the development cost was paid for by the high end parts.</p><p>In terms of hyping it? Well that's called marketing.</p></htmltext>
<tokenext>Well the things that may make DX11 interesting in general , not just to high end graphics : 1 ) Compute shaders .
Those actually work on any card DX10 or higher using DX11 APIs ( just lower versions of the shaders ) .
The reason these are useful , even on lower end cards , is that some things run drastically faster on a GPU so even a low end one is better than the CPU .
I do n't have any good examples specific to compute shaders but an older non-computer shader example would be HD video .
You can do HD H.264 on a lower end CPU so long as you have a GPU that can handle acceleration .
Does n't have to be a high end one either.2 ) 64-bit precision .
Former versions of DX required only 32-bit FP max , since that is the most you generally need for graphics ( 32-bit per channel that is ) .
However there are other math functions that need higher precision .
DX11 mandates 64-bit FP support .
In the case of the 5000 series , it works well too , 64-bit FP is half the speed of 32-bit FP so slower , but still plenty quick as to be useful.3 ) Multithreaded rendering/GPU multitasking .
DX11 offers much , much better support for having multiple programs talk to the GPU at the same time .
The idea is to have it fully preemptively multi-task , just like the CPU .
Have the thing be a general purpose resource that can be addressed by multiple programs with no impact.It 's a worthwhile new API .
Now I 'm not saying " Oh everyone needs a DX11 card !
" If you have an older card and it works fine for you , great stick with it .
However there is a point to wanting to have DX11 in all the segments of the market .
Hopefully we can start having GPUs be used for more than just games on the average system.Also , it makes sense from ATi 's point of view .
Rather than maintaining separate designs for separate lines , unify everything .
Their low end DX11 parts are the same thing as their high end DX11 parts , just less of it .
Less shaders , less ROPs , smaller memory controllers , etc .
Makes sense to do that for a low end part , rather than a totally new design .
Keeps your costs down , since most of the development cost was paid for by the high end parts.In terms of hyping it ?
Well that 's called marketing .</tokentext>
<sentencetext>Well the things that may make DX11 interesting in general, not just to high end graphics:1) Compute shaders.
Those actually work on any card DX10 or higher using DX11 APIs (just lower versions of the shaders).
The reason these are useful, even on lower end cards, is that some things run drastically faster on a GPU so even a low end one is better than the CPU.
I don't have any good examples specific to compute shaders but an older non-computer shader example would be HD video.
You can do HD H.264 on a lower end CPU so long as you have a GPU that can handle acceleration.
Doesn't have to be a high end one either.2) 64-bit precision.
Former versions of DX required only 32-bit FP max, since that is the most you generally need for graphics (32-bit per channel that is).
However there are other math functions that need higher precision.
DX11 mandates 64-bit FP support.
In the case of the 5000 series, it works well too, 64-bit FP is half the speed of 32-bit FP so slower, but still plenty quick as to be useful.3) Multithreaded rendering/GPU multitasking.
DX11 offers much, much better support for having multiple programs talk to the GPU at the same time.
The idea is to have it fully preemptively multi-task, just like the CPU.
Have the thing be a general purpose resource that can be addressed by multiple programs with no impact.It's a worthwhile new API.
Now I'm not saying "Oh everyone needs a DX11 card!
" If you have an older card and it works fine for you, great stick with it.
However there is a point to wanting to have DX11 in all the segments of the market.
Hopefully we can start having GPUs be used for more than just games on the average system.Also, it makes sense from ATi's point of view.
Rather than maintaining separate designs for separate lines, unify everything.
Their low end DX11 parts are the same thing as their high end DX11 parts, just less of it.
Less shaders, less ROPs, smaller memory controllers, etc.
Makes sense to do that for a low end part, rather than a totally new design.
Keeps your costs down, since most of the development cost was paid for by the high end parts.In terms of hyping it?
Well that's called marketing.</sentencetext>
	<parent>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_01_16_2226235.30793830</parent>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_01_16_2226235.30796212</id>
	<title>Re:Whats the point?</title>
	<author>Anonymous</author>
	<datestamp>1263666660000</datestamp>
	<modclass>None</modclass>
	<modscore>0</modscore>
	<htmltext><p>MW2 is definitely not one of the more intensive games. The opposite if anything.</p></htmltext>
<tokenext>MW2 is definitely not one of the more intensive games .
The opposite if anything .</tokentext>
<sentencetext>MW2 is definitely not one of the more intensive games.
The opposite if anything.</sentencetext>
	<parent>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_01_16_2226235.30794104</parent>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_01_16_2226235.30794510</id>
	<title>Re:Why?</title>
	<author>BikeHelmet</author>
	<datestamp>1263646320000</datestamp>
	<modclass>Insightful</modclass>
	<modscore>4</modscore>
	<htmltext><p>Google Earth across 6 monitors from a single $100 card? Seems like technology is heading in the right direction!</p></htmltext>
<tokenext>Google Earth across 6 monitors from a single $ 100 card ?
Seems like technology is heading in the right direction !</tokentext>
<sentencetext>Google Earth across 6 monitors from a single $100 card?
Seems like technology is heading in the right direction!</sentencetext>
	<parent>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_01_16_2226235.30793830</parent>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_01_16_2226235.30796594</id>
	<title>Re:Meanwhile, NVidia is renaming cards</title>
	<author>i.of.the.storm</author>
	<datestamp>1263760320000</datestamp>
	<modclass>None</modclass>
	<modscore>1</modscore>
	<htmltext>The funniest part is that much of the 2xx series cards are just renamed 9000 series cards, and much of those are renamed or die-shrunk 8000 series cards. That said, Charlie Demerjian is hardly an unbiased source of reporting on nVidia, although I think he does have good reasons for his "grudge."</htmltext>
<tokenext>The funniest part is that much of the 2xx series cards are just renamed 9000 series cards , and much of those are renamed or die-shrunk 8000 series cards .
That said , Charlie Demerjian is hardly an unbiased source of reporting on nVidia , although I think he does have good reasons for his " grudge .
"</tokentext>
<sentencetext>The funniest part is that much of the 2xx series cards are just renamed 9000 series cards, and much of those are renamed or die-shrunk 8000 series cards.
That said, Charlie Demerjian is hardly an unbiased source of reporting on nVidia, although I think he does have good reasons for his "grudge.
"</sentencetext>
	<parent>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_01_16_2226235.30794426</parent>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_01_16_2226235.30794558</id>
	<title>Re:Why?</title>
	<author>Anonymous</author>
	<datestamp>1263646560000</datestamp>
	<modclass>Insightful</modclass>
	<modscore>2</modscore>
	<htmltext><p><div class="quote"><p>I'm sorry, I've seen this news go all around tech sites and... I don't get it. Yay, DX11. The biggest new features I could see about it were hardware tessellation and compute shaders. What, this requires a powerful GPU in the first place to be of any use? Something much, much better than this card? Oh....</p></div><p>Sounds like AMD wants to pull the "NVidia GeforceFX 5200 card" in the market to see what happens. The FX5200 was on a huge fail scale for being hyped of DX9 Pixelshader 2 features, it does at a grand 1-3fps. Don't get me started on it's unbelievably poor GLSL support either... But hey, it IS "The way it's meant to be played", so can YOU even complain!?</p></div>
	</htmltext>
<tokenext>I 'm sorry , I 've seen this news go all around tech sites and... I do n't get it .
Yay , DX11 .
The biggest new features I could see about it were hardware tessellation and compute shaders .
What , this requires a powerful GPU in the first place to be of any use ?
Something much , much better than this card ?
Oh....Sounds like AMD wants to pull the " NVidia GeforceFX 5200 card " in the market to see what happens .
The FX5200 was on a huge fail scale for being hyped of DX9 Pixelshader 2 features , it does at a grand 1-3fps .
Do n't get me started on it 's unbelievably poor GLSL support either... But hey , it IS " The way it 's meant to be played " , so can YOU even complain !
?</tokentext>
<sentencetext>I'm sorry, I've seen this news go all around tech sites and... I don't get it.
Yay, DX11.
The biggest new features I could see about it were hardware tessellation and compute shaders.
What, this requires a powerful GPU in the first place to be of any use?
Something much, much better than this card?
Oh....Sounds like AMD wants to pull the "NVidia GeforceFX 5200 card" in the market to see what happens.
The FX5200 was on a huge fail scale for being hyped of DX9 Pixelshader 2 features, it does at a grand 1-3fps.
Don't get me started on it's unbelievably poor GLSL support either... But hey, it IS "The way it's meant to be played", so can YOU even complain!
?
	</sentencetext>
	<parent>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_01_16_2226235.30793830</parent>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_01_16_2226235.30794548</id>
	<title>Re:Why?</title>
	<author>Kjella</author>
	<datestamp>1263646560000</datestamp>
	<modclass>None</modclass>
	<modscore>1</modscore>
	<htmltext><p><div class="quote"><p>The reason these are useful, even on lower end cards, is that some things run drastically faster on a GPU so even a low end one is better than the CPU. I don't have any good examples specific to compute shaders but an older non-computer shader example would be HD video.</p></div><p>Except that for all intents and purposes, it has nothing to do with the GPU. It could just as well have been on a separate chip, like the Broadcom chip for the new Intel Atoms. It could have been on the CPU too for that matter. Right now there's an awful lot of hype, the question is how much is practical reality. Some things are better solved, in fact generally best solved by dedicated hardware like a HD decoder. How much falls between general purpose and dedicated hardware? Very good question.</p></div>
	</htmltext>
<tokenext>The reason these are useful , even on lower end cards , is that some things run drastically faster on a GPU so even a low end one is better than the CPU .
I do n't have any good examples specific to compute shaders but an older non-computer shader example would be HD video.Except that for all intents and purposes , it has nothing to do with the GPU .
It could just as well have been on a separate chip , like the Broadcom chip for the new Intel Atoms .
It could have been on the CPU too for that matter .
Right now there 's an awful lot of hype , the question is how much is practical reality .
Some things are better solved , in fact generally best solved by dedicated hardware like a HD decoder .
How much falls between general purpose and dedicated hardware ?
Very good question .</tokentext>
<sentencetext>The reason these are useful, even on lower end cards, is that some things run drastically faster on a GPU so even a low end one is better than the CPU.
I don't have any good examples specific to compute shaders but an older non-computer shader example would be HD video.Except that for all intents and purposes, it has nothing to do with the GPU.
It could just as well have been on a separate chip, like the Broadcom chip for the new Intel Atoms.
It could have been on the CPU too for that matter.
Right now there's an awful lot of hype, the question is how much is practical reality.
Some things are better solved, in fact generally best solved by dedicated hardware like a HD decoder.
How much falls between general purpose and dedicated hardware?
Very good question.
	</sentencetext>
	<parent>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_01_16_2226235.30794188</parent>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_01_16_2226235.30797812</id>
	<title>Re:Of course</title>
	<author>TheRaven64</author>
	<datestamp>1263739200000</datestamp>
	<modclass>None</modclass>
	<modscore>1</modscore>
	<htmltext><p><div class="quote"><p>How could hardware not outpace software? I mean it is really hard to develop a game for what does not yet exist. The hardware has to come out, and in particular the API has to come out, then developers can develop for it. They do get engineering samples a little early but still.</p></div><p>Not really true.  Most games have variable quality settings.  You can test the playability at the low quality settings on a top of the line card during development and test the graphics output from the higher settings with a driver that emulates the missing features.  You may only be able to get one frame every few seconds, but that's enough to check that the rendering looks right.  Then you release the game and when the hardware improves people can just turn up the quality a bit.  That way it keeps looking good compared to newer releases and so has a longer shelf life.  A lot of games, and in particular a lot of game engines that are licensed to third parties, work in this way.</p></div>
	</htmltext>
<tokenext>How could hardware not outpace software ?
I mean it is really hard to develop a game for what does not yet exist .
The hardware has to come out , and in particular the API has to come out , then developers can develop for it .
They do get engineering samples a little early but still.Not really true .
Most games have variable quality settings .
You can test the playability at the low quality settings on a top of the line card during development and test the graphics output from the higher settings with a driver that emulates the missing features .
You may only be able to get one frame every few seconds , but that 's enough to check that the rendering looks right .
Then you release the game and when the hardware improves people can just turn up the quality a bit .
That way it keeps looking good compared to newer releases and so has a longer shelf life .
A lot of games , and in particular a lot of game engines that are licensed to third parties , work in this way .</tokentext>
<sentencetext>How could hardware not outpace software?
I mean it is really hard to develop a game for what does not yet exist.
The hardware has to come out, and in particular the API has to come out, then developers can develop for it.
They do get engineering samples a little early but still.Not really true.
Most games have variable quality settings.
You can test the playability at the low quality settings on a top of the line card during development and test the graphics output from the higher settings with a driver that emulates the missing features.
You may only be able to get one frame every few seconds, but that's enough to check that the rendering looks right.
Then you release the game and when the hardware improves people can just turn up the quality a bit.
That way it keeps looking good compared to newer releases and so has a longer shelf life.
A lot of games, and in particular a lot of game engines that are licensed to third parties, work in this way.
	</sentencetext>
	<parent>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_01_16_2226235.30794550</parent>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_01_16_2226235.30794104</id>
	<title>Re:Whats the point?</title>
	<author>Anonymous</author>
	<datestamp>1263643380000</datestamp>
	<modclass>Informativ</modclass>
	<modscore>4</modscore>
	<htmltext>I think your post is misleading. According to that article, the card gets 46FPS average on Call of Duty: Modern Warfare 2, on 1920x1200, highest settings -- and that's one of the more intensive games. I have no idea what numbers you're quoting.</htmltext>
<tokenext>I think your post is misleading .
According to that article , the card gets 46FPS average on Call of Duty : Modern Warfare 2 , on 1920x1200 , highest settings -- and that 's one of the more intensive games .
I have no idea what numbers you 're quoting .</tokentext>
<sentencetext>I think your post is misleading.
According to that article, the card gets 46FPS average on Call of Duty: Modern Warfare 2, on 1920x1200, highest settings -- and that's one of the more intensive games.
I have no idea what numbers you're quoting.</sentencetext>
	<parent>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_01_16_2226235.30794020</parent>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_01_16_2226235.30796644</id>
	<title>Re:Who needs performance</title>
	<author>Anonymous</author>
	<datestamp>1263761160000</datestamp>
	<modclass>None</modclass>
	<modscore>0</modscore>
	<htmltext>WTF?<br>
Seriously, WTF?<br>
The difference between native 1920x1080 and upscaled 1280x720 is fucking dead obvious.<br>
Hell, even on a 10 year old 21" CRT 1920x1440 and 1280x960 are worlds apart.</htmltext>
<tokenext>WTF ?
Seriously , WTF ?
The difference between native 1920x1080 and upscaled 1280x720 is fucking dead obvious .
Hell , even on a 10 year old 21 " CRT 1920x1440 and 1280x960 are worlds apart .</tokentext>
<sentencetext>WTF?
Seriously, WTF?
The difference between native 1920x1080 and upscaled 1280x720 is fucking dead obvious.
Hell, even on a 10 year old 21" CRT 1920x1440 and 1280x960 are worlds apart.</sentencetext>
	<parent>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_01_16_2226235.30795740</parent>
</comment>
<thread>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#thread_10_01_16_2226235_33</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_01_16_2226235.30799642
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_01_16_2226235.30794124
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_01_16_2226235.30794020
</commentlist>
</thread>
<thread>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#thread_10_01_16_2226235_24</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_01_16_2226235.30795434
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_01_16_2226235.30793830
</commentlist>
</thread>
<thread>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#thread_10_01_16_2226235_6</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_01_16_2226235.30795468
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_01_16_2226235.30794160
</commentlist>
</thread>
<thread>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#thread_10_01_16_2226235_25</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_01_16_2226235.30796530
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_01_16_2226235.30793830
</commentlist>
</thread>
<thread>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#thread_10_01_16_2226235_16</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_01_16_2226235.30799328
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_01_16_2226235.30794020
</commentlist>
</thread>
<thread>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#thread_10_01_16_2226235_32</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_01_16_2226235.30795748
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_01_16_2226235.30794124
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_01_16_2226235.30794020
</commentlist>
</thread>
<thread>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#thread_10_01_16_2226235_15</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_01_16_2226235.30796644
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_01_16_2226235.30795740
</commentlist>
</thread>
<thread>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#thread_10_01_16_2226235_1</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_01_16_2226235.30794780
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_01_16_2226235.30793986
</commentlist>
</thread>
<thread>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#thread_10_01_16_2226235_22</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_01_16_2226235.30794548
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_01_16_2226235.30794188
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_01_16_2226235.30793830
</commentlist>
</thread>
<thread>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#thread_10_01_16_2226235_4</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_01_16_2226235.30800230
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_01_16_2226235.30794560
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_01_16_2226235.30793872
</commentlist>
</thread>
<thread>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#thread_10_01_16_2226235_14</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_01_16_2226235.30794874
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_01_16_2226235.30794160
</commentlist>
</thread>
<thread>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#thread_10_01_16_2226235_28</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_01_16_2226235.30795818
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_01_16_2226235.30794020
</commentlist>
</thread>
<thread>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#thread_10_01_16_2226235_29</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_01_16_2226235.30797864
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_01_16_2226235.30796854
</commentlist>
</thread>
<thread>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#thread_10_01_16_2226235_19</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_01_16_2226235.30794754
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_01_16_2226235.30794160
</commentlist>
</thread>
<thread>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#thread_10_01_16_2226235_9</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_01_16_2226235.30794510
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_01_16_2226235.30793830
</commentlist>
</thread>
<thread>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#thread_10_01_16_2226235_26</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_01_16_2226235.30794676
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_01_16_2226235.30793830
</commentlist>
</thread>
<thread>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#thread_10_01_16_2226235_8</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_01_16_2226235.30796942
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_01_16_2226235.30793872
</commentlist>
</thread>
<thread>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#thread_10_01_16_2226235_31</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_01_16_2226235.30794194
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_01_16_2226235.30793830
</commentlist>
</thread>
<thread>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#thread_10_01_16_2226235_2</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_01_16_2226235.30796240
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_01_16_2226235.30794188
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_01_16_2226235.30793830
</commentlist>
</thread>
<thread>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#thread_10_01_16_2226235_18</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_01_16_2226235.30801458
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_01_16_2226235.30796594
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_01_16_2226235.30794426
</commentlist>
</thread>
<thread>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#thread_10_01_16_2226235_23</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_01_16_2226235.30796916
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_01_16_2226235.30794926
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_01_16_2226235.30793830
</commentlist>
</thread>
<thread>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#thread_10_01_16_2226235_5</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_01_16_2226235.30799428
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_01_16_2226235.30794020
</commentlist>
</thread>
<thread>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#thread_10_01_16_2226235_7</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_01_16_2226235.30796754
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_01_16_2226235.30793950
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_01_16_2226235.30793872
</commentlist>
</thread>
<thread>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#thread_10_01_16_2226235_30</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_01_16_2226235.30794336
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_01_16_2226235.30794020
</commentlist>
</thread>
<thread>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#thread_10_01_16_2226235_13</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_01_16_2226235.30794054
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_01_16_2226235.30793830
</commentlist>
</thread>
<thread>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#thread_10_01_16_2226235_20</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_01_16_2226235.30797604
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_01_16_2226235.30794188
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_01_16_2226235.30793830
</commentlist>
</thread>
<thread>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#thread_10_01_16_2226235_0</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_01_16_2226235.30795542
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_01_16_2226235.30794188
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_01_16_2226235.30793830
</commentlist>
</thread>
<thread>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#thread_10_01_16_2226235_21</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_01_16_2226235.30798760
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_01_16_2226235.30794020
</commentlist>
</thread>
<thread>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#thread_10_01_16_2226235_12</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_01_16_2226235.30796212
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_01_16_2226235.30794104
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_01_16_2226235.30794020
</commentlist>
</thread>
<thread>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#thread_10_01_16_2226235_3</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_01_16_2226235.30796208
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_01_16_2226235.30795614
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_01_16_2226235.30793830
</commentlist>
</thread>
<thread>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#thread_10_01_16_2226235_11</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_01_16_2226235.30795826
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_01_16_2226235.30795460
</commentlist>
</thread>
<thread>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#thread_10_01_16_2226235_27</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_01_16_2226235.30794558
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_01_16_2226235.30793830
</commentlist>
</thread>
<thread>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#thread_10_01_16_2226235_34</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_01_16_2226235.30797056
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_01_16_2226235.30795740
</commentlist>
</thread>
<thread>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#thread_10_01_16_2226235_17</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_01_16_2226235.30794148
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_01_16_2226235.30793986
</commentlist>
</thread>
<thread>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#thread_10_01_16_2226235_10</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_01_16_2226235.30797812
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_01_16_2226235.30794550
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_01_16_2226235.30793872
</commentlist>
</thread>
<conversation>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#conversation10_01_16_2226235.9</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_01_16_2226235.30794160
-http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_01_16_2226235.30794874
-http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_01_16_2226235.30794754
-http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_01_16_2226235.30795468
</commentlist>
</conversation>
<conversation>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#conversation10_01_16_2226235.10</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_01_16_2226235.30794426
-http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_01_16_2226235.30796594
--http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_01_16_2226235.30801458
</commentlist>
</conversation>
<conversation>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#conversation10_01_16_2226235.3</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_01_16_2226235.30793844
</commentlist>
</conversation>
<conversation>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#conversation10_01_16_2226235.1</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_01_16_2226235.30793830
-http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_01_16_2226235.30794676
-http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_01_16_2226235.30795434
-http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_01_16_2226235.30795614
--http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_01_16_2226235.30796208
-http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_01_16_2226235.30794510
-http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_01_16_2226235.30794926
--http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_01_16_2226235.30796916
-http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_01_16_2226235.30794054
-http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_01_16_2226235.30794194
-http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_01_16_2226235.30794558
-http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_01_16_2226235.30794188
--http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_01_16_2226235.30795542
--http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_01_16_2226235.30797604
--http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_01_16_2226235.30796240
--http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_01_16_2226235.30794548
-http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_01_16_2226235.30796530
</commentlist>
</conversation>
<conversation>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#conversation10_01_16_2226235.7</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_01_16_2226235.30795460
-http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_01_16_2226235.30795826
</commentlist>
</conversation>
<conversation>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#conversation10_01_16_2226235.5</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_01_16_2226235.30796854
-http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_01_16_2226235.30797864
</commentlist>
</conversation>
<conversation>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#conversation10_01_16_2226235.8</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_01_16_2226235.30795740
-http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_01_16_2226235.30796644
-http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_01_16_2226235.30797056
</commentlist>
</conversation>
<conversation>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#conversation10_01_16_2226235.2</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_01_16_2226235.30794392
</commentlist>
</conversation>
<conversation>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#conversation10_01_16_2226235.0</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_01_16_2226235.30793872
-http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_01_16_2226235.30796942
-http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_01_16_2226235.30794550
--http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_01_16_2226235.30797812
-http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_01_16_2226235.30794560
--http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_01_16_2226235.30800230
-http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_01_16_2226235.30793950
--http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_01_16_2226235.30796754
</commentlist>
</conversation>
<conversation>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#conversation10_01_16_2226235.6</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_01_16_2226235.30793986
-http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_01_16_2226235.30794780
-http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_01_16_2226235.30794148
</commentlist>
</conversation>
<conversation>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#conversation10_01_16_2226235.4</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_01_16_2226235.30794020
-http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_01_16_2226235.30795818
-http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_01_16_2226235.30799328
-http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_01_16_2226235.30798760
-http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_01_16_2226235.30794124
--http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_01_16_2226235.30799642
--http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_01_16_2226235.30795748
-http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_01_16_2226235.30794336
-http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_01_16_2226235.30799428
-http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_01_16_2226235.30794104
--http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_01_16_2226235.30796212
</commentlist>
</conversation>
