<article>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#article10_01_07_1951244</id>
	<title>AMD Launches World's First Mobile DirectX 11 GPUs</title>
	<author>kdawson</author>
	<datestamp>1262852400000</datestamp>
	<htmltext>J. Dzhugashvili writes <i>"Less than 4 months after <a href="//games.slashdot.org/story/09/09/23/1530251/AMD-Radeon-HD-5870-Adds-DX11-Multi-Monitor-Gaming">releasing the first DX11 desktop graphics card</a>, AMD has followed up with a whole lineup of mobile graphics processors based on the same architecture. The new <a href="http://techreport.com/articles.x/18238">Mobility Radeon HD 5000 lineup</a> includes four different series of GPUs designed to serve everything from high-end gaming notebooks to mainstream thin-and-light systems. AMD has based these processors on the same silicon chips as its desktop Radeon HD 5000-series graphics cards, so performance shouldn't disappoint. The company also intends to follow Nvidia's lead by offering notebook graphics drivers directly from its website, as opposed to relying on laptop vendors to provide updates."</i></htmltext>
<tokenext>J. Dzhugashvili writes " Less than 4 months after releasing the first DX11 desktop graphics card , AMD has followed up with a whole lineup of mobile graphics processors based on the same architecture .
The new Mobility Radeon HD 5000 lineup includes four different series of GPUs designed to serve everything from high-end gaming notebooks to mainstream thin-and-light systems .
AMD has based these processors on the same silicon chips as its desktop Radeon HD 5000-series graphics cards , so performance should n't disappoint .
The company also intends to follow Nvidia 's lead by offering notebook graphics drivers directly from its website , as opposed to relying on laptop vendors to provide updates .
"</tokentext>
<sentencetext>J. Dzhugashvili writes "Less than 4 months after releasing the first DX11 desktop graphics card, AMD has followed up with a whole lineup of mobile graphics processors based on the same architecture.
The new Mobility Radeon HD 5000 lineup includes four different series of GPUs designed to serve everything from high-end gaming notebooks to mainstream thin-and-light systems.
AMD has based these processors on the same silicon chips as its desktop Radeon HD 5000-series graphics cards, so performance shouldn't disappoint.
The company also intends to follow Nvidia's lead by offering notebook graphics drivers directly from its website, as opposed to relying on laptop vendors to provide updates.
"</sentencetext>
</article>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_01_07_1951244.30687284</id>
	<title>Re:Driver Quality?</title>
	<author>Anonymous</author>
	<datestamp>1262857560000</datestamp>
	<modclass>None</modclass>
	<modscore>0</modscore>
	<htmltext><p><div class="quote"><p>Perhaps this will increase the actual quality of the Drivers which have been historically so bad?</p><p>--jeffk++</p></div><p>Came here to say the same thing that you said so I'm out!</p></div>
	</htmltext>
<tokenext>Perhaps this will increase the actual quality of the Drivers which have been historically so bad ? --jeffk + + Came here to say the same thing that you said so I 'm out !</tokentext>
<sentencetext>Perhaps this will increase the actual quality of the Drivers which have been historically so bad?--jeffk++Came here to say the same thing that you said so I'm out!
	</sentencetext>
	<parent>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_01_07_1951244.30686970</parent>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_01_07_1951244.30691322</id>
	<title>Re:ATI at it again...</title>
	<author>Anonymous</author>
	<datestamp>1262892120000</datestamp>
	<modclass>None</modclass>
	<modscore>0</modscore>
	<htmltext><p>Nvidia drops support  for older cards as well.</p></htmltext>
<tokenext>Nvidia drops support for older cards as well .</tokentext>
<sentencetext>Nvidia drops support  for older cards as well.</sentencetext>
	<parent>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_01_07_1951244.30688724</parent>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_01_07_1951244.30689214</id>
	<title>Re:People Still Use DirectX???</title>
	<author>shutdown -p now</author>
	<datestamp>1262868000000</datestamp>
	<modclass>Informativ</modclass>
	<modscore>2</modscore>
	<htmltext><p><div class="quote"><p>Who the hell other than the poor sods still doing x86 Windows only game/graphics development still uses that turd of an API DirectX?</p></div><p>I know you've specifically excluded Carmack here, but nonetheless, I think his opinion is not exactly irrelevant:</p><p>"DX9 is really quite a good API level. Even with the D3D side of things, where I know I have a long history of people thinking I'm antagonistic against it. Microsoft has done a very, very good job of sensibly evolving it at each step&mdash;they're not worried about breaking backwards compatibility&mdash;and it's a pretty clean API. I especially like the work I'm doing on the 360, and it's probably the best graphics API as far as a sensibly designed thing that I've worked with."</p><p>(the original interview that contained that quote seems to be offline, sadly, so I cannot give you the primary source, but googling for that phrase should give plenty of secondary sources)</p></div>
	</htmltext>
<tokenext>Who the hell other than the poor sods still doing x86 Windows only game/graphics development still uses that turd of an API DirectX ? I know you 've specifically excluded Carmack here , but nonetheless , I think his opinion is not exactly irrelevant : " DX9 is really quite a good API level .
Even with the D3D side of things , where I know I have a long history of people thinking I 'm antagonistic against it .
Microsoft has done a very , very good job of sensibly evolving it at each step    they 're not worried about breaking backwards compatibility    and it 's a pretty clean API .
I especially like the work I 'm doing on the 360 , and it 's probably the best graphics API as far as a sensibly designed thing that I 've worked with .
" ( the original interview that contained that quote seems to be offline , sadly , so I can not give you the primary source , but googling for that phrase should give plenty of secondary sources )</tokentext>
<sentencetext>Who the hell other than the poor sods still doing x86 Windows only game/graphics development still uses that turd of an API DirectX?I know you've specifically excluded Carmack here, but nonetheless, I think his opinion is not exactly irrelevant:"DX9 is really quite a good API level.
Even with the D3D side of things, where I know I have a long history of people thinking I'm antagonistic against it.
Microsoft has done a very, very good job of sensibly evolving it at each step—they're not worried about breaking backwards compatibility—and it's a pretty clean API.
I especially like the work I'm doing on the 360, and it's probably the best graphics API as far as a sensibly designed thing that I've worked with.
"(the original interview that contained that quote seems to be offline, sadly, so I cannot give you the primary source, but googling for that phrase should give plenty of secondary sources)
	</sentencetext>
	<parent>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_01_07_1951244.30687044</parent>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_01_07_1951244.30687814</id>
	<title>X [] O</title>
	<author>tepples</author>
	<datestamp>1262860200000</datestamp>
	<modclass>None</modclass>
	<modscore>1</modscore>
	<htmltext><p>
Xbox 360 graphics development - DirectX<br>
XNA (Xbox 360 indie games) graphics development - a managed API based on DirectX
</p></htmltext>
<tokenext>Xbox 360 graphics development - DirectX XNA ( Xbox 360 indie games ) graphics development - a managed API based on DirectX</tokentext>
<sentencetext>
Xbox 360 graphics development - DirectX
XNA (Xbox 360 indie games) graphics development - a managed API based on DirectX
</sentencetext>
	<parent>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_01_07_1951244.30687044</parent>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_01_07_1951244.30686964</id>
	<title>And?</title>
	<author>Anonymous</author>
	<datestamp>1262856120000</datestamp>
	<modclass>Redundant</modclass>
	<modscore>-1</modscore>
	<htmltext><p>Better rush out to get this so you can play the whopping <b>3</b> games that support DirectX 11.</p></htmltext>
<tokenext>Better rush out to get this so you can play the whopping 3 games that support DirectX 11 .</tokentext>
<sentencetext>Better rush out to get this so you can play the whopping 3 games that support DirectX 11.</sentencetext>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_01_07_1951244.30699428</id>
	<title>Re:People Still Use DirectX???</title>
	<author>servognome</author>
	<datestamp>1262942820000</datestamp>
	<modclass>None</modclass>
	<modscore>1</modscore>
	<htmltext>You ignore that it also takes more time and resources.<br>There's a marketing window for most projects, whether it's a tie-in to a sport/movie/show or making sure your game isn't considered outdated upon release.<br>Software as a business is about releasing a product that is "good enough."  Spending more time costs money, both in actual cost and potential costs (you could have the same dev team working on another project).<br>Tweaking may get more eyes, but it doesn't necessarily mean more money.</htmltext>
<tokenext>You ignore that it also takes more time and resources.There 's a marketing window for most projects , whether it 's a tie-in to a sport/movie/show or making sure your game is n't considered outdated upon release.Software as a business is about releasing a product that is " good enough .
" Spending more time costs money , both in actual cost and potential costs ( you could have the same dev team working on another project ) .Tweaking may get more eyes , but it does n't necessarily mean more money .</tokentext>
<sentencetext>You ignore that it also takes more time and resources.There's a marketing window for most projects, whether it's a tie-in to a sport/movie/show or making sure your game isn't considered outdated upon release.Software as a business is about releasing a product that is "good enough.
"  Spending more time costs money, both in actual cost and potential costs (you could have the same dev team working on another project).Tweaking may get more eyes, but it doesn't necessarily mean more money.</sentencetext>
	<parent>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_01_07_1951244.30688888</parent>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_01_07_1951244.30687044</id>
	<title>People Still Use DirectX???</title>
	<author>Anonymous</author>
	<datestamp>1262856480000</datestamp>
	<modclass>Interestin</modclass>
	<modscore>2</modscore>
	<htmltext><p>Who the hell other than the poor sods still doing x86 Windows only game/graphics development still uses that turd of an API DirectX?</p><p>Let's just go over the platforms I work on:</p><p>PC graphics development - OpenGL<br>Linux graphics development - OpenGL<br>Mac graphics development - OpenGL<br>Android graphics development - OpenGL ES<br>iPhone graphics development - OpenGL ES<br>Embedded ARM based system development - OpenGL ES</p><p>even some OpenGL for console development.</p></htmltext>
<tokenext>Who the hell other than the poor sods still doing x86 Windows only game/graphics development still uses that turd of an API DirectX ? Let 's just go over the platforms I work on : PC graphics development - OpenGLLinux graphics development - OpenGLMac graphics development - OpenGLAndroid graphics development - OpenGL ESiPhone graphics development - OpenGL ESEmbedded ARM based system development - OpenGL ESeven some OpenGL for console development .</tokentext>
<sentencetext>Who the hell other than the poor sods still doing x86 Windows only game/graphics development still uses that turd of an API DirectX?Let's just go over the platforms I work on:PC graphics development - OpenGLLinux graphics development - OpenGLMac graphics development - OpenGLAndroid graphics development - OpenGL ESiPhone graphics development - OpenGL ESEmbedded ARM based system development - OpenGL ESeven some OpenGL for console development.</sentencetext>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_01_07_1951244.30687572</id>
	<title>Linux support is coming, we promise!</title>
	<author>MostAwesomeDude</author>
	<datestamp>1262859000000</datestamp>
	<modclass>Informativ</modclass>
	<modscore>4</modscore>
	<htmltext><p>Support in the open-source drivers is being written as fast as ATI can verify and declassify docs. Also the r600/r700 3D code should be mostly reusable for these GPUs.</p></htmltext>
<tokenext>Support in the open-source drivers is being written as fast as ATI can verify and declassify docs .
Also the r600/r700 3D code should be mostly reusable for these GPUs .</tokentext>
<sentencetext>Support in the open-source drivers is being written as fast as ATI can verify and declassify docs.
Also the r600/r700 3D code should be mostly reusable for these GPUs.</sentencetext>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_01_07_1951244.30688044</id>
	<title>Re:I would rather eat a bag of computer screws...</title>
	<author>nxtw</author>
	<datestamp>1262861460000</datestamp>
	<modclass>None</modclass>
	<modscore>0</modscore>
	<htmltext><blockquote><div><p>Than buy another graphics card powered by ATI. Seriously, my X1300xt was only supported for like 3 years. Now I can't even use it on a current distro whilst retaining the full functionality/performance of the closed driver.</p></div></blockquote><p>This sounds more like a Linux/X.org problem than an ATI problem.  Users of old graphics cards in Windows can keep using the old drivers, even in newer operating systems.  Even Windows 2000/XP drivers continue to work in Windows 7/2008 R2, although without the features made possible with newer drivers.</p></div>
	</htmltext>
<tokenext>Than buy another graphics card powered by ATI .
Seriously , my X1300xt was only supported for like 3 years .
Now I ca n't even use it on a current distro whilst retaining the full functionality/performance of the closed driver.This sounds more like a Linux/X.org problem than an ATI problem .
Users of old graphics cards in Windows can keep using the old drivers , even in newer operating systems .
Even Windows 2000/XP drivers continue to work in Windows 7/2008 R2 , although without the features made possible with newer drivers .</tokentext>
<sentencetext>Than buy another graphics card powered by ATI.
Seriously, my X1300xt was only supported for like 3 years.
Now I can't even use it on a current distro whilst retaining the full functionality/performance of the closed driver.This sounds more like a Linux/X.org problem than an ATI problem.
Users of old graphics cards in Windows can keep using the old drivers, even in newer operating systems.
Even Windows 2000/XP drivers continue to work in Windows 7/2008 R2, although without the features made possible with newer drivers.
	</sentencetext>
	<parent>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_01_07_1951244.30687402</parent>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_01_07_1951244.30687184</id>
	<title>Starry Starry Night</title>
	<author>Anonymous</author>
	<datestamp>1262857140000</datestamp>
	<modclass>Offtopic</modclass>
	<modscore>-1</modscore>
	<htmltext><p>If there was a big tech shocker this Christmas, it was the fact that the Kindle was the top selling gift of all time, according to multiple news sources. Really? If so, did the entire industry miss the mark, leaving the space open to a guy whose company specializes in online books sales? A guy who has never really entered the hardware space in any meaningful way before, except as an investor?</p><p>How did this happen?</p><p>The kicker here, is the fact that numerous failed attempts at a pad machine have been made, beginning with the imaginary Dynabook in the 1970s and including various WinPads and other tablets right up to the Microsoft announcement of a tablet platform a few years ago. You remember that, right? This was the platform that was set to dominate all computing by 2000 or 2004 or whatever. In the meantime, a slew of "convertible" laptops evolved and subsequently ended up in the trash heap of innovation.</p><p>Apparently all anyone actually wanted was a device they could read books on. Of course, this may only be the beginning, should Apple come out with its iSlate or iPad--what I prefer to just call a Giant iPod Touch.</p><p>So, what changed? Why are we making this sort of platform shift, all of the sudden?</p><p>How's this for an idea: This would have happened a long time ago, were it not for Microsoft. The same holds true for the smartphone. It would have become an object of desire a lot sooner, had Microsoft laid off.</p><p>The software giant has been a hindrance to progress ever since it transformed from a subversive company to a kind of IBM-clone to a lackey for big business. There's no connection between the company and the end-user anymore.</p><p>To understand this, we need to travel back to the introduction of Excel. The dominant spreadsheet app used to be Lotus 1-2-3. The program put Lotus at the top of the software heap. The company was bigger than Microsoft--by a lot. Microsoft had a crummy spreadsheet program called Multiplan at the time, but it discovered a guy who was working on a Lotus killer. His app became Excel and Microsoft ran a series of humorous ads featuring people skulking around an office, showing each other Excel in a manner that suggested their fictitious company wouldn't approve of the use of the unauthorized product. All of the workers continued to use it on the sly, because they could get so much more work done.</p><p>The ad was totally subversive. It fit right in with the mid-1970s PC revolution, where attendees at those early computer conferences would boo IBM's name. The entire "micro-computer" scene was very subversive in that way. But Bill Gates said that he'd like Microsoft to become the software version of IBM. It's since managed that feat, and now Microsoft is as likely to get booed as IBM was back in 1976</p><p>And you know what? Judging from the company's recent history, it deserves it. Microsoft is the post-modern IBM, serving as the same sort of hindrance to the scene that IBM was 50 years ago. Microsoft is hindering the industry with its vision, stumbling into places it doesn't belong. The company is like the big, dumb rich kid you don't want at your party. He comes anyway and knocks over the punchbowl--again. The company ruins markets just by showing up.</p><p>Microsoft entered the smartphone space early on and slowed things down in the process because nobody wanted to compete with its money and fickle approach. It's not worth risking that the company would take your good idea, re-brand your idea, and ruin the whole thing for everyone. It's a scorched earth policy. Examples are everywhere. Look at FrontPage, a very functional HTML editor bought and branded as a Microsoft product. The company kept mucking with the app until it was useless. The product line was eventually shuttered. HTML and page editing gravitated to the Mac in order to avoid Microsoft (the aforementioned big, dumb rich kid). That Mac software was eventually re-coded for Windows.</p><p>Microsoft's tablet was doomed from the beginning--actually, it's been doomed numerous times over the years. The c</p></htmltext>
<tokenext>If there was a big tech shocker this Christmas , it was the fact that the Kindle was the top selling gift of all time , according to multiple news sources .
Really ? If so , did the entire industry miss the mark , leaving the space open to a guy whose company specializes in online books sales ?
A guy who has never really entered the hardware space in any meaningful way before , except as an investor ? How did this happen ? The kicker here , is the fact that numerous failed attempts at a pad machine have been made , beginning with the imaginary Dynabook in the 1970s and including various WinPads and other tablets right up to the Microsoft announcement of a tablet platform a few years ago .
You remember that , right ?
This was the platform that was set to dominate all computing by 2000 or 2004 or whatever .
In the meantime , a slew of " convertible " laptops evolved and subsequently ended up in the trash heap of innovation.Apparently all anyone actually wanted was a device they could read books on .
Of course , this may only be the beginning , should Apple come out with its iSlate or iPad--what I prefer to just call a Giant iPod Touch.So , what changed ?
Why are we making this sort of platform shift , all of the sudden ? How 's this for an idea : This would have happened a long time ago , were it not for Microsoft .
The same holds true for the smartphone .
It would have become an object of desire a lot sooner , had Microsoft laid off.The software giant has been a hindrance to progress ever since it transformed from a subversive company to a kind of IBM-clone to a lackey for big business .
There 's no connection between the company and the end-user anymore.To understand this , we need to travel back to the introduction of Excel .
The dominant spreadsheet app used to be Lotus 1-2-3 .
The program put Lotus at the top of the software heap .
The company was bigger than Microsoft--by a lot .
Microsoft had a crummy spreadsheet program called Multiplan at the time , but it discovered a guy who was working on a Lotus killer .
His app became Excel and Microsoft ran a series of humorous ads featuring people skulking around an office , showing each other Excel in a manner that suggested their fictitious company would n't approve of the use of the unauthorized product .
All of the workers continued to use it on the sly , because they could get so much more work done.The ad was totally subversive .
It fit right in with the mid-1970s PC revolution , where attendees at those early computer conferences would boo IBM 's name .
The entire " micro-computer " scene was very subversive in that way .
But Bill Gates said that he 'd like Microsoft to become the software version of IBM .
It 's since managed that feat , and now Microsoft is as likely to get booed as IBM was back in 1976And you know what ?
Judging from the company 's recent history , it deserves it .
Microsoft is the post-modern IBM , serving as the same sort of hindrance to the scene that IBM was 50 years ago .
Microsoft is hindering the industry with its vision , stumbling into places it does n't belong .
The company is like the big , dumb rich kid you do n't want at your party .
He comes anyway and knocks over the punchbowl--again .
The company ruins markets just by showing up.Microsoft entered the smartphone space early on and slowed things down in the process because nobody wanted to compete with its money and fickle approach .
It 's not worth risking that the company would take your good idea , re-brand your idea , and ruin the whole thing for everyone .
It 's a scorched earth policy .
Examples are everywhere .
Look at FrontPage , a very functional HTML editor bought and branded as a Microsoft product .
The company kept mucking with the app until it was useless .
The product line was eventually shuttered .
HTML and page editing gravitated to the Mac in order to avoid Microsoft ( the aforementioned big , dumb rich kid ) .
That Mac software was eventually re-coded for Windows.Microsoft 's tablet was doomed from the beginning--actually , it 's been doomed numerous times over the years .
The c</tokentext>
<sentencetext>If there was a big tech shocker this Christmas, it was the fact that the Kindle was the top selling gift of all time, according to multiple news sources.
Really? If so, did the entire industry miss the mark, leaving the space open to a guy whose company specializes in online books sales?
A guy who has never really entered the hardware space in any meaningful way before, except as an investor?How did this happen?The kicker here, is the fact that numerous failed attempts at a pad machine have been made, beginning with the imaginary Dynabook in the 1970s and including various WinPads and other tablets right up to the Microsoft announcement of a tablet platform a few years ago.
You remember that, right?
This was the platform that was set to dominate all computing by 2000 or 2004 or whatever.
In the meantime, a slew of "convertible" laptops evolved and subsequently ended up in the trash heap of innovation.Apparently all anyone actually wanted was a device they could read books on.
Of course, this may only be the beginning, should Apple come out with its iSlate or iPad--what I prefer to just call a Giant iPod Touch.So, what changed?
Why are we making this sort of platform shift, all of the sudden?How's this for an idea: This would have happened a long time ago, were it not for Microsoft.
The same holds true for the smartphone.
It would have become an object of desire a lot sooner, had Microsoft laid off.The software giant has been a hindrance to progress ever since it transformed from a subversive company to a kind of IBM-clone to a lackey for big business.
There's no connection between the company and the end-user anymore.To understand this, we need to travel back to the introduction of Excel.
The dominant spreadsheet app used to be Lotus 1-2-3.
The program put Lotus at the top of the software heap.
The company was bigger than Microsoft--by a lot.
Microsoft had a crummy spreadsheet program called Multiplan at the time, but it discovered a guy who was working on a Lotus killer.
His app became Excel and Microsoft ran a series of humorous ads featuring people skulking around an office, showing each other Excel in a manner that suggested their fictitious company wouldn't approve of the use of the unauthorized product.
All of the workers continued to use it on the sly, because they could get so much more work done.The ad was totally subversive.
It fit right in with the mid-1970s PC revolution, where attendees at those early computer conferences would boo IBM's name.
The entire "micro-computer" scene was very subversive in that way.
But Bill Gates said that he'd like Microsoft to become the software version of IBM.
It's since managed that feat, and now Microsoft is as likely to get booed as IBM was back in 1976And you know what?
Judging from the company's recent history, it deserves it.
Microsoft is the post-modern IBM, serving as the same sort of hindrance to the scene that IBM was 50 years ago.
Microsoft is hindering the industry with its vision, stumbling into places it doesn't belong.
The company is like the big, dumb rich kid you don't want at your party.
He comes anyway and knocks over the punchbowl--again.
The company ruins markets just by showing up.Microsoft entered the smartphone space early on and slowed things down in the process because nobody wanted to compete with its money and fickle approach.
It's not worth risking that the company would take your good idea, re-brand your idea, and ruin the whole thing for everyone.
It's a scorched earth policy.
Examples are everywhere.
Look at FrontPage, a very functional HTML editor bought and branded as a Microsoft product.
The company kept mucking with the app until it was useless.
The product line was eventually shuttered.
HTML and page editing gravitated to the Mac in order to avoid Microsoft (the aforementioned big, dumb rich kid).
That Mac software was eventually re-coded for Windows.Microsoft's tablet was doomed from the beginning--actually, it's been doomed numerous times over the years.
The c</sentencetext>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_01_07_1951244.30695398</id>
	<title>Re:Driver Quality?</title>
	<author>Anonymous</author>
	<datestamp>1262969880000</datestamp>
	<modclass>None</modclass>
	<modscore>1</modscore>
	<htmltext><p><div class="quote"><p><div class="quote"><p>Obviously you have never tried running Linux on a system with a ATI graphics card.</p></div><p>. One update and suddenly your graphics drivers won't work and X won't start. Then it's back down to the CLI to figure out why the fully supported drivers with full 6600GT support don't work with your 6600GT.</p><p>P.S. I've been jaded by automatic updates.</p></div><p>Wrong, dkms takes care of automatically (re)compiling the nvidia module if needed. This happens on boot, before X starts. All good.</p></div>
	</htmltext>
<tokenext>Obviously you have never tried running Linux on a system with a ATI graphics card.. One update and suddenly your graphics drivers wo n't work and X wo n't start .
Then it 's back down to the CLI to figure out why the fully supported drivers with full 6600GT support do n't work with your 6600GT.P.S .
I 've been jaded by automatic updates.Wrong , dkms takes care of automatically ( re ) compiling the nvidia module if needed .
This happens on boot , before X starts .
All good .</tokentext>
<sentencetext>Obviously you have never tried running Linux on a system with a ATI graphics card.. One update and suddenly your graphics drivers won't work and X won't start.
Then it's back down to the CLI to figure out why the fully supported drivers with full 6600GT support don't work with your 6600GT.P.S.
I've been jaded by automatic updates.Wrong, dkms takes care of automatically (re)compiling the nvidia module if needed.
This happens on boot, before X starts.
All good.
	</sentencetext>
	<parent>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_01_07_1951244.30688586</parent>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_01_07_1951244.30687296</id>
	<title>Re:Driver Quality?</title>
	<author>Anonymous</author>
	<datestamp>1262857620000</datestamp>
	<modclass>Insightful</modclass>
	<modscore>3</modscore>
	<htmltext>1995 called and wants their "ATI drivers are crap" comment back.</htmltext>
<tokenext>1995 called and wants their " ATI drivers are crap " comment back .</tokentext>
<sentencetext>1995 called and wants their "ATI drivers are crap" comment back.</sentencetext>
	<parent>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_01_07_1951244.30686970</parent>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_01_07_1951244.30688402</id>
	<title>Re:And?</title>
	<author>dunkelfalke</author>
	<datestamp>1262862960000</datestamp>
	<modclass>None</modclass>
	<modscore>1</modscore>
	<htmltext><p>So, in your opinion, all technical progress should stop at once?</p></htmltext>
<tokenext>So , in your opinion , all technical progress should stop at once ?</tokentext>
<sentencetext>So, in your opinion, all technical progress should stop at once?</sentencetext>
	<parent>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_01_07_1951244.30686964</parent>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_01_07_1951244.30688458</id>
	<title>Re:Driver Quality?</title>
	<author>cynyr</author>
	<datestamp>1262863260000</datestamp>
	<modclass>Insightful</modclass>
	<modscore>2</modscore>
	<htmltext>hmm xrandar support, new kernel support? can i run vs. git sources? or just 1-2 releases back? does it support the 57xx and 58xx cards yet? how about TVout? Also can i use the card "hard"(WoW raids) for 4+ hours? and maintain uptimes of weeks? how about the current release of xorg? All of the above only applies to linux.<br> <br>

Anyways until then i'll be sticking with nvidia cards.</htmltext>
<tokenext>hmm xrandar support , new kernel support ?
can i run vs. git sources ?
or just 1-2 releases back ?
does it support the 57xx and 58xx cards yet ?
how about TVout ?
Also can i use the card " hard " ( WoW raids ) for 4 + hours ?
and maintain uptimes of weeks ?
how about the current release of xorg ?
All of the above only applies to linux .
Anyways until then i 'll be sticking with nvidia cards .</tokentext>
<sentencetext>hmm xrandar support, new kernel support?
can i run vs. git sources?
or just 1-2 releases back?
does it support the 57xx and 58xx cards yet?
how about TVout?
Also can i use the card "hard"(WoW raids) for 4+ hours?
and maintain uptimes of weeks?
how about the current release of xorg?
All of the above only applies to linux.
Anyways until then i'll be sticking with nvidia cards.</sentencetext>
	<parent>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_01_07_1951244.30687360</parent>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_01_07_1951244.30687360</id>
	<title>Re:Driver Quality?</title>
	<author>Anonymous</author>
	<datestamp>1262857980000</datestamp>
	<modclass>None</modclass>
	<modscore>0</modscore>
	<htmltext><p>Agreed. It's been some time since ATI/AMD has produced bad graphics drivers.</p></htmltext>
<tokenext>Agreed .
It 's been some time since ATI/AMD has produced bad graphics drivers .</tokentext>
<sentencetext>Agreed.
It's been some time since ATI/AMD has produced bad graphics drivers.</sentencetext>
	<parent>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_01_07_1951244.30687296</parent>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_01_07_1951244.30687500</id>
	<title>Innovationz!!!!</title>
	<author>NotBorg</author>
	<datestamp>1262858640000</datestamp>
	<modclass>Funny</modclass>
	<modscore>3</modscore>
	<htmltext><p>DirectX 11 in a mobile device?  So the device doubles as a hairdryer?</p></htmltext>
<tokenext>DirectX 11 in a mobile device ?
So the device doubles as a hairdryer ?</tokentext>
<sentencetext>DirectX 11 in a mobile device?
So the device doubles as a hairdryer?</sentencetext>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_01_07_1951244.30691342</id>
	<title>Re:People Still Use DirectX???</title>
	<author>RAMMS+EIN</author>
	<datestamp>1262892360000</datestamp>
	<modclass>None</modclass>
	<modscore>1</modscore>
	<htmltext><p>The question is, with all your arguments favoring OpenGL over Direct3D, why are developers still using the latter?</p><p>And why do the developers of many portable 3D libraries and game development libraries (Crystal Space, OGRE, etc.) support Direct3D, even though they already have OpenGL support, which would supposedly work just fine?</p></htmltext>
<tokenext>The question is , with all your arguments favoring OpenGL over Direct3D , why are developers still using the latter ? And why do the developers of many portable 3D libraries and game development libraries ( Crystal Space , OGRE , etc .
) support Direct3D , even though they already have OpenGL support , which would supposedly work just fine ?</tokentext>
<sentencetext>The question is, with all your arguments favoring OpenGL over Direct3D, why are developers still using the latter?And why do the developers of many portable 3D libraries and game development libraries (Crystal Space, OGRE, etc.
) support Direct3D, even though they already have OpenGL support, which would supposedly work just fine?</sentencetext>
	<parent>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_01_07_1951244.30688888</parent>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_01_07_1951244.30689650</id>
	<title>Re:Driver Quality?</title>
	<author>Arterion</author>
	<datestamp>1262871600000</datestamp>
	<modclass>None</modclass>
	<modscore>1</modscore>
	<htmltext><p>Actually, if you rely on your laptop manufacturer to provide you updated drivers, they DO suck.  The decision to over the drivers from their website for mobile cards is an amazing decision they should've made years ago.</p></htmltext>
<tokenext>Actually , if you rely on your laptop manufacturer to provide you updated drivers , they DO suck .
The decision to over the drivers from their website for mobile cards is an amazing decision they should 've made years ago .</tokentext>
<sentencetext>Actually, if you rely on your laptop manufacturer to provide you updated drivers, they DO suck.
The decision to over the drivers from their website for mobile cards is an amazing decision they should've made years ago.</sentencetext>
	<parent>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_01_07_1951244.30687296</parent>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_01_07_1951244.30687402</id>
	<title>I would rather eat a bag of computer screws...</title>
	<author>Anonymous</author>
	<datestamp>1262858160000</datestamp>
	<modclass>None</modclass>
	<modscore>0</modscore>
	<htmltext><p>Than buy another graphics card powered by ATI. Seriously, my X1300xt was only supported for like 3 years. Now I can't even use it on a current distro whilst retaining the full functionality/performance of the closed driver.</p><p>To ATI: Support your products for at least 5 years (like NVidia does), or I will *never* buy from you again!</p></htmltext>
<tokenext>Than buy another graphics card powered by ATI .
Seriously , my X1300xt was only supported for like 3 years .
Now I ca n't even use it on a current distro whilst retaining the full functionality/performance of the closed driver.To ATI : Support your products for at least 5 years ( like NVidia does ) , or I will * never * buy from you again !</tokentext>
<sentencetext>Than buy another graphics card powered by ATI.
Seriously, my X1300xt was only supported for like 3 years.
Now I can't even use it on a current distro whilst retaining the full functionality/performance of the closed driver.To ATI: Support your products for at least 5 years (like NVidia does), or I will *never* buy from you again!</sentencetext>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_01_07_1951244.30687810</id>
	<title>Most of the game world</title>
	<author>Sycraft-fu</author>
	<datestamp>1262860140000</datestamp>
	<modclass>Interestin</modclass>
	<modscore>3</modscore>
	<htmltext><p>As well as a good deal of other Windows graphic programs. You can stick your head in the sand and pretend that Microsoft Windows isn't a major player, but you are fooling only yourself. Windows development matters a whole lot, and DX is the native API and thus many use it.</p><p>However, in this case the reference is to features of the card. See OpenGL is really bad about staying up to date with hardware. They are always playing catchup and often their "support" is just to have the vendors implement their own extensions. So when a new card comes out, talking about it in terms of OpenGL features isn't useful.</p><p>Well, new versions of DirectX neatly map to new hardware features. Reason is MS works with the card vendors. They tell the vendors what they'd like to see, the vendors tell them what they are working on for their next gen chips and so on. So a "DX11" card means "A card that supports the full DirectX 11 feature set." This implies many things, like 64-bit FP support, support for new shader models, and so on. IT can be conveniently summed up as DX11. This sets it apart to a DX10 card like the 8800. While that can run with DX11 APIs, it doesn't support the features. Calling it DX10 means it supports the full DX10 feature set.</p><p>So that's the reason. If you want to yell and scream how OpenGL should rule the world, you can go right ahead, however the simple fact of the matter is DirectX is a major, major player in the graphics market.</p></htmltext>
<tokenext>As well as a good deal of other Windows graphic programs .
You can stick your head in the sand and pretend that Microsoft Windows is n't a major player , but you are fooling only yourself .
Windows development matters a whole lot , and DX is the native API and thus many use it.However , in this case the reference is to features of the card .
See OpenGL is really bad about staying up to date with hardware .
They are always playing catchup and often their " support " is just to have the vendors implement their own extensions .
So when a new card comes out , talking about it in terms of OpenGL features is n't useful.Well , new versions of DirectX neatly map to new hardware features .
Reason is MS works with the card vendors .
They tell the vendors what they 'd like to see , the vendors tell them what they are working on for their next gen chips and so on .
So a " DX11 " card means " A card that supports the full DirectX 11 feature set .
" This implies many things , like 64-bit FP support , support for new shader models , and so on .
IT can be conveniently summed up as DX11 .
This sets it apart to a DX10 card like the 8800 .
While that can run with DX11 APIs , it does n't support the features .
Calling it DX10 means it supports the full DX10 feature set.So that 's the reason .
If you want to yell and scream how OpenGL should rule the world , you can go right ahead , however the simple fact of the matter is DirectX is a major , major player in the graphics market .</tokentext>
<sentencetext>As well as a good deal of other Windows graphic programs.
You can stick your head in the sand and pretend that Microsoft Windows isn't a major player, but you are fooling only yourself.
Windows development matters a whole lot, and DX is the native API and thus many use it.However, in this case the reference is to features of the card.
See OpenGL is really bad about staying up to date with hardware.
They are always playing catchup and often their "support" is just to have the vendors implement their own extensions.
So when a new card comes out, talking about it in terms of OpenGL features isn't useful.Well, new versions of DirectX neatly map to new hardware features.
Reason is MS works with the card vendors.
They tell the vendors what they'd like to see, the vendors tell them what they are working on for their next gen chips and so on.
So a "DX11" card means "A card that supports the full DirectX 11 feature set.
" This implies many things, like 64-bit FP support, support for new shader models, and so on.
IT can be conveniently summed up as DX11.
This sets it apart to a DX10 card like the 8800.
While that can run with DX11 APIs, it doesn't support the features.
Calling it DX10 means it supports the full DX10 feature set.So that's the reason.
If you want to yell and scream how OpenGL should rule the world, you can go right ahead, however the simple fact of the matter is DirectX is a major, major player in the graphics market.</sentencetext>
	<parent>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_01_07_1951244.30687044</parent>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_01_07_1951244.30687892</id>
	<title>Re:Driver Quality?</title>
	<author>perrin</author>
	<datestamp>1262860680000</datestamp>
	<modclass>None</modclass>
	<modscore>1</modscore>
	<htmltext><p>It was only 3 years ago when I gave up on ATI and switched to NVidia because ATIs drivers could not handle bad inputs, and would crash the entire system. So I had to write my own abstraction layer to ensure that no bad point coordinates and so on could be sent to the driver. I also filed kernel crash bugs with ATI that took forever to get fixed. After I switched to NVidia, I have yet to see a single kernel failure due to programming mistakes. Their drivers are just rock solid. So much better to develop on that it would take a lot to go back. I also had much the same bad experience with the open source Intel drivers.</p></htmltext>
<tokenext>It was only 3 years ago when I gave up on ATI and switched to NVidia because ATIs drivers could not handle bad inputs , and would crash the entire system .
So I had to write my own abstraction layer to ensure that no bad point coordinates and so on could be sent to the driver .
I also filed kernel crash bugs with ATI that took forever to get fixed .
After I switched to NVidia , I have yet to see a single kernel failure due to programming mistakes .
Their drivers are just rock solid .
So much better to develop on that it would take a lot to go back .
I also had much the same bad experience with the open source Intel drivers .</tokentext>
<sentencetext>It was only 3 years ago when I gave up on ATI and switched to NVidia because ATIs drivers could not handle bad inputs, and would crash the entire system.
So I had to write my own abstraction layer to ensure that no bad point coordinates and so on could be sent to the driver.
I also filed kernel crash bugs with ATI that took forever to get fixed.
After I switched to NVidia, I have yet to see a single kernel failure due to programming mistakes.
Their drivers are just rock solid.
So much better to develop on that it would take a lot to go back.
I also had much the same bad experience with the open source Intel drivers.</sentencetext>
	<parent>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_01_07_1951244.30687296</parent>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_01_07_1951244.30688050</id>
	<title>Re:People Still Use DirectX???</title>
	<author>Anonymous</author>
	<datestamp>1262861520000</datestamp>
	<modclass>None</modclass>
	<modscore>0</modscore>
	<htmltext><p>I still remember Half Life 1 / Counter Strike beta through 1.6, and getting much better results on OpenGL then on the DirectX crap at the time (was it 5 or 6?).  Since then, it seems most modern games require directx, and no longer offer the OpenGL option, which is a true shame.</p></htmltext>
<tokenext>I still remember Half Life 1 / Counter Strike beta through 1.6 , and getting much better results on OpenGL then on the DirectX crap at the time ( was it 5 or 6 ? ) .
Since then , it seems most modern games require directx , and no longer offer the OpenGL option , which is a true shame .</tokentext>
<sentencetext>I still remember Half Life 1 / Counter Strike beta through 1.6, and getting much better results on OpenGL then on the DirectX crap at the time (was it 5 or 6?).
Since then, it seems most modern games require directx, and no longer offer the OpenGL option, which is a true shame.</sentencetext>
	<parent>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_01_07_1951244.30687044</parent>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_01_07_1951244.30691648</id>
	<title>Re:And?</title>
	<author>AHuxley</author>
	<datestamp>1262983020000</datestamp>
	<modclass>Troll</modclass>
	<modscore>0</modscore>
	<htmltext>All you need for the game of monopoly that is the windows 7 desktop.</htmltext>
<tokenext>All you need for the game of monopoly that is the windows 7 desktop .</tokentext>
<sentencetext>All you need for the game of monopoly that is the windows 7 desktop.</sentencetext>
	<parent>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_01_07_1951244.30687454</parent>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_01_07_1951244.30687754</id>
	<title>Re:People Still Use DirectX???</title>
	<author>Anonymous</author>
	<datestamp>1262859900000</datestamp>
	<modclass>Insightful</modclass>
	<modscore>2</modscore>
	<htmltext><p>Yeah those "poor sods" making multi-million dollar grossing titles.  Seriously, I'm all for OpenGL.  I like it because it does make ports easier and I'd like to see more games available on Linux and Mac.</p><p>The snide "are people STILL using technology X?" comments when technology X is the clear market leader are just annoying though.</p></htmltext>
<tokenext>Yeah those " poor sods " making multi-million dollar grossing titles .
Seriously , I 'm all for OpenGL .
I like it because it does make ports easier and I 'd like to see more games available on Linux and Mac.The snide " are people STILL using technology X ?
" comments when technology X is the clear market leader are just annoying though .</tokentext>
<sentencetext>Yeah those "poor sods" making multi-million dollar grossing titles.
Seriously, I'm all for OpenGL.
I like it because it does make ports easier and I'd like to see more games available on Linux and Mac.The snide "are people STILL using technology X?
" comments when technology X is the clear market leader are just annoying though.</sentencetext>
	<parent>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_01_07_1951244.30687044</parent>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_01_07_1951244.30691284</id>
	<title>Re:Driver Quality?</title>
	<author>Anonymous</author>
	<datestamp>1262891760000</datestamp>
	<modclass>None</modclass>
	<modscore>0</modscore>
	<htmltext><p>I just got off the phone with 1996 and they say they want their comebacks.</p></htmltext>
<tokenext>I just got off the phone with 1996 and they say they want their comebacks .</tokentext>
<sentencetext>I just got off the phone with 1996 and they say they want their comebacks.</sentencetext>
	<parent>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_01_07_1951244.30687296</parent>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_01_07_1951244.30687644</id>
	<title>Re:Driver Quality?</title>
	<author>armanox</author>
	<datestamp>1262859360000</datestamp>
	<modclass>None</modclass>
	<modscore>1</modscore>
	<htmltext>Recently I've not seen an issue - HD4550 in my desktop, HD3200 in my lappy, XPress200m in old lappy (which did have major driver issues in 2006)</htmltext>
<tokenext>Recently I 've not seen an issue - HD4550 in my desktop , HD3200 in my lappy , XPress200m in old lappy ( which did have major driver issues in 2006 )</tokentext>
<sentencetext>Recently I've not seen an issue - HD4550 in my desktop, HD3200 in my lappy, XPress200m in old lappy (which did have major driver issues in 2006)</sentencetext>
	<parent>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_01_07_1951244.30687352</parent>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_01_07_1951244.30688586</id>
	<title>Re:Driver Quality?</title>
	<author>BikeHelmet</author>
	<datestamp>1262863860000</datestamp>
	<modclass>Informativ</modclass>
	<modscore>1</modscore>
	<htmltext><p><div class="quote"><p>Obviously you have never tried running Linux on a system with a ATI graphics card.</p></div><p>Obviously you have never tried running Linux on a system with a nVidia graphics card.</p><p>It's seriously a PITA to get new drivers working on a new kernel with an old card. Anything pre-GeForce 8 may have annoying issues. Not a problem for desktop linux with a new videocard - but if you were setting up a Myth box on that old Athlon XP w/ 6600GT, you may be in for a headache.</p><p>Avoid distros like Ubuntu with automatic kernel updates. One update and suddenly your graphics drivers won't work and X won't start. Then it's back down to the CLI to figure out why the fully supported drivers with full 6600GT support don't work with your 6600GT.</p><p>P.S. I've been jaded by automatic updates.</p></div>
	</htmltext>
<tokenext>Obviously you have never tried running Linux on a system with a ATI graphics card.Obviously you have never tried running Linux on a system with a nVidia graphics card.It 's seriously a PITA to get new drivers working on a new kernel with an old card .
Anything pre-GeForce 8 may have annoying issues .
Not a problem for desktop linux with a new videocard - but if you were setting up a Myth box on that old Athlon XP w/ 6600GT , you may be in for a headache.Avoid distros like Ubuntu with automatic kernel updates .
One update and suddenly your graphics drivers wo n't work and X wo n't start .
Then it 's back down to the CLI to figure out why the fully supported drivers with full 6600GT support do n't work with your 6600GT.P.S .
I 've been jaded by automatic updates .</tokentext>
<sentencetext>Obviously you have never tried running Linux on a system with a ATI graphics card.Obviously you have never tried running Linux on a system with a nVidia graphics card.It's seriously a PITA to get new drivers working on a new kernel with an old card.
Anything pre-GeForce 8 may have annoying issues.
Not a problem for desktop linux with a new videocard - but if you were setting up a Myth box on that old Athlon XP w/ 6600GT, you may be in for a headache.Avoid distros like Ubuntu with automatic kernel updates.
One update and suddenly your graphics drivers won't work and X won't start.
Then it's back down to the CLI to figure out why the fully supported drivers with full 6600GT support don't work with your 6600GT.P.S.
I've been jaded by automatic updates.
	</sentencetext>
	<parent>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_01_07_1951244.30687352</parent>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_01_07_1951244.30690018</id>
	<title>Re:And?</title>
	<author>Anonymous</author>
	<datestamp>1262874960000</datestamp>
	<modclass>None</modclass>
	<modscore>0</modscore>
	<htmltext>Direct X 11? Is that anything like Direct X 10? Last story I heard, said I should rush out to get a Direct X 10 video card so I can play the whopping 3 games that support Direct X 10. I haven't heard any other stories updating that number.</htmltext>
<tokenext>Direct X 11 ?
Is that anything like Direct X 10 ?
Last story I heard , said I should rush out to get a Direct X 10 video card so I can play the whopping 3 games that support Direct X 10 .
I have n't heard any other stories updating that number .</tokentext>
<sentencetext>Direct X 11?
Is that anything like Direct X 10?
Last story I heard, said I should rush out to get a Direct X 10 video card so I can play the whopping 3 games that support Direct X 10.
I haven't heard any other stories updating that number.</sentencetext>
	<parent>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_01_07_1951244.30686964</parent>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_01_07_1951244.30687518</id>
	<title>Re:People Still Use DirectX???</title>
	<author>Anonymous</author>
	<datestamp>1262858700000</datestamp>
	<modclass>None</modclass>
	<modscore>0</modscore>
	<htmltext><p>You don't play many Windows games, do you?  Most of them use DirectX, IIRC.</p></htmltext>
<tokenext>You do n't play many Windows games , do you ?
Most of them use DirectX , IIRC .</tokentext>
<sentencetext>You don't play many Windows games, do you?
Most of them use DirectX, IIRC.</sentencetext>
	<parent>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_01_07_1951244.30687044</parent>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_01_07_1951244.30690282</id>
	<title>Re:Driver Quality?</title>
	<author>hairyfeet</author>
	<datestamp>1262877900000</datestamp>
	<modclass>None</modclass>
	<modscore>1</modscore>
	<htmltext><p>What exactly is wrong with ATI drivers? Exactly? Because I was actually worried when I bought my first ATI a few years back because of all the horror stories I had heard, but frankly I ain't had a bit of trouble out of ANY of my AMD/ATI gear. </p><p>I started with an ATI X1950 IIRC, because my 6200 was getting long in the tooth and Nvidia wanted crazy money for anything AGP (BTW you can <a href="http://www.newegg.com/Product/Product.aspx?Item=N82E16814131329" title="newegg.com">still get</a> [newegg.com] decent AGP cards from ATI) and when I installed it (after using Drivercleaner of course) behold! It all just worked. And it is still working as a matter of fact, with my youngest boy using it with a 3.06Ghz Celeron to play Lunia, Aruarose, Perfect World, and a few other MMORPGs.</p><p>When I passed it and the 3.6Ghz P4 on down to the boys I decided to take the plunge and support competition and go all AMD. First I gamed for nearly 3 months on the IGP! of my 780v board (it played Bioshock! It didn't suck!) and then upgraded my dual to a quad core and my GPU to a 4650HD. To even push my luck I got an ATI USB TV Tuner off of Woot! so I could watch cable on my monitor. To my complete surprise, even the TV Tuner, which anyone who has ever had one can tell you can be seriously flaky driver wise, just worked beautifully. i have pushed my luck by upgrading the drivers a couple of times, even changed OSes from XP X64 to Windows 7 HP x64, and it all "just works" day in and day out, nary a glitch or skip, and it all runs cool &amp; quiet without a bit of troubles.</p><p>

So what <em>exactly is wrong</em> with ATI drivers? Because surely with 3 different boxes, running 3 different OSes (XP32, XP64, 7 HP x64) I would have run into something, wouldn't I? Surely I just can't be the luckiest ATI customer on the planet? And since the "bang for the buck" is squarely in the AMD/ATI camp I have been selling a lot of lower end AMD duals and quads on ATI boards and have yet to have a customer complaint there either. So what am I missing?</p></htmltext>
<tokenext>What exactly is wrong with ATI drivers ?
Exactly ? Because I was actually worried when I bought my first ATI a few years back because of all the horror stories I had heard , but frankly I ai n't had a bit of trouble out of ANY of my AMD/ATI gear .
I started with an ATI X1950 IIRC , because my 6200 was getting long in the tooth and Nvidia wanted crazy money for anything AGP ( BTW you can still get [ newegg.com ] decent AGP cards from ATI ) and when I installed it ( after using Drivercleaner of course ) behold !
It all just worked .
And it is still working as a matter of fact , with my youngest boy using it with a 3.06Ghz Celeron to play Lunia , Aruarose , Perfect World , and a few other MMORPGs.When I passed it and the 3.6Ghz P4 on down to the boys I decided to take the plunge and support competition and go all AMD .
First I gamed for nearly 3 months on the IGP !
of my 780v board ( it played Bioshock !
It did n't suck !
) and then upgraded my dual to a quad core and my GPU to a 4650HD .
To even push my luck I got an ATI USB TV Tuner off of Woot !
so I could watch cable on my monitor .
To my complete surprise , even the TV Tuner , which anyone who has ever had one can tell you can be seriously flaky driver wise , just worked beautifully .
i have pushed my luck by upgrading the drivers a couple of times , even changed OSes from XP X64 to Windows 7 HP x64 , and it all " just works " day in and day out , nary a glitch or skip , and it all runs cool &amp; quiet without a bit of troubles .
So what exactly is wrong with ATI drivers ?
Because surely with 3 different boxes , running 3 different OSes ( XP32 , XP64 , 7 HP x64 ) I would have run into something , would n't I ?
Surely I just ca n't be the luckiest ATI customer on the planet ?
And since the " bang for the buck " is squarely in the AMD/ATI camp I have been selling a lot of lower end AMD duals and quads on ATI boards and have yet to have a customer complaint there either .
So what am I missing ?</tokentext>
<sentencetext>What exactly is wrong with ATI drivers?
Exactly? Because I was actually worried when I bought my first ATI a few years back because of all the horror stories I had heard, but frankly I ain't had a bit of trouble out of ANY of my AMD/ATI gear.
I started with an ATI X1950 IIRC, because my 6200 was getting long in the tooth and Nvidia wanted crazy money for anything AGP (BTW you can still get [newegg.com] decent AGP cards from ATI) and when I installed it (after using Drivercleaner of course) behold!
It all just worked.
And it is still working as a matter of fact, with my youngest boy using it with a 3.06Ghz Celeron to play Lunia, Aruarose, Perfect World, and a few other MMORPGs.When I passed it and the 3.6Ghz P4 on down to the boys I decided to take the plunge and support competition and go all AMD.
First I gamed for nearly 3 months on the IGP!
of my 780v board (it played Bioshock!
It didn't suck!
) and then upgraded my dual to a quad core and my GPU to a 4650HD.
To even push my luck I got an ATI USB TV Tuner off of Woot!
so I could watch cable on my monitor.
To my complete surprise, even the TV Tuner, which anyone who has ever had one can tell you can be seriously flaky driver wise, just worked beautifully.
i have pushed my luck by upgrading the drivers a couple of times, even changed OSes from XP X64 to Windows 7 HP x64, and it all "just works" day in and day out, nary a glitch or skip, and it all runs cool &amp; quiet without a bit of troubles.
So what exactly is wrong with ATI drivers?
Because surely with 3 different boxes, running 3 different OSes (XP32, XP64, 7 HP x64) I would have run into something, wouldn't I?
Surely I just can't be the luckiest ATI customer on the planet?
And since the "bang for the buck" is squarely in the AMD/ATI camp I have been selling a lot of lower end AMD duals and quads on ATI boards and have yet to have a customer complaint there either.
So what am I missing?</sentencetext>
	<parent>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_01_07_1951244.30686970</parent>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_01_07_1951244.30688898</id>
	<title>Re:I would rather eat a bag of computer screws...</title>
	<author>Anonymous</author>
	<datestamp>1262865600000</datestamp>
	<modclass>None</modclass>
	<modscore>1</modscore>
	<htmltext>The architecture changed significantly. Not to mention that it actually has more functionality under the open-source driver than it ever did under the closed-source one. What in the hell are you bitching about?</htmltext>
<tokenext>The architecture changed significantly .
Not to mention that it actually has more functionality under the open-source driver than it ever did under the closed-source one .
What in the hell are you bitching about ?</tokentext>
<sentencetext>The architecture changed significantly.
Not to mention that it actually has more functionality under the open-source driver than it ever did under the closed-source one.
What in the hell are you bitching about?</sentencetext>
	<parent>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_01_07_1951244.30687402</parent>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_01_07_1951244.30687330</id>
	<title>Re:People Still Use DirectX???</title>
	<author>Anonymous</author>
	<datestamp>1262857860000</datestamp>
	<modclass>Informativ</modclass>
	<modscore>2</modscore>
	<htmltext>And that makes perfect sense if you're targeting all those different platforms.  There may even be perfectly reasonable reasons to use OpenGL over DirectX based on your coding requirements and the APIs.  However, if you're target audience is Window and Windows Embedded only, and there are no requirements that are better served by OpenGL, there's no reason not to use DirectX.<br> <br>
It's just a tool.</htmltext>
<tokenext>And that makes perfect sense if you 're targeting all those different platforms .
There may even be perfectly reasonable reasons to use OpenGL over DirectX based on your coding requirements and the APIs .
However , if you 're target audience is Window and Windows Embedded only , and there are no requirements that are better served by OpenGL , there 's no reason not to use DirectX .
It 's just a tool .</tokentext>
<sentencetext>And that makes perfect sense if you're targeting all those different platforms.
There may even be perfectly reasonable reasons to use OpenGL over DirectX based on your coding requirements and the APIs.
However, if you're target audience is Window and Windows Embedded only, and there are no requirements that are better served by OpenGL, there's no reason not to use DirectX.
It's just a tool.</sentencetext>
	<parent>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_01_07_1951244.30687044</parent>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_01_07_1951244.30687888</id>
	<title>Re:People Still Use DirectX???</title>
	<author>Anonymous</author>
	<datestamp>1262860620000</datestamp>
	<modclass>None</modclass>
	<modscore>0</modscore>
	<htmltext><p>Ever hear of XBOX what the hell underlying code do you think they use, it's called XBOX with the X meaning DirectX Now it is not entirely the same as on the PC but shares some of the same underpinnings.</p></htmltext>
<tokenext>Ever hear of XBOX what the hell underlying code do you think they use , it 's called XBOX with the X meaning DirectX Now it is not entirely the same as on the PC but shares some of the same underpinnings .</tokentext>
<sentencetext>Ever hear of XBOX what the hell underlying code do you think they use, it's called XBOX with the X meaning DirectX Now it is not entirely the same as on the PC but shares some of the same underpinnings.</sentencetext>
	<parent>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_01_07_1951244.30687044</parent>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_01_07_1951244.30688888</id>
	<title>Re:People Still Use DirectX???</title>
	<author>Lodragandraoidh</author>
	<datestamp>1262865540000</datestamp>
	<modclass>Interestin</modclass>
	<modscore>3</modscore>
	<htmltext><p>I read your post and it occurred to me that it illustrates perfectly a key problem with software development today: short sightedness.</p><p>In an age of fast multiprocessing, it only makes sense to do everything you can to create abstraction layers that will ensure:</p><p>1.  My software will have the widest possible audience regardless of platform.  $$$</p><p>2.  I will be able to extend the application, or create a new one with minimal effort by reusing modules I've already created to do hard things well/fast.  $$$ (in form of turn-around time/effort)</p><p>3.  If a vendor decides to break something in their firmware/hardware - I only have to fix one module that drives the given hardware - *NOT* the application itself.  $$$ (ditto)</p><p>Flexibility, resiliency, more cash in your pocket...I don't see a down side to taking this approach.  On modern gaming rigs in particular, there is no reason NOT to use OpenGL - for all it's perceived limitations compared to a tweaked out directX X86 app.</p><p>As a gamer myself, I look at it from another angle: I have Linux, Mac machines as well as a high-end Windows game rig - to host games (I like to create and share my own maps/scenarios in some games) cost efficiently I prefer to use the Linux server, and play on my Windows box....using and tweaking WINE in order to run the game (I'm not made of money and can't cost-justify a full compliment of windows servers - which also would waste resources since I am a *nix developer too).  Getting WINE to work with some of the niche games I play is a royal pain.   If the developers of said games took my advice, I would be running their games natively under linux with minimal headaches.</p><p>Flexibility and choice is good for the widest audience.  Vendor lock-in is bad - and only serves a few types of people (the corporation$$$ and simple gamer-$$$).  The funny thing is, these companies stand to make more money than they would under their lock-n strategy if they would think long term and build flexible extensible applications that benefit the largest audience.  Lucky for me most of the titles I currently enjoy have taken this approach; I will continue to gravitate to those that do, and deny $$$ to those that won't.</p></htmltext>
<tokenext>I read your post and it occurred to me that it illustrates perfectly a key problem with software development today : short sightedness.In an age of fast multiprocessing , it only makes sense to do everything you can to create abstraction layers that will ensure : 1 .
My software will have the widest possible audience regardless of platform .
$ $ $ 2. I will be able to extend the application , or create a new one with minimal effort by reusing modules I 've already created to do hard things well/fast .
$ $ $ ( in form of turn-around time/effort ) 3 .
If a vendor decides to break something in their firmware/hardware - I only have to fix one module that drives the given hardware - * NOT * the application itself .
$ $ $ ( ditto ) Flexibility , resiliency , more cash in your pocket...I do n't see a down side to taking this approach .
On modern gaming rigs in particular , there is no reason NOT to use OpenGL - for all it 's perceived limitations compared to a tweaked out directX X86 app.As a gamer myself , I look at it from another angle : I have Linux , Mac machines as well as a high-end Windows game rig - to host games ( I like to create and share my own maps/scenarios in some games ) cost efficiently I prefer to use the Linux server , and play on my Windows box....using and tweaking WINE in order to run the game ( I 'm not made of money and ca n't cost-justify a full compliment of windows servers - which also would waste resources since I am a * nix developer too ) .
Getting WINE to work with some of the niche games I play is a royal pain .
If the developers of said games took my advice , I would be running their games natively under linux with minimal headaches.Flexibility and choice is good for the widest audience .
Vendor lock-in is bad - and only serves a few types of people ( the corporation $ $ $ and simple gamer- $ $ $ ) .
The funny thing is , these companies stand to make more money than they would under their lock-n strategy if they would think long term and build flexible extensible applications that benefit the largest audience .
Lucky for me most of the titles I currently enjoy have taken this approach ; I will continue to gravitate to those that do , and deny $ $ $ to those that wo n't .</tokentext>
<sentencetext>I read your post and it occurred to me that it illustrates perfectly a key problem with software development today: short sightedness.In an age of fast multiprocessing, it only makes sense to do everything you can to create abstraction layers that will ensure:1.
My software will have the widest possible audience regardless of platform.
$$$2.  I will be able to extend the application, or create a new one with minimal effort by reusing modules I've already created to do hard things well/fast.
$$$ (in form of turn-around time/effort)3.
If a vendor decides to break something in their firmware/hardware - I only have to fix one module that drives the given hardware - *NOT* the application itself.
$$$ (ditto)Flexibility, resiliency, more cash in your pocket...I don't see a down side to taking this approach.
On modern gaming rigs in particular, there is no reason NOT to use OpenGL - for all it's perceived limitations compared to a tweaked out directX X86 app.As a gamer myself, I look at it from another angle: I have Linux, Mac machines as well as a high-end Windows game rig - to host games (I like to create and share my own maps/scenarios in some games) cost efficiently I prefer to use the Linux server, and play on my Windows box....using and tweaking WINE in order to run the game (I'm not made of money and can't cost-justify a full compliment of windows servers - which also would waste resources since I am a *nix developer too).
Getting WINE to work with some of the niche games I play is a royal pain.
If the developers of said games took my advice, I would be running their games natively under linux with minimal headaches.Flexibility and choice is good for the widest audience.
Vendor lock-in is bad - and only serves a few types of people (the corporation$$$ and simple gamer-$$$).
The funny thing is, these companies stand to make more money than they would under their lock-n strategy if they would think long term and build flexible extensible applications that benefit the largest audience.
Lucky for me most of the titles I currently enjoy have taken this approach; I will continue to gravitate to those that do, and deny $$$ to those that won't.</sentencetext>
	<parent>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_01_07_1951244.30687330</parent>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_01_07_1951244.30687840</id>
	<title>Re:Linux support is coming, we promise!</title>
	<author>Sir\_Lewk</author>
	<datestamp>1262860320000</datestamp>
	<modclass>Insightful</modclass>
	<modscore>2</modscore>
	<htmltext><p>At first glance, from the subject line, I thought your post was a snide comment about the state of official ATI drivers on linux.  I must say though, you guys are doing an excellent job at picking up ATI's slack.</p></htmltext>
<tokenext>At first glance , from the subject line , I thought your post was a snide comment about the state of official ATI drivers on linux .
I must say though , you guys are doing an excellent job at picking up ATI 's slack .</tokentext>
<sentencetext>At first glance, from the subject line, I thought your post was a snide comment about the state of official ATI drivers on linux.
I must say though, you guys are doing an excellent job at picking up ATI's slack.</sentencetext>
	<parent>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_01_07_1951244.30687572</parent>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_01_07_1951244.30687632</id>
	<title>Re:Driver Quality?</title>
	<author>Anonymous</author>
	<datestamp>1262859240000</datestamp>
	<modclass>None</modclass>
	<modscore>0</modscore>
	<htmltext><p>you're joking right? not that nvidia's are wonderful, but ati's are still loaded with unhandled exceptions (read bsod) for applications that are NOT the top 3 latest games.  try running some 3d workstation apps or even demoscene stuff.  ati has a long way to go.</p></htmltext>
<tokenext>you 're joking right ?
not that nvidia 's are wonderful , but ati 's are still loaded with unhandled exceptions ( read bsod ) for applications that are NOT the top 3 latest games .
try running some 3d workstation apps or even demoscene stuff .
ati has a long way to go .</tokentext>
<sentencetext>you're joking right?
not that nvidia's are wonderful, but ati's are still loaded with unhandled exceptions (read bsod) for applications that are NOT the top 3 latest games.
try running some 3d workstation apps or even demoscene stuff.
ati has a long way to go.</sentencetext>
	<parent>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_01_07_1951244.30687360</parent>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_01_07_1951244.30688256</id>
	<title>Re:Driver Quality?</title>
	<author>cpicon92</author>
	<datestamp>1262862180000</datestamp>
	<modclass>None</modclass>
	<modscore>1</modscore>
	<htmltext><p><div class="quote"><p>Obviously you have never tried running Linux on a system with a ATI graphics card.</p></div><p>
It works fine for me... I've never built an nvidea system and ati graphics drivers have always come through for me.</p></div>
	</htmltext>
<tokenext>Obviously you have never tried running Linux on a system with a ATI graphics card .
It works fine for me... I 've never built an nvidea system and ati graphics drivers have always come through for me .</tokentext>
<sentencetext>Obviously you have never tried running Linux on a system with a ATI graphics card.
It works fine for me... I've never built an nvidea system and ati graphics drivers have always come through for me.
	</sentencetext>
	<parent>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_01_07_1951244.30687352</parent>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_01_07_1951244.30707472</id>
	<title>Re:Driver Quality?</title>
	<author>toddestan</author>
	<datestamp>1263054600000</datestamp>
	<modclass>None</modclass>
	<modscore>1</modscore>
	<htmltext><p>Have you seen the interface they use for the Windows drivers?  It's like a giant bloated turd cooked up in<nobr> <wbr></nobr>.NET, which makes even the simplest tasks a chore.  Why they moved away from the interface they used back around 2004 I'll never figure out.  At least they are stable now.</p></htmltext>
<tokenext>Have you seen the interface they use for the Windows drivers ?
It 's like a giant bloated turd cooked up in .NET , which makes even the simplest tasks a chore .
Why they moved away from the interface they used back around 2004 I 'll never figure out .
At least they are stable now .</tokentext>
<sentencetext>Have you seen the interface they use for the Windows drivers?
It's like a giant bloated turd cooked up in .NET, which makes even the simplest tasks a chore.
Why they moved away from the interface they used back around 2004 I'll never figure out.
At least they are stable now.</sentencetext>
	<parent>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_01_07_1951244.30687296</parent>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_01_07_1951244.30688232</id>
	<title>Re:People Still Use DirectX???</title>
	<author>Kjella</author>
	<datestamp>1262862120000</datestamp>
	<modclass>Insightful</modclass>
	<modscore>4</modscore>
	<htmltext><p>Only all the AAA games on Windows, but clearly you are far more important than them.</p></htmltext>
<tokenext>Only all the AAA games on Windows , but clearly you are far more important than them .</tokentext>
<sentencetext>Only all the AAA games on Windows, but clearly you are far more important than them.</sentencetext>
	<parent>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_01_07_1951244.30687044</parent>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_01_07_1951244.30687352</id>
	<title>Re:Driver Quality?</title>
	<author>Anonymous</author>
	<datestamp>1262857920000</datestamp>
	<modclass>Insightful</modclass>
	<modscore>5</modscore>
	<htmltext><p><div class="quote"><p>1995 called and wants their "ATI drivers are crap" comment back.</p></div><p>Obviously you have never tried running Linux on a system with a ATI graphics card.</p></div>
	</htmltext>
<tokenext>1995 called and wants their " ATI drivers are crap " comment back.Obviously you have never tried running Linux on a system with a ATI graphics card .</tokentext>
<sentencetext>1995 called and wants their "ATI drivers are crap" comment back.Obviously you have never tried running Linux on a system with a ATI graphics card.
	</sentencetext>
	<parent>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_01_07_1951244.30687296</parent>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_01_07_1951244.30688724</id>
	<title>ATI at it again...</title>
	<author>GooberToo</author>
	<datestamp>1262864460000</datestamp>
	<modclass>Informativ</modclass>
	<modscore>1</modscore>
	<htmltext><p>Just upgraded my brother's laptop over the holiday. Seems ATI dropped support for his GPU in their proprietary driver so now he has a choice. Option one, use the open source drivers which provide no 3d acceleration. Basically 3D is completely unusable. Option two, use an older distribution which has the required version of X, kernel support, and all dependent software. And with the second option comes all the associated security issues of running an old and unsupported distro. He chose to run a current distro and be stuck with 2d-only acceleration. All of the 3d games he had on his laptop are now completely unplayable; measured in fractions of frames per second.</p><p>It turns out ATI decided they would simply stop supporting his GPU and AFAIK, they have not released any 3D documentation on it. This is exactly the reason I've gone out of my way to never buy ATI. They drop support of cards like crazy leaving users completely stuck. And in something like laptops, which is exactly what this article is about, that means your entire laptop is now obsolete.</p><p>I don't care how many ATI fanboys there are that want to bash NVIDIA for providing binary blobs - the fact is, their stuff works and works well and best of all, they don't leave their users high and dry. The only problems I've had with NVIDIA was years ago when their first started providing 64-bit Linux drivers. So say what you will to support ATI, at the end of the day, they are still doing the same old thing and hurting their customers. Case in point, I have an nvidia video card which is older than my brothers laptop which is still supported by NVIDIA's drivers.</p><p>So what do you want as a user? Stuff that works year after year or a company (ATI) telling you when your equipment is obsolete and that you need to replace the entire computer?</p><p>For Linux there is still only one 3D option - NVIDIA. Period.</p></htmltext>
<tokenext>Just upgraded my brother 's laptop over the holiday .
Seems ATI dropped support for his GPU in their proprietary driver so now he has a choice .
Option one , use the open source drivers which provide no 3d acceleration .
Basically 3D is completely unusable .
Option two , use an older distribution which has the required version of X , kernel support , and all dependent software .
And with the second option comes all the associated security issues of running an old and unsupported distro .
He chose to run a current distro and be stuck with 2d-only acceleration .
All of the 3d games he had on his laptop are now completely unplayable ; measured in fractions of frames per second.It turns out ATI decided they would simply stop supporting his GPU and AFAIK , they have not released any 3D documentation on it .
This is exactly the reason I 've gone out of my way to never buy ATI .
They drop support of cards like crazy leaving users completely stuck .
And in something like laptops , which is exactly what this article is about , that means your entire laptop is now obsolete.I do n't care how many ATI fanboys there are that want to bash NVIDIA for providing binary blobs - the fact is , their stuff works and works well and best of all , they do n't leave their users high and dry .
The only problems I 've had with NVIDIA was years ago when their first started providing 64-bit Linux drivers .
So say what you will to support ATI , at the end of the day , they are still doing the same old thing and hurting their customers .
Case in point , I have an nvidia video card which is older than my brothers laptop which is still supported by NVIDIA 's drivers.So what do you want as a user ?
Stuff that works year after year or a company ( ATI ) telling you when your equipment is obsolete and that you need to replace the entire computer ? For Linux there is still only one 3D option - NVIDIA .
Period .</tokentext>
<sentencetext>Just upgraded my brother's laptop over the holiday.
Seems ATI dropped support for his GPU in their proprietary driver so now he has a choice.
Option one, use the open source drivers which provide no 3d acceleration.
Basically 3D is completely unusable.
Option two, use an older distribution which has the required version of X, kernel support, and all dependent software.
And with the second option comes all the associated security issues of running an old and unsupported distro.
He chose to run a current distro and be stuck with 2d-only acceleration.
All of the 3d games he had on his laptop are now completely unplayable; measured in fractions of frames per second.It turns out ATI decided they would simply stop supporting his GPU and AFAIK, they have not released any 3D documentation on it.
This is exactly the reason I've gone out of my way to never buy ATI.
They drop support of cards like crazy leaving users completely stuck.
And in something like laptops, which is exactly what this article is about, that means your entire laptop is now obsolete.I don't care how many ATI fanboys there are that want to bash NVIDIA for providing binary blobs - the fact is, their stuff works and works well and best of all, they don't leave their users high and dry.
The only problems I've had with NVIDIA was years ago when their first started providing 64-bit Linux drivers.
So say what you will to support ATI, at the end of the day, they are still doing the same old thing and hurting their customers.
Case in point, I have an nvidia video card which is older than my brothers laptop which is still supported by NVIDIA's drivers.So what do you want as a user?
Stuff that works year after year or a company (ATI) telling you when your equipment is obsolete and that you need to replace the entire computer?For Linux there is still only one 3D option - NVIDIA.
Period.</sentencetext>
	<parent>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_01_07_1951244.30687572</parent>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_01_07_1951244.30686970</id>
	<title>Driver Quality?</title>
	<author>Anonymous</author>
	<datestamp>1262856120000</datestamp>
	<modclass>Troll</modclass>
	<modscore>0</modscore>
	<htmltext><p>Perhaps this will increase the actual quality of the Drivers which have been historically so bad?</p><p>--jeffk++</p></htmltext>
<tokenext>Perhaps this will increase the actual quality of the Drivers which have been historically so bad ? --jeffk + +</tokentext>
<sentencetext>Perhaps this will increase the actual quality of the Drivers which have been historically so bad?--jeffk++</sentencetext>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_01_07_1951244.30689642</id>
	<title>Re:Driver Quality?</title>
	<author>Hurricane78</author>
	<datestamp>1262871480000</datestamp>
	<modclass>Informativ</modclass>
	<modscore>2</modscore>
	<htmltext><p>Well, perhaps it&rsquo;b BECAUSE THEY STILL ARE!</p><p>I have written many lengthy comments about it. When they did still use APIs that were so old, that after being deprecated for a long time, they were taken completely out of the kernel. Rendering the drivers useless.</p><p>The same thing now happened with Xorg 1.7.</p><p>And how long ago did neither compositing, nor xrandr work? One or two months?</p><p>Hell, video still does not work. (Oh, it renders it. But unless you want to see huge black and white blots of over and underexposure at the same time, while having huge blocking in that tiny color space in-between, you can not call it &ldquo;working&rdquo;.)<br>Also, acceleration is NIL.</p><p>And let&rsquo;s not forget that I can reproducibly crash the driver, by compiling the kernel or a big program in a terminal. Or swich a monitor off when in console mode. Basically everything where that crutch called &ldquo;atieventsd&rdquo; does not receive an event.</p><p>And don&rsquo;t even dare to ask about proper OpenGL 3.0 + GLSL support.</p><p>And for the Linux driver being a the piece of shit that the Windows driver is, with a emergency layer wrapped around by a one-man team (seriously: ATi Linux driver development is <em>one</em> poor guy), that&rsquo;s still impressive!</p><p>I will never again buy an ATi card, unless they open-source EVERYTHING! No exceptions. And then I wait a year on top of that, for the Xorg team to catch up.</p><p>You can say what you want about nVidia&rsquo;s binary blob. But when I could not use my brand-new HD 4850 <em>at all</em>, a year ago, I was very happy that the onboard nVidia chip &ldquo;just worked&rdquo;. No hassle. emerge nvidia-drivers, and <em>DONE</em>.</p></htmltext>
<tokenext>Well , perhaps it    b BECAUSE THEY STILL ARE ! I have written many lengthy comments about it .
When they did still use APIs that were so old , that after being deprecated for a long time , they were taken completely out of the kernel .
Rendering the drivers useless.The same thing now happened with Xorg 1.7.And how long ago did neither compositing , nor xrandr work ?
One or two months ? Hell , video still does not work .
( Oh , it renders it .
But unless you want to see huge black and white blots of over and underexposure at the same time , while having huge blocking in that tiny color space in-between , you can not call it    working    .
) Also , acceleration is NIL.And let    s not forget that I can reproducibly crash the driver , by compiling the kernel or a big program in a terminal .
Or swich a monitor off when in console mode .
Basically everything where that crutch called    atieventsd    does not receive an event.And don    t even dare to ask about proper OpenGL 3.0 + GLSL support.And for the Linux driver being a the piece of shit that the Windows driver is , with a emergency layer wrapped around by a one-man team ( seriously : ATi Linux driver development is one poor guy ) , that    s still impressive ! I will never again buy an ATi card , unless they open-source EVERYTHING !
No exceptions .
And then I wait a year on top of that , for the Xorg team to catch up.You can say what you want about nVidia    s binary blob .
But when I could not use my brand-new HD 4850 at all , a year ago , I was very happy that the onboard nVidia chip    just worked    .
No hassle .
emerge nvidia-drivers , and DONE .</tokentext>
<sentencetext>Well, perhaps it’b BECAUSE THEY STILL ARE!I have written many lengthy comments about it.
When they did still use APIs that were so old, that after being deprecated for a long time, they were taken completely out of the kernel.
Rendering the drivers useless.The same thing now happened with Xorg 1.7.And how long ago did neither compositing, nor xrandr work?
One or two months?Hell, video still does not work.
(Oh, it renders it.
But unless you want to see huge black and white blots of over and underexposure at the same time, while having huge blocking in that tiny color space in-between, you can not call it “working”.
)Also, acceleration is NIL.And let’s not forget that I can reproducibly crash the driver, by compiling the kernel or a big program in a terminal.
Or swich a monitor off when in console mode.
Basically everything where that crutch called “atieventsd” does not receive an event.And don’t even dare to ask about proper OpenGL 3.0 + GLSL support.And for the Linux driver being a the piece of shit that the Windows driver is, with a emergency layer wrapped around by a one-man team (seriously: ATi Linux driver development is one poor guy), that’s still impressive!I will never again buy an ATi card, unless they open-source EVERYTHING!
No exceptions.
And then I wait a year on top of that, for the Xorg team to catch up.You can say what you want about nVidia’s binary blob.
But when I could not use my brand-new HD 4850 at all, a year ago, I was very happy that the onboard nVidia chip “just worked”.
No hassle.
emerge nvidia-drivers, and DONE.</sentencetext>
	<parent>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_01_07_1951244.30687296</parent>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_01_07_1951244.30690606</id>
	<title>Re:Driver Quality?</title>
	<author>JDeane</author>
	<datestamp>1262881800000</datestamp>
	<modclass>None</modclass>
	<modscore>1</modscore>
	<htmltext><p>I have been using ATI's since the Radion 7000 PCI model (AGP was still new and most games didn't need that much bandwidth lol) to be honest I have like 3 issues. One is current (might be Windows 7) Star Craft gets all trippy looking on Bnet. It plays ok but after like 10 minutes I feel like I licked the wrong kind of paper.</p><p>The other 2 issues are with installing the newest drivers, this happened a long time ago under XP and just last week on 7 64 bit edition.</p><p>I had to use winrar to decompress the installer for the newest driver 9.12 and use the driver installer in device manager to get it to install.  Usually I just double click and go go go lol   Having to do it that way brought back memories of Windows 95 (it was always best to use the device manager to do drivers back then)</p><p>At least I was not the only one its a known issue with the current driver, the weird thing is not everyone is having it and a fresh install of windows does not have it happen. Look for a 9.12b or something soon.</p><p>Other then those ultra rare issues I am happy ATI customer.</p></htmltext>
<tokenext>I have been using ATI 's since the Radion 7000 PCI model ( AGP was still new and most games did n't need that much bandwidth lol ) to be honest I have like 3 issues .
One is current ( might be Windows 7 ) Star Craft gets all trippy looking on Bnet .
It plays ok but after like 10 minutes I feel like I licked the wrong kind of paper.The other 2 issues are with installing the newest drivers , this happened a long time ago under XP and just last week on 7 64 bit edition.I had to use winrar to decompress the installer for the newest driver 9.12 and use the driver installer in device manager to get it to install .
Usually I just double click and go go go lol Having to do it that way brought back memories of Windows 95 ( it was always best to use the device manager to do drivers back then ) At least I was not the only one its a known issue with the current driver , the weird thing is not everyone is having it and a fresh install of windows does not have it happen .
Look for a 9.12b or something soon.Other then those ultra rare issues I am happy ATI customer .</tokentext>
<sentencetext>I have been using ATI's since the Radion 7000 PCI model (AGP was still new and most games didn't need that much bandwidth lol) to be honest I have like 3 issues.
One is current (might be Windows 7) Star Craft gets all trippy looking on Bnet.
It plays ok but after like 10 minutes I feel like I licked the wrong kind of paper.The other 2 issues are with installing the newest drivers, this happened a long time ago under XP and just last week on 7 64 bit edition.I had to use winrar to decompress the installer for the newest driver 9.12 and use the driver installer in device manager to get it to install.
Usually I just double click and go go go lol   Having to do it that way brought back memories of Windows 95 (it was always best to use the device manager to do drivers back then)At least I was not the only one its a known issue with the current driver, the weird thing is not everyone is having it and a fresh install of windows does not have it happen.
Look for a 9.12b or something soon.Other then those ultra rare issues I am happy ATI customer.</sentencetext>
	<parent>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_01_07_1951244.30690282</parent>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_01_07_1951244.30688162</id>
	<title>Re:Linux support is coming, we promise!</title>
	<author>Ash-Fox</author>
	<datestamp>1262861880000</datestamp>
	<modclass>None</modclass>
	<modscore>1</modscore>
	<htmltext><blockquote><div><p>Support in the open-source drivers is being written as fast as ATI can verify and declassify docs.</p></div></blockquote><p>Personally, I've not been impressed with the 'correct' opensource effort when it comes to 3D acceleration support, see <a href="http://ash-fox.livejournal.com/34854.html" title="livejournal.com">my blog entry</a> [livejournal.com] for more details.</p></div>
	</htmltext>
<tokenext>Support in the open-source drivers is being written as fast as ATI can verify and declassify docs.Personally , I 've not been impressed with the 'correct ' opensource effort when it comes to 3D acceleration support , see my blog entry [ livejournal.com ] for more details .</tokentext>
<sentencetext>Support in the open-source drivers is being written as fast as ATI can verify and declassify docs.Personally, I've not been impressed with the 'correct' opensource effort when it comes to 3D acceleration support, see my blog entry [livejournal.com] for more details.
	</sentencetext>
	<parent>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_01_07_1951244.30687572</parent>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_01_07_1951244.30687454</id>
	<title>Re:And?</title>
	<author>Jeremy Erwin</author>
	<datestamp>1262858400000</datestamp>
	<modclass>None</modclass>
	<modscore>1</modscore>
	<htmltext><p>Are these cards fast enough to run the games in DX11 mode?</p></htmltext>
<tokenext>Are these cards fast enough to run the games in DX11 mode ?</tokentext>
<sentencetext>Are these cards fast enough to run the games in DX11 mode?</sentencetext>
	<parent>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_01_07_1951244.30686964</parent>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_01_07_1951244.30690012</id>
	<title>Re:Driver Quality?</title>
	<author>apoc.famine</author>
	<datestamp>1262874960000</datestamp>
	<modclass>None</modclass>
	<modscore>1</modscore>
	<htmltext><p>The $300 card I bought around 2002 had enough driver issues that I've never bought another ATI card since. So you can add at least 7 years onto that.</p></htmltext>
<tokenext>The $ 300 card I bought around 2002 had enough driver issues that I 've never bought another ATI card since .
So you can add at least 7 years onto that .</tokentext>
<sentencetext>The $300 card I bought around 2002 had enough driver issues that I've never bought another ATI card since.
So you can add at least 7 years onto that.</sentencetext>
	<parent>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_01_07_1951244.30687296</parent>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_01_07_1951244.30693284</id>
	<title>heh dx10</title>
	<author>Ilgaz</author>
	<datestamp>1262959920000</datestamp>
	<modclass>None</modclass>
	<modscore>1</modscore>
	<htmltext><p>Funny thing is, MS never learned their lesson. DirectX 10 was Vista exclusive (!!!) technology and all gamers were running XP! So, except the usual MS ass kisser companies, nobody was that stupid to release a directx 10 game.</p><p>Guess what? DirectX 11 is a Windows 7 exclusive technology!</p><p>I pity the idiots coding in directx only in this age, especially after iPhone and Intel OS X revolution. How many years must pass for them to understand?</p></htmltext>
<tokenext>Funny thing is , MS never learned their lesson .
DirectX 10 was Vista exclusive ( ! ! !
) technology and all gamers were running XP !
So , except the usual MS ass kisser companies , nobody was that stupid to release a directx 10 game.Guess what ?
DirectX 11 is a Windows 7 exclusive technology ! I pity the idiots coding in directx only in this age , especially after iPhone and Intel OS X revolution .
How many years must pass for them to understand ?</tokentext>
<sentencetext>Funny thing is, MS never learned their lesson.
DirectX 10 was Vista exclusive (!!!
) technology and all gamers were running XP!
So, except the usual MS ass kisser companies, nobody was that stupid to release a directx 10 game.Guess what?
DirectX 11 is a Windows 7 exclusive technology!I pity the idiots coding in directx only in this age, especially after iPhone and Intel OS X revolution.
How many years must pass for them to understand?</sentencetext>
	<parent>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_01_07_1951244.30690018</parent>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_01_07_1951244.30687992</id>
	<title>Re:Linux support is coming, we promise!</title>
	<author>MemoryDragon</author>
	<datestamp>1262861220000</datestamp>
	<modclass>None</modclass>
	<modscore>1</modscore>
	<htmltext><p>Not fast enough, I dumped my perfectly fine Radeo 4850 in favor of a somewhat slower NVidia, the reason was that X support was hit and miss, half the 3d functions crashed X others worked. I then dropped in my NVidia card and everything worked out of the box.<br>I do not care for how many years we got promises, the linux drivers suck donkey balls, and probably will be forever.<br>Wake me up when the stability is up to NVidias offerings, or shock the Intel opensource drivers.</p></htmltext>
<tokenext>Not fast enough , I dumped my perfectly fine Radeo 4850 in favor of a somewhat slower NVidia , the reason was that X support was hit and miss , half the 3d functions crashed X others worked .
I then dropped in my NVidia card and everything worked out of the box.I do not care for how many years we got promises , the linux drivers suck donkey balls , and probably will be forever.Wake me up when the stability is up to NVidias offerings , or shock the Intel opensource drivers .</tokentext>
<sentencetext>Not fast enough, I dumped my perfectly fine Radeo 4850 in favor of a somewhat slower NVidia, the reason was that X support was hit and miss, half the 3d functions crashed X others worked.
I then dropped in my NVidia card and everything worked out of the box.I do not care for how many years we got promises, the linux drivers suck donkey balls, and probably will be forever.Wake me up when the stability is up to NVidias offerings, or shock the Intel opensource drivers.</sentencetext>
	<parent>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_01_07_1951244.30687572</parent>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_01_07_1951244.30689926</id>
	<title>Re:People Still Use DirectX???</title>
	<author>Anonymous</author>
	<datestamp>1262874300000</datestamp>
	<modclass>None</modclass>
	<modscore>0</modscore>
	<htmltext><p>Exactly!  I am looking for OpenGL 3.2 available on video cards.  Don't tell me about locked down, proprietary, one-offs that aren't compatible with anything else, and lock me into expensive "well fix it when you pay us more" graphics.  OpenGL means I get a choice.  If I'm not happy, I can go somewhere else.  With DirectX, I'm stuck.  I like to keep my options open.  I like to be able to change vendors/suppliers as I decide.  I don't want expensive hits because of the lock in.  I don't have expensive hits if I use OpenGL.  I have expensive hits if I use DirectX.  OpenGL has always gone places DirectX never did.</p></htmltext>
<tokenext>Exactly !
I am looking for OpenGL 3.2 available on video cards .
Do n't tell me about locked down , proprietary , one-offs that are n't compatible with anything else , and lock me into expensive " well fix it when you pay us more " graphics .
OpenGL means I get a choice .
If I 'm not happy , I can go somewhere else .
With DirectX , I 'm stuck .
I like to keep my options open .
I like to be able to change vendors/suppliers as I decide .
I do n't want expensive hits because of the lock in .
I do n't have expensive hits if I use OpenGL .
I have expensive hits if I use DirectX .
OpenGL has always gone places DirectX never did .</tokentext>
<sentencetext>Exactly!
I am looking for OpenGL 3.2 available on video cards.
Don't tell me about locked down, proprietary, one-offs that aren't compatible with anything else, and lock me into expensive "well fix it when you pay us more" graphics.
OpenGL means I get a choice.
If I'm not happy, I can go somewhere else.
With DirectX, I'm stuck.
I like to keep my options open.
I like to be able to change vendors/suppliers as I decide.
I don't want expensive hits because of the lock in.
I don't have expensive hits if I use OpenGL.
I have expensive hits if I use DirectX.
OpenGL has always gone places DirectX never did.</sentencetext>
	<parent>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_01_07_1951244.30687044</parent>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_01_07_1951244.30688536</id>
	<title>Re:People Still Use DirectX???</title>
	<author>sexconker</author>
	<datestamp>1262863620000</datestamp>
	<modclass>Troll</modclass>
	<modscore>0</modscore>
	<htmltext><p><div class="quote"><p>The snide "are people STILL using technology X?" comments when technology X is the clear market leader are just annoying though.</p></div><p>Are people STILL being snide on the internet?</p></div>
	</htmltext>
<tokenext>The snide " are people STILL using technology X ?
" comments when technology X is the clear market leader are just annoying though.Are people STILL being snide on the internet ?</tokentext>
<sentencetext>The snide "are people STILL using technology X?
" comments when technology X is the clear market leader are just annoying though.Are people STILL being snide on the internet?
	</sentencetext>
	<parent>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_01_07_1951244.30687754</parent>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_01_07_1951244.30687564</id>
	<title>Re:Driver Quality?</title>
	<author>Anonymous</author>
	<datestamp>1262858940000</datestamp>
	<modclass>Interestin</modclass>
	<modscore>4</modscore>
	<htmltext><p>I have three in my system.<nobr> <wbr></nobr>:3</p></htmltext>
<tokenext>I have three in my system .
: 3</tokentext>
<sentencetext>I have three in my system.
:3</sentencetext>
	<parent>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_01_07_1951244.30687352</parent>
</comment>
<thread>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#thread_10_01_07_1951244_28</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_01_07_1951244.30688044
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_01_07_1951244.30687402
</commentlist>
</thread>
<thread>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#thread_10_01_07_1951244_6</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_01_07_1951244.30690012
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_01_07_1951244.30687296
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_01_07_1951244.30686970
</commentlist>
</thread>
<thread>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#thread_10_01_07_1951244_29</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_01_07_1951244.30689926
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_01_07_1951244.30687044
</commentlist>
</thread>
<thread>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#thread_10_01_07_1951244_19</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_01_07_1951244.30688162
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_01_07_1951244.30687572
</commentlist>
</thread>
<thread>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#thread_10_01_07_1951244_30</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_01_07_1951244.30687814
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_01_07_1951244.30687044
</commentlist>
</thread>
<thread>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#thread_10_01_07_1951244_26</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_01_07_1951244.30691284
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_01_07_1951244.30687296
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_01_07_1951244.30686970
</commentlist>
</thread>
<thread>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#thread_10_01_07_1951244_31</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_01_07_1951244.30690606
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_01_07_1951244.30690282
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_01_07_1951244.30686970
</commentlist>
</thread>
<thread>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#thread_10_01_07_1951244_4</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_01_07_1951244.30688898
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_01_07_1951244.30687402
</commentlist>
</thread>
<thread>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#thread_10_01_07_1951244_18</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_01_07_1951244.30687892
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_01_07_1951244.30687296
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_01_07_1951244.30686970
</commentlist>
</thread>
<thread>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#thread_10_01_07_1951244_23</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_01_07_1951244.30687888
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_01_07_1951244.30687044
</commentlist>
</thread>
<thread>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#thread_10_01_07_1951244_13</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_01_07_1951244.30687840
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_01_07_1951244.30687572
</commentlist>
</thread>
<thread>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#thread_10_01_07_1951244_1</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_01_07_1951244.30695398
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_01_07_1951244.30688586
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_01_07_1951244.30687352
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_01_07_1951244.30687296
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_01_07_1951244.30686970
</commentlist>
</thread>
<thread>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#thread_10_01_07_1951244_20</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_01_07_1951244.30707472
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_01_07_1951244.30687296
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_01_07_1951244.30686970
</commentlist>
</thread>
<thread>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#thread_10_01_07_1951244_21</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_01_07_1951244.30691648
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_01_07_1951244.30687454
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_01_07_1951244.30686964
</commentlist>
</thread>
<thread>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#thread_10_01_07_1951244_12</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_01_07_1951244.30687518
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_01_07_1951244.30687044
</commentlist>
</thread>
<thread>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#thread_10_01_07_1951244_9</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_01_07_1951244.30687632
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_01_07_1951244.30687360
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_01_07_1951244.30687296
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_01_07_1951244.30686970
</commentlist>
</thread>
<thread>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#thread_10_01_07_1951244_11</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_01_07_1951244.30687992
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_01_07_1951244.30687572
</commentlist>
</thread>
<thread>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#thread_10_01_07_1951244_8</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_01_07_1951244.30687810
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_01_07_1951244.30687044
</commentlist>
</thread>
<thread>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#thread_10_01_07_1951244_27</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_01_07_1951244.30699428
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_01_07_1951244.30688888
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_01_07_1951244.30687330
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_01_07_1951244.30687044
</commentlist>
</thread>
<thread>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#thread_10_01_07_1951244_17</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_01_07_1951244.30689214
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_01_07_1951244.30687044
</commentlist>
</thread>
<thread>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#thread_10_01_07_1951244_5</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_01_07_1951244.30691322
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_01_07_1951244.30688724
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_01_07_1951244.30687572
</commentlist>
</thread>
<thread>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#thread_10_01_07_1951244_10</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_01_07_1951244.30688402
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_01_07_1951244.30686964
</commentlist>
</thread>
<thread>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#thread_10_01_07_1951244_7</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_01_07_1951244.30688458
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_01_07_1951244.30687360
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_01_07_1951244.30687296
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_01_07_1951244.30686970
</commentlist>
</thread>
<thread>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#thread_10_01_07_1951244_33</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_01_07_1951244.30693284
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_01_07_1951244.30690018
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_01_07_1951244.30686964
</commentlist>
</thread>
<thread>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#thread_10_01_07_1951244_24</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_01_07_1951244.30691342
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_01_07_1951244.30688888
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_01_07_1951244.30687330
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_01_07_1951244.30687044
</commentlist>
</thread>
<thread>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#thread_10_01_07_1951244_0</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_01_07_1951244.30687284
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_01_07_1951244.30686970
</commentlist>
</thread>
<thread>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#thread_10_01_07_1951244_2</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_01_07_1951244.30687564
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_01_07_1951244.30687352
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_01_07_1951244.30687296
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_01_07_1951244.30686970
</commentlist>
</thread>
<thread>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#thread_10_01_07_1951244_25</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_01_07_1951244.30688232
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_01_07_1951244.30687044
</commentlist>
</thread>
<thread>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#thread_10_01_07_1951244_16</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_01_07_1951244.30688050
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_01_07_1951244.30687044
</commentlist>
</thread>
<thread>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#thread_10_01_07_1951244_32</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_01_07_1951244.30688256
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_01_07_1951244.30687352
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_01_07_1951244.30687296
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_01_07_1951244.30686970
</commentlist>
</thread>
<thread>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#thread_10_01_07_1951244_15</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_01_07_1951244.30687644
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_01_07_1951244.30687352
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_01_07_1951244.30687296
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_01_07_1951244.30686970
</commentlist>
</thread>
<thread>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#thread_10_01_07_1951244_3</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_01_07_1951244.30689642
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_01_07_1951244.30687296
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_01_07_1951244.30686970
</commentlist>
</thread>
<thread>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#thread_10_01_07_1951244_22</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_01_07_1951244.30688536
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_01_07_1951244.30687754
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_01_07_1951244.30687044
</commentlist>
</thread>
<thread>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#thread_10_01_07_1951244_14</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_01_07_1951244.30689650
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_01_07_1951244.30687296
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_01_07_1951244.30686970
</commentlist>
</thread>
<conversation>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#conversation10_01_07_1951244.2</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_01_07_1951244.30687184
</commentlist>
</conversation>
<conversation>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#conversation10_01_07_1951244.0</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_01_07_1951244.30687044
-http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_01_07_1951244.30687888
-http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_01_07_1951244.30687330
--http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_01_07_1951244.30688888
---http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_01_07_1951244.30691342
---http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_01_07_1951244.30699428
-http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_01_07_1951244.30687754
--http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_01_07_1951244.30688536
-http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_01_07_1951244.30687814
-http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_01_07_1951244.30689926
-http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_01_07_1951244.30688232
-http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_01_07_1951244.30688050
-http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_01_07_1951244.30687810
-http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_01_07_1951244.30689214
-http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_01_07_1951244.30687518
</commentlist>
</conversation>
<conversation>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#conversation10_01_07_1951244.5</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_01_07_1951244.30686964
-http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_01_07_1951244.30687454
--http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_01_07_1951244.30691648
-http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_01_07_1951244.30688402
-http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_01_07_1951244.30690018
--http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_01_07_1951244.30693284
</commentlist>
</conversation>
<conversation>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#conversation10_01_07_1951244.6</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_01_07_1951244.30686970
-http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_01_07_1951244.30687284
-http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_01_07_1951244.30690282
--http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_01_07_1951244.30690606
-http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_01_07_1951244.30687296
--http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_01_07_1951244.30691284
--http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_01_07_1951244.30687360
---http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_01_07_1951244.30688458
---http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_01_07_1951244.30687632
--http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_01_07_1951244.30690012
--http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_01_07_1951244.30687892
--http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_01_07_1951244.30689642
--http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_01_07_1951244.30687352
---http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_01_07_1951244.30687564
---http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_01_07_1951244.30687644
---http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_01_07_1951244.30688586
----http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_01_07_1951244.30695398
---http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_01_07_1951244.30688256
--http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_01_07_1951244.30689650
--http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_01_07_1951244.30707472
</commentlist>
</conversation>
<conversation>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#conversation10_01_07_1951244.3</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_01_07_1951244.30687402
-http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_01_07_1951244.30688898
-http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_01_07_1951244.30688044
</commentlist>
</conversation>
<conversation>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#conversation10_01_07_1951244.1</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_01_07_1951244.30687500
</commentlist>
</conversation>
<conversation>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#conversation10_01_07_1951244.4</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_01_07_1951244.30687572
-http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_01_07_1951244.30687840
-http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_01_07_1951244.30688724
--http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_01_07_1951244.30691322
-http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_01_07_1951244.30688162
-http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_01_07_1951244.30687992
</commentlist>
</conversation>
