<article>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#article09_11_17_2035200</id>
	<title>NVIDIA Ships Decent DX10 Graphics Card For Under $100</title>
	<author>kdawson</author>
	<datestamp>1258450020000</datestamp>
	<htmltext>MojoKid writes <i>"NVIDIA is launching a new mainstream graphics card today, aimed at consumers in the market for a relatively low-cost upgrade from an integrated graphics solution or older entry-level GPU. The new <a href="http://hothardware.com/printarticle.aspx?articleid=1419">GeForce GT 240 features a GPU with 96 processor cores</a>, 8 ROP units, and 32 texture filtering units. The GPU is manufactured using a 40nm process, features a GDDR5 memory controller (that's also compatible with GDDR3), and unlike NVIDIA's current high-end GPUs, the GT 240 is DirectX 10.1 compatible. For $100 or less, what's perhaps most interesting is that this graphics card actually puts up respectable frame rates with AA turned on and no external power needed beyond what a standard PCIe slot provides."</i></htmltext>
<tokenext>MojoKid writes " NVIDIA is launching a new mainstream graphics card today , aimed at consumers in the market for a relatively low-cost upgrade from an integrated graphics solution or older entry-level GPU .
The new GeForce GT 240 features a GPU with 96 processor cores , 8 ROP units , and 32 texture filtering units .
The GPU is manufactured using a 40nm process , features a GDDR5 memory controller ( that 's also compatible with GDDR3 ) , and unlike NVIDIA 's current high-end GPUs , the GT 240 is DirectX 10.1 compatible .
For $ 100 or less , what 's perhaps most interesting is that this graphics card actually puts up respectable frame rates with AA turned on and no external power needed beyond what a standard PCIe slot provides .
"</tokentext>
<sentencetext>MojoKid writes "NVIDIA is launching a new mainstream graphics card today, aimed at consumers in the market for a relatively low-cost upgrade from an integrated graphics solution or older entry-level GPU.
The new GeForce GT 240 features a GPU with 96 processor cores, 8 ROP units, and 32 texture filtering units.
The GPU is manufactured using a 40nm process, features a GDDR5 memory controller (that's also compatible with GDDR3), and unlike NVIDIA's current high-end GPUs, the GT 240 is DirectX 10.1 compatible.
For $100 or less, what's perhaps most interesting is that this graphics card actually puts up respectable frame rates with AA turned on and no external power needed beyond what a standard PCIe slot provides.
"</sentencetext>
</article>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_17_2035200.30135550</id>
	<title>Re:So, I have a question...</title>
	<author>dagamer34</author>
	<datestamp>1258454640000</datestamp>
	<modclass>None</modclass>
	<modscore>1</modscore>
	<htmltext>Iono, maybe because your eyes are closer to that "smaller screen"?</htmltext>
<tokenext>Iono , maybe because your eyes are closer to that " smaller screen " ?</tokentext>
<sentencetext>Iono, maybe because your eyes are closer to that "smaller screen"?</sentencetext>
	<parent>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_17_2035200.30135462</parent>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_17_2035200.30138984</id>
	<title>Re:Yay! Re-badged 9800GT FTW!</title>
	<author>Anonymous</author>
	<datestamp>1258474140000</datestamp>
	<modclass>None</modclass>
	<modscore>0</modscore>
	<htmltext><p>This doesn't look like a "re-badged" 9800GT at all.  Let's REALLY compare the 9800GT to that of the new GT 240, to see if what you're saying is even remotely correct.  I own both a 9800GT and a 9600GT.</p><p>Fab process: 9800GT = 65 or 55nm, GT 240 = 40nm<br>Clock (core): 9800GT = 600MHz, GT 240 = 550MHz<br>Clock (shader): 9800GT = 1500MHz, GT 240 = 1340MHz<br>Clock (memory): 9800GT = 1800MHz, GT 240 = 3400MHz<br>Pixels per sec: 9800GT = 9.6b, GT 240 = 8.8b<br>Memory bandwidth: 9800GT = 57.6GB/sec, GT 240 = 54.4GB/sec<br>Bus width: 9800GT = 256 bit, GT 240 = 128 bit<br>Stream processors: 9800GT = 112, GT 240 = not directly comparable, but we'll say 96 just to keep it simple<br>Memory type: 9800GT = GDDR3, GT 240 = GDDR5<br>Temperatures: 9800GT = mid-to-high 70s (Celsius) during load, GT 240 = mid-to-high 50s (Celsius) during load<br>Power consumption: 9800GT = ~105-120W, GT 240 = ~62-75W<br>External PCIe power connector required: 9800GT = yes (sans a few BFG cards), GT 240 = no<br>Cost: 9800GT = US$130-140, GT 240 = sub-$100</p><p>The card is physically smaller than a 8800GTS as well, since you had to go and bring up the famous "the 9800GT is just a re-badged 8800GT" argument (pretty sure you meant 8800GT and not 8800GTS, since the 8800GTS is no where near identical to the 8800GT).</p><p>The only thing "hurting" the GT 240 is the 128-bit bus width.  I'm sure nVidia will be rolling out a different card that offers 256-bit in the future.</p><p>Anyway, yup, definitely looks like a "re-badged" product to me.  Totally the same thing.  Definitely.  No doubt about it.  100\% certain.  Lower power usage, lower noise, newer technology... for a lower cost.  Looks like a definite loser to me -- but only if you're a kr4d-l33t g4m3r d00dzz!@!1@!!!!1!</p><p>Gamers... just completely out of their league as far as technological advancement goes, yet we keep catering to them because they're a cash cow.</p></htmltext>
<tokenext>This does n't look like a " re-badged " 9800GT at all .
Let 's REALLY compare the 9800GT to that of the new GT 240 , to see if what you 're saying is even remotely correct .
I own both a 9800GT and a 9600GT.Fab process : 9800GT = 65 or 55nm , GT 240 = 40nmClock ( core ) : 9800GT = 600MHz , GT 240 = 550MHzClock ( shader ) : 9800GT = 1500MHz , GT 240 = 1340MHzClock ( memory ) : 9800GT = 1800MHz , GT 240 = 3400MHzPixels per sec : 9800GT = 9.6b , GT 240 = 8.8bMemory bandwidth : 9800GT = 57.6GB/sec , GT 240 = 54.4GB/secBus width : 9800GT = 256 bit , GT 240 = 128 bitStream processors : 9800GT = 112 , GT 240 = not directly comparable , but we 'll say 96 just to keep it simpleMemory type : 9800GT = GDDR3 , GT 240 = GDDR5Temperatures : 9800GT = mid-to-high 70s ( Celsius ) during load , GT 240 = mid-to-high 50s ( Celsius ) during loadPower consumption : 9800GT = ~ 105-120W , GT 240 = ~ 62-75WExternal PCIe power connector required : 9800GT = yes ( sans a few BFG cards ) , GT 240 = noCost : 9800GT = US $ 130-140 , GT 240 = sub- $ 100The card is physically smaller than a 8800GTS as well , since you had to go and bring up the famous " the 9800GT is just a re-badged 8800GT " argument ( pretty sure you meant 8800GT and not 8800GTS , since the 8800GTS is no where near identical to the 8800GT ) .The only thing " hurting " the GT 240 is the 128-bit bus width .
I 'm sure nVidia will be rolling out a different card that offers 256-bit in the future.Anyway , yup , definitely looks like a " re-badged " product to me .
Totally the same thing .
Definitely. No doubt about it .
100 \ % certain .
Lower power usage , lower noise , newer technology... for a lower cost .
Looks like a definite loser to me -- but only if you 're a kr4d-l33t g4m3r d00dzz ! @ ! 1 @ ! ! ! ! 1 ! Gamers.. .
just completely out of their league as far as technological advancement goes , yet we keep catering to them because they 're a cash cow .</tokentext>
<sentencetext>This doesn't look like a "re-badged" 9800GT at all.
Let's REALLY compare the 9800GT to that of the new GT 240, to see if what you're saying is even remotely correct.
I own both a 9800GT and a 9600GT.Fab process: 9800GT = 65 or 55nm, GT 240 = 40nmClock (core): 9800GT = 600MHz, GT 240 = 550MHzClock (shader): 9800GT = 1500MHz, GT 240 = 1340MHzClock (memory): 9800GT = 1800MHz, GT 240 = 3400MHzPixels per sec: 9800GT = 9.6b, GT 240 = 8.8bMemory bandwidth: 9800GT = 57.6GB/sec, GT 240 = 54.4GB/secBus width: 9800GT = 256 bit, GT 240 = 128 bitStream processors: 9800GT = 112, GT 240 = not directly comparable, but we'll say 96 just to keep it simpleMemory type: 9800GT = GDDR3, GT 240 = GDDR5Temperatures: 9800GT = mid-to-high 70s (Celsius) during load, GT 240 = mid-to-high 50s (Celsius) during loadPower consumption: 9800GT = ~105-120W, GT 240 = ~62-75WExternal PCIe power connector required: 9800GT = yes (sans a few BFG cards), GT 240 = noCost: 9800GT = US$130-140, GT 240 = sub-$100The card is physically smaller than a 8800GTS as well, since you had to go and bring up the famous "the 9800GT is just a re-badged 8800GT" argument (pretty sure you meant 8800GT and not 8800GTS, since the 8800GTS is no where near identical to the 8800GT).The only thing "hurting" the GT 240 is the 128-bit bus width.
I'm sure nVidia will be rolling out a different card that offers 256-bit in the future.Anyway, yup, definitely looks like a "re-badged" product to me.
Totally the same thing.
Definitely.  No doubt about it.
100\% certain.
Lower power usage, lower noise, newer technology... for a lower cost.
Looks like a definite loser to me -- but only if you're a kr4d-l33t g4m3r d00dzz!@!1@!!!!1!Gamers...
just completely out of their league as far as technological advancement goes, yet we keep catering to them because they're a cash cow.</sentencetext>
	<parent>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_17_2035200.30136414</parent>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_17_2035200.30138320</id>
	<title>DirectX?</title>
	<author>akabigbro</author>
	<datestamp>1258468500000</datestamp>
	<modclass>Redundant</modclass>
	<modscore>-1</modscore>
	<htmltext><p>DirectX is only good for Microsoft platforms. How is that a good thing. I see that it is OpenGL 3.2 compliant, now that is something to be proud of (an actual multi-platform API).</p></htmltext>
<tokenext>DirectX is only good for Microsoft platforms .
How is that a good thing .
I see that it is OpenGL 3.2 compliant , now that is something to be proud of ( an actual multi-platform API ) .</tokentext>
<sentencetext>DirectX is only good for Microsoft platforms.
How is that a good thing.
I see that it is OpenGL 3.2 compliant, now that is something to be proud of (an actual multi-platform API).</sentencetext>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_17_2035200.30136210</id>
	<title>Vs. GTS 250?</title>
	<author>Anonymous</author>
	<datestamp>1258457340000</datestamp>
	<modclass>None</modclass>
	<modscore>1</modscore>
	<htmltext>Anyone know how this compares to a <a href="http://www.nvidia.com/object/product\_geforce\_gts\_250\_us.html" title="nvidia.com">GTS 250-based card</a> [nvidia.com]? As those are also DX10-compliant and <a href="http://www.newegg.com/Product/ProductList.aspx?Submit=ENE&amp;N=2010380048+106793024&amp;QksAutoSuggestion=&amp;ShowDeactivatedMark=False&amp;Configurator=&amp;Subcategory=48&amp;description=&amp;Ntk=&amp;CFG=&amp;SpeTabStoreType=&amp;srchInDesc=" title="newegg.com">can be easily found for around $120</a> [newegg.com], I'm not sure what the value of this new model is... beyond the psychological impact of hitting the magic $99 price point, of course.</htmltext>
<tokenext>Anyone know how this compares to a GTS 250-based card [ nvidia.com ] ?
As those are also DX10-compliant and can be easily found for around $ 120 [ newegg.com ] , I 'm not sure what the value of this new model is... beyond the psychological impact of hitting the magic $ 99 price point , of course .</tokentext>
<sentencetext>Anyone know how this compares to a GTS 250-based card [nvidia.com]?
As those are also DX10-compliant and can be easily found for around $120 [newegg.com], I'm not sure what the value of this new model is... beyond the psychological impact of hitting the magic $99 price point, of course.</sentencetext>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_17_2035200.30135958</id>
	<title>Re:Um, so?</title>
	<author>Jeng</author>
	<datestamp>1258456260000</datestamp>
	<modclass>None</modclass>
	<modscore>1</modscore>
	<htmltext><p>The performance is increasing per dollar, but the manufacturing of the video cards is an almost set price.</p><p>Much like with hard drives, yes there are 2 terabyte hard drives for around $200, but that does not mean that you can find a (recently manufactured) 200 gig hard drive for $20.  The cost of all the sub-systems sets the base price.</p></htmltext>
<tokenext>The performance is increasing per dollar , but the manufacturing of the video cards is an almost set price.Much like with hard drives , yes there are 2 terabyte hard drives for around $ 200 , but that does not mean that you can find a ( recently manufactured ) 200 gig hard drive for $ 20 .
The cost of all the sub-systems sets the base price .</tokentext>
<sentencetext>The performance is increasing per dollar, but the manufacturing of the video cards is an almost set price.Much like with hard drives, yes there are 2 terabyte hard drives for around $200, but that does not mean that you can find a (recently manufactured) 200 gig hard drive for $20.
The cost of all the sub-systems sets the base price.</sentencetext>
	<parent>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_17_2035200.30135322</parent>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_17_2035200.30136946</id>
	<title>I care</title>
	<author>westlake</author>
	<datestamp>1258460340000</datestamp>
	<modclass>None</modclass>
	<modscore>1</modscore>
	<htmltext><p><i>Does it come with a free software driver, or at least include specs so you can write your own? If not, why does it deserve a Slashdot front page headline? There are plenty of Windows gaming sites for those who want that kind of thing.</i> </p><p>There are gamers and home video enthusiasts more than willing to download and install the fully functional proprietary driver. The binary blob. Particularly between now and December 25th. Not so many equipted to write the open source driver, even if they had the time and the specs.</p></htmltext>
<tokenext>Does it come with a free software driver , or at least include specs so you can write your own ?
If not , why does it deserve a Slashdot front page headline ?
There are plenty of Windows gaming sites for those who want that kind of thing .
There are gamers and home video enthusiasts more than willing to download and install the fully functional proprietary driver .
The binary blob .
Particularly between now and December 25th .
Not so many equipted to write the open source driver , even if they had the time and the specs .</tokentext>
<sentencetext>Does it come with a free software driver, or at least include specs so you can write your own?
If not, why does it deserve a Slashdot front page headline?
There are plenty of Windows gaming sites for those who want that kind of thing.
There are gamers and home video enthusiasts more than willing to download and install the fully functional proprietary driver.
The binary blob.
Particularly between now and December 25th.
Not so many equipted to write the open source driver, even if they had the time and the specs.</sentencetext>
	<parent>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_17_2035200.30136130</parent>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_17_2035200.30135320</id>
	<title>Great..</title>
	<author>sc0ob5</author>
	<datestamp>1258453920000</datestamp>
	<modclass>None</modclass>
	<modscore>1</modscore>
	<htmltext>Now can we have it in low profile please?</htmltext>
<tokenext>Now can we have it in low profile please ?</tokentext>
<sentencetext>Now can we have it in low profile please?</sentencetext>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_17_2035200.30138038</id>
	<title>Re:So, I have a question...</title>
	<author>obarthelemy</author>
	<datestamp>1258466100000</datestamp>
	<modclass>None</modclass>
	<modscore>1</modscore>
	<htmltext><p>I'm playing WoW (which is far from cutting-edge as far as graphics are concerned) on a 1900x1200 monitor with an ATI 4750. I have to put all details at the lowest setting when raiding, otherwise the game becomes unplayable.</p><p>I do agree that very high frame rates are useless. In games though, frame rates tend to vary wildly... in WoW for exemple, I can go from 160+ when alone in the wilderness to 4 in a boss fight with lots of AOE spells. So as a gamer I need a large FPS margin from my video card.</p><p>The other issue is monitor size... I wanted a big one, so I got an Asus 26" 1900x1200 (excellent monitor, BTW). Moving all those pixels around requires much more power than for my old 1280x1024 CRT.</p><p>I'm kinda hoping resolutions and screen sizes will stabilize (frankly, my 26" is too big for computer work, I'd have been better off with 2x22" for the same price), and that IGPs will offer reasonable performance for MMOs and casual games. They already do for video.</p></htmltext>
<tokenext>I 'm playing WoW ( which is far from cutting-edge as far as graphics are concerned ) on a 1900x1200 monitor with an ATI 4750 .
I have to put all details at the lowest setting when raiding , otherwise the game becomes unplayable.I do agree that very high frame rates are useless .
In games though , frame rates tend to vary wildly... in WoW for exemple , I can go from 160 + when alone in the wilderness to 4 in a boss fight with lots of AOE spells .
So as a gamer I need a large FPS margin from my video card.The other issue is monitor size... I wanted a big one , so I got an Asus 26 " 1900x1200 ( excellent monitor , BTW ) .
Moving all those pixels around requires much more power than for my old 1280x1024 CRT.I 'm kinda hoping resolutions and screen sizes will stabilize ( frankly , my 26 " is too big for computer work , I 'd have been better off with 2x22 " for the same price ) , and that IGPs will offer reasonable performance for MMOs and casual games .
They already do for video .</tokentext>
<sentencetext>I'm playing WoW (which is far from cutting-edge as far as graphics are concerned) on a 1900x1200 monitor with an ATI 4750.
I have to put all details at the lowest setting when raiding, otherwise the game becomes unplayable.I do agree that very high frame rates are useless.
In games though, frame rates tend to vary wildly... in WoW for exemple, I can go from 160+ when alone in the wilderness to 4 in a boss fight with lots of AOE spells.
So as a gamer I need a large FPS margin from my video card.The other issue is monitor size... I wanted a big one, so I got an Asus 26" 1900x1200 (excellent monitor, BTW).
Moving all those pixels around requires much more power than for my old 1280x1024 CRT.I'm kinda hoping resolutions and screen sizes will stabilize (frankly, my 26" is too big for computer work, I'd have been better off with 2x22" for the same price), and that IGPs will offer reasonable performance for MMOs and casual games.
They already do for video.</sentencetext>
	<parent>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_17_2035200.30135462</parent>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_17_2035200.30137148</id>
	<title>Re:Um, so?</title>
	<author>sznupi</author>
	<datestamp>1258461420000</datestamp>
	<modclass>None</modclass>
	<modscore>1</modscore>
	<htmltext><p>With such cards you're buying also low power draw (now, if only that was actually seriously utilised with passive cooling as standard...)</p></htmltext>
<tokenext>With such cards you 're buying also low power draw ( now , if only that was actually seriously utilised with passive cooling as standard... )</tokentext>
<sentencetext>With such cards you're buying also low power draw (now, if only that was actually seriously utilised with passive cooling as standard...)</sentencetext>
	<parent>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_17_2035200.30135322</parent>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_17_2035200.30136328</id>
	<title>HD 1080p?</title>
	<author>Anonymous</author>
	<datestamp>1258457820000</datestamp>
	<modclass>None</modclass>
	<modscore>1</modscore>
	<htmltext><p>Can this card render HD 1080p@30FPS? What's the puniest Pentium that can deliver that HD data to it fast enough from a SATA drive?</p><p>And is there a Linux driver?</p></htmltext>
<tokenext>Can this card render HD 1080p @ 30FPS ?
What 's the puniest Pentium that can deliver that HD data to it fast enough from a SATA drive ? And is there a Linux driver ?</tokentext>
<sentencetext>Can this card render HD 1080p@30FPS?
What's the puniest Pentium that can deliver that HD data to it fast enough from a SATA drive?And is there a Linux driver?</sentencetext>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_17_2035200.30137530</id>
	<title>Re:DirectX 10.whatever? Who cares?</title>
	<author>Anonymous</author>
	<datestamp>1258463160000</datestamp>
	<modclass>None</modclass>
	<modscore>0</modscore>
	<htmltext><p>Dear Ed,</p><p>It's news for nerds, not news for Ed.</p><p>Regards,</p><p>Nerds</p></htmltext>
<tokenext>Dear Ed,It 's news for nerds , not news for Ed.Regards,Nerds</tokentext>
<sentencetext>Dear Ed,It's news for nerds, not news for Ed.Regards,Nerds</sentencetext>
	<parent>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_17_2035200.30136130</parent>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_17_2035200.30142226</id>
	<title>Re:DirectX 10.whatever? Who cares?</title>
	<author>drinkypoo</author>
	<datestamp>1257083700000</datestamp>
	<modclass>None</modclass>
	<modscore>1</modscore>
	<htmltext><p><div class="quote"><p>Does it come with a free software driver,</p></div><p>Yes. It doesn't come with a Free software driver, though. If you're going to be a tiresome pedant, at least be accurate.</p><p><div class="quote"><p>If not, why does it deserve a Slashdot front page headline?</p></div><p>Are you going to say something relevant? If not, why do you deserve a slashdot account?</p></div>
	</htmltext>
<tokenext>Does it come with a free software driver,Yes .
It does n't come with a Free software driver , though .
If you 're going to be a tiresome pedant , at least be accurate.If not , why does it deserve a Slashdot front page headline ? Are you going to say something relevant ?
If not , why do you deserve a slashdot account ?</tokentext>
<sentencetext>Does it come with a free software driver,Yes.
It doesn't come with a Free software driver, though.
If you're going to be a tiresome pedant, at least be accurate.If not, why does it deserve a Slashdot front page headline?Are you going to say something relevant?
If not, why do you deserve a slashdot account?
	</sentencetext>
	<parent>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_17_2035200.30136130</parent>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_17_2035200.30136746</id>
	<title>Re:nVidia 9400M</title>
	<author>WaroDaBeast</author>
	<datestamp>1258459440000</datestamp>
	<modclass>None</modclass>
	<modscore>1</modscore>
	<htmltext>The review over at <a href="http://www.guru3d.com/article/msi-geforce-gt-240-review-test/1" title="guru3d.com" rel="nofollow">Guru3D</a> [guru3d.com] shows that it fits between an HD4670 and a 9600GT performance-wise. As for video acceleration, you're looking at a <a href="http://en.wikipedia.org/wiki/Comparison\_of\_Nvidia\_graphics\_processing\_units#GeForce\_200\_Series" title="wikipedia.org" rel="nofollow">VP4 engine</a> [wikipedia.org].</htmltext>
<tokenext>The review over at Guru3D [ guru3d.com ] shows that it fits between an HD4670 and a 9600GT performance-wise .
As for video acceleration , you 're looking at a VP4 engine [ wikipedia.org ] .</tokentext>
<sentencetext>The review over at Guru3D [guru3d.com] shows that it fits between an HD4670 and a 9600GT performance-wise.
As for video acceleration, you're looking at a VP4 engine [wikipedia.org].</sentencetext>
	<parent>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_17_2035200.30135268</parent>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_17_2035200.30138916</id>
	<title>Where can I buy this?</title>
	<author>Akira Kogami</author>
	<datestamp>1258473360000</datestamp>
	<modclass>None</modclass>
	<modscore>1</modscore>
	<htmltext>That Zotac card looks excellent, but I can't for the life of me find anywhere to buy it. Is it not out yet or something?</htmltext>
<tokenext>That Zotac card looks excellent , but I ca n't for the life of me find anywhere to buy it .
Is it not out yet or something ?</tokentext>
<sentencetext>That Zotac card looks excellent, but I can't for the life of me find anywhere to buy it.
Is it not out yet or something?</sentencetext>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_17_2035200.30138224</id>
	<title>Re:Dear NVidia,</title>
	<author>afidel</author>
	<datestamp>1258467780000</datestamp>
	<modclass>None</modclass>
	<modscore>1</modscore>
	<htmltext><a href="http://news.softpedia.com/news/PowerColor-Intros-Passively-Cooled-Radeon-HD-5750-126159.shtml" title="softpedia.com">HD5750</a> [softpedia.com] silent launches next week</htmltext>
<tokenext>HD5750 [ softpedia.com ] silent launches next week</tokentext>
<sentencetext>HD5750 [softpedia.com] silent launches next week</sentencetext>
	<parent>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_17_2035200.30136008</parent>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_17_2035200.30135268</id>
	<title>nVidia 9400M</title>
	<author>Anonymous</author>
	<datestamp>1258453800000</datestamp>
	<modclass>Interestin</modclass>
	<modscore>3</modscore>
	<htmltext><p>How does the GT240 compare to a 9400M?</p></htmltext>
<tokenext>How does the GT240 compare to a 9400M ?</tokentext>
<sentencetext>How does the GT240 compare to a 9400M?</sentencetext>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_17_2035200.30135880</id>
	<title>Re:GPU Card Size</title>
	<author>Applekid</author>
	<datestamp>1258455840000</datestamp>
	<modclass>None</modclass>
	<modscore>1</modscore>
	<htmltext><p>If you really miss the slot that modern video cards eat up, you could always <a href="http://www.criticalcables.com/items.asp?Cc=PCIXP\_FLEX&amp;iTpStatus=0&amp;Tp=&amp;Bc=" title="criticalcables.com">get an extender</a> [criticalcables.com]. Too much money, IMHO, and some case assembly may be required, but, then, how much is it to you?</p></htmltext>
<tokenext>If you really miss the slot that modern video cards eat up , you could always get an extender [ criticalcables.com ] .
Too much money , IMHO , and some case assembly may be required , but , then , how much is it to you ?</tokentext>
<sentencetext>If you really miss the slot that modern video cards eat up, you could always get an extender [criticalcables.com].
Too much money, IMHO, and some case assembly may be required, but, then, how much is it to you?</sentencetext>
	<parent>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_17_2035200.30135392</parent>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_17_2035200.30136662</id>
	<title>What's with that hedline</title>
	<author>Anonymous</author>
	<datestamp>1258459140000</datestamp>
	<modclass>Informativ</modclass>
	<modscore>2</modscore>
	<htmltext><p>I paid 76 dollars for my 9600 GT, fanless, and it' is direct x 10 compatible.</p></htmltext>
<tokenext>I paid 76 dollars for my 9600 GT , fanless , and it ' is direct x 10 compatible .</tokentext>
<sentencetext>I paid 76 dollars for my 9600 GT, fanless, and it' is direct x 10 compatible.</sentencetext>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_17_2035200.30139690</id>
	<title>Power/heat</title>
	<author>Sycraft-fu</author>
	<datestamp>1258479360000</datestamp>
	<modclass>None</modclass>
	<modscore>1</modscore>
	<htmltext><p>This is one of the very few reasonably performance cards that doesn't require an additional power connector. There are plenty of systems out there, especially cheaper ones, that don't have much in the way of a PSU. Adding in a graphics card that needs more power can be a problem. This one? No problem, it is entirely motherboard powered.</p></htmltext>
<tokenext>This is one of the very few reasonably performance cards that does n't require an additional power connector .
There are plenty of systems out there , especially cheaper ones , that do n't have much in the way of a PSU .
Adding in a graphics card that needs more power can be a problem .
This one ?
No problem , it is entirely motherboard powered .</tokentext>
<sentencetext>This is one of the very few reasonably performance cards that doesn't require an additional power connector.
There are plenty of systems out there, especially cheaper ones, that don't have much in the way of a PSU.
Adding in a graphics card that needs more power can be a problem.
This one?
No problem, it is entirely motherboard powered.</sentencetext>
	<parent>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_17_2035200.30136210</parent>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_17_2035200.30137930</id>
	<title>Re:nVidia 9400M</title>
	<author>arth1</author>
	<datestamp>1258465500000</datestamp>
	<modclass>None</modclass>
	<modscore>1</modscore>
	<htmltext><p>Well, unlike the 275, this one may actually be small enough to fit into full size tower cases without using a Dremel on your HD bay.<br>And run without upgrading your power supply to a triple rail 900W monster.</p><p>I'm seriously considering downgrading my 8800GTS to one of these -- I'll quickly save the price of it on the saved electricity alone.  And presumably less noise too.  The only drawback is that it's single DVI.  Make a 24x with dual DVI-D for ten bucks more, and I'll switch.</p></htmltext>
<tokenext>Well , unlike the 275 , this one may actually be small enough to fit into full size tower cases without using a Dremel on your HD bay.And run without upgrading your power supply to a triple rail 900W monster.I 'm seriously considering downgrading my 8800GTS to one of these -- I 'll quickly save the price of it on the saved electricity alone .
And presumably less noise too .
The only drawback is that it 's single DVI .
Make a 24x with dual DVI-D for ten bucks more , and I 'll switch .</tokentext>
<sentencetext>Well, unlike the 275, this one may actually be small enough to fit into full size tower cases without using a Dremel on your HD bay.And run without upgrading your power supply to a triple rail 900W monster.I'm seriously considering downgrading my 8800GTS to one of these -- I'll quickly save the price of it on the saved electricity alone.
And presumably less noise too.
The only drawback is that it's single DVI.
Make a 24x with dual DVI-D for ten bucks more, and I'll switch.</sentencetext>
	<parent>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_17_2035200.30135326</parent>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_17_2035200.30137224</id>
	<title>Re:Yay! Re-badged 9800GT FTW!</title>
	<author>citizenr</author>
	<datestamp>1258461720000</datestamp>
	<modclass>None</modclass>
	<modscore>1</modscore>
	<htmltext><p><div class="quote"><p>Come on, nVidia...  Stop with the re-branding already.</p><p>This is just a die-shrunk 9800 GT</p></div><p>and somehow its slower than my 8800GS 384MB</p></div>
	</htmltext>
<tokenext>Come on , nVidia... Stop with the re-branding already.This is just a die-shrunk 9800 GTand somehow its slower than my 8800GS 384MB</tokentext>
<sentencetext>Come on, nVidia...  Stop with the re-branding already.This is just a die-shrunk 9800 GTand somehow its slower than my 8800GS 384MB
	</sentencetext>
	<parent>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_17_2035200.30136414</parent>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_17_2035200.30143272</id>
	<title>Re:Vs. GTS 250?</title>
	<author>mdm-adph</author>
	<datestamp>1257090120000</datestamp>
	<modclass>None</modclass>
	<modscore>1</modscore>
	<htmltext><p>It's because this card does it all while using a <i>fraction</i> of the GTS 250's power draw.</p></htmltext>
<tokenext>It 's because this card does it all while using a fraction of the GTS 250 's power draw .</tokentext>
<sentencetext>It's because this card does it all while using a fraction of the GTS 250's power draw.</sentencetext>
	<parent>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_17_2035200.30136210</parent>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_17_2035200.30139590</id>
	<title>Ummmm</title>
	<author>Sycraft-fu</author>
	<datestamp>1258478460000</datestamp>
	<modclass>None</modclass>
	<modscore>1</modscore>
	<htmltext><p>Guess that depends on what you mean by "just a die-shrink". For one, it isn't as though you "just" take the same mask and shrink it down on a new process. There is a good deal of work to be done.</p><p>However that aside, you are incorrect about the tech. This is not the same tech as an 8800. Main thing that shows that is DX 10.1 support, which you'll not the 8800 doesn't have (nor do the 260/280s). It also has a GDDR5 memory controller, again something the other cards listed lack (they are all GDDR3 only).</p><p>While I'm not a big fan of their recent name changes, you need to stop pretending like there's nothing new they are doing, or that there is some requirement with regards to versions. In the past major versions were a new thousand, however major version doesn't mean totally new tech. Often it was a sort of tick-tock system reminiscent of Intel's. For example the 6000 series was new Shader Model 3.0 stuff, nothing prior supported that. However the 7000 series was the same thing. The cards got faster, smaller, more parallel and so on, but they were still DX 9.0c cards like the 6000s. The 8000s were then DX10 cards, almost a completely new architecture.</p><p>This card here actually offers some features their high end cards don't, it just isn't near as fast. However saying it is the "same tech" as the 8800 is rather silly.</p></htmltext>
<tokenext>Guess that depends on what you mean by " just a die-shrink " .
For one , it is n't as though you " just " take the same mask and shrink it down on a new process .
There is a good deal of work to be done.However that aside , you are incorrect about the tech .
This is not the same tech as an 8800 .
Main thing that shows that is DX 10.1 support , which you 'll not the 8800 does n't have ( nor do the 260/280s ) .
It also has a GDDR5 memory controller , again something the other cards listed lack ( they are all GDDR3 only ) .While I 'm not a big fan of their recent name changes , you need to stop pretending like there 's nothing new they are doing , or that there is some requirement with regards to versions .
In the past major versions were a new thousand , however major version does n't mean totally new tech .
Often it was a sort of tick-tock system reminiscent of Intel 's .
For example the 6000 series was new Shader Model 3.0 stuff , nothing prior supported that .
However the 7000 series was the same thing .
The cards got faster , smaller , more parallel and so on , but they were still DX 9.0c cards like the 6000s .
The 8000s were then DX10 cards , almost a completely new architecture.This card here actually offers some features their high end cards do n't , it just is n't near as fast .
However saying it is the " same tech " as the 8800 is rather silly .</tokentext>
<sentencetext>Guess that depends on what you mean by "just a die-shrink".
For one, it isn't as though you "just" take the same mask and shrink it down on a new process.
There is a good deal of work to be done.However that aside, you are incorrect about the tech.
This is not the same tech as an 8800.
Main thing that shows that is DX 10.1 support, which you'll not the 8800 doesn't have (nor do the 260/280s).
It also has a GDDR5 memory controller, again something the other cards listed lack (they are all GDDR3 only).While I'm not a big fan of their recent name changes, you need to stop pretending like there's nothing new they are doing, or that there is some requirement with regards to versions.
In the past major versions were a new thousand, however major version doesn't mean totally new tech.
Often it was a sort of tick-tock system reminiscent of Intel's.
For example the 6000 series was new Shader Model 3.0 stuff, nothing prior supported that.
However the 7000 series was the same thing.
The cards got faster, smaller, more parallel and so on, but they were still DX 9.0c cards like the 6000s.
The 8000s were then DX10 cards, almost a completely new architecture.This card here actually offers some features their high end cards don't, it just isn't near as fast.
However saying it is the "same tech" as the 8800 is rather silly.</sentencetext>
	<parent>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_17_2035200.30136414</parent>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_17_2035200.30135556</id>
	<title>But will it run Crysis?</title>
	<author>Anonymous</author>
	<datestamp>1258454700000</datestamp>
	<modclass>None</modclass>
	<modscore>1</modscore>
	<htmltext>Surprisingly, <a href="http://www.tomshardware.com/reviews/geforce-gt-240,2475-9.html" title="tomshardware.com">it almost can</a> [tomshardware.com]</htmltext>
<tokenext>Surprisingly , it almost can [ tomshardware.com ]</tokentext>
<sentencetext>Surprisingly, it almost can [tomshardware.com]</sentencetext>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_17_2035200.30138714</id>
	<title>Re:Yay! Re-badged 9800GT FTW!</title>
	<author>Akira Kogami</author>
	<datestamp>1258471620000</datestamp>
	<modclass>None</modclass>
	<modscore>1</modscore>
	<htmltext>Sure, but I have a crappy computer with a paltry 300 watt power supply I can't be assed to upgrade so this card still appeals to me.</htmltext>
<tokenext>Sure , but I have a crappy computer with a paltry 300 watt power supply I ca n't be assed to upgrade so this card still appeals to me .</tokentext>
<sentencetext>Sure, but I have a crappy computer with a paltry 300 watt power supply I can't be assed to upgrade so this card still appeals to me.</sentencetext>
	<parent>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_17_2035200.30136414</parent>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_17_2035200.30136414</id>
	<title>Yay!  Re-badged 9800GT FTW!</title>
	<author>Anonymous Freak</author>
	<datestamp>1258458120000</datestamp>
	<modclass>Informativ</modclass>
	<modscore>5</modscore>
	<htmltext><p>Come on, nVidia...  Stop with the re-branding already.</p><p>This is just a die-shrunk 9800 GT, which was just a die-shrunk 8800 GT.</p><p>Yes, it's a great card for $100.  But stop misleading people into thinking it's the same tech as the GTX 260-285.</p><p>(They did the same with the "GTS 250", which is just a re-badged 9800 GTX, which was just a re-badged 8800 GTS.)</p></htmltext>
<tokenext>Come on , nVidia... Stop with the re-branding already.This is just a die-shrunk 9800 GT , which was just a die-shrunk 8800 GT.Yes , it 's a great card for $ 100 .
But stop misleading people into thinking it 's the same tech as the GTX 260-285 .
( They did the same with the " GTS 250 " , which is just a re-badged 9800 GTX , which was just a re-badged 8800 GTS .
)</tokentext>
<sentencetext>Come on, nVidia...  Stop with the re-branding already.This is just a die-shrunk 9800 GT, which was just a die-shrunk 8800 GT.Yes, it's a great card for $100.
But stop misleading people into thinking it's the same tech as the GTX 260-285.
(They did the same with the "GTS 250", which is just a re-badged 9800 GTX, which was just a re-badged 8800 GTS.
)</sentencetext>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_17_2035200.30135302</id>
	<title>Sweet.</title>
	<author>Anonymous</author>
	<datestamp>1258453860000</datestamp>
	<modclass>Interestin</modclass>
	<modscore>1</modscore>
	<htmltext><p>Finally time for a standard PCI-E graphics solution? Death to integrated graphics!</p></htmltext>
<tokenext>Finally time for a standard PCI-E graphics solution ?
Death to integrated graphics !</tokentext>
<sentencetext>Finally time for a standard PCI-E graphics solution?
Death to integrated graphics!</sentencetext>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_17_2035200.30138870</id>
	<title>Many minor Nvidia announcements the last few days</title>
	<author>mykos</author>
	<datestamp>1258473060000</datestamp>
	<modclass>Offtopic</modclass>
	<modscore>0</modscore>
	<htmltext>Some announcements have even been repeated.  Is slashdot becoming the mouthpiece of Nvidia?</htmltext>
<tokenext>Some announcements have even been repeated .
Is slashdot becoming the mouthpiece of Nvidia ?</tokentext>
<sentencetext>Some announcements have even been repeated.
Is slashdot becoming the mouthpiece of Nvidia?</sentencetext>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_17_2035200.30135546</id>
	<title>how do ati cards at the same price do next to this</title>
	<author>Anonymous</author>
	<datestamp>1258454640000</datestamp>
	<modclass>None</modclass>
	<modscore>1</modscore>
	<htmltext><p>how do ati cards at the same price do next to this?</p></htmltext>
<tokenext>how do ati cards at the same price do next to this ?</tokentext>
<sentencetext>how do ati cards at the same price do next to this?</sentencetext>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_17_2035200.30155022</id>
	<title>Re:Yay! Re-badged 9800GT FTW!</title>
	<author>Anonymous</author>
	<datestamp>1258639680000</datestamp>
	<modclass>None</modclass>
	<modscore>0</modscore>
	<htmltext>and in the meanwhile, informed consumer could get the same feature packaged as an 8800 gt for much less</htmltext>
<tokenext>and in the meanwhile , informed consumer could get the same feature packaged as an 8800 gt for much less</tokentext>
<sentencetext>and in the meanwhile, informed consumer could get the same feature packaged as an 8800 gt for much less</sentencetext>
	<parent>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_17_2035200.30136414</parent>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_17_2035200.30135924</id>
	<title>GPU</title>
	<author>Anonymous</author>
	<datestamp>1258456080000</datestamp>
	<modclass>None</modclass>
	<modscore>0</modscore>
	<htmltext><p>these graphics cards, in addition to today's or the future's implementation of Grand Central Dispatch, are really going to be powerful for processing arbitrary data.</p></htmltext>
<tokenext>these graphics cards , in addition to today 's or the future 's implementation of Grand Central Dispatch , are really going to be powerful for processing arbitrary data .</tokentext>
<sentencetext>these graphics cards, in addition to today's or the future's implementation of Grand Central Dispatch, are really going to be powerful for processing arbitrary data.</sentencetext>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_17_2035200.30136154</id>
	<title>too little too late</title>
	<author>Anonymous</author>
	<datestamp>1258457160000</datestamp>
	<modclass>Informativ</modclass>
	<modscore>1</modscore>
	<htmltext><p>The 4850 has been at the $100 mark (on sale) for months now.  The 4770 would qualify too but it never really was pushed out in huge supply.  Both equal or best the 9800GTX in sheer power.  So what's the big deal?</p></htmltext>
<tokenext>The 4850 has been at the $ 100 mark ( on sale ) for months now .
The 4770 would qualify too but it never really was pushed out in huge supply .
Both equal or best the 9800GTX in sheer power .
So what 's the big deal ?</tokentext>
<sentencetext>The 4850 has been at the $100 mark (on sale) for months now.
The 4770 would qualify too but it never really was pushed out in huge supply.
Both equal or best the 9800GTX in sheer power.
So what's the big deal?</sentencetext>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_17_2035200.30135462</id>
	<title>So, I have a question...</title>
	<author>Anonymous</author>
	<datestamp>1258454340000</datestamp>
	<modclass>Troll</modclass>
	<modscore>-1</modscore>
	<htmltext><p>If a device can display video at 1080p 24+ frames per second, what's the point of more?</p><p>It shouldn't require specialized hardware or dedicated or expensive equipment for High Definition(HD) video. And, if 1080p is great for a 50" screen, whats the point of higher resolution on a far smaller screen?</p></htmltext>
<tokenext>If a device can display video at 1080p 24 + frames per second , what 's the point of more ? It should n't require specialized hardware or dedicated or expensive equipment for High Definition ( HD ) video .
And , if 1080p is great for a 50 " screen , whats the point of higher resolution on a far smaller screen ?</tokentext>
<sentencetext>If a device can display video at 1080p 24+ frames per second, what's the point of more?It shouldn't require specialized hardware or dedicated or expensive equipment for High Definition(HD) video.
And, if 1080p is great for a 50" screen, whats the point of higher resolution on a far smaller screen?</sentencetext>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_17_2035200.30147690</id>
	<title>Re:Yay! Re-badged 9800GT FTW!</title>
	<author>Anonymous</author>
	<datestamp>1257108300000</datestamp>
	<modclass>None</modclass>
	<modscore>0</modscore>
	<htmltext><p>Very incorrect.</p><p>9600 GT: compute model 1.1, 8192 registers per multiprocessor</p><p>240 GT: compute model 1.2, 16384 registers per multiprocessor</p><p>complete redesign, memory GDDR5 interface.</p><p>Call that a die shrink once more, and I'll bitchslap you.</p></htmltext>
<tokenext>Very incorrect.9600 GT : compute model 1.1 , 8192 registers per multiprocessor240 GT : compute model 1.2 , 16384 registers per multiprocessorcomplete redesign , memory GDDR5 interface.Call that a die shrink once more , and I 'll bitchslap you .</tokentext>
<sentencetext>Very incorrect.9600 GT: compute model 1.1, 8192 registers per multiprocessor240 GT: compute model 1.2, 16384 registers per multiprocessorcomplete redesign, memory GDDR5 interface.Call that a die shrink once more, and I'll bitchslap you.</sentencetext>
	<parent>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_17_2035200.30136414</parent>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_17_2035200.30136130</id>
	<title>DirectX 10.whatever?  Who cares?</title>
	<author>Ed Avis</author>
	<datestamp>1258457100000</datestamp>
	<modclass>Insightful</modclass>
	<modscore>1</modscore>
	<htmltext>Does it come with a free software driver, or at least include specs so you can write your own?

If not, why does it deserve a Slashdot front page headline?  There are plenty of Windows gaming sites for those who want that kind of thing.</htmltext>
<tokenext>Does it come with a free software driver , or at least include specs so you can write your own ?
If not , why does it deserve a Slashdot front page headline ?
There are plenty of Windows gaming sites for those who want that kind of thing .</tokentext>
<sentencetext>Does it come with a free software driver, or at least include specs so you can write your own?
If not, why does it deserve a Slashdot front page headline?
There are plenty of Windows gaming sites for those who want that kind of thing.</sentencetext>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_17_2035200.30135694</id>
	<title>Re:So, I have a question...</title>
	<author>Anonymous</author>
	<datestamp>1258455120000</datestamp>
	<modclass>None</modclass>
	<modscore>1</modscore>
	<htmltext><p>It's not about displaying video, it's about rendering 3D scenes.  Any old suck card can do 1080p24 (or 1080p60 for that matter).  It takes a lot of horsepower to handle realtime rendering at high resolution.</p><p>As for why the higher resolution, it's because you're sitting closer and the more details the better.  Even in the video domain 3840x2160, if shot natively, would look better on a 60"+ TV than 1080p.  Not amazingly better, and of course there is a point of diminishing returns, but...</p></htmltext>
<tokenext>It 's not about displaying video , it 's about rendering 3D scenes .
Any old suck card can do 1080p24 ( or 1080p60 for that matter ) .
It takes a lot of horsepower to handle realtime rendering at high resolution.As for why the higher resolution , it 's because you 're sitting closer and the more details the better .
Even in the video domain 3840x2160 , if shot natively , would look better on a 60 " + TV than 1080p .
Not amazingly better , and of course there is a point of diminishing returns , but.. .</tokentext>
<sentencetext>It's not about displaying video, it's about rendering 3D scenes.
Any old suck card can do 1080p24 (or 1080p60 for that matter).
It takes a lot of horsepower to handle realtime rendering at high resolution.As for why the higher resolution, it's because you're sitting closer and the more details the better.
Even in the video domain 3840x2160, if shot natively, would look better on a 60"+ TV than 1080p.
Not amazingly better, and of course there is a point of diminishing returns, but...</sentencetext>
	<parent>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_17_2035200.30135462</parent>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_17_2035200.30137120</id>
	<title>Only if standard with passive cooling...</title>
	<author>sznupi</author>
	<datestamp>1258461240000</datestamp>
	<modclass>None</modclass>
	<modscore>1</modscore>
	<htmltext><p>For me, the main <i>potential</i> benefit for such "low power" GFX chips is their low power draw, which might give total silence with passive cooling or near silence with large, slow &amp; quiet fan.</p><p>But practically all cheap cards come with small and whining cooling fans nowadays... (and no, finding an aftermarket solution for such card if no passive ones are readily available (nvm that they are often...a bit more expensive) is not exactly a viable option due to large, comparativelly, additional cost)</p><p>Integrated GFX at least comes with a passive cooling as a standard feature... (c'mon, they can do it with almost microscopic heatsinks on integrated GFX, they can't with such cards?...)</p></htmltext>
<tokenext>For me , the main potential benefit for such " low power " GFX chips is their low power draw , which might give total silence with passive cooling or near silence with large , slow &amp; quiet fan.But practically all cheap cards come with small and whining cooling fans nowadays... ( and no , finding an aftermarket solution for such card if no passive ones are readily available ( nvm that they are often...a bit more expensive ) is not exactly a viable option due to large , comparativelly , additional cost ) Integrated GFX at least comes with a passive cooling as a standard feature... ( c'mon , they can do it with almost microscopic heatsinks on integrated GFX , they ca n't with such cards ? .. .
)</tokentext>
<sentencetext>For me, the main potential benefit for such "low power" GFX chips is their low power draw, which might give total silence with passive cooling or near silence with large, slow &amp; quiet fan.But practically all cheap cards come with small and whining cooling fans nowadays... (and no, finding an aftermarket solution for such card if no passive ones are readily available (nvm that they are often...a bit more expensive) is not exactly a viable option due to large, comparativelly, additional cost)Integrated GFX at least comes with a passive cooling as a standard feature... (c'mon, they can do it with almost microscopic heatsinks on integrated GFX, they can't with such cards?...
)</sentencetext>
	<parent>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_17_2035200.30135302</parent>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_17_2035200.30142822</id>
	<title>Re:Dear NVidia,</title>
	<author>Hadlock</author>
	<datestamp>1257088020000</datestamp>
	<modclass>None</modclass>
	<modscore>1</modscore>
	<htmltext><blockquote><div><p>TPD must be no more than approximately 100W</p></div></blockquote><p>Seconding this. ATI, nVidia? do you hear me? I don't need/want a fsking 800w power supply to play video games on my computer. And I sure as hell am not going to upgrade my power supply along with my video card.</p></div>
	</htmltext>
<tokenext>TPD must be no more than approximately 100WSeconding this .
ATI , nVidia ?
do you hear me ?
I do n't need/want a fsking 800w power supply to play video games on my computer .
And I sure as hell am not going to upgrade my power supply along with my video card .</tokentext>
<sentencetext>TPD must be no more than approximately 100WSeconding this.
ATI, nVidia?
do you hear me?
I don't need/want a fsking 800w power supply to play video games on my computer.
And I sure as hell am not going to upgrade my power supply along with my video card.
	</sentencetext>
	<parent>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_17_2035200.30136008</parent>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_17_2035200.30138628</id>
	<title>Wow. Huge news!</title>
	<author>scumdamn</author>
	<datestamp>1258470960000</datestamp>
	<modclass>None</modclass>
	<modscore>1</modscore>
	<htmltext>Nvidia just released a slower card at the same price point than a card that ATI has had out for months. This is huge and amazing and stuff.</htmltext>
<tokenext>Nvidia just released a slower card at the same price point than a card that ATI has had out for months .
This is huge and amazing and stuff .</tokentext>
<sentencetext>Nvidia just released a slower card at the same price point than a card that ATI has had out for months.
This is huge and amazing and stuff.</sentencetext>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_17_2035200.30135326</id>
	<title>Re:nVidia 9400M</title>
	<author>Anonymous</author>
	<datestamp>1258453920000</datestamp>
	<modclass>None</modclass>
	<modscore>1</modscore>
	<htmltext><p>good question.  i mean is this card going to be worth it?</p><p>will it bring high-end PC game performance to the casual gamers?</p></htmltext>
<tokenext>good question .
i mean is this card going to be worth it ? will it bring high-end PC game performance to the casual gamers ?</tokentext>
<sentencetext>good question.
i mean is this card going to be worth it?will it bring high-end PC game performance to the casual gamers?</sentencetext>
	<parent>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_17_2035200.30135268</parent>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_17_2035200.30135578</id>
	<title>Re:So, I have a question...</title>
	<author>Tristor</author>
	<datestamp>1258454760000</datestamp>
	<modclass>None</modclass>
	<modscore>1</modscore>
	<htmltext>Screen Real Estate</htmltext>
<tokenext>Screen Real Estate</tokentext>
<sentencetext>Screen Real Estate</sentencetext>
	<parent>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_17_2035200.30135462</parent>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_17_2035200.30135310</id>
	<title>Tom's Hardware Link</title>
	<author>Anonymous</author>
	<datestamp>1258453860000</datestamp>
	<modclass>Informativ</modclass>
	<modscore>4</modscore>
	<htmltext>I prefer the performance graphs/comparisons at <a href="http://www.tomshardware.com/reviews/geforce-gt-240,2475.html" title="tomshardware.com">Tom's Hardware</a> [tomshardware.com].</htmltext>
<tokenext>I prefer the performance graphs/comparisons at Tom 's Hardware [ tomshardware.com ] .</tokentext>
<sentencetext>I prefer the performance graphs/comparisons at Tom's Hardware [tomshardware.com].</sentencetext>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_17_2035200.30135754</id>
	<title>Re:nVidia 9400M</title>
	<author>Anonymous</author>
	<datestamp>1258455300000</datestamp>
	<modclass>Informativ</modclass>
	<modscore>2</modscore>
	<htmltext>Um. in the realm of great video cards, RADEON currently holds it with the 5870 series of HD cards, which are already DX11 ready and blow the socks off of anything Nvidia has, esp. in CrossFire configs.  What I don't understand is why Nvidia drops this to market now, when it's still chewing on whether it'll do anything with DX 11? By that time, RADEON/ATI will be on it's 2nd Gen of their great HD cards, and Nvidia "might" be just rolling their out?  Don't get me wrong, but onboard graphics are eons from the capacity of these cards, esp. in dual or triple SLI configs -and when you see the difference (which few do) I would guess that most folks would not be buying Dell, HP, EMachines crap online and building their own or looking at Cyberpower, Poly, MicroExpress, Falcon and others more regularly.  Decent card only for what it does - do not expect to play ANY DX10 or DX 11 game in a decent FPS  - it just can't do it!</htmltext>
<tokenext>Um .
in the realm of great video cards , RADEON currently holds it with the 5870 series of HD cards , which are already DX11 ready and blow the socks off of anything Nvidia has , esp .
in CrossFire configs .
What I do n't understand is why Nvidia drops this to market now , when it 's still chewing on whether it 'll do anything with DX 11 ?
By that time , RADEON/ATI will be on it 's 2nd Gen of their great HD cards , and Nvidia " might " be just rolling their out ?
Do n't get me wrong , but onboard graphics are eons from the capacity of these cards , esp .
in dual or triple SLI configs -and when you see the difference ( which few do ) I would guess that most folks would not be buying Dell , HP , EMachines crap online and building their own or looking at Cyberpower , Poly , MicroExpress , Falcon and others more regularly .
Decent card only for what it does - do not expect to play ANY DX10 or DX 11 game in a decent FPS - it just ca n't do it !</tokentext>
<sentencetext>Um.
in the realm of great video cards, RADEON currently holds it with the 5870 series of HD cards, which are already DX11 ready and blow the socks off of anything Nvidia has, esp.
in CrossFire configs.
What I don't understand is why Nvidia drops this to market now, when it's still chewing on whether it'll do anything with DX 11?
By that time, RADEON/ATI will be on it's 2nd Gen of their great HD cards, and Nvidia "might" be just rolling their out?
Don't get me wrong, but onboard graphics are eons from the capacity of these cards, esp.
in dual or triple SLI configs -and when you see the difference (which few do) I would guess that most folks would not be buying Dell, HP, EMachines crap online and building their own or looking at Cyberpower, Poly, MicroExpress, Falcon and others more regularly.
Decent card only for what it does - do not expect to play ANY DX10 or DX 11 game in a decent FPS  - it just can't do it!</sentencetext>
	<parent>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_17_2035200.30135268</parent>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_17_2035200.30135568</id>
	<title>meanwhile ATI announces 4.6TFLOPS Radeon 5970</title>
	<author>Anonymous</author>
	<datestamp>1258454700000</datestamp>
	<modclass>Interestin</modclass>
	<modscore>1</modscore>
	<htmltext><p>this is the best NVidia can do to try to answer the Radeon 5970 announcement tomorrow?</p></htmltext>
<tokenext>this is the best NVidia can do to try to answer the Radeon 5970 announcement tomorrow ?</tokentext>
<sentencetext>this is the best NVidia can do to try to answer the Radeon 5970 announcement tomorrow?</sentencetext>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_17_2035200.30138416</id>
	<title>Re:nVidia 9400M</title>
	<author>smash</author>
	<datestamp>1258469220000</datestamp>
	<modclass>None</modclass>
	<modscore>1</modscore>
	<htmltext>It should destroy a 9400M</htmltext>
<tokenext>It should destroy a 9400M</tokentext>
<sentencetext>It should destroy a 9400M</sentencetext>
	<parent>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_17_2035200.30135268</parent>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_17_2035200.30136008</id>
	<title>Dear NVidia,</title>
	<author>Anonymous</author>
	<datestamp>1258456500000</datestamp>
	<modclass>Interestin</modclass>
	<modscore>2</modscore>
	<htmltext><p>Nice chip.  I'm waiting until you make a 40nm GPU that beats the 9800GT.  40mn is required because heat and noise are crucial to me.  All of your fast 2xx series stuff is hot and power hungry, so I haven't moved.</p><p>Listen carefully: My magic price point is $200 or less.  TPD must be no more than approximately 100W, ah la the 9800GT.  I want 1GB (but I'll settle for 768) because 512MB is too small now.  I have never cared about SLI and I won't start anytime soon.  I *DO* care about heat and noise, so make these damn card builders use good cooling, which I define as "can tolerate less than perfect airflow (because fan filled holes = noise) using 1 large, quiet fan, at FULL load."</p><p>Do that and I'll upgrade.  Don't and I'll look very hard at Larrabee...</p><p>
&nbsp; - Loyal NVidia buyer</p></htmltext>
<tokenext>Nice chip .
I 'm waiting until you make a 40nm GPU that beats the 9800GT .
40mn is required because heat and noise are crucial to me .
All of your fast 2xx series stuff is hot and power hungry , so I have n't moved.Listen carefully : My magic price point is $ 200 or less .
TPD must be no more than approximately 100W , ah la the 9800GT .
I want 1GB ( but I 'll settle for 768 ) because 512MB is too small now .
I have never cared about SLI and I wo n't start anytime soon .
I * DO * care about heat and noise , so make these damn card builders use good cooling , which I define as " can tolerate less than perfect airflow ( because fan filled holes = noise ) using 1 large , quiet fan , at FULL load .
" Do that and I 'll upgrade .
Do n't and I 'll look very hard at Larrabee.. .   - Loyal NVidia buyer</tokentext>
<sentencetext>Nice chip.
I'm waiting until you make a 40nm GPU that beats the 9800GT.
40mn is required because heat and noise are crucial to me.
All of your fast 2xx series stuff is hot and power hungry, so I haven't moved.Listen carefully: My magic price point is $200 or less.
TPD must be no more than approximately 100W, ah la the 9800GT.
I want 1GB (but I'll settle for 768) because 512MB is too small now.
I have never cared about SLI and I won't start anytime soon.
I *DO* care about heat and noise, so make these damn card builders use good cooling, which I define as "can tolerate less than perfect airflow (because fan filled holes = noise) using 1 large, quiet fan, at FULL load.
"Do that and I'll upgrade.
Don't and I'll look very hard at Larrabee...
  - Loyal NVidia buyer</sentencetext>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_17_2035200.30140184</id>
	<title>Re:What's with that hedline</title>
	<author>AniVisual</author>
	<datestamp>1258483440000</datestamp>
	<modclass>None</modclass>
	<modscore>1</modscore>
	<htmltext><p> &lt;insert technology here&gt; compatible does not mean that it optimizes programs that use that technology well, only that it supports it. </p></htmltext>
<tokenext>compatible does not mean that it optimizes programs that use that technology well , only that it supports it .</tokentext>
<sentencetext>  compatible does not mean that it optimizes programs that use that technology well, only that it supports it. </sentencetext>
	<parent>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_17_2035200.30136662</parent>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_17_2035200.30135688</id>
	<title>Re:Um, so?</title>
	<author>RyuuzakiTetsuya</author>
	<datestamp>1258455120000</datestamp>
	<modclass>None</modclass>
	<modscore>1</modscore>
	<htmltext><p>Feature set.</p><p>I'm out of the PC Gaming scene(in fact, my computer is <a href="http://www.penny-arcade.com/comic/1999/04/28/" title="penny-arcade.com">Grape.</a> [penny-arcade.com]).</p><p>But I do understand the idea of building a sub 500 dollar PC that supports Windows 7 and nearly any game you awnt to throw at it though.</p></htmltext>
<tokenext>Feature set.I 'm out of the PC Gaming scene ( in fact , my computer is Grape .
[ penny-arcade.com ] ) .But I do understand the idea of building a sub 500 dollar PC that supports Windows 7 and nearly any game you awnt to throw at it though .</tokentext>
<sentencetext>Feature set.I'm out of the PC Gaming scene(in fact, my computer is Grape.
[penny-arcade.com]).But I do understand the idea of building a sub 500 dollar PC that supports Windows 7 and nearly any game you awnt to throw at it though.</sentencetext>
	<parent>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_17_2035200.30135322</parent>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_17_2035200.30140160</id>
	<title>Sure got told...</title>
	<author>Suiggy</author>
	<datestamp>1258483260000</datestamp>
	<modclass>Interestin</modclass>
	<modscore>2</modscore>
	<htmltext><p>
1.7\% yields of Fermi GPUs in first batch.<br>
Wooden screws used in the non-working Fermi prototype card which Nvidia claimed was working.<br>
Q2 2010 release date now for consumer Fermi GPUs instead of the promised Q4 2009 release.<br>
20\% clock miss on Fermi architecture.</p><p>
And now they're releasing re-badged crap yet again.
</p><p>
When will it end?</p></htmltext>
<tokenext>1.7 \ % yields of Fermi GPUs in first batch .
Wooden screws used in the non-working Fermi prototype card which Nvidia claimed was working .
Q2 2010 release date now for consumer Fermi GPUs instead of the promised Q4 2009 release .
20 \ % clock miss on Fermi architecture .
And now they 're releasing re-badged crap yet again .
When will it end ?</tokentext>
<sentencetext>
1.7\% yields of Fermi GPUs in first batch.
Wooden screws used in the non-working Fermi prototype card which Nvidia claimed was working.
Q2 2010 release date now for consumer Fermi GPUs instead of the promised Q4 2009 release.
20\% clock miss on Fermi architecture.
And now they're releasing re-badged crap yet again.
When will it end?</sentencetext>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_17_2035200.30138522</id>
	<title>Re:nVidia 9400M</title>
	<author>Anonymous</author>
	<datestamp>1258470120000</datestamp>
	<modclass>None</modclass>
	<modscore>0</modscore>
	<htmltext><p>probably extremely well. the 9400M is a piece of shit.</p></htmltext>
<tokenext>probably extremely well .
the 9400M is a piece of shit .</tokentext>
<sentencetext>probably extremely well.
the 9400M is a piece of shit.</sentencetext>
	<parent>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_17_2035200.30135268</parent>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_17_2035200.30135356</id>
	<title>Re:nVidia 9400M</title>
	<author>Anonymous</author>
	<datestamp>1258454040000</datestamp>
	<modclass>Informativ</modclass>
	<modscore>2</modscore>
	<htmltext>It's a little better: <a href="http://www.tomshardware.com/reviews/geforce-gt-240,2475-8.html" title="tomshardware.com" rel="nofollow">http://www.tomshardware.com/reviews/geforce-gt-240,2475-8.html</a> [tomshardware.com]</htmltext>
<tokenext>It 's a little better : http : //www.tomshardware.com/reviews/geforce-gt-240,2475-8.html [ tomshardware.com ]</tokentext>
<sentencetext>It's a little better: http://www.tomshardware.com/reviews/geforce-gt-240,2475-8.html [tomshardware.com]</sentencetext>
	<parent>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_17_2035200.30135268</parent>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_17_2035200.30139562</id>
	<title>Re:Yay! Re-badged 9800GT FTW!</title>
	<author>mczak</author>
	<datestamp>1258478280000</datestamp>
	<modclass>None</modclass>
	<modscore>1</modscore>
	<htmltext>You're quite wrong here. I think you're confusing the GT240 with the GTS240, which is a OEM-only deal and indeed pretty much a rebadged 9800GT (I won't disagree though naming is silly). GT240 is based on a entirely new chip (GT215 based on 40nm process) instead of the old and trusted G92(b) (65/55nm) which was used in 8800GT/9800GTX/GTS250 (and more). It can also do DX10.1 - something neither G92 based cards nor GT200 based ones (GTX260 and friends) can do.</htmltext>
<tokenext>You 're quite wrong here .
I think you 're confusing the GT240 with the GTS240 , which is a OEM-only deal and indeed pretty much a rebadged 9800GT ( I wo n't disagree though naming is silly ) .
GT240 is based on a entirely new chip ( GT215 based on 40nm process ) instead of the old and trusted G92 ( b ) ( 65/55nm ) which was used in 8800GT/9800GTX/GTS250 ( and more ) .
It can also do DX10.1 - something neither G92 based cards nor GT200 based ones ( GTX260 and friends ) can do .</tokentext>
<sentencetext>You're quite wrong here.
I think you're confusing the GT240 with the GTS240, which is a OEM-only deal and indeed pretty much a rebadged 9800GT (I won't disagree though naming is silly).
GT240 is based on a entirely new chip (GT215 based on 40nm process) instead of the old and trusted G92(b) (65/55nm) which was used in 8800GT/9800GTX/GTS250 (and more).
It can also do DX10.1 - something neither G92 based cards nor GT200 based ones (GTX260 and friends) can do.</sentencetext>
	<parent>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_17_2035200.30136414</parent>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_17_2035200.30135392</id>
	<title>GPU Card Size</title>
	<author>Anonymous</author>
	<datestamp>1258454160000</datestamp>
	<modclass>None</modclass>
	<modscore>0</modscore>
	<htmltext><p>Will it fit in the motherboard?  Does it take up two slots?  The trend with graphics boards is to make them way too big and bulky.  It's ridiculous.</p><p>I use linux, but I tried both an AMD and Nvidia card from the recent generation, both of which were pains to install and still barely fit.  Both eat a regular PCI slot when installed.  And both take up two slots on the back of the tower.</p><p>It's just not worth it for the meager performance you get on linux with a decent card (in the case of AMD, <em>if</em> the drivers work).  I would think that more lower-end, lower-power and smaller cards would be coming around.  It looks like Nvidia is giving that a shot.</p></htmltext>
<tokenext>Will it fit in the motherboard ?
Does it take up two slots ?
The trend with graphics boards is to make them way too big and bulky .
It 's ridiculous.I use linux , but I tried both an AMD and Nvidia card from the recent generation , both of which were pains to install and still barely fit .
Both eat a regular PCI slot when installed .
And both take up two slots on the back of the tower.It 's just not worth it for the meager performance you get on linux with a decent card ( in the case of AMD , if the drivers work ) .
I would think that more lower-end , lower-power and smaller cards would be coming around .
It looks like Nvidia is giving that a shot .</tokentext>
<sentencetext>Will it fit in the motherboard?
Does it take up two slots?
The trend with graphics boards is to make them way too big and bulky.
It's ridiculous.I use linux, but I tried both an AMD and Nvidia card from the recent generation, both of which were pains to install and still barely fit.
Both eat a regular PCI slot when installed.
And both take up two slots on the back of the tower.It's just not worth it for the meager performance you get on linux with a decent card (in the case of AMD, if the drivers work).
I would think that more lower-end, lower-power and smaller cards would be coming around.
It looks like Nvidia is giving that a shot.</sentencetext>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_17_2035200.30141012</id>
	<title>Physx</title>
	<author>Krneki</author>
	<datestamp>1257067560000</datestamp>
	<modclass>None</modclass>
	<modscore>1</modscore>
	<htmltext>Right now Nvidia sucks, but I'm thinking about getting one for physx support.<br><br>But since Nvidia is acting all emo and disabling this feature, I don't think I'll buy any of their product for the moment being.<br><br>But we need a strong Nvidia or ATI will stop delivering cheap and good video cards.</htmltext>
<tokenext>Right now Nvidia sucks , but I 'm thinking about getting one for physx support.But since Nvidia is acting all emo and disabling this feature , I do n't think I 'll buy any of their product for the moment being.But we need a strong Nvidia or ATI will stop delivering cheap and good video cards .</tokentext>
<sentencetext>Right now Nvidia sucks, but I'm thinking about getting one for physx support.But since Nvidia is acting all emo and disabling this feature, I don't think I'll buy any of their product for the moment being.But we need a strong Nvidia or ATI will stop delivering cheap and good video cards.</sentencetext>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_17_2035200.30135972</id>
	<title>ATI's Radeon HD4770 beats it</title>
	<author>Manfesto</author>
	<datestamp>1258456320000</datestamp>
	<modclass>None</modclass>
	<modscore>1</modscore>
	<htmltext>ATI's Radeon HD4770 would be this card's analogue (mainstream DX10 hardware around $100) - widely available at around $110 on Newegg, and according to this review:
<br> <br>
<a href="http://www.pcgameshardware.com/aid,699611/Geforce-GT-240-Nvidias-fastest-DirectX-101-graphics-card-reviewed/Reviews/" title="pcgameshardware.com" rel="nofollow">http://www.pcgameshardware.com/aid,699611/Geforce-GT-240-Nvidias-fastest-DirectX-101-graphics-card-reviewed/Reviews/</a> [pcgameshardware.com]
<br> <br>
Handily beats this GT240 across the board.  I'd say it's worth more than the extra $10.
<br> <br>
Moreover, I think it's a shame this is so far the only review I've found comparing the GT240 to the HD4770.  The above review pits it against the HD5750 and HD5770, which are in a completely different league, being DX11 hardware.</htmltext>
<tokenext>ATI 's Radeon HD4770 would be this card 's analogue ( mainstream DX10 hardware around $ 100 ) - widely available at around $ 110 on Newegg , and according to this review : http : //www.pcgameshardware.com/aid,699611/Geforce-GT-240-Nvidias-fastest-DirectX-101-graphics-card-reviewed/Reviews/ [ pcgameshardware.com ] Handily beats this GT240 across the board .
I 'd say it 's worth more than the extra $ 10 .
Moreover , I think it 's a shame this is so far the only review I 've found comparing the GT240 to the HD4770 .
The above review pits it against the HD5750 and HD5770 , which are in a completely different league , being DX11 hardware .</tokentext>
<sentencetext>ATI's Radeon HD4770 would be this card's analogue (mainstream DX10 hardware around $100) - widely available at around $110 on Newegg, and according to this review:
 
http://www.pcgameshardware.com/aid,699611/Geforce-GT-240-Nvidias-fastest-DirectX-101-graphics-card-reviewed/Reviews/ [pcgameshardware.com]
 
Handily beats this GT240 across the board.
I'd say it's worth more than the extra $10.
Moreover, I think it's a shame this is so far the only review I've found comparing the GT240 to the HD4770.
The above review pits it against the HD5750 and HD5770, which are in a completely different league, being DX11 hardware.</sentencetext>
	<parent>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_17_2035200.30135546</parent>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_17_2035200.30135322</id>
	<title>Um, so?</title>
	<author>Anonymous</author>
	<datestamp>1258453920000</datestamp>
	<modclass>Interestin</modclass>
	<modscore>3</modscore>
	<htmltext>While I understand that there is a psychological influence of the whole "under $100" mark, is it really that much different than the standard price reductions and increasing power of graphics cards over time?</htmltext>
<tokenext>While I understand that there is a psychological influence of the whole " under $ 100 " mark , is it really that much different than the standard price reductions and increasing power of graphics cards over time ?</tokentext>
<sentencetext>While I understand that there is a psychological influence of the whole "under $100" mark, is it really that much different than the standard price reductions and increasing power of graphics cards over time?</sentencetext>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_17_2035200.30137500</id>
	<title>Re:DirectX 10.whatever? Who cares?</title>
	<author>Anonymous</author>
	<datestamp>1258463100000</datestamp>
	<modclass>None</modclass>
	<modscore>0</modscore>
	<htmltext><p>I couldn't care less about DirectWhatever.  But the fact that it's a graphics card for those of us who <i>aren't</i> GPU-melting gameheads makes it relevant for Slashdot, not just a gaming site.  The fact that its ability to run without an external power supply  is <i>noteworthy</i>... is itself noteworthy.</p></htmltext>
<tokenext>I could n't care less about DirectWhatever .
But the fact that it 's a graphics card for those of us who are n't GPU-melting gameheads makes it relevant for Slashdot , not just a gaming site .
The fact that its ability to run without an external power supply is noteworthy... is itself noteworthy .</tokentext>
<sentencetext>I couldn't care less about DirectWhatever.
But the fact that it's a graphics card for those of us who aren't GPU-melting gameheads makes it relevant for Slashdot, not just a gaming site.
The fact that its ability to run without an external power supply  is noteworthy... is itself noteworthy.</sentencetext>
	<parent>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_17_2035200.30136130</parent>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_17_2035200.30140132</id>
	<title>Re:Yay! Re-badged 9800GT FTW!</title>
	<author>Macman408</author>
	<datestamp>1258482960000</datestamp>
	<modclass>None</modclass>
	<modscore>1</modscore>
	<htmltext><p>It's not a die shrink - the GT 240 supports DirectX 10.1, while the GTS 250 only supports 10.0. Not to mention the GT 240 can also do GDDR 5. And the GT240 is about half the power. And despite the GT240 having a 25\% slower graphics clock, 27\% slower processor clock, AND 25\% fewer cores, it manages about 25\% less performance. If it were the same chip, those effects should roughly stack so it'd be about<nobr> <wbr></nobr>.75*.75=.5625 or 44\% less performance. Looks to me like it's something new and different, and not just a re-brand.</p><p>Now where's the -1 Wrong mod when you need one? Or at least a -0 Wrong mod - you can keep the points and the visibility, but there oughtta be a tag for those cases where a +5 comment just ain't right.</p></htmltext>
<tokenext>It 's not a die shrink - the GT 240 supports DirectX 10.1 , while the GTS 250 only supports 10.0 .
Not to mention the GT 240 can also do GDDR 5 .
And the GT240 is about half the power .
And despite the GT240 having a 25 \ % slower graphics clock , 27 \ % slower processor clock , AND 25 \ % fewer cores , it manages about 25 \ % less performance .
If it were the same chip , those effects should roughly stack so it 'd be about .75 * .75 = .5625 or 44 \ % less performance .
Looks to me like it 's something new and different , and not just a re-brand.Now where 's the -1 Wrong mod when you need one ?
Or at least a -0 Wrong mod - you can keep the points and the visibility , but there oughtta be a tag for those cases where a + 5 comment just ai n't right .</tokentext>
<sentencetext>It's not a die shrink - the GT 240 supports DirectX 10.1, while the GTS 250 only supports 10.0.
Not to mention the GT 240 can also do GDDR 5.
And the GT240 is about half the power.
And despite the GT240 having a 25\% slower graphics clock, 27\% slower processor clock, AND 25\% fewer cores, it manages about 25\% less performance.
If it were the same chip, those effects should roughly stack so it'd be about .75*.75=.5625 or 44\% less performance.
Looks to me like it's something new and different, and not just a re-brand.Now where's the -1 Wrong mod when you need one?
Or at least a -0 Wrong mod - you can keep the points and the visibility, but there oughtta be a tag for those cases where a +5 comment just ain't right.</sentencetext>
	<parent>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_17_2035200.30136414</parent>
</comment>
<thread>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#thread_09_11_17_2035200_18</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_17_2035200.30137500
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_17_2035200.30136130
</commentlist>
</thread>
<thread>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#thread_09_11_17_2035200_6</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_17_2035200.30140132
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_17_2035200.30136414
</commentlist>
</thread>
<thread>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#thread_09_11_17_2035200_23</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_17_2035200.30138714
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_17_2035200.30136414
</commentlist>
</thread>
<thread>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#thread_09_11_17_2035200_0</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_17_2035200.30137148
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_17_2035200.30135322
</commentlist>
</thread>
<thread>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#thread_09_11_17_2035200_13</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_17_2035200.30135688
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_17_2035200.30135322
</commentlist>
</thread>
<thread>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#thread_09_11_17_2035200_1</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_17_2035200.30136946
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_17_2035200.30136130
</commentlist>
</thread>
<thread>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#thread_09_11_17_2035200_29</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_17_2035200.30138522
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_17_2035200.30135268
</commentlist>
</thread>
<thread>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#thread_09_11_17_2035200_28</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_17_2035200.30137930
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_17_2035200.30135326
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_17_2035200.30135268
</commentlist>
</thread>
<thread>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#thread_09_11_17_2035200_10</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_17_2035200.30138416
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_17_2035200.30135268
</commentlist>
</thread>
<thread>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#thread_09_11_17_2035200_27</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_17_2035200.30135694
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_17_2035200.30135462
</commentlist>
</thread>
<thread>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#thread_09_11_17_2035200_4</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_17_2035200.30135958
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_17_2035200.30135322
</commentlist>
</thread>
<thread>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#thread_09_11_17_2035200_32</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_17_2035200.30143272
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_17_2035200.30136210
</commentlist>
</thread>
<thread>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#thread_09_11_17_2035200_17</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_17_2035200.30147690
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_17_2035200.30136414
</commentlist>
</thread>
<thread>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#thread_09_11_17_2035200_19</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_17_2035200.30135880
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_17_2035200.30135392
</commentlist>
</thread>
<thread>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#thread_09_11_17_2035200_22</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_17_2035200.30138984
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_17_2035200.30136414
</commentlist>
</thread>
<thread>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#thread_09_11_17_2035200_7</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_17_2035200.30142822
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_17_2035200.30136008
</commentlist>
</thread>
<thread>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#thread_09_11_17_2035200_14</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_17_2035200.30155022
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_17_2035200.30136414
</commentlist>
</thread>
<thread>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#thread_09_11_17_2035200_2</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_17_2035200.30135972
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_17_2035200.30135546
</commentlist>
</thread>
<thread>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#thread_09_11_17_2035200_30</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_17_2035200.30142226
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_17_2035200.30136130
</commentlist>
</thread>
<thread>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#thread_09_11_17_2035200_21</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_17_2035200.30135754
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_17_2035200.30135268
</commentlist>
</thread>
<thread>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#thread_09_11_17_2035200_20</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_17_2035200.30135550
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_17_2035200.30135462
</commentlist>
</thread>
<thread>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#thread_09_11_17_2035200_11</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_17_2035200.30140184
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_17_2035200.30136662
</commentlist>
</thread>
<thread>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#thread_09_11_17_2035200_5</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_17_2035200.30135356
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_17_2035200.30135268
</commentlist>
</thread>
<thread>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#thread_09_11_17_2035200_12</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_17_2035200.30139562
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_17_2035200.30136414
</commentlist>
</thread>
<thread>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#thread_09_11_17_2035200_26</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_17_2035200.30137530
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_17_2035200.30136130
</commentlist>
</thread>
<thread>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#thread_09_11_17_2035200_8</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_17_2035200.30138038
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_17_2035200.30135462
</commentlist>
</thread>
<thread>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#thread_09_11_17_2035200_25</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_17_2035200.30138224
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_17_2035200.30136008
</commentlist>
</thread>
<thread>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#thread_09_11_17_2035200_24</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_17_2035200.30139590
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_17_2035200.30136414
</commentlist>
</thread>
<thread>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#thread_09_11_17_2035200_15</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_17_2035200.30135578
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_17_2035200.30135462
</commentlist>
</thread>
<thread>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#thread_09_11_17_2035200_31</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_17_2035200.30136746
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_17_2035200.30135268
</commentlist>
</thread>
<thread>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#thread_09_11_17_2035200_9</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_17_2035200.30137120
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_17_2035200.30135302
</commentlist>
</thread>
<thread>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#thread_09_11_17_2035200_3</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_17_2035200.30137224
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_17_2035200.30136414
</commentlist>
</thread>
<thread>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#thread_09_11_17_2035200_16</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_17_2035200.30139690
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_17_2035200.30136210
</commentlist>
</thread>
<conversation>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#conversation09_11_17_2035200.8</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_17_2035200.30136210
-http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_17_2035200.30143272
-http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_17_2035200.30139690
</commentlist>
</conversation>
<conversation>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#conversation09_11_17_2035200.12</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_17_2035200.30136662
-http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_17_2035200.30140184
</commentlist>
</conversation>
<conversation>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#conversation09_11_17_2035200.6</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_17_2035200.30135556
</commentlist>
</conversation>
<conversation>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#conversation09_11_17_2035200.15</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_17_2035200.30135392
-http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_17_2035200.30135880
</commentlist>
</conversation>
<conversation>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#conversation09_11_17_2035200.0</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_17_2035200.30135302
-http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_17_2035200.30137120
</commentlist>
</conversation>
<conversation>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#conversation09_11_17_2035200.13</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_17_2035200.30136328
</commentlist>
</conversation>
<conversation>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#conversation09_11_17_2035200.16</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_17_2035200.30135924
</commentlist>
</conversation>
<conversation>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#conversation09_11_17_2035200.10</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_17_2035200.30136130
-http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_17_2035200.30137500
-http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_17_2035200.30136946
-http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_17_2035200.30142226
-http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_17_2035200.30137530
</commentlist>
</conversation>
<conversation>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#conversation09_11_17_2035200.17</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_17_2035200.30135462
-http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_17_2035200.30135550
-http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_17_2035200.30138038
-http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_17_2035200.30135694
-http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_17_2035200.30135578
</commentlist>
</conversation>
<conversation>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#conversation09_11_17_2035200.11</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_17_2035200.30136008
-http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_17_2035200.30142822
-http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_17_2035200.30138224
</commentlist>
</conversation>
<conversation>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#conversation09_11_17_2035200.5</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_17_2035200.30140160
</commentlist>
</conversation>
<conversation>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#conversation09_11_17_2035200.3</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_17_2035200.30136414
-http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_17_2035200.30139562
-http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_17_2035200.30140132
-http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_17_2035200.30137224
-http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_17_2035200.30138984
-http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_17_2035200.30147690
-http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_17_2035200.30139590
-http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_17_2035200.30138714
-http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_17_2035200.30155022
</commentlist>
</conversation>
<conversation>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#conversation09_11_17_2035200.9</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_17_2035200.30135310
</commentlist>
</conversation>
<conversation>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#conversation09_11_17_2035200.7</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_17_2035200.30135320
</commentlist>
</conversation>
<conversation>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#conversation09_11_17_2035200.4</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_17_2035200.30135322
-http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_17_2035200.30135958
-http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_17_2035200.30135688
-http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_17_2035200.30137148
</commentlist>
</conversation>
<conversation>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#conversation09_11_17_2035200.2</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_17_2035200.30135546
-http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_17_2035200.30135972
</commentlist>
</conversation>
<conversation>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#conversation09_11_17_2035200.1</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_17_2035200.30135268
-http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_17_2035200.30135326
--http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_17_2035200.30137930
-http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_17_2035200.30138522
-http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_17_2035200.30136746
-http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_17_2035200.30135754
-http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_17_2035200.30138416
-http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_17_2035200.30135356
</commentlist>
</conversation>
<conversation>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#conversation09_11_17_2035200.14</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_17_2035200.30135568
</commentlist>
</conversation>
