<article>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#article10_03_05_0739241</id>
	<title>NVIDIA Driver Update Causing Video Cards To Overheat In Games</title>
	<author>Soulskill</author>
	<datestamp>1267783980000</datestamp>
	<htmltext>After a group of <em>StarCraft II</em> beta testers reported technical difficulties following the installation of NVIDIA driver update 196.75, Blizzard tech support found that the update introduced fan control problems that were <a href="http://www.incgamers.com/News/21293/nvidia-19675-kills-video-cards">causing video cards to overheat in 3D applications</a>. "This means every single 3D application (i.e. games) running these drivers is going to be exposed to overheating and in some extreme cases it will cause video card, motherboard and/or processor damage. If said motherboard, processor or graphic card is not under warranty, some gamers are in serious trouble playing intensive games such as <em>Prototype</em>, <em>World of Warcraft</em>, <em>Farcry 3</em>, <em>Crysis</em> and many other games with realistic graphics." NVIDIA said they were investigating the problem, took down links to the new drivers, and advised users to revert to 196.21 until the problem can be fixed.</htmltext>
<tokenext>After a group of StarCraft II beta testers reported technical difficulties following the installation of NVIDIA driver update 196.75 , Blizzard tech support found that the update introduced fan control problems that were causing video cards to overheat in 3D applications .
" This means every single 3D application ( i.e .
games ) running these drivers is going to be exposed to overheating and in some extreme cases it will cause video card , motherboard and/or processor damage .
If said motherboard , processor or graphic card is not under warranty , some gamers are in serious trouble playing intensive games such as Prototype , World of Warcraft , Farcry 3 , Crysis and many other games with realistic graphics .
" NVIDIA said they were investigating the problem , took down links to the new drivers , and advised users to revert to 196.21 until the problem can be fixed .</tokentext>
<sentencetext>After a group of StarCraft II beta testers reported technical difficulties following the installation of NVIDIA driver update 196.75, Blizzard tech support found that the update introduced fan control problems that were causing video cards to overheat in 3D applications.
"This means every single 3D application (i.e.
games) running these drivers is going to be exposed to overheating and in some extreme cases it will cause video card, motherboard and/or processor damage.
If said motherboard, processor or graphic card is not under warranty, some gamers are in serious trouble playing intensive games such as Prototype, World of Warcraft, Farcry 3, Crysis and many other games with realistic graphics.
" NVIDIA said they were investigating the problem, took down links to the new drivers, and advised users to revert to 196.21 until the problem can be fixed.</sentencetext>
</article>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_03_05_0739241.31369390</id>
	<title>Re:Nvidia driver causing overheating? Oh really.</title>
	<author>Anonymous</author>
	<datestamp>1267792500000</datestamp>
	<modclass>None</modclass>
	<modscore>0</modscore>
	<htmltext><p>The best way to solve this, Is to turn v-sync on, And triple buffering off in your nvidia panel. As that'll cap your framerate at 60 (Or whatever your monitor would be.)</p><p>I've had a few 8800ultras and gtx 280s, And anything that's 'high performance' generally uses alot of power, and renders at high framerates.</p><p>(Also, Laptop cards use hardly any power, so they generate hardly any heat, assuming you have a decent cooler, My laptop with a 8800GTX in it never goes above 55c in games like the STALKER series)</p></htmltext>
<tokenext>The best way to solve this , Is to turn v-sync on , And triple buffering off in your nvidia panel .
As that 'll cap your framerate at 60 ( Or whatever your monitor would be .
) I 've had a few 8800ultras and gtx 280s , And anything that 's 'high performance ' generally uses alot of power , and renders at high framerates .
( Also , Laptop cards use hardly any power , so they generate hardly any heat , assuming you have a decent cooler , My laptop with a 8800GTX in it never goes above 55c in games like the STALKER series )</tokentext>
<sentencetext>The best way to solve this, Is to turn v-sync on, And triple buffering off in your nvidia panel.
As that'll cap your framerate at 60 (Or whatever your monitor would be.
)I've had a few 8800ultras and gtx 280s, And anything that's 'high performance' generally uses alot of power, and renders at high framerates.
(Also, Laptop cards use hardly any power, so they generate hardly any heat, assuming you have a decent cooler, My laptop with a 8800GTX in it never goes above 55c in games like the STALKER series)</sentencetext>
	<parent>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_03_05_0739241.31369282</parent>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_03_05_0739241.31374630</id>
	<title>Re:A little more info from the story</title>
	<author>Anonymous</author>
	<datestamp>1267821900000</datestamp>
	<modclass>Informativ</modclass>
	<modscore>1</modscore>
	<htmltext><p>A lot of Laptops have their cooling fan controlled by the CPU not the GPU.</p><p>For example the ASUS G1S which has no end of over heating problems thanks to incredibly bad internal thermal design (this is from ASUS! So others will have have it too). One tiny heat pipe connected to a GeForce 8600m GT which:<br>
&nbsp; &nbsp; a) Is made by nVidia - which means it will run hot.<br>
&nbsp; &nbsp; b) Is a know problem, it has a history of over heating.</p><p>So it is quite easy to end up in a situation where the GPU is stressed out to the max and the CPU is idling - so the fan drops to quite and the GPU temperature starts to rises.</p><p>I've seen 110*C on a G1S which was well ventilated (raised off the table) and was dust free inside - it had just come back from ASUS after being RMA'ed because the previous GPU had overheated and killed the laptop. It took them a month to repair.</p><p>There is no tool, that I am aware of, that allows you to modify the BIOS of the GPU 8600m GT successfully, ie:<br>
&nbsp; &nbsp; a) Undervolt it (I'm sure nVidia purposely overstate the voltage requirement on their GPUs so that they can be overclocked successfully).<br>
&nbsp; &nbsp; b) Change the default clock speeds for Full and Idle loads.</p><p>MMTools and NiBiTor comes closes but does not work (you can change the values re-flash the bios but they have no effect).<br>No desktop application fan speed controller will work either.</p><p>The best it gets is a cooling pad - which creates even more noise.</p></htmltext>
<tokenext>A lot of Laptops have their cooling fan controlled by the CPU not the GPU.For example the ASUS G1S which has no end of over heating problems thanks to incredibly bad internal thermal design ( this is from ASUS !
So others will have have it too ) .
One tiny heat pipe connected to a GeForce 8600m GT which :     a ) Is made by nVidia - which means it will run hot .
    b ) Is a know problem , it has a history of over heating.So it is quite easy to end up in a situation where the GPU is stressed out to the max and the CPU is idling - so the fan drops to quite and the GPU temperature starts to rises.I 've seen 110 * C on a G1S which was well ventilated ( raised off the table ) and was dust free inside - it had just come back from ASUS after being RMA'ed because the previous GPU had overheated and killed the laptop .
It took them a month to repair.There is no tool , that I am aware of , that allows you to modify the BIOS of the GPU 8600m GT successfully , ie :     a ) Undervolt it ( I 'm sure nVidia purposely overstate the voltage requirement on their GPUs so that they can be overclocked successfully ) .
    b ) Change the default clock speeds for Full and Idle loads.MMTools and NiBiTor comes closes but does not work ( you can change the values re-flash the bios but they have no effect ) .No desktop application fan speed controller will work either.The best it gets is a cooling pad - which creates even more noise .</tokentext>
<sentencetext>A lot of Laptops have their cooling fan controlled by the CPU not the GPU.For example the ASUS G1S which has no end of over heating problems thanks to incredibly bad internal thermal design (this is from ASUS!
So others will have have it too).
One tiny heat pipe connected to a GeForce 8600m GT which:
    a) Is made by nVidia - which means it will run hot.
    b) Is a know problem, it has a history of over heating.So it is quite easy to end up in a situation where the GPU is stressed out to the max and the CPU is idling - so the fan drops to quite and the GPU temperature starts to rises.I've seen 110*C on a G1S which was well ventilated (raised off the table) and was dust free inside - it had just come back from ASUS after being RMA'ed because the previous GPU had overheated and killed the laptop.
It took them a month to repair.There is no tool, that I am aware of, that allows you to modify the BIOS of the GPU 8600m GT successfully, ie:
    a) Undervolt it (I'm sure nVidia purposely overstate the voltage requirement on their GPUs so that they can be overclocked successfully).
    b) Change the default clock speeds for Full and Idle loads.MMTools and NiBiTor comes closes but does not work (you can change the values re-flash the bios but they have no effect).No desktop application fan speed controller will work either.The best it gets is a cooling pad - which creates even more noise.</sentencetext>
	<parent>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_03_05_0739241.31369304</parent>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_03_05_0739241.31369458</id>
	<title>Terrible design</title>
	<author>QuoteMstr</author>
	<datestamp>1267793280000</datestamp>
	<modclass>Insightful</modclass>
	<modscore>3</modscore>
	<htmltext><p>Software should not be able to destroy hardware, <i>period</i>. The GPU's cooling system should be designed to safety operate for sustained periods at peak load --- anything less is artificially crippling the hardware and leads to both security and reliability problems.</p><p>Great job, NVIDIA: now, malware can not only destroy your files, but destroy your expensive graphics card as well.</p></htmltext>
<tokenext>Software should not be able to destroy hardware , period .
The GPU 's cooling system should be designed to safety operate for sustained periods at peak load --- anything less is artificially crippling the hardware and leads to both security and reliability problems.Great job , NVIDIA : now , malware can not only destroy your files , but destroy your expensive graphics card as well .</tokentext>
<sentencetext>Software should not be able to destroy hardware, period.
The GPU's cooling system should be designed to safety operate for sustained periods at peak load --- anything less is artificially crippling the hardware and leads to both security and reliability problems.Great job, NVIDIA: now, malware can not only destroy your files, but destroy your expensive graphics card as well.</sentencetext>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_03_05_0739241.31377368</id>
	<title>Re:Glad it didn't fry mine.</title>
	<author>BikeHelmet</author>
	<datestamp>1267795920000</datestamp>
	<modclass>None</modclass>
	<modscore>1</modscore>
	<htmltext><p>I had an 8800GS that overheated with all recent drivers. I had to underclock its memory with Rivatuner by about 10\%.</p><p>I recently picked up a GTS 250, which also overheats. I had to underclock its memory by about 30\%, bringing it in line with the 8800GS's memory speed.</p><p>The culprit seems to be inadequate GDDR3 cooling. Frankly, I'm glad their new drivers are more efficient, and stress the card more. I just wish they didn't half-ass the heatsinks.</p></htmltext>
<tokenext>I had an 8800GS that overheated with all recent drivers .
I had to underclock its memory with Rivatuner by about 10 \ % .I recently picked up a GTS 250 , which also overheats .
I had to underclock its memory by about 30 \ % , bringing it in line with the 8800GS 's memory speed.The culprit seems to be inadequate GDDR3 cooling .
Frankly , I 'm glad their new drivers are more efficient , and stress the card more .
I just wish they did n't half-ass the heatsinks .</tokentext>
<sentencetext>I had an 8800GS that overheated with all recent drivers.
I had to underclock its memory with Rivatuner by about 10\%.I recently picked up a GTS 250, which also overheats.
I had to underclock its memory by about 30\%, bringing it in line with the 8800GS's memory speed.The culprit seems to be inadequate GDDR3 cooling.
Frankly, I'm glad their new drivers are more efficient, and stress the card more.
I just wish they didn't half-ass the heatsinks.</sentencetext>
	<parent>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_03_05_0739241.31369078</parent>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_03_05_0739241.31369038</id>
	<title>Wow</title>
	<author>Anonymous</author>
	<datestamp>1267787760000</datestamp>
	<modclass>None</modclass>
	<modscore>0</modscore>
	<htmltext>That's hot.</htmltext>
<tokenext>That 's hot .</tokentext>
<sentencetext>That's hot.</sentencetext>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_03_05_0739241.31372198</id>
	<title>Linux Users Remain Largely Unaffected</title>
	<author>RobDude</author>
	<datestamp>1267810260000</datestamp>
	<modclass>Funny</modclass>
	<modscore>4</modscore>
	<htmltext><p>Just sayin...</p></htmltext>
<tokenext>Just sayin.. .</tokentext>
<sentencetext>Just sayin...</sentencetext>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_03_05_0739241.31369182</id>
	<title>Planned obsolescence...</title>
	<author>Anonymous</author>
	<datestamp>1267789500000</datestamp>
	<modclass>None</modclass>
	<modscore>0</modscore>
	<htmltext><p>... programmed obsolescence, literally<nobr> <wbr></nobr>:-).</p></htmltext>
<tokenext>... programmed obsolescence , literally : - ) .</tokentext>
<sentencetext>... programmed obsolescence, literally :-).</sentencetext>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_03_05_0739241.31371452</id>
	<title>Re:Terrible design</title>
	<author>PPalmgren</author>
	<datestamp>1267806900000</datestamp>
	<modclass>None</modclass>
	<modscore>1</modscore>
	<htmltext><p>I think it has a lot to do with enthusiast cards.  That market segment is incredibly picky and extremely informed.  They tend to push the hardware to the limits, and beyond if it all possible.  A lot of these guys run fans at 100\% for a year with the card pushed to the max, so it has a lifespan less than 30\% of a stock card.  As such, they buy more cards per year than their mainstream counterpart.  They are also the highest profit margin and recoup R&amp;D costs for NVidia and ATI.</p><p>A good way to kill your enthusiast appeal is telling them that you hard-coded in OC prevention to dangerous voltages and temps into your cards.  "Just works" doesn't cut it in that arena.</p></htmltext>
<tokenext>I think it has a lot to do with enthusiast cards .
That market segment is incredibly picky and extremely informed .
They tend to push the hardware to the limits , and beyond if it all possible .
A lot of these guys run fans at 100 \ % for a year with the card pushed to the max , so it has a lifespan less than 30 \ % of a stock card .
As such , they buy more cards per year than their mainstream counterpart .
They are also the highest profit margin and recoup R&amp;D costs for NVidia and ATI.A good way to kill your enthusiast appeal is telling them that you hard-coded in OC prevention to dangerous voltages and temps into your cards .
" Just works " does n't cut it in that arena .</tokentext>
<sentencetext>I think it has a lot to do with enthusiast cards.
That market segment is incredibly picky and extremely informed.
They tend to push the hardware to the limits, and beyond if it all possible.
A lot of these guys run fans at 100\% for a year with the card pushed to the max, so it has a lifespan less than 30\% of a stock card.
As such, they buy more cards per year than their mainstream counterpart.
They are also the highest profit margin and recoup R&amp;D costs for NVidia and ATI.A good way to kill your enthusiast appeal is telling them that you hard-coded in OC prevention to dangerous voltages and temps into your cards.
"Just works" doesn't cut it in that arena.</sentencetext>
	<parent>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_03_05_0739241.31369458</parent>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_03_05_0739241.31369664</id>
	<title>The real question is....</title>
	<author>TheFakeMcCoy</author>
	<datestamp>1267795620000</datestamp>
	<modclass>None</modclass>
	<modscore>1</modscore>
	<htmltext>How the hell did those guys get into the Starcraft II beta, I've been waiting for months!</htmltext>
<tokenext>How the hell did those guys get into the Starcraft II beta , I 've been waiting for months !</tokentext>
<sentencetext>How the hell did those guys get into the Starcraft II beta, I've been waiting for months!</sentencetext>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_03_05_0739241.31370218</id>
	<title>Re:If it ain't broke..</title>
	<author>Anonymous</author>
	<datestamp>1267800540000</datestamp>
	<modclass>None</modclass>
	<modscore>0</modscore>
	<htmltext><p>That seems a bit contradictory. You state that you have never had an issue resolved by upgrading a driver, yet state an upgraded driver has caused issues for you? So let me guess you never updated beyond that point either, since it's quite possible a newer release of the driver fixes the issues the previous one caused? To offer the counter view Mass Effect 2 didn't get into the menu screen on my machine, a issue that was fixed by updating the driver. Yours works, then good for you.</p><p>What about added features to offload graphical processing? The introduction of PhysX makes a very noticeable difference to games which support it.</p><p>The if it ain't broke mentality may work for office types but there's a lot of people who push their cards to the limit and the differences in drivers can have quite a significant impact.</p></htmltext>
<tokenext>That seems a bit contradictory .
You state that you have never had an issue resolved by upgrading a driver , yet state an upgraded driver has caused issues for you ?
So let me guess you never updated beyond that point either , since it 's quite possible a newer release of the driver fixes the issues the previous one caused ?
To offer the counter view Mass Effect 2 did n't get into the menu screen on my machine , a issue that was fixed by updating the driver .
Yours works , then good for you.What about added features to offload graphical processing ?
The introduction of PhysX makes a very noticeable difference to games which support it.The if it ai n't broke mentality may work for office types but there 's a lot of people who push their cards to the limit and the differences in drivers can have quite a significant impact .</tokentext>
<sentencetext>That seems a bit contradictory.
You state that you have never had an issue resolved by upgrading a driver, yet state an upgraded driver has caused issues for you?
So let me guess you never updated beyond that point either, since it's quite possible a newer release of the driver fixes the issues the previous one caused?
To offer the counter view Mass Effect 2 didn't get into the menu screen on my machine, a issue that was fixed by updating the driver.
Yours works, then good for you.What about added features to offload graphical processing?
The introduction of PhysX makes a very noticeable difference to games which support it.The if it ain't broke mentality may work for office types but there's a lot of people who push their cards to the limit and the differences in drivers can have quite a significant impact.</sentencetext>
	<parent>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_03_05_0739241.31369198</parent>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_03_05_0739241.31369550</id>
	<title>Re:Crappy Nvidia driver has multiple issues</title>
	<author>Anonymous</author>
	<datestamp>1267794360000</datestamp>
	<modclass>Informativ</modclass>
	<modscore>4</modscore>
	<htmltext>This issue is related to automatic fan control not working due to improper registry keys, and so GPU's that run warm (9800 series for instance) can quickly overheat and potentially suffer damage.  I'm having no issues with mine, but I set fan profiles manually as I'm using a machine that has a very hot MCH &amp; fb-dimms (2008 Xeon) and don't want the gpu contributing more.  However for anyone interested (and using a GT200 or at least G80/G92 on up) here's the fix: <a href="http://forums.nvidia.com/index.php?showtopic=161767" title="nvidia.com" rel="nofollow">http://forums.nvidia.com/index.php?showtopic=161767</a> [nvidia.com]</htmltext>
<tokenext>This issue is related to automatic fan control not working due to improper registry keys , and so GPU 's that run warm ( 9800 series for instance ) can quickly overheat and potentially suffer damage .
I 'm having no issues with mine , but I set fan profiles manually as I 'm using a machine that has a very hot MCH &amp; fb-dimms ( 2008 Xeon ) and do n't want the gpu contributing more .
However for anyone interested ( and using a GT200 or at least G80/G92 on up ) here 's the fix : http : //forums.nvidia.com/index.php ? showtopic = 161767 [ nvidia.com ]</tokentext>
<sentencetext>This issue is related to automatic fan control not working due to improper registry keys, and so GPU's that run warm (9800 series for instance) can quickly overheat and potentially suffer damage.
I'm having no issues with mine, but I set fan profiles manually as I'm using a machine that has a very hot MCH &amp; fb-dimms (2008 Xeon) and don't want the gpu contributing more.
However for anyone interested (and using a GT200 or at least G80/G92 on up) here's the fix: http://forums.nvidia.com/index.php?showtopic=161767 [nvidia.com]</sentencetext>
	<parent>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_03_05_0739241.31369104</parent>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_03_05_0739241.31370792</id>
	<title>Re:Processor damage, really?</title>
	<author>idontgno</author>
	<datestamp>1267803780000</datestamp>
	<modclass>None</modclass>
	<modscore>1</modscore>
	<htmltext><p>You're forgetting laptops. Everything integrated in close proximity on one motherboard, with shared fixed-capacity cooling. If the driver update pushes a heat-constrained laptop GPU harder, you could easily exceed whole-system thermal limits leading to CPU or MB damage.</p><p>The less obvious case is if a desktop system is ventilated just well enough to handle normal heat from its components, and the GPU goes into thermal overdrive because of this driver. In that case, intra-case temps will go up and, if not noticed, overheat other components. Probably not damage the CPU at that point, but thermal shutdown is likely.</p><p> <i>I'm not saying there isn't an issue, but it sounds like the issue is just a bit over-hyped... or someone has an agenda and just wants to bash NVIDIA.</i> </p><p>NVIDIA is doing a damn fine job of bashing themselves. They lost my trust with their piss-poor chip engineering and deceptive PR and warranty practices in the Bumpgate fiasco. This driver screwup doesn't help that. After years of devoted NForce/GeForce fanboism, I'm now thoroughly on the AMD/ATI bandwagon. We'll give NVIDIA another look when they appear to have gotten their crap together.</p></htmltext>
<tokenext>You 're forgetting laptops .
Everything integrated in close proximity on one motherboard , with shared fixed-capacity cooling .
If the driver update pushes a heat-constrained laptop GPU harder , you could easily exceed whole-system thermal limits leading to CPU or MB damage.The less obvious case is if a desktop system is ventilated just well enough to handle normal heat from its components , and the GPU goes into thermal overdrive because of this driver .
In that case , intra-case temps will go up and , if not noticed , overheat other components .
Probably not damage the CPU at that point , but thermal shutdown is likely .
I 'm not saying there is n't an issue , but it sounds like the issue is just a bit over-hyped... or someone has an agenda and just wants to bash NVIDIA .
NVIDIA is doing a damn fine job of bashing themselves .
They lost my trust with their piss-poor chip engineering and deceptive PR and warranty practices in the Bumpgate fiasco .
This driver screwup does n't help that .
After years of devoted NForce/GeForce fanboism , I 'm now thoroughly on the AMD/ATI bandwagon .
We 'll give NVIDIA another look when they appear to have gotten their crap together .</tokentext>
<sentencetext>You're forgetting laptops.
Everything integrated in close proximity on one motherboard, with shared fixed-capacity cooling.
If the driver update pushes a heat-constrained laptop GPU harder, you could easily exceed whole-system thermal limits leading to CPU or MB damage.The less obvious case is if a desktop system is ventilated just well enough to handle normal heat from its components, and the GPU goes into thermal overdrive because of this driver.
In that case, intra-case temps will go up and, if not noticed, overheat other components.
Probably not damage the CPU at that point, but thermal shutdown is likely.
I'm not saying there isn't an issue, but it sounds like the issue is just a bit over-hyped... or someone has an agenda and just wants to bash NVIDIA.
NVIDIA is doing a damn fine job of bashing themselves.
They lost my trust with their piss-poor chip engineering and deceptive PR and warranty practices in the Bumpgate fiasco.
This driver screwup doesn't help that.
After years of devoted NForce/GeForce fanboism, I'm now thoroughly on the AMD/ATI bandwagon.
We'll give NVIDIA another look when they appear to have gotten their crap together.</sentencetext>
	<parent>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_03_05_0739241.31369156</parent>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_03_05_0739241.31370670</id>
	<title>SUE THEM!!!</title>
	<author>Anonymous</author>
	<datestamp>1267803180000</datestamp>
	<modclass>None</modclass>
	<modscore>0</modscore>
	<htmltext><p>Anyone whose hardware is damaged should SUE THESE BASTARDS!!!!</p></htmltext>
<tokenext>Anyone whose hardware is damaged should SUE THESE BASTARDS ! ! !
!</tokentext>
<sentencetext>Anyone whose hardware is damaged should SUE THESE BASTARDS!!!
!</sentencetext>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_03_05_0739241.31370198</id>
	<title>WoW has realistic graphics?</title>
	<author>Drethon</author>
	<datestamp>1267800420000</datestamp>
	<modclass>None</modclass>
	<modscore>1</modscore>
	<htmltext>Since when?</htmltext>
<tokenext>Since when ?</tokentext>
<sentencetext>Since when?</sentencetext>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_03_05_0739241.31369308</id>
	<title>Re:Processor damage, really?</title>
	<author>Manip</author>
	<datestamp>1267791360000</datestamp>
	<modclass>Informativ</modclass>
	<modscore>2</modscore>
	<htmltext><p>The slot can be damaged by overheating cards, and if it is your only 16x slot then you could wind up throwing away the entire motherboard. Although typically this is more often seen when a card overheats multiple times causing the material to expand and contract until it eventually fails (as opposed to this case when cards just die).</p><p>My only guess about CPU damage is unregulated power spikes but that is just conjecture. Plus if anything was going to get damaged by power spikes it wouldn't be the CPU it would be the RAM.</p></htmltext>
<tokenext>The slot can be damaged by overheating cards , and if it is your only 16x slot then you could wind up throwing away the entire motherboard .
Although typically this is more often seen when a card overheats multiple times causing the material to expand and contract until it eventually fails ( as opposed to this case when cards just die ) .My only guess about CPU damage is unregulated power spikes but that is just conjecture .
Plus if anything was going to get damaged by power spikes it would n't be the CPU it would be the RAM .</tokentext>
<sentencetext>The slot can be damaged by overheating cards, and if it is your only 16x slot then you could wind up throwing away the entire motherboard.
Although typically this is more often seen when a card overheats multiple times causing the material to expand and contract until it eventually fails (as opposed to this case when cards just die).My only guess about CPU damage is unregulated power spikes but that is just conjecture.
Plus if anything was going to get damaged by power spikes it wouldn't be the CPU it would be the RAM.</sentencetext>
	<parent>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_03_05_0739241.31369156</parent>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_03_05_0739241.31370106</id>
	<title>Re:Terrible design</title>
	<author>Anonymous</author>
	<datestamp>1267799700000</datestamp>
	<modclass>Interestin</modclass>
	<modscore>1</modscore>
	<htmltext><p>Software (read: applications) isn't destroying hardware in this case.  The hardware itself is now "faulty" as the drivers have a pretty bad bug.</p><p>In my mind, this is no different than taking the the heatsink/fan off a CPU.  That's a hardware issue.  Doesn't matter what games, etc, you run, you risk killing that CPU because the CPU is under an abnormal operating condition.</p><p>While drivers are in control in the case we have here with nVidia, I see the drivers as part of the hardware since they were released by the manufacturer.</p></htmltext>
<tokenext>Software ( read : applications ) is n't destroying hardware in this case .
The hardware itself is now " faulty " as the drivers have a pretty bad bug.In my mind , this is no different than taking the the heatsink/fan off a CPU .
That 's a hardware issue .
Does n't matter what games , etc , you run , you risk killing that CPU because the CPU is under an abnormal operating condition.While drivers are in control in the case we have here with nVidia , I see the drivers as part of the hardware since they were released by the manufacturer .</tokentext>
<sentencetext>Software (read: applications) isn't destroying hardware in this case.
The hardware itself is now "faulty" as the drivers have a pretty bad bug.In my mind, this is no different than taking the the heatsink/fan off a CPU.
That's a hardware issue.
Doesn't matter what games, etc, you run, you risk killing that CPU because the CPU is under an abnormal operating condition.While drivers are in control in the case we have here with nVidia, I see the drivers as part of the hardware since they were released by the manufacturer.</sentencetext>
	<parent>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_03_05_0739241.31369458</parent>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_03_05_0739241.31371368</id>
	<title>Time to switch to ATI?</title>
	<author>js3</author>
	<datestamp>1267806480000</datestamp>
	<modclass>None</modclass>
	<modscore>1</modscore>
	<htmltext><p>Last week my win7 bluescreened 3 times with weird hardware errors while playing WoW. I knew something was off but never figured it would be crappy nvidia cards. I've always been a fan and always bought their cards but yea wtf is up with that. Maybe time to try some ATI</p></htmltext>
<tokenext>Last week my win7 bluescreened 3 times with weird hardware errors while playing WoW .
I knew something was off but never figured it would be crappy nvidia cards .
I 've always been a fan and always bought their cards but yea wtf is up with that .
Maybe time to try some ATI</tokentext>
<sentencetext>Last week my win7 bluescreened 3 times with weird hardware errors while playing WoW.
I knew something was off but never figured it would be crappy nvidia cards.
I've always been a fan and always bought their cards but yea wtf is up with that.
Maybe time to try some ATI</sentencetext>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_03_05_0739241.31382456</id>
	<title>How to Find Current NVIDIA Driver Version</title>
	<author>progliberty</author>
	<datestamp>1267906740000</datestamp>
	<modclass>None</modclass>
	<modscore>1</modscore>
	<htmltext>How to Find Current NVIDIA Driver Version (on your computer) in Windows XP:

START menu &gt;&gt; Control Panel &gt;&gt; NVIDIA Control Panel &gt;&gt; Help Menu &gt;&gt; System Information</htmltext>
<tokenext>How to Find Current NVIDIA Driver Version ( on your computer ) in Windows XP : START menu &gt; &gt; Control Panel &gt; &gt; NVIDIA Control Panel &gt; &gt; Help Menu &gt; &gt; System Information</tokentext>
<sentencetext>How to Find Current NVIDIA Driver Version (on your computer) in Windows XP:

START menu &gt;&gt; Control Panel &gt;&gt; NVIDIA Control Panel &gt;&gt; Help Menu &gt;&gt; System Information</sentencetext>
	<parent>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_03_05_0739241.31369038</parent>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_03_05_0739241.31371830</id>
	<title>Has any GPU damage *actually* occurred?</title>
	<author>3.1415926535</author>
	<datestamp>1267808520000</datestamp>
	<modclass>None</modclass>
	<modscore>1</modscore>
	<htmltext><p>Unlike the (what appears to be purely speculative) complaining here, modern graphics boards have thermal and voltage protection circuitry that operates independently of the software to protect the GPU from exactly this sort of situation.  That's why the Blizzard report talks about a lot of "my game slowed down" complaints rather than "my GPU blew up" complaints.</p></htmltext>
<tokenext>Unlike the ( what appears to be purely speculative ) complaining here , modern graphics boards have thermal and voltage protection circuitry that operates independently of the software to protect the GPU from exactly this sort of situation .
That 's why the Blizzard report talks about a lot of " my game slowed down " complaints rather than " my GPU blew up " complaints .</tokentext>
<sentencetext>Unlike the (what appears to be purely speculative) complaining here, modern graphics boards have thermal and voltage protection circuitry that operates independently of the software to protect the GPU from exactly this sort of situation.
That's why the Blizzard report talks about a lot of "my game slowed down" complaints rather than "my GPU blew up" complaints.</sentencetext>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_03_05_0739241.31369232</id>
	<title>Re:Processor damage, really?</title>
	<author>someone1234</author>
	<datestamp>1267790100000</datestamp>
	<modclass>None</modclass>
	<modscore>1</modscore>
	<htmltext><p>The melting GPU burns the CPU. It could even damage the carpet!</p></htmltext>
<tokenext>The melting GPU burns the CPU .
It could even damage the carpet !</tokentext>
<sentencetext>The melting GPU burns the CPU.
It could even damage the carpet!</sentencetext>
	<parent>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_03_05_0739241.31369156</parent>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_03_05_0739241.31371182</id>
	<title>evil...</title>
	<author>GNUPublicLicense</author>
	<datestamp>1267805640000</datestamp>
	<modclass>None</modclass>
	<modscore>1</modscore>
	<htmltext>nvidia is evil since they don't publish their hardware programming manual like AMD(ATI)/Intel.
Buy AMD(ATI) or Intel. Avoid like hell nvidia till they release their manuals.</htmltext>
<tokenext>nvidia is evil since they do n't publish their hardware programming manual like AMD ( ATI ) /Intel .
Buy AMD ( ATI ) or Intel .
Avoid like hell nvidia till they release their manuals .</tokentext>
<sentencetext>nvidia is evil since they don't publish their hardware programming manual like AMD(ATI)/Intel.
Buy AMD(ATI) or Intel.
Avoid like hell nvidia till they release their manuals.</sentencetext>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_03_05_0739241.31370002</id>
	<title>Re:Terrible design</title>
	<author>Kleppy</author>
	<datestamp>1267798920000</datestamp>
	<modclass>Funny</modclass>
	<modscore>2</modscore>
	<htmltext><blockquote><div><p>"Software should not be able to destroy hardware, period."</p></div></blockquote><p>
Tell that to Toyota.....</p></div>
	</htmltext>
<tokenext>" Software should not be able to destroy hardware , period .
" Tell that to Toyota.... .</tokentext>
<sentencetext>"Software should not be able to destroy hardware, period.
"
Tell that to Toyota.....
	</sentencetext>
	<parent>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_03_05_0739241.31369458</parent>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_03_05_0739241.31370138</id>
	<title>Re:Terrible design</title>
	<author>maxwell demon</author>
	<datestamp>1267799940000</datestamp>
	<modclass>None</modclass>
	<modscore>1</modscore>
	<htmltext><p>Old monitors could be killed by software as well (by just selecting a too high sync frequency). Later monitors added a protection against that.<br>Also, don't some motherboards allow to set the CPU voltage in the BIOS? I guess that means you could fry your CPU from software as well.</p></htmltext>
<tokenext>Old monitors could be killed by software as well ( by just selecting a too high sync frequency ) .
Later monitors added a protection against that.Also , do n't some motherboards allow to set the CPU voltage in the BIOS ?
I guess that means you could fry your CPU from software as well .</tokentext>
<sentencetext>Old monitors could be killed by software as well (by just selecting a too high sync frequency).
Later monitors added a protection against that.Also, don't some motherboards allow to set the CPU voltage in the BIOS?
I guess that means you could fry your CPU from software as well.</sentencetext>
	<parent>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_03_05_0739241.31369458</parent>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_03_05_0739241.31371538</id>
	<title>Re:Processor damage, really?</title>
	<author>Anonymous</author>
	<datestamp>1267807320000</datestamp>
	<modclass>None</modclass>
	<modscore>0</modscore>
	<htmltext><p>Unfortunately for you theory laptops use the mobile nvidia series and the newest driver release is still 195.62, therefore not affected by this.</p></htmltext>
<tokenext>Unfortunately for you theory laptops use the mobile nvidia series and the newest driver release is still 195.62 , therefore not affected by this .</tokentext>
<sentencetext>Unfortunately for you theory laptops use the mobile nvidia series and the newest driver release is still 195.62, therefore not affected by this.</sentencetext>
	<parent>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_03_05_0739241.31369238</parent>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_03_05_0739241.31369518</id>
	<title>Er....</title>
	<author>Anonymous</author>
	<datestamp>1267793940000</datestamp>
	<modclass>None</modclass>
	<modscore>0</modscore>
	<htmltext><p>Farcry 3?  Really?</p></htmltext>
<tokenext>Farcry 3 ?
Really ?</tokentext>
<sentencetext>Farcry 3?
Really?</sentencetext>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_03_05_0739241.31371136</id>
	<title>GPUs not just for gaming anymore</title>
	<author>dcraid</author>
	<datestamp>1267805340000</datestamp>
	<modclass>Offtopic</modclass>
	<modscore>0</modscore>
	<htmltext>I use my GPU as a 715+ GFLOP SETI cruncher.</htmltext>
<tokenext>I use my GPU as a 715 + GFLOP SETI cruncher .</tokentext>
<sentencetext>I use my GPU as a 715+ GFLOP SETI cruncher.</sentencetext>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_03_05_0739241.31369198</id>
	<title>If it ain't broke..</title>
	<author>Anonymous</author>
	<datestamp>1267789680000</datestamp>
	<modclass>Insightful</modclass>
	<modscore>3</modscore>
	<htmltext><p>WoW seems an odd companion to those other games, I've always felt the CPU was the primary bottleneck in that beast, but be that as it may..</p><p>For me, I can't recall ever solving an issue or getting noticeable performance improvements from upgrading graphics drivers. I have, however, had several issues introduced by it.</p><p>Nowadays I stick to the old "if it works don't try to fix it" mantra, with a few exceptions. For example, I kept up-to-date for a bit after Win7 release, assuming there would be teething issues for a few revisions. If buying a bleeding edge recently released card I would also stay on top of drivers for a month or two. But other than that, just leave them be I say.</p></htmltext>
<tokenext>WoW seems an odd companion to those other games , I 've always felt the CPU was the primary bottleneck in that beast , but be that as it may..For me , I ca n't recall ever solving an issue or getting noticeable performance improvements from upgrading graphics drivers .
I have , however , had several issues introduced by it.Nowadays I stick to the old " if it works do n't try to fix it " mantra , with a few exceptions .
For example , I kept up-to-date for a bit after Win7 release , assuming there would be teething issues for a few revisions .
If buying a bleeding edge recently released card I would also stay on top of drivers for a month or two .
But other than that , just leave them be I say .</tokentext>
<sentencetext>WoW seems an odd companion to those other games, I've always felt the CPU was the primary bottleneck in that beast, but be that as it may..For me, I can't recall ever solving an issue or getting noticeable performance improvements from upgrading graphics drivers.
I have, however, had several issues introduced by it.Nowadays I stick to the old "if it works don't try to fix it" mantra, with a few exceptions.
For example, I kept up-to-date for a bit after Win7 release, assuming there would be teething issues for a few revisions.
If buying a bleeding edge recently released card I would also stay on top of drivers for a month or two.
But other than that, just leave them be I say.</sentencetext>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_03_05_0739241.31371276</id>
	<title>Re:Terrible design</title>
	<author>null8</author>
	<datestamp>1267806000000</datestamp>
	<modclass>None</modclass>
	<modscore>1</modscore>
	<htmltext>That is not realistic. If you want to provide people with a possibility of BIOS update to fix some hardware bugs, you can overwrite you bios for example with some garbage that can apply incorrect voltages, which will physically destroy your mainboard, it once happened to me. If you know how you even can load new microcode, which can kill a CPU. One can theoretically open multiple tristate gates and cause some kind of  short circuit. I mean you can say "noone should kill another person, period", everyone will agree with it, but it's also not realistic.</htmltext>
<tokenext>That is not realistic .
If you want to provide people with a possibility of BIOS update to fix some hardware bugs , you can overwrite you bios for example with some garbage that can apply incorrect voltages , which will physically destroy your mainboard , it once happened to me .
If you know how you even can load new microcode , which can kill a CPU .
One can theoretically open multiple tristate gates and cause some kind of short circuit .
I mean you can say " noone should kill another person , period " , everyone will agree with it , but it 's also not realistic .</tokentext>
<sentencetext>That is not realistic.
If you want to provide people with a possibility of BIOS update to fix some hardware bugs, you can overwrite you bios for example with some garbage that can apply incorrect voltages, which will physically destroy your mainboard, it once happened to me.
If you know how you even can load new microcode, which can kill a CPU.
One can theoretically open multiple tristate gates and cause some kind of  short circuit.
I mean you can say "noone should kill another person, period", everyone will agree with it, but it's also not realistic.</sentencetext>
	<parent>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_03_05_0739241.31369458</parent>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_03_05_0739241.31376746</id>
	<title>It's a warning label.</title>
	<author>Narcocide</author>
	<datestamp>1267790520000</datestamp>
	<modclass>None</modclass>
	<modscore>1</modscore>
	<htmltext><p>Seriously tell me you didn't already know.</p></htmltext>
<tokenext>Seriously tell me you did n't already know .</tokentext>
<sentencetext>Seriously tell me you didn't already know.</sentencetext>
	<parent>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_03_05_0739241.31371548</parent>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_03_05_0739241.31369142</id>
	<title>As an ATI user...</title>
	<author>Anonymous</author>
	<datestamp>1267789140000</datestamp>
	<modclass>None</modclass>
	<modscore>-1</modscore>
	<htmltext><p>... all I have to say is "neeener neeener nah nah!"</p></htmltext>
<tokenext>... all I have to say is " neeener neeener nah nah !
"</tokentext>
<sentencetext>... all I have to say is "neeener neeener nah nah!
"</sentencetext>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_03_05_0739241.31369702</id>
	<title>Re:If it ain't broke..</title>
	<author>Anonymous</author>
	<datestamp>1267795980000</datestamp>
	<modclass>None</modclass>
	<modscore>0</modscore>
	<htmltext><p>I used to moonlight as a hardware reviewer specialized in graphics cards, and I can assure you, performance and quality DOES differ between driver versions.</p><p>I cannot give any recent examples since I'm not longer in the business, but i've seen drivers that gave up to 33 FPS more in newly released games in comparison with preview driver versions. That said, those gains were sometimes fueled by quality drops, but not always.</p></htmltext>
<tokenext>I used to moonlight as a hardware reviewer specialized in graphics cards , and I can assure you , performance and quality DOES differ between driver versions.I can not give any recent examples since I 'm not longer in the business , but i 've seen drivers that gave up to 33 FPS more in newly released games in comparison with preview driver versions .
That said , those gains were sometimes fueled by quality drops , but not always .</tokentext>
<sentencetext>I used to moonlight as a hardware reviewer specialized in graphics cards, and I can assure you, performance and quality DOES differ between driver versions.I cannot give any recent examples since I'm not longer in the business, but i've seen drivers that gave up to 33 FPS more in newly released games in comparison with preview driver versions.
That said, those gains were sometimes fueled by quality drops, but not always.</sentencetext>
	<parent>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_03_05_0739241.31369198</parent>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_03_05_0739241.31370258</id>
	<title>WoW and realistic in the same sentence!</title>
	<author>carlhaagen</author>
	<datestamp>1267800840000</datestamp>
	<modclass>None</modclass>
	<modscore>1</modscore>
	<htmltext>That's very odd. Also odd is that from the article it seems that the overheating has to do with how realistic the game looks; as if the card just KNOWS the content looks realistic, and suffers a spell of worry, feeling stressed about performing, and thus not managing to cope. Oh, the poor GPUs, they deserve better. Spread the love.</htmltext>
<tokenext>That 's very odd .
Also odd is that from the article it seems that the overheating has to do with how realistic the game looks ; as if the card just KNOWS the content looks realistic , and suffers a spell of worry , feeling stressed about performing , and thus not managing to cope .
Oh , the poor GPUs , they deserve better .
Spread the love .</tokentext>
<sentencetext>That's very odd.
Also odd is that from the article it seems that the overheating has to do with how realistic the game looks; as if the card just KNOWS the content looks realistic, and suffers a spell of worry, feeling stressed about performing, and thus not managing to cope.
Oh, the poor GPUs, they deserve better.
Spread the love.</sentencetext>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_03_05_0739241.31369842</id>
	<title>196.21 had issues as well</title>
	<author>Anonymous</author>
	<datestamp>1267797360000</datestamp>
	<modclass>None</modclass>
	<modscore>1</modscore>
	<htmltext><p>I had to revert back to 195.62 driver because the 196.21 was causing my system to randomly lock up, even more so when i was playing games such as Star Trek Online. Boy am I Glad I didn't see the newer one. I will tell you this however, these last to driver revs from Nvidia are sure starting to make look more closely at ATI again.</p></htmltext>
<tokenext>I had to revert back to 195.62 driver because the 196.21 was causing my system to randomly lock up , even more so when i was playing games such as Star Trek Online .
Boy am I Glad I did n't see the newer one .
I will tell you this however , these last to driver revs from Nvidia are sure starting to make look more closely at ATI again .</tokentext>
<sentencetext>I had to revert back to 195.62 driver because the 196.21 was causing my system to randomly lock up, even more so when i was playing games such as Star Trek Online.
Boy am I Glad I didn't see the newer one.
I will tell you this however, these last to driver revs from Nvidia are sure starting to make look more closely at ATI again.</sentencetext>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_03_05_0739241.31371682</id>
	<title>Re:Terrible design</title>
	<author>Nemyst</author>
	<datestamp>1267807920000</datestamp>
	<modclass>None</modclass>
	<modscore>1</modscore>
	<htmltext>This is why you don't buy directly from Nvidia (or ATI for that matters). If you get a BFG or XFX or EVGA card (which often run for about the same price as vanilla cards), you also get lifetime warranty to protect against exactly this kind of trouble. A friend of mine got his EVGA GPU fried (8800GTX) and they replaced it within 5 days with a GTX 260, for free. No, it's not normal that the drivers can allow that, but shit happens. It'll get fixed quickly I hope, but you should always give yourself some protection on top of that.</htmltext>
<tokenext>This is why you do n't buy directly from Nvidia ( or ATI for that matters ) .
If you get a BFG or XFX or EVGA card ( which often run for about the same price as vanilla cards ) , you also get lifetime warranty to protect against exactly this kind of trouble .
A friend of mine got his EVGA GPU fried ( 8800GTX ) and they replaced it within 5 days with a GTX 260 , for free .
No , it 's not normal that the drivers can allow that , but shit happens .
It 'll get fixed quickly I hope , but you should always give yourself some protection on top of that .</tokentext>
<sentencetext>This is why you don't buy directly from Nvidia (or ATI for that matters).
If you get a BFG or XFX or EVGA card (which often run for about the same price as vanilla cards), you also get lifetime warranty to protect against exactly this kind of trouble.
A friend of mine got his EVGA GPU fried (8800GTX) and they replaced it within 5 days with a GTX 260, for free.
No, it's not normal that the drivers can allow that, but shit happens.
It'll get fixed quickly I hope, but you should always give yourself some protection on top of that.</sentencetext>
	<parent>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_03_05_0739241.31369458</parent>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_03_05_0739241.31369260</id>
	<title>You Can't See Inside!</title>
	<author>Anonymous</author>
	<datestamp>1267790700000</datestamp>
	<modclass>None</modclass>
	<modscore>0</modscore>
	<htmltext><p>that's what you get for using proprietary software!</p></htmltext>
<tokenext>that 's what you get for using proprietary software !</tokentext>
<sentencetext>that's what you get for using proprietary software!</sentencetext>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_03_05_0739241.31370082</id>
	<title>Re:Processor damage, really?</title>
	<author>wisnoskij</author>
	<datestamp>1267799520000</datestamp>
	<modclass>None</modclass>
	<modscore>1</modscore>
	<htmltext><p>I have had a video card overheat and break my motherboard.</p><p>I am not sure about the technical side but I imagine that the motherboard was not designed to run at extreme temperatures.</p></htmltext>
<tokenext>I have had a video card overheat and break my motherboard.I am not sure about the technical side but I imagine that the motherboard was not designed to run at extreme temperatures .</tokentext>
<sentencetext>I have had a video card overheat and break my motherboard.I am not sure about the technical side but I imagine that the motherboard was not designed to run at extreme temperatures.</sentencetext>
	<parent>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_03_05_0739241.31369156</parent>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_03_05_0739241.31390026</id>
	<title>9600 GT goes bye-bye</title>
	<author>Derpnooner</author>
	<datestamp>1267978500000</datestamp>
	<modclass>None</modclass>
	<modscore>1</modscore>
	<htmltext>This driver update is probably why my 9600 GT popped 2 caps a couple of weeks ago.  How lame is this?  My card is no longer under warranty and it popped due to badly written/tested drivers.  Oh well, I went out and bought an ATI 5 series.  DirectX 11 and AvP is a pretty good combo.  Thanks Nvidia!</htmltext>
<tokenext>This driver update is probably why my 9600 GT popped 2 caps a couple of weeks ago .
How lame is this ?
My card is no longer under warranty and it popped due to badly written/tested drivers .
Oh well , I went out and bought an ATI 5 series .
DirectX 11 and AvP is a pretty good combo .
Thanks Nvidia !</tokentext>
<sentencetext>This driver update is probably why my 9600 GT popped 2 caps a couple of weeks ago.
How lame is this?
My card is no longer under warranty and it popped due to badly written/tested drivers.
Oh well, I went out and bought an ATI 5 series.
DirectX 11 and AvP is a pretty good combo.
Thanks Nvidia!</sentencetext>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_03_05_0739241.31370420</id>
	<title>Re:Terrible design</title>
	<author>syousef</author>
	<datestamp>1267801620000</datestamp>
	<modclass>None</modclass>
	<modscore>1</modscore>
	<htmltext><p><div class="quote"><p>Software should not be able to destroy hardware, <i>period</i></p> </div><p>Good luck with that since software controls the hardware. Whether it's in bios or drivers, software that operates hardware is going to be able to fry it if written poorly.</p><p><div class="quote"><p>The GPU's cooling system should be designed to safety operate for sustained periods at peak load --- anything less is artificially crippling the hardware and leads to both security and reliability problems.</p></div><p>Yes, that's why they built a fan or heat sync into the graphics card.</p><p><div class="quote"><p>Great job, NVIDIA: now, malware can not only destroy your files, but destroy your expensive graphics card as well.</p></div><p>You must be new to computing because the ability for a virus to destroy hardware is not new. The only reason it's not done more often is that there's no money or glory to be made in such asshole behaviour. So instead viruses focus on stealing bank account details.</p></div>
	</htmltext>
<tokenext>Software should not be able to destroy hardware , period Good luck with that since software controls the hardware .
Whether it 's in bios or drivers , software that operates hardware is going to be able to fry it if written poorly.The GPU 's cooling system should be designed to safety operate for sustained periods at peak load --- anything less is artificially crippling the hardware and leads to both security and reliability problems.Yes , that 's why they built a fan or heat sync into the graphics card.Great job , NVIDIA : now , malware can not only destroy your files , but destroy your expensive graphics card as well.You must be new to computing because the ability for a virus to destroy hardware is not new .
The only reason it 's not done more often is that there 's no money or glory to be made in such asshole behaviour .
So instead viruses focus on stealing bank account details .</tokentext>
<sentencetext>Software should not be able to destroy hardware, period Good luck with that since software controls the hardware.
Whether it's in bios or drivers, software that operates hardware is going to be able to fry it if written poorly.The GPU's cooling system should be designed to safety operate for sustained periods at peak load --- anything less is artificially crippling the hardware and leads to both security and reliability problems.Yes, that's why they built a fan or heat sync into the graphics card.Great job, NVIDIA: now, malware can not only destroy your files, but destroy your expensive graphics card as well.You must be new to computing because the ability for a virus to destroy hardware is not new.
The only reason it's not done more often is that there's no money or glory to be made in such asshole behaviour.
So instead viruses focus on stealing bank account details.
	</sentencetext>
	<parent>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_03_05_0739241.31369458</parent>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_03_05_0739241.31369628</id>
	<title>nVidia and the dreaded nv4\_disp.dll bug</title>
	<author>CuteSteveJobs</author>
	<datestamp>1267795380000</datestamp>
	<modclass>None</modclass>
	<modscore>1</modscore>
	<htmltext><p>Don't expect it fixed... ever! In 2005 bought a "top end" Nvidia card that worked fine most of the time, but occasionally it would go through fits where it threw up a BSOD  announcing an infinite loop was detected in the display driver nv4\_disp.dll.</p><p>Many reported it to nVidia - me included - but they ignored everyone through every avenue. The bug stayed there through releases of new generation nVidia cards, and Google shows people still finding the bug and trying to "fix" it to this day.</p><p>I can only presume nVidia knew about it, but the problem would have required a card recall. So they just ignored it and kept selling the buggy cards. Many solutions were suggested by users, posted and tried, but none worked. No solutions ever came from nVidia, who wouldn't say a word on the issue. Their FAQ fobbed you off to the OEM who of course had no clue. Last time I checked you couldn't even submit a bug report through their site. They may be successful, but they have the worst tech support ever. Don't expect a fix. In the end I tossed the card.</p><p><a href="http://www.google.com/search?q=nvdisp+4+nvidia+bsod" title="google.com">http://www.google.com/search?q=nvdisp+4+nvidia+bsod</a> [google.com]</p></htmltext>
<tokenext>Do n't expect it fixed... ever ! In 2005 bought a " top end " Nvidia card that worked fine most of the time , but occasionally it would go through fits where it threw up a BSOD announcing an infinite loop was detected in the display driver nv4 \ _disp.dll.Many reported it to nVidia - me included - but they ignored everyone through every avenue .
The bug stayed there through releases of new generation nVidia cards , and Google shows people still finding the bug and trying to " fix " it to this day.I can only presume nVidia knew about it , but the problem would have required a card recall .
So they just ignored it and kept selling the buggy cards .
Many solutions were suggested by users , posted and tried , but none worked .
No solutions ever came from nVidia , who would n't say a word on the issue .
Their FAQ fobbed you off to the OEM who of course had no clue .
Last time I checked you could n't even submit a bug report through their site .
They may be successful , but they have the worst tech support ever .
Do n't expect a fix .
In the end I tossed the card.http : //www.google.com/search ? q = nvdisp + 4 + nvidia + bsod [ google.com ]</tokentext>
<sentencetext>Don't expect it fixed... ever! In 2005 bought a "top end" Nvidia card that worked fine most of the time, but occasionally it would go through fits where it threw up a BSOD  announcing an infinite loop was detected in the display driver nv4\_disp.dll.Many reported it to nVidia - me included - but they ignored everyone through every avenue.
The bug stayed there through releases of new generation nVidia cards, and Google shows people still finding the bug and trying to "fix" it to this day.I can only presume nVidia knew about it, but the problem would have required a card recall.
So they just ignored it and kept selling the buggy cards.
Many solutions were suggested by users, posted and tried, but none worked.
No solutions ever came from nVidia, who wouldn't say a word on the issue.
Their FAQ fobbed you off to the OEM who of course had no clue.
Last time I checked you couldn't even submit a bug report through their site.
They may be successful, but they have the worst tech support ever.
Don't expect a fix.
In the end I tossed the card.http://www.google.com/search?q=nvdisp+4+nvidia+bsod [google.com]</sentencetext>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_03_05_0739241.31369104</id>
	<title>Crappy Nvidia driver has multiple issues</title>
	<author>Anonymous</author>
	<datestamp>1267788780000</datestamp>
	<modclass>None</modclass>
	<modscore>1</modscore>
	<htmltext>Apart from the fan problem, is this version more stable?  The last version causes my laptop to crash every few minutes, making it unusable, so I have to run the VESA driver.</htmltext>
<tokenext>Apart from the fan problem , is this version more stable ?
The last version causes my laptop to crash every few minutes , making it unusable , so I have to run the VESA driver .</tokentext>
<sentencetext>Apart from the fan problem, is this version more stable?
The last version causes my laptop to crash every few minutes, making it unusable, so I have to run the VESA driver.</sentencetext>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_03_05_0739241.31369304</id>
	<title>A little more info from the story</title>
	<author>L4t3r4lu5</author>
	<datestamp>1267791300000</datestamp>
	<modclass>Interestin</modclass>
	<modscore>4</modscore>
	<htmltext>The EVGA tool has been used to manually set fan speed to 77\% to compensate. I see no reason for other low-level customisation tools (RivaTuner etc) to not behave in the same way.<br> <br>If you get a performance boost from this new driver, download RivaTuner or a similar tool and manually set the fan speed for gaming.</htmltext>
<tokenext>The EVGA tool has been used to manually set fan speed to 77 \ % to compensate .
I see no reason for other low-level customisation tools ( RivaTuner etc ) to not behave in the same way .
If you get a performance boost from this new driver , download RivaTuner or a similar tool and manually set the fan speed for gaming .</tokentext>
<sentencetext>The EVGA tool has been used to manually set fan speed to 77\% to compensate.
I see no reason for other low-level customisation tools (RivaTuner etc) to not behave in the same way.
If you get a performance boost from this new driver, download RivaTuner or a similar tool and manually set the fan speed for gaming.</sentencetext>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_03_05_0739241.31369404</id>
	<title>Re:If it ain't broke..</title>
	<author>L4t3r4lu5</author>
	<datestamp>1267792620000</datestamp>
	<modclass>Informativ</modclass>
	<modscore>2</modscore>
	<htmltext>The shadows implemented with v3 crippled WoW graphics performance. I have an C2Q Q6600@2.8GHz, 4GB DDRII RAM, 8800gtx running everything at max settings except shadows (blob only), 1920x1200 with min 60fps. If I turn shadows up one level I get 40 fps, full shadows bring the thing to a crawl even in open areas like The Shimmering Flats.<br> <br>I can easily see the gfx being a bottleneck with the shadows up, but other than that I agree. Loading the other players in Dala is horrid.</htmltext>
<tokenext>The shadows implemented with v3 crippled WoW graphics performance .
I have an C2Q Q6600 @ 2.8GHz , 4GB DDRII RAM , 8800gtx running everything at max settings except shadows ( blob only ) , 1920x1200 with min 60fps .
If I turn shadows up one level I get 40 fps , full shadows bring the thing to a crawl even in open areas like The Shimmering Flats .
I can easily see the gfx being a bottleneck with the shadows up , but other than that I agree .
Loading the other players in Dala is horrid .</tokentext>
<sentencetext>The shadows implemented with v3 crippled WoW graphics performance.
I have an C2Q Q6600@2.8GHz, 4GB DDRII RAM, 8800gtx running everything at max settings except shadows (blob only), 1920x1200 with min 60fps.
If I turn shadows up one level I get 40 fps, full shadows bring the thing to a crawl even in open areas like The Shimmering Flats.
I can easily see the gfx being a bottleneck with the shadows up, but other than that I agree.
Loading the other players in Dala is horrid.</sentencetext>
	<parent>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_03_05_0739241.31369198</parent>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_03_05_0739241.31372800</id>
	<title>So...</title>
	<author>Moheeheeko</author>
	<datestamp>1267813380000</datestamp>
	<modclass>None</modclass>
	<modscore>1</modscore>
	<htmltext>...buissiness as usual at NVIDIA?</htmltext>
<tokenext>...buissiness as usual at NVIDIA ?</tokentext>
<sentencetext>...buissiness as usual at NVIDIA?</sentencetext>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_03_05_0739241.31378806</id>
	<title>Re:nVidia and the dreaded nv4\_disp.dll bug</title>
	<author>kalirion</author>
	<datestamp>1267811400000</datestamp>
	<modclass>None</modclass>
	<modscore>1</modscore>
	<htmltext><p>My 8800GT was getting that, but mainly in older games (Descent 3, Gothic, Recoil, Neverwinter Nights) and just a few of the newer ones (Titan Quest: IT).  The integrated Nvidia 7100 ran those perfect with the exact same drivers.</p></htmltext>
<tokenext>My 8800GT was getting that , but mainly in older games ( Descent 3 , Gothic , Recoil , Neverwinter Nights ) and just a few of the newer ones ( Titan Quest : IT ) .
The integrated Nvidia 7100 ran those perfect with the exact same drivers .</tokentext>
<sentencetext>My 8800GT was getting that, but mainly in older games (Descent 3, Gothic, Recoil, Neverwinter Nights) and just a few of the newer ones (Titan Quest: IT).
The integrated Nvidia 7100 ran those perfect with the exact same drivers.</sentencetext>
	<parent>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_03_05_0739241.31369628</parent>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_03_05_0739241.31389278</id>
	<title>nVIDIA Overheating</title>
	<author>Chili-71</author>
	<datestamp>1267974120000</datestamp>
	<modclass>None</modclass>
	<modscore>1</modscore>
	<htmltext>I have had a problem for years with nVIDIA cards overheating. So, I have the profile setup to run the fans as max speed. Makes the system a little noisy, but then again, when a grenade goes off in your face, the fan noise is not an issue.</htmltext>
<tokenext>I have had a problem for years with nVIDIA cards overheating .
So , I have the profile setup to run the fans as max speed .
Makes the system a little noisy , but then again , when a grenade goes off in your face , the fan noise is not an issue .</tokentext>
<sentencetext>I have had a problem for years with nVIDIA cards overheating.
So, I have the profile setup to run the fans as max speed.
Makes the system a little noisy, but then again, when a grenade goes off in your face, the fan noise is not an issue.</sentencetext>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_03_05_0739241.31371998</id>
	<title>Re:Terrible design</title>
	<author>bjwest</author>
	<datestamp>1267809420000</datestamp>
	<modclass>None</modclass>
	<modscore>1</modscore>
	<htmltext><p><div class="quote"><p>  While drivers are in control in the case we have here with nVidia, I see the drivers as part of the hardware since they were released by the manufacturer.</p> </div><p>Hardware are PROMs, logic gates and discrete components on the cards, the drivers however are software that runs on the computer and/or changes data on EEPROMs on the card.  If it can be changed in a non physical means, it's software.  Using CD's as an example, Data on CD-R's, once written, cannot be changed (hardware) where as data on CD-RW's can be (software).</p></div>
	</htmltext>
<tokenext>While drivers are in control in the case we have here with nVidia , I see the drivers as part of the hardware since they were released by the manufacturer .
Hardware are PROMs , logic gates and discrete components on the cards , the drivers however are software that runs on the computer and/or changes data on EEPROMs on the card .
If it can be changed in a non physical means , it 's software .
Using CD 's as an example , Data on CD-R 's , once written , can not be changed ( hardware ) where as data on CD-RW 's can be ( software ) .</tokentext>
<sentencetext>  While drivers are in control in the case we have here with nVidia, I see the drivers as part of the hardware since they were released by the manufacturer.
Hardware are PROMs, logic gates and discrete components on the cards, the drivers however are software that runs on the computer and/or changes data on EEPROMs on the card.
If it can be changed in a non physical means, it's software.
Using CD's as an example, Data on CD-R's, once written, cannot be changed (hardware) where as data on CD-RW's can be (software).
	</sentencetext>
	<parent>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_03_05_0739241.31370106</parent>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_03_05_0739241.31369282</id>
	<title>Nvidia driver causing overheating? Oh really.</title>
	<author>Anonymous</author>
	<datestamp>1267791060000</datestamp>
	<modclass>None</modclass>
	<modscore>0</modscore>
	<htmltext><p>I'm no fan of Nvidia or ATI, but I have to question Blizzard and their programmers and beta testers on this. Leaving WoW sitting at the title or login screen has caused overheating in my 8800GT's for over a year now, no matter what drivers I've used. I've noticed temperatures reaching 105-110 Celsius in under 2 minutes flat as recently as last week when I made the mistake of letting WoW sit at the login screen. This only ever occurs in said game and accompanying areas. Similarly, my laptop's ATI 3650 tends to jump to 75+ Celsius in said areas of WoW. Pardon me, but I'm a little skeptical about Nvidia's drivers being the ultimate source of the problem.</p></htmltext>
<tokenext>I 'm no fan of Nvidia or ATI , but I have to question Blizzard and their programmers and beta testers on this .
Leaving WoW sitting at the title or login screen has caused overheating in my 8800GT 's for over a year now , no matter what drivers I 've used .
I 've noticed temperatures reaching 105-110 Celsius in under 2 minutes flat as recently as last week when I made the mistake of letting WoW sit at the login screen .
This only ever occurs in said game and accompanying areas .
Similarly , my laptop 's ATI 3650 tends to jump to 75 + Celsius in said areas of WoW .
Pardon me , but I 'm a little skeptical about Nvidia 's drivers being the ultimate source of the problem .</tokentext>
<sentencetext>I'm no fan of Nvidia or ATI, but I have to question Blizzard and their programmers and beta testers on this.
Leaving WoW sitting at the title or login screen has caused overheating in my 8800GT's for over a year now, no matter what drivers I've used.
I've noticed temperatures reaching 105-110 Celsius in under 2 minutes flat as recently as last week when I made the mistake of letting WoW sit at the login screen.
This only ever occurs in said game and accompanying areas.
Similarly, my laptop's ATI 3650 tends to jump to 75+ Celsius in said areas of WoW.
Pardon me, but I'm a little skeptical about Nvidia's drivers being the ultimate source of the problem.</sentencetext>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_03_05_0739241.31369078</id>
	<title>Glad it didn't fry mine.</title>
	<author>Anonymous</author>
	<datestamp>1267788480000</datestamp>
	<modclass>Interestin</modclass>
	<modscore>4</modscore>
	<htmltext>Oddly enough, I played World of Warcraft and Fallout 3 quite a bit since upgrading to these drivers, and my performance has been much better than the previous win7 64x driver.  I hear the fan ramping up like it should, and the card hasn't gotten close to overheating.   Maybe it's only affecting certain models.   I have an 8800ultra.</htmltext>
<tokenext>Oddly enough , I played World of Warcraft and Fallout 3 quite a bit since upgrading to these drivers , and my performance has been much better than the previous win7 64x driver .
I hear the fan ramping up like it should , and the card has n't gotten close to overheating .
Maybe it 's only affecting certain models .
I have an 8800ultra .</tokentext>
<sentencetext>Oddly enough, I played World of Warcraft and Fallout 3 quite a bit since upgrading to these drivers, and my performance has been much better than the previous win7 64x driver.
I hear the fan ramping up like it should, and the card hasn't gotten close to overheating.
Maybe it's only affecting certain models.
I have an 8800ultra.</sentencetext>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_03_05_0739241.31376676</id>
	<title>Re:Terrible design</title>
	<author>Brianwa</author>
	<datestamp>1267790040000</datestamp>
	<modclass>None</modclass>
	<modscore>1</modscore>
	<htmltext>Reminds me of my laptop.  The fan speed is software controlled.  There's some sort of fail safe built in -- if the speed control software doesn't initialize or stops responding completely for around 10 seconds, the fan will spin up to maximum speed and stay there.  Once I was having some software problems (a nasty rootkit broke a lot of stuff) and the fan speed control process was crashing every few hours.  One time, the fail safe didn't kick in.  Unfortunately the CPU load was light, so it sat there running at incredibly high temperatures for half an hour rather than shutting down.  Luckily the laptop still works, but that probably took a chunk out of it's lifespan.<nobr> <wbr></nobr>:/</htmltext>
<tokenext>Reminds me of my laptop .
The fan speed is software controlled .
There 's some sort of fail safe built in -- if the speed control software does n't initialize or stops responding completely for around 10 seconds , the fan will spin up to maximum speed and stay there .
Once I was having some software problems ( a nasty rootkit broke a lot of stuff ) and the fan speed control process was crashing every few hours .
One time , the fail safe did n't kick in .
Unfortunately the CPU load was light , so it sat there running at incredibly high temperatures for half an hour rather than shutting down .
Luckily the laptop still works , but that probably took a chunk out of it 's lifespan .
: /</tokentext>
<sentencetext>Reminds me of my laptop.
The fan speed is software controlled.
There's some sort of fail safe built in -- if the speed control software doesn't initialize or stops responding completely for around 10 seconds, the fan will spin up to maximum speed and stay there.
Once I was having some software problems (a nasty rootkit broke a lot of stuff) and the fan speed control process was crashing every few hours.
One time, the fail safe didn't kick in.
Unfortunately the CPU load was light, so it sat there running at incredibly high temperatures for half an hour rather than shutting down.
Luckily the laptop still works, but that probably took a chunk out of it's lifespan.
:/</sentencetext>
	<parent>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_03_05_0739241.31369458</parent>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_03_05_0739241.31369808</id>
	<title>my 8800 ultra died a few days ago</title>
	<author>Anonymous</author>
	<datestamp>1267797120000</datestamp>
	<modclass>None</modclass>
	<modscore>0</modscore>
	<htmltext><p>And I installed the new drivers a week before that. Co-incidence possibily.. but funny enough the gfx died after playing a long stint of Aion and PC locked up (like a heat related crash) rebooted and blue lines were going across the screen in bios.<nobr> <wbr></nobr>:(</p><p>Oh well I guess NVIDIA could think maybe they could up their sales on cards because people needed to replace their old ones... well guess what NVIDIA! I baught ATI and I'm not looking back.<nobr> <wbr></nobr>/hugs new 5870</p></htmltext>
<tokenext>And I installed the new drivers a week before that .
Co-incidence possibily.. but funny enough the gfx died after playing a long stint of Aion and PC locked up ( like a heat related crash ) rebooted and blue lines were going across the screen in bios .
: ( Oh well I guess NVIDIA could think maybe they could up their sales on cards because people needed to replace their old ones... well guess what NVIDIA !
I baught ATI and I 'm not looking back .
/hugs new 5870</tokentext>
<sentencetext>And I installed the new drivers a week before that.
Co-incidence possibily.. but funny enough the gfx died after playing a long stint of Aion and PC locked up (like a heat related crash) rebooted and blue lines were going across the screen in bios.
:(Oh well I guess NVIDIA could think maybe they could up their sales on cards because people needed to replace their old ones... well guess what NVIDIA!
I baught ATI and I'm not looking back.
/hugs new 5870</sentencetext>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_03_05_0739241.31369312</id>
	<title>Re:Processor damage, really?</title>
	<author>Anonymous</author>
	<datestamp>1267791360000</datestamp>
	<modclass>None</modclass>
	<modscore>0</modscore>
	<htmltext><p>Strange an EE does not understand something every overclocker does. I guess education can't make up for experience.</p></htmltext>
<tokenext>Strange an EE does not understand something every overclocker does .
I guess education ca n't make up for experience .</tokentext>
<sentencetext>Strange an EE does not understand something every overclocker does.
I guess education can't make up for experience.</sentencetext>
	<parent>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_03_05_0739241.31369156</parent>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_03_05_0739241.31369820</id>
	<title>Far Cry 3</title>
	<author>Karem Lore</author>
	<datestamp>1267797180000</datestamp>
	<modclass>Informativ</modclass>
	<modscore>4</modscore>
	<htmltext><p>Hi,</p><p>Please do tell where I can get Far Cry 3....Unless bittorrent has seriously moved into time travel of course...</p></htmltext>
<tokenext>Hi,Please do tell where I can get Far Cry 3....Unless bittorrent has seriously moved into time travel of course.. .</tokentext>
<sentencetext>Hi,Please do tell where I can get Far Cry 3....Unless bittorrent has seriously moved into time travel of course...</sentencetext>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_03_05_0739241.31372768</id>
	<title>Re:Terrible design</title>
	<author>SwabTheDeck</author>
	<datestamp>1267813260000</datestamp>
	<modclass>None</modclass>
	<modscore>1</modscore>
	<htmltext><p><div class="quote"><p>Software should not be able to destroy hardware, <i>period</i>. The GPU's cooling system should be designed to safety operate for sustained periods at peak load --- anything less is artificially crippling the hardware and leads to both security and reliability problems.</p><p>Great job, NVIDIA: now, malware can not only destroy your files, but destroy your expensive graphics card as well.</p></div><p>This shouldn't be surprising to anyone.  Software (or firmware, if you want to make the distinction) has been used to control fans on GPUs, CPUs, northbridges and plenty of other components for many, many years.  I think people don't think about the alternative: putting hardware exclusively in charge of fan control.  If you choose the hardware method, there is just as much chance of it becoming fucked up due to lack of testing, poor design choices, etc.  However, if you ship a million units with faulty hardware, that means you have a million broken units and there is no choice but to recall/replace them.  If you use software and you fuck it up, you can patch it, saving your customers the time and hassle of having to return their product or being stuck without a solution for an extended period of time, and at the same time saving your company from potentially devastating financial losses.  Frankly, I think all parties benefit with a software solution.</p></div>
	</htmltext>
<tokenext>Software should not be able to destroy hardware , period .
The GPU 's cooling system should be designed to safety operate for sustained periods at peak load --- anything less is artificially crippling the hardware and leads to both security and reliability problems.Great job , NVIDIA : now , malware can not only destroy your files , but destroy your expensive graphics card as well.This should n't be surprising to anyone .
Software ( or firmware , if you want to make the distinction ) has been used to control fans on GPUs , CPUs , northbridges and plenty of other components for many , many years .
I think people do n't think about the alternative : putting hardware exclusively in charge of fan control .
If you choose the hardware method , there is just as much chance of it becoming fucked up due to lack of testing , poor design choices , etc .
However , if you ship a million units with faulty hardware , that means you have a million broken units and there is no choice but to recall/replace them .
If you use software and you fuck it up , you can patch it , saving your customers the time and hassle of having to return their product or being stuck without a solution for an extended period of time , and at the same time saving your company from potentially devastating financial losses .
Frankly , I think all parties benefit with a software solution .</tokentext>
<sentencetext>Software should not be able to destroy hardware, period.
The GPU's cooling system should be designed to safety operate for sustained periods at peak load --- anything less is artificially crippling the hardware and leads to both security and reliability problems.Great job, NVIDIA: now, malware can not only destroy your files, but destroy your expensive graphics card as well.This shouldn't be surprising to anyone.
Software (or firmware, if you want to make the distinction) has been used to control fans on GPUs, CPUs, northbridges and plenty of other components for many, many years.
I think people don't think about the alternative: putting hardware exclusively in charge of fan control.
If you choose the hardware method, there is just as much chance of it becoming fucked up due to lack of testing, poor design choices, etc.
However, if you ship a million units with faulty hardware, that means you have a million broken units and there is no choice but to recall/replace them.
If you use software and you fuck it up, you can patch it, saving your customers the time and hassle of having to return their product or being stuck without a solution for an extended period of time, and at the same time saving your company from potentially devastating financial losses.
Frankly, I think all parties benefit with a software solution.
	</sentencetext>
	<parent>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_03_05_0739241.31369458</parent>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_03_05_0739241.31370554</id>
	<title>Silence</title>
	<author>DrYak</author>
	<datestamp>1267802340000</datestamp>
	<modclass>None</modclass>
	<modscore>1</modscore>
	<htmltext><p><div class="quote"><p>Software should not be able to destroy hardware, <i>period</i>. The GPU's cooling system should be designed to safety operate for sustained periods at peak load</p></div><p>And that's certainly the strategy in the corporate* world (for servers, for example).</p><p>On the other hand, some other people, the kind who only occasionally play games and use their computer most of the time for office-type work (ie.: non graphically intesive tasks), would appreciate not having to endure the sound of an Airbus A380's takeoff coming out of their computer case every single moment during which the computer is on.</p><p>Thus the fan aren't working at constant speed, but are varying their speed to constantly find the perfect balance between silence and avoiding the card catching fire under the load.<br>Thus you have a small chip controlling the fan. Of course to simplify Q/A, in field bug fixing, etc. this small chip has a small firmware. (Just imagine a non programmable chip controlling the fans, and the same bug. Every single card produced with the bug has to be called backed and replaced - a logistic night mare).<br>This firmware is setup by the drivers.<br>A buggy drivers *could* damage the hardware by setting the fan too low. And all that because the end user *wants* a fan that slows down when it's not necessary to have some silence.</p><p>The only thing that could have been done, is adding a safe guard which fires a software alarm and either shuts down or massively underclocks the 3D core in case a temperature threshold is crossed. (That's how it's done to protect CPU in case of faulty fan).</p><p>*: And then, there's the question of wear of mechanical parts like fans - in server land, it's more a quesiton of balance between mechanical wear of the fans and the server surviving a<nobr> <wbr></nobr>/. effect (and the wear of the computer due to thermal expansion).</p></div>
	</htmltext>
<tokenext>Software should not be able to destroy hardware , period .
The GPU 's cooling system should be designed to safety operate for sustained periods at peak loadAnd that 's certainly the strategy in the corporate * world ( for servers , for example ) .On the other hand , some other people , the kind who only occasionally play games and use their computer most of the time for office-type work ( ie .
: non graphically intesive tasks ) , would appreciate not having to endure the sound of an Airbus A380 's takeoff coming out of their computer case every single moment during which the computer is on.Thus the fan are n't working at constant speed , but are varying their speed to constantly find the perfect balance between silence and avoiding the card catching fire under the load.Thus you have a small chip controlling the fan .
Of course to simplify Q/A , in field bug fixing , etc .
this small chip has a small firmware .
( Just imagine a non programmable chip controlling the fans , and the same bug .
Every single card produced with the bug has to be called backed and replaced - a logistic night mare ) .This firmware is setup by the drivers.A buggy drivers * could * damage the hardware by setting the fan too low .
And all that because the end user * wants * a fan that slows down when it 's not necessary to have some silence.The only thing that could have been done , is adding a safe guard which fires a software alarm and either shuts down or massively underclocks the 3D core in case a temperature threshold is crossed .
( That 's how it 's done to protect CPU in case of faulty fan ) .
* : And then , there 's the question of wear of mechanical parts like fans - in server land , it 's more a quesiton of balance between mechanical wear of the fans and the server surviving a / .
effect ( and the wear of the computer due to thermal expansion ) .</tokentext>
<sentencetext>Software should not be able to destroy hardware, period.
The GPU's cooling system should be designed to safety operate for sustained periods at peak loadAnd that's certainly the strategy in the corporate* world (for servers, for example).On the other hand, some other people, the kind who only occasionally play games and use their computer most of the time for office-type work (ie.
: non graphically intesive tasks), would appreciate not having to endure the sound of an Airbus A380's takeoff coming out of their computer case every single moment during which the computer is on.Thus the fan aren't working at constant speed, but are varying their speed to constantly find the perfect balance between silence and avoiding the card catching fire under the load.Thus you have a small chip controlling the fan.
Of course to simplify Q/A, in field bug fixing, etc.
this small chip has a small firmware.
(Just imagine a non programmable chip controlling the fans, and the same bug.
Every single card produced with the bug has to be called backed and replaced - a logistic night mare).This firmware is setup by the drivers.A buggy drivers *could* damage the hardware by setting the fan too low.
And all that because the end user *wants* a fan that slows down when it's not necessary to have some silence.The only thing that could have been done, is adding a safe guard which fires a software alarm and either shuts down or massively underclocks the 3D core in case a temperature threshold is crossed.
(That's how it's done to protect CPU in case of faulty fan).
*: And then, there's the question of wear of mechanical parts like fans - in server land, it's more a quesiton of balance between mechanical wear of the fans and the server surviving a /.
effect (and the wear of the computer due to thermal expansion).
	</sentencetext>
	<parent>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_03_05_0739241.31369458</parent>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_03_05_0739241.31369888</id>
	<title>Re:Terrible design</title>
	<author>Anonymous</author>
	<datestamp>1267797780000</datestamp>
	<modclass>None</modclass>
	<modscore>0</modscore>
	<htmltext><p>What's the difference between hardware and software?  I'm not being ridiculous, it's a genuine if somewhat rhetorical question.  Graphics cards have firmware and microcode, clock frequencies can be changed via software, etc..  When you get down to the bare bones of the system, the distinction between hardware and software becomes blurred, and the clear separation between the two is not so obvious.</p></htmltext>
<tokenext>What 's the difference between hardware and software ?
I 'm not being ridiculous , it 's a genuine if somewhat rhetorical question .
Graphics cards have firmware and microcode , clock frequencies can be changed via software , etc.. When you get down to the bare bones of the system , the distinction between hardware and software becomes blurred , and the clear separation between the two is not so obvious .</tokentext>
<sentencetext>What's the difference between hardware and software?
I'm not being ridiculous, it's a genuine if somewhat rhetorical question.
Graphics cards have firmware and microcode, clock frequencies can be changed via software, etc..  When you get down to the bare bones of the system, the distinction between hardware and software becomes blurred, and the clear separation between the two is not so obvious.</sentencetext>
	<parent>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_03_05_0739241.31369458</parent>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_03_05_0739241.31369694</id>
	<title>Re:Processor damage, really?</title>
	<author>databyss</author>
	<datestamp>1267795860000</datestamp>
	<modclass>None</modclass>
	<modscore>1</modscore>
	<htmltext><p>The excessive heat can overwhelm the standard cooling system on a PC.</p><p>As an EE, I'm sure you're well aware that heat has negative effects on CPU's and other electrical components.</p></htmltext>
<tokenext>The excessive heat can overwhelm the standard cooling system on a PC.As an EE , I 'm sure you 're well aware that heat has negative effects on CPU 's and other electrical components .</tokentext>
<sentencetext>The excessive heat can overwhelm the standard cooling system on a PC.As an EE, I'm sure you're well aware that heat has negative effects on CPU's and other electrical components.</sentencetext>
	<parent>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_03_05_0739241.31369156</parent>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_03_05_0739241.31369370</id>
	<title>Re:Processor damage, really?</title>
	<author>yacc143</author>
	<datestamp>1267792080000</datestamp>
	<modclass>None</modclass>
	<modscore>1</modscore>
	<htmltext><p>Well, I'm not an EE, but I seem to remember from university, that changing temperatures can lead to changes in voltage/current. Then you've got the extreme case of a short circuit.</p><p>So I think it's quite possible to have motherboard damage, e.g. GPU takes more power than is good for the MB, MB dies. Slowly or quickly, depending upon how extreme the effect is.</p><p>As an official example, see GPU that have a seperate power connection, where the documentation explicitly states that the GPU and/or the motherboard can die if it's left unconnected, because the GPU will overuse the PCIe provided power.<br>So obviously, something that the GPU does can effect the MB and other parts of the system, phrased nicely by the manufacturers.</p></htmltext>
<tokenext>Well , I 'm not an EE , but I seem to remember from university , that changing temperatures can lead to changes in voltage/current .
Then you 've got the extreme case of a short circuit.So I think it 's quite possible to have motherboard damage , e.g .
GPU takes more power than is good for the MB , MB dies .
Slowly or quickly , depending upon how extreme the effect is.As an official example , see GPU that have a seperate power connection , where the documentation explicitly states that the GPU and/or the motherboard can die if it 's left unconnected , because the GPU will overuse the PCIe provided power.So obviously , something that the GPU does can effect the MB and other parts of the system , phrased nicely by the manufacturers .</tokentext>
<sentencetext>Well, I'm not an EE, but I seem to remember from university, that changing temperatures can lead to changes in voltage/current.
Then you've got the extreme case of a short circuit.So I think it's quite possible to have motherboard damage, e.g.
GPU takes more power than is good for the MB, MB dies.
Slowly or quickly, depending upon how extreme the effect is.As an official example, see GPU that have a seperate power connection, where the documentation explicitly states that the GPU and/or the motherboard can die if it's left unconnected, because the GPU will overuse the PCIe provided power.So obviously, something that the GPU does can effect the MB and other parts of the system, phrased nicely by the manufacturers.</sentencetext>
	<parent>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_03_05_0739241.31369156</parent>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_03_05_0739241.31369238</id>
	<title>Re:Processor damage, really?</title>
	<author>Anonymous</author>
	<datestamp>1267790160000</datestamp>
	<modclass>Interestin</modclass>
	<modscore>5</modscore>
	<htmltext>Laptops for example generally have the same heat pipe connected to the CPU and GPU. If one overheats, so can the other.</htmltext>
<tokenext>Laptops for example generally have the same heat pipe connected to the CPU and GPU .
If one overheats , so can the other .</tokentext>
<sentencetext>Laptops for example generally have the same heat pipe connected to the CPU and GPU.
If one overheats, so can the other.</sentencetext>
	<parent>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_03_05_0739241.31369156</parent>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_03_05_0739241.31372602</id>
	<title>My NVIDIA Observation</title>
	<author>dlfretz</author>
	<datestamp>1267812480000</datestamp>
	<modclass>None</modclass>
	<modscore>1</modscore>
	<htmltext><p>I do know that in my computer setup that the 2 EVGA 275 FTW cards get cooled down unevenly at idle.  I am running the NVIDIA 196.75 driver under Vista 64 Ultimate with SLI ana PhysX turned on.  The first card runs at 163 degrees Fahrenheit and the other card at 123 degrees Fahrenheit with both fans set at 100\% speed via the NVDIA control panel at idle.</p><p>I have caused my computer to lockup playing Mass Effect with the fans in automatic mode and running with the NVIDIA Real 3D feature.  The temperature was 192 degrees Fahrenheit on the hotter card before the lock up with Mass Effect.  However, World of Warcraft doesn't cause as much heat as other games.  I normally see both cards range around 163 degrees Fahrenheit with no problems at all.</p></htmltext>
<tokenext>I do know that in my computer setup that the 2 EVGA 275 FTW cards get cooled down unevenly at idle .
I am running the NVIDIA 196.75 driver under Vista 64 Ultimate with SLI ana PhysX turned on .
The first card runs at 163 degrees Fahrenheit and the other card at 123 degrees Fahrenheit with both fans set at 100 \ % speed via the NVDIA control panel at idle.I have caused my computer to lockup playing Mass Effect with the fans in automatic mode and running with the NVIDIA Real 3D feature .
The temperature was 192 degrees Fahrenheit on the hotter card before the lock up with Mass Effect .
However , World of Warcraft does n't cause as much heat as other games .
I normally see both cards range around 163 degrees Fahrenheit with no problems at all .</tokentext>
<sentencetext>I do know that in my computer setup that the 2 EVGA 275 FTW cards get cooled down unevenly at idle.
I am running the NVIDIA 196.75 driver under Vista 64 Ultimate with SLI ana PhysX turned on.
The first card runs at 163 degrees Fahrenheit and the other card at 123 degrees Fahrenheit with both fans set at 100\% speed via the NVDIA control panel at idle.I have caused my computer to lockup playing Mass Effect with the fans in automatic mode and running with the NVIDIA Real 3D feature.
The temperature was 192 degrees Fahrenheit on the hotter card before the lock up with Mass Effect.
However, World of Warcraft doesn't cause as much heat as other games.
I normally see both cards range around 163 degrees Fahrenheit with no problems at all.</sentencetext>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_03_05_0739241.31369228</id>
	<title>Re:Processor damage, really?</title>
	<author>theeddie55</author>
	<datestamp>1267790100000</datestamp>
	<modclass>Interestin</modclass>
	<modscore>1</modscore>
	<htmltext>if it causes a short circuit, the feedback could easily blow the CPU, though in practice, any half decent power supply should cut out before that can happen.</htmltext>
<tokenext>if it causes a short circuit , the feedback could easily blow the CPU , though in practice , any half decent power supply should cut out before that can happen .</tokentext>
<sentencetext>if it causes a short circuit, the feedback could easily blow the CPU, though in practice, any half decent power supply should cut out before that can happen.</sentencetext>
	<parent>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_03_05_0739241.31369156</parent>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_03_05_0739241.31371970</id>
	<title>I doubt they've isolated the  bug.</title>
	<author>MarkvW</author>
	<datestamp>1267809240000</datestamp>
	<modclass>None</modclass>
	<modscore>1</modscore>
	<htmltext><p>I finally sidelined my (very expensive) NVIDIA card because it kept bsodding.  Damn nvlddmkm driver.  This is a long term problem for NVIDIA.  Check the web.</p><p>Don't know whether it's a software or hardware problem.  Card used to work.</p><p>Won't buy NVIDIA for a long time.</p></htmltext>
<tokenext>I finally sidelined my ( very expensive ) NVIDIA card because it kept bsodding .
Damn nvlddmkm driver .
This is a long term problem for NVIDIA .
Check the web.Do n't know whether it 's a software or hardware problem .
Card used to work.Wo n't buy NVIDIA for a long time .</tokentext>
<sentencetext>I finally sidelined my (very expensive) NVIDIA card because it kept bsodding.
Damn nvlddmkm driver.
This is a long term problem for NVIDIA.
Check the web.Don't know whether it's a software or hardware problem.
Card used to work.Won't buy NVIDIA for a long time.</sentencetext>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_03_05_0739241.31370240</id>
	<title>Re:Terrible design</title>
	<author>maxwell demon</author>
	<datestamp>1267800660000</datestamp>
	<modclass>None</modclass>
	<modscore>1</modscore>
	<htmltext><blockquote><div><p>I see the drivers as part of the hardware since they were released by the manufacturer.</p></div></blockquote><p>So Mac OS X is hardware, too, because it's released by the hardware manufacturer, i.e. Apple?</p></div>
	</htmltext>
<tokenext>I see the drivers as part of the hardware since they were released by the manufacturer.So Mac OS X is hardware , too , because it 's released by the hardware manufacturer , i.e .
Apple ?</tokentext>
<sentencetext>I see the drivers as part of the hardware since they were released by the manufacturer.So Mac OS X is hardware, too, because it's released by the hardware manufacturer, i.e.
Apple?
	</sentencetext>
	<parent>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_03_05_0739241.31370106</parent>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_03_05_0739241.31377526</id>
	<title>Re:Silence</title>
	<author>petermgreen</author>
	<datestamp>1267797900000</datestamp>
	<modclass>None</modclass>
	<modscore>1</modscore>
	<htmltext><p><i>The only thing that could have been done, is adding a safe guard which fires a software alarm and either shuts down or massively underclocks the 3D core in case a temperature threshold is crossed. (That's how it's done to protect CPU in case of faulty fan).</i><br>Back in the P4/athon XP era toms hardware demonstrated ( <a href="http://www.youtube.com/watch?v=BSGcnRanYMM" title="youtube.com">http://www.youtube.com/watch?v=BSGcnRanYMM</a> [youtube.com] ) removing the heatsink from various chips while gaming and the P4 kept going albiet at slideshow framerates and reconvered fine when the heatsink was reattatched . Unfortunately i'm now aware of any similar tests since</p><p>Nvidia need to realise (like AMD did after that video) that thier overheat protection systems need failsafes implemented at as low a level as possible so that even if the fan system fails the chip can't cook itself.</p></htmltext>
<tokenext>The only thing that could have been done , is adding a safe guard which fires a software alarm and either shuts down or massively underclocks the 3D core in case a temperature threshold is crossed .
( That 's how it 's done to protect CPU in case of faulty fan ) .Back in the P4/athon XP era toms hardware demonstrated ( http : //www.youtube.com/watch ? v = BSGcnRanYMM [ youtube.com ] ) removing the heatsink from various chips while gaming and the P4 kept going albiet at slideshow framerates and reconvered fine when the heatsink was reattatched .
Unfortunately i 'm now aware of any similar tests sinceNvidia need to realise ( like AMD did after that video ) that thier overheat protection systems need failsafes implemented at as low a level as possible so that even if the fan system fails the chip ca n't cook itself .</tokentext>
<sentencetext>The only thing that could have been done, is adding a safe guard which fires a software alarm and either shuts down or massively underclocks the 3D core in case a temperature threshold is crossed.
(That's how it's done to protect CPU in case of faulty fan).Back in the P4/athon XP era toms hardware demonstrated ( http://www.youtube.com/watch?v=BSGcnRanYMM [youtube.com] ) removing the heatsink from various chips while gaming and the P4 kept going albiet at slideshow framerates and reconvered fine when the heatsink was reattatched .
Unfortunately i'm now aware of any similar tests sinceNvidia need to realise (like AMD did after that video) that thier overheat protection systems need failsafes implemented at as low a level as possible so that even if the fan system fails the chip can't cook itself.</sentencetext>
	<parent>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_03_05_0739241.31370554</parent>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_03_05_0739241.31371974</id>
	<title>Re:nVidia and the dreaded nv4\_disp.dll bug</title>
	<author>Anonymous</author>
	<datestamp>1267809240000</datestamp>
	<modclass>None</modclass>
	<modscore>0</modscore>
	<htmltext><p>it's 2010... if you're still getting BSOD's you're doin it wrong. I haven't seen a BSOD in at least 5 years and I use computers pretty extensively.</p></htmltext>
<tokenext>it 's 2010... if you 're still getting BSOD 's you 're doin it wrong .
I have n't seen a BSOD in at least 5 years and I use computers pretty extensively .</tokentext>
<sentencetext>it's 2010... if you're still getting BSOD's you're doin it wrong.
I haven't seen a BSOD in at least 5 years and I use computers pretty extensively.</sentencetext>
	<parent>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_03_05_0739241.31369628</parent>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_03_05_0739241.31373632</id>
	<title>Re:Terrible design</title>
	<author>Anonymous</author>
	<datestamp>1267817280000</datestamp>
	<modclass>None</modclass>
	<modscore>0</modscore>
	<htmltext><p><div class="quote"><p>I see the drivers as part of the hardware since they were released by the manufacturer.</p></div><p>I see you as still being part of your mom because you were released by her.</p></div>
	</htmltext>
<tokenext>I see the drivers as part of the hardware since they were released by the manufacturer.I see you as still being part of your mom because you were released by her .</tokentext>
<sentencetext>I see the drivers as part of the hardware since they were released by the manufacturer.I see you as still being part of your mom because you were released by her.
	</sentencetext>
	<parent>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_03_05_0739241.31370106</parent>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_03_05_0739241.31371118</id>
	<title>Re:Glad it didn't fry mine.</title>
	<author>gad\_zuki!</author>
	<datestamp>1267805280000</datestamp>
	<modclass>None</modclass>
	<modscore>1</modscore>
	<htmltext><p>&gt;Oddly enough, I played World of Warcraft and Fallout 3 quite a bit since upgrading to these drivers, and my performance has been much better than the previous win7 64x driver.</p><p>If you read the release notes you'll see big performance gains on a lot of games from this driver. This is something I've never seen from Nvidia. Anyone have the details on what happened?  Maybe they found some new way to be efficient or found some long standing bug.</p></htmltext>
<tokenext>&gt; Oddly enough , I played World of Warcraft and Fallout 3 quite a bit since upgrading to these drivers , and my performance has been much better than the previous win7 64x driver.If you read the release notes you 'll see big performance gains on a lot of games from this driver .
This is something I 've never seen from Nvidia .
Anyone have the details on what happened ?
Maybe they found some new way to be efficient or found some long standing bug .</tokentext>
<sentencetext>&gt;Oddly enough, I played World of Warcraft and Fallout 3 quite a bit since upgrading to these drivers, and my performance has been much better than the previous win7 64x driver.If you read the release notes you'll see big performance gains on a lot of games from this driver.
This is something I've never seen from Nvidia.
Anyone have the details on what happened?
Maybe they found some new way to be efficient or found some long standing bug.</sentencetext>
	<parent>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_03_05_0739241.31369078</parent>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_03_05_0739241.31369110</id>
	<title>Convinience</title>
	<author>Anonymous</author>
	<datestamp>1267788780000</datestamp>
	<modclass>None</modclass>
	<modscore>0</modscore>
	<htmltext><p>Playing wow while making eggs and bacon without leaving the PC?</p></htmltext>
<tokenext>Playing wow while making eggs and bacon without leaving the PC ?</tokentext>
<sentencetext>Playing wow while making eggs and bacon without leaving the PC?</sentencetext>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_03_05_0739241.31370454</id>
	<title>Re:If it ain't broke..</title>
	<author>Blakey Rat</author>
	<datestamp>1267801860000</datestamp>
	<modclass>None</modclass>
	<modscore>1</modscore>
	<htmltext><p>More to the point, World of Warcraft has "realistic graphics?" Even if you ignore the art style, which is as far from realistic as you can get, the engine is something like 5-10 years older than all the other games listed there and quite frankly looks like ass.</p><p>I wish people would proofread before they publish an article that thousands will read.</p></htmltext>
<tokenext>More to the point , World of Warcraft has " realistic graphics ?
" Even if you ignore the art style , which is as far from realistic as you can get , the engine is something like 5-10 years older than all the other games listed there and quite frankly looks like ass.I wish people would proofread before they publish an article that thousands will read .</tokentext>
<sentencetext>More to the point, World of Warcraft has "realistic graphics?
" Even if you ignore the art style, which is as far from realistic as you can get, the engine is something like 5-10 years older than all the other games listed there and quite frankly looks like ass.I wish people would proofread before they publish an article that thousands will read.</sentencetext>
	<parent>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_03_05_0739241.31369198</parent>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_03_05_0739241.31369392</id>
	<title>Re:Processor damage, really?</title>
	<author>asdf7890</author>
	<datestamp>1267792500000</datestamp>
	<modclass>None</modclass>
	<modscore>1</modscore>
	<htmltext><p><div class="quote"><p>Wait a minute... just how is an overheating graphics card causing damage to a CPU?</p></div><p>Depends on the airflow in your case, but many are not well laid out in terms of airflow. If the card is pumping out more heat than usual and this isn't being drawn out correctly, it may build up in the case generally, reducing the ability of the CPU's HS+F to cool it properly. Similarly, if the heat built up is sufficient for an appreciable amount of time (say, over the course of a long gaming session) you may also find drives and other components start failing due to overheating though the CPU is the item most at risk from this collateral warming and would most likely be the first to go (rescuing other parts by falling first, hopefully stopping the heat build up as no more head generating tasks will be run by it of give the the GPU by it) if the situation became extreme enough.</p><p>Another GPU-killing-CPU-by-heat scenario exists in liquid cooled systems. If the GPU and CPU share coolant and the GPU heats up so much that it overwhelms the liquid cooling arrangement there may not be a sufficient temperature gradient between the coolant and the CPU for the CPU to be usefully cooled.</p><p>I'd file both of these situations under "quite unlikely, but far from impossible".</p></div>
	</htmltext>
<tokenext>Wait a minute... just how is an overheating graphics card causing damage to a CPU ? Depends on the airflow in your case , but many are not well laid out in terms of airflow .
If the card is pumping out more heat than usual and this is n't being drawn out correctly , it may build up in the case generally , reducing the ability of the CPU 's HS + F to cool it properly .
Similarly , if the heat built up is sufficient for an appreciable amount of time ( say , over the course of a long gaming session ) you may also find drives and other components start failing due to overheating though the CPU is the item most at risk from this collateral warming and would most likely be the first to go ( rescuing other parts by falling first , hopefully stopping the heat build up as no more head generating tasks will be run by it of give the the GPU by it ) if the situation became extreme enough.Another GPU-killing-CPU-by-heat scenario exists in liquid cooled systems .
If the GPU and CPU share coolant and the GPU heats up so much that it overwhelms the liquid cooling arrangement there may not be a sufficient temperature gradient between the coolant and the CPU for the CPU to be usefully cooled.I 'd file both of these situations under " quite unlikely , but far from impossible " .</tokentext>
<sentencetext>Wait a minute... just how is an overheating graphics card causing damage to a CPU?Depends on the airflow in your case, but many are not well laid out in terms of airflow.
If the card is pumping out more heat than usual and this isn't being drawn out correctly, it may build up in the case generally, reducing the ability of the CPU's HS+F to cool it properly.
Similarly, if the heat built up is sufficient for an appreciable amount of time (say, over the course of a long gaming session) you may also find drives and other components start failing due to overheating though the CPU is the item most at risk from this collateral warming and would most likely be the first to go (rescuing other parts by falling first, hopefully stopping the heat build up as no more head generating tasks will be run by it of give the the GPU by it) if the situation became extreme enough.Another GPU-killing-CPU-by-heat scenario exists in liquid cooled systems.
If the GPU and CPU share coolant and the GPU heats up so much that it overwhelms the liquid cooling arrangement there may not be a sufficient temperature gradient between the coolant and the CPU for the CPU to be usefully cooled.I'd file both of these situations under "quite unlikely, but far from impossible".
	</sentencetext>
	<parent>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_03_05_0739241.31369156</parent>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_03_05_0739241.31371548</id>
	<title>These drivers were WHQL certified</title>
	<author>Anonymous</author>
	<datestamp>1267807320000</datestamp>
	<modclass>Insightful</modclass>
	<modscore>2</modscore>
	<htmltext>According to <a href="http://www.microsoft.com/whdc/winlogo/default.mspx" title="microsoft.com">Microsoft</a> [microsoft.com], <i>The Windows logo signifies the compatibility and reliability of systems and devices with Windows operating system. It gives customers confidence that your product is thoroughly tested with Microsoft-provided tools and ensures a good user experience.</i> <br> <br>

Doesn't say much about their testing, does it?</htmltext>
<tokenext>According to Microsoft [ microsoft.com ] , The Windows logo signifies the compatibility and reliability of systems and devices with Windows operating system .
It gives customers confidence that your product is thoroughly tested with Microsoft-provided tools and ensures a good user experience .
Does n't say much about their testing , does it ?</tokentext>
<sentencetext>According to Microsoft [microsoft.com], The Windows logo signifies the compatibility and reliability of systems and devices with Windows operating system.
It gives customers confidence that your product is thoroughly tested with Microsoft-provided tools and ensures a good user experience.
Doesn't say much about their testing, does it?</sentencetext>
	<parent>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_03_05_0739241.31369072</parent>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_03_05_0739241.31372260</id>
	<title>November update.</title>
	<author>Anonymous</author>
	<datestamp>1267810620000</datestamp>
	<modclass>None</modclass>
	<modscore>0</modscore>
	<htmltext><p>The November Nvidia update caused me to start getting serious artifacts in Batman after just a few minutes of play that would not go away until I shut down the machine and waited a few minutes. I wasn't alone with that.</p><p>This is the 3rd catastrophic Nvidia driver fail since November. Good job guys!</p></htmltext>
<tokenext>The November Nvidia update caused me to start getting serious artifacts in Batman after just a few minutes of play that would not go away until I shut down the machine and waited a few minutes .
I was n't alone with that.This is the 3rd catastrophic Nvidia driver fail since November .
Good job guys !</tokentext>
<sentencetext>The November Nvidia update caused me to start getting serious artifacts in Batman after just a few minutes of play that would not go away until I shut down the machine and waited a few minutes.
I wasn't alone with that.This is the 3rd catastrophic Nvidia driver fail since November.
Good job guys!</sentencetext>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_03_05_0739241.31369072</id>
	<title>Wow realistic?</title>
	<author>Anonymous</author>
	<datestamp>1267788360000</datestamp>
	<modclass>None</modclass>
	<modscore>0</modscore>
	<htmltext>WoW realistic? psssssssssshawwwww</htmltext>
<tokenext>WoW realistic ?
psssssssssshawwwww</tokentext>
<sentencetext>WoW realistic?
psssssssssshawwwww</sentencetext>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_03_05_0739241.31372712</id>
	<title>It's the late 80's and early 90's again.</title>
	<author>Mashiki</author>
	<datestamp>1267813020000</datestamp>
	<modclass>None</modclass>
	<modscore>1</modscore>
	<htmltext><p>And people called me crazy when I said there was a possibility of software ruining hardware again.  Those old enough should remember the ansi/ascii malware that ran around for awhile popping peoples monitors before there was sync locking.  And they should also remember the number of virus that were floating around that would crash drive heads into the spindle.</p></htmltext>
<tokenext>And people called me crazy when I said there was a possibility of software ruining hardware again .
Those old enough should remember the ansi/ascii malware that ran around for awhile popping peoples monitors before there was sync locking .
And they should also remember the number of virus that were floating around that would crash drive heads into the spindle .</tokentext>
<sentencetext>And people called me crazy when I said there was a possibility of software ruining hardware again.
Those old enough should remember the ansi/ascii malware that ran around for awhile popping peoples monitors before there was sync locking.
And they should also remember the number of virus that were floating around that would crash drive heads into the spindle.</sentencetext>
	<parent>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_03_05_0739241.31369458</parent>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_03_05_0739241.31370628</id>
	<title>Re:Processor damage, really?</title>
	<author>mcgrew</author>
	<datestamp>1267802760000</datestamp>
	<modclass>None</modclass>
	<modscore>1</modscore>
	<htmltext><p><i>Or the graphics card gets hot enough to re-flow solder, which then drips onto the PCIe slot or motherboard components. Not to mention most cases are vertically oriented these days. Not a chance in hell, I'd say.<br></i><br>In hell you wouldn't even need to turn the PC on for all the solder to melt!</p></htmltext>
<tokenext>Or the graphics card gets hot enough to re-flow solder , which then drips onto the PCIe slot or motherboard components .
Not to mention most cases are vertically oriented these days .
Not a chance in hell , I 'd say.In hell you would n't even need to turn the PC on for all the solder to melt !</tokentext>
<sentencetext>Or the graphics card gets hot enough to re-flow solder, which then drips onto the PCIe slot or motherboard components.
Not to mention most cases are vertically oriented these days.
Not a chance in hell, I'd say.In hell you wouldn't even need to turn the PC on for all the solder to melt!</sentencetext>
	<parent>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_03_05_0739241.31369156</parent>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_03_05_0739241.31371040</id>
	<title>Re:Terrible design</title>
	<author>VGPowerlord</author>
	<datestamp>1267804920000</datestamp>
	<modclass>None</modclass>
	<modscore>1</modscore>
	<htmltext><blockquote><div><p>Software (read: applications) isn't destroying hardware in this case. The hardware itself is now "faulty" as the drivers have a pretty bad bug.</p><p>In my mind, this is no different than taking the the heatsink/fan off a CPU. That's a hardware issue. Doesn't matter what games, etc, you run, you risk killing that CPU because the CPU is under an abnormal operating condition.</p></div></blockquote><p>Er, no, because the hardware clearly still works fine with older drivers.</p><blockquote><div><p>While drivers are in control in the case we have here with nVidia, I see the drivers as part of the hardware since they were released by the manufacturer.</p></div></blockquote><p>If it was an issue with the Firmware, <b>then</b> it'd be a hardware issue, as firmware is part of the card itself.  Drivers are a piece of software on the computer that knows how to talk to the hardware device that can be changed out by the user as needed.</p><p>The <i>real</i> issue here is that the firmware has no override if it thinks the driver is giving it the wrong fan control values.</p></div>
	</htmltext>
<tokenext>Software ( read : applications ) is n't destroying hardware in this case .
The hardware itself is now " faulty " as the drivers have a pretty bad bug.In my mind , this is no different than taking the the heatsink/fan off a CPU .
That 's a hardware issue .
Does n't matter what games , etc , you run , you risk killing that CPU because the CPU is under an abnormal operating condition.Er , no , because the hardware clearly still works fine with older drivers.While drivers are in control in the case we have here with nVidia , I see the drivers as part of the hardware since they were released by the manufacturer.If it was an issue with the Firmware , then it 'd be a hardware issue , as firmware is part of the card itself .
Drivers are a piece of software on the computer that knows how to talk to the hardware device that can be changed out by the user as needed.The real issue here is that the firmware has no override if it thinks the driver is giving it the wrong fan control values .</tokentext>
<sentencetext>Software (read: applications) isn't destroying hardware in this case.
The hardware itself is now "faulty" as the drivers have a pretty bad bug.In my mind, this is no different than taking the the heatsink/fan off a CPU.
That's a hardware issue.
Doesn't matter what games, etc, you run, you risk killing that CPU because the CPU is under an abnormal operating condition.Er, no, because the hardware clearly still works fine with older drivers.While drivers are in control in the case we have here with nVidia, I see the drivers as part of the hardware since they were released by the manufacturer.If it was an issue with the Firmware, then it'd be a hardware issue, as firmware is part of the card itself.
Drivers are a piece of software on the computer that knows how to talk to the hardware device that can be changed out by the user as needed.The real issue here is that the firmware has no override if it thinks the driver is giving it the wrong fan control values.
	</sentencetext>
	<parent>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_03_05_0739241.31370106</parent>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_03_05_0739241.31369156</id>
	<title>Processor damage, really?</title>
	<author>Anonymous</author>
	<datestamp>1267789260000</datestamp>
	<modclass>Insightful</modclass>
	<modscore>3</modscore>
	<htmltext><p>Wait a minute... just how is an overheating graphics card causing damage to a CPU? As an EE, I'd love to hear the basis for that. Even motherboard damage is extremely unlikely, unless the card bursts into flames and torches the PCIe slot. Or the graphics card gets hot enough to re-flow solder, which then drips onto the PCIe slot or motherboard components. Not to mention most cases are vertically oriented these days. Not a chance in hell, I'd say.</p><p>I'm not saying there isn't an issue, but it sounds like the issue is just a bit over-hyped... or someone has an agenda and just wants to bash NVIDIA.</p></htmltext>
<tokenext>Wait a minute... just how is an overheating graphics card causing damage to a CPU ?
As an EE , I 'd love to hear the basis for that .
Even motherboard damage is extremely unlikely , unless the card bursts into flames and torches the PCIe slot .
Or the graphics card gets hot enough to re-flow solder , which then drips onto the PCIe slot or motherboard components .
Not to mention most cases are vertically oriented these days .
Not a chance in hell , I 'd say.I 'm not saying there is n't an issue , but it sounds like the issue is just a bit over-hyped... or someone has an agenda and just wants to bash NVIDIA .</tokentext>
<sentencetext>Wait a minute... just how is an overheating graphics card causing damage to a CPU?
As an EE, I'd love to hear the basis for that.
Even motherboard damage is extremely unlikely, unless the card bursts into flames and torches the PCIe slot.
Or the graphics card gets hot enough to re-flow solder, which then drips onto the PCIe slot or motherboard components.
Not to mention most cases are vertically oriented these days.
Not a chance in hell, I'd say.I'm not saying there isn't an issue, but it sounds like the issue is just a bit over-hyped... or someone has an agenda and just wants to bash NVIDIA.</sentencetext>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_03_05_0739241.31369804</id>
	<title>Re:Terrible design</title>
	<author>Anonymous</author>
	<datestamp>1267797060000</datestamp>
	<modclass>None</modclass>
	<modscore>0</modscore>
	<htmltext><p>This is more likely due to a defective card.. Should have RMA'd it. Haven't gotten any crashes on any of my nvidia cards in years.</p></htmltext>
<tokenext>This is more likely due to a defective card.. Should have RMA 'd it .
Have n't gotten any crashes on any of my nvidia cards in years .</tokentext>
<sentencetext>This is more likely due to a defective card.. Should have RMA'd it.
Haven't gotten any crashes on any of my nvidia cards in years.</sentencetext>
	<parent>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_03_05_0739241.31369458</parent>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_03_05_0739241.31374618</id>
	<title>Will you let us know when it's safe to go to newer</title>
	<author>Anonymous</author>
	<datestamp>1267821780000</datestamp>
	<modclass>None</modclass>
	<modscore>0</modscore>
	<htmltext><p>Will you let us know when it's safe to go back to the pulled or newer driver?</p></htmltext>
<tokenext>Will you let us know when it 's safe to go back to the pulled or newer driver ?</tokentext>
<sentencetext>Will you let us know when it's safe to go back to the pulled or newer driver?</sentencetext>
</comment>
<thread>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#thread_10_03_05_0739241_3</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_03_05_0739241.31369370
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_03_05_0739241.31369156
</commentlist>
</thread>
<thread>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#thread_10_03_05_0739241_12</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_03_05_0739241.31371538
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_03_05_0739241.31369238
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_03_05_0739241.31369156
</commentlist>
</thread>
<thread>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#thread_10_03_05_0739241_37</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_03_05_0739241.31372712
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_03_05_0739241.31369458
</commentlist>
</thread>
<thread>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#thread_10_03_05_0739241_9</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_03_05_0739241.31369392
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_03_05_0739241.31369156
</commentlist>
</thread>
<thread>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#thread_10_03_05_0739241_29</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_03_05_0739241.31372768
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_03_05_0739241.31369458
</commentlist>
</thread>
<thread>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#thread_10_03_05_0739241_6</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_03_05_0739241.31376676
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_03_05_0739241.31369458
</commentlist>
</thread>
<thread>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#thread_10_03_05_0739241_34</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_03_05_0739241.31370240
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_03_05_0739241.31370106
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_03_05_0739241.31369458
</commentlist>
</thread>
<thread>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#thread_10_03_05_0739241_19</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_03_05_0739241.31369804
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_03_05_0739241.31369458
</commentlist>
</thread>
<thread>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#thread_10_03_05_0739241_10</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_03_05_0739241.31369888
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_03_05_0739241.31369458
</commentlist>
</thread>
<thread>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#thread_10_03_05_0739241_24</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_03_05_0739241.31378806
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_03_05_0739241.31369628
</commentlist>
</thread>
<thread>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#thread_10_03_05_0739241_2</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_03_05_0739241.31377526
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_03_05_0739241.31370554
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_03_05_0739241.31369458
</commentlist>
</thread>
<thread>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#thread_10_03_05_0739241_31</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_03_05_0739241.31371682
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_03_05_0739241.31369458
</commentlist>
</thread>
<thread>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#thread_10_03_05_0739241_7</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_03_05_0739241.31370138
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_03_05_0739241.31369458
</commentlist>
</thread>
<thread>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#thread_10_03_05_0739241_16</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_03_05_0739241.31370002
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_03_05_0739241.31369458
</commentlist>
</thread>
<thread>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#thread_10_03_05_0739241_4</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_03_05_0739241.31369404
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_03_05_0739241.31369198
</commentlist>
</thread>
<thread>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#thread_10_03_05_0739241_32</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_03_05_0739241.31376746
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_03_05_0739241.31371548
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_03_05_0739241.31369072
</commentlist>
</thread>
<thread>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#thread_10_03_05_0739241_23</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_03_05_0739241.31371974
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_03_05_0739241.31369628
</commentlist>
</thread>
<thread>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#thread_10_03_05_0739241_22</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_03_05_0739241.31369550
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_03_05_0739241.31369104
</commentlist>
</thread>
<thread>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#thread_10_03_05_0739241_13</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_03_05_0739241.31369232
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_03_05_0739241.31369156
</commentlist>
</thread>
<thread>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#thread_10_03_05_0739241_0</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_03_05_0739241.31371118
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_03_05_0739241.31369078
</commentlist>
</thread>
<thread>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#thread_10_03_05_0739241_38</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_03_05_0739241.31371040
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_03_05_0739241.31370106
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_03_05_0739241.31369458
</commentlist>
</thread>
<thread>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#thread_10_03_05_0739241_14</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_03_05_0739241.31374630
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_03_05_0739241.31369304
</commentlist>
</thread>
<thread>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#thread_10_03_05_0739241_28</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_03_05_0739241.31370628
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_03_05_0739241.31369156
</commentlist>
</thread>
<thread>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#thread_10_03_05_0739241_21</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_03_05_0739241.31371452
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_03_05_0739241.31369458
</commentlist>
</thread>
<thread>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#thread_10_03_05_0739241_35</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_03_05_0739241.31369308
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_03_05_0739241.31369156
</commentlist>
</thread>
<thread>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#thread_10_03_05_0739241_11</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_03_05_0739241.31370792
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_03_05_0739241.31369156
</commentlist>
</thread>
<thread>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#thread_10_03_05_0739241_8</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_03_05_0739241.31370082
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_03_05_0739241.31369156
</commentlist>
</thread>
<thread>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#thread_10_03_05_0739241_36</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_03_05_0739241.31373632
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_03_05_0739241.31370106
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_03_05_0739241.31369458
</commentlist>
</thread>
<thread>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#thread_10_03_05_0739241_27</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_03_05_0739241.31377368
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_03_05_0739241.31369078
</commentlist>
</thread>
<thread>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#thread_10_03_05_0739241_26</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_03_05_0739241.31371276
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_03_05_0739241.31369458
</commentlist>
</thread>
<thread>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#thread_10_03_05_0739241_17</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_03_05_0739241.31369228
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_03_05_0739241.31369156
</commentlist>
</thread>
<thread>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#thread_10_03_05_0739241_33</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_03_05_0739241.31369702
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_03_05_0739241.31369198
</commentlist>
</thread>
<thread>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#thread_10_03_05_0739241_5</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_03_05_0739241.31369694
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_03_05_0739241.31369156
</commentlist>
</thread>
<thread>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#thread_10_03_05_0739241_18</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_03_05_0739241.31382456
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_03_05_0739241.31369038
</commentlist>
</thread>
<thread>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#thread_10_03_05_0739241_25</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_03_05_0739241.31370420
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_03_05_0739241.31369458
</commentlist>
</thread>
<thread>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#thread_10_03_05_0739241_1</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_03_05_0739241.31369312
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_03_05_0739241.31369156
</commentlist>
</thread>
<thread>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#thread_10_03_05_0739241_39</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_03_05_0739241.31370454
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_03_05_0739241.31369198
</commentlist>
</thread>
<thread>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#thread_10_03_05_0739241_30</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_03_05_0739241.31370218
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_03_05_0739241.31369198
</commentlist>
</thread>
<thread>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#thread_10_03_05_0739241_15</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_03_05_0739241.31369390
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_03_05_0739241.31369282
</commentlist>
</thread>
<thread>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#thread_10_03_05_0739241_20</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_03_05_0739241.31371998
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_03_05_0739241.31370106
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_03_05_0739241.31369458
</commentlist>
</thread>
<conversation>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#conversation10_03_05_0739241.9</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_03_05_0739241.31369458
-http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_03_05_0739241.31370106
--http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_03_05_0739241.31371998
--http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_03_05_0739241.31371040
--http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_03_05_0739241.31370240
--http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_03_05_0739241.31373632
-http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_03_05_0739241.31370420
-http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_03_05_0739241.31371276
-http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_03_05_0739241.31371452
-http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_03_05_0739241.31371682
-http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_03_05_0739241.31369804
-http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_03_05_0739241.31369888
-http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_03_05_0739241.31372768
-http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_03_05_0739241.31370138
-http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_03_05_0739241.31372712
-http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_03_05_0739241.31370554
--http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_03_05_0739241.31377526
-http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_03_05_0739241.31376676
-http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_03_05_0739241.31370002
</commentlist>
</conversation>
<conversation>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#conversation10_03_05_0739241.3</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_03_05_0739241.31369628
-http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_03_05_0739241.31378806
-http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_03_05_0739241.31371974
</commentlist>
</conversation>
<conversation>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#conversation10_03_05_0739241.1</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_03_05_0739241.31369198
-http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_03_05_0739241.31369404
-http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_03_05_0739241.31370454
-http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_03_05_0739241.31369702
-http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_03_05_0739241.31370218
</commentlist>
</conversation>
<conversation>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#conversation10_03_05_0739241.7</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_03_05_0739241.31369078
-http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_03_05_0739241.31371118
-http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_03_05_0739241.31377368
</commentlist>
</conversation>
<conversation>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#conversation10_03_05_0739241.10</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_03_05_0739241.31369304
-http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_03_05_0739241.31374630
</commentlist>
</conversation>
<conversation>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#conversation10_03_05_0739241.8</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_03_05_0739241.31369842
</commentlist>
</conversation>
<conversation>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#conversation10_03_05_0739241.5</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_03_05_0739241.31372198
</commentlist>
</conversation>
<conversation>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#conversation10_03_05_0739241.2</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_03_05_0739241.31369156
-http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_03_05_0739241.31369312
-http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_03_05_0739241.31370082
-http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_03_05_0739241.31370792
-http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_03_05_0739241.31369392
-http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_03_05_0739241.31369238
--http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_03_05_0739241.31371538
-http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_03_05_0739241.31370628
-http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_03_05_0739241.31369308
-http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_03_05_0739241.31369232
-http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_03_05_0739241.31369370
-http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_03_05_0739241.31369694
-http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_03_05_0739241.31369228
</commentlist>
</conversation>
<conversation>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#conversation10_03_05_0739241.0</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_03_05_0739241.31369072
-http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_03_05_0739241.31371548
--http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_03_05_0739241.31376746
</commentlist>
</conversation>
<conversation>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#conversation10_03_05_0739241.11</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_03_05_0739241.31369110
</commentlist>
</conversation>
<conversation>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#conversation10_03_05_0739241.14</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_03_05_0739241.31369104
-http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_03_05_0739241.31369550
</commentlist>
</conversation>
<conversation>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#conversation10_03_05_0739241.12</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_03_05_0739241.31371830
</commentlist>
</conversation>
<conversation>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#conversation10_03_05_0739241.6</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_03_05_0739241.31370198
</commentlist>
</conversation>
<conversation>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#conversation10_03_05_0739241.13</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_03_05_0739241.31369282
-http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_03_05_0739241.31369390
</commentlist>
</conversation>
<conversation>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#conversation10_03_05_0739241.4</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_03_05_0739241.31369038
-http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_03_05_0739241.31382456
</commentlist>
</conversation>
