<article>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#article10_02_09_1627231</id>
	<title>NVIDIA Shows Off "Optimus" Switchable Graphics For Notebooks</title>
	<author>Soulskill</author>
	<datestamp>1265737740000</datestamp>
	<htmltext>Vigile writes <i>"Transformers jokes aside, NVIDIA's newest technology offering hopes to radically change the way notebook computers are built and how customers use them.  The promise of both extended battery life and high performance mobile computing has seemed like a pipe dream, and even the most recent updates to 'switchable graphics' left much to be desired in terms of the user experience.  Having both an integrated and discrete graphics chip in your notebook does little good if you never switch between the two.  Optimus allows the system to <a href="http://www.pcper.com/article.php?aid=868">seamlessly and instantly change between IGP and discrete NVIDIA GPUs</a> based on the task being run, including games, GPU encoding or Flash video playback.  Using new software and hardware technology, notebooks using Optimus can <a href="http://hothardware.com/Articles/NVIDIA-Optimus-Mobile-Technology-Preview/">power on and pass control to the GPU in a matter of 300ms</a> and power both the GPU and PCIe lanes completely off when not in use.  This can be done without being forced to reboot or even close out your applications, making it a hands-free solution for the customer."</i></htmltext>
<tokenext>Vigile writes " Transformers jokes aside , NVIDIA 's newest technology offering hopes to radically change the way notebook computers are built and how customers use them .
The promise of both extended battery life and high performance mobile computing has seemed like a pipe dream , and even the most recent updates to 'switchable graphics ' left much to be desired in terms of the user experience .
Having both an integrated and discrete graphics chip in your notebook does little good if you never switch between the two .
Optimus allows the system to seamlessly and instantly change between IGP and discrete NVIDIA GPUs based on the task being run , including games , GPU encoding or Flash video playback .
Using new software and hardware technology , notebooks using Optimus can power on and pass control to the GPU in a matter of 300ms and power both the GPU and PCIe lanes completely off when not in use .
This can be done without being forced to reboot or even close out your applications , making it a hands-free solution for the customer .
"</tokentext>
<sentencetext>Vigile writes "Transformers jokes aside, NVIDIA's newest technology offering hopes to radically change the way notebook computers are built and how customers use them.
The promise of both extended battery life and high performance mobile computing has seemed like a pipe dream, and even the most recent updates to 'switchable graphics' left much to be desired in terms of the user experience.
Having both an integrated and discrete graphics chip in your notebook does little good if you never switch between the two.
Optimus allows the system to seamlessly and instantly change between IGP and discrete NVIDIA GPUs based on the task being run, including games, GPU encoding or Flash video playback.
Using new software and hardware technology, notebooks using Optimus can power on and pass control to the GPU in a matter of 300ms and power both the GPU and PCIe lanes completely off when not in use.
This can be done without being forced to reboot or even close out your applications, making it a hands-free solution for the customer.
"</sentencetext>
</article>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_02_09_1627231.31095028</id>
	<title>Coming to your Linux box...</title>
	<author>DaVince21</author>
	<datestamp>1265040360000</datestamp>
	<modclass>None</modclass>
	<modscore>1</modscore>
	<htmltext><p>...in 2014!</p></htmltext>
<tokenext>...in 2014 !</tokentext>
<sentencetext>...in 2014!</sentencetext>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_02_09_1627231.31083720</id>
	<title>Re:VOODOO</title>
	<author>Anonymous</author>
	<datestamp>1265024160000</datestamp>
	<modclass>None</modclass>
	<modscore>0</modscore>
	<htmltext><p>BAH, 3dfx people.</p><p>I'm sure Rendition will come back and prove to you all that vQuake is better than gl-quake.<br>I will just keep holding to my Verite V1000 a tiny bit longer.</p></htmltext>
<tokenext>BAH , 3dfx people.I 'm sure Rendition will come back and prove to you all that vQuake is better than gl-quake.I will just keep holding to my Verite V1000 a tiny bit longer .</tokentext>
<sentencetext>BAH, 3dfx people.I'm sure Rendition will come back and prove to you all that vQuake is better than gl-quake.I will just keep holding to my Verite V1000 a tiny bit longer.</sentencetext>
	<parent>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_02_09_1627231.31074738</parent>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_02_09_1627231.31081896</id>
	<title>A Micheal Bay associated brand? Must get.</title>
	<author>UrduBlake</author>
	<datestamp>1265734800000</datestamp>
	<modclass>None</modclass>
	<modscore>1</modscore>
	<htmltext>This is a total no brainer. A definite buy. What else can a transfan ask for? Kudos, Nvidia.

50 cents per post. $)</htmltext>
<tokenext>This is a total no brainer .
A definite buy .
What else can a transfan ask for ?
Kudos , Nvidia .
50 cents per post .
$ )</tokentext>
<sentencetext>This is a total no brainer.
A definite buy.
What else can a transfan ask for?
Kudos, Nvidia.
50 cents per post.
$)</sentencetext>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_02_09_1627231.31074738</id>
	<title>VOODOO</title>
	<author>Anonymous</author>
	<datestamp>1265741580000</datestamp>
	<modclass>Funny</modclass>
	<modscore>5</modscore>
	<htmltext><p>I knew if I just held off upgrading my Orchid Righteous 3d (Voodoo 1) card long eoungh discrete 3d cards would become relavent again.  You guys with your fancy Banshee cards can suck it.</p></htmltext>
<tokenext>I knew if I just held off upgrading my Orchid Righteous 3d ( Voodoo 1 ) card long eoungh discrete 3d cards would become relavent again .
You guys with your fancy Banshee cards can suck it .</tokentext>
<sentencetext>I knew if I just held off upgrading my Orchid Righteous 3d (Voodoo 1) card long eoungh discrete 3d cards would become relavent again.
You guys with your fancy Banshee cards can suck it.</sentencetext>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_02_09_1627231.31082370</id>
	<title>5870s drop to 27-35W when not gaming already</title>
	<author>mykos</author>
	<datestamp>1265740680000</datestamp>
	<modclass>None</modclass>
	<modscore>1</modscore>
	<htmltext>ATI already has high-powered GPUs like the 5870 that drop to 27-35W when not gaming (which is probably not too far off the power consumption of these GPUs) without having to switch to another GPU.  I guess this switching thing is probably designed to compete power-wise with ATI.</htmltext>
<tokenext>ATI already has high-powered GPUs like the 5870 that drop to 27-35W when not gaming ( which is probably not too far off the power consumption of these GPUs ) without having to switch to another GPU .
I guess this switching thing is probably designed to compete power-wise with ATI .</tokentext>
<sentencetext>ATI already has high-powered GPUs like the 5870 that drop to 27-35W when not gaming (which is probably not too far off the power consumption of these GPUs) without having to switch to another GPU.
I guess this switching thing is probably designed to compete power-wise with ATI.</sentencetext>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_02_09_1627231.31079424</id>
	<title>Re:MacBook Pros</title>
	<author>Belisar</author>
	<datestamp>1265716200000</datestamp>
	<modclass>Interestin</modclass>
	<modscore>2</modscore>
	<htmltext><p>The new thing seems to be that you can actually switch between the onboard and 'real' GPU on the fly and fast while everything is running.</p><p>The previous laptops with switchable graphics, such as my Sony Vaio which had a Geforce and an Intel chips, did have to at least reboot the graphics system (on OS X) or reboot the whole computer (Windows) in order to go to the power saving mode.</p><p>In my experience, I usually was too lazy / didn't want to close my work and kept using the good GPU all the time. The only times I'd work up the enthusiasm to actually switch over was before a flight or something where I'd know I'd not need the power.</p></htmltext>
<tokenext>The new thing seems to be that you can actually switch between the onboard and 'real ' GPU on the fly and fast while everything is running.The previous laptops with switchable graphics , such as my Sony Vaio which had a Geforce and an Intel chips , did have to at least reboot the graphics system ( on OS X ) or reboot the whole computer ( Windows ) in order to go to the power saving mode.In my experience , I usually was too lazy / did n't want to close my work and kept using the good GPU all the time .
The only times I 'd work up the enthusiasm to actually switch over was before a flight or something where I 'd know I 'd not need the power .</tokentext>
<sentencetext>The new thing seems to be that you can actually switch between the onboard and 'real' GPU on the fly and fast while everything is running.The previous laptops with switchable graphics, such as my Sony Vaio which had a Geforce and an Intel chips, did have to at least reboot the graphics system (on OS X) or reboot the whole computer (Windows) in order to go to the power saving mode.In my experience, I usually was too lazy / didn't want to close my work and kept using the good GPU all the time.
The only times I'd work up the enthusiasm to actually switch over was before a flight or something where I'd know I'd not need the power.</sentencetext>
	<parent>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_02_09_1627231.31074886</parent>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_02_09_1627231.31074886</id>
	<title>MacBook Pros</title>
	<author>Stele</author>
	<datestamp>1265742120000</datestamp>
	<modclass>Informativ</modclass>
	<modscore>1</modscore>
	<htmltext><p>I believe the latest model MacBook Pros have been doing this for at least a year.</p></htmltext>
<tokenext>I believe the latest model MacBook Pros have been doing this for at least a year .</tokentext>
<sentencetext>I believe the latest model MacBook Pros have been doing this for at least a year.</sentencetext>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_02_09_1627231.31074862</id>
	<title>What a relief</title>
	<author>s2theg</author>
	<datestamp>1265741940000</datestamp>
	<modclass>Funny</modclass>
	<modscore>5</modscore>
	<htmltext>"Optimus can power on and pass control to the GPU in a matter of 300ms"<br> <br>

That's good. I'm tired of finishing before my video player can render the first frame.</htmltext>
<tokenext>" Optimus can power on and pass control to the GPU in a matter of 300ms " That 's good .
I 'm tired of finishing before my video player can render the first frame .</tokentext>
<sentencetext>"Optimus can power on and pass control to the GPU in a matter of 300ms" 

That's good.
I'm tired of finishing before my video player can render the first frame.</sentencetext>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_02_09_1627231.31075426</id>
	<title>How fitting</title>
	<author>Wiarumas</author>
	<datestamp>1265743860000</datestamp>
	<modclass>None</modclass>
	<modscore>1</modscore>
	<htmltext>So, on one hand you have a powerful graphics notebook when its primed (aka Optimus Prime).  And on the other hand, you can turn it off and it becomes a cab over semi truck.</htmltext>
<tokenext>So , on one hand you have a powerful graphics notebook when its primed ( aka Optimus Prime ) .
And on the other hand , you can turn it off and it becomes a cab over semi truck .</tokentext>
<sentencetext>So, on one hand you have a powerful graphics notebook when its primed (aka Optimus Prime).
And on the other hand, you can turn it off and it becomes a cab over semi truck.</sentencetext>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_02_09_1627231.31075370</id>
	<title>About time, if it works as advertised.</title>
	<author>Happy Nuclear Death</author>
	<datestamp>1265743620000</datestamp>
	<modclass>Informativ</modclass>
	<modscore>2</modscore>
	<htmltext><p>I have suffered from one of the multiple-display-device solutions, in the form of an Alienware M15X, so Optimus sounds like a huge step forward.</p><p>While in theory it was nice to have both a battery-friendly Intel GMA and a reasonably powerful Nvidia GeForce card in one (relatively) portable package, in reality it was lousy.  As suggested by TFA, you had to reboot to switch between them, whether running Windows XP or Vista.  That would have been bad enough, but wait, there's more!</p><p>This effectively meant that I could never switch, because us mere users were not permitted to authorize UAC prompts or do "admin" things under XP.  Yes, you needed administrator-level access to switch between display devices.  I don't know why, maybe because it involved changes to startup files.  Huge software limitation there, as well as a shortcoming of our boneheaded IT rules.</p><p>But you really shouldn't have to reboot to switch devices.</p></htmltext>
<tokenext>I have suffered from one of the multiple-display-device solutions , in the form of an Alienware M15X , so Optimus sounds like a huge step forward.While in theory it was nice to have both a battery-friendly Intel GMA and a reasonably powerful Nvidia GeForce card in one ( relatively ) portable package , in reality it was lousy .
As suggested by TFA , you had to reboot to switch between them , whether running Windows XP or Vista .
That would have been bad enough , but wait , there 's more ! This effectively meant that I could never switch , because us mere users were not permitted to authorize UAC prompts or do " admin " things under XP .
Yes , you needed administrator-level access to switch between display devices .
I do n't know why , maybe because it involved changes to startup files .
Huge software limitation there , as well as a shortcoming of our boneheaded IT rules.But you really should n't have to reboot to switch devices .</tokentext>
<sentencetext>I have suffered from one of the multiple-display-device solutions, in the form of an Alienware M15X, so Optimus sounds like a huge step forward.While in theory it was nice to have both a battery-friendly Intel GMA and a reasonably powerful Nvidia GeForce card in one (relatively) portable package, in reality it was lousy.
As suggested by TFA, you had to reboot to switch between them, whether running Windows XP or Vista.
That would have been bad enough, but wait, there's more!This effectively meant that I could never switch, because us mere users were not permitted to authorize UAC prompts or do "admin" things under XP.
Yes, you needed administrator-level access to switch between display devices.
I don't know why, maybe because it involved changes to startup files.
Huge software limitation there, as well as a shortcoming of our boneheaded IT rules.But you really shouldn't have to reboot to switch devices.</sentencetext>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_02_09_1627231.31075238</id>
	<title>correct me if im wrong..</title>
	<author>Moheeheeko</author>
	<datestamp>1265743200000</datestamp>
	<modclass>None</modclass>
	<modscore>1</modscore>
	<htmltext>..but dont ATI cards ALLREADY do this if you set up the CCC right?</htmltext>
<tokenext>..but dont ATI cards ALLREADY do this if you set up the CCC right ?</tokentext>
<sentencetext>..but dont ATI cards ALLREADY do this if you set up the CCC right?</sentencetext>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_02_09_1627231.31074992</id>
	<title>Re:I like my desktop.</title>
	<author>Jeremy Erwin</author>
	<datestamp>1265742480000</datestamp>
	<modclass>None</modclass>
	<modscore>1</modscore>
	<htmltext><p>A twelve pound notebook? Sounds like a niche product.</p></htmltext>
<tokenext>A twelve pound notebook ?
Sounds like a niche product .</tokentext>
<sentencetext>A twelve pound notebook?
Sounds like a niche product.</sentencetext>
	<parent>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_02_09_1627231.31074806</parent>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_02_09_1627231.31075962</id>
	<title>YOU FAIL ixT</title>
	<author>Anonymous</author>
	<datestamp>1265745660000</datestamp>
	<modclass>Offtopic</modclass>
	<modscore>-1</modscore>
	<htmltext><A HREF="http://goat.cx/" title="goat.cx" rel="nofollow">more grandiose users all over the posts on Usenet ar3 Fact: *BSD IS A most. LLok at the become obsessed empire in decline,</a> [goat.cx]</htmltext>
<tokenext>more grandiose users all over the posts on Usenet ar3 Fact : * BSD IS A most .
LLok at the become obsessed empire in decline , [ goat.cx ]</tokentext>
<sentencetext>more grandiose users all over the posts on Usenet ar3 Fact: *BSD IS A most.
LLok at the become obsessed empire in decline, [goat.cx]</sentencetext>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_02_09_1627231.31075178</id>
	<title>Re:MacBook Pros</title>
	<author>Vigile</author>
	<datestamp>1265743080000</datestamp>
	<modclass>Informativ</modclass>
	<modscore>3</modscore>
	<htmltext><p>Nope, not really.  I have one of those and the video on the PCPer article shows the process on a MacBook Pro.  You have to change a settings in the control panel and then logout of the system to change GPU modes.</p></htmltext>
<tokenext>Nope , not really .
I have one of those and the video on the PCPer article shows the process on a MacBook Pro .
You have to change a settings in the control panel and then logout of the system to change GPU modes .</tokentext>
<sentencetext>Nope, not really.
I have one of those and the video on the PCPer article shows the process on a MacBook Pro.
You have to change a settings in the control panel and then logout of the system to change GPU modes.</sentencetext>
	<parent>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_02_09_1627231.31074886</parent>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_02_09_1627231.31077014</id>
	<title>Re:MacBook Pros</title>
	<author>Khyber</author>
	<datestamp>1265706300000</datestamp>
	<modclass>None</modclass>
	<modscore>1</modscore>
	<htmltext><p>Nope, they cannot, just an effect of OSX's security permissions and driver model. Rebooting is required.</p><p>Besides, what's the point of having dual GPUs if you can't use both simultaneously for really heavy data processing?</p><p>Oh, that's right - Intel IGP, nVidia Discrete - you couldn't SLI it anyways without some serious hardware and software workarounds.</p></htmltext>
<tokenext>Nope , they can not , just an effect of OSX 's security permissions and driver model .
Rebooting is required.Besides , what 's the point of having dual GPUs if you ca n't use both simultaneously for really heavy data processing ? Oh , that 's right - Intel IGP , nVidia Discrete - you could n't SLI it anyways without some serious hardware and software workarounds .</tokentext>
<sentencetext>Nope, they cannot, just an effect of OSX's security permissions and driver model.
Rebooting is required.Besides, what's the point of having dual GPUs if you can't use both simultaneously for really heavy data processing?Oh, that's right - Intel IGP, nVidia Discrete - you couldn't SLI it anyways without some serious hardware and software workarounds.</sentencetext>
	<parent>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_02_09_1627231.31074886</parent>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_02_09_1627231.31081208</id>
	<title>Re:What a relief</title>
	<author>Bluesman</author>
	<datestamp>1265727540000</datestamp>
	<modclass>Informativ</modclass>
	<modscore>2</modscore>
	<htmltext><p>Getting older will help your stamina too.</p></htmltext>
<tokenext>Getting older will help your stamina too .</tokentext>
<sentencetext>Getting older will help your stamina too.</sentencetext>
	<parent>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_02_09_1627231.31074862</parent>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_02_09_1627231.31075952</id>
	<title>Re:no transformers jokes?</title>
	<author>twentynine</author>
	<datestamp>1265745540000</datestamp>
	<modclass>Funny</modclass>
	<modscore>4</modscore>
	<htmltext>You might wanna Jazz up that joke a little bit...</htmltext>
<tokenext>You might wan na Jazz up that joke a little bit.. .</tokentext>
<sentencetext>You might wanna Jazz up that joke a little bit...</sentencetext>
	<parent>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_02_09_1627231.31074858</parent>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_02_09_1627231.31080364</id>
	<title>I read articles about e-pcie years ago.</title>
	<author>AbRASiON</author>
	<datestamp>1265721660000</datestamp>
	<modclass>None</modclass>
	<modscore>1</modscore>
	<htmltext><p>Where is it? External PCI-Express slot on the laptop - some kind of high end, many many pin plug to go to an external, powered '3D brick' nothing eventuated<nobr> <wbr></nobr>:/</p></htmltext>
<tokenext>Where is it ?
External PCI-Express slot on the laptop - some kind of high end , many many pin plug to go to an external , powered '3D brick ' nothing eventuated : /</tokentext>
<sentencetext>Where is it?
External PCI-Express slot on the laptop - some kind of high end, many many pin plug to go to an external, powered '3D brick' nothing eventuated :/</sentencetext>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_02_09_1627231.31074806</id>
	<title>I like my desktop.</title>
	<author>tjstork</author>
	<datestamp>1265741760000</datestamp>
	<modclass>None</modclass>
	<modscore>1</modscore>
	<htmltext><p>I guess when they have dual CPU notebooks with full size keyboards and 21" displays, I might be more interested in them.  But I'd also want solid state hard drives and hdmi cables to wire them to the TV...</p><p>these guys are close...</p><p><a href="http://hothardware.com/News/Eurocom\_launches\_QuadCore\_XEON\_Based\_Notebook\_/" title="hothardware.com">http://hothardware.com/News/Eurocom\_launches\_QuadCore\_XEON\_Based\_Notebook\_/</a> [hothardware.com]</p><p>But oddly, I would like to have an SSI EEB desktop case, that lies flat, like old PCs used to...</p></htmltext>
<tokenext>I guess when they have dual CPU notebooks with full size keyboards and 21 " displays , I might be more interested in them .
But I 'd also want solid state hard drives and hdmi cables to wire them to the TV...these guys are close...http : //hothardware.com/News/Eurocom \ _launches \ _QuadCore \ _XEON \ _Based \ _Notebook \ _/ [ hothardware.com ] But oddly , I would like to have an SSI EEB desktop case , that lies flat , like old PCs used to.. .</tokentext>
<sentencetext>I guess when they have dual CPU notebooks with full size keyboards and 21" displays, I might be more interested in them.
But I'd also want solid state hard drives and hdmi cables to wire them to the TV...these guys are close...http://hothardware.com/News/Eurocom\_launches\_QuadCore\_XEON\_Based\_Notebook\_/ [hothardware.com]But oddly, I would like to have an SSI EEB desktop case, that lies flat, like old PCs used to...</sentencetext>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_02_09_1627231.31075022</id>
	<title>Great</title>
	<author>Joshua Fan</author>
	<datestamp>1265742600000</datestamp>
	<modclass>None</modclass>
	<modscore>1</modscore>
	<htmltext>Now give this to me in a 10" notebook.</htmltext>
<tokenext>Now give this to me in a 10 " notebook .</tokentext>
<sentencetext>Now give this to me in a 10" notebook.</sentencetext>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_02_09_1627231.31078812</id>
	<title>Re:Hey if it extends battery life...</title>
	<author>oKtosiTe</author>
	<datestamp>1265713560000</datestamp>
	<modclass>None</modclass>
	<modscore>1</modscore>
	<htmltext><p><nobr> <wbr></nobr></p><div class="quote"><p>...I'm all for it. But by how much will it extend the battery life? And when they say it will "Drastically" change the notebook market I doubt that; netbooks folks won't care about 3D and Desktop Replacement folks don't care if their machine is plugged in. Mabye in a smaller segment of mobile gamers this will make a difference.</p></div><p>I'm one of the "netbooks folks", and the prospect of being able to play video, or even basic accelerated games without running out of juice in less than half the regular time sounds great to me.</p></div>
	</htmltext>
<tokenext>...I 'm all for it .
But by how much will it extend the battery life ?
And when they say it will " Drastically " change the notebook market I doubt that ; netbooks folks wo n't care about 3D and Desktop Replacement folks do n't care if their machine is plugged in .
Mabye in a smaller segment of mobile gamers this will make a difference.I 'm one of the " netbooks folks " , and the prospect of being able to play video , or even basic accelerated games without running out of juice in less than half the regular time sounds great to me .</tokentext>
<sentencetext> ...I'm all for it.
But by how much will it extend the battery life?
And when they say it will "Drastically" change the notebook market I doubt that; netbooks folks won't care about 3D and Desktop Replacement folks don't care if their machine is plugged in.
Mabye in a smaller segment of mobile gamers this will make a difference.I'm one of the "netbooks folks", and the prospect of being able to play video, or even basic accelerated games without running out of juice in less than half the regular time sounds great to me.
	</sentencetext>
	<parent>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_02_09_1627231.31074896</parent>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_02_09_1627231.31076890</id>
	<title>Re:Can't they make a 'smarter' GPU?</title>
	<author>billcopc</author>
	<datestamp>1265749140000</datestamp>
	<modclass>Interestin</modclass>
	<modscore>2</modscore>
	<htmltext><p>The problem with GPU throttling is it's far more visible (pun intended).  If your CPU is rapidly switching between 3.0ghz and, say, 1.2ghz, you probably won't notice at all, but if your game or video app has uneven framerates or the dreaded micro-stutter, you will feel the overwhelming urge to smash your laptop against the nearest brick wall.</p><p>GPUs typically have two power modes: power-saving (idle), and full-blast (gaming).  Your device drivers kick it into high-power mode whenever you launch a 3D app, so the stutter of switching speeds happens before any animation takes place, and it stays that way until you exit the game.  This is representative of typical GPU usage: you're either using it to the max, or not at all.  I don't know anyone who runs their games at lower quality settings just to "save power on the GPU", you'll push the flashiest pixels your hardware can handle.</p><p>What would be quite appreciated is if the high-end GPUs had a true low-power mode that shuts off all the excess pwnage, but that's just my bias.  I tend to buy the fastest GPU I can afford, and stick it out for a few years until it starts bothering me.  My latest acquisition, the GTX 295, is a power hog.  Even when sitting idle at the desktop, my PC chugs a hearty 400 watts to do nothing, roughly 300w to the two GPUs and the remainder for the CPU and motherboard.  While gaming, this number swells to around 800w, again 3/4 of that goes to the GPUs.  I'm fine with the 800w active consumption, it's the idle power draw that bothers me, because I only game for an hour or two a night, 3-4 nights a week.  If I replace those two GPUs with a low-end card, my 2D performance is unaffected yet power usage drops to a much cozier 100w.  Why the big GPUs need 200 more watts to do absolutely nothing, that defies even the most usurious logic.  Now given the greater number of high-end desktop vs laptop GPUs, I think they should figure out how to shut down parts of the desktop GPU when not in use, rather than investing in some never-gonna-sell IGP+GPU trickery.  The $25 drop on my monthly hydro bill would more than justify the expense of a higher-efficiency device.  Hell, that's enough to buy the latest GPU every year!</p></htmltext>
<tokenext>The problem with GPU throttling is it 's far more visible ( pun intended ) .
If your CPU is rapidly switching between 3.0ghz and , say , 1.2ghz , you probably wo n't notice at all , but if your game or video app has uneven framerates or the dreaded micro-stutter , you will feel the overwhelming urge to smash your laptop against the nearest brick wall.GPUs typically have two power modes : power-saving ( idle ) , and full-blast ( gaming ) .
Your device drivers kick it into high-power mode whenever you launch a 3D app , so the stutter of switching speeds happens before any animation takes place , and it stays that way until you exit the game .
This is representative of typical GPU usage : you 're either using it to the max , or not at all .
I do n't know anyone who runs their games at lower quality settings just to " save power on the GPU " , you 'll push the flashiest pixels your hardware can handle.What would be quite appreciated is if the high-end GPUs had a true low-power mode that shuts off all the excess pwnage , but that 's just my bias .
I tend to buy the fastest GPU I can afford , and stick it out for a few years until it starts bothering me .
My latest acquisition , the GTX 295 , is a power hog .
Even when sitting idle at the desktop , my PC chugs a hearty 400 watts to do nothing , roughly 300w to the two GPUs and the remainder for the CPU and motherboard .
While gaming , this number swells to around 800w , again 3/4 of that goes to the GPUs .
I 'm fine with the 800w active consumption , it 's the idle power draw that bothers me , because I only game for an hour or two a night , 3-4 nights a week .
If I replace those two GPUs with a low-end card , my 2D performance is unaffected yet power usage drops to a much cozier 100w .
Why the big GPUs need 200 more watts to do absolutely nothing , that defies even the most usurious logic .
Now given the greater number of high-end desktop vs laptop GPUs , I think they should figure out how to shut down parts of the desktop GPU when not in use , rather than investing in some never-gon na-sell IGP + GPU trickery .
The $ 25 drop on my monthly hydro bill would more than justify the expense of a higher-efficiency device .
Hell , that 's enough to buy the latest GPU every year !</tokentext>
<sentencetext>The problem with GPU throttling is it's far more visible (pun intended).
If your CPU is rapidly switching between 3.0ghz and, say, 1.2ghz, you probably won't notice at all, but if your game or video app has uneven framerates or the dreaded micro-stutter, you will feel the overwhelming urge to smash your laptop against the nearest brick wall.GPUs typically have two power modes: power-saving (idle), and full-blast (gaming).
Your device drivers kick it into high-power mode whenever you launch a 3D app, so the stutter of switching speeds happens before any animation takes place, and it stays that way until you exit the game.
This is representative of typical GPU usage: you're either using it to the max, or not at all.
I don't know anyone who runs their games at lower quality settings just to "save power on the GPU", you'll push the flashiest pixels your hardware can handle.What would be quite appreciated is if the high-end GPUs had a true low-power mode that shuts off all the excess pwnage, but that's just my bias.
I tend to buy the fastest GPU I can afford, and stick it out for a few years until it starts bothering me.
My latest acquisition, the GTX 295, is a power hog.
Even when sitting idle at the desktop, my PC chugs a hearty 400 watts to do nothing, roughly 300w to the two GPUs and the remainder for the CPU and motherboard.
While gaming, this number swells to around 800w, again 3/4 of that goes to the GPUs.
I'm fine with the 800w active consumption, it's the idle power draw that bothers me, because I only game for an hour or two a night, 3-4 nights a week.
If I replace those two GPUs with a low-end card, my 2D performance is unaffected yet power usage drops to a much cozier 100w.
Why the big GPUs need 200 more watts to do absolutely nothing, that defies even the most usurious logic.
Now given the greater number of high-end desktop vs laptop GPUs, I think they should figure out how to shut down parts of the desktop GPU when not in use, rather than investing in some never-gonna-sell IGP+GPU trickery.
The $25 drop on my monthly hydro bill would more than justify the expense of a higher-efficiency device.
Hell, that's enough to buy the latest GPU every year!</sentencetext>
	<parent>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_02_09_1627231.31074936</parent>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_02_09_1627231.31075984</id>
	<title>Re:MacBook Pros</title>
	<author>Stele</author>
	<datestamp>1265745660000</datestamp>
	<modclass>None</modclass>
	<modscore>1</modscore>
	<htmltext><p>Ah, well the MBP solution is not nearly as cool as I thought it was then.</p></htmltext>
<tokenext>Ah , well the MBP solution is not nearly as cool as I thought it was then .</tokentext>
<sentencetext>Ah, well the MBP solution is not nearly as cool as I thought it was then.</sentencetext>
	<parent>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_02_09_1627231.31075178</parent>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_02_09_1627231.31077246</id>
	<title>Re:MacBook Pros</title>
	<author>moonbender</author>
	<datestamp>1265707260000</datestamp>
	<modclass>None</modclass>
	<modscore>1</modscore>
	<htmltext><p>That's such an awkward solution that I was always amazed it was allowed to appear in an Apple product. Or any other product for that matter.</p></htmltext>
<tokenext>That 's such an awkward solution that I was always amazed it was allowed to appear in an Apple product .
Or any other product for that matter .</tokentext>
<sentencetext>That's such an awkward solution that I was always amazed it was allowed to appear in an Apple product.
Or any other product for that matter.</sentencetext>
	<parent>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_02_09_1627231.31075178</parent>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_02_09_1627231.31075274</id>
	<title>Re:I like my desktop.</title>
	<author>vlm</author>
	<datestamp>1265743320000</datestamp>
	<modclass>Funny</modclass>
	<modscore>5</modscore>
	<htmltext><p><div class="quote"><p>I guess when they have dual CPU notebooks with full size keyboards and 21" displays, I might be more interested in them. But I'd also want solid state hard drives and hdmi cables to wire them to the TV...</p></div><p>But... But<nobr> <wbr></nobr>... But<nobr> <wbr></nobr>... Marketing told me you guys wanted postage stamp size touch sensitive screens, batteries that last two hours, and 3 second e-ink refresh rates.  And its gotta use a cloud, whatever that is.  And an app store, gotta have an app store.  I guess you must be wrong.</p></div>
	</htmltext>
<tokenext>I guess when they have dual CPU notebooks with full size keyboards and 21 " displays , I might be more interested in them .
But I 'd also want solid state hard drives and hdmi cables to wire them to the TV...But... But ... But ... Marketing told me you guys wanted postage stamp size touch sensitive screens , batteries that last two hours , and 3 second e-ink refresh rates .
And its got ta use a cloud , whatever that is .
And an app store , got ta have an app store .
I guess you must be wrong .</tokentext>
<sentencetext>I guess when they have dual CPU notebooks with full size keyboards and 21" displays, I might be more interested in them.
But I'd also want solid state hard drives and hdmi cables to wire them to the TV...But... But ... But ... Marketing told me you guys wanted postage stamp size touch sensitive screens, batteries that last two hours, and 3 second e-ink refresh rates.
And its gotta use a cloud, whatever that is.
And an app store, gotta have an app store.
I guess you must be wrong.
	</sentencetext>
	<parent>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_02_09_1627231.31074806</parent>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_02_09_1627231.31074926</id>
	<title>HybridSLI?</title>
	<author>Anonymous</author>
	<datestamp>1265742300000</datestamp>
	<modclass>None</modclass>
	<modscore>0</modscore>
	<htmltext><p>Doesn't Nvidia already have something like this in their HybridSLI technology?  I remember reading about it in the manual for my last motherboard, but haven't ever used it since I don't have a discrete card in that machine.  Is this the same thing, just applied to laptops?</p><p>Or a rebranding to create buzz?</p></htmltext>
<tokenext>Does n't Nvidia already have something like this in their HybridSLI technology ?
I remember reading about it in the manual for my last motherboard , but have n't ever used it since I do n't have a discrete card in that machine .
Is this the same thing , just applied to laptops ? Or a rebranding to create buzz ?</tokentext>
<sentencetext>Doesn't Nvidia already have something like this in their HybridSLI technology?
I remember reading about it in the manual for my last motherboard, but haven't ever used it since I don't have a discrete card in that machine.
Is this the same thing, just applied to laptops?Or a rebranding to create buzz?</sentencetext>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_02_09_1627231.31076168</id>
	<title>Re:I like my desktop.</title>
	<author>datapharmer</author>
	<datestamp>1265746260000</datestamp>
	<modclass>Funny</modclass>
	<modscore>2</modscore>
	<htmltext>NO NO NO! You can't do that!  You'll fry the thing with static electricity, coffee, and dust. Everyone true nerd knows you've got to brown bag it.  That's right, a quick trip to the feed supply for a burlap sack and you are working with some serious overclocking potential.  It is breathable, protects from dust and light spills, and you can hatch chick on the heat sink...</htmltext>
<tokenext>NO NO NO !
You ca n't do that !
You 'll fry the thing with static electricity , coffee , and dust .
Everyone true nerd knows you 've got to brown bag it .
That 's right , a quick trip to the feed supply for a burlap sack and you are working with some serious overclocking potential .
It is breathable , protects from dust and light spills , and you can hatch chick on the heat sink.. .</tokentext>
<sentencetext>NO NO NO!
You can't do that!
You'll fry the thing with static electricity, coffee, and dust.
Everyone true nerd knows you've got to brown bag it.
That's right, a quick trip to the feed supply for a burlap sack and you are working with some serious overclocking potential.
It is breathable, protects from dust and light spills, and you can hatch chick on the heat sink...</sentencetext>
	<parent>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_02_09_1627231.31074900</parent>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_02_09_1627231.31076312</id>
	<title>Re:I like my desktop.</title>
	<author>mhajicek</author>
	<datestamp>1265746800000</datestamp>
	<modclass>None</modclass>
	<modscore>1</modscore>
	<htmltext>Here ya go: <a href="http://www.computermuseum.li/Testpage/IBM-PortablePC.htm" title="computermuseum.li">http://www.computermuseum.li/Testpage/IBM-PortablePC.htm</a> [computermuseum.li]</htmltext>
<tokenext>Here ya go : http : //www.computermuseum.li/Testpage/IBM-PortablePC.htm [ computermuseum.li ]</tokentext>
<sentencetext>Here ya go: http://www.computermuseum.li/Testpage/IBM-PortablePC.htm [computermuseum.li]</sentencetext>
	<parent>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_02_09_1627231.31074992</parent>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_02_09_1627231.31099938</id>
	<title>this is good in principle</title>
	<author>oxide7</author>
	<datestamp>1265905500000</datestamp>
	<modclass>None</modclass>
	<modscore>1</modscore>
	<htmltext>This is good in principle but i doubt nvidia can actually execute it successfully. Even in their chips now they have something called "powermizer" that is suppose to throttle the GPU up and down depending on load... the problem is when it is set to "low" it runs like garbage and it takes too long to change to high -- most likely after you needed it -- like to open a giant window, or drag windows. its garbage really.</htmltext>
<tokenext>This is good in principle but i doubt nvidia can actually execute it successfully .
Even in their chips now they have something called " powermizer " that is suppose to throttle the GPU up and down depending on load... the problem is when it is set to " low " it runs like garbage and it takes too long to change to high -- most likely after you needed it -- like to open a giant window , or drag windows .
its garbage really .</tokentext>
<sentencetext>This is good in principle but i doubt nvidia can actually execute it successfully.
Even in their chips now they have something called "powermizer" that is suppose to throttle the GPU up and down depending on load... the problem is when it is set to "low" it runs like garbage and it takes too long to change to high -- most likely after you needed it -- like to open a giant window, or drag windows.
its garbage really.</sentencetext>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_02_09_1627231.31075634</id>
	<title>Re:Can't they make a 'smarter' GPU?</title>
	<author>Anonymous</author>
	<datestamp>1265744640000</datestamp>
	<modclass>None</modclass>
	<modscore>1</modscore>
	<htmltext><p>Not entirely. While you can do that, the chip in a laptop still has some real limits. You can't put off more than X watts of heat, because the laptop just can't dissipate it.
</p><p>But if the GPU used for high intensity activities (such as games)  is external to the laptop, you can have it give off 150 watts of heat because it can provide the necessary cooling capacity.
</p><p>I'd love something like this. I have my MacBook Pro which I really like, but don't do much in the way of 3D. I'd love to be able to plug in a good external card once in a while to use for gaming sessions, and let the internal GPU be lower powered. There are others with enthusiast/desktop replacement laptops this would be very good for as well.</p></htmltext>
<tokenext>Not entirely .
While you can do that , the chip in a laptop still has some real limits .
You ca n't put off more than X watts of heat , because the laptop just ca n't dissipate it .
But if the GPU used for high intensity activities ( such as games ) is external to the laptop , you can have it give off 150 watts of heat because it can provide the necessary cooling capacity .
I 'd love something like this .
I have my MacBook Pro which I really like , but do n't do much in the way of 3D .
I 'd love to be able to plug in a good external card once in a while to use for gaming sessions , and let the internal GPU be lower powered .
There are others with enthusiast/desktop replacement laptops this would be very good for as well .</tokentext>
<sentencetext>Not entirely.
While you can do that, the chip in a laptop still has some real limits.
You can't put off more than X watts of heat, because the laptop just can't dissipate it.
But if the GPU used for high intensity activities (such as games)  is external to the laptop, you can have it give off 150 watts of heat because it can provide the necessary cooling capacity.
I'd love something like this.
I have my MacBook Pro which I really like, but don't do much in the way of 3D.
I'd love to be able to plug in a good external card once in a while to use for gaming sessions, and let the internal GPU be lower powered.
There are others with enthusiast/desktop replacement laptops this would be very good for as well.</sentencetext>
	<parent>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_02_09_1627231.31074936</parent>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_02_09_1627231.31074896</id>
	<title>Hey if it extends battery life...</title>
	<author>planckscale</author>
	<datestamp>1265742120000</datestamp>
	<modclass>Interestin</modclass>
	<modscore>2</modscore>
	<htmltext>...I'm all for it. But by how much will it extend the battery life? And when they say it will "Drastically" change the notebook market I doubt that; netbooks folks won't care about 3D and Desktop Replacement folks don't care if their machine is plugged in. Mabye in a smaller segment of mobile gamers this will make a difference.</htmltext>
<tokenext>...I 'm all for it .
But by how much will it extend the battery life ?
And when they say it will " Drastically " change the notebook market I doubt that ; netbooks folks wo n't care about 3D and Desktop Replacement folks do n't care if their machine is plugged in .
Mabye in a smaller segment of mobile gamers this will make a difference .</tokentext>
<sentencetext>...I'm all for it.
But by how much will it extend the battery life?
And when they say it will "Drastically" change the notebook market I doubt that; netbooks folks won't care about 3D and Desktop Replacement folks don't care if their machine is plugged in.
Mabye in a smaller segment of mobile gamers this will make a difference.</sentencetext>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_02_09_1627231.31074872</id>
	<title>Re:VOODOO</title>
	<author>Pojut</author>
	<datestamp>1265742000000</datestamp>
	<modclass>None</modclass>
	<modscore>1</modscore>
	<htmltext><p>Pfft.  Everyone knows the <a href="http://en.wikipedia.org/wiki/Diamond\_Multimedia#Monster3D" title="wikipedia.org">Monster3D was where it was at!</a> [wikipedia.org]</p></htmltext>
<tokenext>Pfft .
Everyone knows the Monster3D was where it was at !
[ wikipedia.org ]</tokentext>
<sentencetext>Pfft.
Everyone knows the Monster3D was where it was at!
[wikipedia.org]</sentencetext>
	<parent>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_02_09_1627231.31074738</parent>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_02_09_1627231.31075624</id>
	<title>Boring</title>
	<author>toastar</author>
	<datestamp>1265744580000</datestamp>
	<modclass>None</modclass>
	<modscore>1</modscore>
	<htmltext><p>Please wake me when a company has brought the GPU on die.</p></htmltext>
<tokenext>Please wake me when a company has brought the GPU on die .</tokentext>
<sentencetext>Please wake me when a company has brought the GPU on die.</sentencetext>
	<parent>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_02_09_1627231.31074738</parent>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_02_09_1627231.31076518</id>
	<title>Re:Hey if it extends battery life...</title>
	<author>Monkeedude1212</author>
	<datestamp>1265747640000</datestamp>
	<modclass>None</modclass>
	<modscore>1</modscore>
	<htmltext><p>Ah but you can use hardware acceleration in your desktop environment, but you might not always want it on. Playing video, running something like photoshop - theres a bunch of stuff that uses the video card that isn't a video game. Just FYI. So if you are sitting there browsing slashdot for an hour, it can switch to the Integrated low power one, but as soon as you boot up Media Player or something, it can switch to your full blown power monster.</p></htmltext>
<tokenext>Ah but you can use hardware acceleration in your desktop environment , but you might not always want it on .
Playing video , running something like photoshop - theres a bunch of stuff that uses the video card that is n't a video game .
Just FYI .
So if you are sitting there browsing slashdot for an hour , it can switch to the Integrated low power one , but as soon as you boot up Media Player or something , it can switch to your full blown power monster .</tokentext>
<sentencetext>Ah but you can use hardware acceleration in your desktop environment, but you might not always want it on.
Playing video, running something like photoshop - theres a bunch of stuff that uses the video card that isn't a video game.
Just FYI.
So if you are sitting there browsing slashdot for an hour, it can switch to the Integrated low power one, but as soon as you boot up Media Player or something, it can switch to your full blown power monster.</sentencetext>
	<parent>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_02_09_1627231.31074896</parent>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_02_09_1627231.31075584</id>
	<title>Re:I like my desktop.</title>
	<author>PitaBred</author>
	<datestamp>1265744460000</datestamp>
	<modclass>None</modclass>
	<modscore>1</modscore>
	<htmltext><p>I like my desktop too. But I can't carry it on the plane with me, it's a pain in the ass to haul to a friend's house when we want to do some LAN play, and I can't bring it other places I go so I still have a place to offload photos and such.</p><p>Desktops are great as long as you never leave your house, or never need or want a computer when you do so.</p></htmltext>
<tokenext>I like my desktop too .
But I ca n't carry it on the plane with me , it 's a pain in the ass to haul to a friend 's house when we want to do some LAN play , and I ca n't bring it other places I go so I still have a place to offload photos and such.Desktops are great as long as you never leave your house , or never need or want a computer when you do so .</tokentext>
<sentencetext>I like my desktop too.
But I can't carry it on the plane with me, it's a pain in the ass to haul to a friend's house when we want to do some LAN play, and I can't bring it other places I go so I still have a place to offload photos and such.Desktops are great as long as you never leave your house, or never need or want a computer when you do so.</sentencetext>
	<parent>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_02_09_1627231.31074806</parent>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_02_09_1627231.31075516</id>
	<title>what happened to good hardware design?</title>
	<author>robmv</author>
	<datestamp>1265744160000</datestamp>
	<modclass>None</modclass>
	<modscore>1</modscore>
	<htmltext><p>I am a Thinkpad T500 owner with switchable Intel/ATI, and it is a nice feature even that I need to reboot and change the mode at the BIOS to use one or the other chipset on Linux (I have not tried the recent X server restart experiments), I use more than 95\% of the time the Intel IGP, but I still consider this software switching a horrible hack. Why do not design efficient chips (ATI/NVIDIA) able to power down parts of it when not using advanced features?</p><p>This is like putting two processor like the most power hungry Intel chip and an Intel Atom, and build software to switch from them when needed. no, you add power management features the the processor and only use one</p></htmltext>
<tokenext>I am a Thinkpad T500 owner with switchable Intel/ATI , and it is a nice feature even that I need to reboot and change the mode at the BIOS to use one or the other chipset on Linux ( I have not tried the recent X server restart experiments ) , I use more than 95 \ % of the time the Intel IGP , but I still consider this software switching a horrible hack .
Why do not design efficient chips ( ATI/NVIDIA ) able to power down parts of it when not using advanced features ? This is like putting two processor like the most power hungry Intel chip and an Intel Atom , and build software to switch from them when needed .
no , you add power management features the the processor and only use one</tokentext>
<sentencetext>I am a Thinkpad T500 owner with switchable Intel/ATI, and it is a nice feature even that I need to reboot and change the mode at the BIOS to use one or the other chipset on Linux (I have not tried the recent X server restart experiments), I use more than 95\% of the time the Intel IGP, but I still consider this software switching a horrible hack.
Why do not design efficient chips (ATI/NVIDIA) able to power down parts of it when not using advanced features?This is like putting two processor like the most power hungry Intel chip and an Intel Atom, and build software to switch from them when needed.
no, you add power management features the the processor and only use one</sentencetext>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_02_09_1627231.31074900</id>
	<title>Re:I like my desktop.</title>
	<author>Monkeedude1212</author>
	<datestamp>1265742180000</datestamp>
	<modclass>Insightful</modclass>
	<modscore>3</modscore>
	<htmltext><p>What all the cool kids are doing is dropping cases altogether. Thats right, nothing looks more badass than your motherboard laying on the desk with silicon chips sticking up in the air, with a giant fan overhead to help keep things cool and circulated. Your friends will be so jealous at all the blinking lights.</p><p>As for the Optimus, I think its a great idea. This change can come for desktops as much as it has for notebooks, if there is enough demand for such a product.</p><p>Think, you had to factor in the power supply when you bought that new Graphics card. So imagine how much power its actually eating up. Imagine if your desktop didn't have to use that much power when it didn't have to?</p></htmltext>
<tokenext>What all the cool kids are doing is dropping cases altogether .
Thats right , nothing looks more badass than your motherboard laying on the desk with silicon chips sticking up in the air , with a giant fan overhead to help keep things cool and circulated .
Your friends will be so jealous at all the blinking lights.As for the Optimus , I think its a great idea .
This change can come for desktops as much as it has for notebooks , if there is enough demand for such a product.Think , you had to factor in the power supply when you bought that new Graphics card .
So imagine how much power its actually eating up .
Imagine if your desktop did n't have to use that much power when it did n't have to ?</tokentext>
<sentencetext>What all the cool kids are doing is dropping cases altogether.
Thats right, nothing looks more badass than your motherboard laying on the desk with silicon chips sticking up in the air, with a giant fan overhead to help keep things cool and circulated.
Your friends will be so jealous at all the blinking lights.As for the Optimus, I think its a great idea.
This change can come for desktops as much as it has for notebooks, if there is enough demand for such a product.Think, you had to factor in the power supply when you bought that new Graphics card.
So imagine how much power its actually eating up.
Imagine if your desktop didn't have to use that much power when it didn't have to?</sentencetext>
	<parent>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_02_09_1627231.31074806</parent>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_02_09_1627231.31076946</id>
	<title>Re:About time, if it works as advertised.</title>
	<author>billcopc</author>
	<datestamp>1265706120000</datestamp>
	<modclass>None</modclass>
	<modscore>1</modscore>
	<htmltext><p>The real solution to this problem is to reduce the base power consumption of the GeForce.  Dual-GPU switching is a kludge, nothing more.  A crutch for an inefficient GPU.</p></htmltext>
<tokenext>The real solution to this problem is to reduce the base power consumption of the GeForce .
Dual-GPU switching is a kludge , nothing more .
A crutch for an inefficient GPU .</tokentext>
<sentencetext>The real solution to this problem is to reduce the base power consumption of the GeForce.
Dual-GPU switching is a kludge, nothing more.
A crutch for an inefficient GPU.</sentencetext>
	<parent>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_02_09_1627231.31075370</parent>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_02_09_1627231.31076728</id>
	<title>Re:no transformers jokes?  Bah!</title>
	<author>Tetsujin</author>
	<datestamp>1265748540000</datestamp>
	<modclass>None</modclass>
	<modscore>1</modscore>
	<htmltext><p>Yes, we can talk about hardware without making a bunch of stupid jokes about its name*.</p><p>One of the great features of the Optimus chipset is its pipelining architecture, called the "Convoy".  With this system a number of pending GPU tasks can be stored into containers, and the GPU hardware will process them quickly, moving the data to its destination, transforming it as necessary, etc.  But the hardware apparently kept dying on them during the demonstration: they were able to get it up and running again each time, but it happened at least three times.</p><p>Rumors are already spreading about the planned successor technology, known simply as "Ultra".  It will basically be a beefed-up version of "Optimus"...  Though there are rumors it won't be quite as flexible.</p><p>AMD is working on their own competing product, called "Hot Rod" - it really hasn't gained much of a following so far, though.  I've also heard about something called "Ironhide" - apparently it's designed to provide a GPU for processing functions on headless systems...</p><p>(* Doesn't necessarily mean we will...)</p></htmltext>
<tokenext>Yes , we can talk about hardware without making a bunch of stupid jokes about its name * .One of the great features of the Optimus chipset is its pipelining architecture , called the " Convoy " .
With this system a number of pending GPU tasks can be stored into containers , and the GPU hardware will process them quickly , moving the data to its destination , transforming it as necessary , etc .
But the hardware apparently kept dying on them during the demonstration : they were able to get it up and running again each time , but it happened at least three times.Rumors are already spreading about the planned successor technology , known simply as " Ultra " .
It will basically be a beefed-up version of " Optimus " ... Though there are rumors it wo n't be quite as flexible.AMD is working on their own competing product , called " Hot Rod " - it really has n't gained much of a following so far , though .
I 've also heard about something called " Ironhide " - apparently it 's designed to provide a GPU for processing functions on headless systems... ( * Does n't necessarily mean we will... )</tokentext>
<sentencetext>Yes, we can talk about hardware without making a bunch of stupid jokes about its name*.One of the great features of the Optimus chipset is its pipelining architecture, called the "Convoy".
With this system a number of pending GPU tasks can be stored into containers, and the GPU hardware will process them quickly, moving the data to its destination, transforming it as necessary, etc.
But the hardware apparently kept dying on them during the demonstration: they were able to get it up and running again each time, but it happened at least three times.Rumors are already spreading about the planned successor technology, known simply as "Ultra".
It will basically be a beefed-up version of "Optimus"...  Though there are rumors it won't be quite as flexible.AMD is working on their own competing product, called "Hot Rod" - it really hasn't gained much of a following so far, though.
I've also heard about something called "Ironhide" - apparently it's designed to provide a GPU for processing functions on headless systems...(* Doesn't necessarily mean we will...)</sentencetext>
	<parent>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_02_09_1627231.31074858</parent>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_02_09_1627231.31075232</id>
	<title>Linux support</title>
	<author>Ltap</author>
	<datestamp>1265743200000</datestamp>
	<modclass>None</modclass>
	<modscore>1</modscore>
	<htmltext>I see a lot of hacking that will be necessary to make something like this work. It doesn't seem like something that would automate easily unless it used some kind of profiles system.</htmltext>
<tokenext>I see a lot of hacking that will be necessary to make something like this work .
It does n't seem like something that would automate easily unless it used some kind of profiles system .</tokentext>
<sentencetext>I see a lot of hacking that will be necessary to make something like this work.
It doesn't seem like something that would automate easily unless it used some kind of profiles system.</sentencetext>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_02_09_1627231.31079522</id>
	<title>YUO FAiL IT</title>
	<author>Anonymous</author>
	<datestamp>1265716560000</datestamp>
	<modclass>Troll</modclass>
	<modscore>-1</modscore>
	<htmltext>practical purposes to th3 crowd in To yOu by Penisbird</htmltext>
<tokenext>practical purposes to th3 crowd in To yOu by Penisbird</tokentext>
<sentencetext>practical purposes to th3 crowd in To yOu by Penisbird</sentencetext>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_02_09_1627231.31075572</id>
	<title>Re:I like my desktop.</title>
	<author>maxume</author>
	<datestamp>1265744400000</datestamp>
	<modclass>None</modclass>
	<modscore>1</modscore>
	<htmltext><p>You can plug keyboards and displays into a notebook.</p><p>Hdmi is currently only somewhat available, and SSDs are a tough trade off if you are concerned about the amount of live space (without an external drive).</p><p>Dual CPUs no, but multiple cores yes.</p><p>And they cost more.</p><p>Still, the number of people with needs that are not met by an $800 laptop is shrinking pretty fast.</p></htmltext>
<tokenext>You can plug keyboards and displays into a notebook.Hdmi is currently only somewhat available , and SSDs are a tough trade off if you are concerned about the amount of live space ( without an external drive ) .Dual CPUs no , but multiple cores yes.And they cost more.Still , the number of people with needs that are not met by an $ 800 laptop is shrinking pretty fast .</tokentext>
<sentencetext>You can plug keyboards and displays into a notebook.Hdmi is currently only somewhat available, and SSDs are a tough trade off if you are concerned about the amount of live space (without an external drive).Dual CPUs no, but multiple cores yes.And they cost more.Still, the number of people with needs that are not met by an $800 laptop is shrinking pretty fast.</sentencetext>
	<parent>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_02_09_1627231.31074806</parent>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_02_09_1627231.31080382</id>
	<title>Re:Hey if it extends battery life...</title>
	<author>VoltageX</author>
	<datestamp>1265721840000</datestamp>
	<modclass>None</modclass>
	<modscore>1</modscore>
	<htmltext>Actually, with my newest desktop using 350W under full load I could use this.</htmltext>
<tokenext>Actually , with my newest desktop using 350W under full load I could use this .</tokentext>
<sentencetext>Actually, with my newest desktop using 350W under full load I could use this.</sentencetext>
	<parent>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_02_09_1627231.31074896</parent>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_02_09_1627231.31074858</id>
	<title>no transformers jokes?</title>
	<author>DJCouchyCouch</author>
	<datestamp>1265741940000</datestamp>
	<modclass>Funny</modclass>
	<modscore>4</modscore>
	<htmltext>But that's my Prime form of entertainment!</htmltext>
<tokenext>But that 's my Prime form of entertainment !</tokentext>
<sentencetext>But that's my Prime form of entertainment!</sentencetext>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_02_09_1627231.31074780</id>
	<title>Crap</title>
	<author>Anonymous</author>
	<datestamp>1265741700000</datestamp>
	<modclass>Funny</modclass>
	<modscore>1</modscore>
	<htmltext><p><div class="quote"><p>Transformers jokes aside</p></div><p>This article is ruined for me<nobr> <wbr></nobr>:(</p></div>
	</htmltext>
<tokenext>Transformers jokes asideThis article is ruined for me : (</tokentext>
<sentencetext>Transformers jokes asideThis article is ruined for me :(
	</sentencetext>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_02_09_1627231.31078854</id>
	<title>Re:VOODOO</title>
	<author>spire3661</author>
	<datestamp>1265713740000</datestamp>
	<modclass>None</modclass>
	<modscore>1</modscore>
	<htmltext>Voodoo2 had pass through too.

Voodoo 2 SLI + S3 Virge VGA w/ passthourghs = wiring nightmare in the back of the case.</htmltext>
<tokenext>Voodoo2 had pass through too .
Voodoo 2 SLI + S3 Virge VGA w/ passthourghs = wiring nightmare in the back of the case .</tokentext>
<sentencetext>Voodoo2 had pass through too.
Voodoo 2 SLI + S3 Virge VGA w/ passthourghs = wiring nightmare in the back of the case.</sentencetext>
	<parent>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_02_09_1627231.31074738</parent>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_02_09_1627231.31089134</id>
	<title>Re:I like my desktop.</title>
	<author>MrNemesis</author>
	<datestamp>1265054100000</datestamp>
	<modclass>None</modclass>
	<modscore>1</modscore>
	<htmltext><p>What with nVidia's unceremonious exit from the IGP market thanks to Intel's licensing, and the introduction of every-Intel-chip-comes-with-a-GPU, tech like this is a shrewd, and pretty essential, move by nVidia in order to remain relevant in the middle tiers. If people, whether on a laptop or desktop, can get the power savings of an Intel IGP with the ability to fall back on to a decent GPU, they'll claw back a good deal of marketshare from "prosumers" and the like. Conversely, ATI has made incredible improvements in the idle power savings of their GPUs, so it remains to be seen if process technology makes the software complexity worth it.</p><p>The only question is whether they'll stick at it in terms of driver support. IIRC their hybrid power initiative lasted only for a couple of card revs a year or two ago. Didn't RTFA but suspect this'll only work on Win7 where the WDDM allows for two different GPU HALs whereas Vista did not.</p></htmltext>
<tokenext>What with nVidia 's unceremonious exit from the IGP market thanks to Intel 's licensing , and the introduction of every-Intel-chip-comes-with-a-GPU , tech like this is a shrewd , and pretty essential , move by nVidia in order to remain relevant in the middle tiers .
If people , whether on a laptop or desktop , can get the power savings of an Intel IGP with the ability to fall back on to a decent GPU , they 'll claw back a good deal of marketshare from " prosumers " and the like .
Conversely , ATI has made incredible improvements in the idle power savings of their GPUs , so it remains to be seen if process technology makes the software complexity worth it.The only question is whether they 'll stick at it in terms of driver support .
IIRC their hybrid power initiative lasted only for a couple of card revs a year or two ago .
Did n't RTFA but suspect this 'll only work on Win7 where the WDDM allows for two different GPU HALs whereas Vista did not .</tokentext>
<sentencetext>What with nVidia's unceremonious exit from the IGP market thanks to Intel's licensing, and the introduction of every-Intel-chip-comes-with-a-GPU, tech like this is a shrewd, and pretty essential, move by nVidia in order to remain relevant in the middle tiers.
If people, whether on a laptop or desktop, can get the power savings of an Intel IGP with the ability to fall back on to a decent GPU, they'll claw back a good deal of marketshare from "prosumers" and the like.
Conversely, ATI has made incredible improvements in the idle power savings of their GPUs, so it remains to be seen if process technology makes the software complexity worth it.The only question is whether they'll stick at it in terms of driver support.
IIRC their hybrid power initiative lasted only for a couple of card revs a year or two ago.
Didn't RTFA but suspect this'll only work on Win7 where the WDDM allows for two different GPU HALs whereas Vista did not.</sentencetext>
	<parent>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_02_09_1627231.31074900</parent>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_02_09_1627231.31076090</id>
	<title>MAC</title>
	<author>Anonymous</author>
	<datestamp>1265745960000</datestamp>
	<modclass>Troll</modclass>
	<modscore>-1</modscore>
	<htmltext><p>has.been.doing.this.since.there.latest.mac-book.pro.WHY,is,NVIDIA.being.praised...my.space-bar.is.broken</p></htmltext>
<tokenext>has.been.doing.this.since.there.latest.mac-book.pro.WHY,is,NVIDIA.being.praised...my.space-bar.is.broken</tokentext>
<sentencetext>has.been.doing.this.since.there.latest.mac-book.pro.WHY,is,NVIDIA.being.praised...my.space-bar.is.broken</sentencetext>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_02_09_1627231.31075168</id>
	<title>Re:VOODOO</title>
	<author>Anonymous</author>
	<datestamp>1265743020000</datestamp>
	<modclass>None</modclass>
	<modscore>0</modscore>
	<htmltext><p>I'm glad I held out for the Pure3D.  The extra 2 MB of ram will really help with larger texture sizes for years to come.</p></htmltext>
<tokenext>I 'm glad I held out for the Pure3D .
The extra 2 MB of ram will really help with larger texture sizes for years to come .</tokentext>
<sentencetext>I'm glad I held out for the Pure3D.
The extra 2 MB of ram will really help with larger texture sizes for years to come.</sentencetext>
	<parent>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_02_09_1627231.31074872</parent>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_02_09_1627231.31074936</id>
	<title>Can't they make a 'smarter' GPU?</title>
	<author>JSBiff</author>
	<datestamp>1265742300000</datestamp>
	<modclass>Interestin</modclass>
	<modscore>2</modscore>
	<htmltext><p>I would have thought that, instead of switching between a 'low power' video chip, and a 'high power' GPU, they would have concentrated on just making the Nvidia graphics cards use lower power when not doing things like rendering 3D graphics, or decoding video? I mean, mobile CPU's have some smarts built into them to allow them to vary how much power they consume, can't they do that with GPUs?</p></htmltext>
<tokenext>I would have thought that , instead of switching between a 'low power ' video chip , and a 'high power ' GPU , they would have concentrated on just making the Nvidia graphics cards use lower power when not doing things like rendering 3D graphics , or decoding video ?
I mean , mobile CPU 's have some smarts built into them to allow them to vary how much power they consume , ca n't they do that with GPUs ?</tokentext>
<sentencetext>I would have thought that, instead of switching between a 'low power' video chip, and a 'high power' GPU, they would have concentrated on just making the Nvidia graphics cards use lower power when not doing things like rendering 3D graphics, or decoding video?
I mean, mobile CPU's have some smarts built into them to allow them to vary how much power they consume, can't they do that with GPUs?</sentencetext>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_02_09_1627231.31075216</id>
	<title>Linux hybrid graphics</title>
	<author>Anonymous</author>
	<datestamp>1265743140000</datestamp>
	<modclass>Informativ</modclass>
	<modscore>2</modscore>
	<htmltext><p><a href="http://linux-hybrid-graphics.blogspot.com/" title="blogspot.com" rel="nofollow">The current progress of Linux hybrid graphics.</a> [blogspot.com]</p><p>There has been a lot of progress in this area the past few weeks.  Wonder if this will let NVIDIA switch gpu's without restarting X.</p></htmltext>
<tokenext>The current progress of Linux hybrid graphics .
[ blogspot.com ] There has been a lot of progress in this area the past few weeks .
Wonder if this will let NVIDIA switch gpu 's without restarting X .</tokentext>
<sentencetext>The current progress of Linux hybrid graphics.
[blogspot.com]There has been a lot of progress in this area the past few weeks.
Wonder if this will let NVIDIA switch gpu's without restarting X.</sentencetext>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_02_09_1627231.31076432</id>
	<title>first 4ost</title>
	<author>Anonymous</author>
	<datestamp>1265747340000</datestamp>
	<modclass>Offtopic</modclass>
	<modscore>-1</modscore>
	<htmltext>nned your help! future. Even</htmltext>
<tokenext>nned your help !
future. Even</tokentext>
<sentencetext>nned your help!
future. Even</sentencetext>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_02_09_1627231.31077396</id>
	<title>Re:I like my desktop.</title>
	<author>Anonymous</author>
	<datestamp>1265707980000</datestamp>
	<modclass>None</modclass>
	<modscore>0</modscore>
	<htmltext><p>21"? Lol I like my 27" IPS desktop. 21" is practically laptop sized. Gimme a break.</p><p>And with quadcore i5 laptops out shortly?</p></htmltext>
<tokenext>21 " ?
Lol I like my 27 " IPS desktop .
21 " is practically laptop sized .
Gim me a break.And with quadcore i5 laptops out shortly ?</tokentext>
<sentencetext>21"?
Lol I like my 27" IPS desktop.
21" is practically laptop sized.
Gimme a break.And with quadcore i5 laptops out shortly?</sentencetext>
	<parent>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_02_09_1627231.31074806</parent>
</comment>
<thread>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#thread_10_02_09_1627231_20</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_02_09_1627231.31080382
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_02_09_1627231.31074896
</commentlist>
</thread>
<thread>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#thread_10_02_09_1627231_15</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_02_09_1627231.31076518
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_02_09_1627231.31074896
</commentlist>
</thread>
<thread>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#thread_10_02_09_1627231_19</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_02_09_1627231.31083720
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_02_09_1627231.31074738
</commentlist>
</thread>
<thread>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#thread_10_02_09_1627231_12</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_02_09_1627231.31075168
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_02_09_1627231.31074872
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_02_09_1627231.31074738
</commentlist>
</thread>
<thread>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#thread_10_02_09_1627231_18</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_02_09_1627231.31075572
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_02_09_1627231.31074806
</commentlist>
</thread>
<thread>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#thread_10_02_09_1627231_22</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_02_09_1627231.31078812
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_02_09_1627231.31074896
</commentlist>
</thread>
<thread>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#thread_10_02_09_1627231_21</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_02_09_1627231.31078854
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_02_09_1627231.31074738
</commentlist>
</thread>
<thread>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#thread_10_02_09_1627231_16</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_02_09_1627231.31076312
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_02_09_1627231.31074992
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_02_09_1627231.31074806
</commentlist>
</thread>
<thread>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#thread_10_02_09_1627231_0</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_02_09_1627231.31075274
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_02_09_1627231.31074806
</commentlist>
</thread>
<thread>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#thread_10_02_09_1627231_4</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_02_09_1627231.31075984
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_02_09_1627231.31075178
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_02_09_1627231.31074886
</commentlist>
</thread>
<thread>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#thread_10_02_09_1627231_23</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_02_09_1627231.31075634
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_02_09_1627231.31074936
</commentlist>
</thread>
<thread>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#thread_10_02_09_1627231_1</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_02_09_1627231.31076890
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_02_09_1627231.31074936
</commentlist>
</thread>
<thread>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#thread_10_02_09_1627231_8</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_02_09_1627231.31076168
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_02_09_1627231.31074900
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_02_09_1627231.31074806
</commentlist>
</thread>
<thread>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#thread_10_02_09_1627231_2</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_02_09_1627231.31075584
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_02_09_1627231.31074806
</commentlist>
</thread>
<thread>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#thread_10_02_09_1627231_5</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_02_09_1627231.31089134
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_02_09_1627231.31074900
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_02_09_1627231.31074806
</commentlist>
</thread>
<thread>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#thread_10_02_09_1627231_6</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_02_09_1627231.31075952
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_02_09_1627231.31074858
</commentlist>
</thread>
<thread>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#thread_10_02_09_1627231_13</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_02_09_1627231.31075624
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_02_09_1627231.31074738
</commentlist>
</thread>
<thread>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#thread_10_02_09_1627231_9</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_02_09_1627231.31076728
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_02_09_1627231.31074858
</commentlist>
</thread>
<thread>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#thread_10_02_09_1627231_10</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_02_09_1627231.31077014
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_02_09_1627231.31074886
</commentlist>
</thread>
<thread>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#thread_10_02_09_1627231_3</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_02_09_1627231.31081208
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_02_09_1627231.31074862
</commentlist>
</thread>
<thread>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#thread_10_02_09_1627231_17</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_02_09_1627231.31077396
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_02_09_1627231.31074806
</commentlist>
</thread>
<thread>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#thread_10_02_09_1627231_11</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_02_09_1627231.31077246
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_02_09_1627231.31075178
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_02_09_1627231.31074886
</commentlist>
</thread>
<thread>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#thread_10_02_09_1627231_14</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_02_09_1627231.31076946
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_02_09_1627231.31075370
</commentlist>
</thread>
<thread>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#thread_10_02_09_1627231_7</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_02_09_1627231.31079424
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_02_09_1627231.31074886
</commentlist>
</thread>
<conversation>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#conversation10_02_09_1627231.6</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_02_09_1627231.31074886
-http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_02_09_1627231.31077014
-http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_02_09_1627231.31079424
-http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_02_09_1627231.31075178
--http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_02_09_1627231.31077246
--http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_02_09_1627231.31075984
</commentlist>
</conversation>
<conversation>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#conversation10_02_09_1627231.4</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_02_09_1627231.31074738
-http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_02_09_1627231.31075624
-http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_02_09_1627231.31074872
--http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_02_09_1627231.31075168
-http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_02_09_1627231.31083720
-http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_02_09_1627231.31078854
</commentlist>
</conversation>
<conversation>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#conversation10_02_09_1627231.13</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_02_09_1627231.31075238
</commentlist>
</conversation>
<conversation>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#conversation10_02_09_1627231.7</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_02_09_1627231.31074936
-http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_02_09_1627231.31076890
-http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_02_09_1627231.31075634
</commentlist>
</conversation>
<conversation>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#conversation10_02_09_1627231.8</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_02_09_1627231.31075232
</commentlist>
</conversation>
<conversation>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#conversation10_02_09_1627231.5</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_02_09_1627231.31082370
</commentlist>
</conversation>
<conversation>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#conversation10_02_09_1627231.14</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_02_09_1627231.31074806
-http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_02_09_1627231.31074992
--http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_02_09_1627231.31076312
-http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_02_09_1627231.31077396
-http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_02_09_1627231.31075572
-http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_02_09_1627231.31074900
--http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_02_09_1627231.31089134
--http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_02_09_1627231.31076168
-http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_02_09_1627231.31075274
-http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_02_09_1627231.31075584
</commentlist>
</conversation>
<conversation>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#conversation10_02_09_1627231.9</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_02_09_1627231.31075516
</commentlist>
</conversation>
<conversation>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#conversation10_02_09_1627231.11</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_02_09_1627231.31074896
-http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_02_09_1627231.31080382
-http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_02_09_1627231.31076518
-http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_02_09_1627231.31078812
</commentlist>
</conversation>
<conversation>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#conversation10_02_09_1627231.2</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_02_09_1627231.31074858
-http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_02_09_1627231.31076728
-http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_02_09_1627231.31075952
</commentlist>
</conversation>
<conversation>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#conversation10_02_09_1627231.12</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_02_09_1627231.31074926
</commentlist>
</conversation>
<conversation>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#conversation10_02_09_1627231.0</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_02_09_1627231.31074862
-http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_02_09_1627231.31081208
</commentlist>
</conversation>
<conversation>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#conversation10_02_09_1627231.3</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_02_09_1627231.31074780
</commentlist>
</conversation>
<conversation>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#conversation10_02_09_1627231.1</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_02_09_1627231.31075370
-http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_02_09_1627231.31076946
</commentlist>
</conversation>
<conversation>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#conversation10_02_09_1627231.10</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment10_02_09_1627231.31075022
</commentlist>
</conversation>
