<article>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#article09_06_14_1845222</id>
	<title>Intel Eyes Smartphone Chip Market</title>
	<author>kdawson</author>
	<datestamp>1244969220000</datestamp>
	<htmltext><a href="http://hothardware.com/" rel="nofollow">MojoKid</a> writes <i>"Intel has been rather successful at carving out a large percentage of the netbook market with their low power Atom processor. Moving forward, Intel's executives believe there's a good potential to increase Atom's traction in adjacent markets by targeting its low-cost, energy-efficient chips at various multifunctional consumer gadgets including smartphones and other portable devices that access the Internet. <a href="http://hothardware.com/News/Intel-Hopes-To-Enter-Smartphone-Chip-Market/">Code-named Moorestown, a new version of the chip will offer a 50x power reduction at idle</a> and reportedly will deliver enough horsepower to handle 720p video recording and 1080p quality playback. It is with this upcoming chip that <a href="http://www.mercurynews.com/breakingnews/ci\_12571912">Intel will begin targeting the smartphone market In 2011</a>. Intel also plans to introduce an even smaller, less power-hungry version of the chip known as Medfield, which will be built on a 32nm process with a full solution comprising a PCB area of about half the size of a credit card."</i></htmltext>
<tokenext>MojoKid writes " Intel has been rather successful at carving out a large percentage of the netbook market with their low power Atom processor .
Moving forward , Intel 's executives believe there 's a good potential to increase Atom 's traction in adjacent markets by targeting its low-cost , energy-efficient chips at various multifunctional consumer gadgets including smartphones and other portable devices that access the Internet .
Code-named Moorestown , a new version of the chip will offer a 50x power reduction at idle and reportedly will deliver enough horsepower to handle 720p video recording and 1080p quality playback .
It is with this upcoming chip that Intel will begin targeting the smartphone market In 2011 .
Intel also plans to introduce an even smaller , less power-hungry version of the chip known as Medfield , which will be built on a 32nm process with a full solution comprising a PCB area of about half the size of a credit card .
"</tokentext>
<sentencetext>MojoKid writes "Intel has been rather successful at carving out a large percentage of the netbook market with their low power Atom processor.
Moving forward, Intel's executives believe there's a good potential to increase Atom's traction in adjacent markets by targeting its low-cost, energy-efficient chips at various multifunctional consumer gadgets including smartphones and other portable devices that access the Internet.
Code-named Moorestown, a new version of the chip will offer a 50x power reduction at idle and reportedly will deliver enough horsepower to handle 720p video recording and 1080p quality playback.
It is with this upcoming chip that Intel will begin targeting the smartphone market In 2011.
Intel also plans to introduce an even smaller, less power-hungry version of the chip known as Medfield, which will be built on a 32nm process with a full solution comprising a PCB area of about half the size of a credit card.
"</sentencetext>
</article>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_14_1845222.28334321</id>
	<title>Re:Can't wait to</title>
	<author>jones\_supa</author>
	<datestamp>1245076140000</datestamp>
	<modclass>None</modclass>
	<modscore>1</modscore>
	<htmltext><blockquote><div><p>The problem with Atom, as you say, is all of the other hardware to make it work. Its current chipset is incredibly power hungry, but they're working on that (integrating more and doing even deeper clock gating).</p></div></blockquote><p>The ratio can already be seen getting better. Older designs' 945GC used max 22W while the newer 945GSE tops at 6W.</p></div>
	</htmltext>
<tokenext>The problem with Atom , as you say , is all of the other hardware to make it work .
Its current chipset is incredibly power hungry , but they 're working on that ( integrating more and doing even deeper clock gating ) .The ratio can already be seen getting better .
Older designs ' 945GC used max 22W while the newer 945GSE tops at 6W .</tokentext>
<sentencetext>The problem with Atom, as you say, is all of the other hardware to make it work.
Its current chipset is incredibly power hungry, but they're working on that (integrating more and doing even deeper clock gating).The ratio can already be seen getting better.
Older designs' 945GC used max 22W while the newer 945GSE tops at 6W.
	</sentencetext>
	<parent>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_14_1845222.28329519</parent>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_14_1845222.28329215</id>
	<title>You BET</title>
	<author>Anonymous</author>
	<datestamp>1244973780000</datestamp>
	<modclass>Troll</modclass>
	<modscore>-1</modscore>
	<htmltext>It would be terrible if a la carte cable TV meant that networks like BET would go away due to lack of funding.
<br> <br>
Do you remember when, overnight, every black person in the USA started saying "are-uh" instead of the correct, monosyllabic pronounciation of the letter "r"?  How about when that silly "raise the roof" gesture became trendy?  What about yellow t-shirts?  Speaking of who is or isn't in "the house" in a loud annoying voice?  Who could forget shizzle and nizzle and other black contributions to our culture?  I used to wonder how they managed to coordinate these trends because one day none of them would exhibit such behaviors and then the next day all of them do it as though they had been doing so all of their lives like some kind of long-established tradition.  I mean, that sounds like it would be quite the logistics problem and would take a lot of work.  Then it dawned on me!  BET is how they do it.  Ah well, as they say - monkey see, monkey do!
<br> <br>
I wonder if any of the group-identity type of blacks would be surprised or shocked to learn that most of the commercial trends belonging to "their people", especially music but also designer clothes and the like, were actually the products of market research performed by some very white people wearing business suits and doing other things that are quite non-thuggish.  Amazing how they hate "acting white" when it comes to getting an education and bettering yourself but they love following whitey when it comes to market trends.</htmltext>
<tokenext>It would be terrible if a la carte cable TV meant that networks like BET would go away due to lack of funding .
Do you remember when , overnight , every black person in the USA started saying " are-uh " instead of the correct , monosyllabic pronounciation of the letter " r " ?
How about when that silly " raise the roof " gesture became trendy ?
What about yellow t-shirts ?
Speaking of who is or is n't in " the house " in a loud annoying voice ?
Who could forget shizzle and nizzle and other black contributions to our culture ?
I used to wonder how they managed to coordinate these trends because one day none of them would exhibit such behaviors and then the next day all of them do it as though they had been doing so all of their lives like some kind of long-established tradition .
I mean , that sounds like it would be quite the logistics problem and would take a lot of work .
Then it dawned on me !
BET is how they do it .
Ah well , as they say - monkey see , monkey do !
I wonder if any of the group-identity type of blacks would be surprised or shocked to learn that most of the commercial trends belonging to " their people " , especially music but also designer clothes and the like , were actually the products of market research performed by some very white people wearing business suits and doing other things that are quite non-thuggish .
Amazing how they hate " acting white " when it comes to getting an education and bettering yourself but they love following whitey when it comes to market trends .</tokentext>
<sentencetext>It would be terrible if a la carte cable TV meant that networks like BET would go away due to lack of funding.
Do you remember when, overnight, every black person in the USA started saying "are-uh" instead of the correct, monosyllabic pronounciation of the letter "r"?
How about when that silly "raise the roof" gesture became trendy?
What about yellow t-shirts?
Speaking of who is or isn't in "the house" in a loud annoying voice?
Who could forget shizzle and nizzle and other black contributions to our culture?
I used to wonder how they managed to coordinate these trends because one day none of them would exhibit such behaviors and then the next day all of them do it as though they had been doing so all of their lives like some kind of long-established tradition.
I mean, that sounds like it would be quite the logistics problem and would take a lot of work.
Then it dawned on me!
BET is how they do it.
Ah well, as they say - monkey see, monkey do!
I wonder if any of the group-identity type of blacks would be surprised or shocked to learn that most of the commercial trends belonging to "their people", especially music but also designer clothes and the like, were actually the products of market research performed by some very white people wearing business suits and doing other things that are quite non-thuggish.
Amazing how they hate "acting white" when it comes to getting an education and bettering yourself but they love following whitey when it comes to market trends.</sentencetext>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_14_1845222.28334949</id>
	<title>Re:Can't wait to</title>
	<author>Svartalf</author>
	<datestamp>1245079740000</datestamp>
	<modclass>None</modclass>
	<modscore>1</modscore>
	<htmltext><p>Even then, you're still consuming 2-3 times the juice of the current comparable ARM parts already shipping.</p><p>"Vastly Faster" is a relative concept, mind.  Clock-for-clock, they're showing to be rather close in performance<br>right at the moment.  Most of the Cortex-A8 parts are clocked down to 500-600MHz to save further on juice.</p><p>Don't get me wrong, Atom's VERY nice (I've got one machine right now, getting more...) but as a smartphone<br>platform, it's not as compelling as ARM is.  The only real reason you'd really EVER need X86 is if you're trying<br>to wedge Windows XP or Windows7 onto the device (Seriously...) as most of the mobile/embedded space use<br>ARM or MIPS in their designs.  To be sure, there's configurations in the embedded industry that call for that-<br>but the handheld/wearables tend to not hold up as well or be as prevalent as the ARM or MIPS variants.</p><p>There's a reason for this...</p></htmltext>
<tokenext>Even then , you 're still consuming 2-3 times the juice of the current comparable ARM parts already shipping .
" Vastly Faster " is a relative concept , mind .
Clock-for-clock , they 're showing to be rather close in performanceright at the moment .
Most of the Cortex-A8 parts are clocked down to 500-600MHz to save further on juice.Do n't get me wrong , Atom 's VERY nice ( I 've got one machine right now , getting more... ) but as a smartphoneplatform , it 's not as compelling as ARM is .
The only real reason you 'd really EVER need X86 is if you 're tryingto wedge Windows XP or Windows7 onto the device ( Seriously... ) as most of the mobile/embedded space useARM or MIPS in their designs .
To be sure , there 's configurations in the embedded industry that call for that-but the handheld/wearables tend to not hold up as well or be as prevalent as the ARM or MIPS variants.There 's a reason for this.. .</tokentext>
<sentencetext>Even then, you're still consuming 2-3 times the juice of the current comparable ARM parts already shipping.
"Vastly Faster" is a relative concept, mind.
Clock-for-clock, they're showing to be rather close in performanceright at the moment.
Most of the Cortex-A8 parts are clocked down to 500-600MHz to save further on juice.Don't get me wrong, Atom's VERY nice (I've got one machine right now, getting more...) but as a smartphoneplatform, it's not as compelling as ARM is.
The only real reason you'd really EVER need X86 is if you're tryingto wedge Windows XP or Windows7 onto the device (Seriously...) as most of the mobile/embedded space useARM or MIPS in their designs.
To be sure, there's configurations in the embedded industry that call for that-but the handheld/wearables tend to not hold up as well or be as prevalent as the ARM or MIPS variants.There's a reason for this...</sentencetext>
	<parent>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_14_1845222.28329519</parent>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_14_1845222.28329437</id>
	<title>What do you know, CS student?</title>
	<author>Sybert42</author>
	<datestamp>1244975640000</datestamp>
	<modclass>None</modclass>
	<modscore>-1</modscore>
	<htmltext><p>Stick to Linux.</p></htmltext>
<tokenext>Stick to Linux .</tokentext>
<sentencetext>Stick to Linux.</sentencetext>
	<parent>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_14_1845222.28329085</parent>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_14_1845222.28330095</id>
	<title>Re:Can't wait to</title>
	<author>Taxman415a</author>
	<datestamp>1244982480000</datestamp>
	<modclass>None</modclass>
	<modscore>1</modscore>
	<htmltext><p><div class="quote"><p>Nobody's stopping you from making an Atom device with those components (though it will take more power right now, it'll be vastly faster than the Cortex A8, and you won't have to recompile or use highly specialized toolkits, which is a huge Intel advantage).</p></div><p>Actually, go check the benchmarks and power draws on the chips and chipsets again. The Atom is most certainly not vastly faster than the Cortex A8 (particularly for equivalent clock and number of cores), and while Intel may be able to work the power draw down from the tens of watts that the chip + chipset + graphics require right now, that's a much harder task than what ARM has to do putting together the quad core Cortex A9 package that already has extremely low power graphics, etc. Though you do have it that the Atom doesn't require a recompile/port which is an Intel advantage.</p></div>
	</htmltext>
<tokenext>Nobody 's stopping you from making an Atom device with those components ( though it will take more power right now , it 'll be vastly faster than the Cortex A8 , and you wo n't have to recompile or use highly specialized toolkits , which is a huge Intel advantage ) .Actually , go check the benchmarks and power draws on the chips and chipsets again .
The Atom is most certainly not vastly faster than the Cortex A8 ( particularly for equivalent clock and number of cores ) , and while Intel may be able to work the power draw down from the tens of watts that the chip + chipset + graphics require right now , that 's a much harder task than what ARM has to do putting together the quad core Cortex A9 package that already has extremely low power graphics , etc .
Though you do have it that the Atom does n't require a recompile/port which is an Intel advantage .</tokentext>
<sentencetext>Nobody's stopping you from making an Atom device with those components (though it will take more power right now, it'll be vastly faster than the Cortex A8, and you won't have to recompile or use highly specialized toolkits, which is a huge Intel advantage).Actually, go check the benchmarks and power draws on the chips and chipsets again.
The Atom is most certainly not vastly faster than the Cortex A8 (particularly for equivalent clock and number of cores), and while Intel may be able to work the power draw down from the tens of watts that the chip + chipset + graphics require right now, that's a much harder task than what ARM has to do putting together the quad core Cortex A9 package that already has extremely low power graphics, etc.
Though you do have it that the Atom doesn't require a recompile/port which is an Intel advantage.
	</sentencetext>
	<parent>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_14_1845222.28329519</parent>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_14_1845222.28329161</id>
	<title>wrong info</title>
	<author>Anonymous</author>
	<datestamp>1244973420000</datestamp>
	<modclass>Informativ</modclass>
	<modscore>3</modscore>
	<htmltext><p>Intel talked at the press release about 50\% reduction, not 50 times...</p></htmltext>
<tokenext>Intel talked at the press release about 50 \ % reduction , not 50 times.. .</tokentext>
<sentencetext>Intel talked at the press release about 50\% reduction, not 50 times...</sentencetext>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_14_1845222.28329259</id>
	<title>Time Trax</title>
	<author>Ostracus</author>
	<datestamp>1244974080000</datestamp>
	<modclass>None</modclass>
	<modscore>1</modscore>
	<htmltext><p>"Intel also plans to introduce an even smaller, less power hungry version of the chip known as Medfield, which will be built on a 32nm process with a full solution comprising a PCB area of about half the size of a credit card."</p><p><a href="http://www.tvacres.com/computers\_beings\_selma.htm" title="tvacres.com">Selma</a> [tvacres.com], is that you?</p></htmltext>
<tokenext>" Intel also plans to introduce an even smaller , less power hungry version of the chip known as Medfield , which will be built on a 32nm process with a full solution comprising a PCB area of about half the size of a credit card .
" Selma [ tvacres.com ] , is that you ?</tokentext>
<sentencetext>"Intel also plans to introduce an even smaller, less power hungry version of the chip known as Medfield, which will be built on a 32nm process with a full solution comprising a PCB area of about half the size of a credit card.
"Selma [tvacres.com], is that you?</sentencetext>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_14_1845222.28329985</id>
	<title>Performance per mw</title>
	<author>Anonymous</author>
	<datestamp>1244981400000</datestamp>
	<modclass>None</modclass>
	<modscore>0</modscore>
	<htmltext><p>Finally a race to the lowest mw with high performance!</p></htmltext>
<tokenext>Finally a race to the lowest mw with high performance !</tokentext>
<sentencetext>Finally a race to the lowest mw with high performance!</sentencetext>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_14_1845222.28329325</id>
	<title>Intel vs. ARM</title>
	<author>moon3</author>
	<datestamp>1244974680000</datestamp>
	<modclass>Interestin</modclass>
	<modscore>4</modscore>
	<htmltext>It would be interesting to see what will Intel pitch against ARM's current superior offering. ARM is cheap and already has Power-VR OpenGL accelerator and other stuff integrated, while being very power efficient. Bundled GPU and power efficiency is a deal breaker in the mobile arena. Intel doesn't have integrated GPU nor a track record of being very power efficient.</htmltext>
<tokenext>It would be interesting to see what will Intel pitch against ARM 's current superior offering .
ARM is cheap and already has Power-VR OpenGL accelerator and other stuff integrated , while being very power efficient .
Bundled GPU and power efficiency is a deal breaker in the mobile arena .
Intel does n't have integrated GPU nor a track record of being very power efficient .</tokentext>
<sentencetext>It would be interesting to see what will Intel pitch against ARM's current superior offering.
ARM is cheap and already has Power-VR OpenGL accelerator and other stuff integrated, while being very power efficient.
Bundled GPU and power efficiency is a deal breaker in the mobile arena.
Intel doesn't have integrated GPU nor a track record of being very power efficient.</sentencetext>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_14_1845222.28329407</id>
	<title>Stay Focused</title>
	<author>Anonymous</author>
	<datestamp>1244975400000</datestamp>
	<modclass>None</modclass>
	<modscore>1</modscore>
	<htmltext><p>I hope the Intel employees don't get too distracted by random visits from USB co-inventor Ajay Bhatt.</p></htmltext>
<tokenext>I hope the Intel employees do n't get too distracted by random visits from USB co-inventor Ajay Bhatt .</tokentext>
<sentencetext>I hope the Intel employees don't get too distracted by random visits from USB co-inventor Ajay Bhatt.</sentencetext>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_14_1845222.28330361</id>
	<title>Re:Xscale?</title>
	<author>hattig</author>
	<datestamp>1244985180000</datestamp>
	<modclass>Informativ</modclass>
	<modscore>2</modscore>
	<htmltext><tt>It's in the SheevaPlug device from Marvell - that's a 1.2GHz ARMv5 device (1.2GHz StrongARM / XScale effectively).</tt></htmltext>
<tokenext>It 's in the SheevaPlug device from Marvell - that 's a 1.2GHz ARMv5 device ( 1.2GHz StrongARM / XScale effectively ) .</tokentext>
<sentencetext>It's in the SheevaPlug device from Marvell - that's a 1.2GHz ARMv5 device (1.2GHz StrongARM / XScale effectively).</sentencetext>
	<parent>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_14_1845222.28329357</parent>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_14_1845222.28333361</id>
	<title>Re:Intel vs. ARM</title>
	<author>Anonymous</author>
	<datestamp>1245065880000</datestamp>
	<modclass>None</modclass>
	<modscore>0</modscore>
	<htmltext><p>The GMA500 has a powerVR based GPU, and they're planning on integrating that mess into the CPU. So now both X86 and ARM will have the big-ball-of-mud driver that is buggy and has no 3D support. Yay. The company behnid the PowerVR just doesn't want to play with Linux.</p><p>I'm just hoping that when ARM has some of it's Mali GPU's built up, we'll get an actual functional driver to go with it.</p></htmltext>
<tokenext>The GMA500 has a powerVR based GPU , and they 're planning on integrating that mess into the CPU .
So now both X86 and ARM will have the big-ball-of-mud driver that is buggy and has no 3D support .
Yay. The company behnid the PowerVR just does n't want to play with Linux.I 'm just hoping that when ARM has some of it 's Mali GPU 's built up , we 'll get an actual functional driver to go with it .</tokentext>
<sentencetext>The GMA500 has a powerVR based GPU, and they're planning on integrating that mess into the CPU.
So now both X86 and ARM will have the big-ball-of-mud driver that is buggy and has no 3D support.
Yay. The company behnid the PowerVR just doesn't want to play with Linux.I'm just hoping that when ARM has some of it's Mali GPU's built up, we'll get an actual functional driver to go with it.</sentencetext>
	<parent>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_14_1845222.28329325</parent>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_14_1845222.28329357</id>
	<title>Xscale?</title>
	<author>Anonymous</author>
	<datestamp>1244974860000</datestamp>
	<modclass>Interestin</modclass>
	<modscore>4</modscore>
	<htmltext>Seriously what happened to Intel's Xscale processor? After they sold it to Marvell it went into the abyss of forgotten tech. That ARM processor had the entire Palm and Pocket PC market by the balls a couple of years ago since every device worth its weight was using it! They left that market and now want to reenter it? Last I checked every smartphone still uses ARM.</htmltext>
<tokenext>Seriously what happened to Intel 's Xscale processor ?
After they sold it to Marvell it went into the abyss of forgotten tech .
That ARM processor had the entire Palm and Pocket PC market by the balls a couple of years ago since every device worth its weight was using it !
They left that market and now want to reenter it ?
Last I checked every smartphone still uses ARM .</tokentext>
<sentencetext>Seriously what happened to Intel's Xscale processor?
After they sold it to Marvell it went into the abyss of forgotten tech.
That ARM processor had the entire Palm and Pocket PC market by the balls a couple of years ago since every device worth its weight was using it!
They left that market and now want to reenter it?
Last I checked every smartphone still uses ARM.</sentencetext>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_14_1845222.28329235</id>
	<title>Lets be honest</title>
	<author>Anonymous</author>
	<datestamp>1244973900000</datestamp>
	<modclass>None</modclass>
	<modscore>0</modscore>
	<htmltext><p>Lets be honest there wasn't a net book market until atom.  The first version of the eee, but every other successful netbook is a post atom product.  Intel didn't carve out anything, there wasn't a market before atom.</p></htmltext>
<tokenext>Lets be honest there was n't a net book market until atom .
The first version of the eee , but every other successful netbook is a post atom product .
Intel did n't carve out anything , there was n't a market before atom .</tokentext>
<sentencetext>Lets be honest there wasn't a net book market until atom.
The first version of the eee, but every other successful netbook is a post atom product.
Intel didn't carve out anything, there wasn't a market before atom.</sentencetext>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_14_1845222.28331289</id>
	<title>Re:Can't wait to</title>
	<author>mgblst</author>
	<datestamp>1244994960000</datestamp>
	<modclass>None</modclass>
	<modscore>1</modscore>
	<htmltext><p>Apple have done some research into this area, and concluded that the best power saving technique is to ramp the CPU up for complex tasks, then hit idle as soon as you can. Rather than dragging out the process. This sounds like what Intel is going for, with there 50 x reduction in idle draw.</p></htmltext>
<tokenext>Apple have done some research into this area , and concluded that the best power saving technique is to ramp the CPU up for complex tasks , then hit idle as soon as you can .
Rather than dragging out the process .
This sounds like what Intel is going for , with there 50 x reduction in idle draw .</tokentext>
<sentencetext>Apple have done some research into this area, and concluded that the best power saving technique is to ramp the CPU up for complex tasks, then hit idle as soon as you can.
Rather than dragging out the process.
This sounds like what Intel is going for, with there 50 x reduction in idle draw.</sentencetext>
	<parent>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_14_1845222.28329439</parent>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_14_1845222.28330333</id>
	<title>Re:Can't wait to</title>
	<author>hattig</author>
	<datestamp>1244984940000</datestamp>
	<modclass>None</modclass>
	<modscore>1</modscore>
	<htmltext><tt>Damn right. Every few months this story about Intel driving their products down into the smartphone arena comes along.<br><br>Last year it was laughable, with a CPU + Chipset that needed more PCB space than your average brickphone's footprint, never mind other components. They've reduced that for 2011 to something that's still 10x the footprint of an ARM SoC. I can't see Intel getting anything competitive until 2013, but it's not as if ARM is standing still.<br><br>Cortex A9 brings out-of-order capability and multi-core (up to four way) in 2010, and will probably rule until 2012. It'll start on 45nm and drop to 28nm (via GlobalFoundries) very quickly. NVIDIA are using it in their next generation Tegra for 2010, never mind the other several dozen generic Smartphone/STB ARM SoC designers/manufacturers.<br><br>As Intel shrinks it's cores down by 'pentium-ising' them, the x86 decoder overhead grows in size relative to the rest of the core. This hurts them with multicore. I think it is hurting them with Larrabee in terms of die size.<br><br>It's also following the 'good enough' development path, and the path of adding specific hardware to accelerate tasks (in phone SoCs, DSPs, Security, and soon OpenCL) whereas Intel has always tried to get the CPUs to do that work.</tt></htmltext>
<tokenext>Damn right .
Every few months this story about Intel driving their products down into the smartphone arena comes along.Last year it was laughable , with a CPU + Chipset that needed more PCB space than your average brickphone 's footprint , never mind other components .
They 've reduced that for 2011 to something that 's still 10x the footprint of an ARM SoC .
I ca n't see Intel getting anything competitive until 2013 , but it 's not as if ARM is standing still.Cortex A9 brings out-of-order capability and multi-core ( up to four way ) in 2010 , and will probably rule until 2012 .
It 'll start on 45nm and drop to 28nm ( via GlobalFoundries ) very quickly .
NVIDIA are using it in their next generation Tegra for 2010 , never mind the other several dozen generic Smartphone/STB ARM SoC designers/manufacturers.As Intel shrinks it 's cores down by 'pentium-ising ' them , the x86 decoder overhead grows in size relative to the rest of the core .
This hurts them with multicore .
I think it is hurting them with Larrabee in terms of die size.It 's also following the 'good enough ' development path , and the path of adding specific hardware to accelerate tasks ( in phone SoCs , DSPs , Security , and soon OpenCL ) whereas Intel has always tried to get the CPUs to do that work .</tokentext>
<sentencetext>Damn right.
Every few months this story about Intel driving their products down into the smartphone arena comes along.Last year it was laughable, with a CPU + Chipset that needed more PCB space than your average brickphone's footprint, never mind other components.
They've reduced that for 2011 to something that's still 10x the footprint of an ARM SoC.
I can't see Intel getting anything competitive until 2013, but it's not as if ARM is standing still.Cortex A9 brings out-of-order capability and multi-core (up to four way) in 2010, and will probably rule until 2012.
It'll start on 45nm and drop to 28nm (via GlobalFoundries) very quickly.
NVIDIA are using it in their next generation Tegra for 2010, never mind the other several dozen generic Smartphone/STB ARM SoC designers/manufacturers.As Intel shrinks it's cores down by 'pentium-ising' them, the x86 decoder overhead grows in size relative to the rest of the core.
This hurts them with multicore.
I think it is hurting them with Larrabee in terms of die size.It's also following the 'good enough' development path, and the path of adding specific hardware to accelerate tasks (in phone SoCs, DSPs, Security, and soon OpenCL) whereas Intel has always tried to get the CPUs to do that work.</sentencetext>
	<parent>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_14_1845222.28329439</parent>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_14_1845222.28329733</id>
	<title>Re:processor for the old and poor</title>
	<author>zippthorne</author>
	<datestamp>1244978760000</datestamp>
	<modclass>None</modclass>
	<modscore>1</modscore>
	<htmltext><p>The iPhone costs $2800.  Of course the owners are going to tend to be more affluent.  In fact, I'd suggest that if you earn less than $70k and you bought an iPhone for personal use, you're a dumb-ass.</p></htmltext>
<tokenext>The iPhone costs $ 2800 .
Of course the owners are going to tend to be more affluent .
In fact , I 'd suggest that if you earn less than $ 70k and you bought an iPhone for personal use , you 're a dumb-ass .</tokentext>
<sentencetext>The iPhone costs $2800.
Of course the owners are going to tend to be more affluent.
In fact, I'd suggest that if you earn less than $70k and you bought an iPhone for personal use, you're a dumb-ass.</sentencetext>
	<parent>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_14_1845222.28329147</parent>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_14_1845222.28329085</id>
	<title>Really....</title>
	<author>Darkness404</author>
	<datestamp>1244972940000</datestamp>
	<modclass>Interestin</modclass>
	<modscore>2</modscore>
	<htmltext>Really Intel could excel in the smartphone chip market where they can't in the netbook market because of MS and their speed/power restrictions on netbooks. The problem I see with the smartphone market is that x86 is terribly hard to make power-efficient enough and still be fast. Could Intel do it, sure, but unlike desktop CPUs they can't just increase the clock speed and get faster CPUs, they have to work at it.</htmltext>
<tokenext>Really Intel could excel in the smartphone chip market where they ca n't in the netbook market because of MS and their speed/power restrictions on netbooks .
The problem I see with the smartphone market is that x86 is terribly hard to make power-efficient enough and still be fast .
Could Intel do it , sure , but unlike desktop CPUs they ca n't just increase the clock speed and get faster CPUs , they have to work at it .</tokentext>
<sentencetext>Really Intel could excel in the smartphone chip market where they can't in the netbook market because of MS and their speed/power restrictions on netbooks.
The problem I see with the smartphone market is that x86 is terribly hard to make power-efficient enough and still be fast.
Could Intel do it, sure, but unlike desktop CPUs they can't just increase the clock speed and get faster CPUs, they have to work at it.</sentencetext>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_14_1845222.28349245</id>
	<title>More evidence Intel can't get low enough power</title>
	<author>Taxman415a</author>
	<datestamp>1245174060000</datestamp>
	<modclass>None</modclass>
	<modscore>1</modscore>
	<htmltext>Instead of replying to myself, I thought I'd add it here. Here's a <a href="http://www.linuxdevices.com/news/NS2545908561.html" title="linuxdevices.com">Linuxdevices article</a> [linuxdevices.com] on Intel's <i>upcoming</i> lower power atoms. This is reducing the ridiculous power draw of the chipsets by combining the package into two chips. Quoting <p><div class="quote"><p>"Even more important, the Pine Trail platform will have a seven-Watt TDP and require an average of just two Watts"</p></div><p>That's after the improvements on an upcoming chip release. The article goes on the say the setup will cost more for Intel to produce. Good luck to them though, I'm still rooting for the race to the greatest performance out of milliwatts.</p></div>
	</htmltext>
<tokenext>Instead of replying to myself , I thought I 'd add it here .
Here 's a Linuxdevices article [ linuxdevices.com ] on Intel 's upcoming lower power atoms .
This is reducing the ridiculous power draw of the chipsets by combining the package into two chips .
Quoting " Even more important , the Pine Trail platform will have a seven-Watt TDP and require an average of just two Watts " That 's after the improvements on an upcoming chip release .
The article goes on the say the setup will cost more for Intel to produce .
Good luck to them though , I 'm still rooting for the race to the greatest performance out of milliwatts .</tokentext>
<sentencetext>Instead of replying to myself, I thought I'd add it here.
Here's a Linuxdevices article [linuxdevices.com] on Intel's upcoming lower power atoms.
This is reducing the ridiculous power draw of the chipsets by combining the package into two chips.
Quoting "Even more important, the Pine Trail platform will have a seven-Watt TDP and require an average of just two Watts"That's after the improvements on an upcoming chip release.
The article goes on the say the setup will cost more for Intel to produce.
Good luck to them though, I'm still rooting for the race to the greatest performance out of milliwatts.
	</sentencetext>
	<parent>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_14_1845222.28329439</parent>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_14_1845222.28330813</id>
	<title>Hooray!</title>
	<author>Akir</author>
	<datestamp>1244989440000</datestamp>
	<modclass>None</modclass>
	<modscore>1</modscore>
	<htmltext>This is a day of rejoicing! Now even our most simple embedded devices can have decades of backward-compatibility baggage and buggy code!</htmltext>
<tokenext>This is a day of rejoicing !
Now even our most simple embedded devices can have decades of backward-compatibility baggage and buggy code !</tokentext>
<sentencetext>This is a day of rejoicing!
Now even our most simple embedded devices can have decades of backward-compatibility baggage and buggy code!</sentencetext>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_14_1845222.28329125</id>
	<title>Can't wait to</title>
	<author>Anonymous</author>
	<datestamp>1244973240000</datestamp>
	<modclass>Interestin</modclass>
	<modscore>4</modscore>
	<htmltext><p>watch those 1080p movies on my smart phone screen.</p><p>But on a more serious note, Intel will always be able to leverage their advanced fabrication processes to reduce power consumption.  Most ARM chips I've seen use older (in terms of desktop CPU) process technology but the good architecture still gives you excellent power consumption.</p></htmltext>
<tokenext>watch those 1080p movies on my smart phone screen.But on a more serious note , Intel will always be able to leverage their advanced fabrication processes to reduce power consumption .
Most ARM chips I 've seen use older ( in terms of desktop CPU ) process technology but the good architecture still gives you excellent power consumption .</tokentext>
<sentencetext>watch those 1080p movies on my smart phone screen.But on a more serious note, Intel will always be able to leverage their advanced fabrication processes to reduce power consumption.
Most ARM chips I've seen use older (in terms of desktop CPU) process technology but the good architecture still gives you excellent power consumption.</sentencetext>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_14_1845222.28332487</id>
	<title>Re:Can't wait to</title>
	<author>GordonCopestake</author>
	<datestamp>1245008940000</datestamp>
	<modclass>None</modclass>
	<modscore>1</modscore>
	<htmltext><p>Perhaps. But what if your phone had a USB socket (mine already does) and a HDMI socket? Carry your "laptop" around with you everywhere and use it as it should be - a communication device... BUT when you want a bigger screen, find the nearest 1080p panel and bam, big screen. Plug in a keyboard and mouse when you want to use it for office / school work.</p><p>In my mind a device the size of a cell phone but as powerful as todays netbooks is something of a holy grail. Make a decent universal cradle so everyone has one and you can "hotdesk" as much as you want. Your whole workstation fits in your pocket and of course you can still use it without the boltons.</p></htmltext>
<tokenext>Perhaps .
But what if your phone had a USB socket ( mine already does ) and a HDMI socket ?
Carry your " laptop " around with you everywhere and use it as it should be - a communication device... BUT when you want a bigger screen , find the nearest 1080p panel and bam , big screen .
Plug in a keyboard and mouse when you want to use it for office / school work.In my mind a device the size of a cell phone but as powerful as todays netbooks is something of a holy grail .
Make a decent universal cradle so everyone has one and you can " hotdesk " as much as you want .
Your whole workstation fits in your pocket and of course you can still use it without the boltons .</tokentext>
<sentencetext>Perhaps.
But what if your phone had a USB socket (mine already does) and a HDMI socket?
Carry your "laptop" around with you everywhere and use it as it should be - a communication device... BUT when you want a bigger screen, find the nearest 1080p panel and bam, big screen.
Plug in a keyboard and mouse when you want to use it for office / school work.In my mind a device the size of a cell phone but as powerful as todays netbooks is something of a holy grail.
Make a decent universal cradle so everyone has one and you can "hotdesk" as much as you want.
Your whole workstation fits in your pocket and of course you can still use it without the boltons.</sentencetext>
	<parent>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_14_1845222.28329125</parent>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_14_1845222.28329369</id>
	<title>Re:Can't wait to</title>
	<author>Anonymous</author>
	<datestamp>1244974980000</datestamp>
	<modclass>Informativ</modclass>
	<modscore>4</modscore>
	<htmltext>Intel already had an ARM processor for smartphones -- the XScale PXA family. They decided to sell it off to Marvell a few years ago as part of their cost-cutting strategy. We'll see if that was a wise thing to do.</htmltext>
<tokenext>Intel already had an ARM processor for smartphones -- the XScale PXA family .
They decided to sell it off to Marvell a few years ago as part of their cost-cutting strategy .
We 'll see if that was a wise thing to do .</tokentext>
<sentencetext>Intel already had an ARM processor for smartphones -- the XScale PXA family.
They decided to sell it off to Marvell a few years ago as part of their cost-cutting strategy.
We'll see if that was a wise thing to do.</sentencetext>
	<parent>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_14_1845222.28329125</parent>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_14_1845222.28332717</id>
	<title>everybody loves intel</title>
	<author>Anonymous</author>
	<datestamp>1245098700000</datestamp>
	<modclass>None</modclass>
	<modscore>0</modscore>
	<htmltext><p>Because x86 is the bestest evar.</p></htmltext>
<tokenext>Because x86 is the bestest evar .</tokentext>
<sentencetext>Because x86 is the bestest evar.</sentencetext>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_14_1845222.28330957</id>
	<title>Hmm...</title>
	<author>Zero\_DgZ</author>
	<datestamp>1244991060000</datestamp>
	<modclass>None</modclass>
	<modscore>1</modscore>
	<htmltext><p>Push for (potentially) standardized low-power/decent-performance mobile platform that might actually result in a handheld general purpose computer that isn't an iPhone? Yes, please.</p><p>(Yes, I know all about the Palm Pre, Blackberries, and others. Quiet, you in the peanut gallery.)</p><p>If it doesn't work, a competitive push for other makers (ARM, etc.) to do better? Yes please to that, as well.</p><p>If this thing is supposed to be based on x86-ish architecture, though, I wonder how (or if) they've licked the bus and chipset power consumption problems still plaguing Atom based machines. The Atom is nifty and all and can run on 2 watts or whatever, but unless I've missed some big news somewhere you still need a 15-20 watt chipset/bus/BIOS/etc. hooked up to make it work.</p><p>That said, if this comes to fruition I'd very much like to see it used not only in phones, but in standalone PDA style devices as well. I know I'm in the vast minority these days but I like having the flexibility of a powerful PDA that's not tied to a service provider.</p></htmltext>
<tokenext>Push for ( potentially ) standardized low-power/decent-performance mobile platform that might actually result in a handheld general purpose computer that is n't an iPhone ?
Yes , please .
( Yes , I know all about the Palm Pre , Blackberries , and others .
Quiet , you in the peanut gallery .
) If it does n't work , a competitive push for other makers ( ARM , etc .
) to do better ?
Yes please to that , as well.If this thing is supposed to be based on x86-ish architecture , though , I wonder how ( or if ) they 've licked the bus and chipset power consumption problems still plaguing Atom based machines .
The Atom is nifty and all and can run on 2 watts or whatever , but unless I 've missed some big news somewhere you still need a 15-20 watt chipset/bus/BIOS/etc .
hooked up to make it work.That said , if this comes to fruition I 'd very much like to see it used not only in phones , but in standalone PDA style devices as well .
I know I 'm in the vast minority these days but I like having the flexibility of a powerful PDA that 's not tied to a service provider .</tokentext>
<sentencetext>Push for (potentially) standardized low-power/decent-performance mobile platform that might actually result in a handheld general purpose computer that isn't an iPhone?
Yes, please.
(Yes, I know all about the Palm Pre, Blackberries, and others.
Quiet, you in the peanut gallery.
)If it doesn't work, a competitive push for other makers (ARM, etc.
) to do better?
Yes please to that, as well.If this thing is supposed to be based on x86-ish architecture, though, I wonder how (or if) they've licked the bus and chipset power consumption problems still plaguing Atom based machines.
The Atom is nifty and all and can run on 2 watts or whatever, but unless I've missed some big news somewhere you still need a 15-20 watt chipset/bus/BIOS/etc.
hooked up to make it work.That said, if this comes to fruition I'd very much like to see it used not only in phones, but in standalone PDA style devices as well.
I know I'm in the vast minority these days but I like having the flexibility of a powerful PDA that's not tied to a service provider.</sentencetext>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_14_1845222.28329461</id>
	<title>Good luck with that</title>
	<author>Taxman415a</author>
	<datestamp>1244975880000</datestamp>
	<modclass>Interestin</modclass>
	<modscore>4</modscore>
	<htmltext>The current atoms run about 2 watts, way too much for a smartphone even if they are able to cut that in half, and that's not even counting the power hog chipsets needed for the atom that require 5-12+ watts. By comparison the current cortex A8 packages with video etc that are able to do 1080p are able to make it under the 300 milliwatt line smartphone manufacturers are looking for. <br> <br>

And even better, if you're talking about Intel's chips two generations out, then consider the Cortex A9 quad core chips that are claiming to be ready to go and at reasonable power consumption in the same time frame if not sooner than Intel's offering. That article is actually claiming dual core Cortex A9 phones within a year that use about the same power as current chips with much better performance.<br> <br>

So as noted it looks like ARM is going to have a much easier time scaling up performance at the smartphone power draw level than Intel is going to have getting anywhere near it. And the Cortex A9 will probably spank the Atom. The race should benefit everyone though. Maybe we'll actually get some decent performing netbook, laptop, and desktop chips out of it that run on extremely low power.</htmltext>
<tokenext>The current atoms run about 2 watts , way too much for a smartphone even if they are able to cut that in half , and that 's not even counting the power hog chipsets needed for the atom that require 5-12 + watts .
By comparison the current cortex A8 packages with video etc that are able to do 1080p are able to make it under the 300 milliwatt line smartphone manufacturers are looking for .
And even better , if you 're talking about Intel 's chips two generations out , then consider the Cortex A9 quad core chips that are claiming to be ready to go and at reasonable power consumption in the same time frame if not sooner than Intel 's offering .
That article is actually claiming dual core Cortex A9 phones within a year that use about the same power as current chips with much better performance .
So as noted it looks like ARM is going to have a much easier time scaling up performance at the smartphone power draw level than Intel is going to have getting anywhere near it .
And the Cortex A9 will probably spank the Atom .
The race should benefit everyone though .
Maybe we 'll actually get some decent performing netbook , laptop , and desktop chips out of it that run on extremely low power .</tokentext>
<sentencetext>The current atoms run about 2 watts, way too much for a smartphone even if they are able to cut that in half, and that's not even counting the power hog chipsets needed for the atom that require 5-12+ watts.
By comparison the current cortex A8 packages with video etc that are able to do 1080p are able to make it under the 300 milliwatt line smartphone manufacturers are looking for.
And even better, if you're talking about Intel's chips two generations out, then consider the Cortex A9 quad core chips that are claiming to be ready to go and at reasonable power consumption in the same time frame if not sooner than Intel's offering.
That article is actually claiming dual core Cortex A9 phones within a year that use about the same power as current chips with much better performance.
So as noted it looks like ARM is going to have a much easier time scaling up performance at the smartphone power draw level than Intel is going to have getting anywhere near it.
And the Cortex A9 will probably spank the Atom.
The race should benefit everyone though.
Maybe we'll actually get some decent performing netbook, laptop, and desktop chips out of it that run on extremely low power.</sentencetext>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_14_1845222.28329147</id>
	<title>processor for the old and poor</title>
	<author>Anonymous</author>
	<datestamp>1244973360000</datestamp>
	<modclass>Redundant</modclass>
	<modscore>0</modscore>
	<htmltext><p><a href="http://forums.appleinsider.com/showthread.php?s=&amp;threadid=99152" title="appleinsider.com" rel="nofollow">recent studies</a> [appleinsider.com] demonstrate that only iphone users are young, hip, cool, earn more than 70k a year and in general are more educated and productive, who cares about a processor for smartphones that won't be ready for the iphone?</p></htmltext>
<tokenext>recent studies [ appleinsider.com ] demonstrate that only iphone users are young , hip , cool , earn more than 70k a year and in general are more educated and productive , who cares about a processor for smartphones that wo n't be ready for the iphone ?</tokentext>
<sentencetext>recent studies [appleinsider.com] demonstrate that only iphone users are young, hip, cool, earn more than 70k a year and in general are more educated and productive, who cares about a processor for smartphones that won't be ready for the iphone?</sentencetext>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_14_1845222.28330933</id>
	<title>Three words come to mind</title>
	<author>tyrione</author>
	<datestamp>1244990760000</datestamp>
	<modclass>None</modclass>
	<modscore>1</modscore>
	<htmltext>BFD.</htmltext>
<tokenext>BFD .</tokentext>
<sentencetext>BFD.</sentencetext>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_14_1845222.28332691</id>
	<title>ARM netbooks</title>
	<author>savuporo</author>
	<datestamp>1245098400000</datestamp>
	<modclass>None</modclass>
	<modscore>1</modscore>
	<htmltext><p>Somehow the flurry of upcoming ARM-Cortex based netbook and MID launches this summer has escaped Slashdot crowds attention<br><a href="http://www.engadget.com/tag/arm" title="engadget.com">http://www.engadget.com/tag/arm</a> [engadget.com]<br>Intel is gonna be so dead in this segment.</p></htmltext>
<tokenext>Somehow the flurry of upcoming ARM-Cortex based netbook and MID launches this summer has escaped Slashdot crowds attentionhttp : //www.engadget.com/tag/arm [ engadget.com ] Intel is gon na be so dead in this segment .</tokentext>
<sentencetext>Somehow the flurry of upcoming ARM-Cortex based netbook and MID launches this summer has escaped Slashdot crowds attentionhttp://www.engadget.com/tag/arm [engadget.com]Intel is gonna be so dead in this segment.</sentencetext>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_14_1845222.28329867</id>
	<title>Re:Can't wait to</title>
	<author>Anonymous</author>
	<datestamp>1244980140000</datestamp>
	<modclass>None</modclass>
	<modscore>1</modscore>
	<htmltext>They sold it because it competed with their aspiration to have x86 enter the smartphone market. I don't think profitability had anything to do with it.</htmltext>
<tokenext>They sold it because it competed with their aspiration to have x86 enter the smartphone market .
I do n't think profitability had anything to do with it .</tokentext>
<sentencetext>They sold it because it competed with their aspiration to have x86 enter the smartphone market.
I don't think profitability had anything to do with it.</sentencetext>
	<parent>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_14_1845222.28329369</parent>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_14_1845222.28351137</id>
	<title>Re:Really....</title>
	<author>Julie188</author>
	<datestamp>1245180300000</datestamp>
	<modclass>None</modclass>
	<modscore>1</modscore>
	<htmltext>So, are you saying that all the chips that Intel makes, like IXP4XX family (for network processors), are all based on the X86? Seems as if they already work at it. And if they are smart (and they are smart), they would see that in the next few years the PC becomes the netbook and the netbook merges with the smartphone. Some company or another is going to make a lot of money from those new devices.</htmltext>
<tokenext>So , are you saying that all the chips that Intel makes , like IXP4XX family ( for network processors ) , are all based on the X86 ?
Seems as if they already work at it .
And if they are smart ( and they are smart ) , they would see that in the next few years the PC becomes the netbook and the netbook merges with the smartphone .
Some company or another is going to make a lot of money from those new devices .</tokentext>
<sentencetext>So, are you saying that all the chips that Intel makes, like IXP4XX family (for network processors), are all based on the X86?
Seems as if they already work at it.
And if they are smart (and they are smart), they would see that in the next few years the PC becomes the netbook and the netbook merges with the smartphone.
Some company or another is going to make a lot of money from those new devices.</sentencetext>
	<parent>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_14_1845222.28329085</parent>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_14_1845222.28329253</id>
	<title>Innovation in all market segments</title>
	<author>symbolset</author>
	<datestamp>1244974020000</datestamp>
	<modclass>Insightful</modclass>
	<modscore>2</modscore>
	<htmltext><p><div class="quote"><p>"It's a loser mentality to not develop one segment because you're worried about the other," he said. "I think we have several years ahead of us where we can innovate the heck out of any of these categories without getting defensive about the other one. You just need to unleash innovation in all of the segments and see what happens." - <a href="http://news.cnet.com/8301-13924\_3-10254514-64.html" title="cnet.com" rel="nofollow">Sean Maloney</a> [cnet.com]</p> </div><p>It's interesting to see Intel expanding out of their traditional markets and unleashing innovation in every direction.  Since they're also staying pretty open about interfaces, people are going to do some pretty amazing stuff with their new products.</p></div>
	</htmltext>
<tokenext>" It 's a loser mentality to not develop one segment because you 're worried about the other , " he said .
" I think we have several years ahead of us where we can innovate the heck out of any of these categories without getting defensive about the other one .
You just need to unleash innovation in all of the segments and see what happens .
" - Sean Maloney [ cnet.com ] It 's interesting to see Intel expanding out of their traditional markets and unleashing innovation in every direction .
Since they 're also staying pretty open about interfaces , people are going to do some pretty amazing stuff with their new products .</tokentext>
<sentencetext>"It's a loser mentality to not develop one segment because you're worried about the other," he said.
"I think we have several years ahead of us where we can innovate the heck out of any of these categories without getting defensive about the other one.
You just need to unleash innovation in all of the segments and see what happens.
" - Sean Maloney [cnet.com] It's interesting to see Intel expanding out of their traditional markets and unleashing innovation in every direction.
Since they're also staying pretty open about interfaces, people are going to do some pretty amazing stuff with their new products.
	</sentencetext>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_14_1845222.28330777</id>
	<title>Re:Can't wait to</title>
	<author>Anonymous</author>
	<datestamp>1244989140000</datestamp>
	<modclass>None</modclass>
	<modscore>0</modscore>
	<htmltext><p><i>(Hard to get exact measurements due to the nature of how components interact. <b>Anything loading the CPU probably loads up the memory as well.</b> Anything hitting the GPU will hit the CPU, and DSP load varies greatly depending on the codec and video being decoded.)</i></p><p>Um, no.</p></htmltext>
<tokenext>( Hard to get exact measurements due to the nature of how components interact .
Anything loading the CPU probably loads up the memory as well .
Anything hitting the GPU will hit the CPU , and DSP load varies greatly depending on the codec and video being decoded .
) Um , no .</tokentext>
<sentencetext>(Hard to get exact measurements due to the nature of how components interact.
Anything loading the CPU probably loads up the memory as well.
Anything hitting the GPU will hit the CPU, and DSP load varies greatly depending on the codec and video being decoded.
)Um, no.</sentencetext>
	<parent>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_14_1845222.28329439</parent>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_14_1845222.28329501</id>
	<title>Re:Can't wait to</title>
	<author>msgmonkey</author>
	<datestamp>1244976360000</datestamp>
	<modclass>None</modclass>
	<modscore>1</modscore>
	<htmltext><p>Yes I agree Intel won't beable to to compete with the ARM because as you rightly point out the ARM is just too well designed an architecture.</p><p>If you read between the lines though I expect that the GPU and video decoding/encoding will be competitive if not better due fabrication process and if you notice it says "50X less power consumption at idle" so what I suspect is that as long as you're not doing anything that pushes the CPU you will get OK power consumption overall, but I guess we will have to wait and see if Intel can pull it off.</p></htmltext>
<tokenext>Yes I agree Intel wo n't beable to to compete with the ARM because as you rightly point out the ARM is just too well designed an architecture.If you read between the lines though I expect that the GPU and video decoding/encoding will be competitive if not better due fabrication process and if you notice it says " 50X less power consumption at idle " so what I suspect is that as long as you 're not doing anything that pushes the CPU you will get OK power consumption overall , but I guess we will have to wait and see if Intel can pull it off .</tokentext>
<sentencetext>Yes I agree Intel won't beable to to compete with the ARM because as you rightly point out the ARM is just too well designed an architecture.If you read between the lines though I expect that the GPU and video decoding/encoding will be competitive if not better due fabrication process and if you notice it says "50X less power consumption at idle" so what I suspect is that as long as you're not doing anything that pushes the CPU you will get OK power consumption overall, but I guess we will have to wait and see if Intel can pull it off.</sentencetext>
	<parent>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_14_1845222.28329439</parent>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_14_1845222.28330609</id>
	<title>Re:Can't wait to</title>
	<author>PhireN</author>
	<datestamp>1244987640000</datestamp>
	<modclass>None</modclass>
	<modscore>1</modscore>
	<htmltext>The LCD on my EEEPC 1000he uses way less than 6-15w.
<br>
With linux, a stripped down window manager (awsome), screen at ~30\% brightness and bluetooth, wifi, sd slot and webcam powered down, it idles at 8.1-8.3w. (I think the hdd is spun down at this point)<br>
Turning the backlight off only gets me down to 7.9w, and If I put the screen at full brightness the entire power load only increases to ~9.2w.
<br>
Clearly the Screen isn't using the bulk of the power.</htmltext>
<tokenext>The LCD on my EEEPC 1000he uses way less than 6-15w .
With linux , a stripped down window manager ( awsome ) , screen at ~ 30 \ % brightness and bluetooth , wifi , sd slot and webcam powered down , it idles at 8.1-8.3w .
( I think the hdd is spun down at this point ) Turning the backlight off only gets me down to 7.9w , and If I put the screen at full brightness the entire power load only increases to ~ 9.2w .
Clearly the Screen is n't using the bulk of the power .</tokentext>
<sentencetext>The LCD on my EEEPC 1000he uses way less than 6-15w.
With linux, a stripped down window manager (awsome), screen at ~30\% brightness and bluetooth, wifi, sd slot and webcam powered down, it idles at 8.1-8.3w.
(I think the hdd is spun down at this point)
Turning the backlight off only gets me down to 7.9w, and If I put the screen at full brightness the entire power load only increases to ~9.2w.
Clearly the Screen isn't using the bulk of the power.</sentencetext>
	<parent>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_14_1845222.28329439</parent>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_14_1845222.28329519</id>
	<title>Re:Can't wait to</title>
	<author>ciroknight</author>
	<datestamp>1244976660000</datestamp>
	<modclass>Insightful</modclass>
	<modscore>3</modscore>
	<htmltext><p><div class="quote"><p><div class="quote"><p>good architecture</p></div><p>Don't you mean <i>ludicrously</i> good architecture?</p><p>I'm thinking Cortex A8's, which have been out for over a year. Stuff like the OMAP 3530(present in the <a href="http://beagleboard.org/" title="beagleboard.org">Beagleboard</a> [beagleboard.org], upcoming <a href="http://openpandora.org/" title="openpandora.org">Pandora Handheld</a> [openpandora.org], and <a href="http://mobile.slashdot.org/story/09/06/07/0152215/Palm-Pre-Is-Out-Time-For-Discussion?from=rss" title="slashdot.org">Palm Pre</a> [slashdot.org]) consumes remarkably small amounts of power.</p><p>The Pandora developers said their device consumes around or just over 1 watt. Most of that is from the LCD. They did experiments completely shutting off certain hardware, to measure power consumption, and concluded...</p><p>CPU - about 20-40mw
DSP - about 30-60mw
SGX GPU - about 30-60mw</p><p>(Hard to get exact measurements due to the nature of how components interact. Anything loading the CPU probably loads up the memory as well. Anything hitting the GPU will hit the CPU, and DSP load varies greatly depending on the codec and video being decoded.)</p><p>The entire SoC uses a ludicrously small amount of power; something like 0.2-0.4w. Then add another 0.6w for the LCD, and a bunch more for wireless.</p><p>Now, compare that to the current Atoms, with 6+ watts just for the CPU/chipset, another 2+ for the HDD/SSD, at least 6-15w for the LCD, etc...</p><p>If any company can drive down their power consumption, Intel can, but that doesn't mean it'll be easy to catch ARM!</p><p>I just can't wait for Cortex A9's. Quad-core ARM in the exact same power envelope!</p></div><p>To be fair, the Atom runs at 6 Watts <i>max</i>, where average TDP can down to as little as 0.4W.
The problem with Atom, as you say, is all of the other hardware to make it work. Its current chipset is incredibly power hungry, but they're working on that (integrating more and doing even deeper clock gating). Future Atoms will likely use even less power, with Intel already shipping chips with a max 2.4W threshold.
<br> <br>
And yes, you are being unfair comparing a device which has a hard drive with hundreds of gigabytes of space and a WXSVGA screen to a handheld device with a couple of gigs of flash memory and a HVGA screen. Nobody's stopping you from making an Atom device with those components (though it will take more power right now, it'll be vastly faster than the Cortex A8, and you won't have to recompile or use highly specialized toolkits, which is a huge Intel advantage).</p></div>
	</htmltext>
<tokenext>good architectureDo n't you mean ludicrously good architecture ? I 'm thinking Cortex A8 's , which have been out for over a year .
Stuff like the OMAP 3530 ( present in the Beagleboard [ beagleboard.org ] , upcoming Pandora Handheld [ openpandora.org ] , and Palm Pre [ slashdot.org ] ) consumes remarkably small amounts of power.The Pandora developers said their device consumes around or just over 1 watt .
Most of that is from the LCD .
They did experiments completely shutting off certain hardware , to measure power consumption , and concluded...CPU - about 20-40mw DSP - about 30-60mw SGX GPU - about 30-60mw ( Hard to get exact measurements due to the nature of how components interact .
Anything loading the CPU probably loads up the memory as well .
Anything hitting the GPU will hit the CPU , and DSP load varies greatly depending on the codec and video being decoded .
) The entire SoC uses a ludicrously small amount of power ; something like 0.2-0.4w .
Then add another 0.6w for the LCD , and a bunch more for wireless.Now , compare that to the current Atoms , with 6 + watts just for the CPU/chipset , another 2 + for the HDD/SSD , at least 6-15w for the LCD , etc...If any company can drive down their power consumption , Intel can , but that does n't mean it 'll be easy to catch ARM ! I just ca n't wait for Cortex A9 's .
Quad-core ARM in the exact same power envelope ! To be fair , the Atom runs at 6 Watts max , where average TDP can down to as little as 0.4W .
The problem with Atom , as you say , is all of the other hardware to make it work .
Its current chipset is incredibly power hungry , but they 're working on that ( integrating more and doing even deeper clock gating ) .
Future Atoms will likely use even less power , with Intel already shipping chips with a max 2.4W threshold .
And yes , you are being unfair comparing a device which has a hard drive with hundreds of gigabytes of space and a WXSVGA screen to a handheld device with a couple of gigs of flash memory and a HVGA screen .
Nobody 's stopping you from making an Atom device with those components ( though it will take more power right now , it 'll be vastly faster than the Cortex A8 , and you wo n't have to recompile or use highly specialized toolkits , which is a huge Intel advantage ) .</tokentext>
<sentencetext>good architectureDon't you mean ludicrously good architecture?I'm thinking Cortex A8's, which have been out for over a year.
Stuff like the OMAP 3530(present in the Beagleboard [beagleboard.org], upcoming Pandora Handheld [openpandora.org], and Palm Pre [slashdot.org]) consumes remarkably small amounts of power.The Pandora developers said their device consumes around or just over 1 watt.
Most of that is from the LCD.
They did experiments completely shutting off certain hardware, to measure power consumption, and concluded...CPU - about 20-40mw
DSP - about 30-60mw
SGX GPU - about 30-60mw(Hard to get exact measurements due to the nature of how components interact.
Anything loading the CPU probably loads up the memory as well.
Anything hitting the GPU will hit the CPU, and DSP load varies greatly depending on the codec and video being decoded.
)The entire SoC uses a ludicrously small amount of power; something like 0.2-0.4w.
Then add another 0.6w for the LCD, and a bunch more for wireless.Now, compare that to the current Atoms, with 6+ watts just for the CPU/chipset, another 2+ for the HDD/SSD, at least 6-15w for the LCD, etc...If any company can drive down their power consumption, Intel can, but that doesn't mean it'll be easy to catch ARM!I just can't wait for Cortex A9's.
Quad-core ARM in the exact same power envelope!To be fair, the Atom runs at 6 Watts max, where average TDP can down to as little as 0.4W.
The problem with Atom, as you say, is all of the other hardware to make it work.
Its current chipset is incredibly power hungry, but they're working on that (integrating more and doing even deeper clock gating).
Future Atoms will likely use even less power, with Intel already shipping chips with a max 2.4W threshold.
And yes, you are being unfair comparing a device which has a hard drive with hundreds of gigabytes of space and a WXSVGA screen to a handheld device with a couple of gigs of flash memory and a HVGA screen.
Nobody's stopping you from making an Atom device with those components (though it will take more power right now, it'll be vastly faster than the Cortex A8, and you won't have to recompile or use highly specialized toolkits, which is a huge Intel advantage).
	</sentencetext>
	<parent>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_14_1845222.28329439</parent>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_14_1845222.28329371</id>
	<title>Re:Can't wait to</title>
	<author>Karganeth</author>
	<datestamp>1244974980000</datestamp>
	<modclass>None</modclass>
	<modscore>1</modscore>
	<htmltext><p><div class="quote"><p>[Can't wait to] watch those 1080p movies on my smart phone screen.</p></div><p>There are already phones which can play 780p (and record too). Why the sarcasm? Would you rather watch a lower quality movie?</p></div>
	</htmltext>
<tokenext>[ Ca n't wait to ] watch those 1080p movies on my smart phone screen.There are already phones which can play 780p ( and record too ) .
Why the sarcasm ?
Would you rather watch a lower quality movie ?</tokentext>
<sentencetext>[Can't wait to] watch those 1080p movies on my smart phone screen.There are already phones which can play 780p (and record too).
Why the sarcasm?
Would you rather watch a lower quality movie?
	</sentencetext>
	<parent>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_14_1845222.28329125</parent>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_14_1845222.28329449</id>
	<title>The Two Ways Race</title>
	<author>edivad</author>
	<datestamp>1244975700000</datestamp>
	<modclass>None</modclass>
	<modscore>1</modscore>
	<htmltext>Intel trying to cut down power, and ARM trying to enter the multicore, superscalar field.
So far, ARM is way ahead in the smartphone/mobile market. In there, battery life is king, and Intel lags behind.</htmltext>
<tokenext>Intel trying to cut down power , and ARM trying to enter the multicore , superscalar field .
So far , ARM is way ahead in the smartphone/mobile market .
In there , battery life is king , and Intel lags behind .</tokentext>
<sentencetext>Intel trying to cut down power, and ARM trying to enter the multicore, superscalar field.
So far, ARM is way ahead in the smartphone/mobile market.
In there, battery life is king, and Intel lags behind.</sentencetext>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_14_1845222.28329439</id>
	<title>Re:Can't wait to</title>
	<author>BikeHelmet</author>
	<datestamp>1244975640000</datestamp>
	<modclass>Informativ</modclass>
	<modscore>4</modscore>
	<htmltext><p><div class="quote"><p>good architecture</p></div><p>Don't you mean <i>ludicrously</i> good architecture?</p><p>I'm thinking Cortex A8's, which have been out for over a year. Stuff like the OMAP 3530(present in the <a href="http://beagleboard.org/" title="beagleboard.org">Beagleboard</a> [beagleboard.org], upcoming <a href="http://openpandora.org/" title="openpandora.org">Pandora Handheld</a> [openpandora.org], and <a href="http://mobile.slashdot.org/story/09/06/07/0152215/Palm-Pre-Is-Out-Time-For-Discussion?from=rss" title="slashdot.org">Palm Pre</a> [slashdot.org]) consumes remarkably small amounts of power.</p><p>The Pandora developers said their device consumes around or just over 1 watt. Most of that is from the LCD. They did experiments completely shutting off certain hardware, to measure power consumption, and concluded...</p><p>CPU - about 20-40mw<br>DSP - about 30-60mw<br>SGX GPU - about 30-60mw</p><p>(Hard to get exact measurements due to the nature of how components interact. Anything loading the CPU probably loads up the memory as well. Anything hitting the GPU will hit the CPU, and DSP load varies greatly depending on the codec and video being decoded.)</p><p>The entire SoC uses a ludicrously small amount of power; something like 0.2-0.4w. Then add another 0.6w for the LCD, and a bunch more for wireless.</p><p>Now, compare that to the current Atoms, with 6+ watts just for the CPU/chipset, another 2+ for the HDD/SSD, at least 6-15w for the LCD, etc...</p><p>If any company can drive down their power consumption, Intel can, but that doesn't mean it'll be easy to catch ARM!</p><p>I just can't wait for Cortex A9's. Quad-core ARM in the exact same power envelope!</p></div>
	</htmltext>
<tokenext>good architectureDo n't you mean ludicrously good architecture ? I 'm thinking Cortex A8 's , which have been out for over a year .
Stuff like the OMAP 3530 ( present in the Beagleboard [ beagleboard.org ] , upcoming Pandora Handheld [ openpandora.org ] , and Palm Pre [ slashdot.org ] ) consumes remarkably small amounts of power.The Pandora developers said their device consumes around or just over 1 watt .
Most of that is from the LCD .
They did experiments completely shutting off certain hardware , to measure power consumption , and concluded...CPU - about 20-40mwDSP - about 30-60mwSGX GPU - about 30-60mw ( Hard to get exact measurements due to the nature of how components interact .
Anything loading the CPU probably loads up the memory as well .
Anything hitting the GPU will hit the CPU , and DSP load varies greatly depending on the codec and video being decoded .
) The entire SoC uses a ludicrously small amount of power ; something like 0.2-0.4w .
Then add another 0.6w for the LCD , and a bunch more for wireless.Now , compare that to the current Atoms , with 6 + watts just for the CPU/chipset , another 2 + for the HDD/SSD , at least 6-15w for the LCD , etc...If any company can drive down their power consumption , Intel can , but that does n't mean it 'll be easy to catch ARM ! I just ca n't wait for Cortex A9 's .
Quad-core ARM in the exact same power envelope !</tokentext>
<sentencetext>good architectureDon't you mean ludicrously good architecture?I'm thinking Cortex A8's, which have been out for over a year.
Stuff like the OMAP 3530(present in the Beagleboard [beagleboard.org], upcoming Pandora Handheld [openpandora.org], and Palm Pre [slashdot.org]) consumes remarkably small amounts of power.The Pandora developers said their device consumes around or just over 1 watt.
Most of that is from the LCD.
They did experiments completely shutting off certain hardware, to measure power consumption, and concluded...CPU - about 20-40mwDSP - about 30-60mwSGX GPU - about 30-60mw(Hard to get exact measurements due to the nature of how components interact.
Anything loading the CPU probably loads up the memory as well.
Anything hitting the GPU will hit the CPU, and DSP load varies greatly depending on the codec and video being decoded.
)The entire SoC uses a ludicrously small amount of power; something like 0.2-0.4w.
Then add another 0.6w for the LCD, and a bunch more for wireless.Now, compare that to the current Atoms, with 6+ watts just for the CPU/chipset, another 2+ for the HDD/SSD, at least 6-15w for the LCD, etc...If any company can drive down their power consumption, Intel can, but that doesn't mean it'll be easy to catch ARM!I just can't wait for Cortex A9's.
Quad-core ARM in the exact same power envelope!
	</sentencetext>
	<parent>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_14_1845222.28329125</parent>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_14_1845222.28332575</id>
	<title>x86 please</title>
	<author>A12m0v</author>
	<datestamp>1245096720000</datestamp>
	<modclass>None</modclass>
	<modscore>1</modscore>
	<htmltext><p>I'd love to have an x86 processor powering my smartphone, this way I can run all the amazing x86-only apps and be in synergy with the x86 world, I'll dumb my iPhone for one in a heartbeat.</p><p>x86 shall prevail! die ARM die!</p></htmltext>
<tokenext>I 'd love to have an x86 processor powering my smartphone , this way I can run all the amazing x86-only apps and be in synergy with the x86 world , I 'll dumb my iPhone for one in a heartbeat.x86 shall prevail !
die ARM die !</tokentext>
<sentencetext>I'd love to have an x86 processor powering my smartphone, this way I can run all the amazing x86-only apps and be in synergy with the x86 world, I'll dumb my iPhone for one in a heartbeat.x86 shall prevail!
die ARM die!</sentencetext>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_14_1845222.28331129</id>
	<title>Re:Can't wait to</title>
	<author>Macman408</author>
	<datestamp>1244993400000</datestamp>
	<modclass>None</modclass>
	<modscore>1</modscore>
	<htmltext><p><div class="quote"><p>But on a more serious note, Intel will always be able to leverage their advanced fabrication processes to reduce power consumption.  Most ARM chips I've seen use older (in terms of desktop CPU) process technology but the good architecture still gives you excellent power consumption.</p></div><p>I think that may actually be on purpose. Moving to a smaller process offers many benefits, such as increased speed and circuit density. However, it also tends to increase the leakage power of a chip. Leakage used to be almost nonexistent; these days, somewhere on the order of 50\% of the power dissipation in a chip is just leakage.</p><p>For those not familiar with (semiconductor) leakage, here's a quick explanation: Transistors, as they are used in digital logic, have two states; 'on' and 'off'. When you make those transistors as small as they are currently (45 nanometers from the source to the drain, and maybe 4 to 7 atomic layers insulating the gate from the silicon substrate), transistors also have two states; 'on' and 'mostly on'. It turns out that when you only have a few atoms of silicon dioxide insulating the gate of the transistor, electrons can sometimes just hop straight through your insulation.</p><p>Now there are lots of things that help this - using something other than silicon dioxide to insulate the gate, or using a thicker gate oxide (which means slower transistors), or power gating to disable transistors that aren't in use. However, this is more of a problem in smaller process technology nodes. As a process matures, fabs develop ways to combat this, and low-power chips come out. But the leading-edge chips that pay for this new technology don't need to have the same ultra-low leakage that you need in an always-on device like a phone, so it takes a few years before the same technology moves into that market. It doesn't make a difference if you're Intel, or if you're fabless and buy wafers from TSMC.</p></div>
	</htmltext>
<tokenext>But on a more serious note , Intel will always be able to leverage their advanced fabrication processes to reduce power consumption .
Most ARM chips I 've seen use older ( in terms of desktop CPU ) process technology but the good architecture still gives you excellent power consumption.I think that may actually be on purpose .
Moving to a smaller process offers many benefits , such as increased speed and circuit density .
However , it also tends to increase the leakage power of a chip .
Leakage used to be almost nonexistent ; these days , somewhere on the order of 50 \ % of the power dissipation in a chip is just leakage.For those not familiar with ( semiconductor ) leakage , here 's a quick explanation : Transistors , as they are used in digital logic , have two states ; 'on ' and 'off' .
When you make those transistors as small as they are currently ( 45 nanometers from the source to the drain , and maybe 4 to 7 atomic layers insulating the gate from the silicon substrate ) , transistors also have two states ; 'on ' and 'mostly on' .
It turns out that when you only have a few atoms of silicon dioxide insulating the gate of the transistor , electrons can sometimes just hop straight through your insulation.Now there are lots of things that help this - using something other than silicon dioxide to insulate the gate , or using a thicker gate oxide ( which means slower transistors ) , or power gating to disable transistors that are n't in use .
However , this is more of a problem in smaller process technology nodes .
As a process matures , fabs develop ways to combat this , and low-power chips come out .
But the leading-edge chips that pay for this new technology do n't need to have the same ultra-low leakage that you need in an always-on device like a phone , so it takes a few years before the same technology moves into that market .
It does n't make a difference if you 're Intel , or if you 're fabless and buy wafers from TSMC .</tokentext>
<sentencetext>But on a more serious note, Intel will always be able to leverage their advanced fabrication processes to reduce power consumption.
Most ARM chips I've seen use older (in terms of desktop CPU) process technology but the good architecture still gives you excellent power consumption.I think that may actually be on purpose.
Moving to a smaller process offers many benefits, such as increased speed and circuit density.
However, it also tends to increase the leakage power of a chip.
Leakage used to be almost nonexistent; these days, somewhere on the order of 50\% of the power dissipation in a chip is just leakage.For those not familiar with (semiconductor) leakage, here's a quick explanation: Transistors, as they are used in digital logic, have two states; 'on' and 'off'.
When you make those transistors as small as they are currently (45 nanometers from the source to the drain, and maybe 4 to 7 atomic layers insulating the gate from the silicon substrate), transistors also have two states; 'on' and 'mostly on'.
It turns out that when you only have a few atoms of silicon dioxide insulating the gate of the transistor, electrons can sometimes just hop straight through your insulation.Now there are lots of things that help this - using something other than silicon dioxide to insulate the gate, or using a thicker gate oxide (which means slower transistors), or power gating to disable transistors that aren't in use.
However, this is more of a problem in smaller process technology nodes.
As a process matures, fabs develop ways to combat this, and low-power chips come out.
But the leading-edge chips that pay for this new technology don't need to have the same ultra-low leakage that you need in an always-on device like a phone, so it takes a few years before the same technology moves into that market.
It doesn't make a difference if you're Intel, or if you're fabless and buy wafers from TSMC.
	</sentencetext>
	<parent>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_14_1845222.28329125</parent>
</comment>
<thread>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#thread_09_06_14_1845222_17</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_14_1845222.28329867
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_14_1845222.28329369
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_14_1845222.28329125
</commentlist>
</thread>
<thread>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#thread_09_06_14_1845222_5</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_14_1845222.28351137
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_14_1845222.28329085
</commentlist>
</thread>
<thread>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#thread_09_06_14_1845222_6</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_14_1845222.28330095
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_14_1845222.28329519
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_14_1845222.28329439
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_14_1845222.28329125
</commentlist>
</thread>
<thread>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#thread_09_06_14_1845222_11</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_14_1845222.28329733
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_14_1845222.28329147
</commentlist>
</thread>
<thread>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#thread_09_06_14_1845222_10</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_14_1845222.28329437
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_14_1845222.28329085
</commentlist>
</thread>
<thread>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#thread_09_06_14_1845222_15</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_14_1845222.28331289
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_14_1845222.28329439
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_14_1845222.28329125
</commentlist>
</thread>
<thread>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#thread_09_06_14_1845222_9</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_14_1845222.28330777
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_14_1845222.28329439
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_14_1845222.28329125
</commentlist>
</thread>
<thread>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#thread_09_06_14_1845222_14</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_14_1845222.28333361
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_14_1845222.28329325
</commentlist>
</thread>
<thread>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#thread_09_06_14_1845222_3</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_14_1845222.28330361
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_14_1845222.28329357
</commentlist>
</thread>
<thread>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#thread_09_06_14_1845222_7</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_14_1845222.28330333
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_14_1845222.28329439
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_14_1845222.28329125
</commentlist>
</thread>
<thread>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#thread_09_06_14_1845222_12</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_14_1845222.28330609
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_14_1845222.28329439
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_14_1845222.28329125
</commentlist>
</thread>
<thread>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#thread_09_06_14_1845222_16</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_14_1845222.28331129
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_14_1845222.28329125
</commentlist>
</thread>
<thread>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#thread_09_06_14_1845222_0</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_14_1845222.28334321
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_14_1845222.28329519
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_14_1845222.28329439
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_14_1845222.28329125
</commentlist>
</thread>
<thread>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#thread_09_06_14_1845222_4</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_14_1845222.28329371
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_14_1845222.28329125
</commentlist>
</thread>
<thread>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#thread_09_06_14_1845222_8</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_14_1845222.28329501
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_14_1845222.28329439
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_14_1845222.28329125
</commentlist>
</thread>
<thread>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#thread_09_06_14_1845222_13</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_14_1845222.28332487
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_14_1845222.28329125
</commentlist>
</thread>
<thread>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#thread_09_06_14_1845222_2</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_14_1845222.28349245
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_14_1845222.28329439
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_14_1845222.28329125
</commentlist>
</thread>
<thread>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#thread_09_06_14_1845222_1</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_14_1845222.28334949
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_14_1845222.28329519
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_14_1845222.28329439
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_14_1845222.28329125
</commentlist>
</thread>
<conversation>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#conversation09_06_14_1845222.0</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_14_1845222.28329407
</commentlist>
</conversation>
<conversation>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#conversation09_06_14_1845222.9</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_14_1845222.28329235
</commentlist>
</conversation>
<conversation>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#conversation09_06_14_1845222.7</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_14_1845222.28329085
-http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_14_1845222.28329437
-http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_14_1845222.28351137
</commentlist>
</conversation>
<conversation>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#conversation09_06_14_1845222.11</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_14_1845222.28329215
</commentlist>
</conversation>
<conversation>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#conversation09_06_14_1845222.1</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_14_1845222.28329253
</commentlist>
</conversation>
<conversation>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#conversation09_06_14_1845222.12</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_14_1845222.28329161
</commentlist>
</conversation>
<conversation>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#conversation09_06_14_1845222.10</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_14_1845222.28329125
-http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_14_1845222.28329369
--http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_14_1845222.28329867
-http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_14_1845222.28332487
-http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_14_1845222.28329371
-http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_14_1845222.28331129
-http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_14_1845222.28329439
--http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_14_1845222.28330333
--http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_14_1845222.28329501
--http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_14_1845222.28330609
--http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_14_1845222.28349245
--http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_14_1845222.28330777
--http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_14_1845222.28329519
---http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_14_1845222.28330095
---http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_14_1845222.28334321
---http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_14_1845222.28334949
--http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_14_1845222.28331289
</commentlist>
</conversation>
<conversation>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#conversation09_06_14_1845222.6</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_14_1845222.28329461
</commentlist>
</conversation>
<conversation>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#conversation09_06_14_1845222.4</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_14_1845222.28329357
-http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_14_1845222.28330361
</commentlist>
</conversation>
<conversation>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#conversation09_06_14_1845222.8</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_14_1845222.28329325
-http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_14_1845222.28333361
</commentlist>
</conversation>
<conversation>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#conversation09_06_14_1845222.5</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_14_1845222.28329147
-http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_14_1845222.28329733
</commentlist>
</conversation>
<conversation>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#conversation09_06_14_1845222.3</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_14_1845222.28332691
</commentlist>
</conversation>
<conversation>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#conversation09_06_14_1845222.2</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_14_1845222.28330933
</commentlist>
</conversation>
