<article>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#article09_11_24_1428245</id>
	<title>A Skeptical Reaction To IBM's Cat Brain Simulation Claims</title>
	<author>timothy</author>
	<datestamp>1259074140000</datestamp>
	<htmltext>kreyszig writes <i>"The recent story of a <a href="http://science.slashdot.org/story/09/11/18/1423238/-IBM-Takes-a-Feline-Step-Toward-Thinking-Machines?from=rss">cat brain simulation from IBM</a> had me wondering if this was really possible as described. Now a senior researcher in the same field has <a href="http://spectrum.ieee.org/blog/semiconductors/devices/tech-talk/blue-brain-project-leader-angry-about-cat-brain">publicly denounced IBM's claims</a>."</i>

More optimisticaly, dontmakemethink points out an "astounding article about <a href="http://discovermagazine.com/2009/oct/06-brain-like-chip-may-solve-computers-big-problem-energy/article\_view?b\_start:int=0&amp;-C=">new 'Neurogrid' computer chips</a> which offer brain-like computing with extremely low power consumption.  In a simulation of 55 million neurons on a traditional supercomputer, 320,000 watts of power was required, while a 1-million neuron Neurogrid chip array is expected to consume less than one watt."</htmltext>
<tokenext>kreyszig writes " The recent story of a cat brain simulation from IBM had me wondering if this was really possible as described .
Now a senior researcher in the same field has publicly denounced IBM 's claims .
" More optimisticaly , dontmakemethink points out an " astounding article about new 'Neurogrid ' computer chips which offer brain-like computing with extremely low power consumption .
In a simulation of 55 million neurons on a traditional supercomputer , 320,000 watts of power was required , while a 1-million neuron Neurogrid chip array is expected to consume less than one watt .
"</tokentext>
<sentencetext>kreyszig writes "The recent story of a cat brain simulation from IBM had me wondering if this was really possible as described.
Now a senior researcher in the same field has publicly denounced IBM's claims.
"

More optimisticaly, dontmakemethink points out an "astounding article about new 'Neurogrid' computer chips which offer brain-like computing with extremely low power consumption.
In a simulation of 55 million neurons on a traditional supercomputer, 320,000 watts of power was required, while a 1-million neuron Neurogrid chip array is expected to consume less than one watt.
"</sentencetext>
</article>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_24_1428245.30213734</id>
	<title>The power of custom silicon</title>
	<author>jabuzz</author>
	<datestamp>1259078280000</datestamp>
	<modclass>Interestin</modclass>
	<modscore>3</modscore>
	<htmltext><p>If you have custom silicon to do each neuron then you are going to be hugely more power efficient that a general purpose processor simulating a neuron in software. There is nothing new there and anyone who thinks otherwise is just clueless. Given IBM have the facilities and resources to fabricate some custom silicon I fail to see the issue.</p></htmltext>
<tokenext>If you have custom silicon to do each neuron then you are going to be hugely more power efficient that a general purpose processor simulating a neuron in software .
There is nothing new there and anyone who thinks otherwise is just clueless .
Given IBM have the facilities and resources to fabricate some custom silicon I fail to see the issue .</tokentext>
<sentencetext>If you have custom silicon to do each neuron then you are going to be hugely more power efficient that a general purpose processor simulating a neuron in software.
There is nothing new there and anyone who thinks otherwise is just clueless.
Given IBM have the facilities and resources to fabricate some custom silicon I fail to see the issue.</sentencetext>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_24_1428245.30217262</id>
	<title>Re:I CAN HAZ THINKINESS, THERE4 I IZ?</title>
	<author>Blakey Rat</author>
	<datestamp>1259094000000</datestamp>
	<modclass>None</modclass>
	<modscore>1</modscore>
	<htmltext><p>FAIL!</p><p>LOLCAT font is IMPACT. Fool! Every cat should know that.</p></htmltext>
<tokenext>FAIL ! LOLCAT font is IMPACT .
Fool ! Every cat should know that .</tokentext>
<sentencetext>FAIL!LOLCAT font is IMPACT.
Fool! Every cat should know that.</sentencetext>
	<parent>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_24_1428245.30214114</parent>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_24_1428245.30214016</id>
	<title>Adult Children Seeking Attention</title>
	<author>Anonymous</author>
	<datestamp>1259079660000</datestamp>
	<modclass>Offtopic</modclass>
	<modscore>-1</modscore>
	<htmltext><p>Nothing generates traffic and interest in a topic then a fight.</p><p>Recent events now confirm in my mind that we have raised an entire generation of adult children who have now degenerated science into the equivalent of a school yard chanting:</p><p>"FIGHT! FIGHT! FIGHT!"</p><p>In order to get attention.</p><p>Since we have lost our marbles overall is this issue and reaction any more a suprise then ClimateGate?</p><p>We have been a victim of an generation or two that has been raised on advertisments and entertainment, instant gratification, and agenda driven science. We validate a theory and it is instantly a widely advertised solution or product to lock in investment and grant money rather then taking a thourough approach. Instant Gratification and marketing has polluted science. Shame on all parties. Science and technology, like most parts of society has fallen vicitm to the Idiocracy. This is the 'intellectual' version of the trial in the movie Idiocracy regardless of side.</p><p>It seems, in every aspect of life, we have become polarized and narrow minded because we are only interested in self gratification of our own conventions. We demand tolerance yet are intolerant of others. We demand acceptance but only for 'our side'. The behavior of IBM AND it's curent critics on the issue seems to have a root in this adult child crisis. Our mentality has degraded into the mentality of grade school children. I fear there is no solution as the reward structure we have created rewards the worst and punishes the best.</p><p>WHO IS JOHN GALT!?</p></htmltext>
<tokenext>Nothing generates traffic and interest in a topic then a fight.Recent events now confirm in my mind that we have raised an entire generation of adult children who have now degenerated science into the equivalent of a school yard chanting : " FIGHT !
FIGHT ! FIGHT !
" In order to get attention.Since we have lost our marbles overall is this issue and reaction any more a suprise then ClimateGate ? We have been a victim of an generation or two that has been raised on advertisments and entertainment , instant gratification , and agenda driven science .
We validate a theory and it is instantly a widely advertised solution or product to lock in investment and grant money rather then taking a thourough approach .
Instant Gratification and marketing has polluted science .
Shame on all parties .
Science and technology , like most parts of society has fallen vicitm to the Idiocracy .
This is the 'intellectual ' version of the trial in the movie Idiocracy regardless of side.It seems , in every aspect of life , we have become polarized and narrow minded because we are only interested in self gratification of our own conventions .
We demand tolerance yet are intolerant of others .
We demand acceptance but only for 'our side' .
The behavior of IBM AND it 's curent critics on the issue seems to have a root in this adult child crisis .
Our mentality has degraded into the mentality of grade school children .
I fear there is no solution as the reward structure we have created rewards the worst and punishes the best.WHO IS JOHN GALT !
?</tokentext>
<sentencetext>Nothing generates traffic and interest in a topic then a fight.Recent events now confirm in my mind that we have raised an entire generation of adult children who have now degenerated science into the equivalent of a school yard chanting:"FIGHT!
FIGHT! FIGHT!
"In order to get attention.Since we have lost our marbles overall is this issue and reaction any more a suprise then ClimateGate?We have been a victim of an generation or two that has been raised on advertisments and entertainment, instant gratification, and agenda driven science.
We validate a theory and it is instantly a widely advertised solution or product to lock in investment and grant money rather then taking a thourough approach.
Instant Gratification and marketing has polluted science.
Shame on all parties.
Science and technology, like most parts of society has fallen vicitm to the Idiocracy.
This is the 'intellectual' version of the trial in the movie Idiocracy regardless of side.It seems, in every aspect of life, we have become polarized and narrow minded because we are only interested in self gratification of our own conventions.
We demand tolerance yet are intolerant of others.
We demand acceptance but only for 'our side'.
The behavior of IBM AND it's curent critics on the issue seems to have a root in this adult child crisis.
Our mentality has degraded into the mentality of grade school children.
I fear there is no solution as the reward structure we have created rewards the worst and punishes the best.WHO IS JOHN GALT!
?</sentencetext>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_24_1428245.30214052</id>
	<title>Almaden's Dharmendra Modha: You got pwned!</title>
	<author>gozu</author>
	<datestamp>1259079840000</datestamp>
	<modclass>Funny</modclass>
	<modscore>3</modscore>
	<htmltext><p>I saw that story earlier and dismissed it for the crap that it was. I'd like to thank Henry Markram for vindicating my snap judgment with his flame email.</p></htmltext>
<tokenext>I saw that story earlier and dismissed it for the crap that it was .
I 'd like to thank Henry Markram for vindicating my snap judgment with his flame email .</tokentext>
<sentencetext>I saw that story earlier and dismissed it for the crap that it was.
I'd like to thank Henry Markram for vindicating my snap judgment with his flame email.</sentencetext>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_24_1428245.30215206</id>
	<title>Re:Does anyone really know what a cat thinks?</title>
	<author>Junior J. Junior III</author>
	<datestamp>1259084220000</datestamp>
	<modclass>None</modclass>
	<modscore>1</modscore>
	<htmltext><p>I can haz brain simulation?</p></htmltext>
<tokenext>I can haz brain simulation ?</tokentext>
<sentencetext>I can haz brain simulation?</sentencetext>
	<parent>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_24_1428245.30213660</parent>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_24_1428245.30215134</id>
	<title>Re:Adult Children Seeking Attention</title>
	<author>geekoid</author>
	<datestamp>1259083920000</datestamp>
	<modclass>None</modclass>
	<modscore>1</modscore>
	<htmltext><p>It's not nearly as bad as it seems.</p><p>The media doesn't understand what fair and balance is. They assume every opinion are equal and as valid as facts. They are not.<br>Media generates controversy and then display it for all to see. Hence, the perception is that it's all a fight and confusion. This is generally incorrect.</p><p>Science marches and and continues to deliver the goods.</p></htmltext>
<tokenext>It 's not nearly as bad as it seems.The media does n't understand what fair and balance is .
They assume every opinion are equal and as valid as facts .
They are not.Media generates controversy and then display it for all to see .
Hence , the perception is that it 's all a fight and confusion .
This is generally incorrect.Science marches and and continues to deliver the goods .</tokentext>
<sentencetext>It's not nearly as bad as it seems.The media doesn't understand what fair and balance is.
They assume every opinion are equal and as valid as facts.
They are not.Media generates controversy and then display it for all to see.
Hence, the perception is that it's all a fight and confusion.
This is generally incorrect.Science marches and and continues to deliver the goods.</sentencetext>
	<parent>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_24_1428245.30214016</parent>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_24_1428245.30214096</id>
	<title>Re:Brain Power</title>
	<author>Xest</author>
	<datestamp>1259080020000</datestamp>
	<modclass>Funny</modclass>
	<modscore>2</modscore>
	<htmltext><p>Damn, if only we could find such a great source of power!</p></htmltext>
<tokenext>Damn , if only we could find such a great source of power !</tokentext>
<sentencetext>Damn, if only we could find such a great source of power!</sentencetext>
	<parent>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_24_1428245.30213710</parent>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_24_1428245.30214100</id>
	<title>Re:All those neurons using less than 1 watt?</title>
	<author>Anonymous</author>
	<datestamp>1259080020000</datestamp>
	<modclass>Funny</modclass>
	<modscore>3</modscore>
	<htmltext>I'm being environmentally, friendly you insensitive clod!</htmltext>
<tokenext>I 'm being environmentally , friendly you insensitive clod !</tokentext>
<sentencetext>I'm being environmentally, friendly you insensitive clod!</sentencetext>
	<parent>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_24_1428245.30213700</parent>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_24_1428245.30214492</id>
	<title>Re:long ways to go yet</title>
	<author>Anonymous</author>
	<datestamp>1259081460000</datestamp>
	<modclass>Insightful</modclass>
	<modscore>3</modscore>
	<htmltext><p>It basically just seem to be a case of the same old AI arguments we've always heard even since Turing's days.</p><p>The problem is, we don't actually know what the limits of ANNs are, there is no proof that suggests that they can't, given ever greater amounts of computing power allow for the emergence of (at least seemingly) truly intelligent response to an event.</p><p>So on one hand we have the IBM guys overstating what they've achieved, and on the other we have a guy spouting out a view of the limits of ANNs without actually putting any effort into providing evidence for their limitations.</p><p>I don't know why but the AI field has always been horifically polarised, the kind of arguments you get in that field are just so immature it's beyond belief. You have people in the AI field following their viewpoint religiously, completely unwilling to consider the other viewpoint. To see what I mean just look up some of the discussions on Searle's chinese room argument.</p><p>If AI scientists spent as much time on research as they did bitching at each others experiments and theories we'd have a walking talking robo-jesus by now that could build worlds.</p></htmltext>
<tokenext>It basically just seem to be a case of the same old AI arguments we 've always heard even since Turing 's days.The problem is , we do n't actually know what the limits of ANNs are , there is no proof that suggests that they ca n't , given ever greater amounts of computing power allow for the emergence of ( at least seemingly ) truly intelligent response to an event.So on one hand we have the IBM guys overstating what they 've achieved , and on the other we have a guy spouting out a view of the limits of ANNs without actually putting any effort into providing evidence for their limitations.I do n't know why but the AI field has always been horifically polarised , the kind of arguments you get in that field are just so immature it 's beyond belief .
You have people in the AI field following their viewpoint religiously , completely unwilling to consider the other viewpoint .
To see what I mean just look up some of the discussions on Searle 's chinese room argument.If AI scientists spent as much time on research as they did bitching at each others experiments and theories we 'd have a walking talking robo-jesus by now that could build worlds .</tokentext>
<sentencetext>It basically just seem to be a case of the same old AI arguments we've always heard even since Turing's days.The problem is, we don't actually know what the limits of ANNs are, there is no proof that suggests that they can't, given ever greater amounts of computing power allow for the emergence of (at least seemingly) truly intelligent response to an event.So on one hand we have the IBM guys overstating what they've achieved, and on the other we have a guy spouting out a view of the limits of ANNs without actually putting any effort into providing evidence for their limitations.I don't know why but the AI field has always been horifically polarised, the kind of arguments you get in that field are just so immature it's beyond belief.
You have people in the AI field following their viewpoint religiously, completely unwilling to consider the other viewpoint.
To see what I mean just look up some of the discussions on Searle's chinese room argument.If AI scientists spent as much time on research as they did bitching at each others experiments and theories we'd have a walking talking robo-jesus by now that could build worlds.</sentencetext>
	<parent>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_24_1428245.30213974</parent>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_24_1428245.30214862</id>
	<title>Quantum computing vs digital</title>
	<author>Anonymous</author>
	<datestamp>1259083020000</datestamp>
	<modclass>None</modclass>
	<modscore>0</modscore>
	<htmltext><p>Brains use quantum computing plus new physics.  You can't simulate that in classical digital efficiently.  The computational power of the quantum computer scales exponentially, whereas computational power of digital computers scale linearly.  The advantages of quantum are somewhat offset by increased error handling problems, but that doesn't make it much easier to simulate.</p><p>Having said that, with our digital computing power increasing exponentially with time, they could make a digital analog to a brain, and thereby prove that the brain is more than electrical circuits, because that digital version doesn't function properly.</p></htmltext>
<tokenext>Brains use quantum computing plus new physics .
You ca n't simulate that in classical digital efficiently .
The computational power of the quantum computer scales exponentially , whereas computational power of digital computers scale linearly .
The advantages of quantum are somewhat offset by increased error handling problems , but that does n't make it much easier to simulate.Having said that , with our digital computing power increasing exponentially with time , they could make a digital analog to a brain , and thereby prove that the brain is more than electrical circuits , because that digital version does n't function properly .</tokentext>
<sentencetext>Brains use quantum computing plus new physics.
You can't simulate that in classical digital efficiently.
The computational power of the quantum computer scales exponentially, whereas computational power of digital computers scale linearly.
The advantages of quantum are somewhat offset by increased error handling problems, but that doesn't make it much easier to simulate.Having said that, with our digital computing power increasing exponentially with time, they could make a digital analog to a brain, and thereby prove that the brain is more than electrical circuits, because that digital version doesn't function properly.</sentencetext>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_24_1428245.30213878</id>
	<title>Re:long ways to go yet</title>
	<author>Anonymous</author>
	<datestamp>1259079000000</datestamp>
	<modclass>Informativ</modclass>
	<modscore>5</modscore>
	<htmltext><p>More than this, their simulated neurons aren't anywhere close to the real thing.  A real neuron, an individual cell, has tremendous computing power due to the distribution of a bunch of different ion channel types (active conductances) in a highly complex dendritic tree.  Simulating a few seconds of just ONE neuron accurately can take several minutes to several hours of supercomputer time.  I know this because I do it for a living.</p></htmltext>
<tokenext>More than this , their simulated neurons are n't anywhere close to the real thing .
A real neuron , an individual cell , has tremendous computing power due to the distribution of a bunch of different ion channel types ( active conductances ) in a highly complex dendritic tree .
Simulating a few seconds of just ONE neuron accurately can take several minutes to several hours of supercomputer time .
I know this because I do it for a living .</tokentext>
<sentencetext>More than this, their simulated neurons aren't anywhere close to the real thing.
A real neuron, an individual cell, has tremendous computing power due to the distribution of a bunch of different ion channel types (active conductances) in a highly complex dendritic tree.
Simulating a few seconds of just ONE neuron accurately can take several minutes to several hours of supercomputer time.
I know this because I do it for a living.</sentencetext>
	<parent>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_24_1428245.30213748</parent>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_24_1428245.30214222</id>
	<title>Saturday Morning Breakfast Cereal Knows</title>
	<author>eldavojohn</author>
	<datestamp>1259080500000</datestamp>
	<modclass>None</modclass>
	<modscore>1</modscore>
	<htmltext><p><div class="quote"><p>Think about it. Think about it like a cat.</p></div><p>SMBC explained why cat <a href="http://www.smbc-comics.com/index.php?db=comics&amp;id=1705#comic" title="smbc-comics.com">translation products fail</a> [smbc-comics.com].  Although there are <a href="http://en.wikipedia.org/wiki/BowLingual" title="wikipedia.org">financial endeavors to decode dog</a> [wikipedia.org].</p></div>
	</htmltext>
<tokenext>Think about it .
Think about it like a cat.SMBC explained why cat translation products fail [ smbc-comics.com ] .
Although there are financial endeavors to decode dog [ wikipedia.org ] .</tokentext>
<sentencetext>Think about it.
Think about it like a cat.SMBC explained why cat translation products fail [smbc-comics.com].
Although there are financial endeavors to decode dog [wikipedia.org].
	</sentencetext>
	<parent>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_24_1428245.30213660</parent>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_24_1428245.30218460</id>
	<title>Re:The power of custom silicon</title>
	<author>mcrbids</author>
	<datestamp>1259056320000</datestamp>
	<modclass>None</modclass>
	<modscore>1</modscore>
	<htmltext><p>Sure, dedicated hardware will work faster than emulation in software. But what about the "middle of the road" EG: FPGA? How well could an FPGA allow for actual neuronic simulation with quasi-dedicated hardware, and at what cost?</p><p>Sure, directly fabricated silicon will outperform software emulation on GP hardware. But it's not just a question of silicon vs software...<br>
&nbsp;</p></htmltext>
<tokenext>Sure , dedicated hardware will work faster than emulation in software .
But what about the " middle of the road " EG : FPGA ?
How well could an FPGA allow for actual neuronic simulation with quasi-dedicated hardware , and at what cost ? Sure , directly fabricated silicon will outperform software emulation on GP hardware .
But it 's not just a question of silicon vs software.. .  </tokentext>
<sentencetext>Sure, dedicated hardware will work faster than emulation in software.
But what about the "middle of the road" EG: FPGA?
How well could an FPGA allow for actual neuronic simulation with quasi-dedicated hardware, and at what cost?Sure, directly fabricated silicon will outperform software emulation on GP hardware.
But it's not just a question of silicon vs software...
 </sentencetext>
	<parent>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_24_1428245.30213734</parent>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_24_1428245.30215626</id>
	<title>Think of the possibilities, though!</title>
	<author>Tetsujin</author>
	<datestamp>1259085900000</datestamp>
	<modclass>None</modclass>
	<modscore>1</modscore>
	<htmltext><p>I think people are missing the obvious potential here.  I mean, if you could engineer a computer to accurately simulate a cat's brain, then you could implant that computer in a sexy gynoid body, and have a robot-girl with the mind of a cat!</p></htmltext>
<tokenext>I think people are missing the obvious potential here .
I mean , if you could engineer a computer to accurately simulate a cat 's brain , then you could implant that computer in a sexy gynoid body , and have a robot-girl with the mind of a cat !</tokentext>
<sentencetext>I think people are missing the obvious potential here.
I mean, if you could engineer a computer to accurately simulate a cat's brain, then you could implant that computer in a sexy gynoid body, and have a robot-girl with the mind of a cat!</sentencetext>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_24_1428245.30220330</id>
	<title>Re:Does anyone really know what a cat thinks?</title>
	<author>Thaddeaus</author>
	<datestamp>1259064720000</datestamp>
	<modclass>None</modclass>
	<modscore>1</modscore>
	<htmltext>I don't care.</htmltext>
<tokenext>I do n't care .</tokentext>
<sentencetext>I don't care.</sentencetext>
	<parent>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_24_1428245.30213660</parent>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_24_1428245.30214436</id>
	<title>a binary simulation</title>
	<author>Sterculius</author>
	<datestamp>1259081280000</datestamp>
	<modclass>None</modclass>
	<modscore>1</modscore>
	<htmltext>I have simulated a Conservative Republican's brain with only two points:

1. Listen to Rush Limbaugh.
2. Mindlessly repeat what Rush says.</htmltext>
<tokenext>I have simulated a Conservative Republican 's brain with only two points : 1 .
Listen to Rush Limbaugh .
2. Mindlessly repeat what Rush says .</tokentext>
<sentencetext>I have simulated a Conservative Republican's brain with only two points:

1.
Listen to Rush Limbaugh.
2. Mindlessly repeat what Rush says.</sentencetext>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_24_1428245.30214048</id>
	<title>Re:Does anyone really know what a cat thinks?</title>
	<author>Anonymous</author>
	<datestamp>1259079840000</datestamp>
	<modclass>Funny</modclass>
	<modscore>4</modscore>
	<htmltext>Okay.<br>
<br>
Give me food. <em>Now.</em></htmltext>
<tokenext>Okay .
Give me food .
Now .</tokentext>
<sentencetext>Okay.
Give me food.
Now.</sentencetext>
	<parent>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_24_1428245.30213660</parent>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_24_1428245.30222364</id>
	<title>Re:All those neurons using less than 1 watt?</title>
	<author>electrons\_are\_brave</author>
	<datestamp>1259079240000</datestamp>
	<modclass>None</modclass>
	<modscore>1</modscore>
	<htmltext>When you said "humans only use 1-15\% of their brain" I know you meant "at a time". But just in case someone thinks this is the old: Unlock your potential - use the other 90\% of your brain, here's a simple explanation:

<a href="http://health.howstuffworks.com/10-brain-myths10.htm" title="howstuffworks.com">http://health.howstuffworks.com/10-brain-myths10.htm</a> [howstuffworks.com]</htmltext>
<tokenext>When you said " humans only use 1-15 \ % of their brain " I know you meant " at a time " .
But just in case someone thinks this is the old : Unlock your potential - use the other 90 \ % of your brain , here 's a simple explanation : http : //health.howstuffworks.com/10-brain-myths10.htm [ howstuffworks.com ]</tokentext>
<sentencetext>When you said "humans only use 1-15\% of their brain" I know you meant "at a time".
But just in case someone thinks this is the old: Unlock your potential - use the other 90\% of your brain, here's a simple explanation:

http://health.howstuffworks.com/10-brain-myths10.htm [howstuffworks.com]</sentencetext>
	<parent>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_24_1428245.30218230</parent>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_24_1428245.30214516</id>
	<title>One question remain!</title>
	<author>Anonymous</author>
	<datestamp>1259081520000</datestamp>
	<modclass>None</modclass>
	<modscore>1</modscore>
	<htmltext>Can one of those cats run linux?</htmltext>
<tokenext>Can one of those cats run linux ?</tokentext>
<sentencetext>Can one of those cats run linux?</sentencetext>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_24_1428245.30213994</id>
	<title>Re:Does anyone really know what a cat thinks?</title>
	<author>Anonymous</author>
	<datestamp>1259079540000</datestamp>
	<modclass>None</modclass>
	<modscore>0</modscore>
	<htmltext><p>1. Where's the food?<br>2. Hey, gimme fresh water too!<br>3. Empty my litter box you two-legged fool!<br>4. Don't bug me, I need to sleep.</p><p>Repeat four times a day.</p></htmltext>
<tokenext>1 .
Where 's the food ? 2 .
Hey , gim me fresh water too ! 3 .
Empty my litter box you two-legged fool ! 4 .
Do n't bug me , I need to sleep.Repeat four times a day .</tokentext>
<sentencetext>1.
Where's the food?2.
Hey, gimme fresh water too!3.
Empty my litter box you two-legged fool!4.
Don't bug me, I need to sleep.Repeat four times a day.</sentencetext>
	<parent>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_24_1428245.30213660</parent>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_24_1428245.30214566</id>
	<title>Re:Does anyone really know what a cat thinks?</title>
	<author>Quiet\_Desperation</author>
	<datestamp>1259081700000</datestamp>
	<modclass>None</modclass>
	<modscore>1</modscore>
	<htmltext><p><div class="quote"><p>Think about it like a cat.</p></div><p>I tried, but all it did was make me crave a cheeseburger.</p><p>Oh, and some vision about a cat up in the ceiling or something.</p></div>
	</htmltext>
<tokenext>Think about it like a cat.I tried , but all it did was make me crave a cheeseburger.Oh , and some vision about a cat up in the ceiling or something .</tokentext>
<sentencetext>Think about it like a cat.I tried, but all it did was make me crave a cheeseburger.Oh, and some vision about a cat up in the ceiling or something.
	</sentencetext>
	<parent>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_24_1428245.30213660</parent>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_24_1428245.30222748</id>
	<title>Re:Brain Power</title>
	<author>beguyld</author>
	<datestamp>1259083320000</datestamp>
	<modclass>None</modclass>
	<modscore>1</modscore>
	<htmltext><p><div class="quote"><p>Damn, if only we could find such a great source of power!</p></div><p>Yeah, we could put all the humans in pods, tapping their power to create a huge virtual reality network. Knock knock, Neo...</p></div>
	</htmltext>
<tokenext>Damn , if only we could find such a great source of power ! Yeah , we could put all the humans in pods , tapping their power to create a huge virtual reality network .
Knock knock , Neo.. .</tokentext>
<sentencetext>Damn, if only we could find such a great source of power!Yeah, we could put all the humans in pods, tapping their power to create a huge virtual reality network.
Knock knock, Neo...
	</sentencetext>
	<parent>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_24_1428245.30214096</parent>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_24_1428245.30218230</id>
	<title>Re:All those neurons using less than 1 watt?</title>
	<author>dontmakemethink</author>
	<datestamp>1259055360000</datestamp>
	<modclass>Interestin</modclass>
	<modscore>4</modscore>
	<htmltext><p>Actually if you read TFA, the long-pondered question of why humans only use 1-15\% of their brain is largely a matter of power consumption, and the reason for the abundance of dormant neurons is for greater potential diversity of thought.</p><p> <i>"While accounting for just 2 percent of our body weight, the human brain devours 20 percent of the calories that we eat."</i> </p><p> <i>"The brain achieves optimal energy efficiency by firing no more than 1 to 15 percent&mdash;and often just 1 percent&mdash;of its neurons at a time."</i> </p><p>That seems to indicate that a human brain would burn more calories than the rest of the body if it were "always on".</p><p>Being a hypoglycemia sufferer, I can attest to the severe limitations of brain activity when deprived of sugar.  Before being diagnosed I underwent tunnel vision and black-outs, not to mention the typical mood swings, shakiness, cold sensations, etc.</p><p>Never has my nickname been more appropriate...</p></htmltext>
<tokenext>Actually if you read TFA , the long-pondered question of why humans only use 1-15 \ % of their brain is largely a matter of power consumption , and the reason for the abundance of dormant neurons is for greater potential diversity of thought .
" While accounting for just 2 percent of our body weight , the human brain devours 20 percent of the calories that we eat .
" " The brain achieves optimal energy efficiency by firing no more than 1 to 15 percent    and often just 1 percent    of its neurons at a time .
" That seems to indicate that a human brain would burn more calories than the rest of the body if it were " always on " .Being a hypoglycemia sufferer , I can attest to the severe limitations of brain activity when deprived of sugar .
Before being diagnosed I underwent tunnel vision and black-outs , not to mention the typical mood swings , shakiness , cold sensations , etc.Never has my nickname been more appropriate.. .</tokentext>
<sentencetext>Actually if you read TFA, the long-pondered question of why humans only use 1-15\% of their brain is largely a matter of power consumption, and the reason for the abundance of dormant neurons is for greater potential diversity of thought.
"While accounting for just 2 percent of our body weight, the human brain devours 20 percent of the calories that we eat.
"  "The brain achieves optimal energy efficiency by firing no more than 1 to 15 percent—and often just 1 percent—of its neurons at a time.
" That seems to indicate that a human brain would burn more calories than the rest of the body if it were "always on".Being a hypoglycemia sufferer, I can attest to the severe limitations of brain activity when deprived of sugar.
Before being diagnosed I underwent tunnel vision and black-outs, not to mention the typical mood swings, shakiness, cold sensations, etc.Never has my nickname been more appropriate...</sentencetext>
	<parent>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_24_1428245.30213700</parent>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_24_1428245.30215678</id>
	<title>Re:Markram's for real</title>
	<author>Rogerborg</author>
	<datestamp>1259086140000</datestamp>
	<modclass>None</modclass>
	<modscore>1</modscore>
	<htmltext>You make a compelling argument.  In fact, I won't even bother asking for citations, I'll just ask how I can send money to this scientific demigod.  Is cash OK?</htmltext>
<tokenext>You make a compelling argument .
In fact , I wo n't even bother asking for citations , I 'll just ask how I can send money to this scientific demigod .
Is cash OK ?</tokentext>
<sentencetext>You make a compelling argument.
In fact, I won't even bother asking for citations, I'll just ask how I can send money to this scientific demigod.
Is cash OK?</sentencetext>
	<parent>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_24_1428245.30214088</parent>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_24_1428245.30221636</id>
	<title>Re:long ways to go yet</title>
	<author>Anonymous</author>
	<datestamp>1259073120000</datestamp>
	<modclass>None</modclass>
	<modscore>0</modscore>
	<htmltext><p>don't call it slow, that's bad manners, the politically correct term is mentally challenged cat simulation</p></htmltext>
<tokenext>do n't call it slow , that 's bad manners , the politically correct term is mentally challenged cat simulation</tokentext>
<sentencetext>don't call it slow, that's bad manners, the politically correct term is mentally challenged cat simulation</sentencetext>
	<parent>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_24_1428245.30213748</parent>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_24_1428245.30215390</id>
	<title>Re:long ways to go yet</title>
	<author>Anonymous</author>
	<datestamp>1259085000000</datestamp>
	<modclass>None</modclass>
	<modscore>0</modscore>
	<htmltext><p>I was there at the talk - and with a smattering of neuroscience in my background, I can say it was pretty disappointing.</p><p>It's true we know very little about emergent properties of large ANNs (or large anything, really), so can we learn something from this?  Perhaps.  But calling it a cat's brain simulation is like me pointing to a copper mine and saying I've built the Statue of Liberty.  Raw materials are needed, but so is structure.  Plus, you have to use the right raw materials - how important are ion channels in the simulations?  I don't know, but neither does Mohda.  The guys at the Blue Brain project are doing this correctly - study a physiological simulation, then work your work backwards through levels of abstraction.</p><p>Sadly, it's stuff like this that makes it difficult for wetlab neuroscientists to take the computational guys seriously.</p></htmltext>
<tokenext>I was there at the talk - and with a smattering of neuroscience in my background , I can say it was pretty disappointing.It 's true we know very little about emergent properties of large ANNs ( or large anything , really ) , so can we learn something from this ?
Perhaps. But calling it a cat 's brain simulation is like me pointing to a copper mine and saying I 've built the Statue of Liberty .
Raw materials are needed , but so is structure .
Plus , you have to use the right raw materials - how important are ion channels in the simulations ?
I do n't know , but neither does Mohda .
The guys at the Blue Brain project are doing this correctly - study a physiological simulation , then work your work backwards through levels of abstraction.Sadly , it 's stuff like this that makes it difficult for wetlab neuroscientists to take the computational guys seriously .</tokentext>
<sentencetext>I was there at the talk - and with a smattering of neuroscience in my background, I can say it was pretty disappointing.It's true we know very little about emergent properties of large ANNs (or large anything, really), so can we learn something from this?
Perhaps.  But calling it a cat's brain simulation is like me pointing to a copper mine and saying I've built the Statue of Liberty.
Raw materials are needed, but so is structure.
Plus, you have to use the right raw materials - how important are ion channels in the simulations?
I don't know, but neither does Mohda.
The guys at the Blue Brain project are doing this correctly - study a physiological simulation, then work your work backwards through levels of abstraction.Sadly, it's stuff like this that makes it difficult for wetlab neuroscientists to take the computational guys seriously.</sentencetext>
	<parent>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_24_1428245.30213974</parent>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_24_1428245.30219490</id>
	<title>Re:Does anyone really know what a cat thinks?</title>
	<author>Evil Pete</author>
	<datestamp>1259060820000</datestamp>
	<modclass>None</modclass>
	<modscore>1</modscore>
	<htmltext><p>The delicious irony is that most people wont realise how true this is and why.</p></htmltext>
<tokenext>The delicious irony is that most people wont realise how true this is and why .</tokentext>
<sentencetext>The delicious irony is that most people wont realise how true this is and why.</sentencetext>
	<parent>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_24_1428245.30213746</parent>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_24_1428245.30214208</id>
	<title>Re:long ways to go yet</title>
	<author>Neil Hodges</author>
	<datestamp>1259080440000</datestamp>
	<modclass>Funny</modclass>
	<modscore>4</modscore>
	<htmltext><p><div class="quote"><p>...I know this because I do it for a living.</p></div><p>Don't each of our brains do this for a living, too?</p></div>
	</htmltext>
<tokenext>...I know this because I do it for a living.Do n't each of our brains do this for a living , too ?</tokentext>
<sentencetext>...I know this because I do it for a living.Don't each of our brains do this for a living, too?
	</sentencetext>
	<parent>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_24_1428245.30213878</parent>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_24_1428245.30213886</id>
	<title>argument from personal ignorance, but....</title>
	<author>debatem1</author>
	<datestamp>1259079060000</datestamp>
	<modclass>None</modclass>
	<modscore>1</modscore>
	<htmltext>I don't really see how they would have verified that they were able to simulate a cat's brain. AFAIK, we don't have single-neuron level imaging, and the resolution on FMRI and EEG put those right out. Looking at macro level behavior would be pretty absurd- I too, can write a program that will decide to play with yarn. Unless there's something I'm missing, IBM seems to have made a claim it can't support.</htmltext>
<tokenext>I do n't really see how they would have verified that they were able to simulate a cat 's brain .
AFAIK , we do n't have single-neuron level imaging , and the resolution on FMRI and EEG put those right out .
Looking at macro level behavior would be pretty absurd- I too , can write a program that will decide to play with yarn .
Unless there 's something I 'm missing , IBM seems to have made a claim it ca n't support .</tokentext>
<sentencetext>I don't really see how they would have verified that they were able to simulate a cat's brain.
AFAIK, we don't have single-neuron level imaging, and the resolution on FMRI and EEG put those right out.
Looking at macro level behavior would be pretty absurd- I too, can write a program that will decide to play with yarn.
Unless there's something I'm missing, IBM seems to have made a claim it can't support.</sentencetext>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_24_1428245.30214860</id>
	<title>Re:Does anyone really know what a cat thinks?</title>
	<author>Critical Facilities</author>
	<datestamp>1259083020000</datestamp>
	<modclass>Insightful</modclass>
	<modscore>2</modscore>
	<htmltext>Insightful??<br> <br>Hmmmph!  My cat Phydeaux must have mod points again.</htmltext>
<tokenext>Insightful ? ?
Hmmmph ! My cat Phydeaux must have mod points again .</tokentext>
<sentencetext>Insightful??
Hmmmph!  My cat Phydeaux must have mod points again.</sentencetext>
	<parent>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_24_1428245.30214164</parent>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_24_1428245.30213936</id>
	<title>Skeptical?</title>
	<author>golden age villain</author>
	<datestamp>1259079300000</datestamp>
	<modclass>Interestin</modclass>
	<modscore>5</modscore>
	<htmltext>This IBM announcement was just ridiculous. To cite only one argument, the brain does not consist only of neurons. It contains at least as many other cells which are also involved in signal processing. Mohda would be laughed at in any neuroscience conference and he certainly doesn't help the cause of theoreticians in the neuroscience field by making such stupid announcements. Eugene Izhikevich who designed the neuron model being used for these simulations had a PNAS paper not too long ago modeling the entire human brain and he did not claim that he successfully modeled the human brain. Plus no one has any clue how the brain computes really so making a claim about the formation of thoughts is just nonsense.</htmltext>
<tokenext>This IBM announcement was just ridiculous .
To cite only one argument , the brain does not consist only of neurons .
It contains at least as many other cells which are also involved in signal processing .
Mohda would be laughed at in any neuroscience conference and he certainly does n't help the cause of theoreticians in the neuroscience field by making such stupid announcements .
Eugene Izhikevich who designed the neuron model being used for these simulations had a PNAS paper not too long ago modeling the entire human brain and he did not claim that he successfully modeled the human brain .
Plus no one has any clue how the brain computes really so making a claim about the formation of thoughts is just nonsense .</tokentext>
<sentencetext>This IBM announcement was just ridiculous.
To cite only one argument, the brain does not consist only of neurons.
It contains at least as many other cells which are also involved in signal processing.
Mohda would be laughed at in any neuroscience conference and he certainly doesn't help the cause of theoreticians in the neuroscience field by making such stupid announcements.
Eugene Izhikevich who designed the neuron model being used for these simulations had a PNAS paper not too long ago modeling the entire human brain and he did not claim that he successfully modeled the human brain.
Plus no one has any clue how the brain computes really so making a claim about the formation of thoughts is just nonsense.</sentencetext>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_24_1428245.30214434</id>
	<title>Re:argument from personal ignorance, but....</title>
	<author>vertinox</author>
	<datestamp>1259081280000</datestamp>
	<modclass>None</modclass>
	<modscore>1</modscore>
	<htmltext><p><i>AFAIK, we don't have single-neuron level imaging, and the resolution on FMRI and EEG put those right out.</i></p><p>Just so that you know... We can get higher resolutions on brains neurons by invasive means such as cutting the brain apart and looking at live cells slice by slice under a powerful microscope.</p><p>It is rather tedious and gruesome but it is a viable way to look at the neurons directly.</p><p>Its even been to done to humans after they have passed away, but animals you can sort of get away with doing it while the subject is still "hot". (Oh I am making this sound worse than it is)</p><p>As far as non-invasive resolutions, yes, so far we don't have individual neuron levels but doesn't mean we have other means.</p></htmltext>
<tokenext>AFAIK , we do n't have single-neuron level imaging , and the resolution on FMRI and EEG put those right out.Just so that you know... We can get higher resolutions on brains neurons by invasive means such as cutting the brain apart and looking at live cells slice by slice under a powerful microscope.It is rather tedious and gruesome but it is a viable way to look at the neurons directly.Its even been to done to humans after they have passed away , but animals you can sort of get away with doing it while the subject is still " hot " .
( Oh I am making this sound worse than it is ) As far as non-invasive resolutions , yes , so far we do n't have individual neuron levels but does n't mean we have other means .</tokentext>
<sentencetext>AFAIK, we don't have single-neuron level imaging, and the resolution on FMRI and EEG put those right out.Just so that you know... We can get higher resolutions on brains neurons by invasive means such as cutting the brain apart and looking at live cells slice by slice under a powerful microscope.It is rather tedious and gruesome but it is a viable way to look at the neurons directly.Its even been to done to humans after they have passed away, but animals you can sort of get away with doing it while the subject is still "hot".
(Oh I am making this sound worse than it is)As far as non-invasive resolutions, yes, so far we don't have individual neuron levels but doesn't mean we have other means.</sentencetext>
	<parent>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_24_1428245.30213886</parent>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_24_1428245.30214234</id>
	<title>Re:nonlinear</title>
	<author>Anonymous</author>
	<datestamp>1259080560000</datestamp>
	<modclass>None</modclass>
	<modscore>1</modscore>
	<htmltext><p>considering that I can't even find the quote for the second article linked, I'll remain skeptical of the whole thing. The article on that "low power" version doesn't say anything about low power, in fact it talks about wattage woes and concerns due to the requirements to make a "neural" processor equivalent.</p><p>Also of note is that they're doing the same idea as intel, just at a horrendously lower capability.  Basically a lack of information and whole lot of hype.</p></div>
	</htmltext>
<tokenext>considering that I ca n't even find the quote for the second article linked , I 'll remain skeptical of the whole thing .
The article on that " low power " version does n't say anything about low power , in fact it talks about wattage woes and concerns due to the requirements to make a " neural " processor equivalent.Also of note is that they 're doing the same idea as intel , just at a horrendously lower capability .
Basically a lack of information and whole lot of hype .</tokentext>
<sentencetext>considering that I can't even find the quote for the second article linked, I'll remain skeptical of the whole thing.
The article on that "low power" version doesn't say anything about low power, in fact it talks about wattage woes and concerns due to the requirements to make a "neural" processor equivalent.Also of note is that they're doing the same idea as intel, just at a horrendously lower capability.
Basically a lack of information and whole lot of hype.
	</sentencetext>
	<parent>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_24_1428245.30213696</parent>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_24_1428245.30214046</id>
	<title>Not surprised, remember Deep Blue?</title>
	<author>NapalmScatterBrain</author>
	<datestamp>1259079780000</datestamp>
	<modclass>Interestin</modclass>
	<modscore>2</modscore>
	<htmltext>IBM has a known history of making overblown claims.  This is what happens when you let your PR mesh with your technical research.  Deep Blue was a giant PR stunt, and they had humans retooling the code in between matches.

What a crock.  When they get a robot that catches mice, purrs, and jumps on the table to eat my burger when I leave the room for 2 seconds, maybe then I'll believe it.</htmltext>
<tokenext>IBM has a known history of making overblown claims .
This is what happens when you let your PR mesh with your technical research .
Deep Blue was a giant PR stunt , and they had humans retooling the code in between matches .
What a crock .
When they get a robot that catches mice , purrs , and jumps on the table to eat my burger when I leave the room for 2 seconds , maybe then I 'll believe it .</tokentext>
<sentencetext>IBM has a known history of making overblown claims.
This is what happens when you let your PR mesh with your technical research.
Deep Blue was a giant PR stunt, and they had humans retooling the code in between matches.
What a crock.
When they get a robot that catches mice, purrs, and jumps on the table to eat my burger when I leave the room for 2 seconds, maybe then I'll believe it.</sentencetext>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_24_1428245.30218472</id>
	<title>Re:Does anyone really know what a cat thinks?</title>
	<author>Anonymous</author>
	<datestamp>1259056380000</datestamp>
	<modclass>None</modclass>
	<modscore>0</modscore>
	<htmltext><p>I will take over the world! Bwahahaha, my plan is coming to fruition. Just need to go over here and, and... oooh. Oooh, warm. So warm, getting sleepy. Now what was I thin....zzzz</p></htmltext>
<tokenext>I will take over the world !
Bwahahaha , my plan is coming to fruition .
Just need to go over here and , and... oooh. Oooh , warm .
So warm , getting sleepy .
Now what was I thin....zzzz</tokentext>
<sentencetext>I will take over the world!
Bwahahaha, my plan is coming to fruition.
Just need to go over here and, and... oooh. Oooh, warm.
So warm, getting sleepy.
Now what was I thin....zzzz</sentencetext>
	<parent>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_24_1428245.30213660</parent>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_24_1428245.30214210</id>
	<title>Re:Brain Power</title>
	<author>Sponge Bath</author>
	<datestamp>1259080440000</datestamp>
	<modclass>None</modclass>
	<modscore>1</modscore>
	<htmltext><p>Might want to start with simulating a dog brain to save power. That's what, maybe 5 neurons, 1000 synapses, and half a dog biscuit?</p></htmltext>
<tokenext>Might want to start with simulating a dog brain to save power .
That 's what , maybe 5 neurons , 1000 synapses , and half a dog biscuit ?</tokentext>
<sentencetext>Might want to start with simulating a dog brain to save power.
That's what, maybe 5 neurons, 1000 synapses, and half a dog biscuit?</sentencetext>
	<parent>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_24_1428245.30213710</parent>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_24_1428245.30215000</id>
	<title>Cat brain.</title>
	<author>fahrbot-bot</author>
	<datestamp>1259083440000</datestamp>
	<modclass>None</modclass>
	<modscore>1</modscore>
	<htmltext>It's hard to verify anything cause the machine just sits there and ignores everyone.</htmltext>
<tokenext>It 's hard to verify anything cause the machine just sits there and ignores everyone .</tokentext>
<sentencetext>It's hard to verify anything cause the machine just sits there and ignores everyone.</sentencetext>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_24_1428245.30225786</id>
	<title>Re:Brain Power</title>
	<author>holmstar</author>
	<datestamp>1257176640000</datestamp>
	<modclass>None</modclass>
	<modscore>1</modscore>
	<htmltext>So less than my microwave?  That's pretty good in my opinion.</htmltext>
<tokenext>So less than my microwave ?
That 's pretty good in my opinion .</tokentext>
<sentencetext>So less than my microwave?
That's pretty good in my opinion.</sentencetext>
	<parent>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_24_1428245.30213710</parent>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_24_1428245.30214152</id>
	<title>Another project</title>
	<author>sznupi</author>
	<datestamp>1259080260000</datestamp>
	<modclass>None</modclass>
	<modscore>1</modscore>
	<htmltext><p><a href="http://intranet.cs.man.ac.uk/apt/projects/SpiNNaker/" title="man.ac.uk">http://intranet.cs.man.ac.uk/apt/projects/SpiNNaker/</a> [man.ac.uk]</p><p>It seems that for quite a lot of folks toying with topology and interconnects is a promising approach.</p></htmltext>
<tokenext>http : //intranet.cs.man.ac.uk/apt/projects/SpiNNaker/ [ man.ac.uk ] It seems that for quite a lot of folks toying with topology and interconnects is a promising approach .</tokentext>
<sentencetext>http://intranet.cs.man.ac.uk/apt/projects/SpiNNaker/ [man.ac.uk]It seems that for quite a lot of folks toying with topology and interconnects is a promising approach.</sentencetext>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_24_1428245.30215320</id>
	<title>and you're a sockpuppet</title>
	<author>SuperBanana</author>
	<datestamp>1259084640000</datestamp>
	<modclass>None</modclass>
	<modscore>1</modscore>
	<htmltext>Seriously, you haven't posted in 4-5 years, and you jump out to post now?  Let me guess, you work in his lab...</htmltext>
<tokenext>Seriously , you have n't posted in 4-5 years , and you jump out to post now ?
Let me guess , you work in his lab.. .</tokentext>
<sentencetext>Seriously, you haven't posted in 4-5 years, and you jump out to post now?
Let me guess, you work in his lab...</sentencetext>
	<parent>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_24_1428245.30214088</parent>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_24_1428245.30221336</id>
	<title>Re:long ways to go yet</title>
	<author>TapeCutter</author>
	<datestamp>1259070540000</datestamp>
	<modclass>None</modclass>
	<modscore>1</modscore>
	<htmltext><i>"a true attempt to create such a simulation would need to factor in the stochasticity of ion channels, branchings in neurons and various other biological phenomena that have a tremendous impact on how our brains work."</i>
<br> <br>
If this research is connected to IBM's blue brain project then that is exactly what they have done with their simulation of a mouse neocortex. It is not simply an abstract ANN which any first year CS student should be able to knock up, it is a true simulation based on the brains physical and chemical properties.</htmltext>
<tokenext>" a true attempt to create such a simulation would need to factor in the stochasticity of ion channels , branchings in neurons and various other biological phenomena that have a tremendous impact on how our brains work .
" If this research is connected to IBM 's blue brain project then that is exactly what they have done with their simulation of a mouse neocortex .
It is not simply an abstract ANN which any first year CS student should be able to knock up , it is a true simulation based on the brains physical and chemical properties .</tokentext>
<sentencetext>"a true attempt to create such a simulation would need to factor in the stochasticity of ion channels, branchings in neurons and various other biological phenomena that have a tremendous impact on how our brains work.
"
 
If this research is connected to IBM's blue brain project then that is exactly what they have done with their simulation of a mouse neocortex.
It is not simply an abstract ANN which any first year CS student should be able to knock up, it is a true simulation based on the brains physical and chemical properties.</sentencetext>
	<parent>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_24_1428245.30213974</parent>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_24_1428245.30214166</id>
	<title>Re:Does anyone really know what a cat thinks?</title>
	<author>Abstrackt</author>
	<datestamp>1259080260000</datestamp>
	<modclass>None</modclass>
	<modscore>1</modscore>
	<htmltext><p>All right, here goes...</p><p> <i>o hai<br>
<i>im in ur brain thinkin ur thots</i></i></p><p>No wonder my cats sleep all day...</p></htmltext>
<tokenext>All right , here goes... o hai im in ur brain thinkin ur thotsNo wonder my cats sleep all day.. .</tokentext>
<sentencetext>All right, here goes... o hai
im in ur brain thinkin ur thotsNo wonder my cats sleep all day...</sentencetext>
	<parent>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_24_1428245.30213660</parent>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_24_1428245.30213674</id>
	<title>I can haz</title>
	<author>Anonymous</author>
	<datestamp>1259078040000</datestamp>
	<modclass>None</modclass>
	<modscore>-1</modscore>
	<htmltext>simulated cheezburger?</htmltext>
<tokenext>simulated cheezburger ?</tokentext>
<sentencetext>simulated cheezburger?</sentencetext>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_24_1428245.30217156</id>
	<title>Re:Brain Power</title>
	<author>mhajicek</author>
	<datestamp>1259093340000</datestamp>
	<modclass>None</modclass>
	<modscore>1</modscore>
	<htmltext>Imagine a Beowolf Cluster of Cat Brainz!!!</htmltext>
<tokenext>Imagine a Beowolf Cluster of Cat Brainz ! !
!</tokentext>
<sentencetext>Imagine a Beowolf Cluster of Cat Brainz!!
!</sentencetext>
	<parent>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_24_1428245.30213710</parent>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_24_1428245.30214242</id>
	<title>Has anybody claimed.....</title>
	<author>Anonymous</author>
	<datestamp>1259080560000</datestamp>
	<modclass>None</modclass>
	<modscore>0</modscore>
	<htmltext><p>Has anybody claimed/verified that the cat in question is alive?</p></htmltext>
<tokenext>Has anybody claimed/verified that the cat in question is alive ?</tokentext>
<sentencetext>Has anybody claimed/verified that the cat in question is alive?</sentencetext>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_24_1428245.30218126</id>
	<title>Re:long ways to go yet</title>
	<author>Anonymous</author>
	<datestamp>1259054700000</datestamp>
	<modclass>None</modclass>
	<modscore>0</modscore>
	<htmltext><p><div class="quote"><p>So on one hand we have the IBM guys overstating what they've achieved, and on the other we have a <b>guy spouting out a view of the limits of ANNs without actually putting any effort into providing evidence for their limitations.</b> </p></div><p>Maybe you should read up a bit on the <a href="http://en.wikipedia.org/wiki/Blue\_Brain\_Project" title="wikipedia.org" rel="nofollow">Blue Brain Project and Henry Markram</a> [wikipedia.org] a little bit.</p></div>
	</htmltext>
<tokenext>So on one hand we have the IBM guys overstating what they 've achieved , and on the other we have a guy spouting out a view of the limits of ANNs without actually putting any effort into providing evidence for their limitations .
Maybe you should read up a bit on the Blue Brain Project and Henry Markram [ wikipedia.org ] a little bit .</tokentext>
<sentencetext>So on one hand we have the IBM guys overstating what they've achieved, and on the other we have a guy spouting out a view of the limits of ANNs without actually putting any effort into providing evidence for their limitations.
Maybe you should read up a bit on the Blue Brain Project and Henry Markram [wikipedia.org] a little bit.
	</sentencetext>
	<parent>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_24_1428245.30214492</parent>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_24_1428245.30213696</id>
	<title>nonlinear</title>
	<author>Garble Snarky</author>
	<datestamp>1259078160000</datestamp>
	<modclass>Insightful</modclass>
	<modscore>5</modscore>
	<htmltext>Wouldn't power consumption grow more than linearly with neuron count? I would think the number of connections is the dominant factor - so the comparison of two data points of power consumption vs neuron count is meaningless.</htmltext>
<tokenext>Would n't power consumption grow more than linearly with neuron count ?
I would think the number of connections is the dominant factor - so the comparison of two data points of power consumption vs neuron count is meaningless .</tokentext>
<sentencetext>Wouldn't power consumption grow more than linearly with neuron count?
I would think the number of connections is the dominant factor - so the comparison of two data points of power consumption vs neuron count is meaningless.</sentencetext>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_24_1428245.30223084</id>
	<title>Re:long ways to go yet</title>
	<author>shentino</author>
	<datestamp>1259087640000</datestamp>
	<modclass>None</modclass>
	<modscore>1</modscore>
	<htmltext><p>Trying to simulate an analog system in a digital machine.</p></htmltext>
<tokenext>Trying to simulate an analog system in a digital machine .</tokentext>
<sentencetext>Trying to simulate an analog system in a digital machine.</sentencetext>
	<parent>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_24_1428245.30214492</parent>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_24_1428245.30222698</id>
	<title>Re:Does anyone really know what a cat thinks?</title>
	<author>Enigmafan</author>
	<datestamp>1259082780000</datestamp>
	<modclass>None</modclass>
	<modscore>1</modscore>
	<htmltext><p><div class="quote"><p> <i>"If a lion could talk, we could not understand him."</i></p> </div><p>Really? Because most of the communication would be variations of "let's get that one!". Any other communication might be of the scented variant, which most cat owners already understand.</p></div>
	</htmltext>
<tokenext>" If a lion could talk , we could not understand him .
" Really ?
Because most of the communication would be variations of " let 's get that one ! " .
Any other communication might be of the scented variant , which most cat owners already understand .</tokentext>
<sentencetext> "If a lion could talk, we could not understand him.
" Really?
Because most of the communication would be variations of "let's get that one!".
Any other communication might be of the scented variant, which most cat owners already understand.
	</sentencetext>
	<parent>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_24_1428245.30213746</parent>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_24_1428245.30214872</id>
	<title>Skeptics!  Buuuuuurn them!</title>
	<author>Anonymous</author>
	<datestamp>1259083020000</datestamp>
	<modclass>None</modclass>
	<modscore>0</modscore>
	<htmltext><p>These cat-brain deniers are making me SICK!  idiots!</p><p>The debate about cat brain simulation is OVER!</p></htmltext>
<tokenext>These cat-brain deniers are making me SICK !
idiots ! The debate about cat brain simulation is OVER !</tokentext>
<sentencetext>These cat-brain deniers are making me SICK!
idiots!The debate about cat brain simulation is OVER!</sentencetext>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_24_1428245.30214044</id>
	<title>RE: Ponze Scheme and Astrology</title>
	<author>Anonymous</author>
	<datestamp>1259079780000</datestamp>
	<modclass>Offtopic</modclass>
	<modscore>-1</modscore>
	<htmltext><p>Climate [sic] science is a new age astrology and Global Warming is its Ponze Scheme.</p></htmltext>
<tokenext>Climate [ sic ] science is a new age astrology and Global Warming is its Ponze Scheme .</tokentext>
<sentencetext>Climate [sic] science is a new age astrology and Global Warming is its Ponze Scheme.</sentencetext>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_24_1428245.30223658</id>
	<title>Re:Does anyone really know what a cat thinks?</title>
	<author>PingPongBoy</author>
	<datestamp>1257153060000</datestamp>
	<modclass>None</modclass>
	<modscore>1</modscore>
	<htmltext><p><em>Give me food. Now.</em></p><p>Ok. You've got me going.</p><p>I recall Garfield strip where he's standing on a table, holding Jon's shirt, and has his face right up against Jon's, and he says "Gimme food. Lots of it. And right now." This was in an classic era when Garfield was typically quite funny.</p><p>On a more serious note, a machine that has the intelligence level of a cat (in consideration of problem-solving, memory, attention span, choice of action, conceptualization, etc.), one may easily ask What if the machine is scaled up 10 or 100 times? Could it become able to understand some advanced abstractions like baseball or electromagnetism? Perhaps scaling will help, but scaling some parts up by 1000 or 100000 while other parts only by 5 or 10 may be all that's required. Such partial scaling could be within reach.</p><p>So will we suddenly see a lot of competing brain simulations???</p></htmltext>
<tokenext>Give me food .
Now.Ok. You 've got me going.I recall Garfield strip where he 's standing on a table , holding Jon 's shirt , and has his face right up against Jon 's , and he says " Gim me food .
Lots of it .
And right now .
" This was in an classic era when Garfield was typically quite funny.On a more serious note , a machine that has the intelligence level of a cat ( in consideration of problem-solving , memory , attention span , choice of action , conceptualization , etc .
) , one may easily ask What if the machine is scaled up 10 or 100 times ?
Could it become able to understand some advanced abstractions like baseball or electromagnetism ?
Perhaps scaling will help , but scaling some parts up by 1000 or 100000 while other parts only by 5 or 10 may be all that 's required .
Such partial scaling could be within reach.So will we suddenly see a lot of competing brain simulations ? ?
?</tokentext>
<sentencetext>Give me food.
Now.Ok. You've got me going.I recall Garfield strip where he's standing on a table, holding Jon's shirt, and has his face right up against Jon's, and he says "Gimme food.
Lots of it.
And right now.
" This was in an classic era when Garfield was typically quite funny.On a more serious note, a machine that has the intelligence level of a cat (in consideration of problem-solving, memory, attention span, choice of action, conceptualization, etc.
), one may easily ask What if the machine is scaled up 10 or 100 times?
Could it become able to understand some advanced abstractions like baseball or electromagnetism?
Perhaps scaling will help, but scaling some parts up by 1000 or 100000 while other parts only by 5 or 10 may be all that's required.
Such partial scaling could be within reach.So will we suddenly see a lot of competing brain simulations??
?</sentencetext>
	<parent>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_24_1428245.30214048</parent>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_24_1428245.30213700</id>
	<title>All those neurons using less than 1 watt?</title>
	<author>drainbramage</author>
	<datestamp>1259078160000</datestamp>
	<modclass>Funny</modclass>
	<modscore>5</modscore>
	<htmltext><p>All those neurons using less than 1 watt?<br>I know some people like that.</p></htmltext>
<tokenext>All those neurons using less than 1 watt ? I know some people like that .</tokentext>
<sentencetext>All those neurons using less than 1 watt?I know some people like that.</sentencetext>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_24_1428245.30214170</id>
	<title>Re:Does anyone really know what a cat thinks?</title>
	<author>Anonymous</author>
	<datestamp>1259080320000</datestamp>
	<modclass>None</modclass>
	<modscore>0</modscore>
	<htmltext><p>Spatulas. http://samandfuzzy.com/archive.php?id=32</p></htmltext>
<tokenext>Spatulas .
http : //samandfuzzy.com/archive.php ? id = 32</tokentext>
<sentencetext>Spatulas.
http://samandfuzzy.com/archive.php?id=32</sentencetext>
	<parent>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_24_1428245.30213660</parent>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_24_1428245.30216748</id>
	<title>Free will</title>
	<author>AlpineR</author>
	<datestamp>1259091360000</datestamp>
	<modclass>None</modclass>
	<modscore>1</modscore>
	<htmltext><blockquote><div><p>Digital computers are deterministic: Throw the same equation at them a thousand times and they will always spit out the same answer. Throw a question at the brain and it can produce a thousand different answers, canvassed from a chorus of quirky neurons. "The evidence is overwhelming that the brain computes with probability," Sejnowski says. Wishy-washy responses may make life easier in an uncertain world where we do not know which way an errant football will bounce, or whether a growling dog will lunge. Unpredictable neurons might cause us to take a wrong turn while walking home and discover a shortcut, or to spill acid on a pewter plate and during the cleanup to discover the process of etching.</p></div></blockquote><p>So God <em>does</em> play dice. And we call it "free will".</p></div>
	</htmltext>
<tokenext>Digital computers are deterministic : Throw the same equation at them a thousand times and they will always spit out the same answer .
Throw a question at the brain and it can produce a thousand different answers , canvassed from a chorus of quirky neurons .
" The evidence is overwhelming that the brain computes with probability , " Sejnowski says .
Wishy-washy responses may make life easier in an uncertain world where we do not know which way an errant football will bounce , or whether a growling dog will lunge .
Unpredictable neurons might cause us to take a wrong turn while walking home and discover a shortcut , or to spill acid on a pewter plate and during the cleanup to discover the process of etching.So God does play dice .
And we call it " free will " .</tokentext>
<sentencetext>Digital computers are deterministic: Throw the same equation at them a thousand times and they will always spit out the same answer.
Throw a question at the brain and it can produce a thousand different answers, canvassed from a chorus of quirky neurons.
"The evidence is overwhelming that the brain computes with probability," Sejnowski says.
Wishy-washy responses may make life easier in an uncertain world where we do not know which way an errant football will bounce, or whether a growling dog will lunge.
Unpredictable neurons might cause us to take a wrong turn while walking home and discover a shortcut, or to spill acid on a pewter plate and during the cleanup to discover the process of etching.So God does play dice.
And we call it "free will".
	</sentencetext>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_24_1428245.30215102</id>
	<title>Re:nonlinear</title>
	<author>pz</author>
	<datestamp>1259083800000</datestamp>
	<modclass>None</modclass>
	<modscore>1</modscore>
	<htmltext><p><div class="quote"><p>Wouldn't power consumption grow more than linearly with neuron count? I would think the number of connections is the dominant factor - so the comparison of two data points of power consumption vs neuron count is meaningless.</p></div><p>Neurons are not typically fully connected in K-star like networks, they are more usually connected to a fixed number of other neurons that varies by type from a small handful to 10,000.  The latter number (10,000) is used as when researchers and scientists want to estimate the total number of connections in the cortex, especially when talking about simulations or writing grant proposals where bigger numbers are more impressive.</p><p>So, power consumption should grow linearly with neuron count, if the simulation is following this particular lead from biology, and the simulation writers didn't do something stupid to create an O(n^2) dependency.</p></div>
	</htmltext>
<tokenext>Would n't power consumption grow more than linearly with neuron count ?
I would think the number of connections is the dominant factor - so the comparison of two data points of power consumption vs neuron count is meaningless.Neurons are not typically fully connected in K-star like networks , they are more usually connected to a fixed number of other neurons that varies by type from a small handful to 10,000 .
The latter number ( 10,000 ) is used as when researchers and scientists want to estimate the total number of connections in the cortex , especially when talking about simulations or writing grant proposals where bigger numbers are more impressive.So , power consumption should grow linearly with neuron count , if the simulation is following this particular lead from biology , and the simulation writers did n't do something stupid to create an O ( n ^ 2 ) dependency .</tokentext>
<sentencetext>Wouldn't power consumption grow more than linearly with neuron count?
I would think the number of connections is the dominant factor - so the comparison of two data points of power consumption vs neuron count is meaningless.Neurons are not typically fully connected in K-star like networks, they are more usually connected to a fixed number of other neurons that varies by type from a small handful to 10,000.
The latter number (10,000) is used as when researchers and scientists want to estimate the total number of connections in the cortex, especially when talking about simulations or writing grant proposals where bigger numbers are more impressive.So, power consumption should grow linearly with neuron count, if the simulation is following this particular lead from biology, and the simulation writers didn't do something stupid to create an O(n^2) dependency.
	</sentencetext>
	<parent>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_24_1428245.30213696</parent>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_24_1428245.30213990</id>
	<title>Re:All those neurons using less than 1 watt?</title>
	<author>protodevilin</author>
	<datestamp>1259079540000</datestamp>
	<modclass>None</modclass>
	<modscore>1</modscore>
	<htmltext>I know some cats like that.</htmltext>
<tokenext>I know some cats like that .</tokentext>
<sentencetext>I know some cats like that.</sentencetext>
	<parent>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_24_1428245.30213700</parent>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_24_1428245.30215552</id>
	<title>Accuracy of the simulation can't be confirmed</title>
	<author>Locke2005</author>
	<datestamp>1259085600000</datestamp>
	<modclass>None</modclass>
	<modscore>1</modscore>
	<htmltext>Until it can piss on my briefcase because it thinks I've been ignoring it we have no way of confirming that it is actually simulating a real cat's brain.</htmltext>
<tokenext>Until it can piss on my briefcase because it thinks I 've been ignoring it we have no way of confirming that it is actually simulating a real cat 's brain .</tokentext>
<sentencetext>Until it can piss on my briefcase because it thinks I've been ignoring it we have no way of confirming that it is actually simulating a real cat's brain.</sentencetext>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_24_1428245.30214106</id>
	<title>Re:long ways to go yet</title>
	<author>L4t3r4lu5</author>
	<datestamp>1259080080000</datestamp>
	<modclass>Funny</modclass>
	<modscore>3</modscore>
	<htmltext><p><div class="quote"><p>The simulation, which runs 100 times slower than an actual cat's brain, is more about watching how thoughts are formed in the brain...</p></div><p>What? I can already tell them that!<br> <br> <tt>IF $stomach\_contents = 0 THEN ConsumeFood;<br>IF $claw\_count &gt; 0 THEN ScratchShitOutOfFurniture;<br>IF $Sphincter\_Tension &gt; 0 THEN PoopAnywhereYouWant;<br>IF $TimeSinceSleep &lt; 1800 THEN $TimeSinceSleep = $TimeSinceSleep + 1 ELSE YawnFishBreathInOwnersFaceAndFallAsleepOnComputerChair;</tt></p></div>
	</htmltext>
<tokenext>The simulation , which runs 100 times slower than an actual cat 's brain , is more about watching how thoughts are formed in the brain...What ?
I can already tell them that !
IF $ stomach \ _contents = 0 THEN ConsumeFood ; IF $ claw \ _count &gt; 0 THEN ScratchShitOutOfFurniture ; IF $ Sphincter \ _Tension &gt; 0 THEN PoopAnywhereYouWant ; IF $ TimeSinceSleep</tokentext>
<sentencetext>The simulation, which runs 100 times slower than an actual cat's brain, is more about watching how thoughts are formed in the brain...What?
I can already tell them that!
IF $stomach\_contents = 0 THEN ConsumeFood;IF $claw\_count &gt; 0 THEN ScratchShitOutOfFurniture;IF $Sphincter\_Tension &gt; 0 THEN PoopAnywhereYouWant;IF $TimeSinceSleep 
	</sentencetext>
	<parent>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_24_1428245.30213748</parent>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_24_1428245.30214732</id>
	<title>Re:long ways to go yet</title>
	<author>RevWaldo</author>
	<datestamp>1259082540000</datestamp>
	<modclass>None</modclass>
	<modscore>1</modscore>
	<htmltext>This could still be an accurate representation. Cats work in batch mode. They sleep 23 hours a day, during which they think about how they'll spend the hour they are awake. So if they're solely comparing the simulation's processing speed to how cats function in awake mode, it may actually be around four times slower in aggregate. Not to shabby.</htmltext>
<tokenext>This could still be an accurate representation .
Cats work in batch mode .
They sleep 23 hours a day , during which they think about how they 'll spend the hour they are awake .
So if they 're solely comparing the simulation 's processing speed to how cats function in awake mode , it may actually be around four times slower in aggregate .
Not to shabby .</tokentext>
<sentencetext>This could still be an accurate representation.
Cats work in batch mode.
They sleep 23 hours a day, during which they think about how they'll spend the hour they are awake.
So if they're solely comparing the simulation's processing speed to how cats function in awake mode, it may actually be around four times slower in aggregate.
Not to shabby.</sentencetext>
	<parent>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_24_1428245.30213748</parent>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_24_1428245.30214188</id>
	<title>Re:Brain Power</title>
	<author>hattig</author>
	<datestamp>1259080380000</datestamp>
	<modclass>Interestin</modclass>
	<modscore>2</modscore>
	<htmltext><p>Their chip uses 340 transistors to model a neuron, and has 65536 neurons.</p><p>That means it has ~22m transistors for neurons, although there certainly more transistors managing non-neuron aspects.</p><p>It looks like it was made on a 130nm - 250nm process for the die size.</p><p>Shrink that to 45nm once the technology is proven, and you'll have 8 to 32 times as many neurons in a single chip. That's 512Ki to 2Mi neurons per chip.</p><p>A chip makes up a neural cluster, and you use multiple chips to simulate multiple neural clusters, like a brain. They're using 16 chips at the moment for 1Mi neurons. They'll get to 64Mi neurons easily, and with more clusters, 1Bi doesn't seem out of the question in a few years.</p></htmltext>
<tokenext>Their chip uses 340 transistors to model a neuron , and has 65536 neurons.That means it has ~ 22m transistors for neurons , although there certainly more transistors managing non-neuron aspects.It looks like it was made on a 130nm - 250nm process for the die size.Shrink that to 45nm once the technology is proven , and you 'll have 8 to 32 times as many neurons in a single chip .
That 's 512Ki to 2Mi neurons per chip.A chip makes up a neural cluster , and you use multiple chips to simulate multiple neural clusters , like a brain .
They 're using 16 chips at the moment for 1Mi neurons .
They 'll get to 64Mi neurons easily , and with more clusters , 1Bi does n't seem out of the question in a few years .</tokentext>
<sentencetext>Their chip uses 340 transistors to model a neuron, and has 65536 neurons.That means it has ~22m transistors for neurons, although there certainly more transistors managing non-neuron aspects.It looks like it was made on a 130nm - 250nm process for the die size.Shrink that to 45nm once the technology is proven, and you'll have 8 to 32 times as many neurons in a single chip.
That's 512Ki to 2Mi neurons per chip.A chip makes up a neural cluster, and you use multiple chips to simulate multiple neural clusters, like a brain.
They're using 16 chips at the moment for 1Mi neurons.
They'll get to 64Mi neurons easily, and with more clusters, 1Bi doesn't seem out of the question in a few years.</sentencetext>
	<parent>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_24_1428245.30213710</parent>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_24_1428245.30213748</id>
	<title>long ways to go yet</title>
	<author>Anonymous</author>
	<datestamp>1259078340000</datestamp>
	<modclass>Insightful</modclass>
	<modscore>5</modscore>
	<htmltext><p>From the original FA: "The simulation, which runs 100 times slower than an actual cat's brain, is more about watching how thoughts are formed in the brain and how the roughly 1 billion neurons and 10 trillion synapses in a cat's brain work together."</p><p>So the most bad-ass computer simulation, assuming it worked, which this guy is saying it probably didn't, was still 100 times slower than a real cat's brain.  A real cat's brain also fits inside a tiny furry space the size of a baseball... and it runs on a once-daily small bowl of cat food.  We have a long ways to go.</p></htmltext>
<tokenext>From the original FA : " The simulation , which runs 100 times slower than an actual cat 's brain , is more about watching how thoughts are formed in the brain and how the roughly 1 billion neurons and 10 trillion synapses in a cat 's brain work together .
" So the most bad-ass computer simulation , assuming it worked , which this guy is saying it probably did n't , was still 100 times slower than a real cat 's brain .
A real cat 's brain also fits inside a tiny furry space the size of a baseball... and it runs on a once-daily small bowl of cat food .
We have a long ways to go .</tokentext>
<sentencetext>From the original FA: "The simulation, which runs 100 times slower than an actual cat's brain, is more about watching how thoughts are formed in the brain and how the roughly 1 billion neurons and 10 trillion synapses in a cat's brain work together.
"So the most bad-ass computer simulation, assuming it worked, which this guy is saying it probably didn't, was still 100 times slower than a real cat's brain.
A real cat's brain also fits inside a tiny furry space the size of a baseball... and it runs on a once-daily small bowl of cat food.
We have a long ways to go.</sentencetext>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_24_1428245.30214164</id>
	<title>Re:Does anyone really know what a cat thinks?</title>
	<author>jhoegl</author>
	<datestamp>1259080260000</datestamp>
	<modclass>Funny</modclass>
	<modscore>5</modscore>
	<htmltext>Hey... whats that moving dot on the wall? Why is it there? I must have it!
Great! I captured it! Wait, whats this? It escaped me, inconceivable!!! What luck, it stopped right by my paw, Ill will capture it again!  NNNNOOOOOOO!!!!

Look, look there, its something moving under my feet. I must pounce it to figure out what it is!  Weird, I pounced it and its still moving.  Ill pounce it again!  Ah, there it stopped moving, Ill sniff it now. Wait, its moving again... Curse you!</htmltext>
<tokenext>Hey... whats that moving dot on the wall ?
Why is it there ?
I must have it !
Great ! I captured it !
Wait , whats this ?
It escaped me , inconceivable ! ! !
What luck , it stopped right by my paw , Ill will capture it again !
NNNNOOOOOOO ! ! ! ! Look , look there , its something moving under my feet .
I must pounce it to figure out what it is !
Weird , I pounced it and its still moving .
Ill pounce it again !
Ah , there it stopped moving , Ill sniff it now .
Wait , its moving again... Curse you !</tokentext>
<sentencetext>Hey... whats that moving dot on the wall?
Why is it there?
I must have it!
Great! I captured it!
Wait, whats this?
It escaped me, inconceivable!!!
What luck, it stopped right by my paw, Ill will capture it again!
NNNNOOOOOOO!!!!

Look, look there, its something moving under my feet.
I must pounce it to figure out what it is!
Weird, I pounced it and its still moving.
Ill pounce it again!
Ah, there it stopped moving, Ill sniff it now.
Wait, its moving again... Curse you!</sentencetext>
	<parent>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_24_1428245.30213660</parent>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_24_1428245.30214140</id>
	<title>Re:long ways to go yet</title>
	<author>fran6gagne</author>
	<datestamp>1259080200000</datestamp>
	<modclass>None</modclass>
	<modscore>1</modscore>
	<htmltext><p><div class="quote"><p>From the original FA: "The simulation, which runs 100 times slower than an actual cat's brain</p></div><p>Well, it is a retarded cat brain and you should know that people love retarded animals. On top of that, it doesn't need a litter! I expect to see this thing commercialised soon.</p></div>
	</htmltext>
<tokenext>From the original FA : " The simulation , which runs 100 times slower than an actual cat 's brainWell , it is a retarded cat brain and you should know that people love retarded animals .
On top of that , it does n't need a litter !
I expect to see this thing commercialised soon .</tokentext>
<sentencetext>From the original FA: "The simulation, which runs 100 times slower than an actual cat's brainWell, it is a retarded cat brain and you should know that people love retarded animals.
On top of that, it doesn't need a litter!
I expect to see this thing commercialised soon.
	</sentencetext>
	<parent>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_24_1428245.30213748</parent>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_24_1428245.30221482</id>
	<title>How do you model something you don't understand?</title>
	<author>kklein</author>
	<datestamp>1259071620000</datestamp>
	<modclass>None</modclass>
	<modscore>1</modscore>
	<htmltext><p><div class="quote"><p>Mohda would be laughed at in any neuroscience conference and he certainly doesn't help the cause of theoreticians in the neuroscience field by making such stupid announcements.</p></div><p>Yes. I am not a neuroscientist, but applied linguistics, at the technical end (where I live) does borrow from that field from time to time, and even with that relatively small understanding of the way the brain works, I immediately called "bullshit" on this crazy claim from IBM. We just don't really know how a brain works yet; how could we model it?

</p><p>In my own field, the same nonsense continually pops up from computer scientists blathering about machine translation, which, despite all the rosy claims, does not actually work. The reason is that the computer loads up on rules and objects (words and phrases), and then just links them up with another set of them from a different language. The computer doesn't "speak" either of these languages, and doesn't know what the words mean, and doesn't know what they connote (emotional response), and cant tell when it's spitting out nonsense. The problem is always approached as though languages were rule systems, when those rules are made up after the fact, and most psycholinguists believe that basically there is very little processing involved in language production--just a lot of spitting out crap we've heard before that is tied to specific feelings and concepts that we are trying to recreate in the other. Basically, we don't know how it "works," because it is an innately human activity, which is the product of the human brain and the evolutionary pressures of 2 million years plus. The computer can't do it right because it's not human.

</p><p>I'm not saying "never," because I believe, and hope, that we'll be able to crack it one day--at least to the point of being useful for something. But computer scientists <i>always</i> overstate their successes, because they don't know enough about the end point. No one does.

</p><p>I was almost-audibly cheering as I read TFA. Spot on.</p></div>
	</htmltext>
<tokenext>Mohda would be laughed at in any neuroscience conference and he certainly does n't help the cause of theoreticians in the neuroscience field by making such stupid announcements.Yes .
I am not a neuroscientist , but applied linguistics , at the technical end ( where I live ) does borrow from that field from time to time , and even with that relatively small understanding of the way the brain works , I immediately called " bullshit " on this crazy claim from IBM .
We just do n't really know how a brain works yet ; how could we model it ?
In my own field , the same nonsense continually pops up from computer scientists blathering about machine translation , which , despite all the rosy claims , does not actually work .
The reason is that the computer loads up on rules and objects ( words and phrases ) , and then just links them up with another set of them from a different language .
The computer does n't " speak " either of these languages , and does n't know what the words mean , and does n't know what they connote ( emotional response ) , and cant tell when it 's spitting out nonsense .
The problem is always approached as though languages were rule systems , when those rules are made up after the fact , and most psycholinguists believe that basically there is very little processing involved in language production--just a lot of spitting out crap we 've heard before that is tied to specific feelings and concepts that we are trying to recreate in the other .
Basically , we do n't know how it " works , " because it is an innately human activity , which is the product of the human brain and the evolutionary pressures of 2 million years plus .
The computer ca n't do it right because it 's not human .
I 'm not saying " never , " because I believe , and hope , that we 'll be able to crack it one day--at least to the point of being useful for something .
But computer scientists always overstate their successes , because they do n't know enough about the end point .
No one does .
I was almost-audibly cheering as I read TFA .
Spot on .</tokentext>
<sentencetext>Mohda would be laughed at in any neuroscience conference and he certainly doesn't help the cause of theoreticians in the neuroscience field by making such stupid announcements.Yes.
I am not a neuroscientist, but applied linguistics, at the technical end (where I live) does borrow from that field from time to time, and even with that relatively small understanding of the way the brain works, I immediately called "bullshit" on this crazy claim from IBM.
We just don't really know how a brain works yet; how could we model it?
In my own field, the same nonsense continually pops up from computer scientists blathering about machine translation, which, despite all the rosy claims, does not actually work.
The reason is that the computer loads up on rules and objects (words and phrases), and then just links them up with another set of them from a different language.
The computer doesn't "speak" either of these languages, and doesn't know what the words mean, and doesn't know what they connote (emotional response), and cant tell when it's spitting out nonsense.
The problem is always approached as though languages were rule systems, when those rules are made up after the fact, and most psycholinguists believe that basically there is very little processing involved in language production--just a lot of spitting out crap we've heard before that is tied to specific feelings and concepts that we are trying to recreate in the other.
Basically, we don't know how it "works," because it is an innately human activity, which is the product of the human brain and the evolutionary pressures of 2 million years plus.
The computer can't do it right because it's not human.
I'm not saying "never," because I believe, and hope, that we'll be able to crack it one day--at least to the point of being useful for something.
But computer scientists always overstate their successes, because they don't know enough about the end point.
No one does.
I was almost-audibly cheering as I read TFA.
Spot on.
	</sentencetext>
	<parent>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_24_1428245.30213936</parent>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_24_1428245.30215098</id>
	<title>Re:long ways to go yet</title>
	<author>ErikZ</author>
	<datestamp>1259083800000</datestamp>
	<modclass>None</modclass>
	<modscore>1</modscore>
	<htmltext><p>It sounds very interesting. Do you know of a good reference for those of us who don't have Masters in Biology or Comp-Sci?</p></htmltext>
<tokenext>It sounds very interesting .
Do you know of a good reference for those of us who do n't have Masters in Biology or Comp-Sci ?</tokentext>
<sentencetext>It sounds very interesting.
Do you know of a good reference for those of us who don't have Masters in Biology or Comp-Sci?</sentencetext>
	<parent>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_24_1428245.30213878</parent>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_24_1428245.30216488</id>
	<title>Re:The power of custom silicon</title>
	<author>John Whitley</author>
	<datestamp>1259089860000</datestamp>
	<modclass>None</modclass>
	<modscore>1</modscore>
	<htmltext><p>On that theme, it's easy to calculate some reasonable bounds, based off of actual cat metabolism.  Small cats, around 7 lbs., will require ~125 kcal/day to maintain body weight.  We can use that kcal/day value as a rough bound, which results in a mighty 6W.  For the whole cat.  Granted, that includes a lot of nap time, but it also includes all other metabolic functions.</p><p>Obviously, I have no trouble whatsoever believing that it's possible to do better than 320,000 W in simulating a cat brain.  Even padding for sleep, we've clearly got a long, long way to go.</p></htmltext>
<tokenext>On that theme , it 's easy to calculate some reasonable bounds , based off of actual cat metabolism .
Small cats , around 7 lbs. , will require ~ 125 kcal/day to maintain body weight .
We can use that kcal/day value as a rough bound , which results in a mighty 6W .
For the whole cat .
Granted , that includes a lot of nap time , but it also includes all other metabolic functions.Obviously , I have no trouble whatsoever believing that it 's possible to do better than 320,000 W in simulating a cat brain .
Even padding for sleep , we 've clearly got a long , long way to go .</tokentext>
<sentencetext>On that theme, it's easy to calculate some reasonable bounds, based off of actual cat metabolism.
Small cats, around 7 lbs., will require ~125 kcal/day to maintain body weight.
We can use that kcal/day value as a rough bound, which results in a mighty 6W.
For the whole cat.
Granted, that includes a lot of nap time, but it also includes all other metabolic functions.Obviously, I have no trouble whatsoever believing that it's possible to do better than 320,000 W in simulating a cat brain.
Even padding for sleep, we've clearly got a long, long way to go.</sentencetext>
	<parent>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_24_1428245.30213734</parent>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_24_1428245.30215232</id>
	<title>Re:long ways to go yet</title>
	<author>Rod Frey</author>
	<datestamp>1259084340000</datestamp>
	<modclass>Insightful</modclass>
	<modscore>4</modscore>
	<htmltext><p>Isn't there value in moving to a higher level of abstraction than a single neuron though?  Or simplifying the basic elements for the sake of a tractable broader model?</p><p>Simulating a single atom, for example, is reasonably complex: it would be impossible with current computational resources to simulate the electromagnetic properties of a metal if we required accurate simulations of individual atoms.  Yet despite ignoring what we know about the atomic models, the higher-level models are very predictive.</p><p>Not that we have such predictive, higher-level models for the brain.  That's what some researchers are searching for: I'm just suggesting that those models hopefully won't require accurate simulation of individual neurons.  That seems to be the pattern in other domains.</p></htmltext>
<tokenext>Is n't there value in moving to a higher level of abstraction than a single neuron though ?
Or simplifying the basic elements for the sake of a tractable broader model ? Simulating a single atom , for example , is reasonably complex : it would be impossible with current computational resources to simulate the electromagnetic properties of a metal if we required accurate simulations of individual atoms .
Yet despite ignoring what we know about the atomic models , the higher-level models are very predictive.Not that we have such predictive , higher-level models for the brain .
That 's what some researchers are searching for : I 'm just suggesting that those models hopefully wo n't require accurate simulation of individual neurons .
That seems to be the pattern in other domains .</tokentext>
<sentencetext>Isn't there value in moving to a higher level of abstraction than a single neuron though?
Or simplifying the basic elements for the sake of a tractable broader model?Simulating a single atom, for example, is reasonably complex: it would be impossible with current computational resources to simulate the electromagnetic properties of a metal if we required accurate simulations of individual atoms.
Yet despite ignoring what we know about the atomic models, the higher-level models are very predictive.Not that we have such predictive, higher-level models for the brain.
That's what some researchers are searching for: I'm just suggesting that those models hopefully won't require accurate simulation of individual neurons.
That seems to be the pattern in other domains.</sentencetext>
	<parent>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_24_1428245.30213878</parent>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_24_1428245.30214114</id>
	<title>I CAN HAZ THINKINESS, THERE4 I IZ?</title>
	<author>itsdapead</author>
	<datestamp>1259080080000</datestamp>
	<modclass>None</modclass>
	<modscore>1</modscore>
	<htmltext><p><div class="quote"><p>Think about it. Think about it like a cat.</p></div><p>In block-capital Papyrus on top of a humourous cat photo.</p></div>
	</htmltext>
<tokenext>Think about it .
Think about it like a cat.In block-capital Papyrus on top of a humourous cat photo .</tokentext>
<sentencetext>Think about it.
Think about it like a cat.In block-capital Papyrus on top of a humourous cat photo.
	</sentencetext>
	<parent>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_24_1428245.30213660</parent>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_24_1428245.30216314</id>
	<title>Re:long ways to go yet</title>
	<author>Anonymous</author>
	<datestamp>1259089200000</datestamp>
	<modclass>None</modclass>
	<modscore>0</modscore>
	<htmltext><p>Good point. I'm sure Markram only spent 10 minutes of his life working, the length of time it probably took him to write that e-mail.</p></htmltext>
<tokenext>Good point .
I 'm sure Markram only spent 10 minutes of his life working , the length of time it probably took him to write that e-mail .</tokentext>
<sentencetext>Good point.
I'm sure Markram only spent 10 minutes of his life working, the length of time it probably took him to write that e-mail.</sentencetext>
	<parent>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_24_1428245.30214492</parent>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_24_1428245.30215246</id>
	<title>Re:Does anyone really know what a cat thinks?</title>
	<author>Junior J. Junior III</author>
	<datestamp>1259084340000</datestamp>
	<modclass>None</modclass>
	<modscore>1</modscore>
	<htmltext><p><div class="quote"><p>Think about it. Think about it like a cat.</p></div><p>I can haz brain simulation?</p></div>
	</htmltext>
<tokenext>Think about it .
Think about it like a cat.I can haz brain simulation ?</tokentext>
<sentencetext>Think about it.
Think about it like a cat.I can haz brain simulation?
	</sentencetext>
	<parent>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_24_1428245.30213660</parent>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_24_1428245.30217536</id>
	<title>Re:long ways to go yet</title>
	<author>IICV</author>
	<datestamp>1259095200000</datestamp>
	<modclass>None</modclass>
	<modscore>1</modscore>
	<htmltext>I'm a zombie, you insensitive clod!</htmltext>
<tokenext>I 'm a zombie , you insensitive clod !</tokentext>
<sentencetext>I'm a zombie, you insensitive clod!</sentencetext>
	<parent>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_24_1428245.30214208</parent>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_24_1428245.30221986</id>
	<title>Re:long ways to go yet</title>
	<author>4D6963</author>
	<datestamp>1259075820000</datestamp>
	<modclass>None</modclass>
	<modscore>1</modscore>
	<htmltext><p> <i>we have a guy spouting out a view of the limits of ANNs without actually putting any effort into providing evidence for their limitations.</i> </p><p>The fact that in 50 years we got little better out of ANNs that unreliable methods of pattern matching should be good enough a reason. We have practically no reason to think that by putting much more than billions of those together something will magically pop out.

</p><p>Actually, it's always the same excuse, "we can't simulate enough yet". When will we have enough? Let me guess, one billion isn't enough for even an ant-like intelligence, so one trillion won't be good enough either? Yeah, time to realise what's obvious, the key isn't in how many you can stick together, it's all in figuring out how it's even supposed to work.</p></htmltext>
<tokenext>we have a guy spouting out a view of the limits of ANNs without actually putting any effort into providing evidence for their limitations .
The fact that in 50 years we got little better out of ANNs that unreliable methods of pattern matching should be good enough a reason .
We have practically no reason to think that by putting much more than billions of those together something will magically pop out .
Actually , it 's always the same excuse , " we ca n't simulate enough yet " .
When will we have enough ?
Let me guess , one billion is n't enough for even an ant-like intelligence , so one trillion wo n't be good enough either ?
Yeah , time to realise what 's obvious , the key is n't in how many you can stick together , it 's all in figuring out how it 's even supposed to work .</tokentext>
<sentencetext> we have a guy spouting out a view of the limits of ANNs without actually putting any effort into providing evidence for their limitations.
The fact that in 50 years we got little better out of ANNs that unreliable methods of pattern matching should be good enough a reason.
We have practically no reason to think that by putting much more than billions of those together something will magically pop out.
Actually, it's always the same excuse, "we can't simulate enough yet".
When will we have enough?
Let me guess, one billion isn't enough for even an ant-like intelligence, so one trillion won't be good enough either?
Yeah, time to realise what's obvious, the key isn't in how many you can stick together, it's all in figuring out how it's even supposed to work.</sentencetext>
	<parent>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_24_1428245.30214492</parent>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_24_1428245.30214136</id>
	<title>Re:long ways to go yet</title>
	<author>Anonymous</author>
	<datestamp>1259080200000</datestamp>
	<modclass>None</modclass>
	<modscore>0</modscore>
	<htmltext><p>There's definitely a difference between simulation and modeling.</p></htmltext>
<tokenext>There 's definitely a difference between simulation and modeling .</tokentext>
<sentencetext>There's definitely a difference between simulation and modeling.</sentencetext>
	<parent>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_24_1428245.30213748</parent>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_24_1428245.30213974</id>
	<title>Re:long ways to go yet</title>
	<author>Anonymous</author>
	<datestamp>1259079480000</datestamp>
	<modclass>Informativ</modclass>
	<modscore>5</modscore>
	<htmltext>He's not arguing that it didn't work, he's arguing that they essentially ran a simulation of a large Artificial Neural Network, a relatively trivial task as long as you have a big enough computer behind it. ANNs are essentially points that connect to each other and learn by assigning weights to these various connections- this is essentially the simplest possible way to simulate the behavior of a neuron. The argument is being made that to claim an ANN, regardless of its size, approaches the capabilities of any mammalian brain is simply wrong, and that a true attempt to create such a simulation would need to factor in the stochasticity of ion channels, branchings in neurons and various other biological phenomena that have a tremendous impact on how our brains work.<br> <br>Without reading more details on the original work, I'm inclined to say that he has a very valid point if they were indeed only running a large ANN model.</htmltext>
<tokenext>He 's not arguing that it did n't work , he 's arguing that they essentially ran a simulation of a large Artificial Neural Network , a relatively trivial task as long as you have a big enough computer behind it .
ANNs are essentially points that connect to each other and learn by assigning weights to these various connections- this is essentially the simplest possible way to simulate the behavior of a neuron .
The argument is being made that to claim an ANN , regardless of its size , approaches the capabilities of any mammalian brain is simply wrong , and that a true attempt to create such a simulation would need to factor in the stochasticity of ion channels , branchings in neurons and various other biological phenomena that have a tremendous impact on how our brains work .
Without reading more details on the original work , I 'm inclined to say that he has a very valid point if they were indeed only running a large ANN model .</tokentext>
<sentencetext>He's not arguing that it didn't work, he's arguing that they essentially ran a simulation of a large Artificial Neural Network, a relatively trivial task as long as you have a big enough computer behind it.
ANNs are essentially points that connect to each other and learn by assigning weights to these various connections- this is essentially the simplest possible way to simulate the behavior of a neuron.
The argument is being made that to claim an ANN, regardless of its size, approaches the capabilities of any mammalian brain is simply wrong, and that a true attempt to create such a simulation would need to factor in the stochasticity of ion channels, branchings in neurons and various other biological phenomena that have a tremendous impact on how our brains work.
Without reading more details on the original work, I'm inclined to say that he has a very valid point if they were indeed only running a large ANN model.</sentencetext>
	<parent>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_24_1428245.30213748</parent>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_24_1428245.30217330</id>
	<title>Re:and you're a sockpuppet</title>
	<author>Bell Would?</author>
	<datestamp>1259094240000</datestamp>
	<modclass>None</modclass>
	<modscore>1</modscore>
	<htmltext><p><div class="quote"><p>Seriously, you haven't posted in 4-5 years, and you jump out to post now?  Let me guess, you work in his lab...</p></div><p>Thanks for the comment. In defense of my posting-latency, I prefer to know what I know before I post comments and quite frankly, just about every comment I plan to write on Slashdot is already written!  Today was a rare event.</p><p>I'm a physicist and author, writing a sci-fi novel that throws rocks at the concept of ever simulating a brain using a Von Neumann machine. Markram would not likely want me in his lab -- but he has my greatest respect.  He may end up proving my own theories, which, without funding I cannot pursue (few institutions fund *dis-provers*).</p><p>BTW I've had only one Slashdot account - ever<nobr> <wbr></nobr>:-)</p></div>
	</htmltext>
<tokenext>Seriously , you have n't posted in 4-5 years , and you jump out to post now ?
Let me guess , you work in his lab...Thanks for the comment .
In defense of my posting-latency , I prefer to know what I know before I post comments and quite frankly , just about every comment I plan to write on Slashdot is already written !
Today was a rare event.I 'm a physicist and author , writing a sci-fi novel that throws rocks at the concept of ever simulating a brain using a Von Neumann machine .
Markram would not likely want me in his lab -- but he has my greatest respect .
He may end up proving my own theories , which , without funding I can not pursue ( few institutions fund * dis-provers * ) .BTW I 've had only one Slashdot account - ever : - )</tokentext>
<sentencetext>Seriously, you haven't posted in 4-5 years, and you jump out to post now?
Let me guess, you work in his lab...Thanks for the comment.
In defense of my posting-latency, I prefer to know what I know before I post comments and quite frankly, just about every comment I plan to write on Slashdot is already written!
Today was a rare event.I'm a physicist and author, writing a sci-fi novel that throws rocks at the concept of ever simulating a brain using a Von Neumann machine.
Markram would not likely want me in his lab -- but he has my greatest respect.
He may end up proving my own theories, which, without funding I cannot pursue (few institutions fund *dis-provers*).BTW I've had only one Slashdot account - ever :-)
	</sentencetext>
	<parent>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_24_1428245.30215320</parent>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_24_1428245.30213746</id>
	<title>Re:Does anyone really know what a cat thinks?</title>
	<author>marqs</author>
	<datestamp>1259078340000</datestamp>
	<modclass>Insightful</modclass>
	<modscore>3</modscore>
	<htmltext><i>"If a lion could talk, we could not understand him."</i>
<br>
Ludwig Witgenstein - tractatus logico-philosophicus</htmltext>
<tokenext>" If a lion could talk , we could not understand him .
" Ludwig Witgenstein - tractatus logico-philosophicus</tokentext>
<sentencetext>"If a lion could talk, we could not understand him.
"

Ludwig Witgenstein - tractatus logico-philosophicus</sentencetext>
	<parent>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_24_1428245.30213660</parent>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_24_1428245.30214636</id>
	<title>Emo Philips</title>
	<author>Temujin\_12</author>
	<datestamp>1259082060000</datestamp>
	<modclass>Interestin</modclass>
	<modscore>3</modscore>
	<htmltext><p>"I used to think that the brain was the most wonderful organ in my body. Then I realized who was telling me this."</p></htmltext>
<tokenext>" I used to think that the brain was the most wonderful organ in my body .
Then I realized who was telling me this .
"</tokentext>
<sentencetext>"I used to think that the brain was the most wonderful organ in my body.
Then I realized who was telling me this.
"</sentencetext>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_24_1428245.30216462</id>
	<title>IBM forgot a critical adjective</title>
	<author>russotto</author>
	<datestamp>1259089800000</datestamp>
	<modclass>None</modclass>
	<modscore>1</modscore>
	<htmltext>They did manage to simulate a cat brain... but they failed to mention it was a dead cat.</htmltext>
<tokenext>They did manage to simulate a cat brain... but they failed to mention it was a dead cat .</tokentext>
<sentencetext>They did manage to simulate a cat brain... but they failed to mention it was a dead cat.</sentencetext>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_24_1428245.30216402</id>
	<title>Re:long ways to go yet</title>
	<author>Lundse</author>
	<datestamp>1259089560000</datestamp>
	<modclass>None</modclass>
	<modscore>1</modscore>
	<htmltext><p><div class="quote"><p>...and on the other we have a guy spouting out a view of the limits of ANNs without actually putting any effort into providing evidence for their limitations.</p><p>I didn't see him railing against ANNs being capable of performing as good as, or beyond, a cat. Or a human. Or even performing in basically the same manner. What the FA said was that no simple ANN can be a simulation of a cats brain, because a cats brain is not a simple ANN.</p></div></div>
	</htmltext>
<tokenext>...and on the other we have a guy spouting out a view of the limits of ANNs without actually putting any effort into providing evidence for their limitations.I did n't see him railing against ANNs being capable of performing as good as , or beyond , a cat .
Or a human .
Or even performing in basically the same manner .
What the FA said was that no simple ANN can be a simulation of a cats brain , because a cats brain is not a simple ANN .</tokentext>
<sentencetext>...and on the other we have a guy spouting out a view of the limits of ANNs without actually putting any effort into providing evidence for their limitations.I didn't see him railing against ANNs being capable of performing as good as, or beyond, a cat.
Or a human.
Or even performing in basically the same manner.
What the FA said was that no simple ANN can be a simulation of a cats brain, because a cats brain is not a simple ANN.
	</sentencetext>
	<parent>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_24_1428245.30214492</parent>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_24_1428245.30213722</id>
	<title>Now we know the state of Schrodinger cat...</title>
	<author>KiwiCanuck</author>
	<datestamp>1259078220000</datestamp>
	<modclass>None</modclass>
	<modscore>1</modscore>
	<htmltext>so tuck your head between your legs and wait for the Universe to explode!   ~:-)</htmltext>
<tokenext>so tuck your head between your legs and wait for the Universe to explode !
~ : - )</tokentext>
<sentencetext>so tuck your head between your legs and wait for the Universe to explode!
~:-)</sentencetext>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_24_1428245.30214576</id>
	<title>Re:The power of custom silicon</title>
	<author>Anonymous</author>
	<datestamp>1259081760000</datestamp>
	<modclass>None</modclass>
	<modscore>0</modscore>
	<htmltext><p>Except that they're using the properties of low-power silicon transistors (they misfire frequently) to model the properties of mushy neurons whilst saving energy. Read the article, it's actually quite interesting.</p></htmltext>
<tokenext>Except that they 're using the properties of low-power silicon transistors ( they misfire frequently ) to model the properties of mushy neurons whilst saving energy .
Read the article , it 's actually quite interesting .</tokentext>
<sentencetext>Except that they're using the properties of low-power silicon transistors (they misfire frequently) to model the properties of mushy neurons whilst saving energy.
Read the article, it's actually quite interesting.</sentencetext>
	<parent>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_24_1428245.30213734</parent>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_24_1428245.30214888</id>
	<title>Re:Skeptical?</title>
	<author>agnosticnixie</author>
	<datestamp>1259083080000</datestamp>
	<modclass>None</modclass>
	<modscore>1</modscore>
	<htmltext><p>There's also the number of assemblies we don't know about that have been disregarded early to be brought back on the table (name forgotten) as something that's a) important and b) much much much more powerful than we thought (Kurzweil likened them to RAM, which they are ostensibly not). Oh, and c) we have no real clue how they work because neuroscience isn't even there yet.</p></htmltext>
<tokenext>There 's also the number of assemblies we do n't know about that have been disregarded early to be brought back on the table ( name forgotten ) as something that 's a ) important and b ) much much much more powerful than we thought ( Kurzweil likened them to RAM , which they are ostensibly not ) .
Oh , and c ) we have no real clue how they work because neuroscience is n't even there yet .</tokentext>
<sentencetext>There's also the number of assemblies we don't know about that have been disregarded early to be brought back on the table (name forgotten) as something that's a) important and b) much much much more powerful than we thought (Kurzweil likened them to RAM, which they are ostensibly not).
Oh, and c) we have no real clue how they work because neuroscience isn't even there yet.</sentencetext>
	<parent>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_24_1428245.30213936</parent>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_24_1428245.30216370</id>
	<title>Re:long ways to go yet</title>
	<author>R2.0</author>
	<datestamp>1259089380000</datestamp>
	<modclass>None</modclass>
	<modscore>1</modscore>
	<htmltext><p>"I don't know why but the AI field has always been horifically polarised, the kind of arguments you get in that field are just so immature it's beyond belief. You have people in the AI field following their viewpoint religiously, completely unwilling to consider the other viewpoint."</p><p>Replace "AI" with any field of human endeavor involving an irrational component and it's still true.  Anything where the answer to a fundamental question should be "I don't know" will compel some people to proclaim that they DO "know".</p><p>When we can answer the question "What is consciousness?", we'll make faster progress.</p></htmltext>
<tokenext>" I do n't know why but the AI field has always been horifically polarised , the kind of arguments you get in that field are just so immature it 's beyond belief .
You have people in the AI field following their viewpoint religiously , completely unwilling to consider the other viewpoint .
" Replace " AI " with any field of human endeavor involving an irrational component and it 's still true .
Anything where the answer to a fundamental question should be " I do n't know " will compel some people to proclaim that they DO " know " .When we can answer the question " What is consciousness ?
" , we 'll make faster progress .</tokentext>
<sentencetext>"I don't know why but the AI field has always been horifically polarised, the kind of arguments you get in that field are just so immature it's beyond belief.
You have people in the AI field following their viewpoint religiously, completely unwilling to consider the other viewpoint.
"Replace "AI" with any field of human endeavor involving an irrational component and it's still true.
Anything where the answer to a fundamental question should be "I don't know" will compel some people to proclaim that they DO "know".When we can answer the question "What is consciousness?
", we'll make faster progress.</sentencetext>
	<parent>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_24_1428245.30214492</parent>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_24_1428245.30216614</id>
	<title>Re:long ways to go yet</title>
	<author>Anonymous</author>
	<datestamp>1259090700000</datestamp>
	<modclass>None</modclass>
	<modscore>0</modscore>
	<htmltext><p>You must have missed the point of the article.</p><p>Our brains transmit crap randomly and with a high degree of noise.</p><p>This computer transmits crap randomly and with a high degree of noise.</p><p>Therefore, this computer is like our brain.</p></htmltext>
<tokenext>You must have missed the point of the article.Our brains transmit crap randomly and with a high degree of noise.This computer transmits crap randomly and with a high degree of noise.Therefore , this computer is like our brain .</tokentext>
<sentencetext>You must have missed the point of the article.Our brains transmit crap randomly and with a high degree of noise.This computer transmits crap randomly and with a high degree of noise.Therefore, this computer is like our brain.</sentencetext>
	<parent>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_24_1428245.30213878</parent>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_24_1428245.30213660</id>
	<title>Does anyone really know what a cat thinks?</title>
	<author>Anonymous</author>
	<datestamp>1259077920000</datestamp>
	<modclass>Funny</modclass>
	<modscore>5</modscore>
	<htmltext>Think about it. Think about it like a cat.</htmltext>
<tokenext>Think about it .
Think about it like a cat .</tokentext>
<sentencetext>Think about it.
Think about it like a cat.</sentencetext>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_24_1428245.30216696</id>
	<title>Re:long ways to go yet</title>
	<author>LockeOnLogic</author>
	<datestamp>1259091120000</datestamp>
	<modclass>None</modclass>
	<modscore>1</modscore>
	<htmltext>Suppose you were an idiot and a member of congress. But I repeat myself.</htmltext>
<tokenext>Suppose you were an idiot and a member of congress .
But I repeat myself .</tokentext>
<sentencetext>Suppose you were an idiot and a member of congress.
But I repeat myself.</sentencetext>
	<parent>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_24_1428245.30214208</parent>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_24_1428245.30213998</id>
	<title>Re:Brain Power</title>
	<author>L4t3r4lu5</author>
	<datestamp>1259079540000</datestamp>
	<modclass>Insightful</modclass>
	<modscore>2</modscore>
	<htmltext><p><div class="quote"><p>So with the nuerogrid chips, it will require at least a kilowatt to simulate.</p></div><p>So, a reduction of 319kW, then? That's pretty good.</p></div>
	</htmltext>
<tokenext>So with the nuerogrid chips , it will require at least a kilowatt to simulate.So , a reduction of 319kW , then ?
That 's pretty good .</tokentext>
<sentencetext>So with the nuerogrid chips, it will require at least a kilowatt to simulate.So, a reduction of 319kW, then?
That's pretty good.
	</sentencetext>
	<parent>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_24_1428245.30213710</parent>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_24_1428245.30214972</id>
	<title>Ah, one of the greatest mysteries of Slashdot!</title>
	<author>Anonymous</author>
	<datestamp>1259083320000</datestamp>
	<modclass>None</modclass>
	<modscore>0</modscore>
	<htmltext><p>We now know what AC does for living. Or at least that it has something to do with supercomputers and brain simulation, possinbly biologics too... This certainly explains why he seems to be able to claim expertise in nearrly any given subject!</p></htmltext>
<tokenext>We now know what AC does for living .
Or at least that it has something to do with supercomputers and brain simulation , possinbly biologics too... This certainly explains why he seems to be able to claim expertise in nearrly any given subject !</tokentext>
<sentencetext>We now know what AC does for living.
Or at least that it has something to do with supercomputers and brain simulation, possinbly biologics too... This certainly explains why he seems to be able to claim expertise in nearrly any given subject!</sentencetext>
	<parent>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_24_1428245.30213878</parent>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_24_1428245.30216286</id>
	<title>Ass-like Computing</title>
	<author>DynaSoar</author>
	<datestamp>1259089140000</datestamp>
	<modclass>None</modclass>
	<modscore>1</modscore>
	<htmltext><p>If pressed, Dr. Boahen himself would contradict the Discover article and say the chips were not "brain like" at all. He's working from the same place Karl Pribram worked from 50 years ago, and Karl still can't say he knows how the brain works. Simulating a process that's assumed to be a part of brain function because it can produce results more effectively and/or efficiently that brute force digital computing does not make it "brain like". The comparison/contrast done on power consumption doesn't make a case for similarity to brain function either. It's even more misleading because the sole source of power for the brain is metabolism of glucose, with no consideration of Ohm's law and such to be taken into account.</p><p>Some of the most primitive neural networking  devices were capable of learning to a depth and at a speed far outstripping the brain when the number of neurons/switched are taken into account. To make a chip more "brain like" is to negate that benefit. There are 7 billion human brains running around loose, and way too many cat brains if you ask the SPCA. We don't need to build more. At least not the hard way. What we need are devices that can out perform the brain in specialized functions. Mimicking a process in order to mimic a result does not produce an efficient process or even an efficient mimicking. Building in "noise" when one doesn't even understand what the noise is for (here taken in the signal processing sense: anything other than a defined signal is noise) doesn't improve the situation. Even more salient, building a digital simulation of an analog process introduces error that compounds the longer it runs. No, not even neurons "firing" are digital in nature. They require a voltage curve specific to the neuron type that in general follows a specific pattern of hyperpolarization and depolarization which is itself analog. Furthermore, the voltage measured is a change over time either inside or outside the cell membrane; taken together the internal and external voltages balance as to the respective changes. Digital signals are one voltage throughout the channel.</p><p>The continual insistence at trying to satisfy the descriptor "brain like" when one doesn't have an accurate description of that the brain itself is like, only makes Edsger Dijkstra's quote more relevant: "The question of whether a computer can think is no more interesting than the question of whether a submarine can swim."</p></div>
	</htmltext>
<tokenext>If pressed , Dr. Boahen himself would contradict the Discover article and say the chips were not " brain like " at all .
He 's working from the same place Karl Pribram worked from 50 years ago , and Karl still ca n't say he knows how the brain works .
Simulating a process that 's assumed to be a part of brain function because it can produce results more effectively and/or efficiently that brute force digital computing does not make it " brain like " .
The comparison/contrast done on power consumption does n't make a case for similarity to brain function either .
It 's even more misleading because the sole source of power for the brain is metabolism of glucose , with no consideration of Ohm 's law and such to be taken into account.Some of the most primitive neural networking devices were capable of learning to a depth and at a speed far outstripping the brain when the number of neurons/switched are taken into account .
To make a chip more " brain like " is to negate that benefit .
There are 7 billion human brains running around loose , and way too many cat brains if you ask the SPCA .
We do n't need to build more .
At least not the hard way .
What we need are devices that can out perform the brain in specialized functions .
Mimicking a process in order to mimic a result does not produce an efficient process or even an efficient mimicking .
Building in " noise " when one does n't even understand what the noise is for ( here taken in the signal processing sense : anything other than a defined signal is noise ) does n't improve the situation .
Even more salient , building a digital simulation of an analog process introduces error that compounds the longer it runs .
No , not even neurons " firing " are digital in nature .
They require a voltage curve specific to the neuron type that in general follows a specific pattern of hyperpolarization and depolarization which is itself analog .
Furthermore , the voltage measured is a change over time either inside or outside the cell membrane ; taken together the internal and external voltages balance as to the respective changes .
Digital signals are one voltage throughout the channel.The continual insistence at trying to satisfy the descriptor " brain like " when one does n't have an accurate description of that the brain itself is like , only makes Edsger Dijkstra 's quote more relevant : " The question of whether a computer can think is no more interesting than the question of whether a submarine can swim .
"</tokentext>
<sentencetext>If pressed, Dr. Boahen himself would contradict the Discover article and say the chips were not "brain like" at all.
He's working from the same place Karl Pribram worked from 50 years ago, and Karl still can't say he knows how the brain works.
Simulating a process that's assumed to be a part of brain function because it can produce results more effectively and/or efficiently that brute force digital computing does not make it "brain like".
The comparison/contrast done on power consumption doesn't make a case for similarity to brain function either.
It's even more misleading because the sole source of power for the brain is metabolism of glucose, with no consideration of Ohm's law and such to be taken into account.Some of the most primitive neural networking  devices were capable of learning to a depth and at a speed far outstripping the brain when the number of neurons/switched are taken into account.
To make a chip more "brain like" is to negate that benefit.
There are 7 billion human brains running around loose, and way too many cat brains if you ask the SPCA.
We don't need to build more.
At least not the hard way.
What we need are devices that can out perform the brain in specialized functions.
Mimicking a process in order to mimic a result does not produce an efficient process or even an efficient mimicking.
Building in "noise" when one doesn't even understand what the noise is for (here taken in the signal processing sense: anything other than a defined signal is noise) doesn't improve the situation.
Even more salient, building a digital simulation of an analog process introduces error that compounds the longer it runs.
No, not even neurons "firing" are digital in nature.
They require a voltage curve specific to the neuron type that in general follows a specific pattern of hyperpolarization and depolarization which is itself analog.
Furthermore, the voltage measured is a change over time either inside or outside the cell membrane; taken together the internal and external voltages balance as to the respective changes.
Digital signals are one voltage throughout the channel.The continual insistence at trying to satisfy the descriptor "brain like" when one doesn't have an accurate description of that the brain itself is like, only makes Edsger Dijkstra's quote more relevant: "The question of whether a computer can think is no more interesting than the question of whether a submarine can swim.
"
	</sentencetext>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_24_1428245.30216730</id>
	<title>Re:long ways to go yet</title>
	<author>LockeOnLogic</author>
	<datestamp>1259091300000</datestamp>
	<modclass>None</modclass>
	<modscore>1</modscore>
	<htmltext>There's is no proof that suggests that it can either.</htmltext>
<tokenext>There 's is no proof that suggests that it can either .</tokentext>
<sentencetext>There's is no proof that suggests that it can either.</sentencetext>
	<parent>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_24_1428245.30214492</parent>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_24_1428245.30213710</id>
	<title>Brain Power</title>
	<author>Trevin</author>
	<datestamp>1259078220000</datestamp>
	<modclass>Informativ</modclass>
	<modscore>2</modscore>
	<htmltext><p>The cat's brain is made up of 1 BILLION neurons and 10 trillion synapses.  So with the nuerogrid chips, it will require at least a kilowatt to simulate.</p></htmltext>
<tokenext>The cat 's brain is made up of 1 BILLION neurons and 10 trillion synapses .
So with the nuerogrid chips , it will require at least a kilowatt to simulate .</tokentext>
<sentencetext>The cat's brain is made up of 1 BILLION neurons and 10 trillion synapses.
So with the nuerogrid chips, it will require at least a kilowatt to simulate.</sentencetext>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_24_1428245.30217498</id>
	<title>Random topology?</title>
	<author>nokiator</author>
	<datestamp>1259095020000</datestamp>
	<modclass>None</modclass>
	<modscore>1</modscore>
	<htmltext>Matching the neuron count and connection count of a cat brain is clearly not sufficient to simulate the functionality. Neurons in a mammal brain are not randomly connected. A great level of organization happens during the growth of the brain cells and connections starting from the embryonic stage. Much of the functionality is "hardwired" as result of this organized growth process which has evolved over hundreds of millions of years, and for higher level mammal like a cat a lot of the functionality is wired (learned) during early "kittenhood". Without reverse engineering some of this "schematic diagram", I am not sure how useful it is to simulate a random set of neurons that are wired randomly unless the object is to create a high-grade white noise generator.</htmltext>
<tokenext>Matching the neuron count and connection count of a cat brain is clearly not sufficient to simulate the functionality .
Neurons in a mammal brain are not randomly connected .
A great level of organization happens during the growth of the brain cells and connections starting from the embryonic stage .
Much of the functionality is " hardwired " as result of this organized growth process which has evolved over hundreds of millions of years , and for higher level mammal like a cat a lot of the functionality is wired ( learned ) during early " kittenhood " .
Without reverse engineering some of this " schematic diagram " , I am not sure how useful it is to simulate a random set of neurons that are wired randomly unless the object is to create a high-grade white noise generator .</tokentext>
<sentencetext>Matching the neuron count and connection count of a cat brain is clearly not sufficient to simulate the functionality.
Neurons in a mammal brain are not randomly connected.
A great level of organization happens during the growth of the brain cells and connections starting from the embryonic stage.
Much of the functionality is "hardwired" as result of this organized growth process which has evolved over hundreds of millions of years, and for higher level mammal like a cat a lot of the functionality is wired (learned) during early "kittenhood".
Without reverse engineering some of this "schematic diagram", I am not sure how useful it is to simulate a random set of neurons that are wired randomly unless the object is to create a high-grade white noise generator.</sentencetext>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_24_1428245.30214812</id>
	<title>Re:Brain Power</title>
	<author>tmosley</author>
	<datestamp>1259082900000</datestamp>
	<modclass>None</modclass>
	<modscore>1</modscore>
	<htmltext>Or just one can of cat food.  Better get the good stuff, though, she's a bit finicky.</htmltext>
<tokenext>Or just one can of cat food .
Better get the good stuff , though , she 's a bit finicky .</tokentext>
<sentencetext>Or just one can of cat food.
Better get the good stuff, though, she's a bit finicky.</sentencetext>
	<parent>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_24_1428245.30213710</parent>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_24_1428245.30215894</id>
	<title>Re:Does anyone really know what a cat thinks?</title>
	<author>Nathrael</author>
	<datestamp>1259087220000</datestamp>
	<modclass>None</modclass>
	<modscore>1</modscore>
	<htmltext>I can haz cheezburger?</htmltext>
<tokenext>I can haz cheezburger ?</tokentext>
<sentencetext>I can haz cheezburger?</sentencetext>
	<parent>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_24_1428245.30213660</parent>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_24_1428245.30214102</id>
	<title>Re:Does anyone really know what a cat thinks?</title>
	<author>Sulphur</author>
	<datestamp>1259080020000</datestamp>
	<modclass>None</modclass>
	<modscore>1</modscore>
	<htmltext><p>Your computer is chasing my mouse.</p></htmltext>
<tokenext>Your computer is chasing my mouse .</tokentext>
<sentencetext>Your computer is chasing my mouse.</sentencetext>
	<parent>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_24_1428245.30213660</parent>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_24_1428245.30214088</id>
	<title>Markram's for real</title>
	<author>Anonymous</author>
	<datestamp>1259079960000</datestamp>
	<modclass>Informativ</modclass>
	<modscore>5</modscore>
	<htmltext><p>My research recently took me to some of Markram's work - the guy is brilliant and REALISTIC.  His research goals are simple and attainable and any claims of success he has are *well* within the real world.  He's incrementally worked his way up from a few neurons - the way a *real* scientist works; and to him, the simplest "brain simulation" of any sort is definitely possible, but far off in the future.</p></htmltext>
<tokenext>My research recently took me to some of Markram 's work - the guy is brilliant and REALISTIC .
His research goals are simple and attainable and any claims of success he has are * well * within the real world .
He 's incrementally worked his way up from a few neurons - the way a * real * scientist works ; and to him , the simplest " brain simulation " of any sort is definitely possible , but far off in the future .</tokentext>
<sentencetext>My research recently took me to some of Markram's work - the guy is brilliant and REALISTIC.
His research goals are simple and attainable and any claims of success he has are *well* within the real world.
He's incrementally worked his way up from a few neurons - the way a *real* scientist works; and to him, the simplest "brain simulation" of any sort is definitely possible, but far off in the future.</sentencetext>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_24_1428245.30213916</id>
	<title>Brute force neurons...</title>
	<author>Anonymous</author>
	<datestamp>1259079180000</datestamp>
	<modclass>Insightful</modclass>
	<modscore>4</modscore>
	<htmltext><p>So according to this guy rant letter, the "cat-brain simulation" was nothing more than the simulation of a ANN wiht X number of neurons with X equal to the average number of neurons in a cat.</p><p>However, it seems the<nobr> <wbr></nobr>/complexity/ of the simulated neurons is not remotely similar to that of the neurons of a real cat.</p><p>With that view, yes it seems less breakthrough. The experiment reminds me of AI researchers that thought that we could get intelligent machines using a brute-force kind of approach; this by adding<nobr> <wbr></nobr>/enough/ knowledge-rules,<nobr> <wbr></nobr>/enough/ processing power, etc...</p></htmltext>
<tokenext>So according to this guy rant letter , the " cat-brain simulation " was nothing more than the simulation of a ANN wiht X number of neurons with X equal to the average number of neurons in a cat.However , it seems the /complexity/ of the simulated neurons is not remotely similar to that of the neurons of a real cat.With that view , yes it seems less breakthrough .
The experiment reminds me of AI researchers that thought that we could get intelligent machines using a brute-force kind of approach ; this by adding /enough/ knowledge-rules , /enough/ processing power , etc.. .</tokentext>
<sentencetext>So according to this guy rant letter, the "cat-brain simulation" was nothing more than the simulation of a ANN wiht X number of neurons with X equal to the average number of neurons in a cat.However, it seems the /complexity/ of the simulated neurons is not remotely similar to that of the neurons of a real cat.With that view, yes it seems less breakthrough.
The experiment reminds me of AI researchers that thought that we could get intelligent machines using a brute-force kind of approach; this by adding /enough/ knowledge-rules, /enough/ processing power, etc...</sentencetext>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_24_1428245.30213856</id>
	<title>Simon's Cat</title>
	<author>Rik Sweeney</author>
	<datestamp>1259078880000</datestamp>
	<modclass>None</modclass>
	<modscore>1</modscore>
	<htmltext><p>I bet they just based their simulation on <a href="http://www.youtube.com/simonscat#p/u/5/w0ffwDYo00Q" title="youtube.com">Simon's Cat</a> [youtube.com] which, to be honest, is a pretty accurate representation.</p></htmltext>
<tokenext>I bet they just based their simulation on Simon 's Cat [ youtube.com ] which , to be honest , is a pretty accurate representation .</tokentext>
<sentencetext>I bet they just based their simulation on Simon's Cat [youtube.com] which, to be honest, is a pretty accurate representation.</sentencetext>
</comment>
<thread>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#thread_09_11_24_1428245_42</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_24_1428245.30214166
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_24_1428245.30213660
</commentlist>
</thread>
<thread>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#thread_09_11_24_1428245_33</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_24_1428245.30216696
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_24_1428245.30214208
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_24_1428245.30213878
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_24_1428245.30213748
</commentlist>
</thread>
<thread>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#thread_09_11_24_1428245_12</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_24_1428245.30214140
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_24_1428245.30213748
</commentlist>
</thread>
<thread>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#thread_09_11_24_1428245_26</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_24_1428245.30214210
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_24_1428245.30213710
</commentlist>
</thread>
<thread>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#thread_09_11_24_1428245_8</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_24_1428245.30215098
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_24_1428245.30213878
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_24_1428245.30213748
</commentlist>
</thread>
<thread>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#thread_09_11_24_1428245_34</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_24_1428245.30217156
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_24_1428245.30213710
</commentlist>
</thread>
<thread>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#thread_09_11_24_1428245_48</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_24_1428245.30225786
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_24_1428245.30213710
</commentlist>
</thread>
<thread>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#thread_09_11_24_1428245_18</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_24_1428245.30216314
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_24_1428245.30214492
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_24_1428245.30213974
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_24_1428245.30213748
</commentlist>
</thread>
<thread>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#thread_09_11_24_1428245_53</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_24_1428245.30214732
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_24_1428245.30213748
</commentlist>
</thread>
<thread>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#thread_09_11_24_1428245_55</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_24_1428245.30218472
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_24_1428245.30213660
</commentlist>
</thread>
<thread>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#thread_09_11_24_1428245_25</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_24_1428245.30219490
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_24_1428245.30213746
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_24_1428245.30213660
</commentlist>
</thread>
<thread>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#thread_09_11_24_1428245_31</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_24_1428245.30217330
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_24_1428245.30215320
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_24_1428245.30214088
</commentlist>
</thread>
<thread>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#thread_09_11_24_1428245_9</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_24_1428245.30216488
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_24_1428245.30213734
</commentlist>
</thread>
<thread>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#thread_09_11_24_1428245_24</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_24_1428245.30215246
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_24_1428245.30213660
</commentlist>
</thread>
<thread>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#thread_09_11_24_1428245_15</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_24_1428245.30214222
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_24_1428245.30213660
</commentlist>
</thread>
<thread>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#thread_09_11_24_1428245_56</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_24_1428245.30215206
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_24_1428245.30213660
</commentlist>
</thread>
<thread>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#thread_09_11_24_1428245_6</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_24_1428245.30222748
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_24_1428245.30214096
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_24_1428245.30213710
</commentlist>
</thread>
<thread>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#thread_09_11_24_1428245_4</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_24_1428245.30222364
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_24_1428245.30218230
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_24_1428245.30213700
</commentlist>
</thread>
<thread>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#thread_09_11_24_1428245_3</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_24_1428245.30221482
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_24_1428245.30213936
</commentlist>
</thread>
<thread>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#thread_09_11_24_1428245_47</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_24_1428245.30218126
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_24_1428245.30214492
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_24_1428245.30213974
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_24_1428245.30213748
</commentlist>
</thread>
<thread>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#thread_09_11_24_1428245_50</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_24_1428245.30215894
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_24_1428245.30213660
</commentlist>
</thread>
<thread>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#thread_09_11_24_1428245_2</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_24_1428245.30214102
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_24_1428245.30213660
</commentlist>
</thread>
<thread>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#thread_09_11_24_1428245_46</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_24_1428245.30218460
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_24_1428245.30213734
</commentlist>
</thread>
<thread>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#thread_09_11_24_1428245_37</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_24_1428245.30216614
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_24_1428245.30213878
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_24_1428245.30213748
</commentlist>
</thread>
<thread>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#thread_09_11_24_1428245_16</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_24_1428245.30221986
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_24_1428245.30214492
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_24_1428245.30213974
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_24_1428245.30213748
</commentlist>
</thread>
<thread>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#thread_09_11_24_1428245_40</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_24_1428245.30214566
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_24_1428245.30213660
</commentlist>
</thread>
<thread>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#thread_09_11_24_1428245_23</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_24_1428245.30214234
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_24_1428245.30213696
</commentlist>
</thread>
<thread>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#thread_09_11_24_1428245_13</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_24_1428245.30220330
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_24_1428245.30213660
</commentlist>
</thread>
<thread>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#thread_09_11_24_1428245_38</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_24_1428245.30216730
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_24_1428245.30214492
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_24_1428245.30213974
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_24_1428245.30213748
</commentlist>
</thread>
<thread>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#thread_09_11_24_1428245_1</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_24_1428245.30215678
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_24_1428245.30214088
</commentlist>
</thread>
<thread>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#thread_09_11_24_1428245_45</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_24_1428245.30216370
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_24_1428245.30214492
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_24_1428245.30213974
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_24_1428245.30213748
</commentlist>
</thread>
<thread>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#thread_09_11_24_1428245_29</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_24_1428245.30223658
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_24_1428245.30214048
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_24_1428245.30213660
</commentlist>
</thread>
<thread>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#thread_09_11_24_1428245_35</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_24_1428245.30216402
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_24_1428245.30214492
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_24_1428245.30213974
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_24_1428245.30213748
</commentlist>
</thread>
<thread>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#thread_09_11_24_1428245_28</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_24_1428245.30213998
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_24_1428245.30213710
</commentlist>
</thread>
<thread>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#thread_09_11_24_1428245_19</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_24_1428245.30214812
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_24_1428245.30213710
</commentlist>
</thread>
<thread>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#thread_09_11_24_1428245_10</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_24_1428245.30214972
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_24_1428245.30213878
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_24_1428245.30213748
</commentlist>
</thread>
<thread>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#thread_09_11_24_1428245_32</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_24_1428245.30214888
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_24_1428245.30213936
</commentlist>
</thread>
<thread>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#thread_09_11_24_1428245_27</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_24_1428245.30215390
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_24_1428245.30213974
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_24_1428245.30213748
</commentlist>
</thread>
<thread>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#thread_09_11_24_1428245_51</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_24_1428245.30214188
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_24_1428245.30213710
</commentlist>
</thread>
<thread>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#thread_09_11_24_1428245_17</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_24_1428245.30214170
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_24_1428245.30213660
</commentlist>
</thread>
<thread>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#thread_09_11_24_1428245_7</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_24_1428245.30214576
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_24_1428245.30213734
</commentlist>
</thread>
<thread>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#thread_09_11_24_1428245_22</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_24_1428245.30215232
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_24_1428245.30213878
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_24_1428245.30213748
</commentlist>
</thread>
<thread>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#thread_09_11_24_1428245_49</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_24_1428245.30221336
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_24_1428245.30213974
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_24_1428245.30213748
</commentlist>
</thread>
<thread>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#thread_09_11_24_1428245_52</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_24_1428245.30213990
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_24_1428245.30213700
</commentlist>
</thread>
<thread>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#thread_09_11_24_1428245_54</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_24_1428245.30214106
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_24_1428245.30213748
</commentlist>
</thread>
<thread>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#thread_09_11_24_1428245_39</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_24_1428245.30222698
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_24_1428245.30213746
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_24_1428245.30213660
</commentlist>
</thread>
<thread>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#thread_09_11_24_1428245_30</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_24_1428245.30213994
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_24_1428245.30213660
</commentlist>
</thread>
<thread>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#thread_09_11_24_1428245_0</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_24_1428245.30214136
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_24_1428245.30213748
</commentlist>
</thread>
<thread>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#thread_09_11_24_1428245_44</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_24_1428245.30214434
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_24_1428245.30213886
</commentlist>
</thread>
<thread>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#thread_09_11_24_1428245_14</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_24_1428245.30214100
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_24_1428245.30213700
</commentlist>
</thread>
<thread>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#thread_09_11_24_1428245_21</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_24_1428245.30223084
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_24_1428245.30214492
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_24_1428245.30213974
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_24_1428245.30213748
</commentlist>
</thread>
<thread>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#thread_09_11_24_1428245_5</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_24_1428245.30217262
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_24_1428245.30214114
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_24_1428245.30213660
</commentlist>
</thread>
<thread>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#thread_09_11_24_1428245_20</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_24_1428245.30217536
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_24_1428245.30214208
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_24_1428245.30213878
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_24_1428245.30213748
</commentlist>
</thread>
<thread>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#thread_09_11_24_1428245_11</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_24_1428245.30215134
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_24_1428245.30214016
</commentlist>
</thread>
<thread>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#thread_09_11_24_1428245_36</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_24_1428245.30214860
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_24_1428245.30214164
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_24_1428245.30213660
</commentlist>
</thread>
<thread>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#thread_09_11_24_1428245_41</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_24_1428245.30215102
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_24_1428245.30213696
</commentlist>
</thread>
<thread>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#thread_09_11_24_1428245_43</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_24_1428245.30221636
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_24_1428245.30213748
</commentlist>
</thread>
<conversation>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#conversation09_11_24_1428245.11</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_24_1428245.30214436
</commentlist>
</conversation>
<conversation>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#conversation09_11_24_1428245.14</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_24_1428245.30213886
-http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_24_1428245.30214434
</commentlist>
</conversation>
<conversation>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#conversation09_11_24_1428245.12</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_24_1428245.30213700
-http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_24_1428245.30213990
-http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_24_1428245.30218230
--http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_24_1428245.30222364
-http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_24_1428245.30214100
</commentlist>
</conversation>
<conversation>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#conversation09_11_24_1428245.15</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_24_1428245.30214636
</commentlist>
</conversation>
<conversation>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#conversation09_11_24_1428245.13</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_24_1428245.30213722
</commentlist>
</conversation>
<conversation>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#conversation09_11_24_1428245.3</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_24_1428245.30213674
</commentlist>
</conversation>
<conversation>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#conversation09_11_24_1428245.9</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_24_1428245.30214016
-http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_24_1428245.30215134
</commentlist>
</conversation>
<conversation>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#conversation09_11_24_1428245.1</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_24_1428245.30213748
-http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_24_1428245.30214106
-http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_24_1428245.30213974
--http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_24_1428245.30221336
--http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_24_1428245.30215390
--http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_24_1428245.30214492
---http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_24_1428245.30216370
---http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_24_1428245.30216730
---http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_24_1428245.30221986
---http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_24_1428245.30216402
---http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_24_1428245.30216314
---http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_24_1428245.30223084
---http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_24_1428245.30218126
-http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_24_1428245.30221636
-http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_24_1428245.30213878
--http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_24_1428245.30215098
--http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_24_1428245.30214208
---http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_24_1428245.30216696
---http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_24_1428245.30217536
--http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_24_1428245.30216614
--http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_24_1428245.30214972
--http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_24_1428245.30215232
-http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_24_1428245.30214136
-http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_24_1428245.30214140
-http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_24_1428245.30214732
</commentlist>
</conversation>
<conversation>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#conversation09_11_24_1428245.7</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_24_1428245.30214516
</commentlist>
</conversation>
<conversation>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#conversation09_11_24_1428245.4</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_24_1428245.30213696
-http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_24_1428245.30215102
-http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_24_1428245.30214234
</commentlist>
</conversation>
<conversation>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#conversation09_11_24_1428245.2</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_24_1428245.30214052
</commentlist>
</conversation>
<conversation>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#conversation09_11_24_1428245.5</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_24_1428245.30213936
-http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_24_1428245.30221482
-http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_24_1428245.30214888
</commentlist>
</conversation>
<conversation>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#conversation09_11_24_1428245.8</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_24_1428245.30214088
-http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_24_1428245.30215678
-http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_24_1428245.30215320
--http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_24_1428245.30217330
</commentlist>
</conversation>
<conversation>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#conversation09_11_24_1428245.16</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_24_1428245.30213710
-http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_24_1428245.30214812
-http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_24_1428245.30214096
--http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_24_1428245.30222748
-http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_24_1428245.30214210
-http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_24_1428245.30213998
-http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_24_1428245.30217156
-http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_24_1428245.30214188
-http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_24_1428245.30225786
</commentlist>
</conversation>
<conversation>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#conversation09_11_24_1428245.6</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_24_1428245.30213916
</commentlist>
</conversation>
<conversation>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#conversation09_11_24_1428245.0</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_24_1428245.30213660
-http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_24_1428245.30214102
-http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_24_1428245.30215206
-http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_24_1428245.30213994
-http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_24_1428245.30220330
-http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_24_1428245.30214164
--http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_24_1428245.30214860
-http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_24_1428245.30213746
--http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_24_1428245.30219490
--http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_24_1428245.30222698
-http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_24_1428245.30214166
-http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_24_1428245.30214048
--http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_24_1428245.30223658
-http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_24_1428245.30214566
-http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_24_1428245.30215246
-http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_24_1428245.30214170
-http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_24_1428245.30215894
-http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_24_1428245.30214114
--http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_24_1428245.30217262
-http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_24_1428245.30218472
-http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_24_1428245.30214222
</commentlist>
</conversation>
<conversation>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#conversation09_11_24_1428245.10</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_24_1428245.30213734
-http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_24_1428245.30218460
-http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_24_1428245.30214576
-http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_11_24_1428245.30216488
</commentlist>
</conversation>
