<article>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#article09_06_11_1228209</id>
	<title>Should Undergraduates Be Taught Fortran?</title>
	<author>CmdrTaco</author>
	<datestamp>1244724420000</datestamp>
	<htmltext><a href="http://www.walkingrandomly.com/" rel="nofollow">Mike Croucher</a> writes <i>"Despite the fact that it is over 40 years old, Fortran is still taught at many Universities to students of Physics, Chemistry, Engineering and more as their first ever formal introduction to programming. According to this article  <a href="http://www.walkingrandomly.com/?p=1397">that shouldn't be happening anymore</a>, since there are much better alternatives, such as Python, that would serve a physical science undergraduate much better. There may come a time in some researchers' lives where they need Fortran, but this time isn't in 'programming for chemists 101.'  What do people in the Slashdot community think?"</i></htmltext>
<tokenext>Mike Croucher writes " Despite the fact that it is over 40 years old , Fortran is still taught at many Universities to students of Physics , Chemistry , Engineering and more as their first ever formal introduction to programming .
According to this article that should n't be happening anymore , since there are much better alternatives , such as Python , that would serve a physical science undergraduate much better .
There may come a time in some researchers ' lives where they need Fortran , but this time is n't in 'programming for chemists 101 .
' What do people in the Slashdot community think ?
"</tokentext>
<sentencetext>Mike Croucher writes "Despite the fact that it is over 40 years old, Fortran is still taught at many Universities to students of Physics, Chemistry, Engineering and more as their first ever formal introduction to programming.
According to this article  that shouldn't be happening anymore, since there are much better alternatives, such as Python, that would serve a physical science undergraduate much better.
There may come a time in some researchers' lives where they need Fortran, but this time isn't in 'programming for chemists 101.
'  What do people in the Slashdot community think?
"</sentencetext>
</article>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28292885</id>
	<title>Scared Straight!</title>
	<author>ewg</author>
	<datestamp>1244731860000</datestamp>
	<modclass>None</modclass>
	<modscore>1</modscore>
	<htmltext>They should be taught Fortran <a href="http://en.wikipedia.org/wiki/Scared\_Straight!" title="wikipedia.org">by screaming lifers</a> [wikipedia.org] to save them from a life of static compilation.</htmltext>
<tokenext>They should be taught Fortran by screaming lifers [ wikipedia.org ] to save them from a life of static compilation .</tokentext>
<sentencetext>They should be taught Fortran by screaming lifers [wikipedia.org] to save them from a life of static compilation.</sentencetext>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28293833</id>
	<title>Re:PYTHON????</title>
	<author>m\_maximus</author>
	<datestamp>1244735340000</datestamp>
	<modclass>None</modclass>
	<modscore>1</modscore>
	<htmltext><p><div class="quote"><p>They relearn the lessons of CS he hard way over 10-20-30-40(?) years of experience.</p></div><p>Some of them still don't learn how to write decent code after 40 years.

I've said it before, but the danger posed by a programmer with a set of tools is only surpassed by an engineer who thinks they can write code.</p></div>
	</htmltext>
<tokenext>They relearn the lessons of CS he hard way over 10-20-30-40 ( ?
) years of experience.Some of them still do n't learn how to write decent code after 40 years .
I 've said it before , but the danger posed by a programmer with a set of tools is only surpassed by an engineer who thinks they can write code .</tokentext>
<sentencetext>They relearn the lessons of CS he hard way over 10-20-30-40(?
) years of experience.Some of them still don't learn how to write decent code after 40 years.
I've said it before, but the danger posed by a programmer with a set of tools is only surpassed by an engineer who thinks they can write code.
	</sentencetext>
	<parent>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28292127</parent>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28293329</id>
	<title>Re:PYTHON????</title>
	<author>TheRaven64</author>
	<datestamp>1244733360000</datestamp>
	<modclass>None</modclass>
	<modscore>1</modscore>
	<htmltext>I'm definitely not a Python fanboy - in my experience the only language where people consistently write bad code than Python is VB - but they premise of the article is not too far wrong.  Most of the Fortran code these people write will be using external libraries for things like linear algebra, differential calculus, arbitrary-precision arithmetic, and so on.  Profiling the code, I'd be surprised it if spends even 10\% of its time in their code, with the remaining 90\% being spent in the library code.  If you go to a language that is 1/30th the speed of Fortran, you are only making the total 1/3rd slower.  <p>
Take a look, for example, at something like GNU Radio for how Python can be used where high performance is needed.  This project uses Python as a glue for joining together DSP modules written in a lower-level language.  Most users of the system only need to use the high-level interfaces.  Even though the Python is very slow, the overall code is very fast because it spends most of its time in the DSP modules.</p></htmltext>
<tokenext>I 'm definitely not a Python fanboy - in my experience the only language where people consistently write bad code than Python is VB - but they premise of the article is not too far wrong .
Most of the Fortran code these people write will be using external libraries for things like linear algebra , differential calculus , arbitrary-precision arithmetic , and so on .
Profiling the code , I 'd be surprised it if spends even 10 \ % of its time in their code , with the remaining 90 \ % being spent in the library code .
If you go to a language that is 1/30th the speed of Fortran , you are only making the total 1/3rd slower .
Take a look , for example , at something like GNU Radio for how Python can be used where high performance is needed .
This project uses Python as a glue for joining together DSP modules written in a lower-level language .
Most users of the system only need to use the high-level interfaces .
Even though the Python is very slow , the overall code is very fast because it spends most of its time in the DSP modules .</tokentext>
<sentencetext>I'm definitely not a Python fanboy - in my experience the only language where people consistently write bad code than Python is VB - but they premise of the article is not too far wrong.
Most of the Fortran code these people write will be using external libraries for things like linear algebra, differential calculus, arbitrary-precision arithmetic, and so on.
Profiling the code, I'd be surprised it if spends even 10\% of its time in their code, with the remaining 90\% being spent in the library code.
If you go to a language that is 1/30th the speed of Fortran, you are only making the total 1/3rd slower.
Take a look, for example, at something like GNU Radio for how Python can be used where high performance is needed.
This project uses Python as a glue for joining together DSP modules written in a lower-level language.
Most users of the system only need to use the high-level interfaces.
Even though the Python is very slow, the overall code is very fast because it spends most of its time in the DSP modules.</sentencetext>
	<parent>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28292127</parent>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28292125</id>
	<title>Re:While there may be "newer" languages</title>
	<author>Falstius</author>
	<datestamp>1244729280000</datestamp>
	<modclass>Informativ</modclass>
	<modscore>4</modscore>
	<htmltext>Fortran has tons of libraries specialized to whatever scientific field you are working in, and is unavoidable in high energy physics especially.  Of course, most of these can be wrapped in C and then used in whatever high level language you like.</htmltext>
<tokenext>Fortran has tons of libraries specialized to whatever scientific field you are working in , and is unavoidable in high energy physics especially .
Of course , most of these can be wrapped in C and then used in whatever high level language you like .</tokentext>
<sentencetext>Fortran has tons of libraries specialized to whatever scientific field you are working in, and is unavoidable in high energy physics especially.
Of course, most of these can be wrapped in C and then used in whatever high level language you like.</sentencetext>
	<parent>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28292025</parent>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28297325</id>
	<title>Re:It's okay to teach them FORTRAN</title>
	<author>SuperTechnoNerd</author>
	<datestamp>1244747580000</datestamp>
	<modclass>None</modclass>
	<modscore>1</modscore>
	<htmltext>Learning Fortran and Basic I believe makes it harder to learn object oriented languages like C++ and Java.
I also have seen first hand what a 'real' programmer does when he sees GOTO's in code.. Remember the Excorsist? Head twisting, pea soup, the works...<nobr> <wbr></nobr>:)
However what is more intresting,  I greped through linux kernel sources, wow-  just count the GOTO'S..</htmltext>
<tokenext>Learning Fortran and Basic I believe makes it harder to learn object oriented languages like C + + and Java .
I also have seen first hand what a 'real ' programmer does when he sees GOTO 's in code.. Remember the Excorsist ?
Head twisting , pea soup , the works... : ) However what is more intresting , I greped through linux kernel sources , wow- just count the GOTO 'S. .</tokentext>
<sentencetext>Learning Fortran and Basic I believe makes it harder to learn object oriented languages like C++ and Java.
I also have seen first hand what a 'real' programmer does when he sees GOTO's in code.. Remember the Excorsist?
Head twisting, pea soup, the works... :)
However what is more intresting,  I greped through linux kernel sources, wow-  just count the GOTO'S..</sentencetext>
	<parent>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28294815</parent>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28292331</id>
	<title>Re:Not so easy</title>
	<author>teg</author>
	<datestamp>1244730060000</datestamp>
	<modclass>Insightful</modclass>
	<modscore>4</modscore>
	<htmltext><p> <i>Fortran hasn't had those limitations for decades - Fortran 90 and later are ideal languages for expressing mathematical algorithms and crunching numbers</i> </p><p>
Fortran hasn't had those limitations for decades - Fortran 90 and later are ideal languages for expressing mathematical algorithms and crunching numbers. The handling of arrays, matrices are just what they should be.</p><p>
I wouldn't use Fortran as a general purpose language - having used Python for more 10 years I shudder at using Fortran for string handling, databases, user interfaces and more - but as a tool for expressing math it's the best, and also the most widely used. The alternative would be matlab (much of the syntax isn't that different).</p></htmltext>
<tokenext>Fortran has n't had those limitations for decades - Fortran 90 and later are ideal languages for expressing mathematical algorithms and crunching numbers Fortran has n't had those limitations for decades - Fortran 90 and later are ideal languages for expressing mathematical algorithms and crunching numbers .
The handling of arrays , matrices are just what they should be .
I would n't use Fortran as a general purpose language - having used Python for more 10 years I shudder at using Fortran for string handling , databases , user interfaces and more - but as a tool for expressing math it 's the best , and also the most widely used .
The alternative would be matlab ( much of the syntax is n't that different ) .</tokentext>
<sentencetext> Fortran hasn't had those limitations for decades - Fortran 90 and later are ideal languages for expressing mathematical algorithms and crunching numbers 
Fortran hasn't had those limitations for decades - Fortran 90 and later are ideal languages for expressing mathematical algorithms and crunching numbers.
The handling of arrays, matrices are just what they should be.
I wouldn't use Fortran as a general purpose language - having used Python for more 10 years I shudder at using Fortran for string handling, databases, user interfaces and more - but as a tool for expressing math it's the best, and also the most widely used.
The alternative would be matlab (much of the syntax isn't that different).</sentencetext>
	<parent>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28292045</parent>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28307977</id>
	<title>Re:While there may be "newer" languages</title>
	<author>Anonymous</author>
	<datestamp>1244820660000</datestamp>
	<modclass>None</modclass>
	<modscore>0</modscore>
	<htmltext><p><i>To make a long story short; solving Ax=b by calculating x=inv(A)*b is a terrible idea because calculating inv(A) is an inherently difficult thing. While it would be extremely useful to have inv(A), it's not strictly neccessary to obtain in in order to solve Ax=b.</i></p><p>And who cares?  What difference does it make whether it's "difficult"?  Your computer wastes lots of time on lots of different things already.  If it's fast enough and works, just use it.</p><p>In any case, Python has multiple ways of solving linear systems directly; it just doesn't have the "\" syntax (yet).</p><p>(Furthermore, just because the syntax is inv(A)*b doesn't mean that that's the actual calculation.  Good matrix packages need to bundle up a whole bunch of expressions and then rearrange them for the best way of executing them.)</p></htmltext>
<tokenext>To make a long story short ; solving Ax = b by calculating x = inv ( A ) * b is a terrible idea because calculating inv ( A ) is an inherently difficult thing .
While it would be extremely useful to have inv ( A ) , it 's not strictly neccessary to obtain in in order to solve Ax = b.And who cares ?
What difference does it make whether it 's " difficult " ?
Your computer wastes lots of time on lots of different things already .
If it 's fast enough and works , just use it.In any case , Python has multiple ways of solving linear systems directly ; it just does n't have the " \ " syntax ( yet ) .
( Furthermore , just because the syntax is inv ( A ) * b does n't mean that that 's the actual calculation .
Good matrix packages need to bundle up a whole bunch of expressions and then rearrange them for the best way of executing them .
)</tokentext>
<sentencetext>To make a long story short; solving Ax=b by calculating x=inv(A)*b is a terrible idea because calculating inv(A) is an inherently difficult thing.
While it would be extremely useful to have inv(A), it's not strictly neccessary to obtain in in order to solve Ax=b.And who cares?
What difference does it make whether it's "difficult"?
Your computer wastes lots of time on lots of different things already.
If it's fast enough and works, just use it.In any case, Python has multiple ways of solving linear systems directly; it just doesn't have the "\" syntax (yet).
(Furthermore, just because the syntax is inv(A)*b doesn't mean that that's the actual calculation.
Good matrix packages need to bundle up a whole bunch of expressions and then rearrange them for the best way of executing them.
)</sentencetext>
	<parent>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28293759</parent>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28296619</id>
	<title>Re:While there may be "newer" languages</title>
	<author>drinkypoo</author>
	<datestamp>1244745240000</datestamp>
	<modclass>None</modclass>
	<modscore>1</modscore>
	<htmltext><p>great, maybe next they can learn perl so they can analyze their output, and shell scripting to handle their files, and then they can learn java to make applets to put their results on the web, and then...</p><p>WTF, are they getting a programming degree? Teach them Fortran so that they can get their math done. After that they can probably learn python on the job.</p></htmltext>
<tokenext>great , maybe next they can learn perl so they can analyze their output , and shell scripting to handle their files , and then they can learn java to make applets to put their results on the web , and then...WTF , are they getting a programming degree ?
Teach them Fortran so that they can get their math done .
After that they can probably learn python on the job .</tokentext>
<sentencetext>great, maybe next they can learn perl so they can analyze their output, and shell scripting to handle their files, and then they can learn java to make applets to put their results on the web, and then...WTF, are they getting a programming degree?
Teach them Fortran so that they can get their math done.
After that they can probably learn python on the job.</sentencetext>
	<parent>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28292251</parent>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28307355</id>
	<title>Fortran is the right choice.</title>
	<author>Anonymous</author>
	<datestamp>1244818020000</datestamp>
	<modclass>None</modclass>
	<modscore>0</modscore>
	<htmltext><p>Fortran is entirely reasonable here.  For numerical algorithms Fortran is still king.  C is the alternative not python.  You don't write numerical code in python.  Either C or python are the sensible choices.  Fortran has an edge in general because of the existing size of existing code.  However, it is also true that fortran is easier to optimize than C (strict aliasing is the default there).  Fortran has lots of optimzation constructs that made it a decent choice here.</p><p>If you've ever read "real" numerical code<nobr> <wbr></nobr>... Its aweful.  If it was in python it would still look aweful.  it would just be slower and differently ugly.</p><p>Optimizing C and fortran is easy (even if it can take longer)<nobr> <wbr></nobr>.. optimizing most HLL gets hard fast.</p><p>So, no<nobr> <wbr></nobr>.... fortran or C is 100\% right here.</p><p>If they were CS majors (and not taking a course on optimization / computational science )<nobr> <wbr></nobr>... Fortran is the wrong choice of course.</p></htmltext>
<tokenext>Fortran is entirely reasonable here .
For numerical algorithms Fortran is still king .
C is the alternative not python .
You do n't write numerical code in python .
Either C or python are the sensible choices .
Fortran has an edge in general because of the existing size of existing code .
However , it is also true that fortran is easier to optimize than C ( strict aliasing is the default there ) .
Fortran has lots of optimzation constructs that made it a decent choice here.If you 've ever read " real " numerical code ... Its aweful .
If it was in python it would still look aweful .
it would just be slower and differently ugly.Optimizing C and fortran is easy ( even if it can take longer ) .. optimizing most HLL gets hard fast.So , no .... fortran or C is 100 \ % right here.If they were CS majors ( and not taking a course on optimization / computational science ) ... Fortran is the wrong choice of course .</tokentext>
<sentencetext>Fortran is entirely reasonable here.
For numerical algorithms Fortran is still king.
C is the alternative not python.
You don't write numerical code in python.
Either C or python are the sensible choices.
Fortran has an edge in general because of the existing size of existing code.
However, it is also true that fortran is easier to optimize than C (strict aliasing is the default there).
Fortran has lots of optimzation constructs that made it a decent choice here.If you've ever read "real" numerical code ... Its aweful.
If it was in python it would still look aweful.
it would just be slower and differently ugly.Optimizing C and fortran is easy (even if it can take longer) .. optimizing most HLL gets hard fast.So, no .... fortran or C is 100\% right here.If they were CS majors (and not taking a course on optimization / computational science ) ... Fortran is the wrong choice of course.</sentencetext>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28296543</id>
	<title>FORTRAN for Undergrads</title>
	<author>Anonymous</author>
	<datestamp>1244745000000</datestamp>
	<modclass>None</modclass>
	<modscore>0</modscore>
	<htmltext><p>I vote NO.</p><p>Sure, there's a heck of a lot of working legacy code out there (e.g. NAG, &amp;al.).  There are also a lot of clay tablets lying around with pre-written cuneiform on them.  There's a difference between "re-inventing the wheel" and switching from a wooden-axle-through-a-roundish-stone to a truck with independent suspension, and well balanced wheel bearings.</p><p>I spent several years working as my research group's primary computer "guru" -- including having to port old code that had been written to run on an out-dated UNIX box (in FORTRAN) to run on a newer computer system.  1st, I ported from the old FORTRAN 77 to FORTRAN 95.  Worked fine -- only took me a couple of hours.  Later, as requests for improved functionality started coming in, I compiled the old FORTRAN code into a<nobr> <wbr></nobr>.dll, and wrote the calls in C++.  Still fast, accurate, and all -- from a user's perspective -- but UGLY from a programming/maintinence perspective.  Eventually, I spent the better part of an evening re-writing the core functionality in C++ (with - I'll admit - some embedded ASM), and got a result that ran fine on a 486 box just as fast, and just as reliably as the original FORTRAN code on a it's dedicated workstation...  and it was a *heck* of a lot easier to maintain.  After that, I just left the old FORTRAN compiler alone. [For the curious, the code in question modeled the physical response of a visco-elastic microstructural composite material to applied stress/strain -- not an entirely trivial application -- and involved a fair amount of tensor calculus]</p><p>I know that FORTRAN has come a long way since the old days -- but it's still a beast.  Anything that you can do in FORTRAN, you can also do in C/C++ -- but the same is not true the other way around.  Pointers/dynamic memory access/allocation.  Graphics.  Hardware interfaces/communication.  Cross-platform compatibility.</p></htmltext>
<tokenext>I vote NO.Sure , there 's a heck of a lot of working legacy code out there ( e.g .
NAG , &amp;al. ) .
There are also a lot of clay tablets lying around with pre-written cuneiform on them .
There 's a difference between " re-inventing the wheel " and switching from a wooden-axle-through-a-roundish-stone to a truck with independent suspension , and well balanced wheel bearings.I spent several years working as my research group 's primary computer " guru " -- including having to port old code that had been written to run on an out-dated UNIX box ( in FORTRAN ) to run on a newer computer system .
1st , I ported from the old FORTRAN 77 to FORTRAN 95 .
Worked fine -- only took me a couple of hours .
Later , as requests for improved functionality started coming in , I compiled the old FORTRAN code into a .dll , and wrote the calls in C + + .
Still fast , accurate , and all -- from a user 's perspective -- but UGLY from a programming/maintinence perspective .
Eventually , I spent the better part of an evening re-writing the core functionality in C + + ( with - I 'll admit - some embedded ASM ) , and got a result that ran fine on a 486 box just as fast , and just as reliably as the original FORTRAN code on a it 's dedicated workstation... and it was a * heck * of a lot easier to maintain .
After that , I just left the old FORTRAN compiler alone .
[ For the curious , the code in question modeled the physical response of a visco-elastic microstructural composite material to applied stress/strain -- not an entirely trivial application -- and involved a fair amount of tensor calculus ] I know that FORTRAN has come a long way since the old days -- but it 's still a beast .
Anything that you can do in FORTRAN , you can also do in C/C + + -- but the same is not true the other way around .
Pointers/dynamic memory access/allocation .
Graphics. Hardware interfaces/communication .
Cross-platform compatibility .</tokentext>
<sentencetext>I vote NO.Sure, there's a heck of a lot of working legacy code out there (e.g.
NAG, &amp;al.).
There are also a lot of clay tablets lying around with pre-written cuneiform on them.
There's a difference between "re-inventing the wheel" and switching from a wooden-axle-through-a-roundish-stone to a truck with independent suspension, and well balanced wheel bearings.I spent several years working as my research group's primary computer "guru" -- including having to port old code that had been written to run on an out-dated UNIX box (in FORTRAN) to run on a newer computer system.
1st, I ported from the old FORTRAN 77 to FORTRAN 95.
Worked fine -- only took me a couple of hours.
Later, as requests for improved functionality started coming in, I compiled the old FORTRAN code into a .dll, and wrote the calls in C++.
Still fast, accurate, and all -- from a user's perspective -- but UGLY from a programming/maintinence perspective.
Eventually, I spent the better part of an evening re-writing the core functionality in C++ (with - I'll admit - some embedded ASM), and got a result that ran fine on a 486 box just as fast, and just as reliably as the original FORTRAN code on a it's dedicated workstation...  and it was a *heck* of a lot easier to maintain.
After that, I just left the old FORTRAN compiler alone.
[For the curious, the code in question modeled the physical response of a visco-elastic microstructural composite material to applied stress/strain -- not an entirely trivial application -- and involved a fair amount of tensor calculus]I know that FORTRAN has come a long way since the old days -- but it's still a beast.
Anything that you can do in FORTRAN, you can also do in C/C++ -- but the same is not true the other way around.
Pointers/dynamic memory access/allocation.
Graphics.  Hardware interfaces/communication.
Cross-platform compatibility.</sentencetext>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28295763</id>
	<title>Re:While there may be "newer" languages</title>
	<author>gknoy</author>
	<datestamp>1244742120000</datestamp>
	<modclass>None</modclass>
	<modscore>1</modscore>
	<htmltext><p>I am glad you pointed out legacy code. In some engineering disciplines, there is a LOT of existing code written in Fortran. Understanding it, bug-fixing it, and improving it generally requires both engineering savvy (to know when answers make sense) and knowledge of Fortran.  At least being able to READ Fortan helps, too, if you're looking at a reference implementation of something and want to transcode it to something else in a non-verbatim manner.</p></htmltext>
<tokenext>I am glad you pointed out legacy code .
In some engineering disciplines , there is a LOT of existing code written in Fortran .
Understanding it , bug-fixing it , and improving it generally requires both engineering savvy ( to know when answers make sense ) and knowledge of Fortran .
At least being able to READ Fortan helps , too , if you 're looking at a reference implementation of something and want to transcode it to something else in a non-verbatim manner .</tokentext>
<sentencetext>I am glad you pointed out legacy code.
In some engineering disciplines, there is a LOT of existing code written in Fortran.
Understanding it, bug-fixing it, and improving it generally requires both engineering savvy (to know when answers make sense) and knowledge of Fortran.
At least being able to READ Fortan helps, too, if you're looking at a reference implementation of something and want to transcode it to something else in a non-verbatim manner.</sentencetext>
	<parent>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28292401</parent>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28296707</id>
	<title>Re:Are You Serious?</title>
	<author>Yunzil</author>
	<datestamp>1244745600000</datestamp>
	<modclass>None</modclass>
	<modscore>1</modscore>
	<htmltext><p><i>How did we ever make progress with people like you around?</i></p><p>Yes, let's do it your way and rewrite rock solid, decades old, high performance math libraries every time a new language hits the streets.  Just think of how much progress we'd make!</p></htmltext>
<tokenext>How did we ever make progress with people like you around ? Yes , let 's do it your way and rewrite rock solid , decades old , high performance math libraries every time a new language hits the streets .
Just think of how much progress we 'd make !</tokentext>
<sentencetext>How did we ever make progress with people like you around?Yes, let's do it your way and rewrite rock solid, decades old, high performance math libraries every time a new language hits the streets.
Just think of how much progress we'd make!</sentencetext>
	<parent>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28292165</parent>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28295149</id>
	<title>Legacy code vs new rewrite</title>
	<author>cwills</author>
	<datestamp>1244740140000</datestamp>
	<modclass>Interestin</modclass>
	<modscore>2</modscore>
	<htmltext>The fact is, there is an enormous base of existing <b>tested</b> FORTRAN code that is still in use and still being developed within the scientific community. The issue is not simply writing new code in a newer language, the problem is setting up a new tested base in the new language.
<p>
Lets say that you are working on a project to evaluate the effects of theoretical gravity waves through a nebulae.  You have a choice
</p><ol>
<li>Use 3 college interns to modify some code that you have been working on using a library of subroutines that have you and your prior researchers have been building up and using over the last 40 years</li><li>-or- Use 3 college interns to write new code from scratch in a new language that the compiler/interpreter was just released 2 years ago.</li></ol><p>
And oh, you have to publish results in 2 months in order to get your next NSF grant.
</p><p>
Yes, the new code might be all object orienty, and you can use the latest IDE to develop in, and you can hire a bunch of fresh young (cheap) grad students that are familiar with the latest python, perl, C#, etc. development and they can bang out thousands of lines of code a day.  But are you really really sure that that freshly written eigenvalue routine produces the correct result?  Has that new compiler been tested on the super-computer you have limited access to, can it even take advantage of all the power of that system?
</p><p>
I'm not saying that FORTRAN compilers are not bug free, but I suspect that the chances of finding a basic compiler or runtime library bug is lower in FORTRAN then in say Perl 6.
</p><p>
A couple of years ago my son spent some time doing some intern coding work for a private atmospheric research group.  The group was/is doing bleeding edge research.  My son was helping out one of the researchers in updating code that handled 2D models to 3D models.  All the code was in FORTRAN and there was no desire to move away from it.</p></htmltext>
<tokenext>The fact is , there is an enormous base of existing tested FORTRAN code that is still in use and still being developed within the scientific community .
The issue is not simply writing new code in a newer language , the problem is setting up a new tested base in the new language .
Lets say that you are working on a project to evaluate the effects of theoretical gravity waves through a nebulae .
You have a choice Use 3 college interns to modify some code that you have been working on using a library of subroutines that have you and your prior researchers have been building up and using over the last 40 years-or- Use 3 college interns to write new code from scratch in a new language that the compiler/interpreter was just released 2 years ago .
And oh , you have to publish results in 2 months in order to get your next NSF grant .
Yes , the new code might be all object orienty , and you can use the latest IDE to develop in , and you can hire a bunch of fresh young ( cheap ) grad students that are familiar with the latest python , perl , C # , etc .
development and they can bang out thousands of lines of code a day .
But are you really really sure that that freshly written eigenvalue routine produces the correct result ?
Has that new compiler been tested on the super-computer you have limited access to , can it even take advantage of all the power of that system ?
I 'm not saying that FORTRAN compilers are not bug free , but I suspect that the chances of finding a basic compiler or runtime library bug is lower in FORTRAN then in say Perl 6 .
A couple of years ago my son spent some time doing some intern coding work for a private atmospheric research group .
The group was/is doing bleeding edge research .
My son was helping out one of the researchers in updating code that handled 2D models to 3D models .
All the code was in FORTRAN and there was no desire to move away from it .</tokentext>
<sentencetext>The fact is, there is an enormous base of existing tested FORTRAN code that is still in use and still being developed within the scientific community.
The issue is not simply writing new code in a newer language, the problem is setting up a new tested base in the new language.
Lets say that you are working on a project to evaluate the effects of theoretical gravity waves through a nebulae.
You have a choice

Use 3 college interns to modify some code that you have been working on using a library of subroutines that have you and your prior researchers have been building up and using over the last 40 years-or- Use 3 college interns to write new code from scratch in a new language that the compiler/interpreter was just released 2 years ago.
And oh, you have to publish results in 2 months in order to get your next NSF grant.
Yes, the new code might be all object orienty, and you can use the latest IDE to develop in, and you can hire a bunch of fresh young (cheap) grad students that are familiar with the latest python, perl, C#, etc.
development and they can bang out thousands of lines of code a day.
But are you really really sure that that freshly written eigenvalue routine produces the correct result?
Has that new compiler been tested on the super-computer you have limited access to, can it even take advantage of all the power of that system?
I'm not saying that FORTRAN compilers are not bug free, but I suspect that the chances of finding a basic compiler or runtime library bug is lower in FORTRAN then in say Perl 6.
A couple of years ago my son spent some time doing some intern coding work for a private atmospheric research group.
The group was/is doing bleeding edge research.
My son was helping out one of the researchers in updating code that handled 2D models to 3D models.
All the code was in FORTRAN and there was no desire to move away from it.</sentencetext>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28293411</id>
	<title>Re:While there may be "newer" languages</title>
	<author>Anonymous</author>
	<datestamp>1244733660000</datestamp>
	<modclass>None</modclass>
	<modscore>0</modscore>
	<htmltext><p>But thats why god invented MATLAB...</p></htmltext>
<tokenext>But thats why god invented MATLAB.. .</tokentext>
<sentencetext>But thats why god invented MATLAB...</sentencetext>
	<parent>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28291859</parent>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28292695</id>
	<title>should be taught more often</title>
	<author>Anonymous</author>
	<datestamp>1244731320000</datestamp>
	<modclass>None</modclass>
	<modscore>0</modscore>
	<htmltext><p>im a 4th year math student soon to be grad student<br>and i was not taught fortran had to learn it on my own on the fly<br>really wish i had been taught it though</p><p>there is a huge code base out there for various types of models<br>and recoding all that is just pointless and not to mention stupid when fortran is the fastest for it</p><p>i mean some of these simulations already run for weeks or days on numerous computers at a time<br>do u really want to slow that down or take the hit in over head from other languages</p></htmltext>
<tokenext>im a 4th year math student soon to be grad studentand i was not taught fortran had to learn it on my own on the flyreally wish i had been taught it thoughthere is a huge code base out there for various types of modelsand recoding all that is just pointless and not to mention stupid when fortran is the fastest for iti mean some of these simulations already run for weeks or days on numerous computers at a timedo u really want to slow that down or take the hit in over head from other languages</tokentext>
<sentencetext>im a 4th year math student soon to be grad studentand i was not taught fortran had to learn it on my own on the flyreally wish i had been taught it thoughthere is a huge code base out there for various types of modelsand recoding all that is just pointless and not to mention stupid when fortran is the fastest for iti mean some of these simulations already run for weeks or days on numerous computers at a timedo u really want to slow that down or take the hit in over head from other languages</sentencetext>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28293109</id>
	<title>c?</title>
	<author>buddyglass</author>
	<datestamp>1244732640000</datestamp>
	<modclass>None</modclass>
	<modscore>1</modscore>
	<htmltext><p>I'd imagine C would be more useful.  I went through a graduate program in CS and took several classes cross-listed in the Computation and Applied Mathematics program, as well as some numerical methods classes in the CS dept.  We had to link with Fortran occasionally (BLAS), but we always did it from within C code.  Just have to make sure your two-dimensional arrays are laid out correctly (i.e. column major).</p></htmltext>
<tokenext>I 'd imagine C would be more useful .
I went through a graduate program in CS and took several classes cross-listed in the Computation and Applied Mathematics program , as well as some numerical methods classes in the CS dept .
We had to link with Fortran occasionally ( BLAS ) , but we always did it from within C code .
Just have to make sure your two-dimensional arrays are laid out correctly ( i.e .
column major ) .</tokentext>
<sentencetext>I'd imagine C would be more useful.
I went through a graduate program in CS and took several classes cross-listed in the Computation and Applied Mathematics program, as well as some numerical methods classes in the CS dept.
We had to link with Fortran occasionally (BLAS), but we always did it from within C code.
Just have to make sure your two-dimensional arrays are laid out correctly (i.e.
column major).</sentencetext>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28295617</id>
	<title>FORTRAN, no... LISP, yes...</title>
	<author>neo</author>
	<datestamp>1244741700000</datestamp>
	<modclass>None</modclass>
	<modscore>1</modscore>
	<htmltext><p>Your computer programming ability will only marginally change should you learn FORTRAN.  Your time is better spent on it's children that copied and made better it's functional properties.</p><p>However LISP will expand you programming experience and should be taught at some point after C++ or PHP and before Python.  This would have a profound impact on the way you program.</p></htmltext>
<tokenext>Your computer programming ability will only marginally change should you learn FORTRAN .
Your time is better spent on it 's children that copied and made better it 's functional properties.However LISP will expand you programming experience and should be taught at some point after C + + or PHP and before Python .
This would have a profound impact on the way you program .</tokentext>
<sentencetext>Your computer programming ability will only marginally change should you learn FORTRAN.
Your time is better spent on it's children that copied and made better it's functional properties.However LISP will expand you programming experience and should be taught at some point after C++ or PHP and before Python.
This would have a profound impact on the way you program.</sentencetext>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28293303</id>
	<title>Of course</title>
	<author>No2Gates</author>
	<datestamp>1244733240000</datestamp>
	<modclass>None</modclass>
	<modscore>0</modscore>
	<htmltext><p>They should also learn Latin and Sanscrit</p></htmltext>
<tokenext>They should also learn Latin and Sanscrit</tokentext>
<sentencetext>They should also learn Latin and Sanscrit</sentencetext>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28324573</id>
	<title>No.</title>
	<author>grahagre</author>
	<datestamp>1244913480000</datestamp>
	<modclass>None</modclass>
	<modscore>0</modscore>
	<htmltext><p>Undergraduates should be taught programming principles that should easily carry over to different language paradaigms. Direcly learning Fortran in an undergraduate cirriculum is unneccessary.</p></htmltext>
<tokenext>Undergraduates should be taught programming principles that should easily carry over to different language paradaigms .
Direcly learning Fortran in an undergraduate cirriculum is unneccessary .</tokentext>
<sentencetext>Undergraduates should be taught programming principles that should easily carry over to different language paradaigms.
Direcly learning Fortran in an undergraduate cirriculum is unneccessary.</sentencetext>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28293445</id>
	<title>Re:While there may be "newer" languages</title>
	<author>fitten</author>
	<datestamp>1244733780000</datestamp>
	<modclass>None</modclass>
	<modscore>1</modscore>
	<htmltext><p>Well, just one example... the aliasing rules for pointers passed to subroutines allow for very aggressive optimization by the compiler.</p></htmltext>
<tokenext>Well , just one example... the aliasing rules for pointers passed to subroutines allow for very aggressive optimization by the compiler .</tokentext>
<sentencetext>Well, just one example... the aliasing rules for pointers passed to subroutines allow for very aggressive optimization by the compiler.</sentencetext>
	<parent>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28292025</parent>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28296831</id>
	<title>I guess it all depends</title>
	<author>sheph</author>
	<datestamp>1244746020000</datestamp>
	<modclass>None</modclass>
	<modscore>0</modscore>
	<htmltext>It really depends on what you are planning to do.  Fortran is certainly still being used in real world applications.  We use it here to write custom functions for software that controls the electrical grid. I would imagine that it's still in use in other environments that require engineered modeling as well.  I didn't learn it in school, but I did learn C++, Java, Perl, and PHP.  When I started here I had enough general knowledge that it was really just a matter of learning the syntax.  So I think it's plausible to expect a student to be able to pick it up as long as they are taught the basics of programming in some respectible language.</htmltext>
<tokenext>It really depends on what you are planning to do .
Fortran is certainly still being used in real world applications .
We use it here to write custom functions for software that controls the electrical grid .
I would imagine that it 's still in use in other environments that require engineered modeling as well .
I did n't learn it in school , but I did learn C + + , Java , Perl , and PHP .
When I started here I had enough general knowledge that it was really just a matter of learning the syntax .
So I think it 's plausible to expect a student to be able to pick it up as long as they are taught the basics of programming in some respectible language .</tokentext>
<sentencetext>It really depends on what you are planning to do.
Fortran is certainly still being used in real world applications.
We use it here to write custom functions for software that controls the electrical grid.
I would imagine that it's still in use in other environments that require engineered modeling as well.
I didn't learn it in school, but I did learn C++, Java, Perl, and PHP.
When I started here I had enough general knowledge that it was really just a matter of learning the syntax.
So I think it's plausible to expect a student to be able to pick it up as long as they are taught the basics of programming in some respectible language.</sentencetext>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28295217</id>
	<title>Re:It could be worse. In fact, it was...</title>
	<author>oliderid</author>
	<datestamp>1244740320000</datestamp>
	<modclass>None</modclass>
	<modscore>1</modscore>
	<htmltext><p><div class="quote"><p>I think the reasoning was that a student should learn a language with extreme, formal structure, and then later they can learn ones that aren't quite as strict. Maybe that the same reasoning behind teaching students Fortran? At least it's a little more useful than Pascal.</p></div><p>Pascal...I had to learn it early 90's.... On the other hand, what's the point to bore students to death with such languages lacking all the nice stuffs of modern languages? I remember that I learn a lot more by my own playing with graphic library because it was funny, extremely visual. And once you want to make some animation, you need to understand the logic behind buffering, and other basic concepts. </p><p>I would rather take a high level language and then I would gradually move to the core. It's like being in front of a car...first you want to drive it. Once you master driving...You wonder how this thing works?  I always had the feeling that CS takes the opposite road, first you have to know how an engine works before driving...What you learn has no practical use ( read funny  for a teenager) until very late in your study.</p></div>
	</htmltext>
<tokenext>I think the reasoning was that a student should learn a language with extreme , formal structure , and then later they can learn ones that are n't quite as strict .
Maybe that the same reasoning behind teaching students Fortran ?
At least it 's a little more useful than Pascal.Pascal...I had to learn it early 90 's.... On the other hand , what 's the point to bore students to death with such languages lacking all the nice stuffs of modern languages ?
I remember that I learn a lot more by my own playing with graphic library because it was funny , extremely visual .
And once you want to make some animation , you need to understand the logic behind buffering , and other basic concepts .
I would rather take a high level language and then I would gradually move to the core .
It 's like being in front of a car...first you want to drive it .
Once you master driving...You wonder how this thing works ?
I always had the feeling that CS takes the opposite road , first you have to know how an engine works before driving...What you learn has no practical use ( read funny for a teenager ) until very late in your study .</tokentext>
<sentencetext>I think the reasoning was that a student should learn a language with extreme, formal structure, and then later they can learn ones that aren't quite as strict.
Maybe that the same reasoning behind teaching students Fortran?
At least it's a little more useful than Pascal.Pascal...I had to learn it early 90's.... On the other hand, what's the point to bore students to death with such languages lacking all the nice stuffs of modern languages?
I remember that I learn a lot more by my own playing with graphic library because it was funny, extremely visual.
And once you want to make some animation, you need to understand the logic behind buffering, and other basic concepts.
I would rather take a high level language and then I would gradually move to the core.
It's like being in front of a car...first you want to drive it.
Once you master driving...You wonder how this thing works?
I always had the feeling that CS takes the opposite road, first you have to know how an engine works before driving...What you learn has no practical use ( read funny  for a teenager) until very late in your study.
	</sentencetext>
	<parent>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28292075</parent>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28292925</id>
	<title>Only if...</title>
	<author>Anonymous</author>
	<datestamp>1244732040000</datestamp>
	<modclass>None</modclass>
	<modscore>0</modscore>
	<htmltext><p>Only if they misbehave...</p></htmltext>
<tokenext>Only if they misbehave.. .</tokentext>
<sentencetext>Only if they misbehave...</sentencetext>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28292783</id>
	<title>Re:While there may be "newer" languages</title>
	<author>mdwh2</author>
	<datestamp>1244731560000</datestamp>
	<modclass>None</modclass>
	<modscore>1</modscore>
	<htmltext><p><i>Even if not phython, what does Fortran have over modern compiled languages, for example?<br></i></p><p>Crumbs, looks like I upset one Fortran-fan with mod points today. I have nothing against Fortran. I merely asked the OP for evidence for his claim that Fortran was better than all other languages. I also pointed out that Python wasn't the only option.</p><p>But hey, let's ignore debate and pretend Fortran is now the best language around.</p></htmltext>
<tokenext>Even if not phython , what does Fortran have over modern compiled languages , for example ? Crumbs , looks like I upset one Fortran-fan with mod points today .
I have nothing against Fortran .
I merely asked the OP for evidence for his claim that Fortran was better than all other languages .
I also pointed out that Python was n't the only option.But hey , let 's ignore debate and pretend Fortran is now the best language around .</tokentext>
<sentencetext>Even if not phython, what does Fortran have over modern compiled languages, for example?Crumbs, looks like I upset one Fortran-fan with mod points today.
I have nothing against Fortran.
I merely asked the OP for evidence for his claim that Fortran was better than all other languages.
I also pointed out that Python wasn't the only option.But hey, let's ignore debate and pretend Fortran is now the best language around.</sentencetext>
	<parent>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28292025</parent>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28292653</id>
	<title>FORTRAN is still widely used by Nuclear Engineers</title>
	<author>sandwall</author>
	<datestamp>1244731200000</datestamp>
	<modclass>None</modclass>
	<modscore>1</modscore>
	<htmltext>There was a point in my graduate career where all the students in the class were asked if anyone knew fortran, only one student raised his hand.
Not that the other students couldn't learn it, but the student that already knew it, had a great advantage. Python and other newer languages may not always be in vogue and may eventually be defunct, while fortan will always be a rock solid foundation.</htmltext>
<tokenext>There was a point in my graduate career where all the students in the class were asked if anyone knew fortran , only one student raised his hand .
Not that the other students could n't learn it , but the student that already knew it , had a great advantage .
Python and other newer languages may not always be in vogue and may eventually be defunct , while fortan will always be a rock solid foundation .</tokentext>
<sentencetext>There was a point in my graduate career where all the students in the class were asked if anyone knew fortran, only one student raised his hand.
Not that the other students couldn't learn it, but the student that already knew it, had a great advantage.
Python and other newer languages may not always be in vogue and may eventually be defunct, while fortan will always be a rock solid foundation.</sentencetext>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28292089</id>
	<title>Re:"Introductory"</title>
	<author>Anonymous</author>
	<datestamp>1244729160000</datestamp>
	<modclass>None</modclass>
	<modscore>0</modscore>
	<htmltext><p>Beginner's Allpurpose Symbolic Instruction Code perhaps?<br>Of course, because BASIC was designed in "before structured programming" days, it lacks useful and modern abstractions.<br>Back in the mid 70s, UCSD used Pascal as the educational language... It provided the necessary constructs and such to learn good practices, and the instructors made no claim that you'd ever use it for anything else (since the dominant language in engineering at the time was FORTRAN, although various Algol variants were popular at UCSD).. They felt once you learned in pascal how to structure things, you'd be able to write in anything.</p><p>At UCLA  in the same time frame (when they had no CS department, per se), MIXAL and PL/I (or PL/C) were the languages of choice. PL/I wasn't bad (and was implemented better than Calgol, another language used.. yet another custom variant of Algol) but it never got traction in the industry. Could it be because you had to PAY for PL/I, and C compilers were (almost) free?  Clever folks those marketeers at Bell Labs..</p></htmltext>
<tokenext>Beginner 's Allpurpose Symbolic Instruction Code perhaps ? Of course , because BASIC was designed in " before structured programming " days , it lacks useful and modern abstractions.Back in the mid 70s , UCSD used Pascal as the educational language... It provided the necessary constructs and such to learn good practices , and the instructors made no claim that you 'd ever use it for anything else ( since the dominant language in engineering at the time was FORTRAN , although various Algol variants were popular at UCSD ) .. They felt once you learned in pascal how to structure things , you 'd be able to write in anything.At UCLA in the same time frame ( when they had no CS department , per se ) , MIXAL and PL/I ( or PL/C ) were the languages of choice .
PL/I was n't bad ( and was implemented better than Calgol , another language used.. yet another custom variant of Algol ) but it never got traction in the industry .
Could it be because you had to PAY for PL/I , and C compilers were ( almost ) free ?
Clever folks those marketeers at Bell Labs. .</tokentext>
<sentencetext>Beginner's Allpurpose Symbolic Instruction Code perhaps?Of course, because BASIC was designed in "before structured programming" days, it lacks useful and modern abstractions.Back in the mid 70s, UCSD used Pascal as the educational language... It provided the necessary constructs and such to learn good practices, and the instructors made no claim that you'd ever use it for anything else (since the dominant language in engineering at the time was FORTRAN, although various Algol variants were popular at UCSD).. They felt once you learned in pascal how to structure things, you'd be able to write in anything.At UCLA  in the same time frame (when they had no CS department, per se), MIXAL and PL/I (or PL/C) were the languages of choice.
PL/I wasn't bad (and was implemented better than Calgol, another language used.. yet another custom variant of Algol) but it never got traction in the industry.
Could it be because you had to PAY for PL/I, and C compilers were (almost) free?
Clever folks those marketeers at Bell Labs..</sentencetext>
	<parent>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28291939</parent>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28293167</id>
	<title>Pick the tool suited to the task...</title>
	<author>bradley13</author>
	<datestamp>1244732880000</datestamp>
	<modclass>None</modclass>
	<modscore>1</modscore>
	<htmltext><p>In principle, you can solve any programming problem in any complete language. But why not take the right tool for the job? Fortran is suited to numerical analysis and is highly optimized for it. Python isn't, nor is Java, nor is C++. In particular, if you are doing numerical number crunching, you do not want object oriented crapola getting in your way. OO is great for the vast majority of standard programming problems, but it is a hindrance in a lot of scientific and numerical analysis tasks.

</p><p>Here's an example: you are analyzing radar signals. These come in as a continuous stream of numbers, which are packed into arrays and analyzed. There are no objects yet, just lots of numbers to crunch as quickly and efficiently as you can - and that's what Fortran is good at. Once you have identified airplanes and figured out their distance, speed, etc - then maybe you want to pass these on to an OO module written in another language.</p></htmltext>
<tokenext>In principle , you can solve any programming problem in any complete language .
But why not take the right tool for the job ?
Fortran is suited to numerical analysis and is highly optimized for it .
Python is n't , nor is Java , nor is C + + .
In particular , if you are doing numerical number crunching , you do not want object oriented crapola getting in your way .
OO is great for the vast majority of standard programming problems , but it is a hindrance in a lot of scientific and numerical analysis tasks .
Here 's an example : you are analyzing radar signals .
These come in as a continuous stream of numbers , which are packed into arrays and analyzed .
There are no objects yet , just lots of numbers to crunch as quickly and efficiently as you can - and that 's what Fortran is good at .
Once you have identified airplanes and figured out their distance , speed , etc - then maybe you want to pass these on to an OO module written in another language .</tokentext>
<sentencetext>In principle, you can solve any programming problem in any complete language.
But why not take the right tool for the job?
Fortran is suited to numerical analysis and is highly optimized for it.
Python isn't, nor is Java, nor is C++.
In particular, if you are doing numerical number crunching, you do not want object oriented crapola getting in your way.
OO is great for the vast majority of standard programming problems, but it is a hindrance in a lot of scientific and numerical analysis tasks.
Here's an example: you are analyzing radar signals.
These come in as a continuous stream of numbers, which are packed into arrays and analyzed.
There are no objects yet, just lots of numbers to crunch as quickly and efficiently as you can - and that's what Fortran is good at.
Once you have identified airplanes and figured out their distance, speed, etc - then maybe you want to pass these on to an OO module written in another language.</sentencetext>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28292893</id>
	<title>Re:University != Trade school</title>
	<author>Anonymous</author>
	<datestamp>1244731920000</datestamp>
	<modclass>None</modclass>
	<modscore>0</modscore>
	<htmltext><p>Well for starters, jackass, large-scale computational codes are not written in JavaScript or "Microsoft Clippy", whatever the hell that is. And speaking as a practicing engineer not long out of school who is writing such software, I would have appreciated instruction in a language that I would actually use in practice (Fortran 90), instead of some CS wanker jerking me off with yet another sorting implementation in Scheme.</p></htmltext>
<tokenext>Well for starters , jackass , large-scale computational codes are not written in JavaScript or " Microsoft Clippy " , whatever the hell that is .
And speaking as a practicing engineer not long out of school who is writing such software , I would have appreciated instruction in a language that I would actually use in practice ( Fortran 90 ) , instead of some CS wanker jerking me off with yet another sorting implementation in Scheme .</tokentext>
<sentencetext>Well for starters, jackass, large-scale computational codes are not written in JavaScript or "Microsoft Clippy", whatever the hell that is.
And speaking as a practicing engineer not long out of school who is writing such software, I would have appreciated instruction in a language that I would actually use in practice (Fortran 90), instead of some CS wanker jerking me off with yet another sorting implementation in Scheme.</sentencetext>
	<parent>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28292153</parent>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28293037</id>
	<title>Re:Are You Serious?</title>
	<author>robbyjo</author>
	<datestamp>1244732400000</datestamp>
	<modclass>None</modclass>
	<modscore>1</modscore>
	<htmltext><p>I think ODE is going to be one of the simplest case of numerical library. I can simply copy algorithms from Numerical Recipes and get away with it. But, there are a LOT more written exclusively in Fortran that you don't want to touch with 10 foot pole.</p><p>For example, you should know R and why R does contain some Fortran libraries, especially BLAS and LAPACK. I think BLAS is the easiest one to translate over albeit tedious. It's simply matrix operations (add, subtract, multiply), but there are lots of them with multiple cases making it really fast. LAPACK (and other associated libraries like LINPACK, etc) requires a more intimate matrix theory. I have the necessary background to do, say, Singular Value Decomposition (SVD). But to date, the only fast SVD routines I know are in Fortran or derived from Fortran, ALONG with quirks and limitation of Fortran 77. Don't believe me? Check Jama's SVD routine and how it doesn't handle matrices with number of columns greater than number of rows with only one pass. You can get around that by invoking the routine twice (and believe me, that happens in a lot of places although 1 pass is possible), but that's partly due to the limitation of Fortran 77 of not being able to dynamically allocate arrays. The routine is so tight and fast that the option of translating it to other languages depend on the luck of machine translation (and making sense of it and try to clean up the mess, yada yada).</p><p>You have no idea that many of the libraries in the 80's are still in Fortran 77 and left untouched. Try Netlib (http://www.netlib.org/liblist.html) and try to pick one Fortran library and translate it to another language. See if you don't cry river. Believe it or not, many of them are still in wide use, usually as a part of other newer algorithms. I myself have done that. For example, a B-spline shrunk smoothing library of GCV (downloadable from Netlib) from 1985 is known to have a very good result. It's not your usual (and cheap) B-spline smoothing that you can find off the net, it's almost heaven-and-earth differences. This GCV is used in Q-value routine of 2003 to determine false discovery rates by smoothing over the P-values of thousands of genes. Nobody has tried to translate that off of Fortran. I did a daring job and spent 200 hours to translate that one library to Java with success. If I had the option not to do that, I'd rather spent 200 hours somewhere else and use whatever Fortran-Java glue to get around it. Seriously.</p><p>So, if you've never been into serious scientific library development, please don't make such arrogant and ignorant assertion. Although you can assert that Operating System is complex, the principle behind it is simple. Much simpler than a scientific formula, which requires much more math skills than just Calc1-3 and DE1-2. It's not simply translating differential or integral or what have you, that's the easiest part if somebody is giving it to you. The hard part is to read the scientific paper behind that Fortran code in order to try to make sense what the code is doing. Many of the algorithms contain some hack that makes the formula work. For example, certain algorithm define limits, magic constants, assumptions, etc that are NOT explained anywhere in the paper AT ALL. Some can be found from the papers cited by that paper, with reasons that might be unclear to you. Now, if you translate the mathematical formula off of the paper without reading it, wouldn't it be a recipe of disaster?</p><p>Many people may be gifted in coding, but very very rare have sufficient skills to translate highly numerical algorithms. Seriously. Most coders know absolutely nothing about higher-level maths. Netlib is just a start. They only contain algorithms of the 80's and early 90's (which are still widely used). Even Numpy uses plenty of untouched Fortran codes for its backend.</p><p>This ignorance of gigantic proportion of yours needs to stop. Now. If you still cling on your assertion, start an open source project that translates gigabytes of Fortran numerical library into a more modern language. See if you can even get some contributors. Good luck.</p></htmltext>
<tokenext>I think ODE is going to be one of the simplest case of numerical library .
I can simply copy algorithms from Numerical Recipes and get away with it .
But , there are a LOT more written exclusively in Fortran that you do n't want to touch with 10 foot pole.For example , you should know R and why R does contain some Fortran libraries , especially BLAS and LAPACK .
I think BLAS is the easiest one to translate over albeit tedious .
It 's simply matrix operations ( add , subtract , multiply ) , but there are lots of them with multiple cases making it really fast .
LAPACK ( and other associated libraries like LINPACK , etc ) requires a more intimate matrix theory .
I have the necessary background to do , say , Singular Value Decomposition ( SVD ) .
But to date , the only fast SVD routines I know are in Fortran or derived from Fortran , ALONG with quirks and limitation of Fortran 77 .
Do n't believe me ?
Check Jama 's SVD routine and how it does n't handle matrices with number of columns greater than number of rows with only one pass .
You can get around that by invoking the routine twice ( and believe me , that happens in a lot of places although 1 pass is possible ) , but that 's partly due to the limitation of Fortran 77 of not being able to dynamically allocate arrays .
The routine is so tight and fast that the option of translating it to other languages depend on the luck of machine translation ( and making sense of it and try to clean up the mess , yada yada ) .You have no idea that many of the libraries in the 80 's are still in Fortran 77 and left untouched .
Try Netlib ( http : //www.netlib.org/liblist.html ) and try to pick one Fortran library and translate it to another language .
See if you do n't cry river .
Believe it or not , many of them are still in wide use , usually as a part of other newer algorithms .
I myself have done that .
For example , a B-spline shrunk smoothing library of GCV ( downloadable from Netlib ) from 1985 is known to have a very good result .
It 's not your usual ( and cheap ) B-spline smoothing that you can find off the net , it 's almost heaven-and-earth differences .
This GCV is used in Q-value routine of 2003 to determine false discovery rates by smoothing over the P-values of thousands of genes .
Nobody has tried to translate that off of Fortran .
I did a daring job and spent 200 hours to translate that one library to Java with success .
If I had the option not to do that , I 'd rather spent 200 hours somewhere else and use whatever Fortran-Java glue to get around it .
Seriously.So , if you 've never been into serious scientific library development , please do n't make such arrogant and ignorant assertion .
Although you can assert that Operating System is complex , the principle behind it is simple .
Much simpler than a scientific formula , which requires much more math skills than just Calc1-3 and DE1-2 .
It 's not simply translating differential or integral or what have you , that 's the easiest part if somebody is giving it to you .
The hard part is to read the scientific paper behind that Fortran code in order to try to make sense what the code is doing .
Many of the algorithms contain some hack that makes the formula work .
For example , certain algorithm define limits , magic constants , assumptions , etc that are NOT explained anywhere in the paper AT ALL .
Some can be found from the papers cited by that paper , with reasons that might be unclear to you .
Now , if you translate the mathematical formula off of the paper without reading it , would n't it be a recipe of disaster ? Many people may be gifted in coding , but very very rare have sufficient skills to translate highly numerical algorithms .
Seriously. Most coders know absolutely nothing about higher-level maths .
Netlib is just a start .
They only contain algorithms of the 80 's and early 90 's ( which are still widely used ) .
Even Numpy uses plenty of untouched Fortran codes for its backend.This ignorance of gigantic proportion of yours needs to stop .
Now. If you still cling on your assertion , start an open source project that translates gigabytes of Fortran numerical library into a more modern language .
See if you can even get some contributors .
Good luck .</tokentext>
<sentencetext>I think ODE is going to be one of the simplest case of numerical library.
I can simply copy algorithms from Numerical Recipes and get away with it.
But, there are a LOT more written exclusively in Fortran that you don't want to touch with 10 foot pole.For example, you should know R and why R does contain some Fortran libraries, especially BLAS and LAPACK.
I think BLAS is the easiest one to translate over albeit tedious.
It's simply matrix operations (add, subtract, multiply), but there are lots of them with multiple cases making it really fast.
LAPACK (and other associated libraries like LINPACK, etc) requires a more intimate matrix theory.
I have the necessary background to do, say, Singular Value Decomposition (SVD).
But to date, the only fast SVD routines I know are in Fortran or derived from Fortran, ALONG with quirks and limitation of Fortran 77.
Don't believe me?
Check Jama's SVD routine and how it doesn't handle matrices with number of columns greater than number of rows with only one pass.
You can get around that by invoking the routine twice (and believe me, that happens in a lot of places although 1 pass is possible), but that's partly due to the limitation of Fortran 77 of not being able to dynamically allocate arrays.
The routine is so tight and fast that the option of translating it to other languages depend on the luck of machine translation (and making sense of it and try to clean up the mess, yada yada).You have no idea that many of the libraries in the 80's are still in Fortran 77 and left untouched.
Try Netlib (http://www.netlib.org/liblist.html) and try to pick one Fortran library and translate it to another language.
See if you don't cry river.
Believe it or not, many of them are still in wide use, usually as a part of other newer algorithms.
I myself have done that.
For example, a B-spline shrunk smoothing library of GCV (downloadable from Netlib) from 1985 is known to have a very good result.
It's not your usual (and cheap) B-spline smoothing that you can find off the net, it's almost heaven-and-earth differences.
This GCV is used in Q-value routine of 2003 to determine false discovery rates by smoothing over the P-values of thousands of genes.
Nobody has tried to translate that off of Fortran.
I did a daring job and spent 200 hours to translate that one library to Java with success.
If I had the option not to do that, I'd rather spent 200 hours somewhere else and use whatever Fortran-Java glue to get around it.
Seriously.So, if you've never been into serious scientific library development, please don't make such arrogant and ignorant assertion.
Although you can assert that Operating System is complex, the principle behind it is simple.
Much simpler than a scientific formula, which requires much more math skills than just Calc1-3 and DE1-2.
It's not simply translating differential or integral or what have you, that's the easiest part if somebody is giving it to you.
The hard part is to read the scientific paper behind that Fortran code in order to try to make sense what the code is doing.
Many of the algorithms contain some hack that makes the formula work.
For example, certain algorithm define limits, magic constants, assumptions, etc that are NOT explained anywhere in the paper AT ALL.
Some can be found from the papers cited by that paper, with reasons that might be unclear to you.
Now, if you translate the mathematical formula off of the paper without reading it, wouldn't it be a recipe of disaster?Many people may be gifted in coding, but very very rare have sufficient skills to translate highly numerical algorithms.
Seriously. Most coders know absolutely nothing about higher-level maths.
Netlib is just a start.
They only contain algorithms of the 80's and early 90's (which are still widely used).
Even Numpy uses plenty of untouched Fortran codes for its backend.This ignorance of gigantic proportion of yours needs to stop.
Now. If you still cling on your assertion, start an open source project that translates gigabytes of Fortran numerical library into a more modern language.
See if you can even get some contributors.
Good luck.</sentencetext>
	<parent>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28292165</parent>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28302379</id>
	<title>Re:PYTHON????</title>
	<author>MtHuurne</author>
	<datestamp>1244724300000</datestamp>
	<modclass>None</modclass>
	<modscore>1</modscore>
	<htmltext>It all depends on the granularity of the operations you perform in Python. Multiplying two giant matrices by iterating over all items in Python is going to be very slow indeed. Multiplying them by writing "C = A * B" in Python and having the actual multiplication done by a native library can be very fast.</htmltext>
<tokenext>It all depends on the granularity of the operations you perform in Python .
Multiplying two giant matrices by iterating over all items in Python is going to be very slow indeed .
Multiplying them by writing " C = A * B " in Python and having the actual multiplication done by a native library can be very fast .</tokentext>
<sentencetext>It all depends on the granularity of the operations you perform in Python.
Multiplying two giant matrices by iterating over all items in Python is going to be very slow indeed.
Multiplying them by writing "C = A * B" in Python and having the actual multiplication done by a native library can be very fast.</sentencetext>
	<parent>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28292127</parent>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28297059</id>
	<title>LHC languages.</title>
	<author>Bill, Shooter of Bul</author>
	<datestamp>1244746740000</datestamp>
	<modclass>Informativ</modclass>
	<modscore>2</modscore>
	<htmltext><p>My friend at cern working on programs for the LHC has to do everything in c++ or python. So python might be a good one to learn. I love fortran 90/95 its a good language that only has a hint of the unpleasantness that was fortran 66. As, I'm sure others have posted there are some very high performance libraries/compilers for fortran number crunching. But, its not that hard to pick up if you know c/python.</p><p>But it was pretty funny taking that FORTRAN class with some cs students who only knew ADA. They were sort of freaked out by its "modern" features and "easy" string handling. It was right then and there that I realized I had dodged a major bullet by not becoming a cs major at the school.</p></htmltext>
<tokenext>My friend at cern working on programs for the LHC has to do everything in c + + or python .
So python might be a good one to learn .
I love fortran 90/95 its a good language that only has a hint of the unpleasantness that was fortran 66 .
As , I 'm sure others have posted there are some very high performance libraries/compilers for fortran number crunching .
But , its not that hard to pick up if you know c/python.But it was pretty funny taking that FORTRAN class with some cs students who only knew ADA .
They were sort of freaked out by its " modern " features and " easy " string handling .
It was right then and there that I realized I had dodged a major bullet by not becoming a cs major at the school .</tokentext>
<sentencetext>My friend at cern working on programs for the LHC has to do everything in c++ or python.
So python might be a good one to learn.
I love fortran 90/95 its a good language that only has a hint of the unpleasantness that was fortran 66.
As, I'm sure others have posted there are some very high performance libraries/compilers for fortran number crunching.
But, its not that hard to pick up if you know c/python.But it was pretty funny taking that FORTRAN class with some cs students who only knew ADA.
They were sort of freaked out by its "modern" features and "easy" string handling.
It was right then and there that I realized I had dodged a major bullet by not becoming a cs major at the school.</sentencetext>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28308871</id>
	<title>Re:It's okay to teach them FORTRAN</title>
	<author>Anonymous</author>
	<datestamp>1244824200000</datestamp>
	<modclass>None</modclass>
	<modscore>0</modscore>
	<htmltext>REAL programmers use GOTOs freely, because<nobr> <wbr></nobr>/they/ like it when "real" programmers rant about the evils of GOTO, just to give them a Hannibal Lecture of the benefits afterwards.</htmltext>
<tokenext>REAL programmers use GOTOs freely , because /they/ like it when " real " programmers rant about the evils of GOTO , just to give them a Hannibal Lecture of the benefits afterwards .</tokentext>
<sentencetext>REAL programmers use GOTOs freely, because /they/ like it when "real" programmers rant about the evils of GOTO, just to give them a Hannibal Lecture of the benefits afterwards.</sentencetext>
	<parent>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28294815</parent>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28292047</id>
	<title>Don't fear modern languages</title>
	<author>ZenGeek</author>
	<datestamp>1244729040000</datestamp>
	<modclass>Interestin</modclass>
	<modscore>3</modscore>
	<htmltext><p>I work at a university research lab and Fortran is still very much present. If nothing else, students need to be able to work with legacy code. I agree, however, that new projects should make use of more modern languages. Special consideration should be given to functional programming which naturally fits many science problems and is easily parallelizable due to its "no side effects" philosophy.</p></htmltext>
<tokenext>I work at a university research lab and Fortran is still very much present .
If nothing else , students need to be able to work with legacy code .
I agree , however , that new projects should make use of more modern languages .
Special consideration should be given to functional programming which naturally fits many science problems and is easily parallelizable due to its " no side effects " philosophy .</tokentext>
<sentencetext>I work at a university research lab and Fortran is still very much present.
If nothing else, students need to be able to work with legacy code.
I agree, however, that new projects should make use of more modern languages.
Special consideration should be given to functional programming which naturally fits many science problems and is easily parallelizable due to its "no side effects" philosophy.</sentencetext>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28307685</id>
	<title>Re:While there may be "newer" languages</title>
	<author>speedtux</author>
	<datestamp>1244819700000</datestamp>
	<modclass>None</modclass>
	<modscore>1</modscore>
	<htmltext><p><i>Python still lacks many of Matlab's features, its only advantage is being Free Software.</i></p><p>Python has plenty of advantages over Matlab, and plenty of libraries that Matlab doesn't have.  I stopped using Matlab not because it cost money, but because it was a PITA and didn't get the job done.</p><p><i>Ask any numerical analyst to know why it is a terrible idea to solve a linear system with inv(A)*b. But make sure you have at least half an hour free.</i></p><p>Like many specialists, numerical analysts are often too close to the problem.  inv(A)*b is fine if it gets the job done.</p></htmltext>
<tokenext>Python still lacks many of Matlab 's features , its only advantage is being Free Software.Python has plenty of advantages over Matlab , and plenty of libraries that Matlab does n't have .
I stopped using Matlab not because it cost money , but because it was a PITA and did n't get the job done.Ask any numerical analyst to know why it is a terrible idea to solve a linear system with inv ( A ) * b. But make sure you have at least half an hour free.Like many specialists , numerical analysts are often too close to the problem .
inv ( A ) * b is fine if it gets the job done .</tokentext>
<sentencetext>Python still lacks many of Matlab's features, its only advantage is being Free Software.Python has plenty of advantages over Matlab, and plenty of libraries that Matlab doesn't have.
I stopped using Matlab not because it cost money, but because it was a PITA and didn't get the job done.Ask any numerical analyst to know why it is a terrible idea to solve a linear system with inv(A)*b. But make sure you have at least half an hour free.Like many specialists, numerical analysts are often too close to the problem.
inv(A)*b is fine if it gets the job done.</sentencetext>
	<parent>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28292401</parent>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28302479</id>
	<title>Well...</title>
	<author>Anonymous</author>
	<datestamp>1244724900000</datestamp>
	<modclass>None</modclass>
	<modscore>0</modscore>
	<htmltext><p>Fortran 2003 is a very different language to ancient fortran.</p><p>Teach them modern fortran at least.</p><p>But no, don't teach them python. Python is a toy scripting language.<br>If you're going to teach a bunch of mathematically inclined people (engineers!) a higher level language than fortran, they are way, way better choices than python, such as Common Lisp or Haskell.</p></htmltext>
<tokenext>Fortran 2003 is a very different language to ancient fortran.Teach them modern fortran at least.But no , do n't teach them python .
Python is a toy scripting language.If you 're going to teach a bunch of mathematically inclined people ( engineers !
) a higher level language than fortran , they are way , way better choices than python , such as Common Lisp or Haskell .</tokentext>
<sentencetext>Fortran 2003 is a very different language to ancient fortran.Teach them modern fortran at least.But no, don't teach them python.
Python is a toy scripting language.If you're going to teach a bunch of mathematically inclined people (engineers!
) a higher level language than fortran, they are way, way better choices than python, such as Common Lisp or Haskell.</sentencetext>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28293159</id>
	<title>Computer Science</title>
	<author>Anonymous</author>
	<datestamp>1244732820000</datestamp>
	<modclass>None</modclass>
	<modscore>0</modscore>
	<htmltext><p>They should have it in the computer science curriculum too. I was upset when I found out that it is no longer taught by my university's Computer Science department.</p></htmltext>
<tokenext>They should have it in the computer science curriculum too .
I was upset when I found out that it is no longer taught by my university 's Computer Science department .</tokentext>
<sentencetext>They should have it in the computer science curriculum too.
I was upset when I found out that it is no longer taught by my university's Computer Science department.</sentencetext>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28292031</id>
	<title>Re:Oh come on.</title>
	<author>Anonymous</author>
	<datestamp>1244728920000</datestamp>
	<modclass>None</modclass>
	<modscore>1</modscore>
	<htmltext>I agree. There are so many better first languages. I was fortunate to learn C++. Python seems to be growing in popularity as first language that can be learned.
OK. TFA mentions that.</htmltext>
<tokenext>I agree .
There are so many better first languages .
I was fortunate to learn C + + .
Python seems to be growing in popularity as first language that can be learned .
OK. TFA mentions that .</tokentext>
<sentencetext>I agree.
There are so many better first languages.
I was fortunate to learn C++.
Python seems to be growing in popularity as first language that can be learned.
OK. TFA mentions that.</sentencetext>
	<parent>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28291849</parent>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28303437</id>
	<title>CS folks: Pls stop trying to impose your will...</title>
	<author>Glasswire</author>
	<datestamp>1244732580000</datestamp>
	<modclass>None</modclass>
	<modscore>1</modscore>
	<htmltext><p>In the real sciences (chemistry or physics) where they need to get real things done, the elegance of the language is a very minor consideration.  Major ones are: are there libraries that contain key specialised functions and subroutines that apply in my discipline and does it avoid any obscure object paradigms that though a steep learning curve into the procedural code I really want to write and does it manage variable memory for me in ways that will not create unstable code from poor pointer management?<br>For most scientists, FORTRAN fits this requirement perfectly and I doubt Python will ever have 10\% of the scientific libraries FORTRAN has acquired over the years.<br>Even in commercial technical codes FORTRAN is widely used.  80\% or the computational-assisted-engineering cluster cycles at automakers (and at many other industries) are consumed by materials deformation codes like LS-DYNA which has always been and is still written in FORTRAN.   (Lots of the key engineering codes were written in the 60s for NASA and they survive today)<br>People use languages that work best for them, not what some meta-discipline like computer science says is the right thing.</p></htmltext>
<tokenext>In the real sciences ( chemistry or physics ) where they need to get real things done , the elegance of the language is a very minor consideration .
Major ones are : are there libraries that contain key specialised functions and subroutines that apply in my discipline and does it avoid any obscure object paradigms that though a steep learning curve into the procedural code I really want to write and does it manage variable memory for me in ways that will not create unstable code from poor pointer management ? For most scientists , FORTRAN fits this requirement perfectly and I doubt Python will ever have 10 \ % of the scientific libraries FORTRAN has acquired over the years.Even in commercial technical codes FORTRAN is widely used .
80 \ % or the computational-assisted-engineering cluster cycles at automakers ( and at many other industries ) are consumed by materials deformation codes like LS-DYNA which has always been and is still written in FORTRAN .
( Lots of the key engineering codes were written in the 60s for NASA and they survive today ) People use languages that work best for them , not what some meta-discipline like computer science says is the right thing .</tokentext>
<sentencetext>In the real sciences (chemistry or physics) where they need to get real things done, the elegance of the language is a very minor consideration.
Major ones are: are there libraries that contain key specialised functions and subroutines that apply in my discipline and does it avoid any obscure object paradigms that though a steep learning curve into the procedural code I really want to write and does it manage variable memory for me in ways that will not create unstable code from poor pointer management?For most scientists, FORTRAN fits this requirement perfectly and I doubt Python will ever have 10\% of the scientific libraries FORTRAN has acquired over the years.Even in commercial technical codes FORTRAN is widely used.
80\% or the computational-assisted-engineering cluster cycles at automakers (and at many other industries) are consumed by materials deformation codes like LS-DYNA which has always been and is still written in FORTRAN.
(Lots of the key engineering codes were written in the 60s for NASA and they survive today)People use languages that work best for them, not what some meta-discipline like computer science says is the right thing.</sentencetext>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28293859</id>
	<title>Since I'm on both sides of the fence</title>
	<author>jinxed\_one</author>
	<datestamp>1244735400000</datestamp>
	<modclass>Interestin</modclass>
	<modscore>2</modscore>
	<htmltext><p>Ok here goes:<br>
&nbsp; &nbsp; Should science undergraduates be taught Fortran?  Yes<br>
&nbsp; &nbsp; Should it be the FIRST language, NO, not any more</p><p>So much of science, especially physics, is done on computers now - as both a software engineer and someone transitioning into Physics I ran into many people that had severe problems learning FORTRAN and applying it to problems.  I really feel science students should have a couple of general courses in programming in C before moving on to other languages or even programming classes specific to their science.  Here's the reasoning:<br>A) Science students need to learn programming basics away from the pressure of also learning within their science field at the same time - if your learning the science at the same time, the actual basic programming concepts get lost and muddied with the science being learned.</p><p>B) It can allow a science major to learn the concepts of programming in a general purpose language without muddying it with a lot of OS specific, library specific, attitude specific usage (aside from the compiler use)</p><p>C) There is a C compiler on almost every system you will most likely use in your lifetime as a scientist</p><p>D) C has enough structure to be "readable", but doesn't have so many constraints that it has problems being fast</p><p>E) C syntax is the basis for many other programming languages including Python and Java (both of which are heavily used in science as well)</p><p>and finally if a science major has a good understanding of programming concepts they can know what to look for when they're learning a new language (whatever it might be) - they will know that they have to learn the syntax for control structures in the new language (for, while, if, etc) as well know they'll have to find out more esoteric language specific concepts like how do I create functions and libraries? How do I use them?</p><p>ALL THAT being said, yes FORTRAN is a critical language to know with the sciences, because of the availability of libraries. HOWEVER, many of those libraries are now available in other languages and/or can be called from a different language via an abstraction (a concept that would be taught in a more general computing course)</p></htmltext>
<tokenext>Ok here goes :     Should science undergraduates be taught Fortran ?
Yes     Should it be the FIRST language , NO , not any moreSo much of science , especially physics , is done on computers now - as both a software engineer and someone transitioning into Physics I ran into many people that had severe problems learning FORTRAN and applying it to problems .
I really feel science students should have a couple of general courses in programming in C before moving on to other languages or even programming classes specific to their science .
Here 's the reasoning : A ) Science students need to learn programming basics away from the pressure of also learning within their science field at the same time - if your learning the science at the same time , the actual basic programming concepts get lost and muddied with the science being learned.B ) It can allow a science major to learn the concepts of programming in a general purpose language without muddying it with a lot of OS specific , library specific , attitude specific usage ( aside from the compiler use ) C ) There is a C compiler on almost every system you will most likely use in your lifetime as a scientistD ) C has enough structure to be " readable " , but does n't have so many constraints that it has problems being fastE ) C syntax is the basis for many other programming languages including Python and Java ( both of which are heavily used in science as well ) and finally if a science major has a good understanding of programming concepts they can know what to look for when they 're learning a new language ( whatever it might be ) - they will know that they have to learn the syntax for control structures in the new language ( for , while , if , etc ) as well know they 'll have to find out more esoteric language specific concepts like how do I create functions and libraries ?
How do I use them ? ALL THAT being said , yes FORTRAN is a critical language to know with the sciences , because of the availability of libraries .
HOWEVER , many of those libraries are now available in other languages and/or can be called from a different language via an abstraction ( a concept that would be taught in a more general computing course )</tokentext>
<sentencetext>Ok here goes:
    Should science undergraduates be taught Fortran?
Yes
    Should it be the FIRST language, NO, not any moreSo much of science, especially physics, is done on computers now - as both a software engineer and someone transitioning into Physics I ran into many people that had severe problems learning FORTRAN and applying it to problems.
I really feel science students should have a couple of general courses in programming in C before moving on to other languages or even programming classes specific to their science.
Here's the reasoning:A) Science students need to learn programming basics away from the pressure of also learning within their science field at the same time - if your learning the science at the same time, the actual basic programming concepts get lost and muddied with the science being learned.B) It can allow a science major to learn the concepts of programming in a general purpose language without muddying it with a lot of OS specific, library specific, attitude specific usage (aside from the compiler use)C) There is a C compiler on almost every system you will most likely use in your lifetime as a scientistD) C has enough structure to be "readable", but doesn't have so many constraints that it has problems being fastE) C syntax is the basis for many other programming languages including Python and Java (both of which are heavily used in science as well)and finally if a science major has a good understanding of programming concepts they can know what to look for when they're learning a new language (whatever it might be) - they will know that they have to learn the syntax for control structures in the new language (for, while, if, etc) as well know they'll have to find out more esoteric language specific concepts like how do I create functions and libraries?
How do I use them?ALL THAT being said, yes FORTRAN is a critical language to know with the sciences, because of the availability of libraries.
HOWEVER, many of those libraries are now available in other languages and/or can be called from a different language via an abstraction (a concept that would be taught in a more general computing course)</sentencetext>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28292729</id>
	<title>Yes.</title>
	<author>Anonymous</author>
	<datestamp>1244731380000</datestamp>
	<modclass>None</modclass>
	<modscore>0</modscore>
	<htmltext><p>I had to suffer with Hollerith so why shouldn't they.</p></htmltext>
<tokenext>I had to suffer with Hollerith so why should n't they .</tokentext>
<sentencetext>I had to suffer with Hollerith so why shouldn't they.</sentencetext>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28317201</id>
	<title>Re:PYTHON????</title>
	<author>sciurus0</author>
	<datestamp>1244825460000</datestamp>
	<modclass>None</modclass>
	<modscore>1</modscore>
	<htmltext><p>There's <a href="http://www.scipy.org/PerformancePython" title="scipy.org">a  lot of alternatives</a> [scipy.org] in between pure python and pure C or fortran that can be speedy to write and run.</p></htmltext>
<tokenext>There 's a lot of alternatives [ scipy.org ] in between pure python and pure C or fortran that can be speedy to write and run .</tokentext>
<sentencetext>There's a  lot of alternatives [scipy.org] in between pure python and pure C or fortran that can be speedy to write and run.</sentencetext>
	<parent>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28292127</parent>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28293311</id>
	<title>Bring back Basic and FortranII</title>
	<author>cvtan</author>
	<datestamp>1244733300000</datestamp>
	<modclass>None</modclass>
	<modscore>1</modscore>
	<htmltext>When I started working at Kodak in 1983 I had to use a Tektronix Basic computer with 32k memory and 8-in floppy drives.  The language would only allow variable names that were two characters so you would have names like V9 or A2.  Of course after a while you ending up using I1 and O0 so on a dot matrix printout the code was unreadable.

On a PDP-11 I used FortranII.  If the program was too large I would get "Fortran Start Fail" error messages and would remove lines of code until it ran.  Occasionally I would have syntax error messages like "Error 142 on line 629" and find that there was no line 629...  (Insert Twilight Zone theme here.)</htmltext>
<tokenext>When I started working at Kodak in 1983 I had to use a Tektronix Basic computer with 32k memory and 8-in floppy drives .
The language would only allow variable names that were two characters so you would have names like V9 or A2 .
Of course after a while you ending up using I1 and O0 so on a dot matrix printout the code was unreadable .
On a PDP-11 I used FortranII .
If the program was too large I would get " Fortran Start Fail " error messages and would remove lines of code until it ran .
Occasionally I would have syntax error messages like " Error 142 on line 629 " and find that there was no line 629... ( Insert Twilight Zone theme here .
)</tokentext>
<sentencetext>When I started working at Kodak in 1983 I had to use a Tektronix Basic computer with 32k memory and 8-in floppy drives.
The language would only allow variable names that were two characters so you would have names like V9 or A2.
Of course after a while you ending up using I1 and O0 so on a dot matrix printout the code was unreadable.
On a PDP-11 I used FortranII.
If the program was too large I would get "Fortran Start Fail" error messages and would remove lines of code until it ran.
Occasionally I would have syntax error messages like "Error 142 on line 629" and find that there was no line 629...  (Insert Twilight Zone theme here.
)</sentencetext>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28303057</id>
	<title>Re:While there may be "newer" languages</title>
	<author>jrumney</author>
	<datestamp>1244729220000</datestamp>
	<modclass>None</modclass>
	<modscore>1</modscore>
	<htmltext>I'd have thought a higher level specialised language like GNU Octave or its commercial equivalent would be a good first introduction to computer languages for maths and science majors.</htmltext>
<tokenext>I 'd have thought a higher level specialised language like GNU Octave or its commercial equivalent would be a good first introduction to computer languages for maths and science majors .</tokentext>
<sentencetext>I'd have thought a higher level specialised language like GNU Octave or its commercial equivalent would be a good first introduction to computer languages for maths and science majors.</sentencetext>
	<parent>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28291859</parent>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28303769</id>
	<title>Re:Python is not programming.</title>
	<author>ceoyoyo</author>
	<datestamp>1244735520000</datestamp>
	<modclass>None</modclass>
	<modscore>1</modscore>
	<htmltext><p>If you're talking Objective-C, try out distributed objects and you'll spit every time you see the word MPI.</p></htmltext>
<tokenext>If you 're talking Objective-C , try out distributed objects and you 'll spit every time you see the word MPI .</tokentext>
<sentencetext>If you're talking Objective-C, try out distributed objects and you'll spit every time you see the word MPI.</sentencetext>
	<parent>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28292033</parent>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28294541</id>
	<title>I RTFA.</title>
	<author>DaveV1.0</author>
	<datestamp>1244737980000</datestamp>
	<modclass>None</modclass>
	<modscore>1</modscore>
	<htmltext><p>What I got from the article is:</p><ol><li>The author was a physics student.</li><li>The author thinks FORTRAN is a great tool for engineering and science.</li><li>The author and some of his friends had a hard time learning FORTRAN.</li><li>The author thinks FORTRAN should be dropped in favor of something else, like Python</li></ol><p>The most telling points in the article are:</p><blockquote><div><p>Did you learn FORTRAN at University?  Are you still using it?  If you answer yes to both of those questions then there is a high probability that you are still involved in research or that advanced numerical analysis is the mainstay of your job.  I know a lot of people in the (non-academic) IT industry - many of them were undergraduates in Physics or Chemistry and so they learned FORTRAN.  They don't use it anymore.</p></div> </blockquote><p>In other, he knows a lot of people who are now working out of their degree field and aren't using the programming language they learned specifically for said degree field. Apparently, this is his justification for suggesting the removal of FORTRAN programming from undergraduate science and engineering courses.</p><p>Here is a neat idea: If you are going to work in IT, get an IT degree so you will know IT things. Don't get a physics or chemistry degree where you will learn things specific to physics or chemistry then complain that those degrees didn't prepare you for your career.</p><p>I will give him one point. It may be worth while to have an intro to programming course that uses something a bit simpler than FORTRAN, but if one is going to be doing serious science and engineering (you know, what the degree is about) one should be required to learn FORTRAN.</p></div>
	</htmltext>
<tokenext>What I got from the article is : The author was a physics student.The author thinks FORTRAN is a great tool for engineering and science.The author and some of his friends had a hard time learning FORTRAN.The author thinks FORTRAN should be dropped in favor of something else , like PythonThe most telling points in the article are : Did you learn FORTRAN at University ?
Are you still using it ?
If you answer yes to both of those questions then there is a high probability that you are still involved in research or that advanced numerical analysis is the mainstay of your job .
I know a lot of people in the ( non-academic ) IT industry - many of them were undergraduates in Physics or Chemistry and so they learned FORTRAN .
They do n't use it anymore .
In other , he knows a lot of people who are now working out of their degree field and are n't using the programming language they learned specifically for said degree field .
Apparently , this is his justification for suggesting the removal of FORTRAN programming from undergraduate science and engineering courses.Here is a neat idea : If you are going to work in IT , get an IT degree so you will know IT things .
Do n't get a physics or chemistry degree where you will learn things specific to physics or chemistry then complain that those degrees did n't prepare you for your career.I will give him one point .
It may be worth while to have an intro to programming course that uses something a bit simpler than FORTRAN , but if one is going to be doing serious science and engineering ( you know , what the degree is about ) one should be required to learn FORTRAN .</tokentext>
<sentencetext>What I got from the article is:The author was a physics student.The author thinks FORTRAN is a great tool for engineering and science.The author and some of his friends had a hard time learning FORTRAN.The author thinks FORTRAN should be dropped in favor of something else, like PythonThe most telling points in the article are:Did you learn FORTRAN at University?
Are you still using it?
If you answer yes to both of those questions then there is a high probability that you are still involved in research or that advanced numerical analysis is the mainstay of your job.
I know a lot of people in the (non-academic) IT industry - many of them were undergraduates in Physics or Chemistry and so they learned FORTRAN.
They don't use it anymore.
In other, he knows a lot of people who are now working out of their degree field and aren't using the programming language they learned specifically for said degree field.
Apparently, this is his justification for suggesting the removal of FORTRAN programming from undergraduate science and engineering courses.Here is a neat idea: If you are going to work in IT, get an IT degree so you will know IT things.
Don't get a physics or chemistry degree where you will learn things specific to physics or chemistry then complain that those degrees didn't prepare you for your career.I will give him one point.
It may be worth while to have an intro to programming course that uses something a bit simpler than FORTRAN, but if one is going to be doing serious science and engineering (you know, what the degree is about) one should be required to learn FORTRAN.
	</sentencetext>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28299461</id>
	<title>Wrong</title>
	<author>Anonymous</author>
	<datestamp>1244711940000</datestamp>
	<modclass>None</modclass>
	<modscore>0</modscore>
	<htmltext><p>Teach them C or assembly first. They will thank you later when they have switched to Python because of experience, not because you forced it down their throats.</p></htmltext>
<tokenext>Teach them C or assembly first .
They will thank you later when they have switched to Python because of experience , not because you forced it down their throats .</tokentext>
<sentencetext>Teach them C or assembly first.
They will thank you later when they have switched to Python because of experience, not because you forced it down their throats.</sentencetext>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28292309</id>
	<title>FI^RST</title>
	<author>Anonymous</author>
	<datestamp>1244730000000</datestamp>
	<modclass>Troll</modclass>
	<modscore>-1</modscore>
	<htmltext>briilIant plan Dying' crowd -</htmltext>
<tokenext>briilIant plan Dying ' crowd -</tokentext>
<sentencetext>briilIant plan Dying' crowd -</sentencetext>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28298443</id>
	<title>assembly language</title>
	<author>Khashishi</author>
	<datestamp>1244751480000</datestamp>
	<modclass>None</modclass>
	<modscore>1</modscore>
	<htmltext><p>Assembly language is almost always totally impractical to actually use, but gives tremendous insight into what a computer <i>is</i> and what the limitations are.</p></htmltext>
<tokenext>Assembly language is almost always totally impractical to actually use , but gives tremendous insight into what a computer is and what the limitations are .</tokentext>
<sentencetext>Assembly language is almost always totally impractical to actually use, but gives tremendous insight into what a computer is and what the limitations are.</sentencetext>
	<parent>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28292059</parent>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28293389</id>
	<title>Fortran: from a TA's perspective</title>
	<author>Anubis IV</author>
	<datestamp>1244733540000</datestamp>
	<modclass>None</modclass>
	<modscore>1</modscore>
	<htmltext>As someone who has actually helped to teach Fortran at the university level (I was TA for the course, just this last semester), here are two reasons I think Fortran is still useful to teach:
<br>
<br>
1) It's still relevant.  As others have pointed out, libraries have been highly optimized over decades of use and reuse, and it's hard to throw that legacy code out for a newer language, especially when there is no benefit to doing so.  In many cases, you would actually be trading better performance for worse, even, and when you're dealing with things like weather models or geological simulations, raw performance means extra/better information.  So, much as I may personally not find Fortran useful and wish it would go away, the fact of the matter is that there is plenty of legacy code around that has been kept around for a good reason.  Before I ever worked as a TA at my university, I interned for two summers at Lockheed Martin Space Operations working on, guess what, Fortran code that was being used for meteorological purposes (the code had edits that were initialed by my office mates from before my birth, which was an odd experience, to say the least).
<br>
<br>
2) It's a good place to start.  It may not take up the mantle of OOP like Java or even C++, but Fortran is a great starter language since it is syntactically very simple, which lets the students really dive right in and not worry about getting hung up on things like semicolons or curly braces.  Having TAd C and C++ in previous semesters, I cannot tell you how many concepts students simply misunderstood or never used because of the fact that they were still struggling with syntax.  While all of these things are painfully simple to you and me, remember that these students are coming from a place where an if statement will take a day to explain, and arrays are pushing them pretty hard at the time.
<br>
<br>
As a very quick example, I probably had dozens of students out of the 130 in my last C/C++ class use the phrase "if loop" at some point in the lab or in office hours, since if and while statements look very similar syntactically.  In contrast, I never once had that issue with the 50 students in my last Fortran class.  To go along with that little fact, in the C/C++ class, it was not at all uncommon, despite frequent correction, for students to persist in trying to misuse an if statement as a loop, but I think I only saw that attempted once in the Fortran class, simply because the syntax between a loop and a conditional is so drastically different and dead simple.
<br>
<br>
When the language is simple syntactically, it means that they can focus more on the concepts and less on the language, which is really the ideal situation.  I had more students who took Fortran come to me at the end of the semester, express an interest in learning more languages, and ask what additional or more advanced courses they could take, than I ever did when TAing C and C++.  And for an intro-level course, the concepts are really the important part, not the syntax, since you want them to be able to take what they learn and apply it practically, whether that means Excel macros or just the ability to communicate an idea logically to a developer.
<br>
<br>
---
<br>
<br>
Fortran and I have a love-hate relationship.  Compared to more modern languages, it's not as pure, it's not as easy to program (for someone from a CS background), it's not as convenient, and it's not as concise.  All of that said, it still has raw practicality going for it, as well a low learning curve and continued widespread use, which makes it, in my opinion, great for teaching to students in related fields.  It's hard to make a decent case against Fortran.  Even so, my university has cancelled the class for upcoming semesters, so, after decades of being taught here, this last semester that I TAd really IS the last semester of Fortran at my university, period.  Maybe they know something I don't.</htmltext>
<tokenext>As someone who has actually helped to teach Fortran at the university level ( I was TA for the course , just this last semester ) , here are two reasons I think Fortran is still useful to teach : 1 ) It 's still relevant .
As others have pointed out , libraries have been highly optimized over decades of use and reuse , and it 's hard to throw that legacy code out for a newer language , especially when there is no benefit to doing so .
In many cases , you would actually be trading better performance for worse , even , and when you 're dealing with things like weather models or geological simulations , raw performance means extra/better information .
So , much as I may personally not find Fortran useful and wish it would go away , the fact of the matter is that there is plenty of legacy code around that has been kept around for a good reason .
Before I ever worked as a TA at my university , I interned for two summers at Lockheed Martin Space Operations working on , guess what , Fortran code that was being used for meteorological purposes ( the code had edits that were initialed by my office mates from before my birth , which was an odd experience , to say the least ) .
2 ) It 's a good place to start .
It may not take up the mantle of OOP like Java or even C + + , but Fortran is a great starter language since it is syntactically very simple , which lets the students really dive right in and not worry about getting hung up on things like semicolons or curly braces .
Having TAd C and C + + in previous semesters , I can not tell you how many concepts students simply misunderstood or never used because of the fact that they were still struggling with syntax .
While all of these things are painfully simple to you and me , remember that these students are coming from a place where an if statement will take a day to explain , and arrays are pushing them pretty hard at the time .
As a very quick example , I probably had dozens of students out of the 130 in my last C/C + + class use the phrase " if loop " at some point in the lab or in office hours , since if and while statements look very similar syntactically .
In contrast , I never once had that issue with the 50 students in my last Fortran class .
To go along with that little fact , in the C/C + + class , it was not at all uncommon , despite frequent correction , for students to persist in trying to misuse an if statement as a loop , but I think I only saw that attempted once in the Fortran class , simply because the syntax between a loop and a conditional is so drastically different and dead simple .
When the language is simple syntactically , it means that they can focus more on the concepts and less on the language , which is really the ideal situation .
I had more students who took Fortran come to me at the end of the semester , express an interest in learning more languages , and ask what additional or more advanced courses they could take , than I ever did when TAing C and C + + .
And for an intro-level course , the concepts are really the important part , not the syntax , since you want them to be able to take what they learn and apply it practically , whether that means Excel macros or just the ability to communicate an idea logically to a developer .
--- Fortran and I have a love-hate relationship .
Compared to more modern languages , it 's not as pure , it 's not as easy to program ( for someone from a CS background ) , it 's not as convenient , and it 's not as concise .
All of that said , it still has raw practicality going for it , as well a low learning curve and continued widespread use , which makes it , in my opinion , great for teaching to students in related fields .
It 's hard to make a decent case against Fortran .
Even so , my university has cancelled the class for upcoming semesters , so , after decades of being taught here , this last semester that I TAd really IS the last semester of Fortran at my university , period .
Maybe they know something I do n't .</tokentext>
<sentencetext>As someone who has actually helped to teach Fortran at the university level (I was TA for the course, just this last semester), here are two reasons I think Fortran is still useful to teach:


1) It's still relevant.
As others have pointed out, libraries have been highly optimized over decades of use and reuse, and it's hard to throw that legacy code out for a newer language, especially when there is no benefit to doing so.
In many cases, you would actually be trading better performance for worse, even, and when you're dealing with things like weather models or geological simulations, raw performance means extra/better information.
So, much as I may personally not find Fortran useful and wish it would go away, the fact of the matter is that there is plenty of legacy code around that has been kept around for a good reason.
Before I ever worked as a TA at my university, I interned for two summers at Lockheed Martin Space Operations working on, guess what, Fortran code that was being used for meteorological purposes (the code had edits that were initialed by my office mates from before my birth, which was an odd experience, to say the least).
2) It's a good place to start.
It may not take up the mantle of OOP like Java or even C++, but Fortran is a great starter language since it is syntactically very simple, which lets the students really dive right in and not worry about getting hung up on things like semicolons or curly braces.
Having TAd C and C++ in previous semesters, I cannot tell you how many concepts students simply misunderstood or never used because of the fact that they were still struggling with syntax.
While all of these things are painfully simple to you and me, remember that these students are coming from a place where an if statement will take a day to explain, and arrays are pushing them pretty hard at the time.
As a very quick example, I probably had dozens of students out of the 130 in my last C/C++ class use the phrase "if loop" at some point in the lab or in office hours, since if and while statements look very similar syntactically.
In contrast, I never once had that issue with the 50 students in my last Fortran class.
To go along with that little fact, in the C/C++ class, it was not at all uncommon, despite frequent correction, for students to persist in trying to misuse an if statement as a loop, but I think I only saw that attempted once in the Fortran class, simply because the syntax between a loop and a conditional is so drastically different and dead simple.
When the language is simple syntactically, it means that they can focus more on the concepts and less on the language, which is really the ideal situation.
I had more students who took Fortran come to me at the end of the semester, express an interest in learning more languages, and ask what additional or more advanced courses they could take, than I ever did when TAing C and C++.
And for an intro-level course, the concepts are really the important part, not the syntax, since you want them to be able to take what they learn and apply it practically, whether that means Excel macros or just the ability to communicate an idea logically to a developer.
---


Fortran and I have a love-hate relationship.
Compared to more modern languages, it's not as pure, it's not as easy to program (for someone from a CS background), it's not as convenient, and it's not as concise.
All of that said, it still has raw practicality going for it, as well a low learning curve and continued widespread use, which makes it, in my opinion, great for teaching to students in related fields.
It's hard to make a decent case against Fortran.
Even so, my university has cancelled the class for upcoming semesters, so, after decades of being taught here, this last semester that I TAd really IS the last semester of Fortran at my university, period.
Maybe they know something I don't.</sentencetext>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28292359</id>
	<title>Re:While there may be "newer" languages</title>
	<author>wireloose</author>
	<datestamp>1244730180000</datestamp>
	<modclass>None</modclass>
	<modscore>1</modscore>
	<htmltext>I believe I stated that it was still "one of the best, fastest, most optimized" languages.  I did not say in any way that it was "the best" but I did imply that it was a good one.  I offered that as a response to the OP OP.<br> <br>

If you want citations, find your own.  I only offer this one:  <a href="http://en.wikipedia.org/wiki/Python\_(programming\_language)" title="wikipedia.org">http://en.wikipedia.org/wiki/Python\_(programming\_language)</a> [wikipedia.org]  - note that Python is considered primarily a scripting language even by today's wiki authors.  If you have an issue with that, correct wiki.<br> <br>

There are plenty of newer languages with better features.  C and C++ are both in the same category, although you can argue that it's a bit more complicated to write your first C program than your first Fortran program if you're new to programming.  C++ offers a few more challenges.  Remember that we're talking about engineers here and not computer scientists.  They don't all arrive at college with programming backgrounds.</htmltext>
<tokenext>I believe I stated that it was still " one of the best , fastest , most optimized " languages .
I did not say in any way that it was " the best " but I did imply that it was a good one .
I offered that as a response to the OP OP .
If you want citations , find your own .
I only offer this one : http : //en.wikipedia.org/wiki/Python \ _ ( programming \ _language ) [ wikipedia.org ] - note that Python is considered primarily a scripting language even by today 's wiki authors .
If you have an issue with that , correct wiki .
There are plenty of newer languages with better features .
C and C + + are both in the same category , although you can argue that it 's a bit more complicated to write your first C program than your first Fortran program if you 're new to programming .
C + + offers a few more challenges .
Remember that we 're talking about engineers here and not computer scientists .
They do n't all arrive at college with programming backgrounds .</tokentext>
<sentencetext>I believe I stated that it was still "one of the best, fastest, most optimized" languages.
I did not say in any way that it was "the best" but I did imply that it was a good one.
I offered that as a response to the OP OP.
If you want citations, find your own.
I only offer this one:  http://en.wikipedia.org/wiki/Python\_(programming\_language) [wikipedia.org]  - note that Python is considered primarily a scripting language even by today's wiki authors.
If you have an issue with that, correct wiki.
There are plenty of newer languages with better features.
C and C++ are both in the same category, although you can argue that it's a bit more complicated to write your first C program than your first Fortran program if you're new to programming.
C++ offers a few more challenges.
Remember that we're talking about engineers here and not computer scientists.
They don't all arrive at college with programming backgrounds.</sentencetext>
	<parent>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28292025</parent>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28303623</id>
	<title>Re:Not so easy</title>
	<author>ceoyoyo</author>
	<datestamp>1244734560000</datestamp>
	<modclass>None</modclass>
	<modscore>1</modscore>
	<htmltext><p>I've never done much more in Fortran than translate it to something else, but I've seen a LOT of awful MatLab code.  With the addition of a few extra modules to Python you can make it look very much like MatLab.</p></htmltext>
<tokenext>I 've never done much more in Fortran than translate it to something else , but I 've seen a LOT of awful MatLab code .
With the addition of a few extra modules to Python you can make it look very much like MatLab .</tokentext>
<sentencetext>I've never done much more in Fortran than translate it to something else, but I've seen a LOT of awful MatLab code.
With the addition of a few extra modules to Python you can make it look very much like MatLab.</sentencetext>
	<parent>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28292331</parent>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28294755</id>
	<title>Fortran is far too high level.</title>
	<author>argent</author>
	<datestamp>1244738760000</datestamp>
	<modclass>None</modclass>
	<modscore>1</modscore>
	<htmltext><p>Everyone should have at least one semester in PDP-11 assembly language.</p></htmltext>
<tokenext>Everyone should have at least one semester in PDP-11 assembly language .</tokentext>
<sentencetext>Everyone should have at least one semester in PDP-11 assembly language.</sentencetext>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28292005</id>
	<title>Can't change overnight</title>
	<author>fermion</author>
	<datestamp>1244728800000</datestamp>
	<modclass>None</modclass>
	<modscore>1</modscore>
	<htmltext>I have been reading about the advances in Python for the physical sciences.  I think at some point it will be a major player, and it would probably be a good idea for the undergraduate to learn it.
<p>
OTOH, we tend to teach what we know and what has worked well.  This has been Fortran.  It is stable, which means it is easy to get and easy to teach.  Most universities has extensive mathematical and scientific libraries, which means that unless one is teaching programming, all that is really neccesary is some glue code to put the functions together.
</p><p>
As an aside, it is also moderately difficult to debug.  While this is annoying to the undergraduate, debugging a fortran program does teach important investigative skills and encourages a level of precision that is not so necessary in some more modern languages.</p></htmltext>
<tokenext>I have been reading about the advances in Python for the physical sciences .
I think at some point it will be a major player , and it would probably be a good idea for the undergraduate to learn it .
OTOH , we tend to teach what we know and what has worked well .
This has been Fortran .
It is stable , which means it is easy to get and easy to teach .
Most universities has extensive mathematical and scientific libraries , which means that unless one is teaching programming , all that is really neccesary is some glue code to put the functions together .
As an aside , it is also moderately difficult to debug .
While this is annoying to the undergraduate , debugging a fortran program does teach important investigative skills and encourages a level of precision that is not so necessary in some more modern languages .</tokentext>
<sentencetext>I have been reading about the advances in Python for the physical sciences.
I think at some point it will be a major player, and it would probably be a good idea for the undergraduate to learn it.
OTOH, we tend to teach what we know and what has worked well.
This has been Fortran.
It is stable, which means it is easy to get and easy to teach.
Most universities has extensive mathematical and scientific libraries, which means that unless one is teaching programming, all that is really neccesary is some glue code to put the functions together.
As an aside, it is also moderately difficult to debug.
While this is annoying to the undergraduate, debugging a fortran program does teach important investigative skills and encourages a level of precision that is not so necessary in some more modern languages.</sentencetext>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28292433</id>
	<title>Re:libraries. gigabytes of libraries</title>
	<author>Anonymous</author>
	<datestamp>1244730420000</datestamp>
	<modclass>None</modclass>
	<modscore>0</modscore>
	<htmltext><p>This is why they should be teaching the science undergrads Python. When they get to the point that they need to do some serious numbercrunching, F2PY allows them to interface to any Fortran library.</p><p>http://www.scipy.org/F2py  F2PY is part of the SCIPY project to create a science-oriented distro of Python.</p></htmltext>
<tokenext>This is why they should be teaching the science undergrads Python .
When they get to the point that they need to do some serious numbercrunching , F2PY allows them to interface to any Fortran library.http : //www.scipy.org/F2py F2PY is part of the SCIPY project to create a science-oriented distro of Python .</tokentext>
<sentencetext>This is why they should be teaching the science undergrads Python.
When they get to the point that they need to do some serious numbercrunching, F2PY allows them to interface to any Fortran library.http://www.scipy.org/F2py  F2PY is part of the SCIPY project to create a science-oriented distro of Python.</sentencetext>
	<parent>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28291919</parent>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28292507</id>
	<title>how to program is most important</title>
	<author>cats-paw</author>
	<datestamp>1244730660000</datestamp>
	<modclass>None</modclass>
	<modscore>1</modscore>
	<htmltext><p>As many on these hallowed pages have pointed out, the focus needs to be on how to program, not necessarily the language.</p><p>As for those who think that languages like Python are too slow, they are being too simplistic.</p><p>Writing an LU factorization routine in Python for real world use is silly.</p><p>Writing such a routine in Python for educational reasons is perfectly fine.</p><p>And using such a routine for real work will involve calling a LAPACK routine \_from\_ Python, and will<br>work great until the problem size isn't doesn't get too large.</p><p>So numerical work in Python or similar is perfectly fine, just as long as you use the right tool for the job.</p><p>So back to that FORTRAN thing... Why not ?  Ultimately it depends. I'd say that other languages \_are\_ more suitable unless the student is going on to a life of hard-core number crunching.  Dedicated simulation tools and programs like Mathematica and MATLAB are really what people use for everyday work.</p></htmltext>
<tokenext>As many on these hallowed pages have pointed out , the focus needs to be on how to program , not necessarily the language.As for those who think that languages like Python are too slow , they are being too simplistic.Writing an LU factorization routine in Python for real world use is silly.Writing such a routine in Python for educational reasons is perfectly fine.And using such a routine for real work will involve calling a LAPACK routine \ _from \ _ Python , and willwork great until the problem size is n't does n't get too large.So numerical work in Python or similar is perfectly fine , just as long as you use the right tool for the job.So back to that FORTRAN thing... Why not ?
Ultimately it depends .
I 'd say that other languages \ _are \ _ more suitable unless the student is going on to a life of hard-core number crunching .
Dedicated simulation tools and programs like Mathematica and MATLAB are really what people use for everyday work .</tokentext>
<sentencetext>As many on these hallowed pages have pointed out, the focus needs to be on how to program, not necessarily the language.As for those who think that languages like Python are too slow, they are being too simplistic.Writing an LU factorization routine in Python for real world use is silly.Writing such a routine in Python for educational reasons is perfectly fine.And using such a routine for real work will involve calling a LAPACK routine \_from\_ Python, and willwork great until the problem size isn't doesn't get too large.So numerical work in Python or similar is perfectly fine, just as long as you use the right tool for the job.So back to that FORTRAN thing... Why not ?
Ultimately it depends.
I'd say that other languages \_are\_ more suitable unless the student is going on to a life of hard-core number crunching.
Dedicated simulation tools and programs like Mathematica and MATLAB are really what people use for everyday work.</sentencetext>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28294097</id>
	<title>Re:I still use Fortran for sciantific calculations</title>
	<author>UID30</author>
	<datestamp>1244736300000</datestamp>
	<modclass>None</modclass>
	<modscore>1</modscore>
	<htmltext>Lesson learned?  C code written by people who have programmed Fortran for 20+ years runs slower than Fortran code written by people who have programmed Fortran for 20+ years.
<br> <br>
It is incredibly easy to write inefficient C code, and only years of exposure will give you the understanding necessary to write truly efficient C code.
<br> <br>
Unlike your typical<nobr> <wbr></nobr>/. post, I present evidence,  <a href="http://en.wikipedia.org/wiki/Duff's\_device" title="wikipedia.org">Duff's Device</a> [wikipedia.org], of what an efficient C coder can do.</htmltext>
<tokenext>Lesson learned ?
C code written by people who have programmed Fortran for 20 + years runs slower than Fortran code written by people who have programmed Fortran for 20 + years .
It is incredibly easy to write inefficient C code , and only years of exposure will give you the understanding necessary to write truly efficient C code .
Unlike your typical / .
post , I present evidence , Duff 's Device [ wikipedia.org ] , of what an efficient C coder can do .</tokentext>
<sentencetext>Lesson learned?
C code written by people who have programmed Fortran for 20+ years runs slower than Fortran code written by people who have programmed Fortran for 20+ years.
It is incredibly easy to write inefficient C code, and only years of exposure will give you the understanding necessary to write truly efficient C code.
Unlike your typical /.
post, I present evidence,  Duff's Device [wikipedia.org], of what an efficient C coder can do.</sentencetext>
	<parent>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28291925</parent>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28292483</id>
	<title>Re:University != Trade school</title>
	<author>Anonymous</author>
	<datestamp>1244730600000</datestamp>
	<modclass>Insightful</modclass>
	<modscore>4</modscore>
	<htmltext><p>We're not talking CS here, we're talking Engineering. Teaching them a specific language used in their field<nobr> <wbr></nobr>/is/ teaching them core principles and methods. Think of it like a basic diffeq class, giving them the tools to be able to learn their field, as opposed to more advanced math classes that underly diffeq.</p></htmltext>
<tokenext>We 're not talking CS here , we 're talking Engineering .
Teaching them a specific language used in their field /is/ teaching them core principles and methods .
Think of it like a basic diffeq class , giving them the tools to be able to learn their field , as opposed to more advanced math classes that underly diffeq .</tokentext>
<sentencetext>We're not talking CS here, we're talking Engineering.
Teaching them a specific language used in their field /is/ teaching them core principles and methods.
Think of it like a basic diffeq class, giving them the tools to be able to learn their field, as opposed to more advanced math classes that underly diffeq.</sentencetext>
	<parent>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28292153</parent>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28297461</id>
	<title>Ab-solute-ly!</title>
	<author>BruceSchaller</author>
	<datestamp>1244748000000</datestamp>
	<modclass>None</modclass>
	<modscore>1</modscore>
	<htmltext>I really wish that I had experience in programming in FORTRAN before I started my current project.  It's very fast, simply because it is so lightweight.  We're doing resevoir simulations...ie....how much CO2 can one put in a rock formation.  The extensive programs already written (and often free or inexpensive) for FORTRAN make it a great language.  The only problem is....thought must be put into threading FORTRAN applications.  MATLAB and Octave already handle threading many programs.  This gives great advantage to the user, who can easily scale to cluster-size computing without a lot of thought.</htmltext>
<tokenext>I really wish that I had experience in programming in FORTRAN before I started my current project .
It 's very fast , simply because it is so lightweight .
We 're doing resevoir simulations...ie....how much CO2 can one put in a rock formation .
The extensive programs already written ( and often free or inexpensive ) for FORTRAN make it a great language .
The only problem is....thought must be put into threading FORTRAN applications .
MATLAB and Octave already handle threading many programs .
This gives great advantage to the user , who can easily scale to cluster-size computing without a lot of thought .</tokentext>
<sentencetext>I really wish that I had experience in programming in FORTRAN before I started my current project.
It's very fast, simply because it is so lightweight.
We're doing resevoir simulations...ie....how much CO2 can one put in a rock formation.
The extensive programs already written (and often free or inexpensive) for FORTRAN make it a great language.
The only problem is....thought must be put into threading FORTRAN applications.
MATLAB and Octave already handle threading many programs.
This gives great advantage to the user, who can easily scale to cluster-size computing without a lot of thought.</sentencetext>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28292777</id>
	<title>Newsflash - science is real</title>
	<author>SuperKendall</author>
	<datestamp>1244731500000</datestamp>
	<modclass>Insightful</modclass>
	<modscore>2</modscore>
	<htmltext><p><i>IMO universities should be teaching core principles and methods, not attempting to impart up-to-date job skills.</i></p><p>IMIO, Fortran is not about "imparting up to the date job skills" as much as showing students a powerful tool to accomplish a high-level task that they'd otherwise have to learn more programming to do - and that takes from time spent with the science they are trying to learn.</p><p>Just because something is real does not make it a "trade skill" with al of the scorn you heaped upon it bountifully.</p></htmltext>
<tokenext>IMO universities should be teaching core principles and methods , not attempting to impart up-to-date job skills.IMIO , Fortran is not about " imparting up to the date job skills " as much as showing students a powerful tool to accomplish a high-level task that they 'd otherwise have to learn more programming to do - and that takes from time spent with the science they are trying to learn.Just because something is real does not make it a " trade skill " with al of the scorn you heaped upon it bountifully .</tokentext>
<sentencetext>IMO universities should be teaching core principles and methods, not attempting to impart up-to-date job skills.IMIO, Fortran is not about "imparting up to the date job skills" as much as showing students a powerful tool to accomplish a high-level task that they'd otherwise have to learn more programming to do - and that takes from time spent with the science they are trying to learn.Just because something is real does not make it a "trade skill" with al of the scorn you heaped upon it bountifully.</sentencetext>
	<parent>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28292153</parent>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28292325</id>
	<title>python?</title>
	<author>Anonymous</author>
	<datestamp>1244730060000</datestamp>
	<modclass>None</modclass>
	<modscore>0</modscore>
	<htmltext><p>Yeah--let me know when python is even 50\% as fast as optimized fortran out of the box.  Look--I've used numpy, scipy and all that--python for distributed computing.  And it's *great* stuff.  But when it comes right down to it--while you can do numeric computation in python--it just isn't as snappy.  C, Fortran--they're still faster, and they probably will be for some time.</p><p>It isn't the byte code--todays CPU caching and branch prediction seems to handle all that...it's the freakin' memory structure and GC as best I can tell.  Maybe if that gets better...problem solved.</p></htmltext>
<tokenext>Yeah--let me know when python is even 50 \ % as fast as optimized fortran out of the box .
Look--I 've used numpy , scipy and all that--python for distributed computing .
And it 's * great * stuff .
But when it comes right down to it--while you can do numeric computation in python--it just is n't as snappy .
C , Fortran--they 're still faster , and they probably will be for some time.It is n't the byte code--todays CPU caching and branch prediction seems to handle all that...it 's the freakin ' memory structure and GC as best I can tell .
Maybe if that gets better...problem solved .</tokentext>
<sentencetext>Yeah--let me know when python is even 50\% as fast as optimized fortran out of the box.
Look--I've used numpy, scipy and all that--python for distributed computing.
And it's *great* stuff.
But when it comes right down to it--while you can do numeric computation in python--it just isn't as snappy.
C, Fortran--they're still faster, and they probably will be for some time.It isn't the byte code--todays CPU caching and branch prediction seems to handle all that...it's the freakin' memory structure and GC as best I can tell.
Maybe if that gets better...problem solved.</sentencetext>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28297009</id>
	<title>Re:No fortran - just Python</title>
	<author>dodobh</author>
	<datestamp>1244746620000</datestamp>
	<modclass>Insightful</modclass>
	<modscore>2</modscore>
	<htmltext><p>Until you realise that Fortran has well tested and proven libraries which you would need to mostly reimplement in Python. See <a href="http://ask.slashdot.org/comments.pl?sid=1265269&amp;cid=28291919" title="slashdot.org">this comment</a> [slashdot.org] for example.</p></htmltext>
<tokenext>Until you realise that Fortran has well tested and proven libraries which you would need to mostly reimplement in Python .
See this comment [ slashdot.org ] for example .</tokentext>
<sentencetext>Until you realise that Fortran has well tested and proven libraries which you would need to mostly reimplement in Python.
See this comment [slashdot.org] for example.</sentencetext>
	<parent>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28292117</parent>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28292521</id>
	<title>"Objective" comparison</title>
	<author>Novus</author>
	<datestamp>1244730720000</datestamp>
	<modclass>None</modclass>
	<modscore>1</modscore>
	<htmltext><p>Using the criteria specified by Mannila &amp; de Raadt [1], the answer is: NO! As far as I can tell, Fortran is just as bad as C by their criteria.</p><p>[1] L. Mannila &amp; M. de Raadt: An objective comparison of languages for teaching introductory programming, Proceedings of the 6th Baltic Sea conference on Computing education research: Koli Calling 2006</p></htmltext>
<tokenext>Using the criteria specified by Mannila &amp; de Raadt [ 1 ] , the answer is : NO !
As far as I can tell , Fortran is just as bad as C by their criteria .
[ 1 ] L. Mannila &amp; M. de Raadt : An objective comparison of languages for teaching introductory programming , Proceedings of the 6th Baltic Sea conference on Computing education research : Koli Calling 2006</tokentext>
<sentencetext>Using the criteria specified by Mannila &amp; de Raadt [1], the answer is: NO!
As far as I can tell, Fortran is just as bad as C by their criteria.
[1] L. Mannila &amp; M. de Raadt: An objective comparison of languages for teaching introductory programming, Proceedings of the 6th Baltic Sea conference on Computing education research: Koli Calling 2006</sentencetext>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28292127</id>
	<title>PYTHON????</title>
	<author>Fantom42</author>
	<datestamp>1244729280000</datestamp>
	<modclass>Insightful</modclass>
	<modscore>5</modscore>
	<htmltext><p>Are you serious?  Python?</p><p>I am somewhat a Python fan boy.  I love it.  Its freaking wonderful for prototyping and really has a great, natural flow that reminds me a lot of pseudocode I might just invent on a napkin.  Great language.  But its also a factor of 30 times slower than a compiled language like C.</p><p>(http://www.osnews.com/story/5602/Nine\_Language\_Performance\_Round-up\_Benchmarking\_Math\_File\_I\_O/page3/)*</p><p>And Fortran is able to do optimizations (due to differences in the language for evaluation of expressions) that C is unable to do.  This has to do with guarantees of ordering that Fortran does not give that C does.  My point is that Fortran is even faster than C.  Why do you think its still around?</p><p>The physical sciences aren't using a fast language because they are bored, or obsessed with speed for the hell of it.  They use them because the problems they solve are typically deep into polynomial space, like O(n^3) or O(n^4).  Having something 30 times faster means they can run 30 simulations instead of just 1.  It makes a big difference to them.</p><p>I think the author of this article has lost some of this perspective.</p><p>That said, what this article should have tackled is, what do we want to teach engineering students about computer science?  Right now, they take a class that teaches them C++, Java, Python, or whatever.  They get some procedural programming skills with maybe a little tiny bit of object-oriented stuff (without really covering OO fundamentals IMHO, which are a more advanced topic) and they are thrown into a world where they are writing code in C for embedded controllers or Fortran for computational codes.  As a result, there is a huge body of code out there written by people who know how to get the job done, but don't exactly write code that is very maintainable.  They relearn the lessons of CS he hard way over 10-20-30-40(?) years of experience.  Are we really giving these young students (who are not CS majors) what they need?  What kind of curriculum would be ideal for someone who is going to end up writing code for something like a robot control system in C?</p><p>* I didn't really look too closely at this particular source, but I've seen numerous benchmarks all saying the same thing.  If you want a surprise, go look at how LISP stacks up compared to C.  It is better than you think.</p></htmltext>
<tokenext>Are you serious ?
Python ? I am somewhat a Python fan boy .
I love it .
Its freaking wonderful for prototyping and really has a great , natural flow that reminds me a lot of pseudocode I might just invent on a napkin .
Great language .
But its also a factor of 30 times slower than a compiled language like C. ( http : //www.osnews.com/story/5602/Nine \ _Language \ _Performance \ _Round-up \ _Benchmarking \ _Math \ _File \ _I \ _O/page3/ ) * And Fortran is able to do optimizations ( due to differences in the language for evaluation of expressions ) that C is unable to do .
This has to do with guarantees of ordering that Fortran does not give that C does .
My point is that Fortran is even faster than C. Why do you think its still around ? The physical sciences are n't using a fast language because they are bored , or obsessed with speed for the hell of it .
They use them because the problems they solve are typically deep into polynomial space , like O ( n ^ 3 ) or O ( n ^ 4 ) .
Having something 30 times faster means they can run 30 simulations instead of just 1 .
It makes a big difference to them.I think the author of this article has lost some of this perspective.That said , what this article should have tackled is , what do we want to teach engineering students about computer science ?
Right now , they take a class that teaches them C + + , Java , Python , or whatever .
They get some procedural programming skills with maybe a little tiny bit of object-oriented stuff ( without really covering OO fundamentals IMHO , which are a more advanced topic ) and they are thrown into a world where they are writing code in C for embedded controllers or Fortran for computational codes .
As a result , there is a huge body of code out there written by people who know how to get the job done , but do n't exactly write code that is very maintainable .
They relearn the lessons of CS he hard way over 10-20-30-40 ( ?
) years of experience .
Are we really giving these young students ( who are not CS majors ) what they need ?
What kind of curriculum would be ideal for someone who is going to end up writing code for something like a robot control system in C ?
* I did n't really look too closely at this particular source , but I 've seen numerous benchmarks all saying the same thing .
If you want a surprise , go look at how LISP stacks up compared to C. It is better than you think .</tokentext>
<sentencetext>Are you serious?
Python?I am somewhat a Python fan boy.
I love it.
Its freaking wonderful for prototyping and really has a great, natural flow that reminds me a lot of pseudocode I might just invent on a napkin.
Great language.
But its also a factor of 30 times slower than a compiled language like C.(http://www.osnews.com/story/5602/Nine\_Language\_Performance\_Round-up\_Benchmarking\_Math\_File\_I\_O/page3/)*And Fortran is able to do optimizations (due to differences in the language for evaluation of expressions) that C is unable to do.
This has to do with guarantees of ordering that Fortran does not give that C does.
My point is that Fortran is even faster than C.  Why do you think its still around?The physical sciences aren't using a fast language because they are bored, or obsessed with speed for the hell of it.
They use them because the problems they solve are typically deep into polynomial space, like O(n^3) or O(n^4).
Having something 30 times faster means they can run 30 simulations instead of just 1.
It makes a big difference to them.I think the author of this article has lost some of this perspective.That said, what this article should have tackled is, what do we want to teach engineering students about computer science?
Right now, they take a class that teaches them C++, Java, Python, or whatever.
They get some procedural programming skills with maybe a little tiny bit of object-oriented stuff (without really covering OO fundamentals IMHO, which are a more advanced topic) and they are thrown into a world where they are writing code in C for embedded controllers or Fortran for computational codes.
As a result, there is a huge body of code out there written by people who know how to get the job done, but don't exactly write code that is very maintainable.
They relearn the lessons of CS he hard way over 10-20-30-40(?
) years of experience.
Are we really giving these young students (who are not CS majors) what they need?
What kind of curriculum would be ideal for someone who is going to end up writing code for something like a robot control system in C?
* I didn't really look too closely at this particular source, but I've seen numerous benchmarks all saying the same thing.
If you want a surprise, go look at how LISP stacks up compared to C.  It is better than you think.</sentencetext>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28302965</id>
	<title>There should be a survey of languages class</title>
	<author>Anonymous</author>
	<datestamp>1244728320000</datestamp>
	<modclass>None</modclass>
	<modscore>0</modscore>
	<htmltext><p>Where you spend one week on each important languages and actually write a program in each language.  Fortran, Pascal, Basic, Algol 68, Lisp, Scheme, C, C++ , java, and a few others.  A different survey of scripting languages would cover sh, ksh, bash, tcl, perl, and others.</p></htmltext>
<tokenext>Where you spend one week on each important languages and actually write a program in each language .
Fortran , Pascal , Basic , Algol 68 , Lisp , Scheme , C , C + + , java , and a few others .
A different survey of scripting languages would cover sh , ksh , bash , tcl , perl , and others .</tokentext>
<sentencetext>Where you spend one week on each important languages and actually write a program in each language.
Fortran, Pascal, Basic, Algol 68, Lisp, Scheme, C, C++ , java, and a few others.
A different survey of scripting languages would cover sh, ksh, bash, tcl, perl, and others.</sentencetext>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28294013</id>
	<title>What do I think ?</title>
	<author>mbone</author>
	<datestamp>1244736000000</datestamp>
	<modclass>None</modclass>
	<modscore>1</modscore>
	<htmltext><p><i> What do people in the Slashdot community think?"</i></p><p>That almost all of the physicists I know program in Fortran. People trying to do better than "double precision" tend to be fairly conservative about such matters.</p></htmltext>
<tokenext>What do people in the Slashdot community think ?
" That almost all of the physicists I know program in Fortran .
People trying to do better than " double precision " tend to be fairly conservative about such matters .</tokentext>
<sentencetext> What do people in the Slashdot community think?
"That almost all of the physicists I know program in Fortran.
People trying to do better than "double precision" tend to be fairly conservative about such matters.</sentencetext>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28293759</id>
	<title>Re:While there may be "newer" languages</title>
	<author>ObsessiveMathsFreak</author>
	<datestamp>1244735040000</datestamp>
	<modclass>Informativ</modclass>
	<modscore>5</modscore>
	<htmltext><blockquote><div><p>BTW, a very ill-advised design choice of Python: <a href="http://www.python.org/dev/peps/pep-0211/" title="python.org">http://www.python.org/dev/peps/pep-0211/</a> [python.org] Ask any numerical analyst to know why it is a terrible idea to solve a linear system with inv(A)*b. But make sure you have at least half an hour free.</p></div></blockquote><p>To make a long story short; solving Ax=b by calculating x=inv(A)*b is a terrible idea because calculating inv(A) is an inherently difficult thing. While it would be extremely useful to have inv(A), it's not strictly neccessary to obtain in in order to solve Ax=b.</p><p>At the most basic level, the technique which most would be aware of to solve Ax=b is basic Gauss Elimination, with an augmented matrix and back substitution. In fact, this is often the very first thing people learn how to do in a linear algebra course. It isn't much better than finding the inverse, but it saves a lot of computation in the long run.</p><p>Of course there are many other techniques. Happily however, most packages can now automatically make the best choice on which technique to use, depending on the properties of A. In Matlab and Octave, it all boils down to using the left division operator like so<br><tt>x=A\b</tt><br>instead of the inverse calculating<br><tt>x=inv(A)*b</tt></p><p>Using the first command, Matlab and Octave will choose a technique that best suits the matrix A. <a href="http://www.mathworks.com/access/helpdesk/help/techdoc/ref/mldivide.html" title="mathworks.com">This page</a> [mathworks.com] has a list of all the techniques that Matlab can use to solve the linear system. To my knowledge, Octave has a number of techniques as well, but I'm not sure if it's as comprehensive as Matlab. Also, Octave's left division operator has been known to have bugs.</p><p>And to return to the main topic, Octave and Matlab both use <a href="http://www.netlib.org/lapack/" title="netlib.org">LAPACK</a> [netlib.org] extensively, which is written completely in Fortran(and based on BLAS). There's really no other language for linear algebra.</p></div>
	</htmltext>
<tokenext>BTW , a very ill-advised design choice of Python : http : //www.python.org/dev/peps/pep-0211/ [ python.org ] Ask any numerical analyst to know why it is a terrible idea to solve a linear system with inv ( A ) * b. But make sure you have at least half an hour free.To make a long story short ; solving Ax = b by calculating x = inv ( A ) * b is a terrible idea because calculating inv ( A ) is an inherently difficult thing .
While it would be extremely useful to have inv ( A ) , it 's not strictly neccessary to obtain in in order to solve Ax = b.At the most basic level , the technique which most would be aware of to solve Ax = b is basic Gauss Elimination , with an augmented matrix and back substitution .
In fact , this is often the very first thing people learn how to do in a linear algebra course .
It is n't much better than finding the inverse , but it saves a lot of computation in the long run.Of course there are many other techniques .
Happily however , most packages can now automatically make the best choice on which technique to use , depending on the properties of A. In Matlab and Octave , it all boils down to using the left division operator like sox = A \ binstead of the inverse calculatingx = inv ( A ) * bUsing the first command , Matlab and Octave will choose a technique that best suits the matrix A. This page [ mathworks.com ] has a list of all the techniques that Matlab can use to solve the linear system .
To my knowledge , Octave has a number of techniques as well , but I 'm not sure if it 's as comprehensive as Matlab .
Also , Octave 's left division operator has been known to have bugs.And to return to the main topic , Octave and Matlab both use LAPACK [ netlib.org ] extensively , which is written completely in Fortran ( and based on BLAS ) .
There 's really no other language for linear algebra .</tokentext>
<sentencetext>BTW, a very ill-advised design choice of Python: http://www.python.org/dev/peps/pep-0211/ [python.org] Ask any numerical analyst to know why it is a terrible idea to solve a linear system with inv(A)*b. But make sure you have at least half an hour free.To make a long story short; solving Ax=b by calculating x=inv(A)*b is a terrible idea because calculating inv(A) is an inherently difficult thing.
While it would be extremely useful to have inv(A), it's not strictly neccessary to obtain in in order to solve Ax=b.At the most basic level, the technique which most would be aware of to solve Ax=b is basic Gauss Elimination, with an augmented matrix and back substitution.
In fact, this is often the very first thing people learn how to do in a linear algebra course.
It isn't much better than finding the inverse, but it saves a lot of computation in the long run.Of course there are many other techniques.
Happily however, most packages can now automatically make the best choice on which technique to use, depending on the properties of A. In Matlab and Octave, it all boils down to using the left division operator like sox=A\binstead of the inverse calculatingx=inv(A)*bUsing the first command, Matlab and Octave will choose a technique that best suits the matrix A. This page [mathworks.com] has a list of all the techniques that Matlab can use to solve the linear system.
To my knowledge, Octave has a number of techniques as well, but I'm not sure if it's as comprehensive as Matlab.
Also, Octave's left division operator has been known to have bugs.And to return to the main topic, Octave and Matlab both use LAPACK [netlib.org] extensively, which is written completely in Fortran(and based on BLAS).
There's really no other language for linear algebra.
	</sentencetext>
	<parent>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28292401</parent>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28293455</id>
	<title>Re:PYTHON????</title>
	<author>Hatta</author>
	<datestamp>1244733780000</datestamp>
	<modclass>None</modclass>
	<modscore>1</modscore>
	<htmltext><p><i>They get some procedural programming skills with maybe a little tiny bit of object-oriented stuff (without really covering OO fundamentals IMHO, which are a more advanced topic)</i></p><p>It seems kind of backwards when the fundamentals of a subject is considered an advanced topic.</p></htmltext>
<tokenext>They get some procedural programming skills with maybe a little tiny bit of object-oriented stuff ( without really covering OO fundamentals IMHO , which are a more advanced topic ) It seems kind of backwards when the fundamentals of a subject is considered an advanced topic .</tokentext>
<sentencetext>They get some procedural programming skills with maybe a little tiny bit of object-oriented stuff (without really covering OO fundamentals IMHO, which are a more advanced topic)It seems kind of backwards when the fundamentals of a subject is considered an advanced topic.</sentencetext>
	<parent>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28292127</parent>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28292879</id>
	<title>Why?</title>
	<author>AtomicDevice</author>
	<datestamp>1244731860000</datestamp>
	<modclass>None</modclass>
	<modscore>1</modscore>
	<htmltext>It seems like fortran would be an awful choice, sure it's fast and lots of big-deal libraries use it, but who cares?
<br> <br>
Firstly, it's not as if you can't call fortran libraries from another language (matlab anyone?), secondly, if you learn to program, and become a professional, then you can learn another language.  I don't think there's too many people out there who honestly find it easier to program in fortran than ${ANY\_OTHER\_LANGUAGE}
<br> <br>
I've always felt C and C++ to be ideal intro languages, they are fast on their own, they very closely represent what is happening on the CPU (pointers, they're easy enough to optimize by hand) so as a student you get some good insight into how the computer actually works, and they are much more similar to modern languages (matlab python php etc etc) than fortran.
<br> <br>
Furthermore, the speed of the language isn't the only thing that matters, if I'm writing stuff for in-house processing, and it takes me a month to write it in fortran, and a day in python, I don't really care if the python is twice (3 5 10) times slower, because my calculation will still be done faster.  Furthermore consider the time it takes to debug matlab/python vs a compiled language like C or fortran, matlab and python (or java, etc) tell me where and what the problem was, as opposed to "SEG FAULT" 5 hours into processing.
<br> <br>
And what happens when I decide I need to change my code? Attempt to decipher fortran or C, rebuild it, test it, sic it on my huge dataset and hope all goes well? Or just edit a text file and have things work.
<br> <br>
Not to say C and other truly compiled languages don't have their place, I still use them for my biggest of big data, and I know in a lot of fields (let's take particle physics for example) the data is just too big to do in anything slower than C.
<br> <br>
But for teaching first year students, a language that they can understand, will be useful to them, and will give them a taste of what they can do with a text editor and compiler/interpreter is what's really important.  Teaching fortran like that seems like early optimization, and we all know where that leads.</htmltext>
<tokenext>It seems like fortran would be an awful choice , sure it 's fast and lots of big-deal libraries use it , but who cares ?
Firstly , it 's not as if you ca n't call fortran libraries from another language ( matlab anyone ?
) , secondly , if you learn to program , and become a professional , then you can learn another language .
I do n't think there 's too many people out there who honestly find it easier to program in fortran than $ { ANY \ _OTHER \ _LANGUAGE } I 've always felt C and C + + to be ideal intro languages , they are fast on their own , they very closely represent what is happening on the CPU ( pointers , they 're easy enough to optimize by hand ) so as a student you get some good insight into how the computer actually works , and they are much more similar to modern languages ( matlab python php etc etc ) than fortran .
Furthermore , the speed of the language is n't the only thing that matters , if I 'm writing stuff for in-house processing , and it takes me a month to write it in fortran , and a day in python , I do n't really care if the python is twice ( 3 5 10 ) times slower , because my calculation will still be done faster .
Furthermore consider the time it takes to debug matlab/python vs a compiled language like C or fortran , matlab and python ( or java , etc ) tell me where and what the problem was , as opposed to " SEG FAULT " 5 hours into processing .
And what happens when I decide I need to change my code ?
Attempt to decipher fortran or C , rebuild it , test it , sic it on my huge dataset and hope all goes well ?
Or just edit a text file and have things work .
Not to say C and other truly compiled languages do n't have their place , I still use them for my biggest of big data , and I know in a lot of fields ( let 's take particle physics for example ) the data is just too big to do in anything slower than C . But for teaching first year students , a language that they can understand , will be useful to them , and will give them a taste of what they can do with a text editor and compiler/interpreter is what 's really important .
Teaching fortran like that seems like early optimization , and we all know where that leads .</tokentext>
<sentencetext>It seems like fortran would be an awful choice, sure it's fast and lots of big-deal libraries use it, but who cares?
Firstly, it's not as if you can't call fortran libraries from another language (matlab anyone?
), secondly, if you learn to program, and become a professional, then you can learn another language.
I don't think there's too many people out there who honestly find it easier to program in fortran than ${ANY\_OTHER\_LANGUAGE}
 
I've always felt C and C++ to be ideal intro languages, they are fast on their own, they very closely represent what is happening on the CPU (pointers, they're easy enough to optimize by hand) so as a student you get some good insight into how the computer actually works, and they are much more similar to modern languages (matlab python php etc etc) than fortran.
Furthermore, the speed of the language isn't the only thing that matters, if I'm writing stuff for in-house processing, and it takes me a month to write it in fortran, and a day in python, I don't really care if the python is twice (3 5 10) times slower, because my calculation will still be done faster.
Furthermore consider the time it takes to debug matlab/python vs a compiled language like C or fortran, matlab and python (or java, etc) tell me where and what the problem was, as opposed to "SEG FAULT" 5 hours into processing.
And what happens when I decide I need to change my code?
Attempt to decipher fortran or C, rebuild it, test it, sic it on my huge dataset and hope all goes well?
Or just edit a text file and have things work.
Not to say C and other truly compiled languages don't have their place, I still use them for my biggest of big data, and I know in a lot of fields (let's take particle physics for example) the data is just too big to do in anything slower than C.
 
But for teaching first year students, a language that they can understand, will be useful to them, and will give them a taste of what they can do with a text editor and compiler/interpreter is what's really important.
Teaching fortran like that seems like early optimization, and we all know where that leads.</sentencetext>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28298321</id>
	<title>Re:While there may be "newer" languages</title>
	<author>chthonicdaemon</author>
	<datestamp>1244751000000</datestamp>
	<modclass>None</modclass>
	<modscore>1</modscore>
	<htmltext>For numeric work, it has a really well thought-out syntax, that is part of the language.  For instance, you can assign parts of arrays to one another with slicing like in Matlab or Python, but with some additional good bits that automatically parallelise.  I have looked far and wide, but I have not found anything in the c++/java world with as much expressive power for matrix and array based problems.
<br> <br>
Additionally, most of the choices in the language were made based on whether the features can be implemented efficiently.  The fact that (for instance) fortran doesn't allow aliasing without notification makes it easier for compilers to optimize certain constructs.  Keywords like pure (guarantees that a function is side-effect free) allow even more optimizion.  You can also specify a function as elemental, which allows it to be mapped over the elements of an array automatically.
<br> <br>
For me, Fortran is more of a domain specific language.  Modern fortran code resembles Matlab in many ways, but outperforms c/c++ with very little trouble.  You can usually get similar (or even a bit better) performance from c/c++ but not without jumping through some pretty interesting hoops.</htmltext>
<tokenext>For numeric work , it has a really well thought-out syntax , that is part of the language .
For instance , you can assign parts of arrays to one another with slicing like in Matlab or Python , but with some additional good bits that automatically parallelise .
I have looked far and wide , but I have not found anything in the c + + /java world with as much expressive power for matrix and array based problems .
Additionally , most of the choices in the language were made based on whether the features can be implemented efficiently .
The fact that ( for instance ) fortran does n't allow aliasing without notification makes it easier for compilers to optimize certain constructs .
Keywords like pure ( guarantees that a function is side-effect free ) allow even more optimizion .
You can also specify a function as elemental , which allows it to be mapped over the elements of an array automatically .
For me , Fortran is more of a domain specific language .
Modern fortran code resembles Matlab in many ways , but outperforms c/c + + with very little trouble .
You can usually get similar ( or even a bit better ) performance from c/c + + but not without jumping through some pretty interesting hoops .</tokentext>
<sentencetext>For numeric work, it has a really well thought-out syntax, that is part of the language.
For instance, you can assign parts of arrays to one another with slicing like in Matlab or Python, but with some additional good bits that automatically parallelise.
I have looked far and wide, but I have not found anything in the c++/java world with as much expressive power for matrix and array based problems.
Additionally, most of the choices in the language were made based on whether the features can be implemented efficiently.
The fact that (for instance) fortran doesn't allow aliasing without notification makes it easier for compilers to optimize certain constructs.
Keywords like pure (guarantees that a function is side-effect free) allow even more optimizion.
You can also specify a function as elemental, which allows it to be mapped over the elements of an array automatically.
For me, Fortran is more of a domain specific language.
Modern fortran code resembles Matlab in many ways, but outperforms c/c++ with very little trouble.
You can usually get similar (or even a bit better) performance from c/c++ but not without jumping through some pretty interesting hoops.</sentencetext>
	<parent>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28292025</parent>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28293347</id>
	<title>Re:While there may be "newer" languages</title>
	<author>fitten</author>
	<datestamp>1244733420000</datestamp>
	<modclass>None</modclass>
	<modscore>1</modscore>
	<htmltext><p>This is what I was thinking... the fact that the OP suggests Python over Fortran shows that he/she has no clue as to why Fortran is still used in preference over numerous other languages.  When you have simulations that have runtimes on the order of *weeks* when written in Fortran... how long do you think they'd take when written in Python? Months?  It's all about time to results.  And why Fortran is better than machine/assembler is that it's portable and higher level and easier to debug.</p></htmltext>
<tokenext>This is what I was thinking... the fact that the OP suggests Python over Fortran shows that he/she has no clue as to why Fortran is still used in preference over numerous other languages .
When you have simulations that have runtimes on the order of * weeks * when written in Fortran... how long do you think they 'd take when written in Python ?
Months ? It 's all about time to results .
And why Fortran is better than machine/assembler is that it 's portable and higher level and easier to debug .</tokentext>
<sentencetext>This is what I was thinking... the fact that the OP suggests Python over Fortran shows that he/she has no clue as to why Fortran is still used in preference over numerous other languages.
When you have simulations that have runtimes on the order of *weeks* when written in Fortran... how long do you think they'd take when written in Python?
Months?  It's all about time to results.
And why Fortran is better than machine/assembler is that it's portable and higher level and easier to debug.</sentencetext>
	<parent>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28291859</parent>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28293071</id>
	<title>MathLab</title>
	<author>Anonymous</author>
	<datestamp>1244732580000</datestamp>
	<modclass>None</modclass>
	<modscore>0</modscore>
	<htmltext><p>Forget Fortran, forget Python</p><p>In my university I teach MathLab ( or Octave the GNU version)  to students of Physics, Chemistry, Engineering. If you are not from computer science you probably hate programming, so it is better to use a more user friendly environment. With MathLab/Octave</p><p>MathLab has a large set of libraries. It is easy to plot charts, do statistics,do matrix operations and the language is very simple (C like without types).<br>A simulation can be up and running in no time. Other languages require too much knowledge in CS to be useful to these people.</p><p>For the more experienced users, it has C, java and python bidings.</p></htmltext>
<tokenext>Forget Fortran , forget PythonIn my university I teach MathLab ( or Octave the GNU version ) to students of Physics , Chemistry , Engineering .
If you are not from computer science you probably hate programming , so it is better to use a more user friendly environment .
With MathLab/OctaveMathLab has a large set of libraries .
It is easy to plot charts , do statistics,do matrix operations and the language is very simple ( C like without types ) .A simulation can be up and running in no time .
Other languages require too much knowledge in CS to be useful to these people.For the more experienced users , it has C , java and python bidings .</tokentext>
<sentencetext>Forget Fortran, forget PythonIn my university I teach MathLab ( or Octave the GNU version)  to students of Physics, Chemistry, Engineering.
If you are not from computer science you probably hate programming, so it is better to use a more user friendly environment.
With MathLab/OctaveMathLab has a large set of libraries.
It is easy to plot charts, do statistics,do matrix operations and the language is very simple (C like without types).A simulation can be up and running in no time.
Other languages require too much knowledge in CS to be useful to these people.For the more experienced users, it has C, java and python bidings.</sentencetext>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28296307</id>
	<title>Re:While there may be "newer" languages</title>
	<author>StillNeedMoreCoffee</author>
	<datestamp>1244744160000</datestamp>
	<modclass>None</modclass>
	<modscore>1</modscore>
	<htmltext><p>For number crunching Fortran is superior. C++ for example can be as quick but takes careful programming and deep knowledge of the language and the implications of the use of structures to not poison the compliers ability to optimize some calculation. If you are going for heads down number crunching then Fortran is the language. The language has enough restrictions that the compiler can optimize in most cases. In fact doing your own hand optimizaitions can be detrimental.</p><p>As to advanced algorithms, advanced algorithms that revolve around data structures yes, but advanced algorithms revolving around numerical analysis probably no. Most of us don't work in that numerical, matrix algebra world. It is specialized and advanced. We tend to not appreciate that body of algorithms as much.</p><p>I would vote for at least 3 fundemental languages with at least 2 not OOP for having a good base for being a Programmer. At least 3 language well enough to do problem solving and an introduction to several others.</p><p>The important concepts are the value of specialization and the value of appropriateness.  The right tool for the job. There are different problem domains with different types of problems. If you use the wrong language to program one, you probably will end up emulating language provided features of the appropriate language to get the job done anyway. The familiarity with the different problem domains gives you more conceptual tools to frame the solution of a problem and maybe, just maybe the guts to use a different tool or language from time to time when appropriate.</p></htmltext>
<tokenext>For number crunching Fortran is superior .
C + + for example can be as quick but takes careful programming and deep knowledge of the language and the implications of the use of structures to not poison the compliers ability to optimize some calculation .
If you are going for heads down number crunching then Fortran is the language .
The language has enough restrictions that the compiler can optimize in most cases .
In fact doing your own hand optimizaitions can be detrimental.As to advanced algorithms , advanced algorithms that revolve around data structures yes , but advanced algorithms revolving around numerical analysis probably no .
Most of us do n't work in that numerical , matrix algebra world .
It is specialized and advanced .
We tend to not appreciate that body of algorithms as much.I would vote for at least 3 fundemental languages with at least 2 not OOP for having a good base for being a Programmer .
At least 3 language well enough to do problem solving and an introduction to several others.The important concepts are the value of specialization and the value of appropriateness .
The right tool for the job .
There are different problem domains with different types of problems .
If you use the wrong language to program one , you probably will end up emulating language provided features of the appropriate language to get the job done anyway .
The familiarity with the different problem domains gives you more conceptual tools to frame the solution of a problem and maybe , just maybe the guts to use a different tool or language from time to time when appropriate .</tokentext>
<sentencetext>For number crunching Fortran is superior.
C++ for example can be as quick but takes careful programming and deep knowledge of the language and the implications of the use of structures to not poison the compliers ability to optimize some calculation.
If you are going for heads down number crunching then Fortran is the language.
The language has enough restrictions that the compiler can optimize in most cases.
In fact doing your own hand optimizaitions can be detrimental.As to advanced algorithms, advanced algorithms that revolve around data structures yes, but advanced algorithms revolving around numerical analysis probably no.
Most of us don't work in that numerical, matrix algebra world.
It is specialized and advanced.
We tend to not appreciate that body of algorithms as much.I would vote for at least 3 fundemental languages with at least 2 not OOP for having a good base for being a Programmer.
At least 3 language well enough to do problem solving and an introduction to several others.The important concepts are the value of specialization and the value of appropriateness.
The right tool for the job.
There are different problem domains with different types of problems.
If you use the wrong language to program one, you probably will end up emulating language provided features of the appropriate language to get the job done anyway.
The familiarity with the different problem domains gives you more conceptual tools to frame the solution of a problem and maybe, just maybe the guts to use a different tool or language from time to time when appropriate.</sentencetext>
	<parent>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28292147</parent>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28291925</id>
	<title>I still use Fortran for sciantific calculations</title>
	<author>sigxcpu</author>
	<datestamp>1244728500000</datestamp>
	<modclass>Informativ</modclass>
	<modscore>5</modscore>
	<htmltext><p>If all you need is to crunch numbers, Fortran is a good choice even today.<br>It might not be the best language to introduce someone to computer science, but it is very powerful for anything that has to do with matrix operations.</p><p>A few years ago in a physics graduate course we had a simulation project which left the choice of language to the student.<br>We compared performance between implementations in C C++ and Fortran.<br>Fortran was consistently faster by a big margin.<br>It's also very easy to learn.</p><p>That said, I do most of my coding in C.</p></htmltext>
<tokenext>If all you need is to crunch numbers , Fortran is a good choice even today.It might not be the best language to introduce someone to computer science , but it is very powerful for anything that has to do with matrix operations.A few years ago in a physics graduate course we had a simulation project which left the choice of language to the student.We compared performance between implementations in C C + + and Fortran.Fortran was consistently faster by a big margin.It 's also very easy to learn.That said , I do most of my coding in C .</tokentext>
<sentencetext>If all you need is to crunch numbers, Fortran is a good choice even today.It might not be the best language to introduce someone to computer science, but it is very powerful for anything that has to do with matrix operations.A few years ago in a physics graduate course we had a simulation project which left the choice of language to the student.We compared performance between implementations in C C++ and Fortran.Fortran was consistently faster by a big margin.It's also very easy to learn.That said, I do most of my coding in C.</sentencetext>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28301825</id>
	<title>Re:While there may be "newer" languages</title>
	<author>Kidbro</author>
	<datestamp>1244720940000</datestamp>
	<modclass>None</modclass>
	<modscore>1</modscore>
	<htmltext><blockquote><div><p>BTW, a very ill-advised design choice of Python: <a href="http://www.python.org/dev/peps/pep-0211/" title="python.org">http://www.python.org/dev/peps/pep-0211/</a> [python.org]</p></div> </blockquote><p>Only... it never happened. So your point (if there was any) is moot. Not arguing against the rest of your rant - but bashing a language for a poor design choice that was never happened seems a bit silly.</p></div>
	</htmltext>
<tokenext>BTW , a very ill-advised design choice of Python : http : //www.python.org/dev/peps/pep-0211/ [ python.org ] Only... it never happened .
So your point ( if there was any ) is moot .
Not arguing against the rest of your rant - but bashing a language for a poor design choice that was never happened seems a bit silly .</tokentext>
<sentencetext>BTW, a very ill-advised design choice of Python: http://www.python.org/dev/peps/pep-0211/ [python.org] Only... it never happened.
So your point (if there was any) is moot.
Not arguing against the rest of your rant - but bashing a language for a poor design choice that was never happened seems a bit silly.
	</sentencetext>
	<parent>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28292401</parent>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28294979</id>
	<title>what is it they say about crap code?</title>
	<author>grep\_rocks</author>
	<datestamp>1244739600000</datestamp>
	<modclass>None</modclass>
	<modscore>1</modscore>
	<htmltext>you can write FORTRAN in any computer language...</htmltext>
<tokenext>you can write FORTRAN in any computer language.. .</tokentext>
<sentencetext>you can write FORTRAN in any computer language...</sentencetext>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28293225</id>
	<title>Re:libraries. gigabytes of libraries</title>
	<author>xZgf6xHx2uhoAj9D</author>
	<datestamp>1244733060000</datestamp>
	<modclass>None</modclass>
	<modscore>1</modscore>
	<htmltext>The question was very specifically aimed at physics, chemistry and engineering students. I can't speak for engineering students, but I know that in physics and chemistry virtually <i>all</i> useful software is written in Fortran. All the code works perfectly, so why rewrite it? Not only does it work, but it's <i>fast</i> (Fortran code almost always outperforms the best optimizing C compilers). The difference between running a 3-day simulation of Fortran code vs. a 4-day simulation of C code (plus whatever development time there would be) can come in handy sometimes.</htmltext>
<tokenext>The question was very specifically aimed at physics , chemistry and engineering students .
I ca n't speak for engineering students , but I know that in physics and chemistry virtually all useful software is written in Fortran .
All the code works perfectly , so why rewrite it ?
Not only does it work , but it 's fast ( Fortran code almost always outperforms the best optimizing C compilers ) .
The difference between running a 3-day simulation of Fortran code vs. a 4-day simulation of C code ( plus whatever development time there would be ) can come in handy sometimes .</tokentext>
<sentencetext>The question was very specifically aimed at physics, chemistry and engineering students.
I can't speak for engineering students, but I know that in physics and chemistry virtually all useful software is written in Fortran.
All the code works perfectly, so why rewrite it?
Not only does it work, but it's fast (Fortran code almost always outperforms the best optimizing C compilers).
The difference between running a 3-day simulation of Fortran code vs. a 4-day simulation of C code (plus whatever development time there would be) can come in handy sometimes.</sentencetext>
	<parent>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28292329</parent>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28295749</id>
	<title>what else should we use</title>
	<author>Anonymous</author>
	<datestamp>1244742120000</datestamp>
	<modclass>None</modclass>
	<modscore>0</modscore>
	<htmltext><p>I was taught fortran 77 this past year as an undergraduate to do atomistic modeling. People tried to use C, C++, matlab, etc but they were too slow</p></htmltext>
<tokenext>I was taught fortran 77 this past year as an undergraduate to do atomistic modeling .
People tried to use C , C + + , matlab , etc but they were too slow</tokentext>
<sentencetext>I was taught fortran 77 this past year as an undergraduate to do atomistic modeling.
People tried to use C, C++, matlab, etc but they were too slow</sentencetext>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28294177</id>
	<title>Inferior Text Book Clearing House</title>
	<author>Anonymous</author>
	<datestamp>1244736600000</datestamp>
	<modclass>None</modclass>
	<modscore>0</modscore>
	<htmltext><p>I started Engineering School in 1988 and we were required to take Fortran.  The required text was a lousy, out of date and cheaply published (mine fell apart before the end of the semester) book that was written by the chair of the Computer Science department in the late seventies.  The professor barely used the book, relying mostly on handouts.</p><p>I think the book was used to keep the prof in royalties.</p></htmltext>
<tokenext>I started Engineering School in 1988 and we were required to take Fortran .
The required text was a lousy , out of date and cheaply published ( mine fell apart before the end of the semester ) book that was written by the chair of the Computer Science department in the late seventies .
The professor barely used the book , relying mostly on handouts.I think the book was used to keep the prof in royalties .</tokentext>
<sentencetext>I started Engineering School in 1988 and we were required to take Fortran.
The required text was a lousy, out of date and cheaply published (mine fell apart before the end of the semester) book that was written by the chair of the Computer Science department in the late seventies.
The professor barely used the book, relying mostly on handouts.I think the book was used to keep the prof in royalties.</sentencetext>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28296343</id>
	<title>The language is not the issue</title>
	<author>wolfguru</author>
	<datestamp>1244744280000</datestamp>
	<modclass>None</modclass>
	<modscore>1</modscore>
	<htmltext>Several posters here have made the point I answered to espouse - It is not the language per se, but the concepts of organizing and structuring data, preparing logical flow and execution, and maintaining a standard of documentation and coding which others can interpret and verify that is the essence of the teaching.  There are any number of "quick and dirty" languages, and the argument that there is excess capacity available to make up for their inefficiency is merely a cover to excuse the lack of clarity in these solutions. Analysis and interpretation of hard data requires the same rigor as gathering and validating that data, and none of the easy ways of manipulating the "for dummies" languages will meet the test of that exacting and essential process.

Teach python as a quick tool for making approximations; but first teach, through whatever language that meets the standards that science must hold itself to, the process of managing the idea and logic in code.

Computers are fast idiots; they do what we tell them to do at fantastic speed; not what we want them to do. Failing to understand that will produce results that never stand up to the level of accuracy needed.</htmltext>
<tokenext>Several posters here have made the point I answered to espouse - It is not the language per se , but the concepts of organizing and structuring data , preparing logical flow and execution , and maintaining a standard of documentation and coding which others can interpret and verify that is the essence of the teaching .
There are any number of " quick and dirty " languages , and the argument that there is excess capacity available to make up for their inefficiency is merely a cover to excuse the lack of clarity in these solutions .
Analysis and interpretation of hard data requires the same rigor as gathering and validating that data , and none of the easy ways of manipulating the " for dummies " languages will meet the test of that exacting and essential process .
Teach python as a quick tool for making approximations ; but first teach , through whatever language that meets the standards that science must hold itself to , the process of managing the idea and logic in code .
Computers are fast idiots ; they do what we tell them to do at fantastic speed ; not what we want them to do .
Failing to understand that will produce results that never stand up to the level of accuracy needed .</tokentext>
<sentencetext>Several posters here have made the point I answered to espouse - It is not the language per se, but the concepts of organizing and structuring data, preparing logical flow and execution, and maintaining a standard of documentation and coding which others can interpret and verify that is the essence of the teaching.
There are any number of "quick and dirty" languages, and the argument that there is excess capacity available to make up for their inefficiency is merely a cover to excuse the lack of clarity in these solutions.
Analysis and interpretation of hard data requires the same rigor as gathering and validating that data, and none of the easy ways of manipulating the "for dummies" languages will meet the test of that exacting and essential process.
Teach python as a quick tool for making approximations; but first teach, through whatever language that meets the standards that science must hold itself to, the process of managing the idea and logic in code.
Computers are fast idiots; they do what we tell them to do at fantastic speed; not what we want them to do.
Failing to understand that will produce results that never stand up to the level of accuracy needed.</sentencetext>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28296441</id>
	<title>Re:It's okay to teach them FORTRAN</title>
	<author>Cor-cor</author>
	<datestamp>1244744580000</datestamp>
	<modclass>Insightful</modclass>
	<modscore>2</modscore>
	<htmltext><p>I realize you're probably joking, but our intro class actually teaches VBA, and even though it's a general overview for engineers of all shapes and sizes, they really do a poor job of teaching the fundamentals of programming, choosing instead to focus mainly on the syntax and the language itself.  As far as I know, no subsequent class ever uses VBA, so we struggle with any future programming almost as much as we would have without an intro class.  They may be looking to fix that, but for now it functions primarily as a weedout class and nothing else.</p><p>I didn't go into computer/software engineering, but I did TA the intro class for a semester and have worked with its graduates (my engineering peers) on group projects and the like.  In my opinion, they really ought to teach the logic of programming (flowcharts and the like) much more heavily than focusing on any one language.  That way, you don't have people sitting and memorizing the way a certain program is written but lacking the common sense to so much as use a loop rather than writing the same calculation over and over again.</p><p>So teach FORTRAN, teach VB6, teach them LOLCODE or whatever the hell you want but please make sure you're teaching them <i>why</i> the code is written the way it is and that computers don't necessarily think the way you do.</p></htmltext>
<tokenext>I realize you 're probably joking , but our intro class actually teaches VBA , and even though it 's a general overview for engineers of all shapes and sizes , they really do a poor job of teaching the fundamentals of programming , choosing instead to focus mainly on the syntax and the language itself .
As far as I know , no subsequent class ever uses VBA , so we struggle with any future programming almost as much as we would have without an intro class .
They may be looking to fix that , but for now it functions primarily as a weedout class and nothing else.I did n't go into computer/software engineering , but I did TA the intro class for a semester and have worked with its graduates ( my engineering peers ) on group projects and the like .
In my opinion , they really ought to teach the logic of programming ( flowcharts and the like ) much more heavily than focusing on any one language .
That way , you do n't have people sitting and memorizing the way a certain program is written but lacking the common sense to so much as use a loop rather than writing the same calculation over and over again.So teach FORTRAN , teach VB6 , teach them LOLCODE or whatever the hell you want but please make sure you 're teaching them why the code is written the way it is and that computers do n't necessarily think the way you do .</tokentext>
<sentencetext>I realize you're probably joking, but our intro class actually teaches VBA, and even though it's a general overview for engineers of all shapes and sizes, they really do a poor job of teaching the fundamentals of programming, choosing instead to focus mainly on the syntax and the language itself.
As far as I know, no subsequent class ever uses VBA, so we struggle with any future programming almost as much as we would have without an intro class.
They may be looking to fix that, but for now it functions primarily as a weedout class and nothing else.I didn't go into computer/software engineering, but I did TA the intro class for a semester and have worked with its graduates (my engineering peers) on group projects and the like.
In my opinion, they really ought to teach the logic of programming (flowcharts and the like) much more heavily than focusing on any one language.
That way, you don't have people sitting and memorizing the way a certain program is written but lacking the common sense to so much as use a loop rather than writing the same calculation over and over again.So teach FORTRAN, teach VB6, teach them LOLCODE or whatever the hell you want but please make sure you're teaching them why the code is written the way it is and that computers don't necessarily think the way you do.</sentencetext>
	<parent>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28294815</parent>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28295219</id>
	<title>Don't overlook the performance</title>
	<author>OrangeTide</author>
	<datestamp>1244740320000</datestamp>
	<modclass>None</modclass>
	<modscore>1</modscore>
	<htmltext><p>FORTRAN is one of the fastest compiled languages around, at least for numerical problems. Using one of the slower interpreted languages around is no substitute. The commercial FORTRAN compilers out there produce very efficient code suitable for commercial, industrial, medical, research and military use. When, and if, Python gets anywhere near that point we can consider it.</p><p>Having easy and pretty syntax is not the only metric we should go by. If after a decade or two you can't squeeze out some performance where it counts in a general purpose language, then I believe your language's design is flawed or there is a lack of interest in your implementation. Python really is supposed to be general purpose (as is Ruby), and neither seem to compare favorable against Java, C, FORTRAN or a number of other languages.</p><p>You can argue about how performance isn't the only metric either, and I will agree. What's the point of a language if it's too hard to use for anything. But swinging the opposite way is not a solution either, ideally we want both speed and ease. And I am not aware of any specific law or property of compilers that prevents us from having both.</p><p>C - advantages: fast, moderately easy to use by ex-assembly programmers, easy to implement a compiler(a single target compiler that is, multi-architecture is hard to get right)</p><p>C++ - similar advantages to C, but the compiler is harder to implement. The rest of the bits are neither a clear advantage or disadvantage, just a matter of preference if you like the modular and OO style it has built-in.</p><p>Java - advantages: still fairly fast, abstraction makes it easy to improve the performance of compiled code, syntax is easier than C (but harder than Python). The rest of the bits I don't see as an advantage or disadvantage (like the built-in OO support or large library).<nobr> <wbr></nobr>.. I won't go over my thoughts in too much detail on Python, Ruby, Io, JavaScript, PHP, Lua, etc because it could be easily interpreted as a flame. Although at least some of those languages are special purpose languages, and I think therefor the criteria we use to judge if they are good or not should also be special. If Python was a special purpose language, instead of a general purpose one, it would be a lot easier to dismiss all of its performance flaws. There are benefits to using Python, I'm just not willing to explore them too deeply them until it can meet some basic needs we have for a general purpose language.</p></htmltext>
<tokenext>FORTRAN is one of the fastest compiled languages around , at least for numerical problems .
Using one of the slower interpreted languages around is no substitute .
The commercial FORTRAN compilers out there produce very efficient code suitable for commercial , industrial , medical , research and military use .
When , and if , Python gets anywhere near that point we can consider it.Having easy and pretty syntax is not the only metric we should go by .
If after a decade or two you ca n't squeeze out some performance where it counts in a general purpose language , then I believe your language 's design is flawed or there is a lack of interest in your implementation .
Python really is supposed to be general purpose ( as is Ruby ) , and neither seem to compare favorable against Java , C , FORTRAN or a number of other languages.You can argue about how performance is n't the only metric either , and I will agree .
What 's the point of a language if it 's too hard to use for anything .
But swinging the opposite way is not a solution either , ideally we want both speed and ease .
And I am not aware of any specific law or property of compilers that prevents us from having both.C - advantages : fast , moderately easy to use by ex-assembly programmers , easy to implement a compiler ( a single target compiler that is , multi-architecture is hard to get right ) C + + - similar advantages to C , but the compiler is harder to implement .
The rest of the bits are neither a clear advantage or disadvantage , just a matter of preference if you like the modular and OO style it has built-in.Java - advantages : still fairly fast , abstraction makes it easy to improve the performance of compiled code , syntax is easier than C ( but harder than Python ) .
The rest of the bits I do n't see as an advantage or disadvantage ( like the built-in OO support or large library ) .
.. I wo n't go over my thoughts in too much detail on Python , Ruby , Io , JavaScript , PHP , Lua , etc because it could be easily interpreted as a flame .
Although at least some of those languages are special purpose languages , and I think therefor the criteria we use to judge if they are good or not should also be special .
If Python was a special purpose language , instead of a general purpose one , it would be a lot easier to dismiss all of its performance flaws .
There are benefits to using Python , I 'm just not willing to explore them too deeply them until it can meet some basic needs we have for a general purpose language .</tokentext>
<sentencetext>FORTRAN is one of the fastest compiled languages around, at least for numerical problems.
Using one of the slower interpreted languages around is no substitute.
The commercial FORTRAN compilers out there produce very efficient code suitable for commercial, industrial, medical, research and military use.
When, and if, Python gets anywhere near that point we can consider it.Having easy and pretty syntax is not the only metric we should go by.
If after a decade or two you can't squeeze out some performance where it counts in a general purpose language, then I believe your language's design is flawed or there is a lack of interest in your implementation.
Python really is supposed to be general purpose (as is Ruby), and neither seem to compare favorable against Java, C, FORTRAN or a number of other languages.You can argue about how performance isn't the only metric either, and I will agree.
What's the point of a language if it's too hard to use for anything.
But swinging the opposite way is not a solution either, ideally we want both speed and ease.
And I am not aware of any specific law or property of compilers that prevents us from having both.C - advantages: fast, moderately easy to use by ex-assembly programmers, easy to implement a compiler(a single target compiler that is, multi-architecture is hard to get right)C++ - similar advantages to C, but the compiler is harder to implement.
The rest of the bits are neither a clear advantage or disadvantage, just a matter of preference if you like the modular and OO style it has built-in.Java - advantages: still fairly fast, abstraction makes it easy to improve the performance of compiled code, syntax is easier than C (but harder than Python).
The rest of the bits I don't see as an advantage or disadvantage (like the built-in OO support or large library).
.. I won't go over my thoughts in too much detail on Python, Ruby, Io, JavaScript, PHP, Lua, etc because it could be easily interpreted as a flame.
Although at least some of those languages are special purpose languages, and I think therefor the criteria we use to judge if they are good or not should also be special.
If Python was a special purpose language, instead of a general purpose one, it would be a lot easier to dismiss all of its performance flaws.
There are benefits to using Python, I'm just not willing to explore them too deeply them until it can meet some basic needs we have for a general purpose language.</sentencetext>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28303789</id>
	<title>Re:PYTHON????</title>
	<author>ceoyoyo</author>
	<datestamp>1244735640000</datestamp>
	<modclass>None</modclass>
	<modscore>1</modscore>
	<htmltext><p>"Great language. But its also a factor of 30 times slower than a compiled language like C."</p><p>Not if you use it properly.  Python is an interpreted language.  There's an art to using it properly.  Here's a hint - if you're writing big for loops, you're doing it wrong.</p><p>Incidentally, the solution to that problem often involves calling a function written in Fortran.</p></htmltext>
<tokenext>" Great language .
But its also a factor of 30 times slower than a compiled language like C. " Not if you use it properly .
Python is an interpreted language .
There 's an art to using it properly .
Here 's a hint - if you 're writing big for loops , you 're doing it wrong.Incidentally , the solution to that problem often involves calling a function written in Fortran .</tokentext>
<sentencetext>"Great language.
But its also a factor of 30 times slower than a compiled language like C."Not if you use it properly.
Python is an interpreted language.
There's an art to using it properly.
Here's a hint - if you're writing big for loops, you're doing it wrong.Incidentally, the solution to that problem often involves calling a function written in Fortran.</sentencetext>
	<parent>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28292127</parent>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28292655</id>
	<title>Starting with Python, Refining with Fortran</title>
	<author>Anonymous</author>
	<datestamp>1244731200000</datestamp>
	<modclass>None</modclass>
	<modscore>0</modscore>
	<htmltext><p>I have seen several arguments above regarding the speed of Fortran.  Sure, its faster.  But speed is not the point.  The majority of these undergraduates aren't going to be a position to benefit from the speed.  They are doing homework and projects, not optimizing a serious computational problem.</p><p>Simply put, python is much better at handling all of the non-numerical work and good enough at performing the numerical work.  Where it is deemed too slow a method could be reimplemented in fortran and almost trivially integrated into the existing python program.  Heck, f2py is even integrated into Numpy at this point:<br>http://www.scipy.org/F2py</p><p>So all the students should learn python and your serious thinkers will learn fortran as well and only occasionally be called on to use it.</p><p>Another argument one should consider is that python enjoys broad adoption across many scientific disciplines.  There are active projects for practically every discipline.  The Fortran material seems to be less accessible or isnt shared as broadly.  Having a helpful community is useful.</p><p>A benefit of python's adoption is that it has interfaces to many other programming languages. For example Gnu R and MATLAB.  If fortran has any integration with these, I would bet that its not as extensive and actively maintained as the python interfaces.</p></htmltext>
<tokenext>I have seen several arguments above regarding the speed of Fortran .
Sure , its faster .
But speed is not the point .
The majority of these undergraduates are n't going to be a position to benefit from the speed .
They are doing homework and projects , not optimizing a serious computational problem.Simply put , python is much better at handling all of the non-numerical work and good enough at performing the numerical work .
Where it is deemed too slow a method could be reimplemented in fortran and almost trivially integrated into the existing python program .
Heck , f2py is even integrated into Numpy at this point : http : //www.scipy.org/F2pySo all the students should learn python and your serious thinkers will learn fortran as well and only occasionally be called on to use it.Another argument one should consider is that python enjoys broad adoption across many scientific disciplines .
There are active projects for practically every discipline .
The Fortran material seems to be less accessible or isnt shared as broadly .
Having a helpful community is useful.A benefit of python 's adoption is that it has interfaces to many other programming languages .
For example Gnu R and MATLAB .
If fortran has any integration with these , I would bet that its not as extensive and actively maintained as the python interfaces .</tokentext>
<sentencetext>I have seen several arguments above regarding the speed of Fortran.
Sure, its faster.
But speed is not the point.
The majority of these undergraduates aren't going to be a position to benefit from the speed.
They are doing homework and projects, not optimizing a serious computational problem.Simply put, python is much better at handling all of the non-numerical work and good enough at performing the numerical work.
Where it is deemed too slow a method could be reimplemented in fortran and almost trivially integrated into the existing python program.
Heck, f2py is even integrated into Numpy at this point:http://www.scipy.org/F2pySo all the students should learn python and your serious thinkers will learn fortran as well and only occasionally be called on to use it.Another argument one should consider is that python enjoys broad adoption across many scientific disciplines.
There are active projects for practically every discipline.
The Fortran material seems to be less accessible or isnt shared as broadly.
Having a helpful community is useful.A benefit of python's adoption is that it has interfaces to many other programming languages.
For example Gnu R and MATLAB.
If fortran has any integration with these, I would bet that its not as extensive and actively maintained as the python interfaces.</sentencetext>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28292847</id>
	<title>FORTRAN is fine</title>
	<author>Corson</author>
	<datestamp>1244731740000</datestamp>
	<modclass>None</modclass>
	<modscore>1</modscore>
	<htmltext>Python is a scripting language. FORTRAN is a language for high performance computing. Engineering, Physics, Chemistry need high performance computational tools. Has anyone ever tried performing Molecular Dynamics simulations in Python? I didn't think so.</htmltext>
<tokenext>Python is a scripting language .
FORTRAN is a language for high performance computing .
Engineering , Physics , Chemistry need high performance computational tools .
Has anyone ever tried performing Molecular Dynamics simulations in Python ?
I did n't think so .</tokentext>
<sentencetext>Python is a scripting language.
FORTRAN is a language for high performance computing.
Engineering, Physics, Chemistry need high performance computational tools.
Has anyone ever tried performing Molecular Dynamics simulations in Python?
I didn't think so.</sentencetext>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28293925</id>
	<title>Sometimes we need the *right* answer!</title>
	<author>Anonymous</author>
	<datestamp>1244735700000</datestamp>
	<modclass>None</modclass>
	<modscore>0</modscore>
	<htmltext><p>About 10 years ago, an industrial firm asked me to review converting from its "obsolete" FORTRAN code to C. This company used many-dimensional matrix and linear algebra to solve for the correct solution in its physical processes. And, yes, I wrote much of that code. I re-wrote the core calculations in C, using the libraries available at that time, and compared the results. The C code was slower, but the solution results were also significantly different. When I plugged those results into the process simulator, it caused a (virtual) violent reaction. That review ended with "... and people could die." The firm decided that FORTRAN wasn't so bad after all, and they would continue investing in scientific computing folks rather than general CS majors (as it turns out, saving money on staffing was the real motivator). OTOH, they were very happy when I suggested moving to more modern languages for user interfaces.</p></htmltext>
<tokenext>About 10 years ago , an industrial firm asked me to review converting from its " obsolete " FORTRAN code to C. This company used many-dimensional matrix and linear algebra to solve for the correct solution in its physical processes .
And , yes , I wrote much of that code .
I re-wrote the core calculations in C , using the libraries available at that time , and compared the results .
The C code was slower , but the solution results were also significantly different .
When I plugged those results into the process simulator , it caused a ( virtual ) violent reaction .
That review ended with " ... and people could die .
" The firm decided that FORTRAN was n't so bad after all , and they would continue investing in scientific computing folks rather than general CS majors ( as it turns out , saving money on staffing was the real motivator ) .
OTOH , they were very happy when I suggested moving to more modern languages for user interfaces .</tokentext>
<sentencetext>About 10 years ago, an industrial firm asked me to review converting from its "obsolete" FORTRAN code to C. This company used many-dimensional matrix and linear algebra to solve for the correct solution in its physical processes.
And, yes, I wrote much of that code.
I re-wrote the core calculations in C, using the libraries available at that time, and compared the results.
The C code was slower, but the solution results were also significantly different.
When I plugged those results into the process simulator, it caused a (virtual) violent reaction.
That review ended with "... and people could die.
" The firm decided that FORTRAN wasn't so bad after all, and they would continue investing in scientific computing folks rather than general CS majors (as it turns out, saving money on staffing was the real motivator).
OTOH, they were very happy when I suggested moving to more modern languages for user interfaces.</sentencetext>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28293535</id>
	<title>Pascal</title>
	<author>sulfur</author>
	<datestamp>1244734200000</datestamp>
	<modclass>None</modclass>
	<modscore>1</modscore>
	<htmltext>I learned <a href="http://en.wikipedia.org/wiki/Pascal\_(programming\_language)" title="wikipedia.org">Pascal</a> [wikipedia.org] as my first programming language. As far as I know, Wirth specifically designed it for teaching. Sure, it doesn't provide low-level hardware access as C does, but it enforces good programming practices by not giving a programmer as much freedom. However, nowadays it is hardly used anywhere, so I would probably recommend C++ or Java as the first language to learn. My school chose Java and I think it is was a good decision, mostly because for a beginner Java code is much easier to debug in a good IDE (as opposed to C++).</htmltext>
<tokenext>I learned Pascal [ wikipedia.org ] as my first programming language .
As far as I know , Wirth specifically designed it for teaching .
Sure , it does n't provide low-level hardware access as C does , but it enforces good programming practices by not giving a programmer as much freedom .
However , nowadays it is hardly used anywhere , so I would probably recommend C + + or Java as the first language to learn .
My school chose Java and I think it is was a good decision , mostly because for a beginner Java code is much easier to debug in a good IDE ( as opposed to C + + ) .</tokentext>
<sentencetext>I learned Pascal [wikipedia.org] as my first programming language.
As far as I know, Wirth specifically designed it for teaching.
Sure, it doesn't provide low-level hardware access as C does, but it enforces good programming practices by not giving a programmer as much freedom.
However, nowadays it is hardly used anywhere, so I would probably recommend C++ or Java as the first language to learn.
My school chose Java and I think it is was a good decision, mostly because for a beginner Java code is much easier to debug in a good IDE (as opposed to C++).</sentencetext>
	<parent>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28291939</parent>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28295579</id>
	<title>Pushing Python was his agenda</title>
	<author>synthespian</author>
	<datestamp>1244741580000</datestamp>
	<modclass>None</modclass>
	<modscore>1</modscore>
	<htmltext><p>I totally agree that Fortran is inadequate for a first language. In fact, at my university, we had to learn C (now some are getting around to Scheme), Pascal (cleaner syntax, etc.) as a first language. Then we get around to learning pointers in the context of data structures and algorithms. This use of pointers is just something you have to go through in life, just like learning your ABC. A little bit of Maude (yes! In the context of specifications) was taught. Applied Math students learn Fortran later, but they also learn Matlab prior or simultaneously. R is learned in statistics. Maple is also used in ODE, etc., classes.</p><p>In terms of the article, Python is a very weak choice. You need to learn about pointers. Then, if you need a simple language high-level language for math/physics you should do what everybody does: use MATLAB (sadly, the excelent Scilab is neglected - probably due to it being European and not having the marketing penetrating capacity the MATLAB firm has), use Maple. IIRC, even the Python numerics packages are not on par with its proprietary counterparts, lacking many things. I'm not sure, but it seems to me that in terms of data visualization, Java might be a good choice (C++ being an obvious one). So Python doesn't seem to fit anywhere, in my mind. It has this niche as a scripting language (and some would argue it's not well designed) and it should stay there. As a "glue" language for scientific computing it doesn't achieve much, it seems. At least, it seems there are other options that deliver more speed, more features, more power (want abstraction + foreign functions interface? I'd do Common Lisp) and, thus, more productivity. In fact, were it not so, Python wouldn't be such a minority choice. Now, if your talking web sites...That's another issue altogether.</p><p>There's no way around Fortran for numerics. Numerics code is the domain of experts, and they work in Fortran (Fortran being faster than C for this task), Let's hope Sun Microsystems gets Fortress in a usable state - because that has a lot of good ideas in it (ideas from functional programming).</p></htmltext>
<tokenext>I totally agree that Fortran is inadequate for a first language .
In fact , at my university , we had to learn C ( now some are getting around to Scheme ) , Pascal ( cleaner syntax , etc .
) as a first language .
Then we get around to learning pointers in the context of data structures and algorithms .
This use of pointers is just something you have to go through in life , just like learning your ABC .
A little bit of Maude ( yes !
In the context of specifications ) was taught .
Applied Math students learn Fortran later , but they also learn Matlab prior or simultaneously .
R is learned in statistics .
Maple is also used in ODE , etc. , classes.In terms of the article , Python is a very weak choice .
You need to learn about pointers .
Then , if you need a simple language high-level language for math/physics you should do what everybody does : use MATLAB ( sadly , the excelent Scilab is neglected - probably due to it being European and not having the marketing penetrating capacity the MATLAB firm has ) , use Maple .
IIRC , even the Python numerics packages are not on par with its proprietary counterparts , lacking many things .
I 'm not sure , but it seems to me that in terms of data visualization , Java might be a good choice ( C + + being an obvious one ) .
So Python does n't seem to fit anywhere , in my mind .
It has this niche as a scripting language ( and some would argue it 's not well designed ) and it should stay there .
As a " glue " language for scientific computing it does n't achieve much , it seems .
At least , it seems there are other options that deliver more speed , more features , more power ( want abstraction + foreign functions interface ?
I 'd do Common Lisp ) and , thus , more productivity .
In fact , were it not so , Python would n't be such a minority choice .
Now , if your talking web sites...That 's another issue altogether.There 's no way around Fortran for numerics .
Numerics code is the domain of experts , and they work in Fortran ( Fortran being faster than C for this task ) , Let 's hope Sun Microsystems gets Fortress in a usable state - because that has a lot of good ideas in it ( ideas from functional programming ) .</tokentext>
<sentencetext>I totally agree that Fortran is inadequate for a first language.
In fact, at my university, we had to learn C (now some are getting around to Scheme), Pascal (cleaner syntax, etc.
) as a first language.
Then we get around to learning pointers in the context of data structures and algorithms.
This use of pointers is just something you have to go through in life, just like learning your ABC.
A little bit of Maude (yes!
In the context of specifications) was taught.
Applied Math students learn Fortran later, but they also learn Matlab prior or simultaneously.
R is learned in statistics.
Maple is also used in ODE, etc., classes.In terms of the article, Python is a very weak choice.
You need to learn about pointers.
Then, if you need a simple language high-level language for math/physics you should do what everybody does: use MATLAB (sadly, the excelent Scilab is neglected - probably due to it being European and not having the marketing penetrating capacity the MATLAB firm has), use Maple.
IIRC, even the Python numerics packages are not on par with its proprietary counterparts, lacking many things.
I'm not sure, but it seems to me that in terms of data visualization, Java might be a good choice (C++ being an obvious one).
So Python doesn't seem to fit anywhere, in my mind.
It has this niche as a scripting language (and some would argue it's not well designed) and it should stay there.
As a "glue" language for scientific computing it doesn't achieve much, it seems.
At least, it seems there are other options that deliver more speed, more features, more power (want abstraction + foreign functions interface?
I'd do Common Lisp) and, thus, more productivity.
In fact, were it not so, Python wouldn't be such a minority choice.
Now, if your talking web sites...That's another issue altogether.There's no way around Fortran for numerics.
Numerics code is the domain of experts, and they work in Fortran (Fortran being faster than C for this task), Let's hope Sun Microsystems gets Fortress in a usable state - because that has a lot of good ideas in it (ideas from functional programming).</sentencetext>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28294031</id>
	<title>The answer is: no</title>
	<author>Spazmania</author>
	<datestamp>1244736060000</datestamp>
	<modclass>None</modclass>
	<modscore>1</modscore>
	<htmltext><p>I'd only recommend Python to my enemies, but I don't hate anyone enough to suggest that they learn Fortran.</p><p>Pure sciences folks would be better served by something like Mathcad or Mathematica, advanced mathematics packages which include reasonably rich scripting languages. No form of "real" software development will serve you well in your core discipline.</p></htmltext>
<tokenext>I 'd only recommend Python to my enemies , but I do n't hate anyone enough to suggest that they learn Fortran.Pure sciences folks would be better served by something like Mathcad or Mathematica , advanced mathematics packages which include reasonably rich scripting languages .
No form of " real " software development will serve you well in your core discipline .</tokentext>
<sentencetext>I'd only recommend Python to my enemies, but I don't hate anyone enough to suggest that they learn Fortran.Pure sciences folks would be better served by something like Mathcad or Mathematica, advanced mathematics packages which include reasonably rich scripting languages.
No form of "real" software development will serve you well in your core discipline.</sentencetext>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28291941</id>
	<title>Give it a break</title>
	<author>pjt33</author>
	<datestamp>1244728560000</datestamp>
	<modclass>None</modclass>
	<modscore>1</modscore>
	<htmltext><p>I think it's a shame Slashdot can't go a month without someone submitting a "Which programming language is best?" flame war firestarter. This has been done to death.</p></htmltext>
<tokenext>I think it 's a shame Slashdot ca n't go a month without someone submitting a " Which programming language is best ?
" flame war firestarter .
This has been done to death .</tokentext>
<sentencetext>I think it's a shame Slashdot can't go a month without someone submitting a "Which programming language is best?
" flame war firestarter.
This has been done to death.</sentencetext>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28292075</id>
	<title>It could be worse.  In fact, it was...</title>
	<author>Anonymous</author>
	<datestamp>1244729160000</datestamp>
	<modclass>Insightful</modclass>
	<modscore>1</modscore>
	<htmltext><p>When I was an undergrad, the CS requirement forced all students to take classes in Pascal.</p><p>I think the reasoning was that a student should learn a language with extreme, formal structure, and then later they can learn ones that aren't quite as strict.  Maybe that the same reasoning behind teaching students Fortran?  At least it's a little more useful than Pascal.</p></htmltext>
<tokenext>When I was an undergrad , the CS requirement forced all students to take classes in Pascal.I think the reasoning was that a student should learn a language with extreme , formal structure , and then later they can learn ones that are n't quite as strict .
Maybe that the same reasoning behind teaching students Fortran ?
At least it 's a little more useful than Pascal .</tokentext>
<sentencetext>When I was an undergrad, the CS requirement forced all students to take classes in Pascal.I think the reasoning was that a student should learn a language with extreme, formal structure, and then later they can learn ones that aren't quite as strict.
Maybe that the same reasoning behind teaching students Fortran?
At least it's a little more useful than Pascal.</sentencetext>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28292211</id>
	<title>Wrong question being asked</title>
	<author>GreatBunzinni</author>
	<datestamp>1244729700000</datestamp>
	<modclass>None</modclass>
	<modscore>1</modscore>
	<htmltext>I'm a civil engineering student who had Fortran in the curriculum. Although I don't use Fortran very often, I believe the article poses the wrong question. Instead of "why learn fortran" the question that the article's author should ask himself was "why am I forcing Python as a replacement of a tool that benefits from a battle-hardened technology that has the benefit of having a extremely serious, decades-long test on production environments in the real world?" I mean, "fortran is old" is not a technical argument. So it's first inception was decades ago. Is that even relevant?<br><br>More to the point, mentioning Python as a tool for numerical analysis is laughable, not because of having any blatant weakness but because there are a whole lot of tools that fit that job even better than Python, both in terms of speed and available APIs. If we were forced to abandon Fortran then why exactly would we adopt Python when there are other tools that are far better suited for the task? (Read C, C++, Maxima, Mathematica, Matlab, Octave, R, etc...)</htmltext>
<tokenext>I 'm a civil engineering student who had Fortran in the curriculum .
Although I do n't use Fortran very often , I believe the article poses the wrong question .
Instead of " why learn fortran " the question that the article 's author should ask himself was " why am I forcing Python as a replacement of a tool that benefits from a battle-hardened technology that has the benefit of having a extremely serious , decades-long test on production environments in the real world ?
" I mean , " fortran is old " is not a technical argument .
So it 's first inception was decades ago .
Is that even relevant ? More to the point , mentioning Python as a tool for numerical analysis is laughable , not because of having any blatant weakness but because there are a whole lot of tools that fit that job even better than Python , both in terms of speed and available APIs .
If we were forced to abandon Fortran then why exactly would we adopt Python when there are other tools that are far better suited for the task ?
( Read C , C + + , Maxima , Mathematica , Matlab , Octave , R , etc... )</tokentext>
<sentencetext>I'm a civil engineering student who had Fortran in the curriculum.
Although I don't use Fortran very often, I believe the article poses the wrong question.
Instead of "why learn fortran" the question that the article's author should ask himself was "why am I forcing Python as a replacement of a tool that benefits from a battle-hardened technology that has the benefit of having a extremely serious, decades-long test on production environments in the real world?
" I mean, "fortran is old" is not a technical argument.
So it's first inception was decades ago.
Is that even relevant?More to the point, mentioning Python as a tool for numerical analysis is laughable, not because of having any blatant weakness but because there are a whole lot of tools that fit that job even better than Python, both in terms of speed and available APIs.
If we were forced to abandon Fortran then why exactly would we adopt Python when there are other tools that are far better suited for the task?
(Read C, C++, Maxima, Mathematica, Matlab, Octave, R, etc...)</sentencetext>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28295131</id>
	<title>Re:Newer doesn't always mean better.</title>
	<author>twrake</author>
	<datestamp>1244740080000</datestamp>
	<modclass>None</modclass>
	<modscore>1</modscore>
	<htmltext><p>I think this comment profoundly restates this entire threads argument. The construction industry suffers from what would be called in the software industry level 1 companies. Most for profit construction companies have adopted nail guns over two decades ago. Nail guns are clearly more productive and level 1 companies understand this when they see this. But this isn't really the point most construction delays and costs are associated with management of the job as a whole and getting the right pieces from the suppliers ( and checking an reordering them if incorrect) and knowing that the specs will chance of that cost saving material has required you to redo work "completed" a month ago.</p><p>Companies have legacy code in FORTRAN and COBOL and few new programmers fluent in either, this makes the legacy costs even higher. Sure rewrite it all in the newest language is the programmers dream but it will never happen. The newest nailgun makes little difference to the companies with these legacy problems and programmer call for call these new languages. Let em starve it was those guys who got us these legacy problems in the first place is the corporate attitude.</p><p>FORTRAN IV was old when I learned it in 1975 a semester early so I could breeze through the spring term, I bought the textbook and went down to the computer center and punched the cards, and typed on the terminals like any good hacker to be. It was easy for me I had studied machine language in 4th grade from a book I purchased, computer are just dumb machines doing step by step stuff these languages just translate one set of instructions to another...</p><p>FORTRAN is worth teaching as was pascal and C and java and lisp and PERL.</p><p>Language wars are a waste of time.</p></htmltext>
<tokenext>I think this comment profoundly restates this entire threads argument .
The construction industry suffers from what would be called in the software industry level 1 companies .
Most for profit construction companies have adopted nail guns over two decades ago .
Nail guns are clearly more productive and level 1 companies understand this when they see this .
But this is n't really the point most construction delays and costs are associated with management of the job as a whole and getting the right pieces from the suppliers ( and checking an reordering them if incorrect ) and knowing that the specs will chance of that cost saving material has required you to redo work " completed " a month ago.Companies have legacy code in FORTRAN and COBOL and few new programmers fluent in either , this makes the legacy costs even higher .
Sure rewrite it all in the newest language is the programmers dream but it will never happen .
The newest nailgun makes little difference to the companies with these legacy problems and programmer call for call these new languages .
Let em starve it was those guys who got us these legacy problems in the first place is the corporate attitude.FORTRAN IV was old when I learned it in 1975 a semester early so I could breeze through the spring term , I bought the textbook and went down to the computer center and punched the cards , and typed on the terminals like any good hacker to be .
It was easy for me I had studied machine language in 4th grade from a book I purchased , computer are just dumb machines doing step by step stuff these languages just translate one set of instructions to another...FORTRAN is worth teaching as was pascal and C and java and lisp and PERL.Language wars are a waste of time .</tokentext>
<sentencetext>I think this comment profoundly restates this entire threads argument.
The construction industry suffers from what would be called in the software industry level 1 companies.
Most for profit construction companies have adopted nail guns over two decades ago.
Nail guns are clearly more productive and level 1 companies understand this when they see this.
But this isn't really the point most construction delays and costs are associated with management of the job as a whole and getting the right pieces from the suppliers ( and checking an reordering them if incorrect) and knowing that the specs will chance of that cost saving material has required you to redo work "completed" a month ago.Companies have legacy code in FORTRAN and COBOL and few new programmers fluent in either, this makes the legacy costs even higher.
Sure rewrite it all in the newest language is the programmers dream but it will never happen.
The newest nailgun makes little difference to the companies with these legacy problems and programmer call for call these new languages.
Let em starve it was those guys who got us these legacy problems in the first place is the corporate attitude.FORTRAN IV was old when I learned it in 1975 a semester early so I could breeze through the spring term, I bought the textbook and went down to the computer center and punched the cards, and typed on the terminals like any good hacker to be.
It was easy for me I had studied machine language in 4th grade from a book I purchased, computer are just dumb machines doing step by step stuff these languages just translate one set of instructions to another...FORTRAN is worth teaching as was pascal and C and java and lisp and PERL.Language wars are a waste of time.</sentencetext>
	<parent>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28292293</parent>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28292969</id>
	<title>act well THY part</title>
	<author>Anonymous</author>
	<datestamp>1244732220000</datestamp>
	<modclass>None</modclass>
	<modscore>0</modscore>
	<htmltext><p>I feel that they should only be taught one or two languages that help accomplish their projects and goals as physical scientists.  Only computer scientists and the like need spend so much time with languages that enforce low-level concepts like memory allocation.  If a physicist wants a program to be more efficient, he should hire a computer scientist!</p></htmltext>
<tokenext>I feel that they should only be taught one or two languages that help accomplish their projects and goals as physical scientists .
Only computer scientists and the like need spend so much time with languages that enforce low-level concepts like memory allocation .
If a physicist wants a program to be more efficient , he should hire a computer scientist !</tokentext>
<sentencetext>I feel that they should only be taught one or two languages that help accomplish their projects and goals as physical scientists.
Only computer scientists and the like need spend so much time with languages that enforce low-level concepts like memory allocation.
If a physicist wants a program to be more efficient, he should hire a computer scientist!</sentencetext>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28292745</id>
	<title>Re:University != Trade school</title>
	<author>Anonymous</author>
	<datestamp>1244731440000</datestamp>
	<modclass>None</modclass>
	<modscore>0</modscore>
	<htmltext><p>They are using it as a tool to accomplish science and engineering goals; not computer science goals.</p><p>And, do you not think MBAs spend time using Excel AND being taught how to write macros?  They are things to be done in this world which require other than a liberal arts major or computer science major.</p></htmltext>
<tokenext>They are using it as a tool to accomplish science and engineering goals ; not computer science goals.And , do you not think MBAs spend time using Excel AND being taught how to write macros ?
They are things to be done in this world which require other than a liberal arts major or computer science major .</tokentext>
<sentencetext>They are using it as a tool to accomplish science and engineering goals; not computer science goals.And, do you not think MBAs spend time using Excel AND being taught how to write macros?
They are things to be done in this world which require other than a liberal arts major or computer science major.</sentencetext>
	<parent>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28292153</parent>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28292839</id>
	<title>Re:While there may be "newer" languages</title>
	<author>Lord Ender</author>
	<datestamp>1244731740000</datestamp>
	<modclass>None</modclass>
	<modscore>1</modscore>
	<htmltext><p>There may be better languages for specialized sets of problems. But for CS and Software Engineering students, schools would serve students best by teaching:</p><ul><li>Python for "basics of programming"</li><li>Ruby for OOP classes</li><li>Assembly and C for Operating Systems and other more hardware-focused stuff.</li></ul></htmltext>
<tokenext>There may be better languages for specialized sets of problems .
But for CS and Software Engineering students , schools would serve students best by teaching : Python for " basics of programming " Ruby for OOP classesAssembly and C for Operating Systems and other more hardware-focused stuff .</tokentext>
<sentencetext>There may be better languages for specialized sets of problems.
But for CS and Software Engineering students, schools would serve students best by teaching:Python for "basics of programming"Ruby for OOP classesAssembly and C for Operating Systems and other more hardware-focused stuff.</sentencetext>
	<parent>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28291859</parent>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28295995</id>
	<title>maths oriented language for physics: YES</title>
	<author>Anonymous</author>
	<datestamp>1244743020000</datestamp>
	<modclass>None</modclass>
	<modscore>0</modscore>
	<htmltext><p>since you're talking about physics  - it sounds like yes it should  still be taught.<br>I would have thought either matlab or fortran is perfect. a language for calculating<br>shit. as opposed to a general purpose langage.</p><p>languages like python can be particularly cumbersome to do maths in.<br>(unless python has got some sort of maths module which I don't know about).</p><p>I don't know much about fortran vs matlab tho so perhaps matlab is a decent alternative.</p><p>anyway fortran is still used in all sorts of areas in code hardware or systems etc where you need<br>actual number crunching for example radar systems and that sort of thing.</p><p>I'm not a fan of fortran myself but the fact is it's a tried and tested language for<br>numeric computations.</p><p>however having said that, the question was not 'is it any use' but should it still be taught.</p><p>the answer to that really depends on the course. the course I learnt it in was called<br>'programming paradigms' and the intent was to show different types of languages. fortran<br>is a good example of a different paradigm from your bog standard boring procedural C<br>style languages.</p><p>so, yes it should be taught. at least there's no reason it shouldn't be used as an example<br>of that type of language.</p><p>also, I have to say, learning how shit some of these old languages are provides appreciation<br>of features of modern languages.</p></htmltext>
<tokenext>since you 're talking about physics - it sounds like yes it should still be taught.I would have thought either matlab or fortran is perfect .
a language for calculatingshit .
as opposed to a general purpose langage.languages like python can be particularly cumbersome to do maths in .
( unless python has got some sort of maths module which I do n't know about ) .I do n't know much about fortran vs matlab tho so perhaps matlab is a decent alternative.anyway fortran is still used in all sorts of areas in code hardware or systems etc where you needactual number crunching for example radar systems and that sort of thing.I 'm not a fan of fortran myself but the fact is it 's a tried and tested language fornumeric computations.however having said that , the question was not 'is it any use ' but should it still be taught.the answer to that really depends on the course .
the course I learnt it in was called'programming paradigms ' and the intent was to show different types of languages .
fortranis a good example of a different paradigm from your bog standard boring procedural Cstyle languages.so , yes it should be taught .
at least there 's no reason it should n't be used as an exampleof that type of language.also , I have to say , learning how shit some of these old languages are provides appreciationof features of modern languages .</tokentext>
<sentencetext>since you're talking about physics  - it sounds like yes it should  still be taught.I would have thought either matlab or fortran is perfect.
a language for calculatingshit.
as opposed to a general purpose langage.languages like python can be particularly cumbersome to do maths in.
(unless python has got some sort of maths module which I don't know about).I don't know much about fortran vs matlab tho so perhaps matlab is a decent alternative.anyway fortran is still used in all sorts of areas in code hardware or systems etc where you needactual number crunching for example radar systems and that sort of thing.I'm not a fan of fortran myself but the fact is it's a tried and tested language fornumeric computations.however having said that, the question was not 'is it any use' but should it still be taught.the answer to that really depends on the course.
the course I learnt it in was called'programming paradigms' and the intent was to show different types of languages.
fortranis a good example of a different paradigm from your bog standard boring procedural Cstyle languages.so, yes it should be taught.
at least there's no reason it shouldn't be used as an exampleof that type of language.also, I have to say, learning how shit some of these old languages are provides appreciationof features of modern languages.</sentencetext>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28293421</id>
	<title>faculty often only know excel, fortran &amp;| matl</title>
	<author>goatbar</author>
	<datestamp>1244733660000</datestamp>
	<modclass>None</modclass>
	<modscore>1</modscore>
	<htmltext>This often comes purely from what the teaching faculty knows... which causes a lot of teaching of excel (+ maybe visual basic) too.  If you don't know it, you can't teach it.</htmltext>
<tokenext>This often comes purely from what the teaching faculty knows... which causes a lot of teaching of excel ( + maybe visual basic ) too .
If you do n't know it , you ca n't teach it .</tokentext>
<sentencetext>This often comes purely from what the teaching faculty knows... which causes a lot of teaching of excel (+ maybe visual basic) too.
If you don't know it, you can't teach it.</sentencetext>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28295865</id>
	<title>Re:Are You Serious?</title>
	<author>synthespian</author>
	<datestamp>1244742540000</datestamp>
	<modclass>None</modclass>
	<modscore>1</modscore>
	<htmltext><p>You are so clueless, you are beyond help. Your TI-89? Throw in the garbage (you should have bought an HP, anyways).<br>What a joke education has become.<br>Now go program your PHP site.</p></htmltext>
<tokenext>You are so clueless , you are beyond help .
Your TI-89 ?
Throw in the garbage ( you should have bought an HP , anyways ) .What a joke education has become.Now go program your PHP site .</tokentext>
<sentencetext>You are so clueless, you are beyond help.
Your TI-89?
Throw in the garbage (you should have bought an HP, anyways).What a joke education has become.Now go program your PHP site.</sentencetext>
	<parent>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28292165</parent>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28292901</id>
	<title>Dude...</title>
	<author>Balinares</author>
	<datestamp>1244731920000</datestamp>
	<modclass>Interestin</modclass>
	<modscore>2</modscore>
	<htmltext><p>... You're WAY behind the times.</p><p>I got a buddy who is an astrophysicist and worked at NASA, and he tells me his department ditched FORTRAN years ago in favor of Python+Numeric.</p><p>I hear you about the need for badass number crunching tools. It's your assumption that only FORTRAN fits that particular bill which is erroneous.</p><p>Not to say that FORTRAN doesn't have its use. It's just that other tools have since become better at some of those.</p><p><a href="http://numpy.scipy.org/" title="scipy.org">Python Numeric homepage</a> [scipy.org]. Check it out.</p></htmltext>
<tokenext>... You 're WAY behind the times.I got a buddy who is an astrophysicist and worked at NASA , and he tells me his department ditched FORTRAN years ago in favor of Python + Numeric.I hear you about the need for badass number crunching tools .
It 's your assumption that only FORTRAN fits that particular bill which is erroneous.Not to say that FORTRAN does n't have its use .
It 's just that other tools have since become better at some of those.Python Numeric homepage [ scipy.org ] .
Check it out .</tokentext>
<sentencetext>... You're WAY behind the times.I got a buddy who is an astrophysicist and worked at NASA, and he tells me his department ditched FORTRAN years ago in favor of Python+Numeric.I hear you about the need for badass number crunching tools.
It's your assumption that only FORTRAN fits that particular bill which is erroneous.Not to say that FORTRAN doesn't have its use.
It's just that other tools have since become better at some of those.Python Numeric homepage [scipy.org].
Check it out.</sentencetext>
	<parent>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28291859</parent>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28294857</id>
	<title>Re:It's okay to teach them FORTRAN</title>
	<author>maj\_id10t</author>
	<datestamp>1244739120000</datestamp>
	<modclass>Insightful</modclass>
	<modscore>3</modscore>
	<htmltext>I too had to learn FORTRAN (it was v77 then) as part of my undergraduate degree as did my now wife.  We were both studying Environmental Engineering.  There are many US Federal government agencies (e.g. EPA) who have mathematical models that are programmed in FORTRAN as per the federal regulations mandate.  For those students who will be studying to work in this field it makes perfect sense for them to study FORTRAN.  There are no plans I am aware of to update these environmental modeling programs from FORTRAN to a 'better' language.  For the non-computer science students / environmental engineers it is a good place to start with computer programming and will add to their skills required for the work place.  If they have the desire to enhance their computer programming skills they should be encouraged to minor in CS etc.

For anyone else, dear God NO!  Do not subject them to this language.

My $0.02.</htmltext>
<tokenext>I too had to learn FORTRAN ( it was v77 then ) as part of my undergraduate degree as did my now wife .
We were both studying Environmental Engineering .
There are many US Federal government agencies ( e.g .
EPA ) who have mathematical models that are programmed in FORTRAN as per the federal regulations mandate .
For those students who will be studying to work in this field it makes perfect sense for them to study FORTRAN .
There are no plans I am aware of to update these environmental modeling programs from FORTRAN to a 'better ' language .
For the non-computer science students / environmental engineers it is a good place to start with computer programming and will add to their skills required for the work place .
If they have the desire to enhance their computer programming skills they should be encouraged to minor in CS etc .
For anyone else , dear God NO !
Do not subject them to this language .
My $ 0.02 .</tokentext>
<sentencetext>I too had to learn FORTRAN (it was v77 then) as part of my undergraduate degree as did my now wife.
We were both studying Environmental Engineering.
There are many US Federal government agencies (e.g.
EPA) who have mathematical models that are programmed in FORTRAN as per the federal regulations mandate.
For those students who will be studying to work in this field it makes perfect sense for them to study FORTRAN.
There are no plans I am aware of to update these environmental modeling programs from FORTRAN to a 'better' language.
For the non-computer science students / environmental engineers it is a good place to start with computer programming and will add to their skills required for the work place.
If they have the desire to enhance their computer programming skills they should be encouraged to minor in CS etc.
For anyone else, dear God NO!
Do not subject them to this language.
My $0.02.</sentencetext>
	<parent>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28291847</parent>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28294685</id>
	<title>Re:Not punched cards</title>
	<author>Sanat</author>
	<datestamp>1244738460000</datestamp>
	<modclass>None</modclass>
	<modscore>1</modscore>
	<htmltext><p>Among other equipment (CDC 3200 systems) I maintained a room full of 026's and I would love to have a nickel for every star-wheel I had to reinstall after the coding cylinder was ripped out without first releasing the star-wheels pressure.</p></htmltext>
<tokenext>Among other equipment ( CDC 3200 systems ) I maintained a room full of 026 's and I would love to have a nickel for every star-wheel I had to reinstall after the coding cylinder was ripped out without first releasing the star-wheels pressure .</tokentext>
<sentencetext>Among other equipment (CDC 3200 systems) I maintained a room full of 026's and I would love to have a nickel for every star-wheel I had to reinstall after the coding cylinder was ripped out without first releasing the star-wheels pressure.</sentencetext>
	<parent>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28291953</parent>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28294233</id>
	<title>Ask ACES what they think</title>
	<author>EmagGeek</author>
	<datestamp>1244736840000</datestamp>
	<modclass>None</modclass>
	<modscore>1</modscore>
	<htmltext><p>There is a reason FORTRAN is still used heavily in numerical computing. I still develop software I started nearly 10 years ago as part of my graduate program. I looked into rewriting it in other languages and none seemed as well-suited for numerical computation as FORTRAN.</p></htmltext>
<tokenext>There is a reason FORTRAN is still used heavily in numerical computing .
I still develop software I started nearly 10 years ago as part of my graduate program .
I looked into rewriting it in other languages and none seemed as well-suited for numerical computation as FORTRAN .</tokentext>
<sentencetext>There is a reason FORTRAN is still used heavily in numerical computing.
I still develop software I started nearly 10 years ago as part of my graduate program.
I looked into rewriting it in other languages and none seemed as well-suited for numerical computation as FORTRAN.</sentencetext>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28294007</id>
	<title>Hell yea</title>
	<author>sucitivel</author>
	<datestamp>1244736000000</datestamp>
	<modclass>None</modclass>
	<modscore>1</modscore>
	<htmltext>Absolutely... I'd rather see the children learning FORTRAN than something awful like<nobr> <wbr></nobr>.NET or to a lesser degree Java -- I'd actually like to see the whole<nobr> <wbr></nobr>.NET thing go away permanently...

but in all reality I think c/c++ is the best language to start in because it is in at least some ways syntactically similar to many languages, and is a good introduction to object oriented design.<nobr> <wbr></nobr>...or as of late I've been finding objective c to be a worthwhile endeavor.</htmltext>
<tokenext>Absolutely... I 'd rather see the children learning FORTRAN than something awful like .NET or to a lesser degree Java -- I 'd actually like to see the whole .NET thing go away permanently.. . but in all reality I think c/c + + is the best language to start in because it is in at least some ways syntactically similar to many languages , and is a good introduction to object oriented design .
...or as of late I 've been finding objective c to be a worthwhile endeavor .</tokentext>
<sentencetext>Absolutely... I'd rather see the children learning FORTRAN than something awful like .NET or to a lesser degree Java -- I'd actually like to see the whole .NET thing go away permanently...

but in all reality I think c/c++ is the best language to start in because it is in at least some ways syntactically similar to many languages, and is a good introduction to object oriented design.
...or as of late I've been finding objective c to be a worthwhile endeavor.</sentencetext>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28301327</id>
	<title>Re:Not punched cards</title>
	<author>Nefarious Wheel</author>
	<datestamp>1244718660000</datestamp>
	<modclass>None</modclass>
	<modscore>1</modscore>
	<htmltext><p><div class="quote"><p>these odd, pale blue, rounded wedge-shaped TV thingies whose screens glowed blue (ADM3A's, for the uninitiated).</p></div><p>Ahh, the ADM3A.  For years I used an ADM3A "Dumb Terminal" clone that I built from the Heathkit catalog (H-19 I think it was).  Great little terminal.  </p><p>I remember seeing an ADM3A in the Hobart (Tasmania) Department of Health building, smack in the middle of one of those ergonomic workstations that had a raised centre and separate space for the keyboard.  To use it that way you would have to be part orangutan. I learned everything I needed to know about bureaucracy from that single gestalt.</p><p>The building lost funds for branding before they completed the outdoor signage.  All you could see from the street was "Depart".  Say what you like about Taswiegens, but they do have a finely tuned sense of irony.</p></div>
	</htmltext>
<tokenext>these odd , pale blue , rounded wedge-shaped TV thingies whose screens glowed blue ( ADM3A 's , for the uninitiated ) .Ahh , the ADM3A .
For years I used an ADM3A " Dumb Terminal " clone that I built from the Heathkit catalog ( H-19 I think it was ) .
Great little terminal .
I remember seeing an ADM3A in the Hobart ( Tasmania ) Department of Health building , smack in the middle of one of those ergonomic workstations that had a raised centre and separate space for the keyboard .
To use it that way you would have to be part orangutan .
I learned everything I needed to know about bureaucracy from that single gestalt.The building lost funds for branding before they completed the outdoor signage .
All you could see from the street was " Depart " .
Say what you like about Taswiegens , but they do have a finely tuned sense of irony .</tokentext>
<sentencetext>these odd, pale blue, rounded wedge-shaped TV thingies whose screens glowed blue (ADM3A's, for the uninitiated).Ahh, the ADM3A.
For years I used an ADM3A "Dumb Terminal" clone that I built from the Heathkit catalog (H-19 I think it was).
Great little terminal.
I remember seeing an ADM3A in the Hobart (Tasmania) Department of Health building, smack in the middle of one of those ergonomic workstations that had a raised centre and separate space for the keyboard.
To use it that way you would have to be part orangutan.
I learned everything I needed to know about bureaucracy from that single gestalt.The building lost funds for branding before they completed the outdoor signage.
All you could see from the street was "Depart".
Say what you like about Taswiegens, but they do have a finely tuned sense of irony.
	</sentencetext>
	<parent>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28296129</parent>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28292153</id>
	<title>University != Trade school</title>
	<author>SpinyNorman</author>
	<datestamp>1244729400000</datestamp>
	<modclass>Insightful</modclass>
	<modscore>5</modscore>
	<htmltext><p>IMO universities should be teaching core principles and methods, not attempting to impart up-to-date job skills.</p><p>If you are going to teach FORTRAN because it's of use in the real world, then why stop there? Why not also (god forbid) teach<nobr> <wbr></nobr>.NET. JavaScript, C#, etc. May as well teach them Excel macros and how to interact with Microsoft Clippy while you're at it.</p><p>No!</p><p>Teaching programming should be done in a langauge that imparts the principles easily and teaches good habits. You could do a lot worse than Pascal which was often used in this role, or maybe today just C++. I'd argue against Java and scripting languages as the core language since they are too high level to learn all the basics. You could throw in Perl, Python or any modern scripting langauge as a secondary, and for a Computer Science (vs. Physics, Engineering, etc) it's appropriate to teach a couple of other styles of programming - e.g. assembler, and functional programming.</p></htmltext>
<tokenext>IMO universities should be teaching core principles and methods , not attempting to impart up-to-date job skills.If you are going to teach FORTRAN because it 's of use in the real world , then why stop there ?
Why not also ( god forbid ) teach .NET .
JavaScript , C # , etc .
May as well teach them Excel macros and how to interact with Microsoft Clippy while you 're at it.No ! Teaching programming should be done in a langauge that imparts the principles easily and teaches good habits .
You could do a lot worse than Pascal which was often used in this role , or maybe today just C + + .
I 'd argue against Java and scripting languages as the core language since they are too high level to learn all the basics .
You could throw in Perl , Python or any modern scripting langauge as a secondary , and for a Computer Science ( vs. Physics , Engineering , etc ) it 's appropriate to teach a couple of other styles of programming - e.g .
assembler , and functional programming .</tokentext>
<sentencetext>IMO universities should be teaching core principles and methods, not attempting to impart up-to-date job skills.If you are going to teach FORTRAN because it's of use in the real world, then why stop there?
Why not also (god forbid) teach .NET.
JavaScript, C#, etc.
May as well teach them Excel macros and how to interact with Microsoft Clippy while you're at it.No!Teaching programming should be done in a langauge that imparts the principles easily and teaches good habits.
You could do a lot worse than Pascal which was often used in this role, or maybe today just C++.
I'd argue against Java and scripting languages as the core language since they are too high level to learn all the basics.
You could throw in Perl, Python or any modern scripting langauge as a secondary, and for a Computer Science (vs. Physics, Engineering, etc) it's appropriate to teach a couple of other styles of programming - e.g.
assembler, and functional programming.</sentencetext>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28292599</id>
	<title>Anonymous Coward</title>
	<author>Anonymous</author>
	<datestamp>1244731020000</datestamp>
	<modclass>None</modclass>
	<modscore>0</modscore>
	<htmltext><p>All our calculation kernels are still written in FORTRAN 95. They are parallelized using Open MP very easily on a loop level (admitted they are quite easily parallelized problems). This was painless to do in fact.</p><p>We do financial optimization (risk, credit, futures and portfolio analysis), electrical optimizations (hydropower, gas power, plant portfolios) and cross-border energy trading. With the amount of number crunching I've done using Fortran (IMSL), C/C++ (Boost, COIN-OR CLP) and Java (with COLT from Cern). I'd say that Fortran was the easiest, most concise, manageable and complete solution.</p><p>So yes, teach good fundamentals in Fortran. Great to learn, interesting projects to work on, fast and future safe (50 years and counting)!</p></htmltext>
<tokenext>All our calculation kernels are still written in FORTRAN 95 .
They are parallelized using Open MP very easily on a loop level ( admitted they are quite easily parallelized problems ) .
This was painless to do in fact.We do financial optimization ( risk , credit , futures and portfolio analysis ) , electrical optimizations ( hydropower , gas power , plant portfolios ) and cross-border energy trading .
With the amount of number crunching I 've done using Fortran ( IMSL ) , C/C + + ( Boost , COIN-OR CLP ) and Java ( with COLT from Cern ) .
I 'd say that Fortran was the easiest , most concise , manageable and complete solution.So yes , teach good fundamentals in Fortran .
Great to learn , interesting projects to work on , fast and future safe ( 50 years and counting ) !</tokentext>
<sentencetext>All our calculation kernels are still written in FORTRAN 95.
They are parallelized using Open MP very easily on a loop level (admitted they are quite easily parallelized problems).
This was painless to do in fact.We do financial optimization (risk, credit, futures and portfolio analysis), electrical optimizations (hydropower, gas power, plant portfolios) and cross-border energy trading.
With the amount of number crunching I've done using Fortran (IMSL), C/C++ (Boost, COIN-OR CLP) and Java (with COLT from Cern).
I'd say that Fortran was the easiest, most concise, manageable and complete solution.So yes, teach good fundamentals in Fortran.
Great to learn, interesting projects to work on, fast and future safe (50 years and counting)!</sentencetext>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28303863</id>
	<title>Re:Sillyness</title>
	<author>ceoyoyo</author>
	<datestamp>1244736060000</datestamp>
	<modclass>None</modclass>
	<modscore>1</modscore>
	<htmltext><p>Realtime aerospace simulations is not a set that encompasses all of "Scientific Computing."</p><p>I also do scientific computing and I've never written much Fortran code.  I do use a lot of excellent Fortran (and C) libraries though, by calling them from Python.</p><p>A LOT, perhaps the majority, of scientific computing these days is done using MatLab, which is an awful "language."  Of course, it also depends heavily on the venerable old Fortran libraries.</p></htmltext>
<tokenext>Realtime aerospace simulations is not a set that encompasses all of " Scientific Computing .
" I also do scientific computing and I 've never written much Fortran code .
I do use a lot of excellent Fortran ( and C ) libraries though , by calling them from Python.A LOT , perhaps the majority , of scientific computing these days is done using MatLab , which is an awful " language .
" Of course , it also depends heavily on the venerable old Fortran libraries .</tokentext>
<sentencetext>Realtime aerospace simulations is not a set that encompasses all of "Scientific Computing.
"I also do scientific computing and I've never written much Fortran code.
I do use a lot of excellent Fortran (and C) libraries though, by calling them from Python.A LOT, perhaps the majority, of scientific computing these days is done using MatLab, which is an awful "language.
"  Of course, it also depends heavily on the venerable old Fortran libraries.</sentencetext>
	<parent>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28292565</parent>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28294815</id>
	<title>Re:It's okay to teach them FORTRAN</title>
	<author>Anonymous</author>
	<datestamp>1244738940000</datestamp>
	<modclass>Funny</modclass>
	<modscore>4</modscore>
	<htmltext>Nah, make them learn VB6, if for no other reason for enjoying the screams of horror from 'real" programmers when they come across some VB6 app. Take that, real programmers! For extra evil teach them GOTO. I've found watching the facial ticks and foam build up around the mouth from real programmers when they encounter a GOTO to be quite entertaining!</htmltext>
<tokenext>Nah , make them learn VB6 , if for no other reason for enjoying the screams of horror from 'real " programmers when they come across some VB6 app .
Take that , real programmers !
For extra evil teach them GOTO .
I 've found watching the facial ticks and foam build up around the mouth from real programmers when they encounter a GOTO to be quite entertaining !</tokentext>
<sentencetext>Nah, make them learn VB6, if for no other reason for enjoying the screams of horror from 'real" programmers when they come across some VB6 app.
Take that, real programmers!
For extra evil teach them GOTO.
I've found watching the facial ticks and foam build up around the mouth from real programmers when they encounter a GOTO to be quite entertaining!</sentencetext>
	<parent>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28291847</parent>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28294321</id>
	<title>Medleys</title>
	<author>itomato</author>
	<datestamp>1244737200000</datestamp>
	<modclass>None</modclass>
	<modscore>0</modscore>
	<htmltext><p>Bands play 'medleys' it when the times have progressed, but the tunes are still a vital part of their lineup, why their fans came in the first place, etc.</p><p>If anything it should be taught as part of an "Exploring Legacy Computing 102" course.  A little FORTRAN, a little Pascal, some LISP (no threats, please), BASIC, COBOL, as well as magnetic and paper media technologies.</p><p>1. Toggle an Integer BASIC interpreter of your own design into the Altair 8800 you wire-wrapped in 101, using the front-panel switches.<br>2. Punch a master card with a FORTRAN program to perform a bubble sort on the following data set:<br>
&nbsp; &nbsp; &nbsp; &nbsp; 2a.: 12, 14, 66, 31, 988, 292, 747, 90922, 1...</p><p>
&nbsp; &nbsp; &nbsp; &nbsp; 2b.: Error-check and reproduce Master until 45 copies can be created with the IBM 514 reproducing punch.</p><p>3. Restore PDP/11 from with 2.11BSD from tapes.<br>
&nbsp; &nbsp; &nbsp; &nbsp; 3a. How many tracks are available on your tape?<br>
&nbsp; &nbsp; &nbsp; &nbsp; 3b. How tracks would be present if you circumambulate the PDP/11, and beginning at the rearmost right corner, affix the tape to the exterior of the machine, with no overlap, with each course at approximately 22 Degrees?</p></htmltext>
<tokenext>Bands play 'medleys ' it when the times have progressed , but the tunes are still a vital part of their lineup , why their fans came in the first place , etc.If anything it should be taught as part of an " Exploring Legacy Computing 102 " course .
A little FORTRAN , a little Pascal , some LISP ( no threats , please ) , BASIC , COBOL , as well as magnetic and paper media technologies.1 .
Toggle an Integer BASIC interpreter of your own design into the Altair 8800 you wire-wrapped in 101 , using the front-panel switches.2 .
Punch a master card with a FORTRAN program to perform a bubble sort on the following data set :         2a .
: 12 , 14 , 66 , 31 , 988 , 292 , 747 , 90922 , 1.. .         2b .
: Error-check and reproduce Master until 45 copies can be created with the IBM 514 reproducing punch.3 .
Restore PDP/11 from with 2.11BSD from tapes .
        3a .
How many tracks are available on your tape ?
        3b .
How tracks would be present if you circumambulate the PDP/11 , and beginning at the rearmost right corner , affix the tape to the exterior of the machine , with no overlap , with each course at approximately 22 Degrees ?</tokentext>
<sentencetext>Bands play 'medleys' it when the times have progressed, but the tunes are still a vital part of their lineup, why their fans came in the first place, etc.If anything it should be taught as part of an "Exploring Legacy Computing 102" course.
A little FORTRAN, a little Pascal, some LISP (no threats, please), BASIC, COBOL, as well as magnetic and paper media technologies.1.
Toggle an Integer BASIC interpreter of your own design into the Altair 8800 you wire-wrapped in 101, using the front-panel switches.2.
Punch a master card with a FORTRAN program to perform a bubble sort on the following data set:
        2a.
: 12, 14, 66, 31, 988, 292, 747, 90922, 1...
        2b.
: Error-check and reproduce Master until 45 copies can be created with the IBM 514 reproducing punch.3.
Restore PDP/11 from with 2.11BSD from tapes.
        3a.
How many tracks are available on your tape?
        3b.
How tracks would be present if you circumambulate the PDP/11, and beginning at the rearmost right corner, affix the tape to the exterior of the machine, with no overlap, with each course at approximately 22 Degrees?</sentencetext>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28316495</id>
	<title>Forth codes</title>
	<author>Anonymous</author>
	<datestamp>1244817660000</datestamp>
	<modclass>None</modclass>
	<modscore>0</modscore>
	<htmltext><p>No, you should learn Forth codes. Forth is good program language that you should learn how to do it</p></htmltext>
<tokenext>No , you should learn Forth codes .
Forth is good program language that you should learn how to do it</tokentext>
<sentencetext>No, you should learn Forth codes.
Forth is good program language that you should learn how to do it</sentencetext>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28292401</id>
	<title>Re:While there may be "newer" languages</title>
	<author>fph il quozientatore</author>
	<datestamp>1244730300000</datestamp>
	<modclass>Informativ</modclass>
	<modscore>5</modscore>
	<htmltext><blockquote><div><p>Citation needed.

Even if not phython, what does Fortran have over modern compiled languages, for example?</p></div>
</blockquote><p>

0) A lot of legacy code people still have to work with is written in FORTRAN. Sad but true.<br>
1) Many very optimized libraries available. Check if your language du jour has an implementation of a routine for solving a linear system using <a href="http://en.wikipedia.org/wiki/BLAS" title="wikipedia.org" rel="nofollow">BLAS</a> [wikipedia.org]. That provides a huge improvement.<br>
2) Many libraries are in fact only available for FORTRAN. For calculating the eigenvalues of a sparse matrix, there is only ARPACK (for Fortran), Arpack++ (a kludgy C++ interface to the very same FORTRAN library), and Matlab's "eigs" (a Visual Basic-style interface to the very same FORTRAN library).<br>
3) Very expressive. For instance, you can reverse the entries of a vector of complex numbers in a single compiler instruction. This is a toy example, but for more complicate stuff this expressiveness pays: the compiler has an easier job in understanding what code can be safely optimized and what cannot. More complicate stuff involving e.g. C++ method calls suffers in terms of pointer aliasing problems and similar stuff. Of course you may write the very same thing in C or machine code, but for 99\% of the computations you would use the "standard" interface to vectors/arrays of your languages and forget about this sort of micro-optimizations. A good commercial FORTRAN compiler (forget about gfortran, sorry GNU but sadly it's true) does this automatically.<br>
4) FORTRAN 95 is not a punch-card language anymore, it has most of the fancy modern stuff if you wish to use it. While "bad programmers can write FORTRAN in every language", good programmers can write well-factored and perfectly readable FORTRAN code.<br>
<br>
Nevertheless, I do matrix computations, and still I try to avoid it as much as I can. Most people in our field use MATLAB (which is essentially a Visual Basic-style interface to most of the awesome number-crunching FORTRAN libraries) even though for tight "for" loops its performance sucks. If performance is mission-critical, you may write FORTRAN subroutines and call them from MATLAB, and that's very convenient. Python still lacks many of Matlab's features, its only advantage is being Free Software.<br>
<br>
BTW, a very ill-advised design choice of Python:
<a href="http://www.python.org/dev/peps/pep-0211/" title="python.org" rel="nofollow">http://www.python.org/dev/peps/pep-0211/</a> [python.org]
Ask any numerical analyst to know why it is a terrible idea to solve a linear system with inv(A)*b. But make sure you have at least half an hour free.</p></div>
	</htmltext>
<tokenext>Citation needed .
Even if not phython , what does Fortran have over modern compiled languages , for example ?
0 ) A lot of legacy code people still have to work with is written in FORTRAN .
Sad but true .
1 ) Many very optimized libraries available .
Check if your language du jour has an implementation of a routine for solving a linear system using BLAS [ wikipedia.org ] .
That provides a huge improvement .
2 ) Many libraries are in fact only available for FORTRAN .
For calculating the eigenvalues of a sparse matrix , there is only ARPACK ( for Fortran ) , Arpack + + ( a kludgy C + + interface to the very same FORTRAN library ) , and Matlab 's " eigs " ( a Visual Basic-style interface to the very same FORTRAN library ) .
3 ) Very expressive .
For instance , you can reverse the entries of a vector of complex numbers in a single compiler instruction .
This is a toy example , but for more complicate stuff this expressiveness pays : the compiler has an easier job in understanding what code can be safely optimized and what can not .
More complicate stuff involving e.g .
C + + method calls suffers in terms of pointer aliasing problems and similar stuff .
Of course you may write the very same thing in C or machine code , but for 99 \ % of the computations you would use the " standard " interface to vectors/arrays of your languages and forget about this sort of micro-optimizations .
A good commercial FORTRAN compiler ( forget about gfortran , sorry GNU but sadly it 's true ) does this automatically .
4 ) FORTRAN 95 is not a punch-card language anymore , it has most of the fancy modern stuff if you wish to use it .
While " bad programmers can write FORTRAN in every language " , good programmers can write well-factored and perfectly readable FORTRAN code .
Nevertheless , I do matrix computations , and still I try to avoid it as much as I can .
Most people in our field use MATLAB ( which is essentially a Visual Basic-style interface to most of the awesome number-crunching FORTRAN libraries ) even though for tight " for " loops its performance sucks .
If performance is mission-critical , you may write FORTRAN subroutines and call them from MATLAB , and that 's very convenient .
Python still lacks many of Matlab 's features , its only advantage is being Free Software .
BTW , a very ill-advised design choice of Python : http : //www.python.org/dev/peps/pep-0211/ [ python.org ] Ask any numerical analyst to know why it is a terrible idea to solve a linear system with inv ( A ) * b. But make sure you have at least half an hour free .</tokentext>
<sentencetext>Citation needed.
Even if not phython, what does Fortran have over modern compiled languages, for example?
0) A lot of legacy code people still have to work with is written in FORTRAN.
Sad but true.
1) Many very optimized libraries available.
Check if your language du jour has an implementation of a routine for solving a linear system using BLAS [wikipedia.org].
That provides a huge improvement.
2) Many libraries are in fact only available for FORTRAN.
For calculating the eigenvalues of a sparse matrix, there is only ARPACK (for Fortran), Arpack++ (a kludgy C++ interface to the very same FORTRAN library), and Matlab's "eigs" (a Visual Basic-style interface to the very same FORTRAN library).
3) Very expressive.
For instance, you can reverse the entries of a vector of complex numbers in a single compiler instruction.
This is a toy example, but for more complicate stuff this expressiveness pays: the compiler has an easier job in understanding what code can be safely optimized and what cannot.
More complicate stuff involving e.g.
C++ method calls suffers in terms of pointer aliasing problems and similar stuff.
Of course you may write the very same thing in C or machine code, but for 99\% of the computations you would use the "standard" interface to vectors/arrays of your languages and forget about this sort of micro-optimizations.
A good commercial FORTRAN compiler (forget about gfortran, sorry GNU but sadly it's true) does this automatically.
4) FORTRAN 95 is not a punch-card language anymore, it has most of the fancy modern stuff if you wish to use it.
While "bad programmers can write FORTRAN in every language", good programmers can write well-factored and perfectly readable FORTRAN code.
Nevertheless, I do matrix computations, and still I try to avoid it as much as I can.
Most people in our field use MATLAB (which is essentially a Visual Basic-style interface to most of the awesome number-crunching FORTRAN libraries) even though for tight "for" loops its performance sucks.
If performance is mission-critical, you may write FORTRAN subroutines and call them from MATLAB, and that's very convenient.
Python still lacks many of Matlab's features, its only advantage is being Free Software.
BTW, a very ill-advised design choice of Python:
http://www.python.org/dev/peps/pep-0211/ [python.org]
Ask any numerical analyst to know why it is a terrible idea to solve a linear system with inv(A)*b. But make sure you have at least half an hour free.
	</sentencetext>
	<parent>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28292025</parent>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28303257</id>
	<title>Perl</title>
	<author>Anonymous</author>
	<datestamp>1244730840000</datestamp>
	<modclass>None</modclass>
	<modscore>0</modscore>
	<htmltext><p>but interesting one. Fortran was my first language. It's good but not modern enough, and not multi-purpose enough.<br>C/C++? Not sure about that either. Most students are dirty and don't clean up after themselves. So writing clean code,<br>where the students have to de-allocate memory etc. may be a way to turn off students.<br>Java? (the circumcised language) Maybe?<br>Perl? Why not? Perl!</p></htmltext>
<tokenext>but interesting one .
Fortran was my first language .
It 's good but not modern enough , and not multi-purpose enough.C/C + + ?
Not sure about that either .
Most students are dirty and do n't clean up after themselves .
So writing clean code,where the students have to de-allocate memory etc .
may be a way to turn off students.Java ?
( the circumcised language ) Maybe ? Perl ?
Why not ?
Perl !</tokentext>
<sentencetext>but interesting one.
Fortran was my first language.
It's good but not modern enough, and not multi-purpose enough.C/C++?
Not sure about that either.
Most students are dirty and don't clean up after themselves.
So writing clean code,where the students have to de-allocate memory etc.
may be a way to turn off students.Java?
(the circumcised language) Maybe?Perl?
Why not?
Perl!</sentencetext>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28293483</id>
	<title>Re:While there may be "newer" languages</title>
	<author>codegen</author>
	<datestamp>1244733900000</datestamp>
	<modclass>None</modclass>
	<modscore>1</modscore>
	<htmltext><p><div class="quote"><p>Citation needed.</p><p>Even if not phython, what does Fortran have over modern compiled languages, for example?</p></div><p>The main thing that Fortran (and Ada too) has is a standard floating point model. There are some
minor differences between compilers. This is unlike C, Java, Python which rely on the
underlying FP hardware. Then to look at the loopholes in the IEEE FP standard for what NaN means.
For citations, look at the Numerical Recipes series of books for a discussion of the differences.
Also check out "What every computer scientist should know about floating point" <br> <br>

<a href="http://docs.sun.com/source/806-3568/ncg\_goldberg.html" title="sun.com">http://docs.sun.com/source/806-3568/ncg\_goldberg.html</a> [sun.com] <br> <br>

IBM has released a set of libraries for C and Java that provide the Fortran semantics
of floating point, and there is a wrapper for them that uses operator overloading in C++
to make them easier to use. But they are still a pain to use compared to the </p></div>
	</htmltext>
<tokenext>Citation needed.Even if not phython , what does Fortran have over modern compiled languages , for example ? The main thing that Fortran ( and Ada too ) has is a standard floating point model .
There are some minor differences between compilers .
This is unlike C , Java , Python which rely on the underlying FP hardware .
Then to look at the loopholes in the IEEE FP standard for what NaN means .
For citations , look at the Numerical Recipes series of books for a discussion of the differences .
Also check out " What every computer scientist should know about floating point " http : //docs.sun.com/source/806-3568/ncg \ _goldberg.html [ sun.com ] IBM has released a set of libraries for C and Java that provide the Fortran semantics of floating point , and there is a wrapper for them that uses operator overloading in C + + to make them easier to use .
But they are still a pain to use compared to the</tokentext>
<sentencetext>Citation needed.Even if not phython, what does Fortran have over modern compiled languages, for example?The main thing that Fortran (and Ada too) has is a standard floating point model.
There are some
minor differences between compilers.
This is unlike C, Java, Python which rely on the
underlying FP hardware.
Then to look at the loopholes in the IEEE FP standard for what NaN means.
For citations, look at the Numerical Recipes series of books for a discussion of the differences.
Also check out "What every computer scientist should know about floating point"  

http://docs.sun.com/source/806-3568/ncg\_goldberg.html [sun.com]  

IBM has released a set of libraries for C and Java that provide the Fortran semantics
of floating point, and there is a wrapper for them that uses operator overloading in C++
to make them easier to use.
But they are still a pain to use compared to the 
	</sentencetext>
	<parent>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28292025</parent>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28292813</id>
	<title>Re:"Introductory"</title>
	<author>idontgno</author>
	<datestamp>1244731680000</datestamp>
	<modclass>None</modclass>
	<modscore>1</modscore>
	<htmltext><p>It's called <a href="http://en.wikipedia.org/wiki/BASIC" title="wikipedia.org">BASIC</a> [wikipedia.org]. I recommend the original Dartmouth dialect. It's got a nice Ivy League blue blood feel to it.</p></htmltext>
<tokenext>It 's called BASIC [ wikipedia.org ] .
I recommend the original Dartmouth dialect .
It 's got a nice Ivy League blue blood feel to it .</tokentext>
<sentencetext>It's called BASIC [wikipedia.org].
I recommend the original Dartmouth dialect.
It's got a nice Ivy League blue blood feel to it.</sentencetext>
	<parent>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28291939</parent>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28293055</id>
	<title>Teach em everything</title>
	<author>SoulRider</author>
	<datestamp>1244732520000</datestamp>
	<modclass>None</modclass>
	<modscore>1</modscore>
	<htmltext><p>Sure, make undergraduates learn to implement theory in as many languages as possible.  Teach them the theory make them learn that the language is irrelevant.</p></htmltext>
<tokenext>Sure , make undergraduates learn to implement theory in as many languages as possible .
Teach them the theory make them learn that the language is irrelevant .</tokentext>
<sentencetext>Sure, make undergraduates learn to implement theory in as many languages as possible.
Teach them the theory make them learn that the language is irrelevant.</sentencetext>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28292565</id>
	<title>Sillyness</title>
	<author>Anonymous</author>
	<datestamp>1244730900000</datestamp>
	<modclass>Insightful</modclass>
	<modscore>4</modscore>
	<htmltext><p>This was clearly written by someone who doesn't actually <em>do</em> any scientific computing.

</p><p>As hard as it may be for some CS-types (myself included) to believe, Fortran is still <strong>the</strong> language for scientific computing.

I've worked at flight simulation companies for two different companies (and 5 different groups) for the last 15 years. The math required to simulate a flying aircraft in realtime is ungodly hairy. It also has to get done <em>fast</em>. We typically have 50 or so different simulation models (plus all the I/O) that have to run to completion 60 times a second. That's about 17ms, or 8ms if we want \%50 spare. In addition, for a realtime app like a simulatior it needs to take the same time to execute every time (no runtime dynamic allocations, GC, etc.) or things "jitter".
</p><p>Everywhere I've worked, with the exception of Ada mandated jobs, had this code done in Fortran. Yes that includes today. We are today writing new Fortran, and we are not alone. When we request models from the aircract manufacturers, they come in Fortran (or occasionally Ada). Fortran is <em>still</em>, and quite possibly always will be, the language for Scientific Computing.
</p><p>Suggesting non-CS math and science students learn some other programming language instead is just wrong. Further suggesting that it should be the author's favorite hip new interpreted languge is just laughable.</p></htmltext>
<tokenext>This was clearly written by someone who does n't actually do any scientific computing .
As hard as it may be for some CS-types ( myself included ) to believe , Fortran is still the language for scientific computing .
I 've worked at flight simulation companies for two different companies ( and 5 different groups ) for the last 15 years .
The math required to simulate a flying aircraft in realtime is ungodly hairy .
It also has to get done fast .
We typically have 50 or so different simulation models ( plus all the I/O ) that have to run to completion 60 times a second .
That 's about 17ms , or 8ms if we want \ % 50 spare .
In addition , for a realtime app like a simulatior it needs to take the same time to execute every time ( no runtime dynamic allocations , GC , etc .
) or things " jitter " .
Everywhere I 've worked , with the exception of Ada mandated jobs , had this code done in Fortran .
Yes that includes today .
We are today writing new Fortran , and we are not alone .
When we request models from the aircract manufacturers , they come in Fortran ( or occasionally Ada ) .
Fortran is still , and quite possibly always will be , the language for Scientific Computing .
Suggesting non-CS math and science students learn some other programming language instead is just wrong .
Further suggesting that it should be the author 's favorite hip new interpreted languge is just laughable .</tokentext>
<sentencetext>This was clearly written by someone who doesn't actually do any scientific computing.
As hard as it may be for some CS-types (myself included) to believe, Fortran is still the language for scientific computing.
I've worked at flight simulation companies for two different companies (and 5 different groups) for the last 15 years.
The math required to simulate a flying aircraft in realtime is ungodly hairy.
It also has to get done fast.
We typically have 50 or so different simulation models (plus all the I/O) that have to run to completion 60 times a second.
That's about 17ms, or 8ms if we want \%50 spare.
In addition, for a realtime app like a simulatior it needs to take the same time to execute every time (no runtime dynamic allocations, GC, etc.
) or things "jitter".
Everywhere I've worked, with the exception of Ada mandated jobs, had this code done in Fortran.
Yes that includes today.
We are today writing new Fortran, and we are not alone.
When we request models from the aircract manufacturers, they come in Fortran (or occasionally Ada).
Fortran is still, and quite possibly always will be, the language for Scientific Computing.
Suggesting non-CS math and science students learn some other programming language instead is just wrong.
Further suggesting that it should be the author's favorite hip new interpreted languge is just laughable.</sentencetext>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28293639</id>
	<title>Yes, Python -- to MANAGE the calculation.</title>
	<author>Anonymous</author>
	<datestamp>1244734620000</datestamp>
	<modclass>None</modclass>
	<modscore>0</modscore>
	<htmltext><p>You cannot (well, actually should not) do it in pure Python, but even today, there are very few people working in pure Python systems (PyPy is the only project I know of).  Python is a language that allows easy use of functions written in other languages, including C, C++, Java, C#, and FORTRAN.  Yes, more easily than in native C.  See The NumPy and SciPy and libraries for access to massive piles of FORTRAN libraries.<br>I've run 4-day programs "with Python" where total execution time of the Python interpreter (if measured properly) would<br>have been less than ten minutes of execution.  Python manages the calculations well, but the actual array processing was in a blend of C and FORTRAN (and, no, I neither know nor care what the mix was there).</p></htmltext>
<tokenext>You can not ( well , actually should not ) do it in pure Python , but even today , there are very few people working in pure Python systems ( PyPy is the only project I know of ) .
Python is a language that allows easy use of functions written in other languages , including C , C + + , Java , C # , and FORTRAN .
Yes , more easily than in native C. See The NumPy and SciPy and libraries for access to massive piles of FORTRAN libraries.I 've run 4-day programs " with Python " where total execution time of the Python interpreter ( if measured properly ) wouldhave been less than ten minutes of execution .
Python manages the calculations well , but the actual array processing was in a blend of C and FORTRAN ( and , no , I neither know nor care what the mix was there ) .</tokentext>
<sentencetext>You cannot (well, actually should not) do it in pure Python, but even today, there are very few people working in pure Python systems (PyPy is the only project I know of).
Python is a language that allows easy use of functions written in other languages, including C, C++, Java, C#, and FORTRAN.
Yes, more easily than in native C.  See The NumPy and SciPy and libraries for access to massive piles of FORTRAN libraries.I've run 4-day programs "with Python" where total execution time of the Python interpreter (if measured properly) wouldhave been less than ten minutes of execution.
Python manages the calculations well, but the actual array processing was in a blend of C and FORTRAN (and, no, I neither know nor care what the mix was there).</sentencetext>
	<parent>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28292127</parent>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28292485</id>
	<title>Certainly not!</title>
	<author>thorsen</author>
	<datestamp>1244730600000</datestamp>
	<modclass>None</modclass>
	<modscore>1</modscore>
	<htmltext><p>There are problems, where fortran is a better (or at least probably a faster runtime) solution. But teaching fortran to every undergraduate would be a big mistake.</p><p>They should be taught how to program, not how to do it in fortran. (And if you don't understand the difference here, you don't understand the problem.) Use any kind of language that is easy to teach and learn, or something that is used regularly out there.</p><p>Anyone programming fortran or cobol (same issue, just with banking instead of physics) will tell you that it takes about three weeks to teach a decent programmer how to do this language as well. But if you start out by teaching them the old school stuff, there is almost no way to get them up to speed on todays programming styles.</p><p>So teach them how to do proper programming, and make specialized courses for those very few who needs the legacy languages.</p><p>Bo Thorsen.</p></htmltext>
<tokenext>There are problems , where fortran is a better ( or at least probably a faster runtime ) solution .
But teaching fortran to every undergraduate would be a big mistake.They should be taught how to program , not how to do it in fortran .
( And if you do n't understand the difference here , you do n't understand the problem .
) Use any kind of language that is easy to teach and learn , or something that is used regularly out there.Anyone programming fortran or cobol ( same issue , just with banking instead of physics ) will tell you that it takes about three weeks to teach a decent programmer how to do this language as well .
But if you start out by teaching them the old school stuff , there is almost no way to get them up to speed on todays programming styles.So teach them how to do proper programming , and make specialized courses for those very few who needs the legacy languages.Bo Thorsen .</tokentext>
<sentencetext>There are problems, where fortran is a better (or at least probably a faster runtime) solution.
But teaching fortran to every undergraduate would be a big mistake.They should be taught how to program, not how to do it in fortran.
(And if you don't understand the difference here, you don't understand the problem.
) Use any kind of language that is easy to teach and learn, or something that is used regularly out there.Anyone programming fortran or cobol (same issue, just with banking instead of physics) will tell you that it takes about three weeks to teach a decent programmer how to do this language as well.
But if you start out by teaching them the old school stuff, there is almost no way to get them up to speed on todays programming styles.So teach them how to do proper programming, and make specialized courses for those very few who needs the legacy languages.Bo Thorsen.</sentencetext>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28292381</id>
	<title>Power tool without any safeguards.</title>
	<author>Anonymous</author>
	<datestamp>1244730240000</datestamp>
	<modclass>None</modclass>
	<modscore>0</modscore>
	<htmltext>Though the newfangled Fortran 90 has introduced dynamic memory allocation, type testing and all the goodies found in C++ or c, the original fortran up to Fortran77 is essentially a powertool without any safety equipment.<p>

Static memory allocation, all arguments pass by reference all the time, no type checkin anywhere etc lead to a function call overhead being a simple jmp instruction. Imagine a clueless physicist passing a large std::vector by value in C++. A simple oversight of missing an &amp; in the function declaration. At every call a copy of the vector gets passed in and discarded at exit... There is no way a higher level language with all kinds of dynamic type testing and dynamic memory allocation and dynamic binding to be as fast as Fortran.</p><p>

I am <b>not</b> arguing Fortran is <b>better</b> in anyway. It is very easy to create very hard to debug bugs in fortran and maintenance is a nightmare. But once the bugs have been ironed out and the code has been trusted, it is hard to beat fortran using modern languages. Heck, fortran is nothing but one small step above assembly language. It comes with all the benefits and disadvantages of assembly language. </p></htmltext>
<tokenext>Though the newfangled Fortran 90 has introduced dynamic memory allocation , type testing and all the goodies found in C + + or c , the original fortran up to Fortran77 is essentially a powertool without any safety equipment .
Static memory allocation , all arguments pass by reference all the time , no type checkin anywhere etc lead to a function call overhead being a simple jmp instruction .
Imagine a clueless physicist passing a large std : : vector by value in C + + .
A simple oversight of missing an &amp; in the function declaration .
At every call a copy of the vector gets passed in and discarded at exit... There is no way a higher level language with all kinds of dynamic type testing and dynamic memory allocation and dynamic binding to be as fast as Fortran .
I am not arguing Fortran is better in anyway .
It is very easy to create very hard to debug bugs in fortran and maintenance is a nightmare .
But once the bugs have been ironed out and the code has been trusted , it is hard to beat fortran using modern languages .
Heck , fortran is nothing but one small step above assembly language .
It comes with all the benefits and disadvantages of assembly language .</tokentext>
<sentencetext>Though the newfangled Fortran 90 has introduced dynamic memory allocation, type testing and all the goodies found in C++ or c, the original fortran up to Fortran77 is essentially a powertool without any safety equipment.
Static memory allocation, all arguments pass by reference all the time, no type checkin anywhere etc lead to a function call overhead being a simple jmp instruction.
Imagine a clueless physicist passing a large std::vector by value in C++.
A simple oversight of missing an &amp; in the function declaration.
At every call a copy of the vector gets passed in and discarded at exit... There is no way a higher level language with all kinds of dynamic type testing and dynamic memory allocation and dynamic binding to be as fast as Fortran.
I am not arguing Fortran is better in anyway.
It is very easy to create very hard to debug bugs in fortran and maintenance is a nightmare.
But once the bugs have been ironed out and the code has been trusted, it is hard to beat fortran using modern languages.
Heck, fortran is nothing but one small step above assembly language.
It comes with all the benefits and disadvantages of assembly language. </sentencetext>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28302227</id>
	<title>Re:Perl or Python</title>
	<author>Anonymous</author>
	<datestamp>1244723340000</datestamp>
	<modclass>None</modclass>
	<modscore>0</modscore>
	<htmltext><p>In shell:<br>sed 's/,//gc' filename &gt; temp<br>mv temp filename</p><p>In Fortran:<br>open(70,file='filename')<br>do i = 1, bound1<br>
&nbsp; read(70,*) array (1:bound2,i)<br>enddo<br>close(70)<br>write(format,'(A1,I5,A6)') '(',bound2,'F12.8)'<br>do i = 1, bound1<br>
&nbsp; write(*,format) array(1:bound2,i)<br>enddo</p><p>Hm.... 9 lines. And I even formatted the output! Okay, I skipped variable declaration (but I could use implicit types... Hey, just kidding. Please, put down the gun!)</p><p>Yes, it gets nasty pretty fast for complicated stuff. You can't do easy splits, search-and-replace, and whatnot. Who cares? Fortran is the king of numbers. If I want to play fancy games with strings, I use perl.</p><p>And I would argue that someone who has mastered Fortrans IO has a very, very good grasp on how to format and read files so that there is never, ever something unexpected coming in - which is a good thing. Unlimited SQL injection vulnerabilities on the 'net stand witness for that.</p></htmltext>
<tokenext>In shell : sed 's/,//gc ' filename &gt; tempmv temp filenameIn Fortran : open ( 70,file = 'filename ' ) do i = 1 , bound1   read ( 70 , * ) array ( 1 : bound2,i ) enddoclose ( 70 ) write ( format, ' ( A1,I5,A6 ) ' ) ' ( ',bound2,'F12.8 ) 'do i = 1 , bound1   write ( * ,format ) array ( 1 : bound2,i ) enddoHm.... 9 lines .
And I even formatted the output !
Okay , I skipped variable declaration ( but I could use implicit types... Hey , just kidding .
Please , put down the gun !
) Yes , it gets nasty pretty fast for complicated stuff .
You ca n't do easy splits , search-and-replace , and whatnot .
Who cares ?
Fortran is the king of numbers .
If I want to play fancy games with strings , I use perl.And I would argue that someone who has mastered Fortrans IO has a very , very good grasp on how to format and read files so that there is never , ever something unexpected coming in - which is a good thing .
Unlimited SQL injection vulnerabilities on the 'net stand witness for that .</tokentext>
<sentencetext>In shell:sed 's/,//gc' filename &gt; tempmv temp filenameIn Fortran:open(70,file='filename')do i = 1, bound1
  read(70,*) array (1:bound2,i)enddoclose(70)write(format,'(A1,I5,A6)') '(',bound2,'F12.8)'do i = 1, bound1
  write(*,format) array(1:bound2,i)enddoHm.... 9 lines.
And I even formatted the output!
Okay, I skipped variable declaration (but I could use implicit types... Hey, just kidding.
Please, put down the gun!
)Yes, it gets nasty pretty fast for complicated stuff.
You can't do easy splits, search-and-replace, and whatnot.
Who cares?
Fortran is the king of numbers.
If I want to play fancy games with strings, I use perl.And I would argue that someone who has mastered Fortrans IO has a very, very good grasp on how to format and read files so that there is never, ever something unexpected coming in - which is a good thing.
Unlimited SQL injection vulnerabilities on the 'net stand witness for that.</sentencetext>
	<parent>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28299473</parent>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28292523</id>
	<title>Re:FORTRAN is better than Python</title>
	<author>jgtg32a</author>
	<datestamp>1244730720000</datestamp>
	<modclass>None</modclass>
	<modscore>1</modscore>
	<htmltext>OOP hasn't failed it has its uses it works great if you want to create a user interface, the only problem is that people are convinced that everything needs to be an object</htmltext>
<tokenext>OOP has n't failed it has its uses it works great if you want to create a user interface , the only problem is that people are convinced that everything needs to be an object</tokentext>
<sentencetext>OOP hasn't failed it has its uses it works great if you want to create a user interface, the only problem is that people are convinced that everything needs to be an object</sentencetext>
	<parent>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28292063</parent>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28293277</id>
	<title>Re:libraries. gigabytes of libraries</title>
	<author>UID30</author>
	<datestamp>1244733180000</datestamp>
	<modclass>Insightful</modclass>
	<modscore>3</modscore>
	<htmltext>Awwww c'mon.  This is just plain silly.  Since the late 80s, "Fortran" on most major computing platforms has been nothing more than front end language parser for a multi-pass compiler system<nobr> <wbr></nobr>... just like "C" and "Pascal".  Whatever language you choose, they all pass their assembly output to the same back-end assembler, and binary machine code generated is pretty generic.
<br> <br>
Back when I was in college, I maintained a Fortran77 program that was a custom built TCP/IP client-server system.  But wait!  F77 didn't know what a socket was!  right.  The network code was written in C and compiled into object code which was directly linked into the F77 project.
<br> <br>
Great.  So there are these massive libraries written in Fortran to do wonderful things.  Best case scenario is you can link them directly into your language of choice.  Worst case, call them from the scripted language of your choice with a wrapper<nobr> <wbr></nobr>... <a href="http://www.swig.org/" title="swig.org">Swig</a> [swig.org] anyone?
<br> <br>
Bottom line?  Program in what you are comfortable with.  Would your peers would frown on your efforts if you learned anything but ALGOL?  Fine.  Use ALGOL.  There are valuable lessons to be learned in any language.  Strong vs weak typed, functional vs object oriented, structure, best practices<nobr> <wbr></nobr>... hell, how to write "fast" code.  I've been a programmer for near 20 years and I'm still learning that lesson on a daily basis.<p><div class="quote"><p>The surest way to corrupt a youth is to instruct him to hold in higher
esteem those who think alike than those who think differently.
  - Nietzsche</p></div></div>
	</htmltext>
<tokenext>Awwww c'mon .
This is just plain silly .
Since the late 80s , " Fortran " on most major computing platforms has been nothing more than front end language parser for a multi-pass compiler system ... just like " C " and " Pascal " .
Whatever language you choose , they all pass their assembly output to the same back-end assembler , and binary machine code generated is pretty generic .
Back when I was in college , I maintained a Fortran77 program that was a custom built TCP/IP client-server system .
But wait !
F77 did n't know what a socket was !
right. The network code was written in C and compiled into object code which was directly linked into the F77 project .
Great. So there are these massive libraries written in Fortran to do wonderful things .
Best case scenario is you can link them directly into your language of choice .
Worst case , call them from the scripted language of your choice with a wrapper ... Swig [ swig.org ] anyone ?
Bottom line ?
Program in what you are comfortable with .
Would your peers would frown on your efforts if you learned anything but ALGOL ?
Fine. Use ALGOL .
There are valuable lessons to be learned in any language .
Strong vs weak typed , functional vs object oriented , structure , best practices ... hell , how to write " fast " code .
I 've been a programmer for near 20 years and I 'm still learning that lesson on a daily basis.The surest way to corrupt a youth is to instruct him to hold in higher esteem those who think alike than those who think differently .
- Nietzsche</tokentext>
<sentencetext>Awwww c'mon.
This is just plain silly.
Since the late 80s, "Fortran" on most major computing platforms has been nothing more than front end language parser for a multi-pass compiler system ... just like "C" and "Pascal".
Whatever language you choose, they all pass their assembly output to the same back-end assembler, and binary machine code generated is pretty generic.
Back when I was in college, I maintained a Fortran77 program that was a custom built TCP/IP client-server system.
But wait!
F77 didn't know what a socket was!
right.  The network code was written in C and compiled into object code which was directly linked into the F77 project.
Great.  So there are these massive libraries written in Fortran to do wonderful things.
Best case scenario is you can link them directly into your language of choice.
Worst case, call them from the scripted language of your choice with a wrapper ... Swig [swig.org] anyone?
Bottom line?
Program in what you are comfortable with.
Would your peers would frown on your efforts if you learned anything but ALGOL?
Fine.  Use ALGOL.
There are valuable lessons to be learned in any language.
Strong vs weak typed, functional vs object oriented, structure, best practices ... hell, how to write "fast" code.
I've been a programmer for near 20 years and I'm still learning that lesson on a daily basis.The surest way to corrupt a youth is to instruct him to hold in higher
esteem those who think alike than those who think differently.
- Nietzsche
	</sentencetext>
	<parent>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28291919</parent>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28295547</id>
	<title>What's the purpose?</title>
	<author>JeffDeMello</author>
	<datestamp>1244741460000</datestamp>
	<modclass>None</modclass>
	<modscore>1</modscore>
	<htmltext>If the purpose is to show how a computer really works<nobr> <wbr></nobr>... teach an assembly language course<nobr> <wbr></nobr>... have the last project be to write a simple device driver.  That will teach a person to appreciate what a computer REALLY does!

If the goal is to teach Fortran for modern programming<nobr> <wbr></nobr>... not worth it  (I can say this after programming in Fortran from '78 - '90 !!!).</htmltext>
<tokenext>If the purpose is to show how a computer really works ... teach an assembly language course ... have the last project be to write a simple device driver .
That will teach a person to appreciate what a computer REALLY does !
If the goal is to teach Fortran for modern programming ... not worth it ( I can say this after programming in Fortran from '78 - '90 ! ! !
) .</tokentext>
<sentencetext>If the purpose is to show how a computer really works ... teach an assembly language course ... have the last project be to write a simple device driver.
That will teach a person to appreciate what a computer REALLY does!
If the goal is to teach Fortran for modern programming ... not worth it  (I can say this after programming in Fortran from '78 - '90 !!!
).</sentencetext>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28306297</id>
	<title>C is not so bad either</title>
	<author>Anonymous</author>
	<datestamp>1244811000000</datestamp>
	<modclass>None</modclass>
	<modscore>0</modscore>
	<htmltext><p>I think one should take into account that someone who decides to study chemistry or physics has a specific "mind-set".  She wants to know how things work.  And for every little detail in the explanation of how things work, she wants to know how \_those\_ work.</p><p>If you use a programming which is close to the machine, it's easier to explain why things are they way they are.</p><p>At my university, the introduction to computer programming was given by the CS department, and they used a special "educational language" where MAXINT was 10000.  This was deliberately chosen because the real MAXINT (4G) would look arbitrary and weird, and thus scare the students.</p><p>All the physics, chemists, and mathematicians immediately fell over this number.  10000!  What a wonderful and weird coincidence that "the largest number" would be a nice power of ten!   All of them had to be explained "Well that was just chosen as an artificial limit.  The real number is 2^32, since this is a 32 bit machine."  And they all went "Ahh, of course."</p><p>If you balance it all out, I still think C is not so bad a choice for a first programming language.  Disclaimer: I wrote an introductionary computer programming book. (http://www.curly-brace.com)</p></htmltext>
<tokenext>I think one should take into account that someone who decides to study chemistry or physics has a specific " mind-set " .
She wants to know how things work .
And for every little detail in the explanation of how things work , she wants to know how \ _those \ _ work.If you use a programming which is close to the machine , it 's easier to explain why things are they way they are.At my university , the introduction to computer programming was given by the CS department , and they used a special " educational language " where MAXINT was 10000 .
This was deliberately chosen because the real MAXINT ( 4G ) would look arbitrary and weird , and thus scare the students.All the physics , chemists , and mathematicians immediately fell over this number .
10000 ! What a wonderful and weird coincidence that " the largest number " would be a nice power of ten !
All of them had to be explained " Well that was just chosen as an artificial limit .
The real number is 2 ^ 32 , since this is a 32 bit machine .
" And they all went " Ahh , of course .
" If you balance it all out , I still think C is not so bad a choice for a first programming language .
Disclaimer : I wrote an introductionary computer programming book .
( http : //www.curly-brace.com )</tokentext>
<sentencetext>I think one should take into account that someone who decides to study chemistry or physics has a specific "mind-set".
She wants to know how things work.
And for every little detail in the explanation of how things work, she wants to know how \_those\_ work.If you use a programming which is close to the machine, it's easier to explain why things are they way they are.At my university, the introduction to computer programming was given by the CS department, and they used a special "educational language" where MAXINT was 10000.
This was deliberately chosen because the real MAXINT (4G) would look arbitrary and weird, and thus scare the students.All the physics, chemists, and mathematicians immediately fell over this number.
10000!  What a wonderful and weird coincidence that "the largest number" would be a nice power of ten!
All of them had to be explained "Well that was just chosen as an artificial limit.
The real number is 2^32, since this is a 32 bit machine.
"  And they all went "Ahh, of course.
"If you balance it all out, I still think C is not so bad a choice for a first programming language.
Disclaimer: I wrote an introductionary computer programming book.
(http://www.curly-brace.com)</sentencetext>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28292273</id>
	<title>Schools don't teach basics anymore</title>
	<author>gbr</author>
	<datestamp>1244729880000</datestamp>
	<modclass>None</modclass>
	<modscore>1</modscore>
	<htmltext><p>They should be taught SOMETHING.  I recently saw a case where a new programmer was asked to add CR/LF to the end of his text strings.  Well, when we saw the code, he had done exactly as asked.  Each string had "CR/LF" (the string literal) added to the end.</p></htmltext>
<tokenext>They should be taught SOMETHING .
I recently saw a case where a new programmer was asked to add CR/LF to the end of his text strings .
Well , when we saw the code , he had done exactly as asked .
Each string had " CR/LF " ( the string literal ) added to the end .</tokentext>
<sentencetext>They should be taught SOMETHING.
I recently saw a case where a new programmer was asked to add CR/LF to the end of his text strings.
Well, when we saw the code, he had done exactly as asked.
Each string had "CR/LF" (the string literal) added to the end.</sentencetext>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28296349</id>
	<title>If you're not a CS major</title>
	<author>Anonymous</author>
	<datestamp>1244744280000</datestamp>
	<modclass>None</modclass>
	<modscore>0</modscore>
	<htmltext><p>they aren't going to teach you PROGRAMMING.   They will teach you CODING, and probably do that pretty poorly.</p></htmltext>
<tokenext>they are n't going to teach you PROGRAMMING .
They will teach you CODING , and probably do that pretty poorly .</tokentext>
<sentencetext>they aren't going to teach you PROGRAMMING.
They will teach you CODING, and probably do that pretty poorly.</sentencetext>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28291983</id>
	<title>I learned C in engineering undergrad</title>
	<author>Anonymous</author>
	<datestamp>1244728680000</datestamp>
	<modclass>None</modclass>
	<modscore>0</modscore>
	<htmltext><p>I was never formally taught to use Fortran, just C. I think this worked best for me since this seemed to be the vanilla of languages and it gave me what I needed, a solid understanding of how and why to program. I did not need to know the intricacies of a specific language, that comes later depending on:</p><p>A) Where you end up.<br>B) What you end up doing.</p><p>So really, the language isn't the point, but more so the techniques of programming. Early courses should be basic, but challenging in that you need to know how to work with the power of number crunching and programming is the tool.</p><p>As an aside, I've later taught myself Perl and Fortran as needed and knowing the fundamentals helped.</p></htmltext>
<tokenext>I was never formally taught to use Fortran , just C. I think this worked best for me since this seemed to be the vanilla of languages and it gave me what I needed , a solid understanding of how and why to program .
I did not need to know the intricacies of a specific language , that comes later depending on : A ) Where you end up.B ) What you end up doing.So really , the language is n't the point , but more so the techniques of programming .
Early courses should be basic , but challenging in that you need to know how to work with the power of number crunching and programming is the tool.As an aside , I 've later taught myself Perl and Fortran as needed and knowing the fundamentals helped .</tokentext>
<sentencetext>I was never formally taught to use Fortran, just C. I think this worked best for me since this seemed to be the vanilla of languages and it gave me what I needed, a solid understanding of how and why to program.
I did not need to know the intricacies of a specific language, that comes later depending on:A) Where you end up.B) What you end up doing.So really, the language isn't the point, but more so the techniques of programming.
Early courses should be basic, but challenging in that you need to know how to work with the power of number crunching and programming is the tool.As an aside, I've later taught myself Perl and Fortran as needed and knowing the fundamentals helped.</sentencetext>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28293047</id>
	<title>Re:While there may be "newer" languages</title>
	<author>mdwh2</author>
	<datestamp>1244732460000</datestamp>
	<modclass>None</modclass>
	<modscore>1</modscore>
	<htmltext><p>I agree entirely. But mod parent up? Surely he was saying the very thing you were arguing against, by saying Fortran was one of the best, and then saying Python was bad at it?</p></htmltext>
<tokenext>I agree entirely .
But mod parent up ?
Surely he was saying the very thing you were arguing against , by saying Fortran was one of the best , and then saying Python was bad at it ?</tokentext>
<sentencetext>I agree entirely.
But mod parent up?
Surely he was saying the very thing you were arguing against, by saying Fortran was one of the best, and then saying Python was bad at it?</sentencetext>
	<parent>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28292357</parent>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28297603</id>
	<title>Re:PYTHON????</title>
	<author>dodobh</author>
	<datestamp>1244748540000</datestamp>
	<modclass>Insightful</modclass>
	<modscore>2</modscore>
	<htmltext><p>Think of Fortran as a specialist little language/DSL. Most people don't learn many DSLs in university (except UML and SQL).</p><p>Fortran is a DSL for number crunching and matrix algebra.</p></htmltext>
<tokenext>Think of Fortran as a specialist little language/DSL .
Most people do n't learn many DSLs in university ( except UML and SQL ) .Fortran is a DSL for number crunching and matrix algebra .</tokentext>
<sentencetext>Think of Fortran as a specialist little language/DSL.
Most people don't learn many DSLs in university (except UML and SQL).Fortran is a DSL for number crunching and matrix algebra.</sentencetext>
	<parent>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28293041</parent>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28292509</id>
	<title>Python?</title>
	<author>Anonymous</author>
	<datestamp>1244730660000</datestamp>
	<modclass>None</modclass>
	<modscore>0</modscore>
	<htmltext><p>Who programs in Python anymore?<nobr> <wbr></nobr>...Oh wait...<br>I forgot I had my time travel suit on. Sorry about that.<br>Oh, of course Python.  Good choice.</p></htmltext>
<tokenext>Who programs in Python anymore ?
...Oh wait...I forgot I had my time travel suit on .
Sorry about that.Oh , of course Python .
Good choice .</tokentext>
<sentencetext>Who programs in Python anymore?
...Oh wait...I forgot I had my time travel suit on.
Sorry about that.Oh, of course Python.
Good choice.</sentencetext>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28299291</id>
	<title>Opinion of expert Wybe Edsger Dijkstra</title>
	<author>Device666</author>
	<datestamp>1244711340000</datestamp>
	<modclass>None</modclass>
	<modscore>1</modscore>
	<htmltext>Apparently not according <a href="http://en.wikipedia.org/wiki/Edsger\_Dijkstra" title="wikipedia.org">Edsger Wybe Dijkstra</a> [wikipedia.org] (to famous computer scientist and influential pioneer), these are <a href="http://en.wikiquote.org/wiki/Edsger\_W.\_Dijkstra" title="wikiquote.org">his quotes about Fortan</a> [wikiquote.org]:<p><div class="quote"><ul>
<li>FORTRAN's tragic fate has been its wide acceptance, mentally chaining thousands and thousands of programmers to our past mistakes.</li><li>When FORTRAN has been called an infantile disorder, full PL/1, with its growth characteristics of a dangerous tumor, could turn out to be a fatal disease.</li><li>FORTRAN, 'the infantile disorder', by now nearly 20 years old, is hopelessly inadequate for whatever computer application you have in mind today: it is now too clumsy, too risky, and too expensive to use.</li><li>In the good old days physicists repeated each other's experiments, just to be sure. Today they stick to FORTRAN, so that they can share each other's programs, bugs included.</li></ul></div></div>
	</htmltext>
<tokenext>Apparently not according Edsger Wybe Dijkstra [ wikipedia.org ] ( to famous computer scientist and influential pioneer ) , these are his quotes about Fortan [ wikiquote.org ] : FORTRAN 's tragic fate has been its wide acceptance , mentally chaining thousands and thousands of programmers to our past mistakes.When FORTRAN has been called an infantile disorder , full PL/1 , with its growth characteristics of a dangerous tumor , could turn out to be a fatal disease.FORTRAN , 'the infantile disorder ' , by now nearly 20 years old , is hopelessly inadequate for whatever computer application you have in mind today : it is now too clumsy , too risky , and too expensive to use.In the good old days physicists repeated each other 's experiments , just to be sure .
Today they stick to FORTRAN , so that they can share each other 's programs , bugs included .</tokentext>
<sentencetext>Apparently not according Edsger Wybe Dijkstra [wikipedia.org] (to famous computer scientist and influential pioneer), these are his quotes about Fortan [wikiquote.org]:
FORTRAN's tragic fate has been its wide acceptance, mentally chaining thousands and thousands of programmers to our past mistakes.When FORTRAN has been called an infantile disorder, full PL/1, with its growth characteristics of a dangerous tumor, could turn out to be a fatal disease.FORTRAN, 'the infantile disorder', by now nearly 20 years old, is hopelessly inadequate for whatever computer application you have in mind today: it is now too clumsy, too risky, and too expensive to use.In the good old days physicists repeated each other's experiments, just to be sure.
Today they stick to FORTRAN, so that they can share each other's programs, bugs included.
	</sentencetext>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28292111</id>
	<title>The real problem: programming literacy</title>
	<author>Anonymous</author>
	<datestamp>1244729220000</datestamp>
	<modclass>None</modclass>
	<modscore>0</modscore>
	<htmltext><p>The greatest misconception about fortran is this: although yes, it goes with heavily optimized compilers for number crunching, the real problem is beyond. The problem, as I have constantly witnessed for decades, is that the people who write fortran usually have no basic knowledge of computer science, and e.g. have never even heard of algorithmic complexity, balanced search trees, hash tables, nearest-neighbor problems, and the like. So they write, spaghetti style, very efficient code, which is then turned into very efficient object code by the compiler, implementing utterly inefficient numerical methods. I can remember blank stares when I told some of them that, despite very clever space partitioning, their nearest-neighbor search was still O(n^2), and that switching to a binary tree would obviously gain so much that they would be able to run the whole thing on a Linux box instead of a Cray.<br>I'm not making this up.</p></htmltext>
<tokenext>The greatest misconception about fortran is this : although yes , it goes with heavily optimized compilers for number crunching , the real problem is beyond .
The problem , as I have constantly witnessed for decades , is that the people who write fortran usually have no basic knowledge of computer science , and e.g .
have never even heard of algorithmic complexity , balanced search trees , hash tables , nearest-neighbor problems , and the like .
So they write , spaghetti style , very efficient code , which is then turned into very efficient object code by the compiler , implementing utterly inefficient numerical methods .
I can remember blank stares when I told some of them that , despite very clever space partitioning , their nearest-neighbor search was still O ( n ^ 2 ) , and that switching to a binary tree would obviously gain so much that they would be able to run the whole thing on a Linux box instead of a Cray.I 'm not making this up .</tokentext>
<sentencetext>The greatest misconception about fortran is this: although yes, it goes with heavily optimized compilers for number crunching, the real problem is beyond.
The problem, as I have constantly witnessed for decades, is that the people who write fortran usually have no basic knowledge of computer science, and e.g.
have never even heard of algorithmic complexity, balanced search trees, hash tables, nearest-neighbor problems, and the like.
So they write, spaghetti style, very efficient code, which is then turned into very efficient object code by the compiler, implementing utterly inefficient numerical methods.
I can remember blank stares when I told some of them that, despite very clever space partitioning, their nearest-neighbor search was still O(n^2), and that switching to a binary tree would obviously gain so much that they would be able to run the whole thing on a Linux box instead of a Cray.I'm not making this up.</sentencetext>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28293261</id>
	<title>Re:PYTHON????</title>
	<author>Anonymous</author>
	<datestamp>1244733180000</datestamp>
	<modclass>None</modclass>
	<modscore>0</modscore>
	<htmltext><p><i>If you want a surprise, go look at how LISP stacks up compared to C. It is better than you think</i></p><p>This many times has to do with the alg you are using.  Then *KNOWING* what your compiler will do.  I have seen a few of these comparisons over the years.  Many times the example they start with was optimized for a particular language.  Be careful of these comparisons many times people are just trying to prove a point that 'X is better than Y language'.</p><p>What do I mean by that?  You can write horrible code in all languages.  But it takes the thing holding your ears apart to make it 'good' code.  Many programmers have gotten lazy and or do not know how to do it.  Many times you ask them to optimize something and they give you a deer in the headlights look.</p><p>It is all about O(N).  What algo is applied to do something.  Then how good is your compiler at recognizing that you are using something that it can apply its optimized pattern for.  Many people treat compilers as some sort of magic tool.  They are not.  They are just pattern interpreters.  Some compilers are bad at it and you have to force the issue.  Either with hand assembly or crazy looking code.</p><p>With ideal virtual machines such as Python, Java,<nobr> <wbr></nobr>.NET you get a intermediate interpreter.  You are at the mercy of how good 2 things are, the interpreter (compiler) and the ideal virtual machine (JVM,CLR,etc...).</p><p>People like to claim that 'X language is better than Y language because it is faster'.  Which is hogwash.  The real crux of the problem is how good is the compiler.</p><p>To give an idea of what I am talking about go look at the comparisons of IronPython vs Python.  Same language different backends.  IronPython is beating Python in many cases in others it is way slower.  For the SAME code.</p><p>I also find many times when people claim a language is 'faster/slower' they are really talking about the 'standard' library that go along with many languages.  That the FORTRAN one is fastest is not too surprising.  It was built in a resource constrained environment where speed was critical.  So its 'standard' libraries are optimized for machines that had a couple of meg of memory and a 2-4mhz proc if you were lucky.  It is like trying to play a early 80s IBMPC game on todays hardware.  They are usually impossible to play as they are crazy hand optimized to run on slow hardware.</p></htmltext>
<tokenext>If you want a surprise , go look at how LISP stacks up compared to C. It is better than you thinkThis many times has to do with the alg you are using .
Then * KNOWING * what your compiler will do .
I have seen a few of these comparisons over the years .
Many times the example they start with was optimized for a particular language .
Be careful of these comparisons many times people are just trying to prove a point that 'X is better than Y language'.What do I mean by that ?
You can write horrible code in all languages .
But it takes the thing holding your ears apart to make it 'good ' code .
Many programmers have gotten lazy and or do not know how to do it .
Many times you ask them to optimize something and they give you a deer in the headlights look.It is all about O ( N ) .
What algo is applied to do something .
Then how good is your compiler at recognizing that you are using something that it can apply its optimized pattern for .
Many people treat compilers as some sort of magic tool .
They are not .
They are just pattern interpreters .
Some compilers are bad at it and you have to force the issue .
Either with hand assembly or crazy looking code.With ideal virtual machines such as Python , Java , .NET you get a intermediate interpreter .
You are at the mercy of how good 2 things are , the interpreter ( compiler ) and the ideal virtual machine ( JVM,CLR,etc... ) .People like to claim that 'X language is better than Y language because it is faster' .
Which is hogwash .
The real crux of the problem is how good is the compiler.To give an idea of what I am talking about go look at the comparisons of IronPython vs Python .
Same language different backends .
IronPython is beating Python in many cases in others it is way slower .
For the SAME code.I also find many times when people claim a language is 'faster/slower ' they are really talking about the 'standard ' library that go along with many languages .
That the FORTRAN one is fastest is not too surprising .
It was built in a resource constrained environment where speed was critical .
So its 'standard ' libraries are optimized for machines that had a couple of meg of memory and a 2-4mhz proc if you were lucky .
It is like trying to play a early 80s IBMPC game on todays hardware .
They are usually impossible to play as they are crazy hand optimized to run on slow hardware .</tokentext>
<sentencetext>If you want a surprise, go look at how LISP stacks up compared to C. It is better than you thinkThis many times has to do with the alg you are using.
Then *KNOWING* what your compiler will do.
I have seen a few of these comparisons over the years.
Many times the example they start with was optimized for a particular language.
Be careful of these comparisons many times people are just trying to prove a point that 'X is better than Y language'.What do I mean by that?
You can write horrible code in all languages.
But it takes the thing holding your ears apart to make it 'good' code.
Many programmers have gotten lazy and or do not know how to do it.
Many times you ask them to optimize something and they give you a deer in the headlights look.It is all about O(N).
What algo is applied to do something.
Then how good is your compiler at recognizing that you are using something that it can apply its optimized pattern for.
Many people treat compilers as some sort of magic tool.
They are not.
They are just pattern interpreters.
Some compilers are bad at it and you have to force the issue.
Either with hand assembly or crazy looking code.With ideal virtual machines such as Python, Java, .NET you get a intermediate interpreter.
You are at the mercy of how good 2 things are, the interpreter (compiler) and the ideal virtual machine (JVM,CLR,etc...).People like to claim that 'X language is better than Y language because it is faster'.
Which is hogwash.
The real crux of the problem is how good is the compiler.To give an idea of what I am talking about go look at the comparisons of IronPython vs Python.
Same language different backends.
IronPython is beating Python in many cases in others it is way slower.
For the SAME code.I also find many times when people claim a language is 'faster/slower' they are really talking about the 'standard' library that go along with many languages.
That the FORTRAN one is fastest is not too surprising.
It was built in a resource constrained environment where speed was critical.
So its 'standard' libraries are optimized for machines that had a couple of meg of memory and a 2-4mhz proc if you were lucky.
It is like trying to play a early 80s IBMPC game on todays hardware.
They are usually impossible to play as they are crazy hand optimized to run on slow hardware.</sentencetext>
	<parent>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28292127</parent>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28292205</id>
	<title>no....it's all about the architecture, people</title>
	<author>NickBlack</author>
	<datestamp>1244729700000</datestamp>
	<modclass>None</modclass>
	<modscore>1</modscore>
	<htmltext>FORTRAN is still used so heavily because

a) the architectures used for HPC are some of the rarest, most singularly-purposed machines on earth -- your Crays, your J-Machines, your Cell BE"s. These are not the commodity architectures for which gcc is designed; in many cases, gcc support might not even exist. In any case, it's certainly not going to have the intense and often, frankly, bizarre optimizations necessary to properly make use of these machines. Does python even allow one to manage cache prefetching, SIMD, the floating point model, etc?

b) FORTRAN's easily parallelized by compilers -- even moreso than C with OpenMPI, etc -- due to the simplicity of its loop constructs and a rich heritage of automatically parallelizing these constructs (look at Ken Kennedy's work etc. google for DOALL and DOACROSS). For HPC, you're often working with massively parallel architectures, often SMID-heavy. Check out the Banerjee inequality sometime if you want to see something cool.

c) Vector notation in FORTRAN is easily compiled into vector registers and operations. Python slices could probably be amenable to this, but Python isn't really<nobr> <wbr></nobr>...

d) Array-oriented. Everything in the microarchitecture world is designed around fast access to and operations on arrays. If you're not working in the array model, it's gonna be hard to compile it in a fashion that one can make use of architectural features designed for high performance.

Language research is ongoing that seeks to address these issues. Python doesn't get there.</htmltext>
<tokenext>FORTRAN is still used so heavily because a ) the architectures used for HPC are some of the rarest , most singularly-purposed machines on earth -- your Crays , your J-Machines , your Cell BE " s. These are not the commodity architectures for which gcc is designed ; in many cases , gcc support might not even exist .
In any case , it 's certainly not going to have the intense and often , frankly , bizarre optimizations necessary to properly make use of these machines .
Does python even allow one to manage cache prefetching , SIMD , the floating point model , etc ?
b ) FORTRAN 's easily parallelized by compilers -- even moreso than C with OpenMPI , etc -- due to the simplicity of its loop constructs and a rich heritage of automatically parallelizing these constructs ( look at Ken Kennedy 's work etc .
google for DOALL and DOACROSS ) .
For HPC , you 're often working with massively parallel architectures , often SMID-heavy .
Check out the Banerjee inequality sometime if you want to see something cool .
c ) Vector notation in FORTRAN is easily compiled into vector registers and operations .
Python slices could probably be amenable to this , but Python is n't really .. . d ) Array-oriented .
Everything in the microarchitecture world is designed around fast access to and operations on arrays .
If you 're not working in the array model , it 's gon na be hard to compile it in a fashion that one can make use of architectural features designed for high performance .
Language research is ongoing that seeks to address these issues .
Python does n't get there .</tokentext>
<sentencetext>FORTRAN is still used so heavily because

a) the architectures used for HPC are some of the rarest, most singularly-purposed machines on earth -- your Crays, your J-Machines, your Cell BE"s. These are not the commodity architectures for which gcc is designed; in many cases, gcc support might not even exist.
In any case, it's certainly not going to have the intense and often, frankly, bizarre optimizations necessary to properly make use of these machines.
Does python even allow one to manage cache prefetching, SIMD, the floating point model, etc?
b) FORTRAN's easily parallelized by compilers -- even moreso than C with OpenMPI, etc -- due to the simplicity of its loop constructs and a rich heritage of automatically parallelizing these constructs (look at Ken Kennedy's work etc.
google for DOALL and DOACROSS).
For HPC, you're often working with massively parallel architectures, often SMID-heavy.
Check out the Banerjee inequality sometime if you want to see something cool.
c) Vector notation in FORTRAN is easily compiled into vector registers and operations.
Python slices could probably be amenable to this, but Python isn't really ...

d) Array-oriented.
Everything in the microarchitecture world is designed around fast access to and operations on arrays.
If you're not working in the array model, it's gonna be hard to compile it in a fashion that one can make use of architectural features designed for high performance.
Language research is ongoing that seeks to address these issues.
Python doesn't get there.</sentencetext>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28294677</id>
	<title>Re:University != Trade school</title>
	<author>snl2587</author>
	<datestamp>1244738400000</datestamp>
	<modclass>None</modclass>
	<modscore>1</modscore>
	<htmltext><p><div class="quote"><p>May as well teach them Excel macros and how to interact with Microsoft Clippy while you're at it.</p></div><p>Interesting that you mention that. When I took the undergraduate programming course for Chemical Engineers at my university, the class focused exclusively on Excel and VBA. That's it. And people in the class still had extensive trouble with it.</p><p>In an ideal sense I think you're right: an undergraduate in physics, chemistry, or engineering <em>should</em> be able to just take and extend concepts learned in any language to whatever their jobs required. In practice, though: have you met some of these undergraduates? While I firmly disagree with teaching to the lowest common denominator at any level, it's sometimes shocking to witness the difficulty they have even getting "Hello World" to run, let alone a simple N-R algorithm.</p></div>
	</htmltext>
<tokenext>May as well teach them Excel macros and how to interact with Microsoft Clippy while you 're at it.Interesting that you mention that .
When I took the undergraduate programming course for Chemical Engineers at my university , the class focused exclusively on Excel and VBA .
That 's it .
And people in the class still had extensive trouble with it.In an ideal sense I think you 're right : an undergraduate in physics , chemistry , or engineering should be able to just take and extend concepts learned in any language to whatever their jobs required .
In practice , though : have you met some of these undergraduates ?
While I firmly disagree with teaching to the lowest common denominator at any level , it 's sometimes shocking to witness the difficulty they have even getting " Hello World " to run , let alone a simple N-R algorithm .</tokentext>
<sentencetext>May as well teach them Excel macros and how to interact with Microsoft Clippy while you're at it.Interesting that you mention that.
When I took the undergraduate programming course for Chemical Engineers at my university, the class focused exclusively on Excel and VBA.
That's it.
And people in the class still had extensive trouble with it.In an ideal sense I think you're right: an undergraduate in physics, chemistry, or engineering should be able to just take and extend concepts learned in any language to whatever their jobs required.
In practice, though: have you met some of these undergraduates?
While I firmly disagree with teaching to the lowest common denominator at any level, it's sometimes shocking to witness the difficulty they have even getting "Hello World" to run, let alone a simple N-R algorithm.
	</sentencetext>
	<parent>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28292153</parent>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28293959</id>
	<title>Re:Python is not programming.</title>
	<author>EnglishTim</author>
	<datestamp>1244735820000</datestamp>
	<modclass>None</modclass>
	<modscore>1</modscore>
	<htmltext><p>I'd be interested to know what the difference between 'scripting' and 'programming' is...</p></htmltext>
<tokenext>I 'd be interested to know what the difference between 'scripting ' and 'programming ' is.. .</tokentext>
<sentencetext>I'd be interested to know what the difference between 'scripting' and 'programming' is...</sentencetext>
	<parent>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28292033</parent>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28292239</id>
	<title>High Performance Computing</title>
	<author>mdmkolbe</author>
	<datestamp>1244729760000</datestamp>
	<modclass>None</modclass>
	<modscore>1</modscore>
	<htmltext><p>In the high performance computing and physical simulation world, Fortran and C are still king.  In large part this is because of the BLAS, LAPACK, MPI and OpenMP libraries which are the standard libraries to use for linear algebra and parallel programming and upon which most physical simulation software is built.  These libraries started out in Fortran but C ports are also widely available.  Thus a new engineering graduate is much more likely to need to work with Fortran and these libraries than with Python.</p><p>Also while most code written by scientists/engineers isn't written very efficient (because they don't know algorithm design and analysis), it needs to be written in a language where someone (usually a CS person) can come in a cleanup/tune the critical bottlenecks.  At that point the efficiency of the language does become important.  In this field, a 3\% improvement in performance can make or break you.  (Yes, you can mix C and Python, but its easier to stick to one language rather than having to keep marshaling back and forth.)</p></htmltext>
<tokenext>In the high performance computing and physical simulation world , Fortran and C are still king .
In large part this is because of the BLAS , LAPACK , MPI and OpenMP libraries which are the standard libraries to use for linear algebra and parallel programming and upon which most physical simulation software is built .
These libraries started out in Fortran but C ports are also widely available .
Thus a new engineering graduate is much more likely to need to work with Fortran and these libraries than with Python.Also while most code written by scientists/engineers is n't written very efficient ( because they do n't know algorithm design and analysis ) , it needs to be written in a language where someone ( usually a CS person ) can come in a cleanup/tune the critical bottlenecks .
At that point the efficiency of the language does become important .
In this field , a 3 \ % improvement in performance can make or break you .
( Yes , you can mix C and Python , but its easier to stick to one language rather than having to keep marshaling back and forth .
)</tokentext>
<sentencetext>In the high performance computing and physical simulation world, Fortran and C are still king.
In large part this is because of the BLAS, LAPACK, MPI and OpenMP libraries which are the standard libraries to use for linear algebra and parallel programming and upon which most physical simulation software is built.
These libraries started out in Fortran but C ports are also widely available.
Thus a new engineering graduate is much more likely to need to work with Fortran and these libraries than with Python.Also while most code written by scientists/engineers isn't written very efficient (because they don't know algorithm design and analysis), it needs to be written in a language where someone (usually a CS person) can come in a cleanup/tune the critical bottlenecks.
At that point the efficiency of the language does become important.
In this field, a 3\% improvement in performance can make or break you.
(Yes, you can mix C and Python, but its easier to stick to one language rather than having to keep marshaling back and forth.
)</sentencetext>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28292943</id>
	<title>Re:Python is not programming.</title>
	<author>gestalt\_n\_pepper</author>
	<datestamp>1244732100000</datestamp>
	<modclass>None</modclass>
	<modscore>1</modscore>
	<htmltext>Please explain the difference. Both are code interfaces that allow you to give instructions to a machine. One has a set of pre-built functions for math. The other uses libraries. Are you saying that I don't program because I don't manipulate memory directly or use "void virtual functions?" What's the story here?</htmltext>
<tokenext>Please explain the difference .
Both are code interfaces that allow you to give instructions to a machine .
One has a set of pre-built functions for math .
The other uses libraries .
Are you saying that I do n't program because I do n't manipulate memory directly or use " void virtual functions ?
" What 's the story here ?</tokentext>
<sentencetext>Please explain the difference.
Both are code interfaces that allow you to give instructions to a machine.
One has a set of pre-built functions for math.
The other uses libraries.
Are you saying that I don't program because I don't manipulate memory directly or use "void virtual functions?
" What's the story here?</sentencetext>
	<parent>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28292033</parent>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28293473</id>
	<title>Re:</title>
	<author>Anonymous</author>
	<datestamp>1244733900000</datestamp>
	<modclass>None</modclass>
	<modscore>0</modscore>
	<htmltext><p>Interesting. I just retired after 30+ years as a software engineer at a large R&amp;D lab. For many years now, the overwhelming majority of people we interviewed for jobs had learned Matlab in school, not Fortran or any other language. This applies at the undergraduate level, Masters, PhD, and post-docs. I can't think of a single resume that mentioned Fortran. I wonder what schools the author is thinking about?</p></htmltext>
<tokenext>Interesting .
I just retired after 30 + years as a software engineer at a large R&amp;D lab .
For many years now , the overwhelming majority of people we interviewed for jobs had learned Matlab in school , not Fortran or any other language .
This applies at the undergraduate level , Masters , PhD , and post-docs .
I ca n't think of a single resume that mentioned Fortran .
I wonder what schools the author is thinking about ?</tokentext>
<sentencetext>Interesting.
I just retired after 30+ years as a software engineer at a large R&amp;D lab.
For many years now, the overwhelming majority of people we interviewed for jobs had learned Matlab in school, not Fortran or any other language.
This applies at the undergraduate level, Masters, PhD, and post-docs.
I can't think of a single resume that mentioned Fortran.
I wonder what schools the author is thinking about?</sentencetext>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28309371</id>
	<title>yaah right...</title>
	<author>neo110</author>
	<datestamp>1244826420000</datestamp>
	<modclass>None</modclass>
	<modscore>1</modscore>
	<htmltext>only if toddlers be taught Latin or Greek..</htmltext>
<tokenext>only if toddlers be taught Latin or Greek. .</tokentext>
<sentencetext>only if toddlers be taught Latin or Greek..</sentencetext>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28292185</id>
	<title>Short answer, no.</title>
	<author>sizzzzlerz</author>
	<datestamp>1244729580000</datestamp>
	<modclass>Insightful</modclass>
	<modscore>2</modscore>
	<htmltext><p>While used extensively in a number of scientific research programs, it isn't used commercially in any great amount, if at all. Unless the student is planning to work in an area where the language is used, there aren't any great benefits to knowing it. College should be more about leaning the discipline of software engineering, not learning a multitude of programming languages. C or Java serve that purpose perfectly well and have extensive use in the non-academic world. It it is needed at some point, learning it won't be terribly difficult if one is already conversant with other languages.</p><p>I used fortran 30 years ago, stopped using 25 years ago, and, outside of a few PhDs who use it where I work, nobody uses it for anything.</p></htmltext>
<tokenext>While used extensively in a number of scientific research programs , it is n't used commercially in any great amount , if at all .
Unless the student is planning to work in an area where the language is used , there are n't any great benefits to knowing it .
College should be more about leaning the discipline of software engineering , not learning a multitude of programming languages .
C or Java serve that purpose perfectly well and have extensive use in the non-academic world .
It it is needed at some point , learning it wo n't be terribly difficult if one is already conversant with other languages.I used fortran 30 years ago , stopped using 25 years ago , and , outside of a few PhDs who use it where I work , nobody uses it for anything .</tokentext>
<sentencetext>While used extensively in a number of scientific research programs, it isn't used commercially in any great amount, if at all.
Unless the student is planning to work in an area where the language is used, there aren't any great benefits to knowing it.
College should be more about leaning the discipline of software engineering, not learning a multitude of programming languages.
C or Java serve that purpose perfectly well and have extensive use in the non-academic world.
It it is needed at some point, learning it won't be terribly difficult if one is already conversant with other languages.I used fortran 30 years ago, stopped using 25 years ago, and, outside of a few PhDs who use it where I work, nobody uses it for anything.</sentencetext>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28297529</id>
	<title>Call me crazy</title>
	<author>Ohmaar</author>
	<datestamp>1244748300000</datestamp>
	<modclass>None</modclass>
	<modscore>1</modscore>
	<htmltext><p>Call me crazy, but shouldn't they be learning Chemistry, Physics and Engineering?</p></htmltext>
<tokenext>Call me crazy , but should n't they be learning Chemistry , Physics and Engineering ?</tokentext>
<sentencetext>Call me crazy, but shouldn't they be learning Chemistry, Physics and Engineering?</sentencetext>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28315971</id>
	<title>Re:It's okay to teach them FORTRAN</title>
	<author>nobodie</author>
	<datestamp>1244813040000</datestamp>
	<modclass>None</modclass>
	<modscore>1</modscore>
	<htmltext>wrong direction. bring back some of those old ENIACs from Princeton where I learned machine code. That way they learn real programming/algorhythm basic and none of this prissy drag and drop text programming. 1s and 0s will hone thier little minds and give them the real foundation for a bright future. Think about the 40th anniversary of UNICS, the OS, a text editor and whatever the hell else he did in that month. that is what a machine code foundation can do for you. I mean, how long would it take to write UNICS in python?</htmltext>
<tokenext>wrong direction .
bring back some of those old ENIACs from Princeton where I learned machine code .
That way they learn real programming/algorhythm basic and none of this prissy drag and drop text programming .
1s and 0s will hone thier little minds and give them the real foundation for a bright future .
Think about the 40th anniversary of UNICS , the OS , a text editor and whatever the hell else he did in that month .
that is what a machine code foundation can do for you .
I mean , how long would it take to write UNICS in python ?</tokentext>
<sentencetext>wrong direction.
bring back some of those old ENIACs from Princeton where I learned machine code.
That way they learn real programming/algorhythm basic and none of this prissy drag and drop text programming.
1s and 0s will hone thier little minds and give them the real foundation for a bright future.
Think about the 40th anniversary of UNICS, the OS, a text editor and whatever the hell else he did in that month.
that is what a machine code foundation can do for you.
I mean, how long would it take to write UNICS in python?</sentencetext>
	<parent>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28294815</parent>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28292343</id>
	<title>I don't think its too bad</title>
	<author>dublindan</author>
	<datestamp>1244730120000</datestamp>
	<modclass>None</modclass>
	<modscore>1</modscore>
	<htmltext><p>
While I agree that languages like Python may be better as an introductory language, Fortran is still heavily used in the scientific community, which these students will (theoretically) all eventually be part of. It makes, at least some, sense to teach the tools which they will be using. In any case, once you know one imperative language, its not too difficult to pick up another. Syntax is easy to learn.
</p><p>
Personally, I think learning Fortran as a first language is better than learning Java as a first.
</p></htmltext>
<tokenext>While I agree that languages like Python may be better as an introductory language , Fortran is still heavily used in the scientific community , which these students will ( theoretically ) all eventually be part of .
It makes , at least some , sense to teach the tools which they will be using .
In any case , once you know one imperative language , its not too difficult to pick up another .
Syntax is easy to learn .
Personally , I think learning Fortran as a first language is better than learning Java as a first .</tokentext>
<sentencetext>
While I agree that languages like Python may be better as an introductory language, Fortran is still heavily used in the scientific community, which these students will (theoretically) all eventually be part of.
It makes, at least some, sense to teach the tools which they will be using.
In any case, once you know one imperative language, its not too difficult to pick up another.
Syntax is easy to learn.
Personally, I think learning Fortran as a first language is better than learning Java as a first.
</sentencetext>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28293323</id>
	<title>Re:I still use Fortran for sciantific calculations</title>
	<author>NekSnappa</author>
	<datestamp>1244733360000</datestamp>
	<modclass>Interestin</modclass>
	<modscore>2</modscore>
	<htmltext><p>My first engineering class after leaving the Marine Corps in '86 was a 3 credit hour class that met twice a week. The first class each week focused on engineering graphics (drafting), the second was Fortran 77 programming.
</p><p>The computers in the lab were, I believe, 286 based Epson machines with dual 5.25" drives, running MS-DOS. As an added bonus our "development environment" as you say now, was edlin!
</p><p>At that time I had no personal experience with computers. I didn't know the difference between the OS, the text editor, and the compiler. Yet I had no problems with picking up the concepts and learning how to use both the language and a general use computer. I guess it was unofficially a trial by fire computer literacy class too.
</p><p>Since then I've taught myself enough of a couple of different programming languages that I needed in the course of my work. Back in the day Lisp was the daddy for customizing and automating AutoCAD!</p></htmltext>
<tokenext>My first engineering class after leaving the Marine Corps in '86 was a 3 credit hour class that met twice a week .
The first class each week focused on engineering graphics ( drafting ) , the second was Fortran 77 programming .
The computers in the lab were , I believe , 286 based Epson machines with dual 5.25 " drives , running MS-DOS .
As an added bonus our " development environment " as you say now , was edlin !
At that time I had no personal experience with computers .
I did n't know the difference between the OS , the text editor , and the compiler .
Yet I had no problems with picking up the concepts and learning how to use both the language and a general use computer .
I guess it was unofficially a trial by fire computer literacy class too .
Since then I 've taught myself enough of a couple of different programming languages that I needed in the course of my work .
Back in the day Lisp was the daddy for customizing and automating AutoCAD !</tokentext>
<sentencetext>My first engineering class after leaving the Marine Corps in '86 was a 3 credit hour class that met twice a week.
The first class each week focused on engineering graphics (drafting), the second was Fortran 77 programming.
The computers in the lab were, I believe, 286 based Epson machines with dual 5.25" drives, running MS-DOS.
As an added bonus our "development environment" as you say now, was edlin!
At that time I had no personal experience with computers.
I didn't know the difference between the OS, the text editor, and the compiler.
Yet I had no problems with picking up the concepts and learning how to use both the language and a general use computer.
I guess it was unofficially a trial by fire computer literacy class too.
Since then I've taught myself enough of a couple of different programming languages that I needed in the course of my work.
Back in the day Lisp was the daddy for customizing and automating AutoCAD!</sentencetext>
	<parent>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28291925</parent>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28292213</id>
	<title>What's the server programmed in?</title>
	<author>Bananenrepublik</author>
	<datestamp>1244729700000</datestamp>
	<modclass>None</modclass>
	<modscore>1</modscore>
	<htmltext><p>Whatever the server's implemented in, it's definitely not fast enough.</p><p>Today's Fortran is not the Fortran of 40 years ago.  Ever since Fortran 90 (i.e. for almost twenty years) the language has real dynamic memory allocation and real ways of sharing data between different parts of the program.  Fortran got its bad reputation because the lack of these features caused most old Fortran code to be hard to follow -- emulating memory allocation by using lots of large arrays, and tens of lines of repeated COMMON blocks scattered all through the code<nobr> <wbr></nobr>.EQ.ual very opaque code.  And let's not forget that those codes were written to be extremely efficient in terms of CPU time as opposed to developer time.</p><p>It's no surprise that Fortran got this bad reputation some 25 years ago when the art of language design had advanced significantly beyond what was in Fortran originally, but people who still think they need to pontificate on that point should catch up with the times -- or at least with 20 years ago.</p></htmltext>
<tokenext>Whatever the server 's implemented in , it 's definitely not fast enough.Today 's Fortran is not the Fortran of 40 years ago .
Ever since Fortran 90 ( i.e .
for almost twenty years ) the language has real dynamic memory allocation and real ways of sharing data between different parts of the program .
Fortran got its bad reputation because the lack of these features caused most old Fortran code to be hard to follow -- emulating memory allocation by using lots of large arrays , and tens of lines of repeated COMMON blocks scattered all through the code .EQ.ual very opaque code .
And let 's not forget that those codes were written to be extremely efficient in terms of CPU time as opposed to developer time.It 's no surprise that Fortran got this bad reputation some 25 years ago when the art of language design had advanced significantly beyond what was in Fortran originally , but people who still think they need to pontificate on that point should catch up with the times -- or at least with 20 years ago .</tokentext>
<sentencetext>Whatever the server's implemented in, it's definitely not fast enough.Today's Fortran is not the Fortran of 40 years ago.
Ever since Fortran 90 (i.e.
for almost twenty years) the language has real dynamic memory allocation and real ways of sharing data between different parts of the program.
Fortran got its bad reputation because the lack of these features caused most old Fortran code to be hard to follow -- emulating memory allocation by using lots of large arrays, and tens of lines of repeated COMMON blocks scattered all through the code .EQ.ual very opaque code.
And let's not forget that those codes were written to be extremely efficient in terms of CPU time as opposed to developer time.It's no surprise that Fortran got this bad reputation some 25 years ago when the art of language design had advanced significantly beyond what was in Fortran originally, but people who still think they need to pontificate on that point should catch up with the times -- or at least with 20 years ago.</sentencetext>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28291847</id>
	<title>It's okay to teach them FORTRAN</title>
	<author>sharkette66</author>
	<datestamp>1244728200000</datestamp>
	<modclass>Funny</modclass>
	<modscore>5</modscore>
	<htmltext><p>But only if they have to do it on punch cards, like I did.  Give each student a can of WD40 to keep the machines working smoothly, too.</p></htmltext>
<tokenext>But only if they have to do it on punch cards , like I did .
Give each student a can of WD40 to keep the machines working smoothly , too .</tokentext>
<sentencetext>But only if they have to do it on punch cards, like I did.
Give each student a can of WD40 to keep the machines working smoothly, too.</sentencetext>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28294955</id>
	<title>Fortran is very important</title>
	<author>NotNormallyNormal</author>
	<datestamp>1244739420000</datestamp>
	<modclass>Interestin</modclass>
	<modscore>2</modscore>
	<htmltext><p>Fortran is very important in the world of modelling and high speed computation. When I was an undergrad Fortran was taught for the physical sciences but the computer science dept refused to teach it so it was being taught by some geophysical modellers. I'm not sure that the university even offers Fortran anymore.</p><p>However, frustrated by that, the dept of physics and astronomy now has two courses in computational physics (both in Fortran) taught by modellers from the department. They deal with real world issues (well, real world modelling issues when applied to a spherical cow right?).  Only one course is mandatory but both courses are very popular.</p><p>For myself, I use several modelling programs that are purely Fortran that I've had problems dealing with. I'm glad I did take a bit of Fortran though I am much more fluent in other languages these days. In fact my wife, in the private sector, has proprietary software that they use for modelling digital elevations and gravity fluctuation that is written purely in Fortran as well - simply for speed. Until someone invents a real quantum computer, I don't think Fortran in the physical sciences is going anywhere.</p></htmltext>
<tokenext>Fortran is very important in the world of modelling and high speed computation .
When I was an undergrad Fortran was taught for the physical sciences but the computer science dept refused to teach it so it was being taught by some geophysical modellers .
I 'm not sure that the university even offers Fortran anymore.However , frustrated by that , the dept of physics and astronomy now has two courses in computational physics ( both in Fortran ) taught by modellers from the department .
They deal with real world issues ( well , real world modelling issues when applied to a spherical cow right ? ) .
Only one course is mandatory but both courses are very popular.For myself , I use several modelling programs that are purely Fortran that I 've had problems dealing with .
I 'm glad I did take a bit of Fortran though I am much more fluent in other languages these days .
In fact my wife , in the private sector , has proprietary software that they use for modelling digital elevations and gravity fluctuation that is written purely in Fortran as well - simply for speed .
Until someone invents a real quantum computer , I do n't think Fortran in the physical sciences is going anywhere .</tokentext>
<sentencetext>Fortran is very important in the world of modelling and high speed computation.
When I was an undergrad Fortran was taught for the physical sciences but the computer science dept refused to teach it so it was being taught by some geophysical modellers.
I'm not sure that the university even offers Fortran anymore.However, frustrated by that, the dept of physics and astronomy now has two courses in computational physics (both in Fortran) taught by modellers from the department.
They deal with real world issues (well, real world modelling issues when applied to a spherical cow right?).
Only one course is mandatory but both courses are very popular.For myself, I use several modelling programs that are purely Fortran that I've had problems dealing with.
I'm glad I did take a bit of Fortran though I am much more fluent in other languages these days.
In fact my wife, in the private sector, has proprietary software that they use for modelling digital elevations and gravity fluctuation that is written purely in Fortran as well - simply for speed.
Until someone invents a real quantum computer, I don't think Fortran in the physical sciences is going anywhere.</sentencetext>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28295325</id>
	<title>Re:While there may be "newer" languages</title>
	<author>Anonymous</author>
	<datestamp>1244740620000</datestamp>
	<modclass>None</modclass>
	<modscore>0</modscore>
	<htmltext>Actually python has nice libraries for numerical analysis. They are as feature complete as octave. Have a look at numpy and scipy.

But Matlab is better suited for education. A teacher should not spend time teaching the students why the vector indexes start at zero. Plotting graphs should be interactive and colorful. Chemistry students should never have to worry about the perils of object oriented programming. This is where matlab has it's big strength.</htmltext>
<tokenext>Actually python has nice libraries for numerical analysis .
They are as feature complete as octave .
Have a look at numpy and scipy .
But Matlab is better suited for education .
A teacher should not spend time teaching the students why the vector indexes start at zero .
Plotting graphs should be interactive and colorful .
Chemistry students should never have to worry about the perils of object oriented programming .
This is where matlab has it 's big strength .</tokentext>
<sentencetext>Actually python has nice libraries for numerical analysis.
They are as feature complete as octave.
Have a look at numpy and scipy.
But Matlab is better suited for education.
A teacher should not spend time teaching the students why the vector indexes start at zero.
Plotting graphs should be interactive and colorful.
Chemistry students should never have to worry about the perils of object oriented programming.
This is where matlab has it's big strength.</sentencetext>
	<parent>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28293759</parent>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28296003</id>
	<title>Fortran is *not* 40 years old</title>
	<author>slashdotlurker</author>
	<datestamp>1244743020000</datestamp>
	<modclass>None</modclass>
	<modscore>1</modscore>
	<htmltext>F2003K is less than 5 years old in terms of implementations.</htmltext>
<tokenext>F2003K is less than 5 years old in terms of implementations .</tokentext>
<sentencetext>F2003K is less than 5 years old in terms of implementations.</sentencetext>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28299473</id>
	<title>Perl or Python</title>
	<author>Average</author>
	<datestamp>1244711940000</datestamp>
	<modclass>None</modclass>
	<modscore>1</modscore>
	<htmltext><p>The point is to learn how to use programming to accomplish a task on a set of numbers (most likely some big comma-separated text document) and output some other set of numbers.</p><p>I want the students to get to the point that they have the numbers read and writable on the first day.</p><p>Perl or Python.  Suck it in, split on commas, and it's in a 2D array. 3 lines. Bob's your uncle. Write it out in a preformatted template. Boom.</p><p>I coded some things with my undergrad science major friends (atmo sci roommate).  The FORTRAN students not only devoted pages of code to the trivia, they spent hours debugging it.  The code part was simple for any of them.  Our Perl scripts were seriously 10 lines long to their pages, and better looking output to boot.</p></htmltext>
<tokenext>The point is to learn how to use programming to accomplish a task on a set of numbers ( most likely some big comma-separated text document ) and output some other set of numbers.I want the students to get to the point that they have the numbers read and writable on the first day.Perl or Python .
Suck it in , split on commas , and it 's in a 2D array .
3 lines .
Bob 's your uncle .
Write it out in a preformatted template .
Boom.I coded some things with my undergrad science major friends ( atmo sci roommate ) .
The FORTRAN students not only devoted pages of code to the trivia , they spent hours debugging it .
The code part was simple for any of them .
Our Perl scripts were seriously 10 lines long to their pages , and better looking output to boot .</tokentext>
<sentencetext>The point is to learn how to use programming to accomplish a task on a set of numbers (most likely some big comma-separated text document) and output some other set of numbers.I want the students to get to the point that they have the numbers read and writable on the first day.Perl or Python.
Suck it in, split on commas, and it's in a 2D array.
3 lines.
Bob's your uncle.
Write it out in a preformatted template.
Boom.I coded some things with my undergrad science major friends (atmo sci roommate).
The FORTRAN students not only devoted pages of code to the trivia, they spent hours debugging it.
The code part was simple for any of them.
Our Perl scripts were seriously 10 lines long to their pages, and better looking output to boot.</sentencetext>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28294913</id>
	<title>Horses for courses</title>
	<author>Winter Lightning</author>
	<datestamp>1244739300000</datestamp>
	<modclass>None</modclass>
	<modscore>1</modscore>
	<htmltext><p>I started using FORTRAN (66 and 77) as a Physics grad. student in the early 80s; since then I've also used C and C++ heavily, dabbled in Perl, Java and Python.  As well as developing a lot of code, I've also had to troubleshoot other people's complex systems, particularly C++ and Java.  I now describe myself - somewhat tongue-in-cheek - as a "born-again FORTRAN programmer".</p><p>Java, C++ etc., are useful for complex applications that are event driven and require independent threads of excution performing many different tasks, but I prefer to avoid these for simple number-crunching tasks unless I need additional layers of abstraction or interaction.</p><p>When I write FORTRAN I know what's going on under the hood; I know that memory is not going to be allocated without me knowing - in fact, allocatable arrays are a relatively new addition  to the original static allocation model.  I can concentrate on the implementation of an algorithm and achieve deterministic performance.</p><p>Flexibility comes at a price and C++ and Java applications frequently run into performance problems as they become bloated by hidden activity, particularly where the OS is called behind the scenes (e.g., hidden constructor methods).  This is a particular problem for real-time codes where an unplanned trip into the kernel may hurt determinism.</p><p>I think FORTRAN still has an important place; with extensions such as OpenMP it has tremendous value for parallel computing on multi-core systems.  With MPI it can exploit cluster/grid systems.  FORTRAN generally has an edge in performance for HPC codes.</p><p>If you're building something from scratch that contains requires numerical computation then consider FORTRAN, if only for computational kernels that are called by other languages such as C++ or Java (e.g., the latter for GUIs, external interaction/communication, with the heavy work in FORTRAN).</p></htmltext>
<tokenext>I started using FORTRAN ( 66 and 77 ) as a Physics grad .
student in the early 80s ; since then I 've also used C and C + + heavily , dabbled in Perl , Java and Python .
As well as developing a lot of code , I 've also had to troubleshoot other people 's complex systems , particularly C + + and Java .
I now describe myself - somewhat tongue-in-cheek - as a " born-again FORTRAN programmer " .Java , C + + etc. , are useful for complex applications that are event driven and require independent threads of excution performing many different tasks , but I prefer to avoid these for simple number-crunching tasks unless I need additional layers of abstraction or interaction.When I write FORTRAN I know what 's going on under the hood ; I know that memory is not going to be allocated without me knowing - in fact , allocatable arrays are a relatively new addition to the original static allocation model .
I can concentrate on the implementation of an algorithm and achieve deterministic performance.Flexibility comes at a price and C + + and Java applications frequently run into performance problems as they become bloated by hidden activity , particularly where the OS is called behind the scenes ( e.g. , hidden constructor methods ) .
This is a particular problem for real-time codes where an unplanned trip into the kernel may hurt determinism.I think FORTRAN still has an important place ; with extensions such as OpenMP it has tremendous value for parallel computing on multi-core systems .
With MPI it can exploit cluster/grid systems .
FORTRAN generally has an edge in performance for HPC codes.If you 're building something from scratch that contains requires numerical computation then consider FORTRAN , if only for computational kernels that are called by other languages such as C + + or Java ( e.g. , the latter for GUIs , external interaction/communication , with the heavy work in FORTRAN ) .</tokentext>
<sentencetext>I started using FORTRAN (66 and 77) as a Physics grad.
student in the early 80s; since then I've also used C and C++ heavily, dabbled in Perl, Java and Python.
As well as developing a lot of code, I've also had to troubleshoot other people's complex systems, particularly C++ and Java.
I now describe myself - somewhat tongue-in-cheek - as a "born-again FORTRAN programmer".Java, C++ etc., are useful for complex applications that are event driven and require independent threads of excution performing many different tasks, but I prefer to avoid these for simple number-crunching tasks unless I need additional layers of abstraction or interaction.When I write FORTRAN I know what's going on under the hood; I know that memory is not going to be allocated without me knowing - in fact, allocatable arrays are a relatively new addition  to the original static allocation model.
I can concentrate on the implementation of an algorithm and achieve deterministic performance.Flexibility comes at a price and C++ and Java applications frequently run into performance problems as they become bloated by hidden activity, particularly where the OS is called behind the scenes (e.g., hidden constructor methods).
This is a particular problem for real-time codes where an unplanned trip into the kernel may hurt determinism.I think FORTRAN still has an important place; with extensions such as OpenMP it has tremendous value for parallel computing on multi-core systems.
With MPI it can exploit cluster/grid systems.
FORTRAN generally has an edge in performance for HPC codes.If you're building something from scratch that contains requires numerical computation then consider FORTRAN, if only for computational kernels that are called by other languages such as C++ or Java (e.g., the latter for GUIs, external interaction/communication, with the heavy work in FORTRAN).</sentencetext>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28294365</id>
	<title>My 2 cents...</title>
	<author>tekiegreg</author>
	<datestamp>1244737380000</datestamp>
	<modclass>None</modclass>
	<modscore>1</modscore>
	<htmltext><p>My thinking is computer science programs need to focus on <i>Programming</i> rather than <i>Language</i> I wish they'd teach just one language regardless of what it is, and teach it well.  Rather than try and cram 40 different languages in 4 years.  Be that language Java, C#, Fortran, LOLCode, etc...</p><p>Within those guidelines, which languages should it be?  Probably having to do more with the ethos of the Computer Science program in particular.  Do they pride themselves on training for math and engineering?  Fortran should be a consideration.  Business world?  Java or C#.  To work for LOLpeople...well you get the idea...</p></htmltext>
<tokenext>My thinking is computer science programs need to focus on Programming rather than Language I wish they 'd teach just one language regardless of what it is , and teach it well .
Rather than try and cram 40 different languages in 4 years .
Be that language Java , C # , Fortran , LOLCode , etc...Within those guidelines , which languages should it be ?
Probably having to do more with the ethos of the Computer Science program in particular .
Do they pride themselves on training for math and engineering ?
Fortran should be a consideration .
Business world ?
Java or C # .
To work for LOLpeople...well you get the idea.. .</tokentext>
<sentencetext>My thinking is computer science programs need to focus on Programming rather than Language I wish they'd teach just one language regardless of what it is, and teach it well.
Rather than try and cram 40 different languages in 4 years.
Be that language Java, C#, Fortran, LOLCode, etc...Within those guidelines, which languages should it be?
Probably having to do more with the ethos of the Computer Science program in particular.
Do they pride themselves on training for math and engineering?
Fortran should be a consideration.
Business world?
Java or C#.
To work for LOLpeople...well you get the idea...</sentencetext>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28295359</id>
	<title>Teaching language</title>
	<author>RogerWilco</author>
	<datestamp>1244740740000</datestamp>
	<modclass>None</modclass>
	<modscore>1</modscore>
	<htmltext><p>FORTRAN has it's uses, mostly in legacy applications, but I don't think it's a good language to learn programming in. I'd go with either (object)pascal or Python there. Certainly not Java.</p><p>Compare<br>-------<br>print "Hello World"<br>-----or------<br>begin<br>
&nbsp; &nbsp; writeln("Hello World");<br>end.<br>------<br>with<br>------<br>class myHelloWorld<br>{<br>
&nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; public static void main(String args[])<br>
&nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; {<br>
&nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; System.out.println("Hello World!");<br>
&nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; &nbsp; }<br>}</p><p>----------</p><p>A language to teach people to program should not have to bother students with class definitions, "public static void"  or any of that. I'm no expert on FORTRAN, but what I've seen it isn't stellar either in it's ability to minimize syntactic sugar.</p></htmltext>
<tokenext>FORTRAN has it 's uses , mostly in legacy applications , but I do n't think it 's a good language to learn programming in .
I 'd go with either ( object ) pascal or Python there .
Certainly not Java.Compare-------print " Hello World " -----or------begin     writeln ( " Hello World " ) ; end.------with------class myHelloWorld {                 public static void main ( String args [ ] )                 {                       System.out.println ( " Hello World !
" ) ;                 } } ----------A language to teach people to program should not have to bother students with class definitions , " public static void " or any of that .
I 'm no expert on FORTRAN , but what I 've seen it is n't stellar either in it 's ability to minimize syntactic sugar .</tokentext>
<sentencetext>FORTRAN has it's uses, mostly in legacy applications, but I don't think it's a good language to learn programming in.
I'd go with either (object)pascal or Python there.
Certainly not Java.Compare-------print "Hello World"-----or------begin
    writeln("Hello World");end.------with------class myHelloWorld{
                public static void main(String args[])
                {
                      System.out.println("Hello World!
");
                }}----------A language to teach people to program should not have to bother students with class definitions, "public static void"  or any of that.
I'm no expert on FORTRAN, but what I've seen it isn't stellar either in it's ability to minimize syntactic sugar.</sentencetext>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28291919</id>
	<title>libraries.  gigabytes of libraries</title>
	<author>Anonymous</author>
	<datestamp>1244728500000</datestamp>
	<modclass>Informativ</modclass>
	<modscore>5</modscore>
	<htmltext><p>i spoke to someone studying engineering in 1990 who was being taught fortran.  they were using a mathematical library that would solve partial differential equations, by presenting the user with the actual mathematical formulae to them.</p><p>these kinds of libraries are staggeringly complex to write, and they have been empirically proven over decades of use to actually work.</p><p>to start again from scratch with such libraries would require man-centuries or possibly man-millenia of development effort to reproduce and debug, regardless of the programming language.</p><p>so it doesn't matter what people in the slashdot community think: for engineers to use anything but these tried-and-tested engineering libraries, that happen to be written in fortran, would just be genuinely stupid of them.</p></htmltext>
<tokenext>i spoke to someone studying engineering in 1990 who was being taught fortran .
they were using a mathematical library that would solve partial differential equations , by presenting the user with the actual mathematical formulae to them.these kinds of libraries are staggeringly complex to write , and they have been empirically proven over decades of use to actually work.to start again from scratch with such libraries would require man-centuries or possibly man-millenia of development effort to reproduce and debug , regardless of the programming language.so it does n't matter what people in the slashdot community think : for engineers to use anything but these tried-and-tested engineering libraries , that happen to be written in fortran , would just be genuinely stupid of them .</tokentext>
<sentencetext>i spoke to someone studying engineering in 1990 who was being taught fortran.
they were using a mathematical library that would solve partial differential equations, by presenting the user with the actual mathematical formulae to them.these kinds of libraries are staggeringly complex to write, and they have been empirically proven over decades of use to actually work.to start again from scratch with such libraries would require man-centuries or possibly man-millenia of development effort to reproduce and debug, regardless of the programming language.so it doesn't matter what people in the slashdot community think: for engineers to use anything but these tried-and-tested engineering libraries, that happen to be written in fortran, would just be genuinely stupid of them.</sentencetext>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28292025</id>
	<title>Re:While there may be "newer" languages</title>
	<author>mdwh2</author>
	<datestamp>1244728860000</datestamp>
	<modclass>None</modclass>
	<modscore>0</modscore>
	<htmltext><p>Citation needed.</p><p>Even if not phython, what does Fortran have over modern compiled languages, for example?</p></htmltext>
<tokenext>Citation needed.Even if not phython , what does Fortran have over modern compiled languages , for example ?</tokentext>
<sentencetext>Citation needed.Even if not phython, what does Fortran have over modern compiled languages, for example?</sentencetext>
	<parent>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28291859</parent>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28292051</id>
	<title>My Physics Degree + Fortran</title>
	<author>Anonymous</author>
	<datestamp>1244729040000</datestamp>
	<modclass>None</modclass>
	<modscore>1</modscore>
	<htmltext>Back in the mid-90;s I was taught Fortran (77?) during my 1st undergraduate year of my Physics Degree at Imperial College, London. We were told not to use the first 4? characters on any line as a left over form the punched cards day. This was quite amusing to myself and my fiends as most of us had been programming for years, and had PDA's that could manage a faster baud rate terminal interface to the mini computer than the 9600 baud green screen terminals we told to use. 2nd year was Fortran 90 on unix workstations so things got a lot better. By the time I was doing my PhD the computer labs Windows PC, with visual studio and C++. The course had got a lot harder during the switch and was teaching more advanced numerical methods and object orientated programing methods.</htmltext>
<tokenext>Back in the mid-90 ; s I was taught Fortran ( 77 ?
) during my 1st undergraduate year of my Physics Degree at Imperial College , London .
We were told not to use the first 4 ?
characters on any line as a left over form the punched cards day .
This was quite amusing to myself and my fiends as most of us had been programming for years , and had PDA 's that could manage a faster baud rate terminal interface to the mini computer than the 9600 baud green screen terminals we told to use .
2nd year was Fortran 90 on unix workstations so things got a lot better .
By the time I was doing my PhD the computer labs Windows PC , with visual studio and C + + .
The course had got a lot harder during the switch and was teaching more advanced numerical methods and object orientated programing methods .</tokentext>
<sentencetext>Back in the mid-90;s I was taught Fortran (77?
) during my 1st undergraduate year of my Physics Degree at Imperial College, London.
We were told not to use the first 4?
characters on any line as a left over form the punched cards day.
This was quite amusing to myself and my fiends as most of us had been programming for years, and had PDA's that could manage a faster baud rate terminal interface to the mini computer than the 9600 baud green screen terminals we told to use.
2nd year was Fortran 90 on unix workstations so things got a lot better.
By the time I was doing my PhD the computer labs Windows PC, with visual studio and C++.
The course had got a lot harder during the switch and was teaching more advanced numerical methods and object orientated programing methods.</sentencetext>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28294593</id>
	<title>Might be OK... but please be careful!</title>
	<author>Anonymous</author>
	<datestamp>1244738160000</datestamp>
	<modclass>None</modclass>
	<modscore>0</modscore>
	<htmltext><p>Should undergrads be taught FORTRAN as an intro language?  It depends.</p><p>If the purpose is an intro to programming, then the answer is a resounding "NO.  NOT EVER."  FORTRAN really has no place in a modern programmer's toolkit.</p><p>However, if the purpose is to give a certain class of physical science students a 'quick-'n'-dirty' tool to write computationally-intensive applications as research or scientific problem-solving aids, well... I'm not thrilled, but okay.  Just beware if any of those students go on to do any more serious programming, because the worst thing in the world is a software developer who thinks like a FORTRAN programmer.  And for God's sake, at least teach them some proper functional programming methodology with it!</p><p>I work for a company that has a very substantial base of FORTRAN-77 code that is actively maintained.  It's UGLY -- but that's only partially due to the language.  It makes heavy use of global data structures, and violates just about every rule of good programming -- even good FORTRAN programming.  I don't even know why it's written in FORTRAN, since it's a financial app &amp; therefore one would think COBOL would have been a better choice.</p><p>Bottom line -- it's okay to teach someone to use an outdated tool if it suits their needs.  Just don't neglect to teach proper tool safety just because "it's only a hammer".</p></htmltext>
<tokenext>Should undergrads be taught FORTRAN as an intro language ?
It depends.If the purpose is an intro to programming , then the answer is a resounding " NO .
NOT EVER .
" FORTRAN really has no place in a modern programmer 's toolkit.However , if the purpose is to give a certain class of physical science students a 'quick-'n'-dirty ' tool to write computationally-intensive applications as research or scientific problem-solving aids , well... I 'm not thrilled , but okay .
Just beware if any of those students go on to do any more serious programming , because the worst thing in the world is a software developer who thinks like a FORTRAN programmer .
And for God 's sake , at least teach them some proper functional programming methodology with it ! I work for a company that has a very substantial base of FORTRAN-77 code that is actively maintained .
It 's UGLY -- but that 's only partially due to the language .
It makes heavy use of global data structures , and violates just about every rule of good programming -- even good FORTRAN programming .
I do n't even know why it 's written in FORTRAN , since it 's a financial app &amp; therefore one would think COBOL would have been a better choice.Bottom line -- it 's okay to teach someone to use an outdated tool if it suits their needs .
Just do n't neglect to teach proper tool safety just because " it 's only a hammer " .</tokentext>
<sentencetext>Should undergrads be taught FORTRAN as an intro language?
It depends.If the purpose is an intro to programming, then the answer is a resounding "NO.
NOT EVER.
"  FORTRAN really has no place in a modern programmer's toolkit.However, if the purpose is to give a certain class of physical science students a 'quick-'n'-dirty' tool to write computationally-intensive applications as research or scientific problem-solving aids, well... I'm not thrilled, but okay.
Just beware if any of those students go on to do any more serious programming, because the worst thing in the world is a software developer who thinks like a FORTRAN programmer.
And for God's sake, at least teach them some proper functional programming methodology with it!I work for a company that has a very substantial base of FORTRAN-77 code that is actively maintained.
It's UGLY -- but that's only partially due to the language.
It makes heavy use of global data structures, and violates just about every rule of good programming -- even good FORTRAN programming.
I don't even know why it's written in FORTRAN, since it's a financial app &amp; therefore one would think COBOL would have been a better choice.Bottom line -- it's okay to teach someone to use an outdated tool if it suits their needs.
Just don't neglect to teach proper tool safety just because "it's only a hammer".</sentencetext>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28292391</id>
	<title>Well my answer is...</title>
	<author>fran6gagne</author>
	<datestamp>1244730300000</datestamp>
	<modclass>None</modclass>
	<modscore>1</modscore>
	<htmltext>No</htmltext>
<tokenext>No</tokentext>
<sentencetext>No</sentencetext>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28317545</id>
	<title>Ya Python Would've Been Cooler</title>
	<author>spaceWeepul</author>
	<datestamp>1244830320000</datestamp>
	<modclass>None</modclass>
	<modscore>1</modscore>
	<htmltext>I learned to use fortran77 in college.  I would've been better off learning Java or C++.  I would've enjoyed a class that discussed collecting data, processing it and producing control signals on the cheap.  Every place I worked used some kind of customized system to do that and I almost never worked with anything I had seen in classes I had taken.</htmltext>
<tokenext>I learned to use fortran77 in college .
I would 've been better off learning Java or C + + .
I would 've enjoyed a class that discussed collecting data , processing it and producing control signals on the cheap .
Every place I worked used some kind of customized system to do that and I almost never worked with anything I had seen in classes I had taken .</tokentext>
<sentencetext>I learned to use fortran77 in college.
I would've been better off learning Java or C++.
I would've enjoyed a class that discussed collecting data, processing it and producing control signals on the cheap.
Every place I worked used some kind of customized system to do that and I almost never worked with anything I had seen in classes I had taken.</sentencetext>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28292059</id>
	<title>Yes</title>
	<author>Anonymous</author>
	<datestamp>1244729040000</datestamp>
	<modclass>Insightful</modclass>
	<modscore>4</modscore>
	<htmltext>It's called Scheme.</htmltext>
<tokenext>It 's called Scheme .</tokentext>
<sentencetext>It's called Scheme.</sentencetext>
	<parent>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28291939</parent>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28293057</id>
	<title>Yes! But not for CompSci majors.</title>
	<author>Lord Byron II</author>
	<datestamp>1244732520000</datestamp>
	<modclass>None</modclass>
	<modscore>1</modscore>
	<htmltext><p>I'm a graduate student in physics and let me say that I feel that I've been able to go further and do more with my degree because of my knowledge of Fortran. There is a TON of code out there that is in Fortran and will always be in Fortran. My first undergraduate job was doing optimizations on 30yr old Fortran code. My current research uses about 30K of Fortran code.</p><p>I'm not a big fan of the language, but for the stuff we're doing, it works well and reliably and fast, so there's not much incentive to move to anything else.</p><p>Of course, it's completely dead from a CompSci perspective, so those students should be spared.</p></htmltext>
<tokenext>I 'm a graduate student in physics and let me say that I feel that I 've been able to go further and do more with my degree because of my knowledge of Fortran .
There is a TON of code out there that is in Fortran and will always be in Fortran .
My first undergraduate job was doing optimizations on 30yr old Fortran code .
My current research uses about 30K of Fortran code.I 'm not a big fan of the language , but for the stuff we 're doing , it works well and reliably and fast , so there 's not much incentive to move to anything else.Of course , it 's completely dead from a CompSci perspective , so those students should be spared .</tokentext>
<sentencetext>I'm a graduate student in physics and let me say that I feel that I've been able to go further and do more with my degree because of my knowledge of Fortran.
There is a TON of code out there that is in Fortran and will always be in Fortran.
My first undergraduate job was doing optimizations on 30yr old Fortran code.
My current research uses about 30K of Fortran code.I'm not a big fan of the language, but for the stuff we're doing, it works well and reliably and fast, so there's not much incentive to move to anything else.Of course, it's completely dead from a CompSci perspective, so those students should be spared.</sentencetext>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28319305</id>
	<title>Some value to older languages</title>
	<author>stanjam</author>
	<datestamp>1244902740000</datestamp>
	<modclass>None</modclass>
	<modscore>1</modscore>
	<htmltext>Not 100\% sure about Fortran, but I do know there is good value to learning older languages.  Take COBOL for instance.  Anything you can do in COBOL can be done in more modern languages.  The problem is that there is a LOT of COBOL out there.  It works, and businesses have found that projects transferring all their old programs to newer languages ultimately fail more often than they succeed.  It is much better for them to find a COBOL programmer than translate everything to a newer language.</htmltext>
<tokenext>Not 100 \ % sure about Fortran , but I do know there is good value to learning older languages .
Take COBOL for instance .
Anything you can do in COBOL can be done in more modern languages .
The problem is that there is a LOT of COBOL out there .
It works , and businesses have found that projects transferring all their old programs to newer languages ultimately fail more often than they succeed .
It is much better for them to find a COBOL programmer than translate everything to a newer language .</tokentext>
<sentencetext>Not 100\% sure about Fortran, but I do know there is good value to learning older languages.
Take COBOL for instance.
Anything you can do in COBOL can be done in more modern languages.
The problem is that there is a LOT of COBOL out there.
It works, and businesses have found that projects transferring all their old programs to newer languages ultimately fail more often than they succeed.
It is much better for them to find a COBOL programmer than translate everything to a newer language.</sentencetext>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28298375</id>
	<title>Re:While there may be "newer" languages</title>
	<author>cyberthanasis12</author>
	<datestamp>1244751240000</datestamp>
	<modclass>None</modclass>
	<modscore>1</modscore>
	<htmltext>I agree. But I do the first implementation of a new idea (which I don't know if it works) with Python. It is much faster and safer to program in Python. If the Python implementation works, and it is very slow, then I convert the code to Fortran95.</htmltext>
<tokenext>I agree .
But I do the first implementation of a new idea ( which I do n't know if it works ) with Python .
It is much faster and safer to program in Python .
If the Python implementation works , and it is very slow , then I convert the code to Fortran95 .</tokentext>
<sentencetext>I agree.
But I do the first implementation of a new idea (which I don't know if it works) with Python.
It is much faster and safer to program in Python.
If the Python implementation works, and it is very slow, then I convert the code to Fortran95.</sentencetext>
	<parent>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28291859</parent>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28296557</id>
	<title>Programming in Practice</title>
	<author>sitarlo</author>
	<datestamp>1244745060000</datestamp>
	<modclass>None</modclass>
	<modscore>1</modscore>
	<htmltext>
I think colleges should teach the fundamentals of programming using a variety of languages first, then offer specialization courses and research opportunities.  Teaching "programming" using one language is a seriously outdated idea.  Modern software developers need a solid understanding of fundamentals that they can apply to any number of programming languages.  It's like music where one set of fundamental rules can be applied to a large number of existing and future instruments each with their own interface.  I currently work in Objective-C, Java, C, C++, and a bunch of scripting languages like perl, python, etc.  For me to do this effectively I have to view the language as an interface to the machine or platform while applying techniques, concepts, algorithms, and data structures that transcend language alone.  Taking a FORTRAN class may be beneficial to one's completeness as a programmer, but I don't think it should be required for a CS, PHYS, or ENG program.</htmltext>
<tokenext>I think colleges should teach the fundamentals of programming using a variety of languages first , then offer specialization courses and research opportunities .
Teaching " programming " using one language is a seriously outdated idea .
Modern software developers need a solid understanding of fundamentals that they can apply to any number of programming languages .
It 's like music where one set of fundamental rules can be applied to a large number of existing and future instruments each with their own interface .
I currently work in Objective-C , Java , C , C + + , and a bunch of scripting languages like perl , python , etc .
For me to do this effectively I have to view the language as an interface to the machine or platform while applying techniques , concepts , algorithms , and data structures that transcend language alone .
Taking a FORTRAN class may be beneficial to one 's completeness as a programmer , but I do n't think it should be required for a CS , PHYS , or ENG program .</tokentext>
<sentencetext>
I think colleges should teach the fundamentals of programming using a variety of languages first, then offer specialization courses and research opportunities.
Teaching "programming" using one language is a seriously outdated idea.
Modern software developers need a solid understanding of fundamentals that they can apply to any number of programming languages.
It's like music where one set of fundamental rules can be applied to a large number of existing and future instruments each with their own interface.
I currently work in Objective-C, Java, C, C++, and a bunch of scripting languages like perl, python, etc.
For me to do this effectively I have to view the language as an interface to the machine or platform while applying techniques, concepts, algorithms, and data structures that transcend language alone.
Taking a FORTRAN class may be beneficial to one's completeness as a programmer, but I don't think it should be required for a CS, PHYS, or ENG program.</sentencetext>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28296235</id>
	<title>What about other languages?</title>
	<author>DutchUncle</author>
	<datestamp>1244743920000</datestamp>
	<modclass>None</modclass>
	<modscore>1</modscore>
	<htmltext>Despite the fact that they are hundreds or thousands of years old, people are still being taught English, French, Spanish, German, and various other Romance and/or Germanic languages.  Not to mention the various non-Indo-European families.
<br> <br>
They still work.  They each have history and legacy.
<br> <br>
People are still using fire to cook, too, and that's got many thousands of years on the odometer.  It also still works.</htmltext>
<tokenext>Despite the fact that they are hundreds or thousands of years old , people are still being taught English , French , Spanish , German , and various other Romance and/or Germanic languages .
Not to mention the various non-Indo-European families .
They still work .
They each have history and legacy .
People are still using fire to cook , too , and that 's got many thousands of years on the odometer .
It also still works .</tokentext>
<sentencetext>Despite the fact that they are hundreds or thousands of years old, people are still being taught English, French, Spanish, German, and various other Romance and/or Germanic languages.
Not to mention the various non-Indo-European families.
They still work.
They each have history and legacy.
People are still using fire to cook, too, and that's got many thousands of years on the odometer.
It also still works.</sentencetext>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28292329</id>
	<title>Re:libraries. gigabytes of libraries</title>
	<author>dword</author>
	<datestamp>1244730060000</datestamp>
	<modclass>None</modclass>
	<modscore>1</modscore>
	<htmltext><p><div class="quote"><p>so it doesn't matter what people in the slashdot community think: for engineers to use anything but these tried-and-tested engineering libraries, that happen to be written in fortran, would just be genuinely stupid of them.</p></div><p>So, your idea is to teach everyone FORTRAN because if they become engineers, they might need it?</p><p>How many projects do need those kind of computations? There are some cases, I agree, but I could also argue that everybody should be taught assembler, because they have tons of microchips in every-day devices, such as mobile phones, televisions or cars. NO! Leave FORTRAN where it should be, with the mathematical experts, don't feed more useless stuff to undergrads.</p><p>Next question, please!?</p></div>
	</htmltext>
<tokenext>so it does n't matter what people in the slashdot community think : for engineers to use anything but these tried-and-tested engineering libraries , that happen to be written in fortran , would just be genuinely stupid of them.So , your idea is to teach everyone FORTRAN because if they become engineers , they might need it ? How many projects do need those kind of computations ?
There are some cases , I agree , but I could also argue that everybody should be taught assembler , because they have tons of microchips in every-day devices , such as mobile phones , televisions or cars .
NO ! Leave FORTRAN where it should be , with the mathematical experts , do n't feed more useless stuff to undergrads.Next question , please !
?</tokentext>
<sentencetext>so it doesn't matter what people in the slashdot community think: for engineers to use anything but these tried-and-tested engineering libraries, that happen to be written in fortran, would just be genuinely stupid of them.So, your idea is to teach everyone FORTRAN because if they become engineers, they might need it?How many projects do need those kind of computations?
There are some cases, I agree, but I could also argue that everybody should be taught assembler, because they have tons of microchips in every-day devices, such as mobile phones, televisions or cars.
NO! Leave FORTRAN where it should be, with the mathematical experts, don't feed more useless stuff to undergrads.Next question, please!
?
	</sentencetext>
	<parent>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28291919</parent>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28292379</id>
	<title>What to learn first</title>
	<author>vlakkies</author>
	<datestamp>1244730240000</datestamp>
	<modclass>None</modclass>
	<modscore>1</modscore>
	<htmltext><p>For a scientist, FORTRAN is still a valuable skill since there are so many libraries and applications that represent many years worth of development to draw from.  Plus it is efficient, etc.</p><p>The question here is what is the best way to introduce students as a first exposure to programming.  FORTRAN has many quirks so it is debatable whether it is really the best language to learn first.  You can write truly horrible code in FORTRAN, but that is true to some extent of all languages, although Python does make it harder.</p><p>On balance, if students are taught a modern dialect of FORTRAN and the instructor stresses good programming practices, it remains a good way to introduce students to solving a problem numerically and of getting the job done for people who may not code for a living.</p></htmltext>
<tokenext>For a scientist , FORTRAN is still a valuable skill since there are so many libraries and applications that represent many years worth of development to draw from .
Plus it is efficient , etc.The question here is what is the best way to introduce students as a first exposure to programming .
FORTRAN has many quirks so it is debatable whether it is really the best language to learn first .
You can write truly horrible code in FORTRAN , but that is true to some extent of all languages , although Python does make it harder.On balance , if students are taught a modern dialect of FORTRAN and the instructor stresses good programming practices , it remains a good way to introduce students to solving a problem numerically and of getting the job done for people who may not code for a living .</tokentext>
<sentencetext>For a scientist, FORTRAN is still a valuable skill since there are so many libraries and applications that represent many years worth of development to draw from.
Plus it is efficient, etc.The question here is what is the best way to introduce students as a first exposure to programming.
FORTRAN has many quirks so it is debatable whether it is really the best language to learn first.
You can write truly horrible code in FORTRAN, but that is true to some extent of all languages, although Python does make it harder.On balance, if students are taught a modern dialect of FORTRAN and the instructor stresses good programming practices, it remains a good way to introduce students to solving a problem numerically and of getting the job done for people who may not code for a living.</sentencetext>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28295549</id>
	<title>Re:While there may be "newer" languages</title>
	<author>PvtVoid</author>
	<datestamp>1244741460000</datestamp>
	<modclass>None</modclass>
	<modscore>1</modscore>
	<htmltext><blockquote><div><p>Fortran has tons of libraries specialized to whatever scientific field you are working in, and is unavoidable in high energy physics especially.</p></div>
</blockquote><p>

Hopefully not forever, since CERN standardized on C++ for the Large Hadron Collider in the 1990s, although they have enormous amounts of <a href="http://cerncourier.com/cws/article/cern/30873" title="cerncourier.com" rel="nofollow">legacy FORTRAN code</a> [cerncourier.com].
<br> <br>

Personally, I think FORTRAN should be taken out and shot. Godawful unmaintainable code is the norm , and there is nothing you can do in FORTRAN that you can't do in a cleaner environment like C++. People only still use FORTRAN because they are used to it, and it will not go away until all the old fucks die.</p></div>
	</htmltext>
<tokenext>Fortran has tons of libraries specialized to whatever scientific field you are working in , and is unavoidable in high energy physics especially .
Hopefully not forever , since CERN standardized on C + + for the Large Hadron Collider in the 1990s , although they have enormous amounts of legacy FORTRAN code [ cerncourier.com ] .
Personally , I think FORTRAN should be taken out and shot .
Godawful unmaintainable code is the norm , and there is nothing you can do in FORTRAN that you ca n't do in a cleaner environment like C + + .
People only still use FORTRAN because they are used to it , and it will not go away until all the old fucks die .</tokentext>
<sentencetext>Fortran has tons of libraries specialized to whatever scientific field you are working in, and is unavoidable in high energy physics especially.
Hopefully not forever, since CERN standardized on C++ for the Large Hadron Collider in the 1990s, although they have enormous amounts of legacy FORTRAN code [cerncourier.com].
Personally, I think FORTRAN should be taken out and shot.
Godawful unmaintainable code is the norm , and there is nothing you can do in FORTRAN that you can't do in a cleaner environment like C++.
People only still use FORTRAN because they are used to it, and it will not go away until all the old fucks die.
	</sentencetext>
	<parent>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28292125</parent>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28293279</id>
	<title>It's about reliability</title>
	<author>thethibs</author>
	<datestamp>1244733180000</datestamp>
	<modclass>None</modclass>
	<modscore>1</modscore>
	<htmltext><p>Since this is not the place to find actual engineers, it's not a surprise that no one has pointed out one of the primary rules of engineering: "Minimize Innovation."</p><p>FORTRAN and the vast mathematical and physical libraries that come with it have been tested and matured over a period of decades to create an open-source, highly dependable, essentially error-free framework for mathematics, science and engineering computations. We teach FORTRAN because it's the only language that meets that spec and because open-source is useless if you can't understand the source language.</p><p>The intent is not to develop programmers but to give mathematicians, scientists and real engineers a working knowledge of a vital tool.</p></htmltext>
<tokenext>Since this is not the place to find actual engineers , it 's not a surprise that no one has pointed out one of the primary rules of engineering : " Minimize Innovation .
" FORTRAN and the vast mathematical and physical libraries that come with it have been tested and matured over a period of decades to create an open-source , highly dependable , essentially error-free framework for mathematics , science and engineering computations .
We teach FORTRAN because it 's the only language that meets that spec and because open-source is useless if you ca n't understand the source language.The intent is not to develop programmers but to give mathematicians , scientists and real engineers a working knowledge of a vital tool .</tokentext>
<sentencetext>Since this is not the place to find actual engineers, it's not a surprise that no one has pointed out one of the primary rules of engineering: "Minimize Innovation.
"FORTRAN and the vast mathematical and physical libraries that come with it have been tested and matured over a period of decades to create an open-source, highly dependable, essentially error-free framework for mathematics, science and engineering computations.
We teach FORTRAN because it's the only language that meets that spec and because open-source is useless if you can't understand the source language.The intent is not to develop programmers but to give mathematicians, scientists and real engineers a working knowledge of a vital tool.</sentencetext>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28293179</id>
	<title>Force them to learn many languages</title>
	<author>Zarf</author>
	<datestamp>1244732880000</datestamp>
	<modclass>None</modclass>
	<modscore>1</modscore>
	<htmltext><p>Seriously, force all undergraduates to learn at least 4 programming languages.</p></htmltext>
<tokenext>Seriously , force all undergraduates to learn at least 4 programming languages .</tokentext>
<sentencetext>Seriously, force all undergraduates to learn at least 4 programming languages.</sentencetext>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28294671</id>
	<title>FORTAN alive and kicking in meteorological world</title>
	<author>klchoward</author>
	<datestamp>1244738400000</datestamp>
	<modclass>Insightful</modclass>
	<modscore>2</modscore>
	<htmltext>A resounding YES to undergrads being taught FORTRAN. I am a graduate student in Meteorology and FORTRAN is alive and kicking in the meteorological community. It is a vital part of many of our programs and models. Perl and Cshell are also very important but I wouldn't be able to do major parts of my thesis without the aid of FORTRAN. The undergrads at my graduate school are required to take FORTRAN (especially if they are in the met program) and they use it in their upper-level core classes. I wish I'd taken the FORTRAN class at my undergrad so I wouldn't have had to catch up during my thesis. Knowing FORTRAN definitely helped me grasp other languages faster.</htmltext>
<tokenext>A resounding YES to undergrads being taught FORTRAN .
I am a graduate student in Meteorology and FORTRAN is alive and kicking in the meteorological community .
It is a vital part of many of our programs and models .
Perl and Cshell are also very important but I would n't be able to do major parts of my thesis without the aid of FORTRAN .
The undergrads at my graduate school are required to take FORTRAN ( especially if they are in the met program ) and they use it in their upper-level core classes .
I wish I 'd taken the FORTRAN class at my undergrad so I would n't have had to catch up during my thesis .
Knowing FORTRAN definitely helped me grasp other languages faster .</tokentext>
<sentencetext>A resounding YES to undergrads being taught FORTRAN.
I am a graduate student in Meteorology and FORTRAN is alive and kicking in the meteorological community.
It is a vital part of many of our programs and models.
Perl and Cshell are also very important but I wouldn't be able to do major parts of my thesis without the aid of FORTRAN.
The undergrads at my graduate school are required to take FORTRAN (especially if they are in the met program) and they use it in their upper-level core classes.
I wish I'd taken the FORTRAN class at my undergrad so I wouldn't have had to catch up during my thesis.
Knowing FORTRAN definitely helped me grasp other languages faster.</sentencetext>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28291985</id>
	<title>Even NASA can't get it right</title>
	<author>Maximum Prophet</author>
	<datestamp>1244728680000</datestamp>
	<modclass>None</modclass>
	<modscore>1</modscore>
	<htmltext>One of the staff at Purdue (back in the 80's) was a Fortran guru.  He had never seen a true fortran complier, i.e. one that would pass all the standards tests.  I don't know if this is any different today, but I doubt there's a compiler that will compile any and all arbitrary Fortran programs.
<br> <br>
Besides, just write your code in a real language (one with a real grammar) and link it to the time tested Fortran libraries.  What value is there in actually writing in the language?</htmltext>
<tokenext>One of the staff at Purdue ( back in the 80 's ) was a Fortran guru .
He had never seen a true fortran complier , i.e .
one that would pass all the standards tests .
I do n't know if this is any different today , but I doubt there 's a compiler that will compile any and all arbitrary Fortran programs .
Besides , just write your code in a real language ( one with a real grammar ) and link it to the time tested Fortran libraries .
What value is there in actually writing in the language ?</tokentext>
<sentencetext>One of the staff at Purdue (back in the 80's) was a Fortran guru.
He had never seen a true fortran complier, i.e.
one that would pass all the standards tests.
I don't know if this is any different today, but I doubt there's a compiler that will compile any and all arbitrary Fortran programs.
Besides, just write your code in a real language (one with a real grammar) and link it to the time tested Fortran libraries.
What value is there in actually writing in the language?</sentencetext>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28296975</id>
	<title>Why is this being discussed here again?</title>
	<author>Anonymous</author>
	<datestamp>1244746500000</datestamp>
	<modclass>None</modclass>
	<modscore>0</modscore>
	<htmltext><p>There are no new facts to this whole thing. FORTRAN fans will come here and praise the advantages of using numerical routines developed over the last 40+ years. Lazy grad students and alikes will say they have to use it because the old code they were handed is written in FORTRAN.</p><p>The truth is that even though FORTRAN is an efficient language with loads and loads of available numerical routines, it is old, and we should gradually change to something more modern. If only we weren't so lazy!</p><p>I for one, am a graduate student in Physics, working in a group with several other students and post-docs. Even though our supervisors legacy code was sure written in FORTRAN, we all write our own serious stuff in C or C++.  Most of what we do is Monte Carlo, and Linear Algebra (BLAS, LAPACK, etc). We use MPI, OpenMP and the threaded version of Intel's MKL very heavily. I doubt FORTRAN programs would give us any performance gain over this setup, but it sure would require us to keep writing stone-age code, making sure to leave enough space on the right-hand side for the holes to be punched!</p><p>Give me a break! If you are a senior programer, than you might be excused for sticking to your old programming language and codes. If you are new in this game, and you don't want to take advantage of the best tool available, fine too! I just hope I won't have to work with you. Ever!</p></htmltext>
<tokenext>There are no new facts to this whole thing .
FORTRAN fans will come here and praise the advantages of using numerical routines developed over the last 40 + years .
Lazy grad students and alikes will say they have to use it because the old code they were handed is written in FORTRAN.The truth is that even though FORTRAN is an efficient language with loads and loads of available numerical routines , it is old , and we should gradually change to something more modern .
If only we were n't so lazy ! I for one , am a graduate student in Physics , working in a group with several other students and post-docs .
Even though our supervisors legacy code was sure written in FORTRAN , we all write our own serious stuff in C or C + + .
Most of what we do is Monte Carlo , and Linear Algebra ( BLAS , LAPACK , etc ) .
We use MPI , OpenMP and the threaded version of Intel 's MKL very heavily .
I doubt FORTRAN programs would give us any performance gain over this setup , but it sure would require us to keep writing stone-age code , making sure to leave enough space on the right-hand side for the holes to be punched ! Give me a break !
If you are a senior programer , than you might be excused for sticking to your old programming language and codes .
If you are new in this game , and you do n't want to take advantage of the best tool available , fine too !
I just hope I wo n't have to work with you .
Ever !</tokentext>
<sentencetext>There are no new facts to this whole thing.
FORTRAN fans will come here and praise the advantages of using numerical routines developed over the last 40+ years.
Lazy grad students and alikes will say they have to use it because the old code they were handed is written in FORTRAN.The truth is that even though FORTRAN is an efficient language with loads and loads of available numerical routines, it is old, and we should gradually change to something more modern.
If only we weren't so lazy!I for one, am a graduate student in Physics, working in a group with several other students and post-docs.
Even though our supervisors legacy code was sure written in FORTRAN, we all write our own serious stuff in C or C++.
Most of what we do is Monte Carlo, and Linear Algebra (BLAS, LAPACK, etc).
We use MPI, OpenMP and the threaded version of Intel's MKL very heavily.
I doubt FORTRAN programs would give us any performance gain over this setup, but it sure would require us to keep writing stone-age code, making sure to leave enough space on the right-hand side for the holes to be punched!Give me a break!
If you are a senior programer, than you might be excused for sticking to your old programming language and codes.
If you are new in this game, and you don't want to take advantage of the best tool available, fine too!
I just hope I won't have to work with you.
Ever!</sentencetext>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28292033</id>
	<title>Python is not programming.</title>
	<author>Weather</author>
	<datestamp>1244728980000</datestamp>
	<modclass>Informativ</modclass>
	<modscore>2</modscore>
	<htmltext><p>Python is scripting.  FORTRAN is programming.  MPI is vastly more supported by FORTRAN than any other language - grow MPI support for C++ or Object C, and then FORTRAN can go away.</p></htmltext>
<tokenext>Python is scripting .
FORTRAN is programming .
MPI is vastly more supported by FORTRAN than any other language - grow MPI support for C + + or Object C , and then FORTRAN can go away .</tokentext>
<sentencetext>Python is scripting.
FORTRAN is programming.
MPI is vastly more supported by FORTRAN than any other language - grow MPI support for C++ or Object C, and then FORTRAN can go away.</sentencetext>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28300331</id>
	<title>Re:While there may be "newer" languages</title>
	<author>Anonymous</author>
	<datestamp>1244715000000</datestamp>
	<modclass>None</modclass>
	<modscore>0</modscore>
	<htmltext><p>You are right on almos all points.<br>Except about the GNU Compiler - I have recently done an extensive speed analysis of libraries and compilers (my planned calculations will take 100+ days, so 5\% are kind of important). Our Code is basically a huge, complex, non-sparse eigenvalue problem. A complete GNU toolchain (ATLAS -&gt; FLAME/LAPACK -&gt; Our Code) has about the same speed as hand-optimizes assembly (Goto-BLAS) and ifort-compiled FLAME/LAPACK/our Code on Athlons and is only ~20\% slower on Xeons. If I substituted the Goto-Blas for ATLAS, the difference is within the margin of error.</p><p>Give gfortran a try, it has come a long way and is really, really good by now.</p><p>Note for interested people: FLAME gave a performance boost of about 30\%. Hats off to those guys (and to Goto for his SSE coding skills).</p></htmltext>
<tokenext>You are right on almos all points.Except about the GNU Compiler - I have recently done an extensive speed analysis of libraries and compilers ( my planned calculations will take 100 + days , so 5 \ % are kind of important ) .
Our Code is basically a huge , complex , non-sparse eigenvalue problem .
A complete GNU toolchain ( ATLAS - &gt; FLAME/LAPACK - &gt; Our Code ) has about the same speed as hand-optimizes assembly ( Goto-BLAS ) and ifort-compiled FLAME/LAPACK/our Code on Athlons and is only ~ 20 \ % slower on Xeons .
If I substituted the Goto-Blas for ATLAS , the difference is within the margin of error.Give gfortran a try , it has come a long way and is really , really good by now.Note for interested people : FLAME gave a performance boost of about 30 \ % .
Hats off to those guys ( and to Goto for his SSE coding skills ) .</tokentext>
<sentencetext>You are right on almos all points.Except about the GNU Compiler - I have recently done an extensive speed analysis of libraries and compilers (my planned calculations will take 100+ days, so 5\% are kind of important).
Our Code is basically a huge, complex, non-sparse eigenvalue problem.
A complete GNU toolchain (ATLAS -&gt; FLAME/LAPACK -&gt; Our Code) has about the same speed as hand-optimizes assembly (Goto-BLAS) and ifort-compiled FLAME/LAPACK/our Code on Athlons and is only ~20\% slower on Xeons.
If I substituted the Goto-Blas for ATLAS, the difference is within the margin of error.Give gfortran a try, it has come a long way and is really, really good by now.Note for interested people: FLAME gave a performance boost of about 30\%.
Hats off to those guys (and to Goto for his SSE coding skills).</sentencetext>
	<parent>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28292401</parent>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28301159</id>
	<title>Re:Not punched cards</title>
	<author>Nefarious Wheel</author>
	<datestamp>1244718000000</datestamp>
	<modclass>None</modclass>
	<modscore>1</modscore>
	<htmltext>Yes, the 029 was definitely the way to go.  But you had to have your own programming drum for it, people were always nicking them.</htmltext>
<tokenext>Yes , the 029 was definitely the way to go .
But you had to have your own programming drum for it , people were always nicking them .</tokentext>
<sentencetext>Yes, the 029 was definitely the way to go.
But you had to have your own programming drum for it, people were always nicking them.</sentencetext>
	<parent>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28291953</parent>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28292797</id>
	<title>Definitely not!</title>
	<author>Market</author>
	<datestamp>1244731620000</datestamp>
	<modclass>None</modclass>
	<modscore>1</modscore>
	<htmltext><p>Having taught CS (and non-CS) Undergraduates I have to say that you should teach Fortran...<i>or</i> Python.  They should be taught some data representation, basic algorithmic design and how that might be used to develop programs.  If you teach them a <i>language</i>, you're almost always starting from the wrong point.  At least, that's my experience.</p></htmltext>
<tokenext>Having taught CS ( and non-CS ) Undergraduates I have to say that you should teach Fortran...or Python .
They should be taught some data representation , basic algorithmic design and how that might be used to develop programs .
If you teach them a language , you 're almost always starting from the wrong point .
At least , that 's my experience .</tokentext>
<sentencetext>Having taught CS (and non-CS) Undergraduates I have to say that you should teach Fortran...or Python.
They should be taught some data representation, basic algorithmic design and how that might be used to develop programs.
If you teach them a language, you're almost always starting from the wrong point.
At least, that's my experience.</sentencetext>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28292809</id>
	<title>Wanted Fortran programmer with 5-10 years experien</title>
	<author>Anonymous</author>
	<datestamp>1244731680000</datestamp>
	<modclass>None</modclass>
	<modscore>0</modscore>
	<htmltext><p>
The only way Fortran be taught to undergraduates is if they have a corporate sponsor that have agreed to hirer some if not most of the graduates.  Why?  Because the all jobs I have seen posted for Fortran is to replace the guy that has been doing the job for the last 35 years and HR thinks they need a replacement with  5-10 years or recent experience.  What is the point to learning Fortran if you can not get a job using Fortran, and all the jobs that require Fortran tend to need mutli-years of experience in Fortran.
</p><p>
Reminds me when I applied for a job as a Java Developer... long ago.  I did not even get an interview so I phoned the HR department and ask why not.  They said that they needed someone with 10 years of Java experience.  At the time the earliest publish book on Java was only 7 years old.  I asked how the company was able to afford the an inventor of Java.  The HR seemed dumbfounded by my indications that if the person they hirer claimed 10 years of java experience they where lying to them.
</p><p>
I have seen many stupid moves by bureaucratic HR personnel and trying to learn Fortran in school then finding a job is backwards.  Either claim you are a Fortran expert and learn on the job or join a company and a result of drawing the short straw become the company's Fortran resource (learning on the company's dime).  I have found my programming skill translate over to any language - the underlying principles are basically the same on only the syntax and structure changes.  I have learned new languages in a week and was able to efficiently work on large teams with individuals with years of expert knowledge in the language.  Schools need to teach the underlying programming principles that can be adapted to any language - that way the company you work for can training you in their preferred language and expect that you will be some what skilled and productive.
</p></htmltext>
<tokenext>The only way Fortran be taught to undergraduates is if they have a corporate sponsor that have agreed to hirer some if not most of the graduates .
Why ? Because the all jobs I have seen posted for Fortran is to replace the guy that has been doing the job for the last 35 years and HR thinks they need a replacement with 5-10 years or recent experience .
What is the point to learning Fortran if you can not get a job using Fortran , and all the jobs that require Fortran tend to need mutli-years of experience in Fortran .
Reminds me when I applied for a job as a Java Developer... long ago .
I did not even get an interview so I phoned the HR department and ask why not .
They said that they needed someone with 10 years of Java experience .
At the time the earliest publish book on Java was only 7 years old .
I asked how the company was able to afford the an inventor of Java .
The HR seemed dumbfounded by my indications that if the person they hirer claimed 10 years of java experience they where lying to them .
I have seen many stupid moves by bureaucratic HR personnel and trying to learn Fortran in school then finding a job is backwards .
Either claim you are a Fortran expert and learn on the job or join a company and a result of drawing the short straw become the company 's Fortran resource ( learning on the company 's dime ) .
I have found my programming skill translate over to any language - the underlying principles are basically the same on only the syntax and structure changes .
I have learned new languages in a week and was able to efficiently work on large teams with individuals with years of expert knowledge in the language .
Schools need to teach the underlying programming principles that can be adapted to any language - that way the company you work for can training you in their preferred language and expect that you will be some what skilled and productive .</tokentext>
<sentencetext>
The only way Fortran be taught to undergraduates is if they have a corporate sponsor that have agreed to hirer some if not most of the graduates.
Why?  Because the all jobs I have seen posted for Fortran is to replace the guy that has been doing the job for the last 35 years and HR thinks they need a replacement with  5-10 years or recent experience.
What is the point to learning Fortran if you can not get a job using Fortran, and all the jobs that require Fortran tend to need mutli-years of experience in Fortran.
Reminds me when I applied for a job as a Java Developer... long ago.
I did not even get an interview so I phoned the HR department and ask why not.
They said that they needed someone with 10 years of Java experience.
At the time the earliest publish book on Java was only 7 years old.
I asked how the company was able to afford the an inventor of Java.
The HR seemed dumbfounded by my indications that if the person they hirer claimed 10 years of java experience they where lying to them.
I have seen many stupid moves by bureaucratic HR personnel and trying to learn Fortran in school then finding a job is backwards.
Either claim you are a Fortran expert and learn on the job or join a company and a result of drawing the short straw become the company's Fortran resource (learning on the company's dime).
I have found my programming skill translate over to any language - the underlying principles are basically the same on only the syntax and structure changes.
I have learned new languages in a week and was able to efficiently work on large teams with individuals with years of expert knowledge in the language.
Schools need to teach the underlying programming principles that can be adapted to any language - that way the company you work for can training you in their preferred language and expect that you will be some what skilled and productive.
</sentencetext>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28292195</id>
	<title>Old Fortran Code</title>
	<author>Anonymous</author>
	<datestamp>1244729640000</datestamp>
	<modclass>None</modclass>
	<modscore>0</modscore>
	<htmltext><p>In my area of Research, almost every program written has been in Fortran. Maybe Python would be better- i wouldn't know. But it is a lot easier to update a program written in an old language than learn a new one and start from scratch. And fact is Fortran does work and works quickly. Being 40 years old means people have spent that time optimizing the routines.</p></htmltext>
<tokenext>In my area of Research , almost every program written has been in Fortran .
Maybe Python would be better- i would n't know .
But it is a lot easier to update a program written in an old language than learn a new one and start from scratch .
And fact is Fortran does work and works quickly .
Being 40 years old means people have spent that time optimizing the routines .</tokentext>
<sentencetext>In my area of Research, almost every program written has been in Fortran.
Maybe Python would be better- i wouldn't know.
But it is a lot easier to update a program written in an old language than learn a new one and start from scratch.
And fact is Fortran does work and works quickly.
Being 40 years old means people have spent that time optimizing the routines.</sentencetext>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28292147</id>
	<title>Re:While there may be "newer" languages</title>
	<author>MathFox</author>
	<datestamp>1244729400000</datestamp>
	<modclass>Insightful</modclass>
	<modscore>3</modscore>
	<htmltext>First I wonder which Fortran you refer to; Fortran 66 is quite a different language from Fortran 95. I agree that all Fortran variants are pretty good languages for number crunching, but Fortran 77 and older lacked support for data structures, making it hard to teach students about them and advanced algorithms in general. (Yes, I've tried.) Fortran 90 and 95 are much better in those respects. On the other hand: C and C++ are not so far behind in speed to rule them out.
<p>
It is my opinion that learning two fundamentally different languages makes someone a better programmer. I see value in teaching both Fortran and (for example) Python, using Fortran for number crunching and Python for smarter algorithms.</p></htmltext>
<tokenext>First I wonder which Fortran you refer to ; Fortran 66 is quite a different language from Fortran 95 .
I agree that all Fortran variants are pretty good languages for number crunching , but Fortran 77 and older lacked support for data structures , making it hard to teach students about them and advanced algorithms in general .
( Yes , I 've tried .
) Fortran 90 and 95 are much better in those respects .
On the other hand : C and C + + are not so far behind in speed to rule them out .
It is my opinion that learning two fundamentally different languages makes someone a better programmer .
I see value in teaching both Fortran and ( for example ) Python , using Fortran for number crunching and Python for smarter algorithms .</tokentext>
<sentencetext>First I wonder which Fortran you refer to; Fortran 66 is quite a different language from Fortran 95.
I agree that all Fortran variants are pretty good languages for number crunching, but Fortran 77 and older lacked support for data structures, making it hard to teach students about them and advanced algorithms in general.
(Yes, I've tried.
) Fortran 90 and 95 are much better in those respects.
On the other hand: C and C++ are not so far behind in speed to rule them out.
It is my opinion that learning two fundamentally different languages makes someone a better programmer.
I see value in teaching both Fortran and (for example) Python, using Fortran for number crunching and Python for smarter algorithms.</sentencetext>
	<parent>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28291859</parent>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28296169</id>
	<title>This is quite backwards</title>
	<author>DragonTHC</author>
	<datestamp>1244743680000</datestamp>
	<modclass>None</modclass>
	<modscore>1</modscore>
	<htmltext><p>Fortran is not the future.</p><p>My college is pushing perl as an engineering and scientific language since 99.9\% of the work a scientist might have to do will be in Unix or Linux whose administrators favor perl over fortran for such tasks.</p></htmltext>
<tokenext>Fortran is not the future.My college is pushing perl as an engineering and scientific language since 99.9 \ % of the work a scientist might have to do will be in Unix or Linux whose administrators favor perl over fortran for such tasks .</tokentext>
<sentencetext>Fortran is not the future.My college is pushing perl as an engineering and scientific language since 99.9\% of the work a scientist might have to do will be in Unix or Linux whose administrators favor perl over fortran for such tasks.</sentencetext>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28298161</id>
	<title>Not specifically</title>
	<author>Anonymous</author>
	<datestamp>1244750460000</datestamp>
	<modclass>None</modclass>
	<modscore>0</modscore>
	<htmltext><p>They should learn functional programming, for sure. Fortran specifically need only be taught to those who need extreme performance for scientific/mathmatical calculations.</p></htmltext>
<tokenext>They should learn functional programming , for sure .
Fortran specifically need only be taught to those who need extreme performance for scientific/mathmatical calculations .</tokentext>
<sentencetext>They should learn functional programming, for sure.
Fortran specifically need only be taught to those who need extreme performance for scientific/mathmatical calculations.</sentencetext>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28292303</id>
	<title>Bindings</title>
	<author>Anonymous</author>
	<datestamp>1244730000000</datestamp>
	<modclass>None</modclass>
	<modscore>0</modscore>
	<htmltext><p>From what I understand, most of the scientific computing applications of python involves using bindings to C/C++.  Efficient, fast code for crunching numbers is written in the lower-level language, and the program is organized at a higher level using Python.  This way the program is both easy to read and runs fairly quickly.</p></htmltext>
<tokenext>From what I understand , most of the scientific computing applications of python involves using bindings to C/C + + .
Efficient , fast code for crunching numbers is written in the lower-level language , and the program is organized at a higher level using Python .
This way the program is both easy to read and runs fairly quickly .</tokentext>
<sentencetext>From what I understand, most of the scientific computing applications of python involves using bindings to C/C++.
Efficient, fast code for crunching numbers is written in the lower-level language, and the program is organized at a higher level using Python.
This way the program is both easy to read and runs fairly quickly.</sentencetext>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28293253</id>
	<title>Python (and C, and Fortran)</title>
	<author>AaronParsons</author>
	<datestamp>1244733120000</datestamp>
	<modclass>Interestin</modclass>
	<modscore>3</modscore>
	<htmltext>I'm a scientist who does the bulk of his programming in Python.  Numpy (the numerical package for Python) runs at only a 30\% overhead over C.  When that's not fast enough, I drop into C/C++ for bottlenecks and wrap that back into Python (using the Python C API more often than swig/boost).  When there's a great Fortran library that's fast and battle tested, I wrap that into Python using F2Py--and I don't even know that much Fortran.

<br> <br>
Just like it's good to know more than one spoken language, it's good to know more than one programming language.  It's a mistake to think one programming language fits all needs.  That said, it can also be helpful to know one really well, and others enough to convert them into your primary language.  For me, Python fills that role very adequately, and I would highly recommend it be a part (read: part) of the undergraduate programming curriculum.</htmltext>
<tokenext>I 'm a scientist who does the bulk of his programming in Python .
Numpy ( the numerical package for Python ) runs at only a 30 \ % overhead over C. When that 's not fast enough , I drop into C/C + + for bottlenecks and wrap that back into Python ( using the Python C API more often than swig/boost ) .
When there 's a great Fortran library that 's fast and battle tested , I wrap that into Python using F2Py--and I do n't even know that much Fortran .
Just like it 's good to know more than one spoken language , it 's good to know more than one programming language .
It 's a mistake to think one programming language fits all needs .
That said , it can also be helpful to know one really well , and others enough to convert them into your primary language .
For me , Python fills that role very adequately , and I would highly recommend it be a part ( read : part ) of the undergraduate programming curriculum .</tokentext>
<sentencetext>I'm a scientist who does the bulk of his programming in Python.
Numpy (the numerical package for Python) runs at only a 30\% overhead over C.  When that's not fast enough, I drop into C/C++ for bottlenecks and wrap that back into Python (using the Python C API more often than swig/boost).
When there's a great Fortran library that's fast and battle tested, I wrap that into Python using F2Py--and I don't even know that much Fortran.
Just like it's good to know more than one spoken language, it's good to know more than one programming language.
It's a mistake to think one programming language fits all needs.
That said, it can also be helpful to know one really well, and others enough to convert them into your primary language.
For me, Python fills that role very adequately, and I would highly recommend it be a part (read: part) of the undergraduate programming curriculum.</sentencetext>
	<parent>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28292127</parent>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28292353</id>
	<title>It depends</title>
	<author>Anonymous</author>
	<datestamp>1244730120000</datestamp>
	<modclass>None</modclass>
	<modscore>0</modscore>
	<htmltext><p>I'm a post-doctoral researcher and never had any Fortran training in my undergraduate studies and I've spent this week trying to learn Fortran as there are several pieces of code I need that are written in it. Unless people start porting old code/models to a new language then it's necessary to learn it.</p></htmltext>
<tokenext>I 'm a post-doctoral researcher and never had any Fortran training in my undergraduate studies and I 've spent this week trying to learn Fortran as there are several pieces of code I need that are written in it .
Unless people start porting old code/models to a new language then it 's necessary to learn it .</tokentext>
<sentencetext>I'm a post-doctoral researcher and never had any Fortran training in my undergraduate studies and I've spent this week trying to learn Fortran as there are several pieces of code I need that are written in it.
Unless people start porting old code/models to a new language then it's necessary to learn it.</sentencetext>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28293229</id>
	<title>Fortran is analogous to hazing</title>
	<author>93 Escort Wagon</author>
	<datestamp>1244733060000</datestamp>
	<modclass>Interestin</modclass>
	<modscore>3</modscore>
	<htmltext><p>The Elders feel that if they had to go through it, so do the young'uns gol durn it!</p><p>Seriously, though - as far as I know, Fortran has always been the language of those humonguous numerical models because of its optimizations with regard to array handling. I think it makes perfect sense as a first (or second) language for science majors. However I imagine the person asking this question is likely one of the young'uns being forced to learn it; and that person doesn't really have the perspective as to *why* this is so. After all, he's been hacking around in C and Python for years - they're in his comfort zone and have been good enough for the sorts of things he's been dealing with.</p></htmltext>
<tokenext>The Elders feel that if they had to go through it , so do the young'uns gol durn it ! Seriously , though - as far as I know , Fortran has always been the language of those humonguous numerical models because of its optimizations with regard to array handling .
I think it makes perfect sense as a first ( or second ) language for science majors .
However I imagine the person asking this question is likely one of the young'uns being forced to learn it ; and that person does n't really have the perspective as to * why * this is so .
After all , he 's been hacking around in C and Python for years - they 're in his comfort zone and have been good enough for the sorts of things he 's been dealing with .</tokentext>
<sentencetext>The Elders feel that if they had to go through it, so do the young'uns gol durn it!Seriously, though - as far as I know, Fortran has always been the language of those humonguous numerical models because of its optimizations with regard to array handling.
I think it makes perfect sense as a first (or second) language for science majors.
However I imagine the person asking this question is likely one of the young'uns being forced to learn it; and that person doesn't really have the perspective as to *why* this is so.
After all, he's been hacking around in C and Python for years - they're in his comfort zone and have been good enough for the sorts of things he's been dealing with.</sentencetext>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28292105</id>
	<title>Still going to be around for a while</title>
	<author>gustgr</author>
	<datestamp>1244729220000</datestamp>
	<modclass>Interestin</modclass>
	<modscore>2</modscore>
	<htmltext><p>In my opinion, yes. I am an undergrad Physics student (senior) and had my first contact with Fortran in my third semester, in a course called Computational Physics I. We learned the basics of Fortran 77/90 and how to solve some numerical problems using it. We also simulated some interesting problems that amazes undergrad students such as chaotic oscillators, Magnus effect in action and a few other simple yet curious systems. I had already some programming experience, but most other students didn't. They got it quite quickly and I think this is due Fortran's simplicity.</p><p>Even if you are never going to use Fortran in your own projects, you will stumble on it now and then if you are going seriously into applied and theoretical research field. NASA, for example, has tons of production code written in Fortran and even new codes are written on it. Many many Physics and Chemistry groups around the world have their most important codes in Fortran, and sometimes they use clever hacks to make the code faster, so a minimum understanding of it is necessary. I work with a Computational Chemistry group and much of the code they still develop, even for new applications, is Fortran. It is good and solid code, they are very experienced on it, and they are not willing to change to another technology so easily.</p><p>As a first language I don't know if Fortran is the best, maybe Python or Java would be my choice in this case, but it is definitely worth learning.</p></htmltext>
<tokenext>In my opinion , yes .
I am an undergrad Physics student ( senior ) and had my first contact with Fortran in my third semester , in a course called Computational Physics I. We learned the basics of Fortran 77/90 and how to solve some numerical problems using it .
We also simulated some interesting problems that amazes undergrad students such as chaotic oscillators , Magnus effect in action and a few other simple yet curious systems .
I had already some programming experience , but most other students did n't .
They got it quite quickly and I think this is due Fortran 's simplicity.Even if you are never going to use Fortran in your own projects , you will stumble on it now and then if you are going seriously into applied and theoretical research field .
NASA , for example , has tons of production code written in Fortran and even new codes are written on it .
Many many Physics and Chemistry groups around the world have their most important codes in Fortran , and sometimes they use clever hacks to make the code faster , so a minimum understanding of it is necessary .
I work with a Computational Chemistry group and much of the code they still develop , even for new applications , is Fortran .
It is good and solid code , they are very experienced on it , and they are not willing to change to another technology so easily.As a first language I do n't know if Fortran is the best , maybe Python or Java would be my choice in this case , but it is definitely worth learning .</tokentext>
<sentencetext>In my opinion, yes.
I am an undergrad Physics student (senior) and had my first contact with Fortran in my third semester, in a course called Computational Physics I. We learned the basics of Fortran 77/90 and how to solve some numerical problems using it.
We also simulated some interesting problems that amazes undergrad students such as chaotic oscillators, Magnus effect in action and a few other simple yet curious systems.
I had already some programming experience, but most other students didn't.
They got it quite quickly and I think this is due Fortran's simplicity.Even if you are never going to use Fortran in your own projects, you will stumble on it now and then if you are going seriously into applied and theoretical research field.
NASA, for example, has tons of production code written in Fortran and even new codes are written on it.
Many many Physics and Chemistry groups around the world have their most important codes in Fortran, and sometimes they use clever hacks to make the code faster, so a minimum understanding of it is necessary.
I work with a Computational Chemistry group and much of the code they still develop, even for new applications, is Fortran.
It is good and solid code, they are very experienced on it, and they are not willing to change to another technology so easily.As a first language I don't know if Fortran is the best, maybe Python or Java would be my choice in this case, but it is definitely worth learning.</sentencetext>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28292223</id>
	<title>IMHO</title>
	<author>kenp2002</author>
	<datestamp>1244729700000</datestamp>
	<modclass>None</modclass>
	<modscore>1</modscore>
	<htmltext><p>IMHO I would have to say yes. With the rise of multi-core systems I found that FORTRAN has some programming habits that would carry into multi-threaded and multi-core programming. While I don't think it has a high relevancy for finding employment I do think that FORTRAN, along with at least a 3 credit hour course in assembly is crucial for fundamental programming skills.</p><p>I think we can drop PASCAL (If they haven't already) and swap it with either Ruby or PERL for undergraduates.</p></htmltext>
<tokenext>IMHO I would have to say yes .
With the rise of multi-core systems I found that FORTRAN has some programming habits that would carry into multi-threaded and multi-core programming .
While I do n't think it has a high relevancy for finding employment I do think that FORTRAN , along with at least a 3 credit hour course in assembly is crucial for fundamental programming skills.I think we can drop PASCAL ( If they have n't already ) and swap it with either Ruby or PERL for undergraduates .</tokentext>
<sentencetext>IMHO I would have to say yes.
With the rise of multi-core systems I found that FORTRAN has some programming habits that would carry into multi-threaded and multi-core programming.
While I don't think it has a high relevancy for finding employment I do think that FORTRAN, along with at least a 3 credit hour course in assembly is crucial for fundamental programming skills.I think we can drop PASCAL (If they haven't already) and swap it with either Ruby or PERL for undergraduates.</sentencetext>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28293791</id>
	<title>Fortran is also REALLY simple</title>
	<author>alexhmit01</author>
	<datestamp>1244735160000</datestamp>
	<modclass>None</modclass>
	<modscore>1</modscore>
	<htmltext><p>For a Freshman seminar (now 12 years ago), we used Fortran 77 because that was the last version that GNU had (at the time) a compiler for.  We were doing really simple modeling, and the limitations of a then 20 year old language weren't a problem, we weren't building UIs, just crunching numbers.  Fortran 90 cleaned up most of the Syntax and made it as friendly as Pascal, which is probably the cleanest teaching language for it's simplicity.  Later version supposedly added object oriented and other modern niceties.</p><p>Wrapping Fortran in Python seems simple enough, the languages are all fundamentally the same.  But if you leave your logic in Fortran for sciences, where you have 40 years of libraries, you can certainly use Python to build a simple enough UI, but why NOT learn Fortran, it's damned simple, works, and teaches the basics.  All the modern syntactic sugar pulls away from the basics of programming.</p></htmltext>
<tokenext>For a Freshman seminar ( now 12 years ago ) , we used Fortran 77 because that was the last version that GNU had ( at the time ) a compiler for .
We were doing really simple modeling , and the limitations of a then 20 year old language were n't a problem , we were n't building UIs , just crunching numbers .
Fortran 90 cleaned up most of the Syntax and made it as friendly as Pascal , which is probably the cleanest teaching language for it 's simplicity .
Later version supposedly added object oriented and other modern niceties.Wrapping Fortran in Python seems simple enough , the languages are all fundamentally the same .
But if you leave your logic in Fortran for sciences , where you have 40 years of libraries , you can certainly use Python to build a simple enough UI , but why NOT learn Fortran , it 's damned simple , works , and teaches the basics .
All the modern syntactic sugar pulls away from the basics of programming .</tokentext>
<sentencetext>For a Freshman seminar (now 12 years ago), we used Fortran 77 because that was the last version that GNU had (at the time) a compiler for.
We were doing really simple modeling, and the limitations of a then 20 year old language weren't a problem, we weren't building UIs, just crunching numbers.
Fortran 90 cleaned up most of the Syntax and made it as friendly as Pascal, which is probably the cleanest teaching language for it's simplicity.
Later version supposedly added object oriented and other modern niceties.Wrapping Fortran in Python seems simple enough, the languages are all fundamentally the same.
But if you leave your logic in Fortran for sciences, where you have 40 years of libraries, you can certainly use Python to build a simple enough UI, but why NOT learn Fortran, it's damned simple, works, and teaches the basics.
All the modern syntactic sugar pulls away from the basics of programming.</sentencetext>
	<parent>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28292331</parent>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28293475</id>
	<title>Re:While there may be "newer" languages</title>
	<author>Anonymous</author>
	<datestamp>1244733900000</datestamp>
	<modclass>Insightful</modclass>
	<modscore>2</modscore>
	<htmltext><p>Quad precision floating point.</p><p>Doesn't look (after a short google session) that python has this. It's used in nuclear physics still for this reason.</p></htmltext>
<tokenext>Quad precision floating point.Does n't look ( after a short google session ) that python has this .
It 's used in nuclear physics still for this reason .</tokentext>
<sentencetext>Quad precision floating point.Doesn't look (after a short google session) that python has this.
It's used in nuclear physics still for this reason.</sentencetext>
	<parent>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28292025</parent>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28296129</id>
	<title>Re:Not punched cards</title>
	<author>Helen O'Boyle</author>
	<datestamp>1244743500000</datestamp>
	<modclass>None</modclass>
	<modscore>1</modscore>
	<htmltext><p>I hear ya on the IBM keypunch.</p><p>I lasted on that for about 7 days of a 4.5 week summer class before realizing that the 2-3 really smart kids in the class were never in the keypunch room and set about solving that mystery.</p><p>It turns out that the smart kids were hunched over these odd, pale blue, rounded wedge-shaped TV thingies whose screens glowed blue (ADM3A's, for the uninitiated).  I eventually learned that if one was willing to sign an oath of non-annoyance(1), one could get access to the HP 3000 system attached to those terminals, which could do RJE (remote job entry) to the IBM mainframe used to run our class' programs.</p><p>I signed the oath.  I wrote my user ID and password in the inside front of my PL/1 book (yes, I still have it) back in the days where it was nearly unthinkable that anyone would try to log in to an account that wasn't theirs, because it just "wasn't done".  I learned HP edit.  And my life got much easier in terms of the class, and much more complicated in terms of life, because I became one of those rare geek girls who hung out in the computer room for fun.</p><p>And yes, FORTRAN (totally all upper case) is one of the languages I eventually professionally coded in (scientific analysis for a chem prof, using a vector processing unit), along with C, a half dozen assembly languages, SAS, PL/1, COBOL (yes, I really did it for 6 months), etc.</p><p>(1) The oath of non-annoyance (just my name for it) was a promise that you knew where the manuals were and would not bother the other computer room users with "how do I?" questions, in exchange for the privilege of being given an HP account.  At least at my school, in those early days, you could either cut it on your own, or you couldn't cut it, and not much slack was given until you'd proven you had enough talent and work ethic to make it through the program.</p></htmltext>
<tokenext>I hear ya on the IBM keypunch.I lasted on that for about 7 days of a 4.5 week summer class before realizing that the 2-3 really smart kids in the class were never in the keypunch room and set about solving that mystery.It turns out that the smart kids were hunched over these odd , pale blue , rounded wedge-shaped TV thingies whose screens glowed blue ( ADM3A 's , for the uninitiated ) .
I eventually learned that if one was willing to sign an oath of non-annoyance ( 1 ) , one could get access to the HP 3000 system attached to those terminals , which could do RJE ( remote job entry ) to the IBM mainframe used to run our class ' programs.I signed the oath .
I wrote my user ID and password in the inside front of my PL/1 book ( yes , I still have it ) back in the days where it was nearly unthinkable that anyone would try to log in to an account that was n't theirs , because it just " was n't done " .
I learned HP edit .
And my life got much easier in terms of the class , and much more complicated in terms of life , because I became one of those rare geek girls who hung out in the computer room for fun.And yes , FORTRAN ( totally all upper case ) is one of the languages I eventually professionally coded in ( scientific analysis for a chem prof , using a vector processing unit ) , along with C , a half dozen assembly languages , SAS , PL/1 , COBOL ( yes , I really did it for 6 months ) , etc .
( 1 ) The oath of non-annoyance ( just my name for it ) was a promise that you knew where the manuals were and would not bother the other computer room users with " how do I ?
" questions , in exchange for the privilege of being given an HP account .
At least at my school , in those early days , you could either cut it on your own , or you could n't cut it , and not much slack was given until you 'd proven you had enough talent and work ethic to make it through the program .</tokentext>
<sentencetext>I hear ya on the IBM keypunch.I lasted on that for about 7 days of a 4.5 week summer class before realizing that the 2-3 really smart kids in the class were never in the keypunch room and set about solving that mystery.It turns out that the smart kids were hunched over these odd, pale blue, rounded wedge-shaped TV thingies whose screens glowed blue (ADM3A's, for the uninitiated).
I eventually learned that if one was willing to sign an oath of non-annoyance(1), one could get access to the HP 3000 system attached to those terminals, which could do RJE (remote job entry) to the IBM mainframe used to run our class' programs.I signed the oath.
I wrote my user ID and password in the inside front of my PL/1 book (yes, I still have it) back in the days where it was nearly unthinkable that anyone would try to log in to an account that wasn't theirs, because it just "wasn't done".
I learned HP edit.
And my life got much easier in terms of the class, and much more complicated in terms of life, because I became one of those rare geek girls who hung out in the computer room for fun.And yes, FORTRAN (totally all upper case) is one of the languages I eventually professionally coded in (scientific analysis for a chem prof, using a vector processing unit), along with C, a half dozen assembly languages, SAS, PL/1, COBOL (yes, I really did it for 6 months), etc.
(1) The oath of non-annoyance (just my name for it) was a promise that you knew where the manuals were and would not bother the other computer room users with "how do I?
" questions, in exchange for the privilege of being given an HP account.
At least at my school, in those early days, you could either cut it on your own, or you couldn't cut it, and not much slack was given until you'd proven you had enough talent and work ethic to make it through the program.</sentencetext>
	<parent>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28291953</parent>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28292789</id>
	<title>Fortran vs. Python for parallel computation</title>
	<author>SpaFF</author>
	<datestamp>1244731620000</datestamp>
	<modclass>None</modclass>
	<modscore>1</modscore>
	<htmltext><p>Python may be great for serial processing, but when it comes to massively parallel computing, fortran is still king.</p><p>Maybe when <a href="http://sourceforge.net/projects/pympi/" title="sourceforge.net">MPI-aware Python</a> [sourceforge.net] gets out of Alpha stage you can dump fortran.  Until then I don't think you're going to see much python running on the world's supercomputers.</p></htmltext>
<tokenext>Python may be great for serial processing , but when it comes to massively parallel computing , fortran is still king.Maybe when MPI-aware Python [ sourceforge.net ] gets out of Alpha stage you can dump fortran .
Until then I do n't think you 're going to see much python running on the world 's supercomputers .</tokentext>
<sentencetext>Python may be great for serial processing, but when it comes to massively parallel computing, fortran is still king.Maybe when MPI-aware Python [sourceforge.net] gets out of Alpha stage you can dump fortran.
Until then I don't think you're going to see much python running on the world's supercomputers.</sentencetext>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28292683</id>
	<title>Re:Python is not programming.</title>
	<author>Vanders</author>
	<datestamp>1244731260000</datestamp>
	<modclass>None</modclass>
	<modscore>1</modscore>
	<htmltext><blockquote><div><p>MPI is vastly more supported by FORTRAN than any other language - grow MPI support for C++ or Object C</p></div></blockquote><p>
MPI for C is pretty well supported. It's not that complex. What sort of support would "MPI for C++" look like other than the C interface?</p></div>
	</htmltext>
<tokenext>MPI is vastly more supported by FORTRAN than any other language - grow MPI support for C + + or Object C MPI for C is pretty well supported .
It 's not that complex .
What sort of support would " MPI for C + + " look like other than the C interface ?</tokentext>
<sentencetext>MPI is vastly more supported by FORTRAN than any other language - grow MPI support for C++ or Object C
MPI for C is pretty well supported.
It's not that complex.
What sort of support would "MPI for C++" look like other than the C interface?
	</sentencetext>
	<parent>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28292033</parent>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28292093</id>
	<title>I had to learn it...</title>
	<author>digitalhermit</author>
	<datestamp>1244729160000</datestamp>
	<modclass>None</modclass>
	<modscore>1</modscore>
	<htmltext><p>so, damnit, everyone else better go through the same thing.</p><p>Seriously though... though I learned it as part of my CS coursework, I used it more often in the math minor portion. Lots of the examples were in Fortran, and I remember having to write Newton-Raphson and trapezoid functions. It served the purpose of showing how numerical methods worked.</p><p>I'm not certain how useful it would be in a current CS course. Today you'd just link a library and and make a call. I'm not saying that's in any way worse than how I learned it, but may not be as useful as a learning tool.</p></htmltext>
<tokenext>so , damnit , everyone else better go through the same thing.Seriously though... though I learned it as part of my CS coursework , I used it more often in the math minor portion .
Lots of the examples were in Fortran , and I remember having to write Newton-Raphson and trapezoid functions .
It served the purpose of showing how numerical methods worked.I 'm not certain how useful it would be in a current CS course .
Today you 'd just link a library and and make a call .
I 'm not saying that 's in any way worse than how I learned it , but may not be as useful as a learning tool .</tokentext>
<sentencetext>so, damnit, everyone else better go through the same thing.Seriously though... though I learned it as part of my CS coursework, I used it more often in the math minor portion.
Lots of the examples were in Fortran, and I remember having to write Newton-Raphson and trapezoid functions.
It served the purpose of showing how numerical methods worked.I'm not certain how useful it would be in a current CS course.
Today you'd just link a library and and make a call.
I'm not saying that's in any way worse than how I learned it, but may not be as useful as a learning tool.</sentencetext>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28292501</id>
	<title>Re:libraries. gigabytes of libraries</title>
	<author>T Murphy</author>
	<datestamp>1244730660000</datestamp>
	<modclass>None</modclass>
	<modscore>1</modscore>
	<htmltext>There's always Maple.</htmltext>
<tokenext>There 's always Maple .</tokentext>
<sentencetext>There's always Maple.</sentencetext>
	<parent>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28291919</parent>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28293041</id>
	<title>Re:PYTHON????</title>
	<author>slim</author>
	<datestamp>1244732460000</datestamp>
	<modclass>Insightful</modclass>
	<modscore>2</modscore>
	<htmltext><p>You're advocating premature optimisation.</p><p>Now, I'm speaking from a position of ignorance about Fortran - but I'm guessing if it were as expressive as a modern scripting language (Python, Ruby, Groovy etc.) then it would be more generally popular.</p><p>The new scripting languages are *so* condusive to exploratory programming, it seems to me a no-brainer that undergrads would benefit from learning one. When speed becomes an issue, optimise whichever 1\% of the routines are taking up the time.</p></htmltext>
<tokenext>You 're advocating premature optimisation.Now , I 'm speaking from a position of ignorance about Fortran - but I 'm guessing if it were as expressive as a modern scripting language ( Python , Ruby , Groovy etc .
) then it would be more generally popular.The new scripting languages are * so * condusive to exploratory programming , it seems to me a no-brainer that undergrads would benefit from learning one .
When speed becomes an issue , optimise whichever 1 \ % of the routines are taking up the time .</tokentext>
<sentencetext>You're advocating premature optimisation.Now, I'm speaking from a position of ignorance about Fortran - but I'm guessing if it were as expressive as a modern scripting language (Python, Ruby, Groovy etc.
) then it would be more generally popular.The new scripting languages are *so* condusive to exploratory programming, it seems to me a no-brainer that undergrads would benefit from learning one.
When speed becomes an issue, optimise whichever 1\% of the routines are taking up the time.</sentencetext>
	<parent>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28292127</parent>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28296753</id>
	<title>Fortran is old?</title>
	<author>Anonymous</author>
	<datestamp>1244745780000</datestamp>
	<modclass>None</modclass>
	<modscore>0</modscore>
	<htmltext><p>You damned kids, get off my lawn!</p></htmltext>
<tokenext>You damned kids , get off my lawn !</tokentext>
<sentencetext>You damned kids, get off my lawn!</sentencetext>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28291993</id>
	<title>different languages for different purposes</title>
	<author>rotor</author>
	<datestamp>1244728800000</datestamp>
	<modclass>Insightful</modclass>
	<modscore>3</modscore>
	<htmltext><p>There's no problem for teaching Fortran if it's the right tool for the job.  It was 13 years ago that I took Fortran in College.  It went great with physics and modeling courses.  These days I write web-based database apps in Java/Perl/whatever language-du-jour is required of me, but I wouldn't want to use many of these languages for scientific purposes.  I'll leave that to Fortran and C.</p></htmltext>
<tokenext>There 's no problem for teaching Fortran if it 's the right tool for the job .
It was 13 years ago that I took Fortran in College .
It went great with physics and modeling courses .
These days I write web-based database apps in Java/Perl/whatever language-du-jour is required of me , but I would n't want to use many of these languages for scientific purposes .
I 'll leave that to Fortran and C .</tokentext>
<sentencetext>There's no problem for teaching Fortran if it's the right tool for the job.
It was 13 years ago that I took Fortran in College.
It went great with physics and modeling courses.
These days I write web-based database apps in Java/Perl/whatever language-du-jour is required of me, but I wouldn't want to use many of these languages for scientific purposes.
I'll leave that to Fortran and C.</sentencetext>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28292771</id>
	<title>keep it</title>
	<author>PalmKiller</author>
	<datestamp>1244731500000</datestamp>
	<modclass>None</modclass>
	<modscore>1</modscore>
	<htmltext>FORTRAN is much better at very large and very small number calculations, and it seems to keep the significant digits and all in play, and is able to deliver output in a nice form without much work, its best they keep on teaching it.  I know we would all like to see students learn python as a requirement, but if they have to replace one to do it, this is certainly not the one to replace.</htmltext>
<tokenext>FORTRAN is much better at very large and very small number calculations , and it seems to keep the significant digits and all in play , and is able to deliver output in a nice form without much work , its best they keep on teaching it .
I know we would all like to see students learn python as a requirement , but if they have to replace one to do it , this is certainly not the one to replace .</tokentext>
<sentencetext>FORTRAN is much better at very large and very small number calculations, and it seems to keep the significant digits and all in play, and is able to deliver output in a nice form without much work, its best they keep on teaching it.
I know we would all like to see students learn python as a requirement, but if they have to replace one to do it, this is certainly not the one to replace.</sentencetext>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28311397</id>
	<title>Alas, Yes.</title>
	<author>sjames</author>
	<datestamp>1244834040000</datestamp>
	<modclass>None</modclass>
	<modscore>1</modscore>
	<htmltext><p>There is a great deal of scientific code already in Fortran. I wouldn't be surprised to see the same code with or without revisions STILL being used 20 years from now. Some of the code being used now referrs to the input file as a deck because when it was written, the input and output happened on punched cards if that gives you any idea of it's longevity.</p><p>Unfortunately, much of that code accomplishes it's tasks in spite of the source being a horrible jumble of spaghetti without comments.</p><p>Really Fortran AND python would be a good choice. Fortran for long running simulations that need to be as efficient as possible to complete in a reasonable time and python for less performance sensitive cases.</p></htmltext>
<tokenext>There is a great deal of scientific code already in Fortran .
I would n't be surprised to see the same code with or without revisions STILL being used 20 years from now .
Some of the code being used now referrs to the input file as a deck because when it was written , the input and output happened on punched cards if that gives you any idea of it 's longevity.Unfortunately , much of that code accomplishes it 's tasks in spite of the source being a horrible jumble of spaghetti without comments.Really Fortran AND python would be a good choice .
Fortran for long running simulations that need to be as efficient as possible to complete in a reasonable time and python for less performance sensitive cases .</tokentext>
<sentencetext>There is a great deal of scientific code already in Fortran.
I wouldn't be surprised to see the same code with or without revisions STILL being used 20 years from now.
Some of the code being used now referrs to the input file as a deck because when it was written, the input and output happened on punched cards if that gives you any idea of it's longevity.Unfortunately, much of that code accomplishes it's tasks in spite of the source being a horrible jumble of spaghetti without comments.Really Fortran AND python would be a good choice.
Fortran for long running simulations that need to be as efficient as possible to complete in a reasonable time and python for less performance sensitive cases.</sentencetext>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28293191</id>
	<title>No Python... Yes PHP</title>
	<author>Anonymous</author>
	<datestamp>1244732940000</datestamp>
	<modclass>None</modclass>
	<modscore>0</modscore>
	<htmltext><p>They should teach PHP because it is the strongest programming language right now.</p></htmltext>
<tokenext>They should teach PHP because it is the strongest programming language right now .</tokentext>
<sentencetext>They should teach PHP because it is the strongest programming language right now.</sentencetext>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28296803</id>
	<title>Re:University != Trade school</title>
	<author>westlake</author>
	<datestamp>1244745900000</datestamp>
	<modclass>None</modclass>
	<modscore>1</modscore>
	<htmltext><p><i>IMO universities should be teaching core principles and methods, not attempting to impart up-to-date job skills.</i> </p><p>But whose core principles and methods do you teach?</p><p> There are other majors besides CS.</p><p> <i><br>If you are going to teach FORTRAN because it's of use in the real world, then why stop there? Why not also (god forbid) teach<nobr> <wbr></nobr>.NET. JavaScript, C#, etc. May as well teach them Excel macros and how to interact with Microsoft Clippy while you're at it.</i> </p><p>The modern university is quite practical-minded. That decision was made when the classic Latin and Greek curricula was overthrown.</p><p>Mastering the Excel spreadsheet has become - in many settings - as essential a skill as a basic command of the English language.<br>
&nbsp;</p></htmltext>
<tokenext>IMO universities should be teaching core principles and methods , not attempting to impart up-to-date job skills .
But whose core principles and methods do you teach ?
There are other majors besides CS .
If you are going to teach FORTRAN because it 's of use in the real world , then why stop there ?
Why not also ( god forbid ) teach .NET .
JavaScript , C # , etc .
May as well teach them Excel macros and how to interact with Microsoft Clippy while you 're at it .
The modern university is quite practical-minded .
That decision was made when the classic Latin and Greek curricula was overthrown.Mastering the Excel spreadsheet has become - in many settings - as essential a skill as a basic command of the English language .
 </tokentext>
<sentencetext>IMO universities should be teaching core principles and methods, not attempting to impart up-to-date job skills.
But whose core principles and methods do you teach?
There are other majors besides CS.
If you are going to teach FORTRAN because it's of use in the real world, then why stop there?
Why not also (god forbid) teach .NET.
JavaScript, C#, etc.
May as well teach them Excel macros and how to interact with Microsoft Clippy while you're at it.
The modern university is quite practical-minded.
That decision was made when the classic Latin and Greek curricula was overthrown.Mastering the Excel spreadsheet has become - in many settings - as essential a skill as a basic command of the English language.
 </sentencetext>
	<parent>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28292153</parent>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28291849</id>
	<title>Oh come on.</title>
	<author>geminidomino</author>
	<datestamp>1244728200000</datestamp>
	<modclass>Funny</modclass>
	<modscore>5</modscore>
	<htmltext><p>--No one-- should be taught FORTRAN. Ever...</p><p>*sobs in fetal position*</p></htmltext>
<tokenext>--No one-- should be taught FORTRAN .
Ever... * sobs in fetal position *</tokentext>
<sentencetext>--No one-- should be taught FORTRAN.
Ever...*sobs in fetal position*</sentencetext>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28304337</id>
	<title>Actually, its about 52 years old.</title>
	<author>G\_of\_the\_J</author>
	<datestamp>1244740680000</datestamp>
	<modclass>None</modclass>
	<modscore>1</modscore>
	<htmltext>But it's still really efficient at matrix math, so for some science/math applications, it a great choice.</htmltext>
<tokenext>But it 's still really efficient at matrix math , so for some science/math applications , it a great choice .</tokentext>
<sentencetext>But it's still really efficient at matrix math, so for some science/math applications, it a great choice.</sentencetext>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28297941</id>
	<title>Re:Sillyness</title>
	<author>Anonymous</author>
	<datestamp>1244749800000</datestamp>
	<modclass>None</modclass>
	<modscore>0</modscore>
	<htmltext><p>Numerical computation is one job that a chemist of physicist may need to do. They also may need to collect data from a number of test instruments. There are libraries of VB, Python, Labview,... that can do the job a lot better. Undergrad science students should be aware of these tools.</p></htmltext>
<tokenext>Numerical computation is one job that a chemist of physicist may need to do .
They also may need to collect data from a number of test instruments .
There are libraries of VB , Python , Labview,... that can do the job a lot better .
Undergrad science students should be aware of these tools .</tokentext>
<sentencetext>Numerical computation is one job that a chemist of physicist may need to do.
They also may need to collect data from a number of test instruments.
There are libraries of VB, Python, Labview,... that can do the job a lot better.
Undergrad science students should be aware of these tools.</sentencetext>
	<parent>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28292565</parent>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28295301</id>
	<title>from the teacher side...</title>
	<author>Anonymous</author>
	<datestamp>1244740560000</datestamp>
	<modclass>None</modclass>
	<modscore>0</modscore>
	<htmltext><p>I recently graduated with a degree in aerospace engineering and, during my time in college, I was a TA teaching Fortran to freshmen and sophomore (primarily) engineering students.  During that time I realized that the objective of teaching Fortran to these students was not necessarily to have them learn the language, but to learn how to think in a procedural manner.  Say what you will, but engineering and the scientific process is full of procedures.  These kids come in to college not knowing how to analyze, formulate, or even how to follow procedures and learning programming, especially in a language such as Fortran where you don't have a choice, is one large step in that direction.  Therefore, when I would teach, I would focus on the program design process rather than the syntax because, as we all know, anybody can learn a syntax.</p><p>I realize that the previous argument was more programming-general rather than Fortran-specific, so here's my Fortran plug.  It all comes down to this: Almost Every Major Company in the Aerospace Field Still Uses Fortran!  Many of my fellow TA's moved on to become employees at NASA, Lockheed, Boeing, Harris, and others based primarily on the fact that they know Fortran.  I should also mention that I have a friend in a graduate physics program who uses Fortran for his modeling because of the extreme computer-intensive computations it requires.  It's still out there.  It's still strong.  It's still important.</p></htmltext>
<tokenext>I recently graduated with a degree in aerospace engineering and , during my time in college , I was a TA teaching Fortran to freshmen and sophomore ( primarily ) engineering students .
During that time I realized that the objective of teaching Fortran to these students was not necessarily to have them learn the language , but to learn how to think in a procedural manner .
Say what you will , but engineering and the scientific process is full of procedures .
These kids come in to college not knowing how to analyze , formulate , or even how to follow procedures and learning programming , especially in a language such as Fortran where you do n't have a choice , is one large step in that direction .
Therefore , when I would teach , I would focus on the program design process rather than the syntax because , as we all know , anybody can learn a syntax.I realize that the previous argument was more programming-general rather than Fortran-specific , so here 's my Fortran plug .
It all comes down to this : Almost Every Major Company in the Aerospace Field Still Uses Fortran !
Many of my fellow TA 's moved on to become employees at NASA , Lockheed , Boeing , Harris , and others based primarily on the fact that they know Fortran .
I should also mention that I have a friend in a graduate physics program who uses Fortran for his modeling because of the extreme computer-intensive computations it requires .
It 's still out there .
It 's still strong .
It 's still important .</tokentext>
<sentencetext>I recently graduated with a degree in aerospace engineering and, during my time in college, I was a TA teaching Fortran to freshmen and sophomore (primarily) engineering students.
During that time I realized that the objective of teaching Fortran to these students was not necessarily to have them learn the language, but to learn how to think in a procedural manner.
Say what you will, but engineering and the scientific process is full of procedures.
These kids come in to college not knowing how to analyze, formulate, or even how to follow procedures and learning programming, especially in a language such as Fortran where you don't have a choice, is one large step in that direction.
Therefore, when I would teach, I would focus on the program design process rather than the syntax because, as we all know, anybody can learn a syntax.I realize that the previous argument was more programming-general rather than Fortran-specific, so here's my Fortran plug.
It all comes down to this: Almost Every Major Company in the Aerospace Field Still Uses Fortran!
Many of my fellow TA's moved on to become employees at NASA, Lockheed, Boeing, Harris, and others based primarily on the fact that they know Fortran.
I should also mention that I have a friend in a graduate physics program who uses Fortran for his modeling because of the extreme computer-intensive computations it requires.
It's still out there.
It's still strong.
It's still important.</sentencetext>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28294155</id>
	<title>Engineers use Numerical Methods, not OOP</title>
	<author>Carbaholic</author>
	<datestamp>1244736480000</datestamp>
	<modclass>Interestin</modclass>
	<modscore>2</modscore>
	<htmltext><p>It doesn't matter what language Engineers and physicists learn as undergrads, the language isn't what matters. It's the numerical methods that matter.</p><p>In fact, I think it's good that they learn FORTRAN because so much of the code they'll work with in industry is written in FORTRAN.</p><p>In my first programming class we spent the vast majority of the time learning numerical methods like taylor expansions and how to write them. In the very last few lectures we talked about what OOP did one homework assignment where we wrote and used a class. This was the right way to go because for an engineer, because the mathematics are far more important than the structure.</p><p>I've seen aerodynamic, structural, and acoustic calculations where the mathematics make your head spin, but it only takes two or three functions to write the numerical method to solve the equations. This is the kind of program engineers need to be good at.</p></htmltext>
<tokenext>It does n't matter what language Engineers and physicists learn as undergrads , the language is n't what matters .
It 's the numerical methods that matter.In fact , I think it 's good that they learn FORTRAN because so much of the code they 'll work with in industry is written in FORTRAN.In my first programming class we spent the vast majority of the time learning numerical methods like taylor expansions and how to write them .
In the very last few lectures we talked about what OOP did one homework assignment where we wrote and used a class .
This was the right way to go because for an engineer , because the mathematics are far more important than the structure.I 've seen aerodynamic , structural , and acoustic calculations where the mathematics make your head spin , but it only takes two or three functions to write the numerical method to solve the equations .
This is the kind of program engineers need to be good at .</tokentext>
<sentencetext>It doesn't matter what language Engineers and physicists learn as undergrads, the language isn't what matters.
It's the numerical methods that matter.In fact, I think it's good that they learn FORTRAN because so much of the code they'll work with in industry is written in FORTRAN.In my first programming class we spent the vast majority of the time learning numerical methods like taylor expansions and how to write them.
In the very last few lectures we talked about what OOP did one homework assignment where we wrote and used a class.
This was the right way to go because for an engineer, because the mathematics are far more important than the structure.I've seen aerodynamic, structural, and acoustic calculations where the mathematics make your head spin, but it only takes two or three functions to write the numerical method to solve the equations.
This is the kind of program engineers need to be good at.</sentencetext>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28305081</id>
	<title>Re:While there may be "newer" languages</title>
	<author>metaforest</author>
	<datestamp>1244839140000</datestamp>
	<modclass>None</modclass>
	<modscore>1</modscore>
	<htmltext><p>If I had mod points you'd get them, mono-spaced font not withstanding.</p><p>Every computational language has it's sweet spot.</p><p>Each one has a band of domains that it is well suited for.</p><p>As for TFA I don't agree.  FORTRAN is not a good language for first year students that need to learn what a program is, and how to adapt ideas to code.</p><p>For that, I believe that ANSI C is a very good starting point.  It presents a terse language that is close enough to the metal to deal with a vast number of 'first principles' concepts.  It also exposes the student to a lot of the sharp edges that make software engineering such a challenge at times.</p><p>Easy is not a virtue when a student is in the process of trying to determine if CIS is right for them.   Throwing the students into a language that requires them to struggle with notation AND first principles becomes a weeder.   If you can hang with ANSI C you can hang with anything.</p><p>Computer languages, at the end of the day, facilitate and hinder the expression of computational concepts.   Would a golfer putt with a Wood, or drive with a putter?   Would one pinch-hit usefully with a wooden dowel, or play tennis with a golfball?</p><p>The vast majority of our physical activities require that the practitioner use the correct tool for the task at hand.</p><p>When I was in my first fur, BASIC was my starting point, and I quickly found that it was inadequate beyond a rather limiting subset of first principles.   My only other choice was 8-bit assembly language.  It was painful to take that step since I had to learn a huge number of peripheral concepts just to be able execute a "Hello World" program.</p><p>While I am invalidating my argument a bit with my own personal journey, it is important to note that I didn't have access to a C compiler until I had 5 years of professional experience programming in BASIC and assembly.   C at that point was a pain.   Eventually I understood it, and found that I wished I had been able to learn it first.</p></htmltext>
<tokenext>If I had mod points you 'd get them , mono-spaced font not withstanding.Every computational language has it 's sweet spot.Each one has a band of domains that it is well suited for.As for TFA I do n't agree .
FORTRAN is not a good language for first year students that need to learn what a program is , and how to adapt ideas to code.For that , I believe that ANSI C is a very good starting point .
It presents a terse language that is close enough to the metal to deal with a vast number of 'first principles ' concepts .
It also exposes the student to a lot of the sharp edges that make software engineering such a challenge at times.Easy is not a virtue when a student is in the process of trying to determine if CIS is right for them .
Throwing the students into a language that requires them to struggle with notation AND first principles becomes a weeder .
If you can hang with ANSI C you can hang with anything.Computer languages , at the end of the day , facilitate and hinder the expression of computational concepts .
Would a golfer putt with a Wood , or drive with a putter ?
Would one pinch-hit usefully with a wooden dowel , or play tennis with a golfball ? The vast majority of our physical activities require that the practitioner use the correct tool for the task at hand.When I was in my first fur , BASIC was my starting point , and I quickly found that it was inadequate beyond a rather limiting subset of first principles .
My only other choice was 8-bit assembly language .
It was painful to take that step since I had to learn a huge number of peripheral concepts just to be able execute a " Hello World " program.While I am invalidating my argument a bit with my own personal journey , it is important to note that I did n't have access to a C compiler until I had 5 years of professional experience programming in BASIC and assembly .
C at that point was a pain .
Eventually I understood it , and found that I wished I had been able to learn it first .</tokentext>
<sentencetext>If I had mod points you'd get them, mono-spaced font not withstanding.Every computational language has it's sweet spot.Each one has a band of domains that it is well suited for.As for TFA I don't agree.
FORTRAN is not a good language for first year students that need to learn what a program is, and how to adapt ideas to code.For that, I believe that ANSI C is a very good starting point.
It presents a terse language that is close enough to the metal to deal with a vast number of 'first principles' concepts.
It also exposes the student to a lot of the sharp edges that make software engineering such a challenge at times.Easy is not a virtue when a student is in the process of trying to determine if CIS is right for them.
Throwing the students into a language that requires them to struggle with notation AND first principles becomes a weeder.
If you can hang with ANSI C you can hang with anything.Computer languages, at the end of the day, facilitate and hinder the expression of computational concepts.
Would a golfer putt with a Wood, or drive with a putter?
Would one pinch-hit usefully with a wooden dowel, or play tennis with a golfball?The vast majority of our physical activities require that the practitioner use the correct tool for the task at hand.When I was in my first fur, BASIC was my starting point, and I quickly found that it was inadequate beyond a rather limiting subset of first principles.
My only other choice was 8-bit assembly language.
It was painful to take that step since I had to learn a huge number of peripheral concepts just to be able execute a "Hello World" program.While I am invalidating my argument a bit with my own personal journey, it is important to note that I didn't have access to a C compiler until I had 5 years of professional experience programming in BASIC and assembly.
C at that point was a pain.
Eventually I understood it, and found that I wished I had been able to learn it first.</sentencetext>
	<parent>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28292357</parent>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28295327</id>
	<title>Re:While there may be "newer" languages</title>
	<author>Anonymous</author>
	<datestamp>1244740620000</datestamp>
	<modclass>None</modclass>
	<modscore>0</modscore>
	<htmltext><p><div class="quote"><p><nobr> <wbr></nobr><tt>...if somebody studies astronomy and will have to work with old legacy Forth code...</tt></p> </div><p>Gah, don't remind me. Some of the engineering (ie. low-level backup) interfaces for the Herschel 4.2m at La Palma is in FORTH connected via vt220 terminals. No one turns these off in case they don't actually power on again. It's going to be "fun" when these eventually break.</p></div>
	</htmltext>
<tokenext>...if somebody studies astronomy and will have to work with old legacy Forth code... Gah , do n't remind me .
Some of the engineering ( ie .
low-level backup ) interfaces for the Herschel 4.2m at La Palma is in FORTH connected via vt220 terminals .
No one turns these off in case they do n't actually power on again .
It 's going to be " fun " when these eventually break .</tokentext>
<sentencetext> ...if somebody studies astronomy and will have to work with old legacy Forth code... Gah, don't remind me.
Some of the engineering (ie.
low-level backup) interfaces for the Herschel 4.2m at La Palma is in FORTH connected via vt220 terminals.
No one turns these off in case they don't actually power on again.
It's going to be "fun" when these eventually break.
	</sentencetext>
	<parent>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28292357</parent>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28297449</id>
	<title>Way over 40</title>
	<author>Nofsck Ingcloo</author>
	<datestamp>1244747940000</datestamp>
	<modclass>None</modclass>
	<modscore>1</modscore>
	<htmltext><p>Just for the record, FORTRAN is at least 50 years old.  I wrote a program at GE Evendale for the IBM 704 using FORTRAN in the summer of 1959.  The language was very young at the time.  The documentation, such as it was, was typewritten, not printed.  Typical FORTRAN scenario: run it once, tqweak it a little, run it again, all done.</p></htmltext>
<tokenext>Just for the record , FORTRAN is at least 50 years old .
I wrote a program at GE Evendale for the IBM 704 using FORTRAN in the summer of 1959 .
The language was very young at the time .
The documentation , such as it was , was typewritten , not printed .
Typical FORTRAN scenario : run it once , tqweak it a little , run it again , all done .</tokentext>
<sentencetext>Just for the record, FORTRAN is at least 50 years old.
I wrote a program at GE Evendale for the IBM 704 using FORTRAN in the summer of 1959.
The language was very young at the time.
The documentation, such as it was, was typewritten, not printed.
Typical FORTRAN scenario: run it once, tqweak it a little, run it again, all done.</sentencetext>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28291937</id>
	<title>Yes they should.</title>
	<author>Ed Bugg</author>
	<datestamp>1244728500000</datestamp>
	<modclass>None</modclass>
	<modscore>1</modscore>
	<htmltext><p>As with any profession, in order to work effectively, you need to know the tools.  Just like the classes teach notation/language of the specialty.  Would anyone even think not to teach mathematical symbols for Calculus.  The students will be dropped into jobs using and maintaining programs written 40 years ago.</p><p>If the majority of special written software is python, then yes the initial programming should be python.  Go with what they'll need to know.</p></htmltext>
<tokenext>As with any profession , in order to work effectively , you need to know the tools .
Just like the classes teach notation/language of the specialty .
Would anyone even think not to teach mathematical symbols for Calculus .
The students will be dropped into jobs using and maintaining programs written 40 years ago.If the majority of special written software is python , then yes the initial programming should be python .
Go with what they 'll need to know .</tokentext>
<sentencetext>As with any profession, in order to work effectively, you need to know the tools.
Just like the classes teach notation/language of the specialty.
Would anyone even think not to teach mathematical symbols for Calculus.
The students will be dropped into jobs using and maintaining programs written 40 years ago.If the majority of special written software is python, then yes the initial programming should be python.
Go with what they'll need to know.</sentencetext>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28299345</id>
	<title>Re:Sillyness</title>
	<author>McSnarf</author>
	<datestamp>1244711520000</datestamp>
	<modclass>None</modclass>
	<modscore>1</modscore>
	<htmltext>The problem with modern CS students is that they do not understand that, for some people, a computer is not much more than a BIG pocket calculator. A modern high energy physics experinemnt can create a staggering amount of VERY simple data (so serial file access might be all that's needed) - and we are not even talking Monte Carlo.

Who in physics would need most of the concepts of non-FORTRAN programming languages?

(Besides, people succeeding in physics will easily be intelligent enough to pick up some other language when there is a need. Remember that a lot of the INTERESTING developments in CS came from someone with a background in physics...)</htmltext>
<tokenext>The problem with modern CS students is that they do not understand that , for some people , a computer is not much more than a BIG pocket calculator .
A modern high energy physics experinemnt can create a staggering amount of VERY simple data ( so serial file access might be all that 's needed ) - and we are not even talking Monte Carlo .
Who in physics would need most of the concepts of non-FORTRAN programming languages ?
( Besides , people succeeding in physics will easily be intelligent enough to pick up some other language when there is a need .
Remember that a lot of the INTERESTING developments in CS came from someone with a background in physics... )</tokentext>
<sentencetext>The problem with modern CS students is that they do not understand that, for some people, a computer is not much more than a BIG pocket calculator.
A modern high energy physics experinemnt can create a staggering amount of VERY simple data (so serial file access might be all that's needed) - and we are not even talking Monte Carlo.
Who in physics would need most of the concepts of non-FORTRAN programming languages?
(Besides, people succeeding in physics will easily be intelligent enough to pick up some other language when there is a need.
Remember that a lot of the INTERESTING developments in CS came from someone with a background in physics...)</sentencetext>
	<parent>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28292565</parent>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28306665</id>
	<title>Re:Yes</title>
	<author>Anonymous</author>
	<datestamp>1244814240000</datestamp>
	<modclass>None</modclass>
	<modscore>0</modscore>
	<htmltext><p>No... Scheme is the devil spawn from hell... useless for any application that is not intensely recursive in nature and largely not used outside of academia for that very reason.  Impossible to use unless you're brain is wired in a strange-loop anyway....  You can stuff 'car' and 'cdr' where the sun don't shine and I'd be perfectly okay with that.</p><p>Scientific language - I vote C.  It's a case of "good enough" and let's face it - engineers won't actually use structs or functions... they'll put everything in main() and call it a day - so what to we care about the elegance or structuring of a language?  They're just going to abuse the hell out of it anyway... so give them C (it's hard to make that language any uglier than it already is... even with abuse).  Throw in a course in MATLAB just for kicks since most of the useful I/O functions in MATLAB look just like C and you've got yourself an engineer ready to enter the workplace.</p><p>There is, by the way, a good case for FORTRAN.  I work with a firm that still makes heavy use of it.  Most legacy numerical libraries are written in FORTRAN and even though just finding a compiler for it is difficult these days - that doesn't seem to dissuade most companies from still using it.</p><p>Just my two cents...</p></htmltext>
<tokenext>No... Scheme is the devil spawn from hell... useless for any application that is not intensely recursive in nature and largely not used outside of academia for that very reason .
Impossible to use unless you 're brain is wired in a strange-loop anyway.... You can stuff 'car ' and 'cdr ' where the sun do n't shine and I 'd be perfectly okay with that.Scientific language - I vote C. It 's a case of " good enough " and let 's face it - engineers wo n't actually use structs or functions... they 'll put everything in main ( ) and call it a day - so what to we care about the elegance or structuring of a language ?
They 're just going to abuse the hell out of it anyway... so give them C ( it 's hard to make that language any uglier than it already is... even with abuse ) .
Throw in a course in MATLAB just for kicks since most of the useful I/O functions in MATLAB look just like C and you 've got yourself an engineer ready to enter the workplace.There is , by the way , a good case for FORTRAN .
I work with a firm that still makes heavy use of it .
Most legacy numerical libraries are written in FORTRAN and even though just finding a compiler for it is difficult these days - that does n't seem to dissuade most companies from still using it.Just my two cents.. .</tokentext>
<sentencetext>No... Scheme is the devil spawn from hell... useless for any application that is not intensely recursive in nature and largely not used outside of academia for that very reason.
Impossible to use unless you're brain is wired in a strange-loop anyway....  You can stuff 'car' and 'cdr' where the sun don't shine and I'd be perfectly okay with that.Scientific language - I vote C.  It's a case of "good enough" and let's face it - engineers won't actually use structs or functions... they'll put everything in main() and call it a day - so what to we care about the elegance or structuring of a language?
They're just going to abuse the hell out of it anyway... so give them C (it's hard to make that language any uglier than it already is... even with abuse).
Throw in a course in MATLAB just for kicks since most of the useful I/O functions in MATLAB look just like C and you've got yourself an engineer ready to enter the workplace.There is, by the way, a good case for FORTRAN.
I work with a firm that still makes heavy use of it.
Most legacy numerical libraries are written in FORTRAN and even though just finding a compiler for it is difficult these days - that doesn't seem to dissuade most companies from still using it.Just my two cents...</sentencetext>
	<parent>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28292059</parent>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28302873</id>
	<title>Such astouding ignorance</title>
	<author>Anonymous Cowpat</author>
	<datestamp>1244727540000</datestamp>
	<modclass>None</modclass>
	<modscore>1</modscore>
	<htmltext><p>His two reasons why everyone hates Fortran can be made to apply to any language (let's get that out of the way), or indeed any course where some of the students already know the subject materials and others don't know it at all.</p><p>I don't know what 'foibles' of Fortran he's talking about, except perhaps having to type numbers in with a<nobr> <wbr></nobr>.0 on the end if they want them treated as floats. That's all that I can think of.</p><p>Fortran 77 was painful - I never learnt it, but as an F90 programmer was exposed to it and saw how ugly it was. F90, on the other hand, is not painful. Stuff works, the language is forgiving (with the exception of the thing about getting integers to behave as floats) and it does what you want with the minimum of fuss. Compared to C (I'm only going to compare to C - you don't do scientific programming in high-level languages); C may be (at a stretch) twice as powerful, but takes 4 times as long to write, and 10 times as long to debug. Most stuff that you want to do is build directly into the language - you don't have to monkey about linking math libraries in to get a square-root. Fortran is case-insensitive (this is GOOD - it shouldn't be possible to have two variables named <tt>power</tt> and <tt>Power</tt>)</p><blockquote><div><p>Don&#226;(TM)t believe me?  OK, write a program in pure Fortran that gives a plot of a Sin(n*x) for integer n and x ranging from -2*pi to 2*pi.  Now connect that plot up to a slider control which will control the value of n.</p></div></blockquote><p>Now, that's just stupid - that pretty picture drawing and <i>should</i> be accomplished in matlab or similar. Fortran is not supposed to be used for writing programs with a GUI, and whining that it's hard to shoehorn it into that role is monumentally stupid. You can bosh that out with minimal training in some higher-level language like matlab - what you cannot do it write some efficient, high performance program after 'learning programming' in matlab, and being given a crash-course in Fortran. Granted, it doesn't give you as much of an idea about what is going on on the underlying metal as, say, C, but it gives you <i>enough</i>. If you don't initially tell your students that they can copy part of an array using colon-notation, they have to do it with a for-loop. Now they get the idea about how ugly such an action is to perform. That sort of thing, and having the number of clock-cycles that common operations take explained to you, is what gets you into a programming mindset where every other line you think 'how can I be doing this more efficiently?' How many python programmers do you know who, when needing to divide a number by 2, multiply it by 0.5 instead?</p><p>I still use Fortran,. when I was an undergraduate I used it for my project, and as a masters student I chose to use it for my dissertation, despite it not having been on the syllabus. My Fortran skills also enabled me to help some of my colleagues who had to use codes provided by their supervisors which were written in fortran, as they were hopelessly out of their depth, not having been taught any fortran at all.</p><p>His rant about compilers shows a total lack of backbone. A small number of shouty people call for their own preference of compiler and he seriously considers buying them all. The choices are, of course, obvious. gfortran or G95. If you're using the supercomputer you can also have access to the Intel compiler. Choosing a compiler because it's very standards-compliant isn't very clever. Firstly because standards compliance is best achieved through good teaching and secondly because at least one part of the standard (invalidness of tab-characters) needs to be safely ignored, without having to teach the students about shutting the rest of the standards off with compiler switches. The idea of picking a compiler on the basis that it allows you to shoehorn fortran into being a GUI language boggles the mind - if you must have some sort of GUI, write it with a C-library and cross-compile.</p><p>What he doesn't seem to realise about first programming languages, is that they're like a ratchet - it's very hard to later go to a lower level language than the one that you started with. If you start people on python or matlab, they'll be very hard to turn into decent fortran programmers later.</p></div>
	</htmltext>
<tokenext>His two reasons why everyone hates Fortran can be made to apply to any language ( let 's get that out of the way ) , or indeed any course where some of the students already know the subject materials and others do n't know it at all.I do n't know what 'foibles ' of Fortran he 's talking about , except perhaps having to type numbers in with a .0 on the end if they want them treated as floats .
That 's all that I can think of.Fortran 77 was painful - I never learnt it , but as an F90 programmer was exposed to it and saw how ugly it was .
F90 , on the other hand , is not painful .
Stuff works , the language is forgiving ( with the exception of the thing about getting integers to behave as floats ) and it does what you want with the minimum of fuss .
Compared to C ( I 'm only going to compare to C - you do n't do scientific programming in high-level languages ) ; C may be ( at a stretch ) twice as powerful , but takes 4 times as long to write , and 10 times as long to debug .
Most stuff that you want to do is build directly into the language - you do n't have to monkey about linking math libraries in to get a square-root .
Fortran is case-insensitive ( this is GOOD - it should n't be possible to have two variables named power and Power ) Don   ( TM ) t believe me ?
OK , write a program in pure Fortran that gives a plot of a Sin ( n * x ) for integer n and x ranging from -2 * pi to 2 * pi .
Now connect that plot up to a slider control which will control the value of n.Now , that 's just stupid - that pretty picture drawing and should be accomplished in matlab or similar .
Fortran is not supposed to be used for writing programs with a GUI , and whining that it 's hard to shoehorn it into that role is monumentally stupid .
You can bosh that out with minimal training in some higher-level language like matlab - what you can not do it write some efficient , high performance program after 'learning programming ' in matlab , and being given a crash-course in Fortran .
Granted , it does n't give you as much of an idea about what is going on on the underlying metal as , say , C , but it gives you enough .
If you do n't initially tell your students that they can copy part of an array using colon-notation , they have to do it with a for-loop .
Now they get the idea about how ugly such an action is to perform .
That sort of thing , and having the number of clock-cycles that common operations take explained to you , is what gets you into a programming mindset where every other line you think 'how can I be doing this more efficiently ?
' How many python programmers do you know who , when needing to divide a number by 2 , multiply it by 0.5 instead ? I still use Fortran, .
when I was an undergraduate I used it for my project , and as a masters student I chose to use it for my dissertation , despite it not having been on the syllabus .
My Fortran skills also enabled me to help some of my colleagues who had to use codes provided by their supervisors which were written in fortran , as they were hopelessly out of their depth , not having been taught any fortran at all.His rant about compilers shows a total lack of backbone .
A small number of shouty people call for their own preference of compiler and he seriously considers buying them all .
The choices are , of course , obvious .
gfortran or G95 .
If you 're using the supercomputer you can also have access to the Intel compiler .
Choosing a compiler because it 's very standards-compliant is n't very clever .
Firstly because standards compliance is best achieved through good teaching and secondly because at least one part of the standard ( invalidness of tab-characters ) needs to be safely ignored , without having to teach the students about shutting the rest of the standards off with compiler switches .
The idea of picking a compiler on the basis that it allows you to shoehorn fortran into being a GUI language boggles the mind - if you must have some sort of GUI , write it with a C-library and cross-compile.What he does n't seem to realise about first programming languages , is that they 're like a ratchet - it 's very hard to later go to a lower level language than the one that you started with .
If you start people on python or matlab , they 'll be very hard to turn into decent fortran programmers later .</tokentext>
<sentencetext>His two reasons why everyone hates Fortran can be made to apply to any language (let's get that out of the way), or indeed any course where some of the students already know the subject materials and others don't know it at all.I don't know what 'foibles' of Fortran he's talking about, except perhaps having to type numbers in with a .0 on the end if they want them treated as floats.
That's all that I can think of.Fortran 77 was painful - I never learnt it, but as an F90 programmer was exposed to it and saw how ugly it was.
F90, on the other hand, is not painful.
Stuff works, the language is forgiving (with the exception of the thing about getting integers to behave as floats) and it does what you want with the minimum of fuss.
Compared to C (I'm only going to compare to C - you don't do scientific programming in high-level languages); C may be (at a stretch) twice as powerful, but takes 4 times as long to write, and 10 times as long to debug.
Most stuff that you want to do is build directly into the language - you don't have to monkey about linking math libraries in to get a square-root.
Fortran is case-insensitive (this is GOOD - it shouldn't be possible to have two variables named power and Power)Donâ(TM)t believe me?
OK, write a program in pure Fortran that gives a plot of a Sin(n*x) for integer n and x ranging from -2*pi to 2*pi.
Now connect that plot up to a slider control which will control the value of n.Now, that's just stupid - that pretty picture drawing and should be accomplished in matlab or similar.
Fortran is not supposed to be used for writing programs with a GUI, and whining that it's hard to shoehorn it into that role is monumentally stupid.
You can bosh that out with minimal training in some higher-level language like matlab - what you cannot do it write some efficient, high performance program after 'learning programming' in matlab, and being given a crash-course in Fortran.
Granted, it doesn't give you as much of an idea about what is going on on the underlying metal as, say, C, but it gives you enough.
If you don't initially tell your students that they can copy part of an array using colon-notation, they have to do it with a for-loop.
Now they get the idea about how ugly such an action is to perform.
That sort of thing, and having the number of clock-cycles that common operations take explained to you, is what gets you into a programming mindset where every other line you think 'how can I be doing this more efficiently?
' How many python programmers do you know who, when needing to divide a number by 2, multiply it by 0.5 instead?I still use Fortran,.
when I was an undergraduate I used it for my project, and as a masters student I chose to use it for my dissertation, despite it not having been on the syllabus.
My Fortran skills also enabled me to help some of my colleagues who had to use codes provided by their supervisors which were written in fortran, as they were hopelessly out of their depth, not having been taught any fortran at all.His rant about compilers shows a total lack of backbone.
A small number of shouty people call for their own preference of compiler and he seriously considers buying them all.
The choices are, of course, obvious.
gfortran or G95.
If you're using the supercomputer you can also have access to the Intel compiler.
Choosing a compiler because it's very standards-compliant isn't very clever.
Firstly because standards compliance is best achieved through good teaching and secondly because at least one part of the standard (invalidness of tab-characters) needs to be safely ignored, without having to teach the students about shutting the rest of the standards off with compiler switches.
The idea of picking a compiler on the basis that it allows you to shoehorn fortran into being a GUI language boggles the mind - if you must have some sort of GUI, write it with a C-library and cross-compile.What he doesn't seem to realise about first programming languages, is that they're like a ratchet - it's very hard to later go to a lower level language than the one that you started with.
If you start people on python or matlab, they'll be very hard to turn into decent fortran programmers later.
	</sentencetext>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28295743</id>
	<title>Re:Are You Serious?</title>
	<author>blueturffan</author>
	<datestamp>1244742120000</datestamp>
	<modclass>Funny</modclass>
	<modscore>3</modscore>
	<htmltext><blockquote><div><p>An operating system is staggeringly complex to write yet a <b>single man</b> can write one.</p></div> </blockquote><p>

That's because a married man has to spend so much time trying to figure out how to keep his staggeringly complex wife happy.</p></div>
	</htmltext>
<tokenext>An operating system is staggeringly complex to write yet a single man can write one .
That 's because a married man has to spend so much time trying to figure out how to keep his staggeringly complex wife happy .</tokentext>
<sentencetext>An operating system is staggeringly complex to write yet a single man can write one.
That's because a married man has to spend so much time trying to figure out how to keep his staggeringly complex wife happy.
	</sentencetext>
	<parent>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28292165</parent>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28300399</id>
	<title>Fortran for scientists?</title>
	<author>Punk CPA</author>
	<datestamp>1244715180000</datestamp>
	<modclass>None</modclass>
	<modscore>1</modscore>
	<htmltext>Here is my qualified "yes": it should be required only for archaeologists.</htmltext>
<tokenext>Here is my qualified " yes " : it should be required only for archaeologists .</tokentext>
<sentencetext>Here is my qualified "yes": it should be required only for archaeologists.</sentencetext>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28291953</id>
	<title>Not punched cards</title>
	<author>davebarnes</author>
	<datestamp>1244728560000</datestamp>
	<modclass>None</modclass>
	<modscore>1</modscore>
	<htmltext><p>I love Fortran (or FORTRAN as we old folks spell it).<br>But, no one should be forced to use an IBM 026 Keypunch (http://en.wikipedia.org/wiki/Keypunch#IBM\_024.2C\_026).</p></htmltext>
<tokenext>I love Fortran ( or FORTRAN as we old folks spell it ) .But , no one should be forced to use an IBM 026 Keypunch ( http : //en.wikipedia.org/wiki/Keypunch # IBM \ _024.2C \ _026 ) .</tokentext>
<sentencetext>I love Fortran (or FORTRAN as we old folks spell it).But, no one should be forced to use an IBM 026 Keypunch (http://en.wikipedia.org/wiki/Keypunch#IBM\_024.2C\_026).</sentencetext>
	<parent>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28291847</parent>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28298809</id>
	<title>Re:University != Trade school</title>
	<author>Anonymous</author>
	<datestamp>1244752800000</datestamp>
	<modclass>None</modclass>
	<modscore>0</modscore>
	<htmltext><p>University may not be trade school, but engineering and "lab science" ARE trades. There's no sense teaching computer science in this case, just hand them the tool and tell them what is needed to do the actual work.</p></htmltext>
<tokenext>University may not be trade school , but engineering and " lab science " ARE trades .
There 's no sense teaching computer science in this case , just hand them the tool and tell them what is needed to do the actual work .</tokentext>
<sentencetext>University may not be trade school, but engineering and "lab science" ARE trades.
There's no sense teaching computer science in this case, just hand them the tool and tell them what is needed to do the actual work.</sentencetext>
	<parent>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28292153</parent>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28298215</id>
	<title>What year is it?</title>
	<author>Anonymous</author>
	<datestamp>1244750580000</datestamp>
	<modclass>None</modclass>
	<modscore>0</modscore>
	<htmltext><p>The answer is - NO!  There is no compelling reason to teach FORTRAN since it doesn't offer any concepts not available in almost any structured or OO programming language.  If you know how to prrogram in almost anything, you could program in FORTRAN.</p><p>When I started Purdue University in 1984, they had long since dropped FORTRAN from the CS curriculum.</p></htmltext>
<tokenext>The answer is - NO !
There is no compelling reason to teach FORTRAN since it does n't offer any concepts not available in almost any structured or OO programming language .
If you know how to prrogram in almost anything , you could program in FORTRAN.When I started Purdue University in 1984 , they had long since dropped FORTRAN from the CS curriculum .</tokentext>
<sentencetext>The answer is - NO!
There is no compelling reason to teach FORTRAN since it doesn't offer any concepts not available in almost any structured or OO programming language.
If you know how to prrogram in almost anything, you could program in FORTRAN.When I started Purdue University in 1984, they had long since dropped FORTRAN from the CS curriculum.</sentencetext>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28294801</id>
	<title>This leaves out a very important point...</title>
	<author>that IT girl</author>
	<datestamp>1244738880000</datestamp>
	<modclass>None</modclass>
	<modscore>1</modscore>
	<htmltext>...and that is that even new folks coming into a modern job may need to support older programs and need to understand the language it is written in. I am not in programming, but as another type of example, my clients run software from the 90's and some of their hardware is at least that old. Having the knowledge of those "obsolete" programs is a necessity.</htmltext>
<tokenext>...and that is that even new folks coming into a modern job may need to support older programs and need to understand the language it is written in .
I am not in programming , but as another type of example , my clients run software from the 90 's and some of their hardware is at least that old .
Having the knowledge of those " obsolete " programs is a necessity .</tokentext>
<sentencetext>...and that is that even new folks coming into a modern job may need to support older programs and need to understand the language it is written in.
I am not in programming, but as another type of example, my clients run software from the 90's and some of their hardware is at least that old.
Having the knowledge of those "obsolete" programs is a necessity.</sentencetext>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28295001</id>
	<title>Smart people should not program</title>
	<author>Animats</author>
	<datestamp>1244739660000</datestamp>
	<modclass>Offtopic</modclass>
	<modscore>0</modscore>
	<htmltext><p>
<a href="http://en.wikipedia.org/wiki/Lee\_Smolin" title="wikipedia.org">Lee Smolin</a> [wikipedia.org], the physicist, writes that "Smart people should not program".  He used to program, and at one point insisted that his department continue to teach physics students programming.  But then he realized that the needed functionality was either available off the shelf or could be written by lower level people. So he now recommends against wasting students' time on programming.</p></htmltext>
<tokenext>Lee Smolin [ wikipedia.org ] , the physicist , writes that " Smart people should not program " .
He used to program , and at one point insisted that his department continue to teach physics students programming .
But then he realized that the needed functionality was either available off the shelf or could be written by lower level people .
So he now recommends against wasting students ' time on programming .</tokentext>
<sentencetext>
Lee Smolin [wikipedia.org], the physicist, writes that "Smart people should not program".
He used to program, and at one point insisted that his department continue to teach physics students programming.
But then he realized that the needed functionality was either available off the shelf or could be written by lower level people.
So he now recommends against wasting students' time on programming.</sentencetext>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28294965</id>
	<title>Re:While there may be "newer" languages</title>
	<author>drewm1980</author>
	<datestamp>1244739540000</datestamp>
	<modclass>None</modclass>
	<modscore>1</modscore>
	<htmltext><p>If you read the PEP you link to more carefully, you'll note that the context is that the octave developers were recommending adding an operator for multiplication to python, but ~against symbols for matrix inversion / linear system solving.  Also, you can get away with doing inv(A)*b if A is well enough conditioned.  There are better ways, but if you're just manipulating, say, a bunch of affine transformation matrices, an inversion or two won't hurt you.</p><p>Also, Python as a language has many advantages over MATLAB in addition to being free software.  Basically, Python is a great general purpose language with number-crunching tacked on, and MATLAB is a great number-crunching language with everything else (i.e. GUI Programming, Object Oriented Programming) tacked on.</p><p><a href="http://www.scipy.org/NumPy\_for\_Matlab\_Users" title="scipy.org" rel="nofollow">http://www.scipy.org/NumPy\_for\_Matlab\_Users</a> [scipy.org]</p></htmltext>
<tokenext>If you read the PEP you link to more carefully , you 'll note that the context is that the octave developers were recommending adding an operator for multiplication to python , but ~ against symbols for matrix inversion / linear system solving .
Also , you can get away with doing inv ( A ) * b if A is well enough conditioned .
There are better ways , but if you 're just manipulating , say , a bunch of affine transformation matrices , an inversion or two wo n't hurt you.Also , Python as a language has many advantages over MATLAB in addition to being free software .
Basically , Python is a great general purpose language with number-crunching tacked on , and MATLAB is a great number-crunching language with everything else ( i.e .
GUI Programming , Object Oriented Programming ) tacked on.http : //www.scipy.org/NumPy \ _for \ _Matlab \ _Users [ scipy.org ]</tokentext>
<sentencetext>If you read the PEP you link to more carefully, you'll note that the context is that the octave developers were recommending adding an operator for multiplication to python, but ~against symbols for matrix inversion / linear system solving.
Also, you can get away with doing inv(A)*b if A is well enough conditioned.
There are better ways, but if you're just manipulating, say, a bunch of affine transformation matrices, an inversion or two won't hurt you.Also, Python as a language has many advantages over MATLAB in addition to being free software.
Basically, Python is a great general purpose language with number-crunching tacked on, and MATLAB is a great number-crunching language with everything else (i.e.
GUI Programming, Object Oriented Programming) tacked on.http://www.scipy.org/NumPy\_for\_Matlab\_Users [scipy.org]</sentencetext>
	<parent>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28292401</parent>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28292293</id>
	<title>Newer doesn't always mean better.</title>
	<author>Anonymous</author>
	<datestamp>1244729940000</datestamp>
	<modclass>Insightful</modclass>
	<modscore>3</modscore>
	<htmltext><p>Nail guns have been around for a while, but a lot of houses still get built with hammers.</p><p>If a simple tool does a job efficiently and effectively then why "change for the sake of change"?</p></htmltext>
<tokenext>Nail guns have been around for a while , but a lot of houses still get built with hammers.If a simple tool does a job efficiently and effectively then why " change for the sake of change " ?</tokentext>
<sentencetext>Nail guns have been around for a while, but a lot of houses still get built with hammers.If a simple tool does a job efficiently and effectively then why "change for the sake of change"?</sentencetext>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28303611</id>
	<title>Re:While there may be "newer" languages</title>
	<author>ceoyoyo</author>
	<datestamp>1244734500000</datestamp>
	<modclass>None</modclass>
	<modscore>1</modscore>
	<htmltext><p>Why not?  Python has lots of free modules available that wrap some very powerful libraries... written in optimized Fortran.</p></htmltext>
<tokenext>Why not ?
Python has lots of free modules available that wrap some very powerful libraries... written in optimized Fortran .</tokentext>
<sentencetext>Why not?
Python has lots of free modules available that wrap some very powerful libraries... written in optimized Fortran.</sentencetext>
	<parent>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28291859</parent>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28293273</id>
	<title>Fortran rocks</title>
	<author>Bushcat</author>
	<datestamp>1244733180000</datestamp>
	<modclass>None</modclass>
	<modscore>1</modscore>
	<htmltext>Many years ago I did a real-time military radar simulation using a flavor of Fortran. At the next job, I helped build a stratigraphic wellhole analysis package using Fortran. At the third job the team built a package that predicted thermal, radiation and other propagations through hardware, wetware and urbanware. At no time was there more than 4 people in the development team. Fortran has done serious work over the years, and people forget there's more to life than the GUI.</htmltext>
<tokenext>Many years ago I did a real-time military radar simulation using a flavor of Fortran .
At the next job , I helped build a stratigraphic wellhole analysis package using Fortran .
At the third job the team built a package that predicted thermal , radiation and other propagations through hardware , wetware and urbanware .
At no time was there more than 4 people in the development team .
Fortran has done serious work over the years , and people forget there 's more to life than the GUI .</tokentext>
<sentencetext>Many years ago I did a real-time military radar simulation using a flavor of Fortran.
At the next job, I helped build a stratigraphic wellhole analysis package using Fortran.
At the third job the team built a package that predicted thermal, radiation and other propagations through hardware, wetware and urbanware.
At no time was there more than 4 people in the development team.
Fortran has done serious work over the years, and people forget there's more to life than the GUI.</sentencetext>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28293121</id>
	<title>python 3.0 has ruined it for scientists</title>
	<author>Anonymous</author>
	<datestamp>1244732700000</datestamp>
	<modclass>Troll</modclass>
	<modscore>-1</modscore>
	<htmltext><p>they have absolutely no desire to go back twiddling fucking parenthesis on hundreds of millions of  lines of code just because some guy named guido read his tea leaves one morning and decided parenthesis are important in the 'print' function.</p><p>all of the tens of thousands of 'python example codes' now on the web will NOT FUCKING WORK OUT OF THE BOX.. even HELLO WORLD WILL NOT EVEN WORK ANYMORE.</p><p>do you think scientists are going to put up with this shit? they are trying to solve cancer, and you tell them they need to spend several hours, days, weeks, putting parenthesis around print statements. what the fuck is wrong with you?</p><p>FORTRAN crap from 40 years ago will STILL FUCKING COMPILE AND RUN.</p><p>Python? python 4.0, guido will probably decide that fucking whitespace has to be tabs, commas have to be semi colons, and print should be called 'state'. fuck computer nerds to hell.</p></htmltext>
<tokenext>they have absolutely no desire to go back twiddling fucking parenthesis on hundreds of millions of lines of code just because some guy named guido read his tea leaves one morning and decided parenthesis are important in the 'print ' function.all of the tens of thousands of 'python example codes ' now on the web will NOT FUCKING WORK OUT OF THE BOX.. even HELLO WORLD WILL NOT EVEN WORK ANYMORE.do you think scientists are going to put up with this shit ?
they are trying to solve cancer , and you tell them they need to spend several hours , days , weeks , putting parenthesis around print statements .
what the fuck is wrong with you ? FORTRAN crap from 40 years ago will STILL FUCKING COMPILE AND RUN.Python ?
python 4.0 , guido will probably decide that fucking whitespace has to be tabs , commas have to be semi colons , and print should be called 'state' .
fuck computer nerds to hell .</tokentext>
<sentencetext>they have absolutely no desire to go back twiddling fucking parenthesis on hundreds of millions of  lines of code just because some guy named guido read his tea leaves one morning and decided parenthesis are important in the 'print' function.all of the tens of thousands of 'python example codes' now on the web will NOT FUCKING WORK OUT OF THE BOX.. even HELLO WORLD WILL NOT EVEN WORK ANYMORE.do you think scientists are going to put up with this shit?
they are trying to solve cancer, and you tell them they need to spend several hours, days, weeks, putting parenthesis around print statements.
what the fuck is wrong with you?FORTRAN crap from 40 years ago will STILL FUCKING COMPILE AND RUN.Python?
python 4.0, guido will probably decide that fucking whitespace has to be tabs, commas have to be semi colons, and print should be called 'state'.
fuck computer nerds to hell.</sentencetext>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28294115</id>
	<title>Master of the obvious here...</title>
	<author>Anonymous</author>
	<datestamp>1244736360000</datestamp>
	<modclass>None</modclass>
	<modscore>0</modscore>
	<htmltext><p>The languages taught to undergrads need to be driven by industry, period.  Educational institutions need to focus on what they want their students to get out of their program.  Are they being prepared for grad school, to perform IT tasks, to develop system software/operating systems or to write some other kind of software with different needs.  Once you have that defined the choice of languages is very obvious.  Some schools are already doing this and the difference is very noticeable when it comes to interviewing candidates with specific skills.</p></htmltext>
<tokenext>The languages taught to undergrads need to be driven by industry , period .
Educational institutions need to focus on what they want their students to get out of their program .
Are they being prepared for grad school , to perform IT tasks , to develop system software/operating systems or to write some other kind of software with different needs .
Once you have that defined the choice of languages is very obvious .
Some schools are already doing this and the difference is very noticeable when it comes to interviewing candidates with specific skills .</tokentext>
<sentencetext>The languages taught to undergrads need to be driven by industry, period.
Educational institutions need to focus on what they want their students to get out of their program.
Are they being prepared for grad school, to perform IT tasks, to develop system software/operating systems or to write some other kind of software with different needs.
Once you have that defined the choice of languages is very obvious.
Some schools are already doing this and the difference is very noticeable when it comes to interviewing candidates with specific skills.</sentencetext>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28292077</id>
	<title>History lesson</title>
	<author>kfractal</author>
	<datestamp>1244729160000</datestamp>
	<modclass>None</modclass>
	<modscore>0</modscore>
	<htmltext><p>Yes, teach it to them for historical purposes.  So they see how awful it is and no one makes that same mistake again<nobr> <wbr></nobr>:P</p></htmltext>
<tokenext>Yes , teach it to them for historical purposes .
So they see how awful it is and no one makes that same mistake again : P</tokentext>
<sentencetext>Yes, teach it to them for historical purposes.
So they see how awful it is and no one makes that same mistake again :P</sentencetext>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28291987</id>
	<title>I learned FORTRAN</title>
	<author>SailorSpork</author>
	<datestamp>1244728740000</datestamp>
	<modclass>None</modclass>
	<modscore>1</modscore>
	<htmltext><p>I had a half-year "Intro to Engineering" class at Purdue my freshmen year (1999) - Frosh Engineers weren't allowed to pick an engineering discipline until sophomore year, so we got all sorts of intro classes.  In it, we first learned FORTRAN, and then we learned C.</p><p><b>The reason we learned FORTRAN:</b></p><p>* We had to compile code from one language and reference it from code in the other, to prove it was all worked the same once compiled.<br>* We had to have it beaten into us that, if we ever wanted to bitch about how confusion C was, all they ever needed to say is "at least it's not FORTRAN."<br>* They had read an article out there saying that FORTRAN coders were the highest paid coders out there, since lots of companied had legacy code and systems to maintain, and very few people cared to learn legacy languages anymore.</p><p>So yeah, it didn't kill us being part of an intro class where it wasn't the only thing we learned.  Quit bitching about, it and GET OFF MY LAWN!</p></htmltext>
<tokenext>I had a half-year " Intro to Engineering " class at Purdue my freshmen year ( 1999 ) - Frosh Engineers were n't allowed to pick an engineering discipline until sophomore year , so we got all sorts of intro classes .
In it , we first learned FORTRAN , and then we learned C.The reason we learned FORTRAN : * We had to compile code from one language and reference it from code in the other , to prove it was all worked the same once compiled .
* We had to have it beaten into us that , if we ever wanted to bitch about how confusion C was , all they ever needed to say is " at least it 's not FORTRAN .
" * They had read an article out there saying that FORTRAN coders were the highest paid coders out there , since lots of companied had legacy code and systems to maintain , and very few people cared to learn legacy languages anymore.So yeah , it did n't kill us being part of an intro class where it was n't the only thing we learned .
Quit bitching about , it and GET OFF MY LAWN !</tokentext>
<sentencetext>I had a half-year "Intro to Engineering" class at Purdue my freshmen year (1999) - Frosh Engineers weren't allowed to pick an engineering discipline until sophomore year, so we got all sorts of intro classes.
In it, we first learned FORTRAN, and then we learned C.The reason we learned FORTRAN:* We had to compile code from one language and reference it from code in the other, to prove it was all worked the same once compiled.
* We had to have it beaten into us that, if we ever wanted to bitch about how confusion C was, all they ever needed to say is "at least it's not FORTRAN.
"* They had read an article out there saying that FORTRAN coders were the highest paid coders out there, since lots of companied had legacy code and systems to maintain, and very few people cared to learn legacy languages anymore.So yeah, it didn't kill us being part of an intro class where it wasn't the only thing we learned.
Quit bitching about, it and GET OFF MY LAWN!</sentencetext>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28332893</id>
	<title>My Experience</title>
	<author>Anonymous</author>
	<datestamp>1245057480000</datestamp>
	<modclass>None</modclass>
	<modscore>0</modscore>
	<htmltext><p>As someone currently doing a postgrad in Particle Physics, the answer is very much no. They should be taught C++, preferably on a Linux machine using the GNU compiler. Nothing I see is in Fortran, it still exists somewhere but the old guys who actually like it are covering that for the time being. We do use a little bit of Python for some basic scripting for "jobs", but you can cut and paste those easily from examples.</p></htmltext>
<tokenext>As someone currently doing a postgrad in Particle Physics , the answer is very much no .
They should be taught C + + , preferably on a Linux machine using the GNU compiler .
Nothing I see is in Fortran , it still exists somewhere but the old guys who actually like it are covering that for the time being .
We do use a little bit of Python for some basic scripting for " jobs " , but you can cut and paste those easily from examples .</tokentext>
<sentencetext>As someone currently doing a postgrad in Particle Physics, the answer is very much no.
They should be taught C++, preferably on a Linux machine using the GNU compiler.
Nothing I see is in Fortran, it still exists somewhere but the old guys who actually like it are covering that for the time being.
We do use a little bit of Python for some basic scripting for "jobs", but you can cut and paste those easily from examples.</sentencetext>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28298171</id>
	<title>PERL RULZ</title>
	<author>Anonymous</author>
	<datestamp>1244750460000</datestamp>
	<modclass>None</modclass>
	<modscore>0</modscore>
	<htmltext><p>Perl is the FORTRAN of interpreted languages.</p></htmltext>
<tokenext>Perl is the FORTRAN of interpreted languages .</tokentext>
<sentencetext>Perl is the FORTRAN of interpreted languages.</sentencetext>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28292165</id>
	<title>Are You Serious?</title>
	<author>Anonymous</author>
	<datestamp>1244729460000</datestamp>
	<modclass>Troll</modclass>
	<modscore>-1</modscore>
	<htmltext><p><div class="quote"><p>they were using a mathematical library that would solve partial differential equations, by presenting the user with the actual mathematical formulae to them.</p></div><p>Uh, just a point of note, I believe my TI-89 from high school in 1999 solves partial differential equations.  Hell, I think Mathematica could do this when I used it in college in 2003.</p><p><div class="quote"><p>these kinds of libraries are staggeringly complex to write, and they have been empirically proven over decades of use to actually work.</p></div><p>An operating system is staggeringly complex to write yet a single man can write one.  A very special man but a man nonetheless.  Your mentality is fatalistic.</p><p><div class="quote"><p>to start again from scratch with such libraries would require man-centuries or possibly man-millenia of development effort to reproduce and debug, regardless of the programming language.</p></div><p>If that's the case, it would be my inkling to tackle this problem from a new angle and I think many people already have and continue to do that.</p><p><div class="quote"><p>so it doesn't matter what people in the slashdot community think: for engineers to use anything but these tried-and-tested engineering libraries, that happen to be written in fortran, <b>would just be genuinely stupid of them</b>.</p></div><p>How did we ever make progress with people like you around?</p></div>
	</htmltext>
<tokenext>they were using a mathematical library that would solve partial differential equations , by presenting the user with the actual mathematical formulae to them.Uh , just a point of note , I believe my TI-89 from high school in 1999 solves partial differential equations .
Hell , I think Mathematica could do this when I used it in college in 2003.these kinds of libraries are staggeringly complex to write , and they have been empirically proven over decades of use to actually work.An operating system is staggeringly complex to write yet a single man can write one .
A very special man but a man nonetheless .
Your mentality is fatalistic.to start again from scratch with such libraries would require man-centuries or possibly man-millenia of development effort to reproduce and debug , regardless of the programming language.If that 's the case , it would be my inkling to tackle this problem from a new angle and I think many people already have and continue to do that.so it does n't matter what people in the slashdot community think : for engineers to use anything but these tried-and-tested engineering libraries , that happen to be written in fortran , would just be genuinely stupid of them.How did we ever make progress with people like you around ?</tokentext>
<sentencetext>they were using a mathematical library that would solve partial differential equations, by presenting the user with the actual mathematical formulae to them.Uh, just a point of note, I believe my TI-89 from high school in 1999 solves partial differential equations.
Hell, I think Mathematica could do this when I used it in college in 2003.these kinds of libraries are staggeringly complex to write, and they have been empirically proven over decades of use to actually work.An operating system is staggeringly complex to write yet a single man can write one.
A very special man but a man nonetheless.
Your mentality is fatalistic.to start again from scratch with such libraries would require man-centuries or possibly man-millenia of development effort to reproduce and debug, regardless of the programming language.If that's the case, it would be my inkling to tackle this problem from a new angle and I think many people already have and continue to do that.so it doesn't matter what people in the slashdot community think: for engineers to use anything but these tried-and-tested engineering libraries, that happen to be written in fortran, would just be genuinely stupid of them.How did we ever make progress with people like you around?
	</sentencetext>
	<parent>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28291919</parent>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28299295</id>
	<title>Re:libraries. gigabytes of libraries</title>
	<author>Anonymous</author>
	<datestamp>1244711400000</datestamp>
	<modclass>None</modclass>
	<modscore>0</modscore>
	<htmltext><p>I think we're seeing two completely different questions here. You're seeing "get rid of Fortran? BLASPHEMY!".</p><p>The actual issue is not about removing Fortran. The actual issue is that we should teach programming first, then teach fortran. Currently we try to teach both programming and Fortran at the same time, therefore often doing a poor job at both, therefore additionally often making it much harder to learn other languages later than might otherwise be so.</p><p>Why is it important to not cripple the ability to pick up other languages? Besides that students might switch to something else? It's important because while the libraries stay the same, the stuff that calls those libraries is in flux. (And it's also important to know *how* the libraries work, so that one can use them correctly.)</p></htmltext>
<tokenext>I think we 're seeing two completely different questions here .
You 're seeing " get rid of Fortran ?
BLASPHEMY ! " .The actual issue is not about removing Fortran .
The actual issue is that we should teach programming first , then teach fortran .
Currently we try to teach both programming and Fortran at the same time , therefore often doing a poor job at both , therefore additionally often making it much harder to learn other languages later than might otherwise be so.Why is it important to not cripple the ability to pick up other languages ?
Besides that students might switch to something else ?
It 's important because while the libraries stay the same , the stuff that calls those libraries is in flux .
( And it 's also important to know * how * the libraries work , so that one can use them correctly .
)</tokentext>
<sentencetext>I think we're seeing two completely different questions here.
You're seeing "get rid of Fortran?
BLASPHEMY!".The actual issue is not about removing Fortran.
The actual issue is that we should teach programming first, then teach fortran.
Currently we try to teach both programming and Fortran at the same time, therefore often doing a poor job at both, therefore additionally often making it much harder to learn other languages later than might otherwise be so.Why is it important to not cripple the ability to pick up other languages?
Besides that students might switch to something else?
It's important because while the libraries stay the same, the stuff that calls those libraries is in flux.
(And it's also important to know *how* the libraries work, so that one can use them correctly.
)</sentencetext>
	<parent>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28291919</parent>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28292209</id>
	<title>Depends on the version</title>
	<author>DoofusOfDeath</author>
	<datestamp>1244729700000</datestamp>
	<modclass>None</modclass>
	<modscore>1</modscore>
	<htmltext><p>I'm currently porting a large Fortran 77 program to C++.  I can safely say that Fortran 77 is a dangerous, bad, damaging language to teach anyone, especially impressionable young programmers.</p><p>There are several idioms in F77 that must not be taught to future generations.  Probably most importantly, the use of "common blocks", and "entry" points, and the hacks needed to get around the lack of dynamic memory allocation.</p><p>I'm sure that newer versions of Fortran are more reasonable, though.</p></htmltext>
<tokenext>I 'm currently porting a large Fortran 77 program to C + + .
I can safely say that Fortran 77 is a dangerous , bad , damaging language to teach anyone , especially impressionable young programmers.There are several idioms in F77 that must not be taught to future generations .
Probably most importantly , the use of " common blocks " , and " entry " points , and the hacks needed to get around the lack of dynamic memory allocation.I 'm sure that newer versions of Fortran are more reasonable , though .</tokentext>
<sentencetext>I'm currently porting a large Fortran 77 program to C++.
I can safely say that Fortran 77 is a dangerous, bad, damaging language to teach anyone, especially impressionable young programmers.There are several idioms in F77 that must not be taught to future generations.
Probably most importantly, the use of "common blocks", and "entry" points, and the hacks needed to get around the lack of dynamic memory allocation.I'm sure that newer versions of Fortran are more reasonable, though.</sentencetext>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28293113</id>
	<title>Teach them Fortran and ....</title>
	<author>Anonymous</author>
	<datestamp>1244732700000</datestamp>
	<modclass>None</modclass>
	<modscore>0</modscore>
	<htmltext><p>I believe science undergrads should be introduced to a few languages. A quick list of the languages I would teach:<br>1. Fortran<br>2. Python or Visual Basic<br>3. Maxima or Mathematica</p><p>After introducing these languages, I would have the students do a few programs in each language. The programs would involve such tasks is data collection, data analysis, and numerical simulation. I would have them each task in all languages.</p><p>The result is the students should discover a few things:<br>1. They can quickly write a program to take in data from various source and do some analysis on that data using a language like Python or Visual Basic.  This is appropriate to tasks where computational speed does not matter. Having a computer collect all the data samples from various test equipment that runs one test every second does not need a supper fast language, you just need to get it running.<br>2. When doing a large number of calculations the Python and Visual Basic languages are a lot slower then Fortran.</p><p>Maxima and Mathematica are included to introduce languages that can do symbolic evaluation.</p></htmltext>
<tokenext>I believe science undergrads should be introduced to a few languages .
A quick list of the languages I would teach : 1 .
Fortran2. Python or Visual Basic3 .
Maxima or MathematicaAfter introducing these languages , I would have the students do a few programs in each language .
The programs would involve such tasks is data collection , data analysis , and numerical simulation .
I would have them each task in all languages.The result is the students should discover a few things : 1 .
They can quickly write a program to take in data from various source and do some analysis on that data using a language like Python or Visual Basic .
This is appropriate to tasks where computational speed does not matter .
Having a computer collect all the data samples from various test equipment that runs one test every second does not need a supper fast language , you just need to get it running.2 .
When doing a large number of calculations the Python and Visual Basic languages are a lot slower then Fortran.Maxima and Mathematica are included to introduce languages that can do symbolic evaluation .</tokentext>
<sentencetext>I believe science undergrads should be introduced to a few languages.
A quick list of the languages I would teach:1.
Fortran2. Python or Visual Basic3.
Maxima or MathematicaAfter introducing these languages, I would have the students do a few programs in each language.
The programs would involve such tasks is data collection, data analysis, and numerical simulation.
I would have them each task in all languages.The result is the students should discover a few things:1.
They can quickly write a program to take in data from various source and do some analysis on that data using a language like Python or Visual Basic.
This is appropriate to tasks where computational speed does not matter.
Having a computer collect all the data samples from various test equipment that runs one test every second does not need a supper fast language, you just need to get it running.2.
When doing a large number of calculations the Python and Visual Basic languages are a lot slower then Fortran.Maxima and Mathematica are included to introduce languages that can do symbolic evaluation.</sentencetext>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28310069</id>
	<title>APL, I have to plug APL as cryptic and techie</title>
	<author>jimcaruso</author>
	<datestamp>1244828880000</datestamp>
	<modclass>None</modclass>
	<modscore>1</modscore>
	<htmltext>APL - good for matrices too.  Plus, there's the fun of trying to write your entire program in a single line of impossible-to-decipher code.</htmltext>
<tokenext>APL - good for matrices too .
Plus , there 's the fun of trying to write your entire program in a single line of impossible-to-decipher code .</tokentext>
<sentencetext>APL - good for matrices too.
Plus, there's the fun of trying to write your entire program in a single line of impossible-to-decipher code.</sentencetext>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28297181</id>
	<title>What other old languages should be taught?</title>
	<author>sirgoran</author>
	<datestamp>1244747160000</datestamp>
	<modclass>None</modclass>
	<modscore>1</modscore>
	<htmltext>Let's get real. If you're going to make the case for every CS student to learn FORTRAN, then why not COBOL? Or why not PASCAL? Perhaps it should be broken out to it's own course structure, based on the field or major/minor of the student.
That way you don't bore the crap out of someone that will never use it.
<br> <br>
I still have the books and notes from the PASCAL I learned in High School, and the notes and printouts from the FORTRAN and COBOL courses. Never use them now, and never used them after I left College.
<br> <br>
- Goran</htmltext>
<tokenext>Let 's get real .
If you 're going to make the case for every CS student to learn FORTRAN , then why not COBOL ?
Or why not PASCAL ?
Perhaps it should be broken out to it 's own course structure , based on the field or major/minor of the student .
That way you do n't bore the crap out of someone that will never use it .
I still have the books and notes from the PASCAL I learned in High School , and the notes and printouts from the FORTRAN and COBOL courses .
Never use them now , and never used them after I left College .
- Goran</tokentext>
<sentencetext>Let's get real.
If you're going to make the case for every CS student to learn FORTRAN, then why not COBOL?
Or why not PASCAL?
Perhaps it should be broken out to it's own course structure, based on the field or major/minor of the student.
That way you don't bore the crap out of someone that will never use it.
I still have the books and notes from the PASCAL I learned in High School, and the notes and printouts from the FORTRAN and COBOL courses.
Never use them now, and never used them after I left College.
- Goran</sentencetext>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28299849</id>
	<title>Undergraduates Should Be Taught As Many Languages</title>
	<author>J\_Pierpoint\_Finch</author>
	<datestamp>1244713320000</datestamp>
	<modclass>None</modclass>
	<modscore>1</modscore>
	<htmltext><p>Old programming languages never die. If the code written in those languages still work, then they are run forever, which is why 50 years after its invention, FORTRAN (and Cobol) are still in use.</p><p>To become a truly competent programmer must learn as many different programming languages as possible. My first programming class was split into 3 sections over the school year; FORTRAN, Cobol, and Assembler (IBM 360). During my CS education my professors exposed their students to as many types of programming languages as possible. We learned the design and functionality of these many programming language examples so we would be able to pick up any new languages in the future without problems. You want a hard language to learn, try APL or LISP... I did, and got a paycheck for the work I did.</p><p>It is likely that as a new graduate, you will enter the programming world and encounter legacy code written in some ancient programming language like FORTRAN. If you tell your boss, "I can re-write this in X (java, python, ) and it will only take me a month (likely 6...)", your boss will nod his head with a smile and tell you he will think about it. What he is really thinking is "Why would I pay to re-write code that was paid for 40 years ago, that already works".</p></htmltext>
<tokenext>Old programming languages never die .
If the code written in those languages still work , then they are run forever , which is why 50 years after its invention , FORTRAN ( and Cobol ) are still in use.To become a truly competent programmer must learn as many different programming languages as possible .
My first programming class was split into 3 sections over the school year ; FORTRAN , Cobol , and Assembler ( IBM 360 ) .
During my CS education my professors exposed their students to as many types of programming languages as possible .
We learned the design and functionality of these many programming language examples so we would be able to pick up any new languages in the future without problems .
You want a hard language to learn , try APL or LISP... I did , and got a paycheck for the work I did.It is likely that as a new graduate , you will enter the programming world and encounter legacy code written in some ancient programming language like FORTRAN .
If you tell your boss , " I can re-write this in X ( java , python , ) and it will only take me a month ( likely 6... ) " , your boss will nod his head with a smile and tell you he will think about it .
What he is really thinking is " Why would I pay to re-write code that was paid for 40 years ago , that already works " .</tokentext>
<sentencetext>Old programming languages never die.
If the code written in those languages still work, then they are run forever, which is why 50 years after its invention, FORTRAN (and Cobol) are still in use.To become a truly competent programmer must learn as many different programming languages as possible.
My first programming class was split into 3 sections over the school year; FORTRAN, Cobol, and Assembler (IBM 360).
During my CS education my professors exposed their students to as many types of programming languages as possible.
We learned the design and functionality of these many programming language examples so we would be able to pick up any new languages in the future without problems.
You want a hard language to learn, try APL or LISP... I did, and got a paycheck for the work I did.It is likely that as a new graduate, you will enter the programming world and encounter legacy code written in some ancient programming language like FORTRAN.
If you tell your boss, "I can re-write this in X (java, python, ) and it will only take me a month (likely 6...)", your boss will nod his head with a smile and tell you he will think about it.
What he is really thinking is "Why would I pay to re-write code that was paid for 40 years ago, that already works".</sentencetext>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28298351</id>
	<title>Chainsaw massacre</title>
	<author>Anonymous</author>
	<datestamp>1244751180000</datestamp>
	<modclass>None</modclass>
	<modscore>0</modscore>
	<htmltext><p>Use the right tool for the right thing. Do you hammer with a chainsaw ? Why would you crunch numbers with Python ? Sure, a Chainsaw is a lot more powerful and can chop trees, but what you need to get the job done is a hammer.</p></htmltext>
<tokenext>Use the right tool for the right thing .
Do you hammer with a chainsaw ?
Why would you crunch numbers with Python ?
Sure , a Chainsaw is a lot more powerful and can chop trees , but what you need to get the job done is a hammer .</tokentext>
<sentencetext>Use the right tool for the right thing.
Do you hammer with a chainsaw ?
Why would you crunch numbers with Python ?
Sure, a Chainsaw is a lot more powerful and can chop trees, but what you need to get the job done is a hammer.</sentencetext>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28300237</id>
	<title>Precision</title>
	<author>Anonymous</author>
	<datestamp>1244714700000</datestamp>
	<modclass>None</modclass>
	<modscore>0</modscore>
	<htmltext><p>Whatever you use, it better have 128bit precision support for when you have to do large matrix diagonalization (often needed in the sciences).</p></htmltext>
<tokenext>Whatever you use , it better have 128bit precision support for when you have to do large matrix diagonalization ( often needed in the sciences ) .</tokentext>
<sentencetext>Whatever you use, it better have 128bit precision support for when you have to do large matrix diagonalization (often needed in the sciences).</sentencetext>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28292997</id>
	<title>Yes and no</title>
	<author>baryluk</author>
	<datestamp>1244732280000</datestamp>
	<modclass>None</modclass>
	<modscore>1</modscore>
	<htmltext>Yes for FORTRAN, but no as a first language.

Hi,

students of Physics and Chemistry here in Cracov Poland also are learning FORTRAN somewhere on first or second year. I think it is horiblle language compared to modern ones, but it is really fast, and libraries are really robust. So FORTRAN is needed, but i think most times students should write C code which talks to FORTRAN libraries, this is simpler.

And FORTRAN shouldn't be first language teached. I opt for Phyton, it is so easy to learn complex aplications. I also personally like D.<nobr> <wbr></nobr>:)</htmltext>
<tokenext>Yes for FORTRAN , but no as a first language .
Hi , students of Physics and Chemistry here in Cracov Poland also are learning FORTRAN somewhere on first or second year .
I think it is horiblle language compared to modern ones , but it is really fast , and libraries are really robust .
So FORTRAN is needed , but i think most times students should write C code which talks to FORTRAN libraries , this is simpler .
And FORTRAN should n't be first language teached .
I opt for Phyton , it is so easy to learn complex aplications .
I also personally like D. : )</tokentext>
<sentencetext>Yes for FORTRAN, but no as a first language.
Hi,

students of Physics and Chemistry here in Cracov Poland also are learning FORTRAN somewhere on first or second year.
I think it is horiblle language compared to modern ones, but it is really fast, and libraries are really robust.
So FORTRAN is needed, but i think most times students should write C code which talks to FORTRAN libraries, this is simpler.
And FORTRAN shouldn't be first language teached.
I opt for Phyton, it is so easy to learn complex aplications.
I also personally like D. :)</sentencetext>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28304829</id>
	<title>it's actually OK</title>
	<author>jipn4</author>
	<datestamp>1244748120000</datestamp>
	<modclass>None</modclass>
	<modscore>1</modscore>
	<htmltext><p>As a language, Fortran supports pretty much what C++ supports (including overloading, templates, and objects), but is probably easier to use.</p><p>In terms of compilers, GNU Fortran supports Fortran 95 and many newer features; it's pretty much as good (or bad) as commercial Fortran implementations.</p><p>Python probably is a good first language.  But writing numerical code in Python is actually harder than in modern Fortran because a lot of "natural" code in Python (loops over arrays) will run very slowly; in Python, if you want to write fast numerical code, you need to figure out how to convert it to array expressions.  Fortran has both array expressions and loops, and they both run fast.</p><p>If people are going to learn just one language, with modern Fortran dialects and compilers, Fortran may be a reasonable choice again.  In many ways, for people writing numerical code, Python is not an easy language.   In fact, many people may opt for Matlab, because Matlab now (apparently) has a JIT that makes numerical loop code run fast as well, so it combines some of the advantages of Fortran and Python (but it's hugely expensive and a not a very good language IMO).</p></htmltext>
<tokenext>As a language , Fortran supports pretty much what C + + supports ( including overloading , templates , and objects ) , but is probably easier to use.In terms of compilers , GNU Fortran supports Fortran 95 and many newer features ; it 's pretty much as good ( or bad ) as commercial Fortran implementations.Python probably is a good first language .
But writing numerical code in Python is actually harder than in modern Fortran because a lot of " natural " code in Python ( loops over arrays ) will run very slowly ; in Python , if you want to write fast numerical code , you need to figure out how to convert it to array expressions .
Fortran has both array expressions and loops , and they both run fast.If people are going to learn just one language , with modern Fortran dialects and compilers , Fortran may be a reasonable choice again .
In many ways , for people writing numerical code , Python is not an easy language .
In fact , many people may opt for Matlab , because Matlab now ( apparently ) has a JIT that makes numerical loop code run fast as well , so it combines some of the advantages of Fortran and Python ( but it 's hugely expensive and a not a very good language IMO ) .</tokentext>
<sentencetext>As a language, Fortran supports pretty much what C++ supports (including overloading, templates, and objects), but is probably easier to use.In terms of compilers, GNU Fortran supports Fortran 95 and many newer features; it's pretty much as good (or bad) as commercial Fortran implementations.Python probably is a good first language.
But writing numerical code in Python is actually harder than in modern Fortran because a lot of "natural" code in Python (loops over arrays) will run very slowly; in Python, if you want to write fast numerical code, you need to figure out how to convert it to array expressions.
Fortran has both array expressions and loops, and they both run fast.If people are going to learn just one language, with modern Fortran dialects and compilers, Fortran may be a reasonable choice again.
In many ways, for people writing numerical code, Python is not an easy language.
In fact, many people may opt for Matlab, because Matlab now (apparently) has a JIT that makes numerical loop code run fast as well, so it combines some of the advantages of Fortran and Python (but it's hugely expensive and a not a very good language IMO).</sentencetext>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28292225</id>
	<title>Re:While there may be "newer" languages</title>
	<author>pzs</author>
	<datestamp>1244729700000</datestamp>
	<modclass>None</modclass>
	<modscore>1</modscore>
	<htmltext><p>Is Fortran really that much quicker than Python+Numpy? I'm genuinely interested. I do large data set numerical stuff with Python and I find the performance is pretty good. As an example, I can z-score an array of 40 million data points in less than 2 seconds.</p><p>There is also decent support for linking in more high performance stuff, using Cython or Swig. I'm also not convinced that the execution speed benefits of a dated language like Fortran stacks up against the convenience of programming in Python. I would not want to manage the kind of complexity and variety I need to deal with in a language without modern OO support.</p><p>I'd rather it took a few weeks to write and a few days to run than a few months to write and a few hours to run. I guess this depends how many runs you expect to do.</p></htmltext>
<tokenext>Is Fortran really that much quicker than Python + Numpy ?
I 'm genuinely interested .
I do large data set numerical stuff with Python and I find the performance is pretty good .
As an example , I can z-score an array of 40 million data points in less than 2 seconds.There is also decent support for linking in more high performance stuff , using Cython or Swig .
I 'm also not convinced that the execution speed benefits of a dated language like Fortran stacks up against the convenience of programming in Python .
I would not want to manage the kind of complexity and variety I need to deal with in a language without modern OO support.I 'd rather it took a few weeks to write and a few days to run than a few months to write and a few hours to run .
I guess this depends how many runs you expect to do .</tokentext>
<sentencetext>Is Fortran really that much quicker than Python+Numpy?
I'm genuinely interested.
I do large data set numerical stuff with Python and I find the performance is pretty good.
As an example, I can z-score an array of 40 million data points in less than 2 seconds.There is also decent support for linking in more high performance stuff, using Cython or Swig.
I'm also not convinced that the execution speed benefits of a dated language like Fortran stacks up against the convenience of programming in Python.
I would not want to manage the kind of complexity and variety I need to deal with in a language without modern OO support.I'd rather it took a few weeks to write and a few days to run than a few months to write and a few hours to run.
I guess this depends how many runs you expect to do.</sentencetext>
	<parent>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28291859</parent>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28292251</id>
	<title>Re:While there may be "newer" languages</title>
	<author>jstults</author>
	<datestamp>1244729820000</datestamp>
	<modclass>Informativ</modclass>
	<modscore>5</modscore>
	<htmltext><p><div class="quote"><p>Fortran is still one of the best, fastest, most optimized tools for number crunching.</p></div><p>Agreed.</p><p><div class="quote"><p>It's also very easy to write simple programs in it.</p></div><p>This is a strength of Python too.</p><p><div class="quote"><p> No way I'd use Python for serious large data set numerical calculations.</p></div><p>It's not either/or, with F2Py you can put your inner loops in Fortran, and deal with the higher level abstractions with Python.  So you get fast number crunching and all the 'batteries included' too.</p></div>
	</htmltext>
<tokenext>Fortran is still one of the best , fastest , most optimized tools for number crunching.Agreed.It 's also very easy to write simple programs in it.This is a strength of Python too .
No way I 'd use Python for serious large data set numerical calculations.It 's not either/or , with F2Py you can put your inner loops in Fortran , and deal with the higher level abstractions with Python .
So you get fast number crunching and all the 'batteries included ' too .</tokentext>
<sentencetext>Fortran is still one of the best, fastest, most optimized tools for number crunching.Agreed.It's also very easy to write simple programs in it.This is a strength of Python too.
No way I'd use Python for serious large data set numerical calculations.It's not either/or, with F2Py you can put your inner loops in Fortran, and deal with the higher level abstractions with Python.
So you get fast number crunching and all the 'batteries included' too.
	</sentencetext>
	<parent>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28291859</parent>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28294123</id>
	<title>Re:While there may be "newer" languages</title>
	<author>Fnkmaster</author>
	<datestamp>1244736360000</datestamp>
	<modclass>None</modclass>
	<modscore>1</modscore>
	<htmltext><p>If you are using numpy you are using both Python and Fortran to do serious numerical calculations.  I generally don't work with particularly large data sets, so I can't speak to that, but for the fairly serious financial number crunching I do, Python is a fantastic tool, with the help of lots of highly optimized Fortran libraries.</p></htmltext>
<tokenext>If you are using numpy you are using both Python and Fortran to do serious numerical calculations .
I generally do n't work with particularly large data sets , so I ca n't speak to that , but for the fairly serious financial number crunching I do , Python is a fantastic tool , with the help of lots of highly optimized Fortran libraries .</tokentext>
<sentencetext>If you are using numpy you are using both Python and Fortran to do serious numerical calculations.
I generally don't work with particularly large data sets, so I can't speak to that, but for the fairly serious financial number crunching I do, Python is a fantastic tool, with the help of lots of highly optimized Fortran libraries.</sentencetext>
	<parent>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28291859</parent>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28291859</id>
	<title>While there may be "newer" languages</title>
	<author>wireloose</author>
	<datestamp>1244728260000</datestamp>
	<modclass>Insightful</modclass>
	<modscore>5</modscore>
	<htmltext>Fortran is still one of the best, fastest, most optimized tools for number crunching.  It's also very easy to write simple programs in it.  No way I'd use Python for serious large data set numerical calculations.</htmltext>
<tokenext>Fortran is still one of the best , fastest , most optimized tools for number crunching .
It 's also very easy to write simple programs in it .
No way I 'd use Python for serious large data set numerical calculations .</tokentext>
<sentencetext>Fortran is still one of the best, fastest, most optimized tools for number crunching.
It's also very easy to write simple programs in it.
No way I'd use Python for serious large data set numerical calculations.</sentencetext>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28302341</id>
	<title>my first project after graduation....</title>
	<author>motherpusbucket</author>
	<datestamp>1244724060000</datestamp>
	<modclass>None</modclass>
	<modscore>1</modscore>
	<htmltext>back in 1989 was an engineering analysis tester written in FORTRAN.  The language was chosen because my assignment was temporary and the guy that was permanently in the department only new that language.  The tester code was a combo of FORTRAN and Micro$oft Assembler for all the bits FORTRAN couldn't handle.  Even in the late 80's, as a EE student, I thought FORTRAN was outdated when it was required in our studies.  However in my case, it did prove useful.
<br>
Since graduating, I've learned many other languages on my own, but FORTRAN was the cornerstone.  Not the language, but the concept.  As long as the concept of programming is taught correctly, other languages should easily be picked up as needed and it shouldn't matter what course you took in college.  C is probably a bit much for the casual programmer, but FORTRAN is a simple alternative.  Chasing the latest 'flavor of the week language' really shouldn't be the goal.  Teaching 'programming as a concept' is the important thing.  If done correctly, a person should be able to easily apply the concept to whatever language their job demands.</htmltext>
<tokenext>back in 1989 was an engineering analysis tester written in FORTRAN .
The language was chosen because my assignment was temporary and the guy that was permanently in the department only new that language .
The tester code was a combo of FORTRAN and Micro $ oft Assembler for all the bits FORTRAN could n't handle .
Even in the late 80 's , as a EE student , I thought FORTRAN was outdated when it was required in our studies .
However in my case , it did prove useful .
Since graduating , I 've learned many other languages on my own , but FORTRAN was the cornerstone .
Not the language , but the concept .
As long as the concept of programming is taught correctly , other languages should easily be picked up as needed and it should n't matter what course you took in college .
C is probably a bit much for the casual programmer , but FORTRAN is a simple alternative .
Chasing the latest 'flavor of the week language ' really should n't be the goal .
Teaching 'programming as a concept ' is the important thing .
If done correctly , a person should be able to easily apply the concept to whatever language their job demands .</tokentext>
<sentencetext>back in 1989 was an engineering analysis tester written in FORTRAN.
The language was chosen because my assignment was temporary and the guy that was permanently in the department only new that language.
The tester code was a combo of FORTRAN and Micro$oft Assembler for all the bits FORTRAN couldn't handle.
Even in the late 80's, as a EE student, I thought FORTRAN was outdated when it was required in our studies.
However in my case, it did prove useful.
Since graduating, I've learned many other languages on my own, but FORTRAN was the cornerstone.
Not the language, but the concept.
As long as the concept of programming is taught correctly, other languages should easily be picked up as needed and it shouldn't matter what course you took in college.
C is probably a bit much for the casual programmer, but FORTRAN is a simple alternative.
Chasing the latest 'flavor of the week language' really shouldn't be the goal.
Teaching 'programming as a concept' is the important thing.
If done correctly, a person should be able to easily apply the concept to whatever language their job demands.</sentencetext>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28292955</id>
	<title>Fortran is simple.</title>
	<author>iVasto</author>
	<datestamp>1244732160000</datestamp>
	<modclass>None</modclass>
	<modscore>1</modscore>
	<htmltext>The reason why universities teach fortran still is because it is very simple to learn. My roommates were taught it and they had trouble with the language. I couldn't imagine trying to teach them a more complex language. The most important thing about learning your first language is not the actual syntax, it is learning how to think.</htmltext>
<tokenext>The reason why universities teach fortran still is because it is very simple to learn .
My roommates were taught it and they had trouble with the language .
I could n't imagine trying to teach them a more complex language .
The most important thing about learning your first language is not the actual syntax , it is learning how to think .</tokentext>
<sentencetext>The reason why universities teach fortran still is because it is very simple to learn.
My roommates were taught it and they had trouble with the language.
I couldn't imagine trying to teach them a more complex language.
The most important thing about learning your first language is not the actual syntax, it is learning how to think.</sentencetext>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28296525</id>
	<title>OO Fortran95</title>
	<author>plopez</author>
	<datestamp>1244744940000</datestamp>
	<modclass>None</modclass>
	<modscore>1</modscore>
	<htmltext><p>Fortran 95 is a fully munitioning OO programming language. Get rid of you biases and old misconceived ideas. All that, built in parallel programming and fast.</p><p>It's a natural choice if you need number crunching.</p><p>If you need string processing, Perl or Ruby or some other scripting language.</p><p>Python lower on the list.</p></htmltext>
<tokenext>Fortran 95 is a fully munitioning OO programming language .
Get rid of you biases and old misconceived ideas .
All that , built in parallel programming and fast.It 's a natural choice if you need number crunching.If you need string processing , Perl or Ruby or some other scripting language.Python lower on the list .</tokentext>
<sentencetext>Fortran 95 is a fully munitioning OO programming language.
Get rid of you biases and old misconceived ideas.
All that, built in parallel programming and fast.It's a natural choice if you need number crunching.If you need string processing, Perl or Ruby or some other scripting language.Python lower on the list.</sentencetext>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28292915</id>
	<title>Re:University != Trade school</title>
	<author>dkf</author>
	<datestamp>1244731980000</datestamp>
	<modclass>None</modclass>
	<modscore>1</modscore>
	<htmltext><p><div class="quote"><p>IMO universities should be teaching core principles and methods, not attempting to impart up-to-date job skills.</p></div><p>So you advocate teaching full-on computer science to physics, chemistry and engineering students? Why? Shouldn't they be learning about the fundamentals of their own fields instead?</p></div>
	</htmltext>
<tokenext>IMO universities should be teaching core principles and methods , not attempting to impart up-to-date job skills.So you advocate teaching full-on computer science to physics , chemistry and engineering students ?
Why ? Should n't they be learning about the fundamentals of their own fields instead ?</tokentext>
<sentencetext>IMO universities should be teaching core principles and methods, not attempting to impart up-to-date job skills.So you advocate teaching full-on computer science to physics, chemistry and engineering students?
Why? Shouldn't they be learning about the fundamentals of their own fields instead?
	</sentencetext>
	<parent>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28292153</parent>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28297005</id>
	<title>Re:While there may be "newer" languages</title>
	<author>ctrl-alt-canc</author>
	<datestamp>1244746620000</datestamp>
	<modclass>None</modclass>
	<modscore>1</modscore>
	<htmltext>Totally second this: fortran legacy is wonderful (just think about BLAS, LINPACK, EISPACK, etc...), but we can use it from wherever environment we want.</htmltext>
<tokenext>Totally second this : fortran legacy is wonderful ( just think about BLAS , LINPACK , EISPACK , etc... ) , but we can use it from wherever environment we want .</tokentext>
<sentencetext>Totally second this: fortran legacy is wonderful (just think about BLAS, LINPACK, EISPACK, etc...), but we can use it from wherever environment we want.</sentencetext>
	<parent>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28292125</parent>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28293993</id>
	<title>Re:University != Trade school</title>
	<author>Peter La Casse</author>
	<datestamp>1244735940000</datestamp>
	<modclass>None</modclass>
	<modscore>1</modscore>
	<htmltext><blockquote><div><p>IMO universities should be teaching core principles and methods, not attempting to impart up-to-date job skills.</p></div></blockquote><p>I used to agree, but I changed my mind.  In addition to having learned core principles and methods, graduates from universities should be able to support themselves immediately after graduation.  That means that they should be able to bring enough value to an organization (possibly one they form themselves) to justify their employment, to provide themselves with food, shelter and the other essentials of life.  University is not Trade School: University is a superset of Trade School.
</p><p>Granted, most of those "job skills" should be learned before a student enters the university: every high school graduate should be able to make a budget, provide themselves with food and shelter, tie their own shoes without help and so on.  Obviously we're far from that.
</p><p>On the specific topic of this story: some scientists need to know Fortran to do their jobs.  It may well be good to teach some other language first, or to require a Programming Languages course in every undergraduate science program, but it's essential for undergraduates to learn the programming language(s) that dominate(s) their field.</p></div>
	</htmltext>
<tokenext>IMO universities should be teaching core principles and methods , not attempting to impart up-to-date job skills.I used to agree , but I changed my mind .
In addition to having learned core principles and methods , graduates from universities should be able to support themselves immediately after graduation .
That means that they should be able to bring enough value to an organization ( possibly one they form themselves ) to justify their employment , to provide themselves with food , shelter and the other essentials of life .
University is not Trade School : University is a superset of Trade School .
Granted , most of those " job skills " should be learned before a student enters the university : every high school graduate should be able to make a budget , provide themselves with food and shelter , tie their own shoes without help and so on .
Obviously we 're far from that .
On the specific topic of this story : some scientists need to know Fortran to do their jobs .
It may well be good to teach some other language first , or to require a Programming Languages course in every undergraduate science program , but it 's essential for undergraduates to learn the programming language ( s ) that dominate ( s ) their field .</tokentext>
<sentencetext>IMO universities should be teaching core principles and methods, not attempting to impart up-to-date job skills.I used to agree, but I changed my mind.
In addition to having learned core principles and methods, graduates from universities should be able to support themselves immediately after graduation.
That means that they should be able to bring enough value to an organization (possibly one they form themselves) to justify their employment, to provide themselves with food, shelter and the other essentials of life.
University is not Trade School: University is a superset of Trade School.
Granted, most of those "job skills" should be learned before a student enters the university: every high school graduate should be able to make a budget, provide themselves with food and shelter, tie their own shoes without help and so on.
Obviously we're far from that.
On the specific topic of this story: some scientists need to know Fortran to do their jobs.
It may well be good to teach some other language first, or to require a Programming Languages course in every undergraduate science program, but it's essential for undergraduates to learn the programming language(s) that dominate(s) their field.
	</sentencetext>
	<parent>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28292153</parent>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28292321</id>
	<title>Re:While there may be "newer" languages</title>
	<author>Anonymous</author>
	<datestamp>1244730060000</datestamp>
	<modclass>None</modclass>
	<modscore>0</modscore>
	<htmltext><p>Decades of well tested and well optimised libraries.</p></htmltext>
<tokenext>Decades of well tested and well optimised libraries .</tokentext>
<sentencetext>Decades of well tested and well optimised libraries.</sentencetext>
	<parent>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28292025</parent>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28303715</id>
	<title>Re:While there may be "newer" languages</title>
	<author>ceoyoyo</author>
	<datestamp>1244735160000</datestamp>
	<modclass>Informativ</modclass>
	<modscore>2</modscore>
	<htmltext><p>Python includes a great deal of MatLab functionality if you go look for it, and has lots of other advantages over MatLab.  It's a modern, general purpose, object oriented language that scales well.  You can write multi-thousand line apps in Python that are elegant and easy to maintain and understand.  MatLab is designed to write very short programs in, but people are always trying to write big, full fledged apps anyway.</p><p>Plus Python is free and MatLab costs thousands.</p></htmltext>
<tokenext>Python includes a great deal of MatLab functionality if you go look for it , and has lots of other advantages over MatLab .
It 's a modern , general purpose , object oriented language that scales well .
You can write multi-thousand line apps in Python that are elegant and easy to maintain and understand .
MatLab is designed to write very short programs in , but people are always trying to write big , full fledged apps anyway.Plus Python is free and MatLab costs thousands .</tokentext>
<sentencetext>Python includes a great deal of MatLab functionality if you go look for it, and has lots of other advantages over MatLab.
It's a modern, general purpose, object oriented language that scales well.
You can write multi-thousand line apps in Python that are elegant and easy to maintain and understand.
MatLab is designed to write very short programs in, but people are always trying to write big, full fledged apps anyway.Plus Python is free and MatLab costs thousands.</sentencetext>
	<parent>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28292401</parent>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28292347</id>
	<title>Not sure it's a big deal</title>
	<author>Kirby</author>
	<datestamp>1244730120000</datestamp>
	<modclass>None</modclass>
	<modscore>1</modscore>
	<htmltext><p>I tend to think that the goal is not to learn specific instances of skills in college (ie, a particular language), but learn principles.  It's very much worth any science student's time to know a programming language.  It's a reasonable thing to learn programming well enough to pick up any language - but really, this is one path of many, and is probably best called 'minor in computer science'.  Yes, Fortran is not exactly cutting edge, but it's a fine example of one kind of language.</p><p>If you wanted to learn the language most commonly used in science labs, sad to say as a programmer type, probably spend some time with Visual Basic.</p><p>As an ecosystem, the best thing is probably variety.  It's good that some percentage of new graduates know Fortran, so it doesn't become burdensome to hire people to work at places with legacy codesets, but it's also good that it's possible to hire someone working in a language that might be more suited to what you're doing.  Or to a different type of brain.  Diversity is good in a larger sense - someone who might be a mediocre Java programmer might be a star Perl programmer, and vice versa.  The more diversity gives more individuals a chance at finding a niche they excel in.  I think this aspect is overlooked massively when people talk about 'the right tool for the job'.</p><p>If you're a scientist that likes programming, though, a minor in CS so you can pick up whatever language comes your way is an approach with a lot of practical potential.  (Or a double major, but CS is hard and most Science programs are hard, so I'm not sure it's worth it - even the people that hire programmers that like degrees would not find fault with a Physics degree and a CS minor.)</p></htmltext>
<tokenext>I tend to think that the goal is not to learn specific instances of skills in college ( ie , a particular language ) , but learn principles .
It 's very much worth any science student 's time to know a programming language .
It 's a reasonable thing to learn programming well enough to pick up any language - but really , this is one path of many , and is probably best called 'minor in computer science' .
Yes , Fortran is not exactly cutting edge , but it 's a fine example of one kind of language.If you wanted to learn the language most commonly used in science labs , sad to say as a programmer type , probably spend some time with Visual Basic.As an ecosystem , the best thing is probably variety .
It 's good that some percentage of new graduates know Fortran , so it does n't become burdensome to hire people to work at places with legacy codesets , but it 's also good that it 's possible to hire someone working in a language that might be more suited to what you 're doing .
Or to a different type of brain .
Diversity is good in a larger sense - someone who might be a mediocre Java programmer might be a star Perl programmer , and vice versa .
The more diversity gives more individuals a chance at finding a niche they excel in .
I think this aspect is overlooked massively when people talk about 'the right tool for the job'.If you 're a scientist that likes programming , though , a minor in CS so you can pick up whatever language comes your way is an approach with a lot of practical potential .
( Or a double major , but CS is hard and most Science programs are hard , so I 'm not sure it 's worth it - even the people that hire programmers that like degrees would not find fault with a Physics degree and a CS minor .
)</tokentext>
<sentencetext>I tend to think that the goal is not to learn specific instances of skills in college (ie, a particular language), but learn principles.
It's very much worth any science student's time to know a programming language.
It's a reasonable thing to learn programming well enough to pick up any language - but really, this is one path of many, and is probably best called 'minor in computer science'.
Yes, Fortran is not exactly cutting edge, but it's a fine example of one kind of language.If you wanted to learn the language most commonly used in science labs, sad to say as a programmer type, probably spend some time with Visual Basic.As an ecosystem, the best thing is probably variety.
It's good that some percentage of new graduates know Fortran, so it doesn't become burdensome to hire people to work at places with legacy codesets, but it's also good that it's possible to hire someone working in a language that might be more suited to what you're doing.
Or to a different type of brain.
Diversity is good in a larger sense - someone who might be a mediocre Java programmer might be a star Perl programmer, and vice versa.
The more diversity gives more individuals a chance at finding a niche they excel in.
I think this aspect is overlooked massively when people talk about 'the right tool for the job'.If you're a scientist that likes programming, though, a minor in CS so you can pick up whatever language comes your way is an approach with a lot of practical potential.
(Or a double major, but CS is hard and most Science programs are hard, so I'm not sure it's worth it - even the people that hire programmers that like degrees would not find fault with a Physics degree and a CS minor.
)</sentencetext>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28292063</id>
	<title>FORTRAN is better than Python</title>
	<author>Anonymous</author>
	<datestamp>1244729100000</datestamp>
	<modclass>Troll</modclass>
	<modscore>0</modscore>
	<htmltext><p>because Python is interpreted and FORTRAN is compiled.   And honestly, lets admit it, the whole object oriented thing has pretty much failed and its time to pack it up and get procedural again.</p></htmltext>
<tokenext>because Python is interpreted and FORTRAN is compiled .
And honestly , lets admit it , the whole object oriented thing has pretty much failed and its time to pack it up and get procedural again .</tokentext>
<sentencetext>because Python is interpreted and FORTRAN is compiled.
And honestly, lets admit it, the whole object oriented thing has pretty much failed and its time to pack it up and get procedural again.</sentencetext>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28304339</id>
	<title>FORTRAN I/O is a killer</title>
	<author>nsaspook</author>
	<datestamp>1244740680000</datestamp>
	<modclass>None</modclass>
	<modscore>1</modscore>
	<htmltext><p>  I once had to copy  Nixdorf data-entry files to 360K single sided 8" floppies for General Dynamics in San Diego. So I wrote a FORTRAN program to reprogram the floppy controller to write  the data in raw mode to 128 byte sectors to emulate a card punch. The floppy drives were connected to the serial port on the I/O controller (Harris H-series) So to make it work we would drop the data-entray tape in the harris tape reader and mount it, insert a blank floppy in the monster driver, start my program to begin reading the tape and keep swapping new floppies as they filled up with data. Why they needed the data in the archaic format I will never know.</p></htmltext>
<tokenext>I once had to copy Nixdorf data-entry files to 360K single sided 8 " floppies for General Dynamics in San Diego .
So I wrote a FORTRAN program to reprogram the floppy controller to write the data in raw mode to 128 byte sectors to emulate a card punch .
The floppy drives were connected to the serial port on the I/O controller ( Harris H-series ) So to make it work we would drop the data-entray tape in the harris tape reader and mount it , insert a blank floppy in the monster driver , start my program to begin reading the tape and keep swapping new floppies as they filled up with data .
Why they needed the data in the archaic format I will never know .</tokentext>
<sentencetext>  I once had to copy  Nixdorf data-entry files to 360K single sided 8" floppies for General Dynamics in San Diego.
So I wrote a FORTRAN program to reprogram the floppy controller to write  the data in raw mode to 128 byte sectors to emulate a card punch.
The floppy drives were connected to the serial port on the I/O controller (Harris H-series) So to make it work we would drop the data-entray tape in the harris tape reader and mount it, insert a blank floppy in the monster driver, start my program to begin reading the tape and keep swapping new floppies as they filled up with data.
Why they needed the data in the archaic format I will never know.</sentencetext>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28293451</id>
	<title>Pascal! no - Javascript!</title>
	<author>ggpauly</author>
	<datestamp>1244733780000</datestamp>
	<modclass>None</modclass>
	<modscore>1</modscore>
	<htmltext><p>The Intro to Programming course I took at Indiana University (a few years ago) taught the basics of programming in Pascal followed by 2 weeks or so of FORTRAN.  Both languages are primitive compared to modern languages.  But I learned the basic idea of programming, which is still exciting to me, and the lesson that learning a new language is not so difficult and well worth the effort.</p><p>I used both languages as an undergraduate: FORTRAN in part time jobs in the chemistry department and Pascal for an undergraduate thesis project.</p><p>I'd like to give a plug for Javascript as an intro language.  The language itself (not the browser DOM) is extremely simple and surprisingly powerful, with OO, C syntax, type, functional, and imperative aspects.  Perhaps a version with an integer type could be used for practical considerations.</p><p>In my opinion the current practice of teaching Java to new students is a mistake, but then again I think any use of Java is a mistake.</p></htmltext>
<tokenext>The Intro to Programming course I took at Indiana University ( a few years ago ) taught the basics of programming in Pascal followed by 2 weeks or so of FORTRAN .
Both languages are primitive compared to modern languages .
But I learned the basic idea of programming , which is still exciting to me , and the lesson that learning a new language is not so difficult and well worth the effort.I used both languages as an undergraduate : FORTRAN in part time jobs in the chemistry department and Pascal for an undergraduate thesis project.I 'd like to give a plug for Javascript as an intro language .
The language itself ( not the browser DOM ) is extremely simple and surprisingly powerful , with OO , C syntax , type , functional , and imperative aspects .
Perhaps a version with an integer type could be used for practical considerations.In my opinion the current practice of teaching Java to new students is a mistake , but then again I think any use of Java is a mistake .</tokentext>
<sentencetext>The Intro to Programming course I took at Indiana University (a few years ago) taught the basics of programming in Pascal followed by 2 weeks or so of FORTRAN.
Both languages are primitive compared to modern languages.
But I learned the basic idea of programming, which is still exciting to me, and the lesson that learning a new language is not so difficult and well worth the effort.I used both languages as an undergraduate: FORTRAN in part time jobs in the chemistry department and Pascal for an undergraduate thesis project.I'd like to give a plug for Javascript as an intro language.
The language itself (not the browser DOM) is extremely simple and surprisingly powerful, with OO, C syntax, type, functional, and imperative aspects.
Perhaps a version with an integer type could be used for practical considerations.In my opinion the current practice of teaching Java to new students is a mistake, but then again I think any use of Java is a mistake.</sentencetext>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28294053</id>
	<title>FORTRAN for scientists and engineers</title>
	<author>Anonymous</author>
	<datestamp>1244736120000</datestamp>
	<modclass>None</modclass>
	<modscore>0</modscore>
	<htmltext><p>I  had trouble logging in so I'll do the Anonymous Coward.</p><p>A very surprisingly large amount of the actual calculations in software, from car crash simulations, to aerodynamics, to stream flow and groundwater simulation, and all EPA air quality models and weather prediction models use FORTRAN.  The user may see interfaces that are written in other languages, but in the guts of the thing, FORTRAN is what drives the model.  At some point in the student's Senior thesis or soon after they get out in a real job, they may run into the need to get into the model.  Certainly in the atmospheric sciences and air quality areas  - you are really handicapped, and your employer is, if someone doesn't know FORTRAN.  If you look at the list of top supercomputers a lot of them are spending most of their time running FORTRAN.  We had to start a Computations in Meteorology class for our upper level undergraduates and graduate students in atmospheric science, primarily to teach them FORTRAN and how to deal with the multitude of flavors of "standard formats" that data comes in.  Our word back from employers is that knowing FORTRAN and C is a definite plus, four B.S. and is virtually assumed for higher degrees.</p></htmltext>
<tokenext>I had trouble logging in so I 'll do the Anonymous Coward.A very surprisingly large amount of the actual calculations in software , from car crash simulations , to aerodynamics , to stream flow and groundwater simulation , and all EPA air quality models and weather prediction models use FORTRAN .
The user may see interfaces that are written in other languages , but in the guts of the thing , FORTRAN is what drives the model .
At some point in the student 's Senior thesis or soon after they get out in a real job , they may run into the need to get into the model .
Certainly in the atmospheric sciences and air quality areas - you are really handicapped , and your employer is , if someone does n't know FORTRAN .
If you look at the list of top supercomputers a lot of them are spending most of their time running FORTRAN .
We had to start a Computations in Meteorology class for our upper level undergraduates and graduate students in atmospheric science , primarily to teach them FORTRAN and how to deal with the multitude of flavors of " standard formats " that data comes in .
Our word back from employers is that knowing FORTRAN and C is a definite plus , four B.S .
and is virtually assumed for higher degrees .</tokentext>
<sentencetext>I  had trouble logging in so I'll do the Anonymous Coward.A very surprisingly large amount of the actual calculations in software, from car crash simulations, to aerodynamics, to stream flow and groundwater simulation, and all EPA air quality models and weather prediction models use FORTRAN.
The user may see interfaces that are written in other languages, but in the guts of the thing, FORTRAN is what drives the model.
At some point in the student's Senior thesis or soon after they get out in a real job, they may run into the need to get into the model.
Certainly in the atmospheric sciences and air quality areas  - you are really handicapped, and your employer is, if someone doesn't know FORTRAN.
If you look at the list of top supercomputers a lot of them are spending most of their time running FORTRAN.
We had to start a Computations in Meteorology class for our upper level undergraduates and graduate students in atmospheric science, primarily to teach them FORTRAN and how to deal with the multitude of flavors of "standard formats" that data comes in.
Our word back from employers is that knowing FORTRAN and C is a definite plus, four B.S.
and is virtually assumed for higher degrees.</sentencetext>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28291939</id>
	<title>"Introductory"</title>
	<author>nine-times</author>
	<datestamp>1244728500000</datestamp>
	<modclass>None</modclass>
	<modscore>1</modscore>
	<htmltext><p>Disclaimer: I am not a programmer.
</p><p>It seems to me like there should be a good "introductory" language where, though the utility of knowing that language may or may not be practical, would help teach programming concepts and good practices to new programmers.  That is, if the goal is really to teach programming.  Usually, when you're introducing someone to a topic, it's a good idea to start with something representative and likely to instill interest in the topic.
</p><p>I don't know which programming language fits the bill, though.</p></htmltext>
<tokenext>Disclaimer : I am not a programmer .
It seems to me like there should be a good " introductory " language where , though the utility of knowing that language may or may not be practical , would help teach programming concepts and good practices to new programmers .
That is , if the goal is really to teach programming .
Usually , when you 're introducing someone to a topic , it 's a good idea to start with something representative and likely to instill interest in the topic .
I do n't know which programming language fits the bill , though .</tokentext>
<sentencetext>Disclaimer: I am not a programmer.
It seems to me like there should be a good "introductory" language where, though the utility of knowing that language may or may not be practical, would help teach programming concepts and good practices to new programmers.
That is, if the goal is really to teach programming.
Usually, when you're introducing someone to a topic, it's a good idea to start with something representative and likely to instill interest in the topic.
I don't know which programming language fits the bill, though.</sentencetext>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28292801</id>
	<title>From an engineer's viewpoint</title>
	<author>dw\_g</author>
	<datestamp>1244731620000</datestamp>
	<modclass>Informativ</modclass>
	<modscore>4</modscore>
	<htmltext>I'm an engineer with a large aerospace firm.  All our major programs are in Fortran and have to be used, modified, and maintained.

I remember a few years ago we hired a new grad from MIT; she had studied Basic, Pascal, and C; so of course we had to teach her Fortran so she could do her work.

The engineering world is heavily dependant upon Fortran, and to not know it puts you at a huge disadvantage.</htmltext>
<tokenext>I 'm an engineer with a large aerospace firm .
All our major programs are in Fortran and have to be used , modified , and maintained .
I remember a few years ago we hired a new grad from MIT ; she had studied Basic , Pascal , and C ; so of course we had to teach her Fortran so she could do her work .
The engineering world is heavily dependant upon Fortran , and to not know it puts you at a huge disadvantage .</tokentext>
<sentencetext>I'm an engineer with a large aerospace firm.
All our major programs are in Fortran and have to be used, modified, and maintained.
I remember a few years ago we hired a new grad from MIT; she had studied Basic, Pascal, and C; so of course we had to teach her Fortran so she could do her work.
The engineering world is heavily dependant upon Fortran, and to not know it puts you at a huge disadvantage.</sentencetext>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28298573</id>
	<title>Modula 2</title>
	<author>mujadaddy</author>
	<datestamp>1244751960000</datestamp>
	<modclass>None</modclass>
	<modscore>1</modscore>
	<htmltext>...anyone else?</htmltext>
<tokenext>...anyone else ?</tokentext>
<sentencetext>...anyone else?</sentencetext>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28292733</id>
	<title>Re:I still use Fortran for sciantific calculations</title>
	<author>Anonymous</author>
	<datestamp>1244731440000</datestamp>
	<modclass>None</modclass>
	<modscore>0</modscore>
	<htmltext><p>Agreed. About fifteen years ago, a bunch of us (NASA physicists) decided we needed to learn C++, so we organized a self-taught course. After a couple months, we gave it up, because the language was ill-suited for high-performance computing.</p><p>It's also amusing to hear all those objections (Uppercase weirdness! Special columns! Short variable names! Antedeluvian control structures!) that simply demonstrate that the complainer hasn't looked at anything newer than Fortran IV. Array syntax alone makes the modern language worthwhile.</p><p>That said, I don't think teaching Fortran as a first language makes much sense. When Fortran was new and shiny, scientific/engineering number crunching was a primary market. That's no longer true. While Fortran has continued to evolve as a specialized language  supporting high-performance computing, using it to develop a GUI-based app is simply a particularly horrible form of self-mutillation. It makes the most sense to teach a first language that's more general-purpose, and if you find later that you need Fortran, it's easy to pick up as an nth language.</p></htmltext>
<tokenext>Agreed .
About fifteen years ago , a bunch of us ( NASA physicists ) decided we needed to learn C + + , so we organized a self-taught course .
After a couple months , we gave it up , because the language was ill-suited for high-performance computing.It 's also amusing to hear all those objections ( Uppercase weirdness !
Special columns !
Short variable names !
Antedeluvian control structures !
) that simply demonstrate that the complainer has n't looked at anything newer than Fortran IV .
Array syntax alone makes the modern language worthwhile.That said , I do n't think teaching Fortran as a first language makes much sense .
When Fortran was new and shiny , scientific/engineering number crunching was a primary market .
That 's no longer true .
While Fortran has continued to evolve as a specialized language supporting high-performance computing , using it to develop a GUI-based app is simply a particularly horrible form of self-mutillation .
It makes the most sense to teach a first language that 's more general-purpose , and if you find later that you need Fortran , it 's easy to pick up as an nth language .</tokentext>
<sentencetext>Agreed.
About fifteen years ago, a bunch of us (NASA physicists) decided we needed to learn C++, so we organized a self-taught course.
After a couple months, we gave it up, because the language was ill-suited for high-performance computing.It's also amusing to hear all those objections (Uppercase weirdness!
Special columns!
Short variable names!
Antedeluvian control structures!
) that simply demonstrate that the complainer hasn't looked at anything newer than Fortran IV.
Array syntax alone makes the modern language worthwhile.That said, I don't think teaching Fortran as a first language makes much sense.
When Fortran was new and shiny, scientific/engineering number crunching was a primary market.
That's no longer true.
While Fortran has continued to evolve as a specialized language  supporting high-performance computing, using it to develop a GUI-based app is simply a particularly horrible form of self-mutillation.
It makes the most sense to teach a first language that's more general-purpose, and if you find later that you need Fortran, it's easy to pick up as an nth language.</sentencetext>
	<parent>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28291925</parent>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28292875</id>
	<title>Does FORTRAN not offer any value?</title>
	<author>Anonymous</author>
	<datestamp>1244731800000</datestamp>
	<modclass>None</modclass>
	<modscore>0</modscore>
	<htmltext><p>What does it mean to be "taught" FORTRAN?  Does this mean that students are merely exposed to the language from a historical perspective, maybe asked to write a few simple programs, and then offered notions about the benefits and drawbacks of using this language?  Or does it mean that they have to craft complex programs to solve difficult engineering problems, thus displaying their mastery of the language?  There's a big difference between those two objectives.</p><p>
&nbsp; </p><p>I think that every computer science student should at least KNOW about this language, what it offers, and the drawbacks of using it.  A language is a tool... and even ancient tools can frequently be applied to modern problems (everyone remember that whole fulcrum/lever thing?), so discounting FORTRAN, simply because it's an older language, is simply "linguistic ageism" and is not a conclusion based upon sound logic.</p><p>
&nbsp; </p><p>If anything, a university should be teaching about how to find the right tool to do the required job, regardless of the age of that tool, and how to apply sound reasoning in making that selection.  Any programmer can learn any language... the point is to optimize your learning, coding and debugging time in an effort to solve the problem that needs solving.  When we start discounting tools because they are "dangerous" or "too complex" or "lacking in strong typing", then we're presupposing that we know better than the talented artist that we hope each programmer will be.</p></htmltext>
<tokenext>What does it mean to be " taught " FORTRAN ?
Does this mean that students are merely exposed to the language from a historical perspective , maybe asked to write a few simple programs , and then offered notions about the benefits and drawbacks of using this language ?
Or does it mean that they have to craft complex programs to solve difficult engineering problems , thus displaying their mastery of the language ?
There 's a big difference between those two objectives .
  I think that every computer science student should at least KNOW about this language , what it offers , and the drawbacks of using it .
A language is a tool... and even ancient tools can frequently be applied to modern problems ( everyone remember that whole fulcrum/lever thing ?
) , so discounting FORTRAN , simply because it 's an older language , is simply " linguistic ageism " and is not a conclusion based upon sound logic .
  If anything , a university should be teaching about how to find the right tool to do the required job , regardless of the age of that tool , and how to apply sound reasoning in making that selection .
Any programmer can learn any language... the point is to optimize your learning , coding and debugging time in an effort to solve the problem that needs solving .
When we start discounting tools because they are " dangerous " or " too complex " or " lacking in strong typing " , then we 're presupposing that we know better than the talented artist that we hope each programmer will be .</tokentext>
<sentencetext>What does it mean to be "taught" FORTRAN?
Does this mean that students are merely exposed to the language from a historical perspective, maybe asked to write a few simple programs, and then offered notions about the benefits and drawbacks of using this language?
Or does it mean that they have to craft complex programs to solve difficult engineering problems, thus displaying their mastery of the language?
There's a big difference between those two objectives.
  I think that every computer science student should at least KNOW about this language, what it offers, and the drawbacks of using it.
A language is a tool... and even ancient tools can frequently be applied to modern problems (everyone remember that whole fulcrum/lever thing?
), so discounting FORTRAN, simply because it's an older language, is simply "linguistic ageism" and is not a conclusion based upon sound logic.
  If anything, a university should be teaching about how to find the right tool to do the required job, regardless of the age of that tool, and how to apply sound reasoning in making that selection.
Any programmer can learn any language... the point is to optimize your learning, coding and debugging time in an effort to solve the problem that needs solving.
When we start discounting tools because they are "dangerous" or "too complex" or "lacking in strong typing", then we're presupposing that we know better than the talented artist that we hope each programmer will be.</sentencetext>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28292233</id>
	<title>Esperanto</title>
	<author>dkh2</author>
	<datestamp>1244729760000</datestamp>
	<modclass>None</modclass>
	<modscore>1</modscore>
	<htmltext><p>May as well require them to speak Esperanto too just to make sure nobody understands them.</p><p>I still work with Cobol geeks.  It's actually in regular use.  But I don't know of anybody who uses Fortran except for comp-sci departments.  And if you're using it to teach programming practices you might as well teach said practices in a language they stand a chance of using.</p></htmltext>
<tokenext>May as well require them to speak Esperanto too just to make sure nobody understands them.I still work with Cobol geeks .
It 's actually in regular use .
But I do n't know of anybody who uses Fortran except for comp-sci departments .
And if you 're using it to teach programming practices you might as well teach said practices in a language they stand a chance of using .</tokentext>
<sentencetext>May as well require them to speak Esperanto too just to make sure nobody understands them.I still work with Cobol geeks.
It's actually in regular use.
But I don't know of anybody who uses Fortran except for comp-sci departments.
And if you're using it to teach programming practices you might as well teach said practices in a language they stand a chance of using.</sentencetext>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28295043</id>
	<title>Fortran</title>
	<author>Anonymous</author>
	<datestamp>1244739780000</datestamp>
	<modclass>None</modclass>
	<modscore>0</modscore>
	<htmltext><p>I graduated from college in 2002.  I've seen over 20 different software packages written in Fortran.  Many widely-used applications are written in Fortran or COBOL.  You're going to see it, so learn it.  Also, I disagree with those who want to learn<nobr> <wbr></nobr>.NET or some other flavor of the month.  It's useless to know too much about<nobr> <wbr></nobr>.NET and nothing about other languages, like Fortran.  You really box yourself in for life if you become dependent on a single language.  Many students today have disdain for anything but an drag and drop, plugin based system with 15,000 unnecessary libraries.  Therefore, they're unemployable to our organization.</p></htmltext>
<tokenext>I graduated from college in 2002 .
I 've seen over 20 different software packages written in Fortran .
Many widely-used applications are written in Fortran or COBOL .
You 're going to see it , so learn it .
Also , I disagree with those who want to learn .NET or some other flavor of the month .
It 's useless to know too much about .NET and nothing about other languages , like Fortran .
You really box yourself in for life if you become dependent on a single language .
Many students today have disdain for anything but an drag and drop , plugin based system with 15,000 unnecessary libraries .
Therefore , they 're unemployable to our organization .</tokentext>
<sentencetext>I graduated from college in 2002.
I've seen over 20 different software packages written in Fortran.
Many widely-used applications are written in Fortran or COBOL.
You're going to see it, so learn it.
Also, I disagree with those who want to learn .NET or some other flavor of the month.
It's useless to know too much about .NET and nothing about other languages, like Fortran.
You really box yourself in for life if you become dependent on a single language.
Many students today have disdain for anything but an drag and drop, plugin based system with 15,000 unnecessary libraries.
Therefore, they're unemployable to our organization.</sentencetext>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28293213</id>
	<title>Python?</title>
	<author>Anonymous</author>
	<datestamp>1244733000000</datestamp>
	<modclass>Insightful</modclass>
	<modscore>1</modscore>
	<htmltext><p>OK, OK.</p><p>Fortran is old.</p><p>But Python? Are you serious?</p><p>pffht.</p></htmltext>
<tokenext>OK , OK.Fortran is old.But Python ?
Are you serious ? pffht .</tokentext>
<sentencetext>OK, OK.Fortran is old.But Python?
Are you serious?pffht.</sentencetext>
	<parent>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28291847</parent>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28292159</id>
	<title>Re:While there may be "newer" languages</title>
	<author>ChienAndalu</author>
	<datestamp>1244729400000</datestamp>
	<modclass>Interestin</modclass>
	<modscore>4</modscore>
	<htmltext><p>Use both. I used Fortran to create some python modules at my last job, and it was dead easy. Take a look at <a href="http://cens.ioc.ee/projects/f2py2e/" title="cens.ioc.ee">this</a> [cens.ioc.ee].</p></htmltext>
<tokenext>Use both .
I used Fortran to create some python modules at my last job , and it was dead easy .
Take a look at this [ cens.ioc.ee ] .</tokentext>
<sentencetext>Use both.
I used Fortran to create some python modules at my last job, and it was dead easy.
Take a look at this [cens.ioc.ee].</sentencetext>
	<parent>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28291859</parent>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28305583</id>
	<title>Pfft</title>
	<author>Rumata</author>
	<datestamp>1244802900000</datestamp>
	<modclass>None</modclass>
	<modscore>1</modscore>
	<htmltext><blockquote><div><p> <i>I've found watching the facial ticks and foam build up around the mouth from real programmers when they encounter a GOTO to be quite entertaining!</i></p></div> </blockquote><p>Clearly you've never met a <a href="http://www.ccil.org/jargon/jargon\_49.html" title="ccil.org" rel="nofollow">Real Programmer</a> [ccil.org].</p><p>Cheers,<br>Michael</p></div>
	</htmltext>
<tokenext>I 've found watching the facial ticks and foam build up around the mouth from real programmers when they encounter a GOTO to be quite entertaining !
Clearly you 've never met a Real Programmer [ ccil.org ] .Cheers,Michael</tokentext>
<sentencetext> I've found watching the facial ticks and foam build up around the mouth from real programmers when they encounter a GOTO to be quite entertaining!
Clearly you've never met a Real Programmer [ccil.org].Cheers,Michael
	</sentencetext>
	<parent>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28294815</parent>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28293641</id>
	<title>Re:While there may be "newer" languages</title>
	<author>Anonymous</author>
	<datestamp>1244734620000</datestamp>
	<modclass>None</modclass>
	<modscore>0</modscore>
	<htmltext><p>I almost spewed my cookies when I read "Python".</p></htmltext>
<tokenext>I almost spewed my cookies when I read " Python " .</tokentext>
<sentencetext>I almost spewed my cookies when I read "Python".</sentencetext>
	<parent>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28291859</parent>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28293573</id>
	<title>Re:It's okay to teach them FORTRAN</title>
	<author>Anonymous</author>
	<datestamp>1244734260000</datestamp>
	<modclass>Troll</modclass>
	<modscore>-1</modscore>
	<htmltext><p>I have a suggestion.  Since FORTRAN programs are everywhere, how about the Python advocates form a large group of programmers to convert every little purpose specific program from FORTRAN (and BASIC and COBOL and Pascal) to Python.  Then the users that use FORTRAN programs because they are too unsophisticated to write and debug any program, will use the environment you want them to.</p><p>It would make my life easier as a program user. I am too old to write programs anymore.  I wrote dozens.</p></htmltext>
<tokenext>I have a suggestion .
Since FORTRAN programs are everywhere , how about the Python advocates form a large group of programmers to convert every little purpose specific program from FORTRAN ( and BASIC and COBOL and Pascal ) to Python .
Then the users that use FORTRAN programs because they are too unsophisticated to write and debug any program , will use the environment you want them to.It would make my life easier as a program user .
I am too old to write programs anymore .
I wrote dozens .</tokentext>
<sentencetext>I have a suggestion.
Since FORTRAN programs are everywhere, how about the Python advocates form a large group of programmers to convert every little purpose specific program from FORTRAN (and BASIC and COBOL and Pascal) to Python.
Then the users that use FORTRAN programs because they are too unsophisticated to write and debug any program, will use the environment you want them to.It would make my life easier as a program user.
I am too old to write programs anymore.
I wrote dozens.</sentencetext>
	<parent>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28291847</parent>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28295615</id>
	<title>Don't teach languages at all</title>
	<author>LennyP</author>
	<datestamp>1244741700000</datestamp>
	<modclass>None</modclass>
	<modscore>0</modscore>
	<htmltext><p>What should be taught is the ability to look at and analyze a problem to find the best available tools (in this case computer language) to accomplish the solution of the problems, not how to use a tool or tools directly.  It needs to be taught how to decide whether to use a screw or a nail, then which tool is better (a screwdriver for a screw and a hammer for a nail), and finally how to use that tool in a practical way.  Schools should be teaching computer languages as after thoughts and as a means to problem solve, not as an end in themselves.</p><p>Learning and using most all higher level programming languages is a no-brainer for someone that has been trained in problem analysis, compilers/interpretors are self-teaching tools.  Has C++ brought us better solutions (more efficient, easier to maintain, less complexity,<nobr> <wbr></nobr>...?) over C?  over Ruby?  over Cobol?  over fortran? </p></htmltext>
<tokenext>What should be taught is the ability to look at and analyze a problem to find the best available tools ( in this case computer language ) to accomplish the solution of the problems , not how to use a tool or tools directly .
It needs to be taught how to decide whether to use a screw or a nail , then which tool is better ( a screwdriver for a screw and a hammer for a nail ) , and finally how to use that tool in a practical way .
Schools should be teaching computer languages as after thoughts and as a means to problem solve , not as an end in themselves.Learning and using most all higher level programming languages is a no-brainer for someone that has been trained in problem analysis , compilers/interpretors are self-teaching tools .
Has C + + brought us better solutions ( more efficient , easier to maintain , less complexity , ... ?
) over C ?
over Ruby ?
over Cobol ?
over fortran ?</tokentext>
<sentencetext>What should be taught is the ability to look at and analyze a problem to find the best available tools (in this case computer language) to accomplish the solution of the problems, not how to use a tool or tools directly.
It needs to be taught how to decide whether to use a screw or a nail, then which tool is better (a screwdriver for a screw and a hammer for a nail), and finally how to use that tool in a practical way.
Schools should be teaching computer languages as after thoughts and as a means to problem solve, not as an end in themselves.Learning and using most all higher level programming languages is a no-brainer for someone that has been trained in problem analysis, compilers/interpretors are self-teaching tools.
Has C++ brought us better solutions (more efficient, easier to maintain, less complexity, ...?
) over C?
over Ruby?
over Cobol?
over fortran? </sentencetext>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28297127</id>
	<title>Re:While there may be "newer" languages</title>
	<author>joib</author>
	<datestamp>1244746980000</datestamp>
	<modclass>None</modclass>
	<modscore>1</modscore>
	<htmltext><p><i><br>BTW, a very ill-advised design choice of Python: <a href="http://www.python.org/dev/peps/pep-0211/" title="python.org">http://www.python.org/dev/peps/pep-0211/</a> [python.org] [python.org] Ask any numerical analyst to know why it is a terrible idea to solve a linear system with inv(A)*b. But make sure you have at least half an hour free.<br></i></p><p>AFAIK PEP 211 and the related PEP 209 were never actually accepted into the language.  Python users that want multidimensional arrays use the <a href="http://numpy.scipy.org/" title="scipy.org">numpy</a> [scipy.org] package, along with the <a href="http://www.scipy.org/" title="scipy.org">scipy</a> [scipy.org] package that builds on top of numpy.</p><p>For solving a linear system Ax=b, you just use</p><p>x = numpy.linalg.solve(A, b)</p><p>which calls LAPACK behind the scenes. For a simple matrix multiplication (or M-V or V-V depending on the dimensionality of the arrays) A*B=C you do</p><p>C = numpy.dot(A, B)</p><p>which is implemented with a call to BLAS.</p></htmltext>
<tokenext>BTW , a very ill-advised design choice of Python : http : //www.python.org/dev/peps/pep-0211/ [ python.org ] [ python.org ] Ask any numerical analyst to know why it is a terrible idea to solve a linear system with inv ( A ) * b. But make sure you have at least half an hour free.AFAIK PEP 211 and the related PEP 209 were never actually accepted into the language .
Python users that want multidimensional arrays use the numpy [ scipy.org ] package , along with the scipy [ scipy.org ] package that builds on top of numpy.For solving a linear system Ax = b , you just usex = numpy.linalg.solve ( A , b ) which calls LAPACK behind the scenes .
For a simple matrix multiplication ( or M-V or V-V depending on the dimensionality of the arrays ) A * B = C you doC = numpy.dot ( A , B ) which is implemented with a call to BLAS .</tokentext>
<sentencetext>BTW, a very ill-advised design choice of Python: http://www.python.org/dev/peps/pep-0211/ [python.org] [python.org] Ask any numerical analyst to know why it is a terrible idea to solve a linear system with inv(A)*b. But make sure you have at least half an hour free.AFAIK PEP 211 and the related PEP 209 were never actually accepted into the language.
Python users that want multidimensional arrays use the numpy [scipy.org] package, along with the scipy [scipy.org] package that builds on top of numpy.For solving a linear system Ax=b, you just usex = numpy.linalg.solve(A, b)which calls LAPACK behind the scenes.
For a simple matrix multiplication (or M-V or V-V depending on the dimensionality of the arrays) A*B=C you doC = numpy.dot(A, B)which is implemented with a call to BLAS.</sentencetext>
	<parent>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28292401</parent>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28292819</id>
	<title>Re:While there may be "newer" languages</title>
	<author>navanee</author>
	<datestamp>1244731680000</datestamp>
	<modclass>None</modclass>
	<modscore>1</modscore>
	<htmltext><a href="http://www.mathworks.com/products/" title="mathworks.com" rel="nofollow"> Matlab </a> [mathworks.com] is now the de-facto language of choice for Engineers. It has a huge set of domain-specific libraries.

That should be the first thing non-comp sci Enggs are taught... They can learn Fortran or Python much later if required.</htmltext>
<tokenext>Matlab [ mathworks.com ] is now the de-facto language of choice for Engineers .
It has a huge set of domain-specific libraries .
That should be the first thing non-comp sci Enggs are taught... They can learn Fortran or Python much later if required .</tokentext>
<sentencetext> Matlab  [mathworks.com] is now the de-facto language of choice for Engineers.
It has a huge set of domain-specific libraries.
That should be the first thing non-comp sci Enggs are taught... They can learn Fortran or Python much later if required.</sentencetext>
	<parent>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28291859</parent>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28292891</id>
	<title>Fortran is still common in science</title>
	<author>damn\_registrars</author>
	<datestamp>1244731920000</datestamp>
	<modclass>None</modclass>
	<modscore>1</modscore>
	<htmltext>I regularly find myself compiling scientific applications that were written in fortran.  Which means that plenty of scientific software development is still done in fortran as well.  Even if the students aren't going to write their own software, there is a good chance that they will at some point need to be able to install scientific software, and a little basic knowledge of fortran can go a long ways towards understanding the snafus that they will likely encounter along the way.<br> <br>
That is in no way something of insignificant value in addition to learning the basic logic structure of a language like fortran.</htmltext>
<tokenext>I regularly find myself compiling scientific applications that were written in fortran .
Which means that plenty of scientific software development is still done in fortran as well .
Even if the students are n't going to write their own software , there is a good chance that they will at some point need to be able to install scientific software , and a little basic knowledge of fortran can go a long ways towards understanding the snafus that they will likely encounter along the way .
That is in no way something of insignificant value in addition to learning the basic logic structure of a language like fortran .</tokentext>
<sentencetext>I regularly find myself compiling scientific applications that were written in fortran.
Which means that plenty of scientific software development is still done in fortran as well.
Even if the students aren't going to write their own software, there is a good chance that they will at some point need to be able to install scientific software, and a little basic knowledge of fortran can go a long ways towards understanding the snafus that they will likely encounter along the way.
That is in no way something of insignificant value in addition to learning the basic logic structure of a language like fortran.</sentencetext>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28294159</id>
	<title>Matlab</title>
	<author>decsnake</author>
	<datestamp>1244736480000</datestamp>
	<modclass>None</modclass>
	<modscore>1</modscore>
	<htmltext><p>I work in space science and engineering, and the two most widely used computing tools used directly by scientists and engineers are Matlab and Excel. FORTRAN is still used, but it is used by \_programmers\_ to solve numeric science and engineering problems.</p><p>The only problem I have with Matlab is that it is proprietary. I tried Octave but it just wasn't quite good enough -- there were some things that worked in matlab that broke octave.</p><p>I wont say any more about excel, other than it is the most widely used computing tool by engineers, other than maybe powerpoint<nobr> <wbr></nobr>:)</p></htmltext>
<tokenext>I work in space science and engineering , and the two most widely used computing tools used directly by scientists and engineers are Matlab and Excel .
FORTRAN is still used , but it is used by \ _programmers \ _ to solve numeric science and engineering problems.The only problem I have with Matlab is that it is proprietary .
I tried Octave but it just was n't quite good enough -- there were some things that worked in matlab that broke octave.I wont say any more about excel , other than it is the most widely used computing tool by engineers , other than maybe powerpoint : )</tokentext>
<sentencetext>I work in space science and engineering, and the two most widely used computing tools used directly by scientists and engineers are Matlab and Excel.
FORTRAN is still used, but it is used by \_programmers\_ to solve numeric science and engineering problems.The only problem I have with Matlab is that it is proprietary.
I tried Octave but it just wasn't quite good enough -- there were some things that worked in matlab that broke octave.I wont say any more about excel, other than it is the most widely used computing tool by engineers, other than maybe powerpoint :)</sentencetext>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28296641</id>
	<title>Re:I still use Fortran for sciantific calculations</title>
	<author>julian\_t</author>
	<datestamp>1244745300000</datestamp>
	<modclass>None</modclass>
	<modscore>1</modscore>
	<htmltext>Although I have to use quiche-type programming languages a lot of the time, I've always reckoned that Fortran, C and Lisp would enable me to do most things...</htmltext>
<tokenext>Although I have to use quiche-type programming languages a lot of the time , I 've always reckoned that Fortran , C and Lisp would enable me to do most things.. .</tokentext>
<sentencetext>Although I have to use quiche-type programming languages a lot of the time, I've always reckoned that Fortran, C and Lisp would enable me to do most things...</sentencetext>
	<parent>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28291925</parent>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28294121</id>
	<title>Re:C++ on Particle Accelerators</title>
	<author>mbone</author>
	<datestamp>1244736360000</datestamp>
	<modclass>None</modclass>
	<modscore>1</modscore>
	<htmltext><p>About 15 years ago I was involved in several efforts to take old scientific analysis software packages and rewrite them in C++. Every one of those efforts failed. Code was produced, but it took so long to get it working right that the old package was upgraded with new features, so the new code was "born obsolete," never used that much, and allowed to die on the vine.</p><p>Your milage, of course, may vary.</p></htmltext>
<tokenext>About 15 years ago I was involved in several efforts to take old scientific analysis software packages and rewrite them in C + + .
Every one of those efforts failed .
Code was produced , but it took so long to get it working right that the old package was upgraded with new features , so the new code was " born obsolete , " never used that much , and allowed to die on the vine.Your milage , of course , may vary .</tokentext>
<sentencetext>About 15 years ago I was involved in several efforts to take old scientific analysis software packages and rewrite them in C++.
Every one of those efforts failed.
Code was produced, but it took so long to get it working right that the old package was upgraded with new features, so the new code was "born obsolete," never used that much, and allowed to die on the vine.Your milage, of course, may vary.</sentencetext>
	<parent>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28292931</parent>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28292295</id>
	<title>ForTran = Formula Translation</title>
	<author>Bloody Peasant</author>
	<datestamp>1244729940000</datestamp>
	<modclass>None</modclass>
	<modscore>1</modscore>
	<htmltext><p>Remember what the name means.  </p><p>Because the language has been around for decades (40 years?) it has matured to the point where the libraries and compilers are <em>highly</em> optimised and reliable, and performance on the most significant and "heavy" computational problems has been tweaked to go as fast as possible.</p><p>Maybe in another couple of decades, Python, Perl, and other interpreted languages will come close.  But not now (IMHO of course).</p></htmltext>
<tokenext>Remember what the name means .
Because the language has been around for decades ( 40 years ?
) it has matured to the point where the libraries and compilers are highly optimised and reliable , and performance on the most significant and " heavy " computational problems has been tweaked to go as fast as possible.Maybe in another couple of decades , Python , Perl , and other interpreted languages will come close .
But not now ( IMHO of course ) .</tokentext>
<sentencetext>Remember what the name means.
Because the language has been around for decades (40 years?
) it has matured to the point where the libraries and compilers are highly optimised and reliable, and performance on the most significant and "heavy" computational problems has been tweaked to go as fast as possible.Maybe in another couple of decades, Python, Perl, and other interpreted languages will come close.
But not now (IMHO of course).</sentencetext>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28292045</id>
	<title>Not so easy</title>
	<author>Anonymous</author>
	<datestamp>1244729040000</datestamp>
	<modclass>None</modclass>
	<modscore>1</modscore>
	<htmltext>Fortran has all sorts of what we would now consider artificial limitations and restrictions on its syntax that make writing even simple programs a lot less simple and intuitive.  For example, line length limitations, special columns, short &amp; special variables names, etc.  I'd rather not spend time teaching/learning (especially non-standard) syntax and more time on the concepts while encouraging good stylistic conventions (use commenting, readable variable/function names, etc).  I found Pascal a much better first language.  Not to mention that many of these programs using FORTRAN are arcane, dated in their teaching concepts, and force FORTRAN despite students who often already know many other languages.</htmltext>
<tokenext>Fortran has all sorts of what we would now consider artificial limitations and restrictions on its syntax that make writing even simple programs a lot less simple and intuitive .
For example , line length limitations , special columns , short &amp; special variables names , etc .
I 'd rather not spend time teaching/learning ( especially non-standard ) syntax and more time on the concepts while encouraging good stylistic conventions ( use commenting , readable variable/function names , etc ) .
I found Pascal a much better first language .
Not to mention that many of these programs using FORTRAN are arcane , dated in their teaching concepts , and force FORTRAN despite students who often already know many other languages .</tokentext>
<sentencetext>Fortran has all sorts of what we would now consider artificial limitations and restrictions on its syntax that make writing even simple programs a lot less simple and intuitive.
For example, line length limitations, special columns, short &amp; special variables names, etc.
I'd rather not spend time teaching/learning (especially non-standard) syntax and more time on the concepts while encouraging good stylistic conventions (use commenting, readable variable/function names, etc).
I found Pascal a much better first language.
Not to mention that many of these programs using FORTRAN are arcane, dated in their teaching concepts, and force FORTRAN despite students who often already know many other languages.</sentencetext>
	<parent>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28291859</parent>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28294691</id>
	<title>Depends</title>
	<author>DarthVain</author>
	<datestamp>1244738460000</datestamp>
	<modclass>None</modclass>
	<modscore>1</modscore>
	<htmltext><p>...if it is instructive or not.</p><p>Disclaimer I graduated in 2000.</p><p>People get too hung up on this or that language. A programmer is a programmer, either you know the fundamentals or you do not. Knowing the particular syntax of one over the other is pretty limited in the grand scheme of things really I think.</p><p>When I started University the standard was Pascal of all things. My second year they changed the Standard to C. Go figure. We also did VB, and in my 3rd year did some Cobol and Assembly. I actually found Cobol a pretty simple and interesting language, though somewhat limited. Practice and theory changed between instructors as well. Some (most) were all horny about programming recursively for efficiency, on my final Cobol project I did just that, and nearly gave the prof a heart attack and he actually took marks off me for doing so... shrug. Assembly was horrible as you can well imagine from it being such a low level language, but it was sort of novel as it is the closest I have come to typing ones and zeros in machine language. Stuff like that also helps tie into other courses like architecture and binary algebra etc...</p><p>Anyway long story short a new language comes out every couple of years. A few stick around in one form or another for a long time, but may have limited use. If you want to program you are pretty much committing to learning new languages for the rest of you life as demands change.</p><p>Having said that, I don't really program much anymore. Mostly I script in specialized languages to simply make my life easier by automating certain processes, and VB of all things, because that is what is commonly integrated in some off the shelf software. As for application design and development I mostly just find out where something is going wrong and say "fix that" to a vendor. 9 times out of 10 it has nothing to do with syntax or the abilities of a language, but more a misunderstanding of the specifications or requirements or perhaps the process. In any case I am pretty confident that I can take whatever modern language I want out there and in a pretty short span of time be able to use it functionally.</p><p>Its just like there are a number of specialty languages out there that will never be taught at any school, so you will either have to have your workplace bankroll the ridiculous training fees from a private training company, or learn it on the job (to buy the software is usually out of the reach to do it personally because of the cost).</p><p>Hell I had a friend (mind you he was pretty bright) who bid on a job that required a language he had never used (Pearl I believe). He learned it over a weekend and did just fine. He now works pretty high up in a corporate IT structure. Anyway so far as I am concerned I think I would rather hire someone who has a broad base of knowlege, than someone who knows one language really well. Unless I am getting someone for one contract, and that is the language it is to be written in, then and only then would I take the mono language guy.</p></htmltext>
<tokenext>...if it is instructive or not.Disclaimer I graduated in 2000.People get too hung up on this or that language .
A programmer is a programmer , either you know the fundamentals or you do not .
Knowing the particular syntax of one over the other is pretty limited in the grand scheme of things really I think.When I started University the standard was Pascal of all things .
My second year they changed the Standard to C. Go figure .
We also did VB , and in my 3rd year did some Cobol and Assembly .
I actually found Cobol a pretty simple and interesting language , though somewhat limited .
Practice and theory changed between instructors as well .
Some ( most ) were all horny about programming recursively for efficiency , on my final Cobol project I did just that , and nearly gave the prof a heart attack and he actually took marks off me for doing so... shrug. Assembly was horrible as you can well imagine from it being such a low level language , but it was sort of novel as it is the closest I have come to typing ones and zeros in machine language .
Stuff like that also helps tie into other courses like architecture and binary algebra etc...Anyway long story short a new language comes out every couple of years .
A few stick around in one form or another for a long time , but may have limited use .
If you want to program you are pretty much committing to learning new languages for the rest of you life as demands change.Having said that , I do n't really program much anymore .
Mostly I script in specialized languages to simply make my life easier by automating certain processes , and VB of all things , because that is what is commonly integrated in some off the shelf software .
As for application design and development I mostly just find out where something is going wrong and say " fix that " to a vendor .
9 times out of 10 it has nothing to do with syntax or the abilities of a language , but more a misunderstanding of the specifications or requirements or perhaps the process .
In any case I am pretty confident that I can take whatever modern language I want out there and in a pretty short span of time be able to use it functionally.Its just like there are a number of specialty languages out there that will never be taught at any school , so you will either have to have your workplace bankroll the ridiculous training fees from a private training company , or learn it on the job ( to buy the software is usually out of the reach to do it personally because of the cost ) .Hell I had a friend ( mind you he was pretty bright ) who bid on a job that required a language he had never used ( Pearl I believe ) .
He learned it over a weekend and did just fine .
He now works pretty high up in a corporate IT structure .
Anyway so far as I am concerned I think I would rather hire someone who has a broad base of knowlege , than someone who knows one language really well .
Unless I am getting someone for one contract , and that is the language it is to be written in , then and only then would I take the mono language guy .</tokentext>
<sentencetext>...if it is instructive or not.Disclaimer I graduated in 2000.People get too hung up on this or that language.
A programmer is a programmer, either you know the fundamentals or you do not.
Knowing the particular syntax of one over the other is pretty limited in the grand scheme of things really I think.When I started University the standard was Pascal of all things.
My second year they changed the Standard to C. Go figure.
We also did VB, and in my 3rd year did some Cobol and Assembly.
I actually found Cobol a pretty simple and interesting language, though somewhat limited.
Practice and theory changed between instructors as well.
Some (most) were all horny about programming recursively for efficiency, on my final Cobol project I did just that, and nearly gave the prof a heart attack and he actually took marks off me for doing so... shrug. Assembly was horrible as you can well imagine from it being such a low level language, but it was sort of novel as it is the closest I have come to typing ones and zeros in machine language.
Stuff like that also helps tie into other courses like architecture and binary algebra etc...Anyway long story short a new language comes out every couple of years.
A few stick around in one form or another for a long time, but may have limited use.
If you want to program you are pretty much committing to learning new languages for the rest of you life as demands change.Having said that, I don't really program much anymore.
Mostly I script in specialized languages to simply make my life easier by automating certain processes, and VB of all things, because that is what is commonly integrated in some off the shelf software.
As for application design and development I mostly just find out where something is going wrong and say "fix that" to a vendor.
9 times out of 10 it has nothing to do with syntax or the abilities of a language, but more a misunderstanding of the specifications or requirements or perhaps the process.
In any case I am pretty confident that I can take whatever modern language I want out there and in a pretty short span of time be able to use it functionally.Its just like there are a number of specialty languages out there that will never be taught at any school, so you will either have to have your workplace bankroll the ridiculous training fees from a private training company, or learn it on the job (to buy the software is usually out of the reach to do it personally because of the cost).Hell I had a friend (mind you he was pretty bright) who bid on a job that required a language he had never used (Pearl I believe).
He learned it over a weekend and did just fine.
He now works pretty high up in a corporate IT structure.
Anyway so far as I am concerned I think I would rather hire someone who has a broad base of knowlege, than someone who knows one language really well.
Unless I am getting someone for one contract, and that is the language it is to be written in, then and only then would I take the mono language guy.</sentencetext>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28294213</id>
	<title>Computer Science w/o Fortran or Cobol...</title>
	<author>matt\_kizerian</author>
	<datestamp>1244736720000</datestamp>
	<modclass>None</modclass>
	<modscore>1</modscore>
	<htmltext>is like birthday cake without catsup and mustard. <br>

Seriously, it's time for Fortran to die. I'm a Chemical Engineer and end up having to use Fortran for programming custom routines inside commercial chemical process modeling software. I was just talking to one of our support reps the other day, and he deigned to pull out the "C++ code doesn't run any faster than well-written Fortran code" argument. The problem isn't about speed or code execution efficiency anymore, it's about 1) ease and efficiency of programming/debugging, and 2) availability of people who are competent programming in the language. Fortran is an epic fail on both fronts; it had its place, now it's time to cede it to better tools. Python would fill its place nicely for many uses when relatively small programs are required (NumPy is a great numerical method's module for arrays and such.) More powerful languages like C++ are well-suited for larger projects.




It should be telling that the authors of the venerable book Numerical Recipes have decided to no longer update the Fortran version of the book and only update the C++ version.</htmltext>
<tokenext>is like birthday cake without catsup and mustard .
Seriously , it 's time for Fortran to die .
I 'm a Chemical Engineer and end up having to use Fortran for programming custom routines inside commercial chemical process modeling software .
I was just talking to one of our support reps the other day , and he deigned to pull out the " C + + code does n't run any faster than well-written Fortran code " argument .
The problem is n't about speed or code execution efficiency anymore , it 's about 1 ) ease and efficiency of programming/debugging , and 2 ) availability of people who are competent programming in the language .
Fortran is an epic fail on both fronts ; it had its place , now it 's time to cede it to better tools .
Python would fill its place nicely for many uses when relatively small programs are required ( NumPy is a great numerical method 's module for arrays and such .
) More powerful languages like C + + are well-suited for larger projects .
It should be telling that the authors of the venerable book Numerical Recipes have decided to no longer update the Fortran version of the book and only update the C + + version .</tokentext>
<sentencetext>is like birthday cake without catsup and mustard.
Seriously, it's time for Fortran to die.
I'm a Chemical Engineer and end up having to use Fortran for programming custom routines inside commercial chemical process modeling software.
I was just talking to one of our support reps the other day, and he deigned to pull out the "C++ code doesn't run any faster than well-written Fortran code" argument.
The problem isn't about speed or code execution efficiency anymore, it's about 1) ease and efficiency of programming/debugging, and 2) availability of people who are competent programming in the language.
Fortran is an epic fail on both fronts; it had its place, now it's time to cede it to better tools.
Python would fill its place nicely for many uses when relatively small programs are required (NumPy is a great numerical method's module for arrays and such.
) More powerful languages like C++ are well-suited for larger projects.
It should be telling that the authors of the venerable book Numerical Recipes have decided to no longer update the Fortran version of the book and only update the C++ version.</sentencetext>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28293899</id>
	<title>Yes, but no.</title>
	<author>Sobrique</author>
	<datestamp>1244735640000</datestamp>
	<modclass>None</modclass>
	<modscore>1</modscore>
	<htmltext>It's ok to teach people who are liable to be working with FORTRAN, how to do FORTRAN. It's one of the better tools for certain tasks. <br>
However I wouldn't include computer science people in on it. Computer Science shouldn't be about learning a language, it should be about learning a paradigm. FORTRAN is therefore a subject of curiosity, rather than a subject of study.</htmltext>
<tokenext>It 's ok to teach people who are liable to be working with FORTRAN , how to do FORTRAN .
It 's one of the better tools for certain tasks .
However I would n't include computer science people in on it .
Computer Science should n't be about learning a language , it should be about learning a paradigm .
FORTRAN is therefore a subject of curiosity , rather than a subject of study .</tokentext>
<sentencetext>It's ok to teach people who are liable to be working with FORTRAN, how to do FORTRAN.
It's one of the better tools for certain tasks.
However I wouldn't include computer science people in on it.
Computer Science shouldn't be about learning a language, it should be about learning a paradigm.
FORTRAN is therefore a subject of curiosity, rather than a subject of study.</sentencetext>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28292423</id>
	<title>Damn Right</title>
	<author>zoomshorts</author>
	<datestamp>1244730360000</datestamp>
	<modclass>None</modclass>
	<modscore>-1</modscore>
	<htmltext><p>And COBOL and FORTRAN on cards, learn REAL Programming !!!!!!!!!!</p><p>NIGGERS!!!! Wow, less Karma, who dives a shit?</p></htmltext>
<tokenext>And COBOL and FORTRAN on cards , learn REAL Programming ! ! ! ! ! ! ! ! ! ! NIGGERS ! ! ! !
Wow , less Karma , who dives a shit ?</tokentext>
<sentencetext>And COBOL and FORTRAN on cards, learn REAL Programming !!!!!!!!!!NIGGERS!!!!
Wow, less Karma, who dives a shit?</sentencetext>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28298581</id>
	<title>The point of teaching computer programming</title>
	<author>juanergie</author>
	<datestamp>1244751960000</datestamp>
	<modclass>None</modclass>
	<modscore>1</modscore>
	<htmltext><p>Many posts I've read seem to be missing the whole point of computer programming.</p><p>It is not computer science: that is another course and I agree it should be taught in a language-independent manner; it is not scientific computing, as many posts have implied; it is not job-skills workshop. The point, I believe, should be to make as many students as possible fall in love with computer programming and understand its usefulness, to the others it will only be an experience better forgotten.</p><p>I love FOTRAN (I even capitalize it, see? I'm young but old-fashioned) but let's agree that it is not the simplest language, and it has a steep learning curve: it takes longer to see your work done, and this might act as a deterrent for weak hearts. I have never taken a computer programming course, yet I am competent (at least not incompetent) in C, C++, FORTRAN, and Perl and Python. When you love computer programming, you don't need a teacher, you only need time to spare.</p><p>My belief is that the computer programming course should aim for making students fall in love. I would lean toward Python, and have students see their computers obeying them sooner rather than later, allowing them to imagine further ways to dominate the computer and just get shit done.</p><p>FORTRAN is faster? This is the same as saying Linux is free. Only if you don't take your time into account! Sure, inverting a matrix with your processor-optimized LAPACK works great in FORTRAN, but the kids are not designing rockets or predicting weather, they are learning how to program. If they learn how to do it and, most importantly, they learn to love it, they may very well end up designing rockets in assembly if you will. They will pick up girls with hexadecimal compliments. Maybe not the last one.</p><p>As anything else in high-school or college, the point is to "get it in the bull-park", learn to love knowledge, and learn how to learn.</p></htmltext>
<tokenext>Many posts I 've read seem to be missing the whole point of computer programming.It is not computer science : that is another course and I agree it should be taught in a language-independent manner ; it is not scientific computing , as many posts have implied ; it is not job-skills workshop .
The point , I believe , should be to make as many students as possible fall in love with computer programming and understand its usefulness , to the others it will only be an experience better forgotten.I love FOTRAN ( I even capitalize it , see ?
I 'm young but old-fashioned ) but let 's agree that it is not the simplest language , and it has a steep learning curve : it takes longer to see your work done , and this might act as a deterrent for weak hearts .
I have never taken a computer programming course , yet I am competent ( at least not incompetent ) in C , C + + , FORTRAN , and Perl and Python .
When you love computer programming , you do n't need a teacher , you only need time to spare.My belief is that the computer programming course should aim for making students fall in love .
I would lean toward Python , and have students see their computers obeying them sooner rather than later , allowing them to imagine further ways to dominate the computer and just get shit done.FORTRAN is faster ?
This is the same as saying Linux is free .
Only if you do n't take your time into account !
Sure , inverting a matrix with your processor-optimized LAPACK works great in FORTRAN , but the kids are not designing rockets or predicting weather , they are learning how to program .
If they learn how to do it and , most importantly , they learn to love it , they may very well end up designing rockets in assembly if you will .
They will pick up girls with hexadecimal compliments .
Maybe not the last one.As anything else in high-school or college , the point is to " get it in the bull-park " , learn to love knowledge , and learn how to learn .</tokentext>
<sentencetext>Many posts I've read seem to be missing the whole point of computer programming.It is not computer science: that is another course and I agree it should be taught in a language-independent manner; it is not scientific computing, as many posts have implied; it is not job-skills workshop.
The point, I believe, should be to make as many students as possible fall in love with computer programming and understand its usefulness, to the others it will only be an experience better forgotten.I love FOTRAN (I even capitalize it, see?
I'm young but old-fashioned) but let's agree that it is not the simplest language, and it has a steep learning curve: it takes longer to see your work done, and this might act as a deterrent for weak hearts.
I have never taken a computer programming course, yet I am competent (at least not incompetent) in C, C++, FORTRAN, and Perl and Python.
When you love computer programming, you don't need a teacher, you only need time to spare.My belief is that the computer programming course should aim for making students fall in love.
I would lean toward Python, and have students see their computers obeying them sooner rather than later, allowing them to imagine further ways to dominate the computer and just get shit done.FORTRAN is faster?
This is the same as saying Linux is free.
Only if you don't take your time into account!
Sure, inverting a matrix with your processor-optimized LAPACK works great in FORTRAN, but the kids are not designing rockets or predicting weather, they are learning how to program.
If they learn how to do it and, most importantly, they learn to love it, they may very well end up designing rockets in assembly if you will.
They will pick up girls with hexadecimal compliments.
Maybe not the last one.As anything else in high-school or college, the point is to "get it in the bull-park", learn to love knowledge, and learn how to learn.</sentencetext>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28302393</id>
	<title>Re:Sillyness</title>
	<author>Anonymous</author>
	<datestamp>1244724360000</datestamp>
	<modclass>None</modclass>
	<modscore>0</modscore>
	<htmltext><p>Also, a lot of models produced by government agencies such as the USGS and Army Corps of Engineers are written in Fortran. In engineering often comes down to the fact that if you ever get drug into court over model results you used in your work the only models that are easily defensible in court are those written by the government.</p><p>And they're written in Fortran.</p></htmltext>
<tokenext>Also , a lot of models produced by government agencies such as the USGS and Army Corps of Engineers are written in Fortran .
In engineering often comes down to the fact that if you ever get drug into court over model results you used in your work the only models that are easily defensible in court are those written by the government.And they 're written in Fortran .</tokentext>
<sentencetext>Also, a lot of models produced by government agencies such as the USGS and Army Corps of Engineers are written in Fortran.
In engineering often comes down to the fact that if you ever get drug into court over model results you used in your work the only models that are easily defensible in court are those written by the government.And they're written in Fortran.</sentencetext>
	<parent>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28292565</parent>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28292315</id>
	<title>More FORTRAN please?!</title>
	<author>laxsu19</author>
	<datestamp>1244730000000</datestamp>
	<modclass>Interestin</modclass>
	<modscore>3</modscore>
	<htmltext>I am a manager in a highly technical organization that relies on computer codes to do our job.  In my experience, there isn't ENOUGH FORTRAN teaching in the college level.  Maybe its location based, but most of our new-hires (we get most from the northeast, but still get a noticeable amount from as far away as University of Washington, Univ of Hawaii, and USC) actually are NOT taught FORTRAN and instead are taught something object-oriented, typically C++ or Java.  I know for a fact that Penn State suggests C++ for all undergrad engineers (FORTRAN is offered though - the classes hold less than 50\% total students than does the C++ course).

In my organization we also have a 'double-hump' age distribution: lots of people ready to retire (or could have retired 5 years ago...) and lots of people who are within 5 years of their first day on the job.  This creates a problem of knowledge management; our new guys need to know the details of the FORTRAN code they are using every day to the extent that our ready-to-retire guys know it, and fast.  If they are not taught FORTRAN, this creates an even larger learning curve for them which isn't desirable.
So one option would be to 'rewrite the code for the future generation'..
We definitely do not have the resources to rewrite our workhorse codes that have been in use and development since the 70s.  I don't know if an organization as large as Microsoft could rewrite Windows in a new language.   Also, we can't retire our old codes because they are still actively needed to respond to emergent issues (it is easier to maintain the codes than it is to make a new model to be inputted into a new code).

So, our hands are tied (mine specifically!) and my organization actually needs MORE FORTRAN programmers coming from the university just to maintain the status quo.</htmltext>
<tokenext>I am a manager in a highly technical organization that relies on computer codes to do our job .
In my experience , there is n't ENOUGH FORTRAN teaching in the college level .
Maybe its location based , but most of our new-hires ( we get most from the northeast , but still get a noticeable amount from as far away as University of Washington , Univ of Hawaii , and USC ) actually are NOT taught FORTRAN and instead are taught something object-oriented , typically C + + or Java .
I know for a fact that Penn State suggests C + + for all undergrad engineers ( FORTRAN is offered though - the classes hold less than 50 \ % total students than does the C + + course ) .
In my organization we also have a 'double-hump ' age distribution : lots of people ready to retire ( or could have retired 5 years ago... ) and lots of people who are within 5 years of their first day on the job .
This creates a problem of knowledge management ; our new guys need to know the details of the FORTRAN code they are using every day to the extent that our ready-to-retire guys know it , and fast .
If they are not taught FORTRAN , this creates an even larger learning curve for them which is n't desirable .
So one option would be to 'rewrite the code for the future generation'. . We definitely do not have the resources to rewrite our workhorse codes that have been in use and development since the 70s .
I do n't know if an organization as large as Microsoft could rewrite Windows in a new language .
Also , we ca n't retire our old codes because they are still actively needed to respond to emergent issues ( it is easier to maintain the codes than it is to make a new model to be inputted into a new code ) .
So , our hands are tied ( mine specifically !
) and my organization actually needs MORE FORTRAN programmers coming from the university just to maintain the status quo .</tokentext>
<sentencetext>I am a manager in a highly technical organization that relies on computer codes to do our job.
In my experience, there isn't ENOUGH FORTRAN teaching in the college level.
Maybe its location based, but most of our new-hires (we get most from the northeast, but still get a noticeable amount from as far away as University of Washington, Univ of Hawaii, and USC) actually are NOT taught FORTRAN and instead are taught something object-oriented, typically C++ or Java.
I know for a fact that Penn State suggests C++ for all undergrad engineers (FORTRAN is offered though - the classes hold less than 50\% total students than does the C++ course).
In my organization we also have a 'double-hump' age distribution: lots of people ready to retire (or could have retired 5 years ago...) and lots of people who are within 5 years of their first day on the job.
This creates a problem of knowledge management; our new guys need to know the details of the FORTRAN code they are using every day to the extent that our ready-to-retire guys know it, and fast.
If they are not taught FORTRAN, this creates an even larger learning curve for them which isn't desirable.
So one option would be to 'rewrite the code for the future generation'..
We definitely do not have the resources to rewrite our workhorse codes that have been in use and development since the 70s.
I don't know if an organization as large as Microsoft could rewrite Windows in a new language.
Also, we can't retire our old codes because they are still actively needed to respond to emergent issues (it is easier to maintain the codes than it is to make a new model to be inputted into a new code).
So, our hands are tied (mine specifically!
) and my organization actually needs MORE FORTRAN programmers coming from the university just to maintain the status quo.</sentencetext>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28313449</id>
	<title>Re:While there may be "newer" languages</title>
	<author>Anonymous</author>
	<datestamp>1244798280000</datestamp>
	<modclass>None</modclass>
	<modscore>0</modscore>
	<htmltext><p><i>Python still lacks many of Matlab's features, its only advantage is being Free Software.</i></p><p>You sound like you have never tried out SciPy, which has many of Matlab's features, and is written in C for blazing speed.</p><p>Yes, Python is Free.  It's also easier to work with than Matlab.  I love Python and hate Matlab.</p><p><i>BTW, a very ill-advised design choice of Python: <a href="http://www.python.org/dev/peps/pep-0211/" title="python.org" rel="nofollow">http://www.python.org/dev/peps/pep-0211/</a> [python.org] Ask any numerical analyst to know why it is a terrible idea to solve a linear system with inv(A)*b. But make sure you have at least half an hour free.</i></p><p>Damn, man, what the hell is your problem?  You take a proposal from nine years ago, a proposal that was not accepted, and you call it out as "a very ill-advised design choice"?  Anyone can propose anything.  I could propose that Python add a "MAGIC" operator that would randomly screw up all your matrices.  That wouldn't be accepted either.  Would you then criticise Python because of my stupid insane proposal?  Get a grip.</p></htmltext>
<tokenext>Python still lacks many of Matlab 's features , its only advantage is being Free Software.You sound like you have never tried out SciPy , which has many of Matlab 's features , and is written in C for blazing speed.Yes , Python is Free .
It 's also easier to work with than Matlab .
I love Python and hate Matlab.BTW , a very ill-advised design choice of Python : http : //www.python.org/dev/peps/pep-0211/ [ python.org ] Ask any numerical analyst to know why it is a terrible idea to solve a linear system with inv ( A ) * b. But make sure you have at least half an hour free.Damn , man , what the hell is your problem ?
You take a proposal from nine years ago , a proposal that was not accepted , and you call it out as " a very ill-advised design choice " ?
Anyone can propose anything .
I could propose that Python add a " MAGIC " operator that would randomly screw up all your matrices .
That would n't be accepted either .
Would you then criticise Python because of my stupid insane proposal ?
Get a grip .</tokentext>
<sentencetext>Python still lacks many of Matlab's features, its only advantage is being Free Software.You sound like you have never tried out SciPy, which has many of Matlab's features, and is written in C for blazing speed.Yes, Python is Free.
It's also easier to work with than Matlab.
I love Python and hate Matlab.BTW, a very ill-advised design choice of Python: http://www.python.org/dev/peps/pep-0211/ [python.org] Ask any numerical analyst to know why it is a terrible idea to solve a linear system with inv(A)*b. But make sure you have at least half an hour free.Damn, man, what the hell is your problem?
You take a proposal from nine years ago, a proposal that was not accepted, and you call it out as "a very ill-advised design choice"?
Anyone can propose anything.
I could propose that Python add a "MAGIC" operator that would randomly screw up all your matrices.
That wouldn't be accepted either.
Would you then criticise Python because of my stupid insane proposal?
Get a grip.</sentencetext>
	<parent>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28292401</parent>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28292457</id>
	<title>Re:While there may be "newer" languages</title>
	<author>MrMr</author>
	<datestamp>1244730540000</datestamp>
	<modclass>None</modclass>
	<modscore>1</modscore>
	<htmltext>Depending on which languages you mean; at least a couple of the following:
<br> <br>
Speed<br>
Native complex numbers<br>
Optimized libraries<br>
Parallelized vector algebra<br>
Existing code base<br>
Portable code since 1959<br>
Ansi standard</htmltext>
<tokenext>Depending on which languages you mean ; at least a couple of the following : Speed Native complex numbers Optimized libraries Parallelized vector algebra Existing code base Portable code since 1959 Ansi standard</tokentext>
<sentencetext>Depending on which languages you mean; at least a couple of the following:
 
Speed
Native complex numbers
Optimized libraries
Parallelized vector algebra
Existing code base
Portable code since 1959
Ansi standard</sentencetext>
	<parent>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28292025</parent>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28302033</id>
	<title>It's a marketable skill...</title>
	<author>sitarlo</author>
	<datestamp>1244721900000</datestamp>
	<modclass>None</modclass>
	<modscore>1</modscore>
	<htmltext>A quick search on dice nets around 80 jobs looking for FORTRAN skills.  In this economy that isn't bad for a language that is perceived as dead.  I also noticed the jobs look entirely more interesting than the typical Java coder is going to get.  Personally, I'd rather write scientific code than web apps.  I noticed the pay was good too.  I say learn some FORTRAN and go to work dong something a little bit more specialized.  I haven't written a line of FORTRAN in 15 years so I won't be applying anytime soon!</htmltext>
<tokenext>A quick search on dice nets around 80 jobs looking for FORTRAN skills .
In this economy that is n't bad for a language that is perceived as dead .
I also noticed the jobs look entirely more interesting than the typical Java coder is going to get .
Personally , I 'd rather write scientific code than web apps .
I noticed the pay was good too .
I say learn some FORTRAN and go to work dong something a little bit more specialized .
I have n't written a line of FORTRAN in 15 years so I wo n't be applying anytime soon !</tokentext>
<sentencetext>A quick search on dice nets around 80 jobs looking for FORTRAN skills.
In this economy that isn't bad for a language that is perceived as dead.
I also noticed the jobs look entirely more interesting than the typical Java coder is going to get.
Personally, I'd rather write scientific code than web apps.
I noticed the pay was good too.
I say learn some FORTRAN and go to work dong something a little bit more specialized.
I haven't written a line of FORTRAN in 15 years so I won't be applying anytime soon!</sentencetext>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28292727</id>
	<title>No</title>
	<author>po134</author>
	<datestamp>1244731380000</datestamp>
	<modclass>None</modclass>
	<modscore>1</modscore>
	<htmltext>No. No. No. and No. We already learn enough useless theorical stuff at the university let us at least learn something usefull in the industry in the few practical course...
<br> <br>
You have to teach in those very few practical course the bit that will be usefull right out the university. If you really do need fortran (in the case you go on specific area of specialisation/research) you'll learn it on your own easily because of the basic understanding you'll have of other language.
<br> <br>
stop teaching stuff only usefull in very narrow field or research !</htmltext>
<tokenext>No .
No. No .
and No .
We already learn enough useless theorical stuff at the university let us at least learn something usefull in the industry in the few practical course.. . You have to teach in those very few practical course the bit that will be usefull right out the university .
If you really do need fortran ( in the case you go on specific area of specialisation/research ) you 'll learn it on your own easily because of the basic understanding you 'll have of other language .
stop teaching stuff only usefull in very narrow field or research !</tokentext>
<sentencetext>No.
No. No.
and No.
We already learn enough useless theorical stuff at the university let us at least learn something usefull in the industry in the few practical course...
 
You have to teach in those very few practical course the bit that will be usefull right out the university.
If you really do need fortran (in the case you go on specific area of specialisation/research) you'll learn it on your own easily because of the basic understanding you'll have of other language.
stop teaching stuff only usefull in very narrow field or research !</sentencetext>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28292117</id>
	<title>No fortran - just Python</title>
	<author>Anonymous</author>
	<datestamp>1244729280000</datestamp>
	<modclass>Insightful</modclass>
	<modscore>2</modscore>
	<htmltext><p>&gt; It's also very easy to write simple programs in it.</p><p>99\% of programs are not simple any more...</p><p>&gt; No way I'd use Python for serious large data set numerical calculations.</p><p>No way I'd us Fortran for serious large data set numerical calculations...</p><p>First I use Python - it is fastest to write and debug and has advanced data structures that simplify algorithms...<br>What good is that my program run one day shorter if it took me 2 weeks more to write and debug...</p><p>If I need to optimize - I am moving the internal part to C - it optimizes almost as well as Fortran...</p><p>There is Also NumPy<nobr> <wbr></nobr>...</p><p>The key thing is that it is algorithms what are more important - and these are easier to write/learn<br>in Python - so it students should learn Python first and only some need to learn fortran..</p></htmltext>
<tokenext>&gt; It 's also very easy to write simple programs in it.99 \ % of programs are not simple any more... &gt; No way I 'd use Python for serious large data set numerical calculations.No way I 'd us Fortran for serious large data set numerical calculations...First I use Python - it is fastest to write and debug and has advanced data structures that simplify algorithms...What good is that my program run one day shorter if it took me 2 weeks more to write and debug...If I need to optimize - I am moving the internal part to C - it optimizes almost as well as Fortran...There is Also NumPy ...The key thing is that it is algorithms what are more important - and these are easier to write/learnin Python - so it students should learn Python first and only some need to learn fortran. .</tokentext>
<sentencetext>&gt; It's also very easy to write simple programs in it.99\% of programs are not simple any more...&gt; No way I'd use Python for serious large data set numerical calculations.No way I'd us Fortran for serious large data set numerical calculations...First I use Python - it is fastest to write and debug and has advanced data structures that simplify algorithms...What good is that my program run one day shorter if it took me 2 weeks more to write and debug...If I need to optimize - I am moving the internal part to C - it optimizes almost as well as Fortran...There is Also NumPy ...The key thing is that it is algorithms what are more important - and these are easier to write/learnin Python - so it students should learn Python first and only some need to learn fortran..</sentencetext>
	<parent>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28291859</parent>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28293115</id>
	<title>matlab</title>
	<author>Anonymous</author>
	<datestamp>1244732700000</datestamp>
	<modclass>None</modclass>
	<modscore>0</modscore>
	<htmltext><p>Around my old research lab everyone used Matlab. Fortran is great and all, but most researchers are terrible programmers (think 1 method per 2000 lines of code). Matlab is just easier. Money for a license usually isn't a problem. Octave is available if money is a problem.</p></htmltext>
<tokenext>Around my old research lab everyone used Matlab .
Fortran is great and all , but most researchers are terrible programmers ( think 1 method per 2000 lines of code ) .
Matlab is just easier .
Money for a license usually is n't a problem .
Octave is available if money is a problem .</tokentext>
<sentencetext>Around my old research lab everyone used Matlab.
Fortran is great and all, but most researchers are terrible programmers (think 1 method per 2000 lines of code).
Matlab is just easier.
Money for a license usually isn't a problem.
Octave is available if money is a problem.</sentencetext>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28294607</id>
	<title>Wrap Fortran in Python -- best of both worlds</title>
	<author>lothario</author>
	<datestamp>1244738220000</datestamp>
	<modclass>None</modclass>
	<modscore>1</modscore>
	<htmltext><p>I'm the developer in a Google Summer of Code project that integrates Fortran support in Cython.</p><p>Cython (http://cython.org/) is a project that takes a python program with partial type information ('cdef int foo', 'cdef char *str') and generates a C file that is compiled to an extension module.  It is possible for Cython to generate pure C code for the hotspots, all while keeping the nice python syntax.  It can also call external functions very easily, so you can integrate other libraries as long as the compiler can link it together.</p><p>Thats where the GSoC project comes in -- it allows the programmer to take Fortran source code, wrap the subroutines/functions he wants, generates a fortran wrapper and a C header, which can then be called from any C code or Cython and by extension, Python.  There's an older project, called 'f2py' that overlaps, but our GSoC project has some different aims, and is focusing on all versions of Fortran -- 77, 90, 95 with full array support (assumed shape, assumed size &amp; explicitly shaped) and support for derived types.</p><p>Lastly, there is an entire numerical/scientific community in Python, known as scipy (http://www.scipy.org/) that has all the functionality of Matlab/Octave, with the full power of Python. Scipy has optimized arrays with a very good array syntax.  Just like Matlab, Scipy hands off all numerically intensive work to external libraries, BLAS, ATLAS, etc.</p><p>So to the OP's question -- certainly teach Fortran, since that is the langua franca of numerical programming and will be for a while.  But learn Python, too.  They play very nicely together, and its getting better by the day.</p></htmltext>
<tokenext>I 'm the developer in a Google Summer of Code project that integrates Fortran support in Cython.Cython ( http : //cython.org/ ) is a project that takes a python program with partial type information ( 'cdef int foo ' , 'cdef char * str ' ) and generates a C file that is compiled to an extension module .
It is possible for Cython to generate pure C code for the hotspots , all while keeping the nice python syntax .
It can also call external functions very easily , so you can integrate other libraries as long as the compiler can link it together.Thats where the GSoC project comes in -- it allows the programmer to take Fortran source code , wrap the subroutines/functions he wants , generates a fortran wrapper and a C header , which can then be called from any C code or Cython and by extension , Python .
There 's an older project , called 'f2py ' that overlaps , but our GSoC project has some different aims , and is focusing on all versions of Fortran -- 77 , 90 , 95 with full array support ( assumed shape , assumed size &amp; explicitly shaped ) and support for derived types.Lastly , there is an entire numerical/scientific community in Python , known as scipy ( http : //www.scipy.org/ ) that has all the functionality of Matlab/Octave , with the full power of Python .
Scipy has optimized arrays with a very good array syntax .
Just like Matlab , Scipy hands off all numerically intensive work to external libraries , BLAS , ATLAS , etc.So to the OP 's question -- certainly teach Fortran , since that is the langua franca of numerical programming and will be for a while .
But learn Python , too .
They play very nicely together , and its getting better by the day .</tokentext>
<sentencetext>I'm the developer in a Google Summer of Code project that integrates Fortran support in Cython.Cython (http://cython.org/) is a project that takes a python program with partial type information ('cdef int foo', 'cdef char *str') and generates a C file that is compiled to an extension module.
It is possible for Cython to generate pure C code for the hotspots, all while keeping the nice python syntax.
It can also call external functions very easily, so you can integrate other libraries as long as the compiler can link it together.Thats where the GSoC project comes in -- it allows the programmer to take Fortran source code, wrap the subroutines/functions he wants, generates a fortran wrapper and a C header, which can then be called from any C code or Cython and by extension, Python.
There's an older project, called 'f2py' that overlaps, but our GSoC project has some different aims, and is focusing on all versions of Fortran -- 77, 90, 95 with full array support (assumed shape, assumed size &amp; explicitly shaped) and support for derived types.Lastly, there is an entire numerical/scientific community in Python, known as scipy (http://www.scipy.org/) that has all the functionality of Matlab/Octave, with the full power of Python.
Scipy has optimized arrays with a very good array syntax.
Just like Matlab, Scipy hands off all numerically intensive work to external libraries, BLAS, ATLAS, etc.So to the OP's question -- certainly teach Fortran, since that is the langua franca of numerical programming and will be for a while.
But learn Python, too.
They play very nicely together, and its getting better by the day.</sentencetext>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28293525</id>
	<title>Re:While there may be "newer" languages</title>
	<author>Anonymous</author>
	<datestamp>1244734140000</datestamp>
	<modclass>None</modclass>
	<modscore>0</modscore>
	<htmltext><p>No kidding. I was a CS major that needed just one more math credit. I decided to do it with the fluff class (to a CS major) Computational Mathematics. The professor allowed us to use whatever programming language we wanted, but he admitted a soft spot for Fortran. I decided to try using it for a few assignments. It was immediately obvious that Fortran is a language well suited to the problems that I was solving. I'm surprised that somebody would suggest python instead of Fortran for number crunching, and it makes me wonder if the person has actually used both.</p><p>Incidentally, I did switch to Java later in the class. Once OO-style solutions started popping into my head as soon as the problem was presented, I decided it wasn't worth the time trying to find the Fortran solution. It was a fluff class after all.</p></htmltext>
<tokenext>No kidding .
I was a CS major that needed just one more math credit .
I decided to do it with the fluff class ( to a CS major ) Computational Mathematics .
The professor allowed us to use whatever programming language we wanted , but he admitted a soft spot for Fortran .
I decided to try using it for a few assignments .
It was immediately obvious that Fortran is a language well suited to the problems that I was solving .
I 'm surprised that somebody would suggest python instead of Fortran for number crunching , and it makes me wonder if the person has actually used both.Incidentally , I did switch to Java later in the class .
Once OO-style solutions started popping into my head as soon as the problem was presented , I decided it was n't worth the time trying to find the Fortran solution .
It was a fluff class after all .</tokentext>
<sentencetext>No kidding.
I was a CS major that needed just one more math credit.
I decided to do it with the fluff class (to a CS major) Computational Mathematics.
The professor allowed us to use whatever programming language we wanted, but he admitted a soft spot for Fortran.
I decided to try using it for a few assignments.
It was immediately obvious that Fortran is a language well suited to the problems that I was solving.
I'm surprised that somebody would suggest python instead of Fortran for number crunching, and it makes me wonder if the person has actually used both.Incidentally, I did switch to Java later in the class.
Once OO-style solutions started popping into my head as soon as the problem was presented, I decided it wasn't worth the time trying to find the Fortran solution.
It was a fluff class after all.</sentencetext>
	<parent>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28291859</parent>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28314891</id>
	<title>Re:libraries. gigabytes of libraries</title>
	<author>seawall</author>
	<datestamp>1244805180000</datestamp>
	<modclass>None</modclass>
	<modscore>1</modscore>
	<htmltext>While I agree FORTRAN (especially the older dialects) is relatively easy to optimize for parallel operations
and clusters and the like and the libraries are platinum, I do think they need rewrites from time to
time;  those libraries can be hard to understand and I want every generation
to have lots of people who understand, not just use, the things.

<br> <br>
Also, every 10-20 years assumptions change. The early libraries did floating point entirely in
software and memory was tight, in 2009  we can almost  assume any desktop (and most laptops) have floating point to IEEE standards available in single and double precision plus virtual memory.

<br> <br>
Now we are seeing GPU's being used in non-graphics computation
so algorithms that are stable in single precision and can actually use
64 pipelines are being written and memory is tighter.</htmltext>
<tokenext>While I agree FORTRAN ( especially the older dialects ) is relatively easy to optimize for parallel operations and clusters and the like and the libraries are platinum , I do think they need rewrites from time to time ; those libraries can be hard to understand and I want every generation to have lots of people who understand , not just use , the things .
Also , every 10-20 years assumptions change .
The early libraries did floating point entirely in software and memory was tight , in 2009 we can almost assume any desktop ( and most laptops ) have floating point to IEEE standards available in single and double precision plus virtual memory .
Now we are seeing GPU 's being used in non-graphics computation so algorithms that are stable in single precision and can actually use 64 pipelines are being written and memory is tighter .</tokentext>
<sentencetext>While I agree FORTRAN (especially the older dialects) is relatively easy to optimize for parallel operations
and clusters and the like and the libraries are platinum, I do think they need rewrites from time to
time;  those libraries can be hard to understand and I want every generation
to have lots of people who understand, not just use, the things.
Also, every 10-20 years assumptions change.
The early libraries did floating point entirely in
software and memory was tight, in 2009  we can almost  assume any desktop (and most laptops) have floating point to IEEE standards available in single and double precision plus virtual memory.
Now we are seeing GPU's being used in non-graphics computation
so algorithms that are stable in single precision and can actually use
64 pipelines are being written and memory is tighter.</sentencetext>
	<parent>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28291919</parent>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28292357</id>
	<title>Re:While there may be "newer" languages</title>
	<author>aaaaaaargh!</author>
	<datestamp>1244730180000</datestamp>
	<modclass>Insightful</modclass>
	<modscore>4</modscore>
	<htmltext><tt>Mod parent up. Students should learn to choose the right tool for the right purpose and not be drawn into stupid "my language is best" discussions. Python is too slow for serious number crunching and Fortran is widely in use for exactly this purpose. Python is suitable for many other tasks, though. To give another example, if somebody studies astronomy and will have to work with old legacy Forth code, he should better be taught to program in Forth at university. And somebody who needs to quench maximum speed out of hardware or wants to implement compilers should better learn assembler. I've studied linguistics and learned to program in Prolog and Common Lisp at University, now what's wrong with that? Both are still widely in use in NLP and you need to no the basics of them when you're working in that domain even if you don't use them.<br><br>In my experience the majority of people that think that advertise one programming language above all others tend to have no clue about programming languages in general and what other languages exist apart from mainstream languages like C/C++ or Ruby, and I'm afraid this holds particularly for Python and Java enthusiasts---both of which are relatively mediocre and outdated languages in terms of their general features and usefulness, although they can of course be the right choice for many tasks.</tt></htmltext>
<tokenext>Mod parent up .
Students should learn to choose the right tool for the right purpose and not be drawn into stupid " my language is best " discussions .
Python is too slow for serious number crunching and Fortran is widely in use for exactly this purpose .
Python is suitable for many other tasks , though .
To give another example , if somebody studies astronomy and will have to work with old legacy Forth code , he should better be taught to program in Forth at university .
And somebody who needs to quench maximum speed out of hardware or wants to implement compilers should better learn assembler .
I 've studied linguistics and learned to program in Prolog and Common Lisp at University , now what 's wrong with that ?
Both are still widely in use in NLP and you need to no the basics of them when you 're working in that domain even if you do n't use them.In my experience the majority of people that think that advertise one programming language above all others tend to have no clue about programming languages in general and what other languages exist apart from mainstream languages like C/C + + or Ruby , and I 'm afraid this holds particularly for Python and Java enthusiasts---both of which are relatively mediocre and outdated languages in terms of their general features and usefulness , although they can of course be the right choice for many tasks .</tokentext>
<sentencetext>Mod parent up.
Students should learn to choose the right tool for the right purpose and not be drawn into stupid "my language is best" discussions.
Python is too slow for serious number crunching and Fortran is widely in use for exactly this purpose.
Python is suitable for many other tasks, though.
To give another example, if somebody studies astronomy and will have to work with old legacy Forth code, he should better be taught to program in Forth at university.
And somebody who needs to quench maximum speed out of hardware or wants to implement compilers should better learn assembler.
I've studied linguistics and learned to program in Prolog and Common Lisp at University, now what's wrong with that?
Both are still widely in use in NLP and you need to no the basics of them when you're working in that domain even if you don't use them.In my experience the majority of people that think that advertise one programming language above all others tend to have no clue about programming languages in general and what other languages exist apart from mainstream languages like C/C++ or Ruby, and I'm afraid this holds particularly for Python and Java enthusiasts---both of which are relatively mediocre and outdated languages in terms of their general features and usefulness, although they can of course be the right choice for many tasks.</sentencetext>
	<parent>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28291859</parent>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28294429</id>
	<title>Python?</title>
	<author>Anonymous</author>
	<datestamp>1244737620000</datestamp>
	<modclass>None</modclass>
	<modscore>0</modscore>
	<htmltext>I wish there was a repellent I could use to keep away all the python fan boys.

I think a course in the history of programming languages would be a valuable addition to the curriculum and, in such a class, do some programming in Fortran, Smalltalk, Ada, Algol, and so on.  In a 101 class, teach something practical.  As ugly as it can be, something as simple as javascript can be used to illustrate a lot of concepts in programming and is useful to most people who will never write anything more than a greasemonkey script.</htmltext>
<tokenext>I wish there was a repellent I could use to keep away all the python fan boys .
I think a course in the history of programming languages would be a valuable addition to the curriculum and , in such a class , do some programming in Fortran , Smalltalk , Ada , Algol , and so on .
In a 101 class , teach something practical .
As ugly as it can be , something as simple as javascript can be used to illustrate a lot of concepts in programming and is useful to most people who will never write anything more than a greasemonkey script .</tokentext>
<sentencetext>I wish there was a repellent I could use to keep away all the python fan boys.
I think a course in the history of programming languages would be a valuable addition to the curriculum and, in such a class, do some programming in Fortran, Smalltalk, Ada, Algol, and so on.
In a 101 class, teach something practical.
As ugly as it can be, something as simple as javascript can be used to illustrate a lot of concepts in programming and is useful to most people who will never write anything more than a greasemonkey script.</sentencetext>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28305551</id>
	<title>Laughable Python advocacy</title>
	<author>dugeen</author>
	<datestamp>1244802420000</datestamp>
	<modclass>None</modclass>
	<modscore>1</modscore>
	<htmltext>The idea of Python as a modern replacement for FORTRAN is totally risible, given that Python is actually more dependent on significant whitespace than FORTRAN. In fact that's why Python is so called.</htmltext>
<tokenext>The idea of Python as a modern replacement for FORTRAN is totally risible , given that Python is actually more dependent on significant whitespace than FORTRAN .
In fact that 's why Python is so called .</tokentext>
<sentencetext>The idea of Python as a modern replacement for FORTRAN is totally risible, given that Python is actually more dependent on significant whitespace than FORTRAN.
In fact that's why Python is so called.</sentencetext>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28294623</id>
	<title>Teach a bunch of obscure languages</title>
	<author>LordNimon</author>
	<datestamp>1244738280000</datestamp>
	<modclass>None</modclass>
	<modscore>1</modscore>
	<htmltext>I think every undergraduate computer science student should have one semester where he learns a large number of different languages, and has to write a simple program in each one.  Every two weeks, you switch to another language.  This will give the student some exposure to different language types, and if a particular language appeals to him, he'll remember it.</htmltext>
<tokenext>I think every undergraduate computer science student should have one semester where he learns a large number of different languages , and has to write a simple program in each one .
Every two weeks , you switch to another language .
This will give the student some exposure to different language types , and if a particular language appeals to him , he 'll remember it .</tokentext>
<sentencetext>I think every undergraduate computer science student should have one semester where he learns a large number of different languages, and has to write a simple program in each one.
Every two weeks, you switch to another language.
This will give the student some exposure to different language types, and if a particular language appeals to him, he'll remember it.</sentencetext>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28291963</id>
	<title>Fortran is weak sauce buddy</title>
	<author>Anonymous</author>
	<datestamp>1244728620000</datestamp>
	<modclass>None</modclass>
	<modscore>0</modscore>
	<htmltext><p>Fortran sucks.</p><p>Get those 50+ year old programmers out of the industry if they are using such an old language.</p><p>Fortran needs to retire, just like every single programmer that thinks that Fortran is worth using.</p></htmltext>
<tokenext>Fortran sucks.Get those 50 + year old programmers out of the industry if they are using such an old language.Fortran needs to retire , just like every single programmer that thinks that Fortran is worth using .</tokentext>
<sentencetext>Fortran sucks.Get those 50+ year old programmers out of the industry if they are using such an old language.Fortran needs to retire, just like every single programmer that thinks that Fortran is worth using.</sentencetext>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28294125</id>
	<title>Difference between teaching and what I use</title>
	<author>Anonymous</author>
	<datestamp>1244736360000</datestamp>
	<modclass>None</modclass>
	<modscore>0</modscore>
	<htmltext><p>There is a difference between asking what should be taught and what do I personally find useful in my work.<br>I don't know what should be taught...I am not a teacher.<br>But I can tell people what I find useful in my work.</p></htmltext>
<tokenext>There is a difference between asking what should be taught and what do I personally find useful in my work.I do n't know what should be taught...I am not a teacher.But I can tell people what I find useful in my work .</tokentext>
<sentencetext>There is a difference between asking what should be taught and what do I personally find useful in my work.I don't know what should be taught...I am not a teacher.But I can tell people what I find useful in my work.</sentencetext>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28292931</id>
	<title>C++ on Particle Accelerators</title>
	<author>tarlss</author>
	<datestamp>1244732040000</datestamp>
	<modclass>None</modclass>
	<modscore>1</modscore>
	<htmltext>Most physicists I know have moved onto C++ code. At least I know they use this at the Brookhaven and Tskuba Particle Accelerators in Long Island,NY and Japan respectively.</htmltext>
<tokenext>Most physicists I know have moved onto C + + code .
At least I know they use this at the Brookhaven and Tskuba Particle Accelerators in Long Island,NY and Japan respectively .</tokentext>
<sentencetext>Most physicists I know have moved onto C++ code.
At least I know they use this at the Brookhaven and Tskuba Particle Accelerators in Long Island,NY and Japan respectively.</sentencetext>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28296953</id>
	<title>Re:While there may be "newer" languages</title>
	<author>ctrl-alt-canc</author>
	<datestamp>1244746440000</datestamp>
	<modclass>None</modclass>
	<modscore>1</modscore>
	<htmltext>Sorry, this is <a href="http://www.oonumerics.org/blitz/" title="oonumerics.org" rel="nofollow">no more true</a> [oonumerics.org]. Furthermore, having personally developed several FDFT algorithm both in fortran and in C, I can tell that rivaling fortran performances isn't that difficult. IMHO undergraduates should rather learn very well how to use data structures and algorithms: <i>rem tene, fortran sequentur</i>...</htmltext>
<tokenext>Sorry , this is no more true [ oonumerics.org ] .
Furthermore , having personally developed several FDFT algorithm both in fortran and in C , I can tell that rivaling fortran performances is n't that difficult .
IMHO undergraduates should rather learn very well how to use data structures and algorithms : rem tene , fortran sequentur.. .</tokentext>
<sentencetext>Sorry, this is no more true [oonumerics.org].
Furthermore, having personally developed several FDFT algorithm both in fortran and in C, I can tell that rivaling fortran performances isn't that difficult.
IMHO undergraduates should rather learn very well how to use data structures and algorithms: rem tene, fortran sequentur...</sentencetext>
	<parent>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28291859</parent>
</comment>
<comment>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28333571</id>
	<title>FORTRAN is still prevalent</title>
	<author>Anonymous</author>
	<datestamp>1245068820000</datestamp>
	<modclass>None</modclass>
	<modscore>0</modscore>
	<htmltext><p>In my company, a HUGE number of our analysis codes are still in FORTRAN.  I never learned it in school and have had to teach myself enough to keep the codes working properly.  I've even suggested in the past that we upgrade our codes, but there was much howling from the gray beards that own them.</p></htmltext>
<tokenext>In my company , a HUGE number of our analysis codes are still in FORTRAN .
I never learned it in school and have had to teach myself enough to keep the codes working properly .
I 've even suggested in the past that we upgrade our codes , but there was much howling from the gray beards that own them .</tokentext>
<sentencetext>In my company, a HUGE number of our analysis codes are still in FORTRAN.
I never learned it in school and have had to teach myself enough to keep the codes working properly.
I've even suggested in the past that we upgrade our codes, but there was much howling from the gray beards that own them.</sentencetext>
</comment>
<thread>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#thread_09_06_11_1228209_34</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28291939
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28292813
</commentlist>
</thread>
<thread>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#thread_09_06_11_1228209_9</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28291859
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28292839
</commentlist>
</thread>
<thread>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#thread_09_06_11_1228209_59</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28291859
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28292025
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28292401
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28313449
</commentlist>
</thread>
<thread>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#thread_09_06_11_1228209_53</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28292075
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28295217
</commentlist>
</thread>
<thread>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#thread_09_06_11_1228209_24</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28291919
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28292329
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28293225
</commentlist>
</thread>
<thread>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#thread_09_06_11_1228209_87</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28292565
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28302393
</commentlist>
</thread>
<thread>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#thread_09_06_11_1228209_6</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28292127
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28293455
</commentlist>
</thread>
<thread>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#thread_09_06_11_1228209_17</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28291859
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28292159
</commentlist>
</thread>
<thread>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#thread_09_06_11_1228209_92</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28291919
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28314891
</commentlist>
</thread>
<thread>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#thread_09_06_11_1228209_77</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28292127
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28317201
</commentlist>
</thread>
<thread>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#thread_09_06_11_1228209_54</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28291859
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28292357
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28305081
</commentlist>
</thread>
<thread>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#thread_09_06_11_1228209_82</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28291847
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28294815
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28297325
</commentlist>
</thread>
<thread>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#thread_09_06_11_1228209_45</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28291847
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28293573
</commentlist>
</thread>
<thread>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#thread_09_06_11_1228209_21</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28291925
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28292733
</commentlist>
</thread>
<thread>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#thread_09_06_11_1228209_44</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28291859
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28292025
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28292401
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28293759
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28295325
</commentlist>
</thread>
<thread>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#thread_09_06_11_1228209_7</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28291859
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28292117
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28297009
</commentlist>
</thread>
<thread>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#thread_09_06_11_1228209_23</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28292033
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28292943
</commentlist>
</thread>
<thread>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#thread_09_06_11_1228209_18</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28291859
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28292025
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28293445
</commentlist>
</thread>
<thread>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#thread_09_06_11_1228209_46</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28291859
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28292025
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28292783
</commentlist>
</thread>
<thread>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#thread_09_06_11_1228209_51</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28291859
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28292025
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28292125
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28297005
</commentlist>
</thread>
<thread>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#thread_09_06_11_1228209_1</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28291847
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28294815
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28296441
</commentlist>
</thread>
<thread>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#thread_09_06_11_1228209_69</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28291859
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28292357
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28293047
</commentlist>
</thread>
<thread>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#thread_09_06_11_1228209_22</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28292127
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28293041
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28297603
</commentlist>
</thread>
<thread>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#thread_09_06_11_1228209_0</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28291859
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28303611
</commentlist>
</thread>
<thread>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#thread_09_06_11_1228209_74</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28291939
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28293535
</commentlist>
</thread>
<thread>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#thread_09_06_11_1228209_101</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28291859
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28292025
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28292401
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28297127
</commentlist>
</thread>
<thread>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#thread_09_06_11_1228209_103</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28292565
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28297941
</commentlist>
</thread>
<thread>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#thread_09_06_11_1228209_99</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28291859
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28296953
</commentlist>
</thread>
<thread>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#thread_09_06_11_1228209_2</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28291859
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28292025
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28292401
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28307685
</commentlist>
</thread>
<thread>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#thread_09_06_11_1228209_4</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28292127
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28303789
</commentlist>
</thread>
<thread>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#thread_09_06_11_1228209_15</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28292153
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28294677
</commentlist>
</thread>
<thread>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#thread_09_06_11_1228209_90</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28291859
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28292901
</commentlist>
</thread>
<thread>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#thread_09_06_11_1228209_81</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28291925
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28296641
</commentlist>
</thread>
<thread>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#thread_09_06_11_1228209_64</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28292153
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28298809
</commentlist>
</thread>
<thread>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#thread_09_06_11_1228209_38</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28291919
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28292165
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28293037
</commentlist>
</thread>
<thread>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#thread_09_06_11_1228209_80</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28291919
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28292165
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28295865
</commentlist>
</thread>
<thread>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#thread_09_06_11_1228209_43</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28292153
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28292745
</commentlist>
</thread>
<thread>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#thread_09_06_11_1228209_71</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28291859
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28292025
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28292401
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28300331
</commentlist>
</thread>
<thread>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#thread_09_06_11_1228209_57</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28292565
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28303863
</commentlist>
</thread>
<thread>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#thread_09_06_11_1228209_28</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28291859
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28292045
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28292331
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28303623
</commentlist>
</thread>
<thread>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#thread_09_06_11_1228209_33</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28299473
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28302227
</commentlist>
</thread>
<thread>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#thread_09_06_11_1228209_35</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28291859
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28292251
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28296619
</commentlist>
</thread>
<thread>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#thread_09_06_11_1228209_100</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28291847
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28293213
</commentlist>
</thread>
<thread>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#thread_09_06_11_1228209_96</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28292153
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28292777
</commentlist>
</thread>
<thread>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#thread_09_06_11_1228209_12</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28292153
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28292893
</commentlist>
</thread>
<thread>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#thread_09_06_11_1228209_72</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28291925
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28293323
</commentlist>
</thread>
<thread>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#thread_09_06_11_1228209_63</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28292033
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28303769
</commentlist>
</thread>
<thread>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#thread_09_06_11_1228209_86</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28291859
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28292025
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28292401
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28293759
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28307977
</commentlist>
</thread>
<thread>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#thread_09_06_11_1228209_62</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28292127
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28293253
</commentlist>
</thread>
<thread>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#thread_09_06_11_1228209_93</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28291939
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28292059
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28306665
</commentlist>
</thread>
<thread>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#thread_09_06_11_1228209_36</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28291859
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28292025
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28292125
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28295549
</commentlist>
</thread>
<thread>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#thread_09_06_11_1228209_27</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28291849
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28292031
</commentlist>
</thread>
<thread>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#thread_09_06_11_1228209_30</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28291847
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28291953
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28294685
</commentlist>
</thread>
<thread>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#thread_09_06_11_1228209_55</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28292127
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28293639
</commentlist>
</thread>
<thread>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#thread_09_06_11_1228209_26</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28291859
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28292025
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28292359
</commentlist>
</thread>
<thread>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#thread_09_06_11_1228209_20</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28291859
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28292225
</commentlist>
</thread>
<thread>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#thread_09_06_11_1228209_78</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28292153
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28292483
</commentlist>
</thread>
<thread>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#thread_09_06_11_1228209_8</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28292153
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28296803
</commentlist>
</thread>
<thread>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#thread_09_06_11_1228209_19</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28292127
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28293833
</commentlist>
</thread>
<thread>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#thread_09_06_11_1228209_94</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28291859
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28292025
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28298321
</commentlist>
</thread>
<thread>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#thread_09_06_11_1228209_85</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28292931
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28294121
</commentlist>
</thread>
<thread>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#thread_09_06_11_1228209_10</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28291859
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28292147
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28296307
</commentlist>
</thread>
<thread>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#thread_09_06_11_1228209_68</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28292127
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28293329
</commentlist>
</thread>
<thread>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#thread_09_06_11_1228209_61</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28291919
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28293277
</commentlist>
</thread>
<thread>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#thread_09_06_11_1228209_84</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28291847
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28294815
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28305583
</commentlist>
</thread>
<thread>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#thread_09_06_11_1228209_75</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28292033
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28293959
</commentlist>
</thread>
<thread>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#thread_09_06_11_1228209_58</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28291859
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28292025
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28292401
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28294965
</commentlist>
</thread>
<thread>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#thread_09_06_11_1228209_49</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28291859
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28292025
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28293475
</commentlist>
</thread>
<thread>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#thread_09_06_11_1228209_91</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28291859
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28292025
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28292401
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28301825
</commentlist>
</thread>
<thread>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#thread_09_06_11_1228209_52</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28291859
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28292819
</commentlist>
</thread>
<thread>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#thread_09_06_11_1228209_25</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28291859
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28298375
</commentlist>
</thread>
<thread>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#thread_09_06_11_1228209_48</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28292033
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28292683
</commentlist>
</thread>
<thread>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#thread_09_06_11_1228209_39</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28291859
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28292025
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28292457
</commentlist>
</thread>
<thread>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#thread_09_06_11_1228209_42</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28291919
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28292165
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28295743
</commentlist>
</thread>
<thread>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#thread_09_06_11_1228209_5</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28291859
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28292025
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28293483
</commentlist>
</thread>
<thread>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#thread_09_06_11_1228209_16</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28292153
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28293993
</commentlist>
</thread>
<thread>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#thread_09_06_11_1228209_76</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28292063
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28292523
</commentlist>
</thread>
<thread>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#thread_09_06_11_1228209_67</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28291859
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28293347
</commentlist>
</thread>
<thread>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#thread_09_06_11_1228209_83</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28292293
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28295131
</commentlist>
</thread>
<thread>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#thread_09_06_11_1228209_66</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28291859
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28293525
</commentlist>
</thread>
<thread>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#thread_09_06_11_1228209_97</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28291847
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28294815
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28315971
</commentlist>
</thread>
<thread>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#thread_09_06_11_1228209_13</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28291859
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28292025
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28292321
</commentlist>
</thread>
<thread>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#thread_09_06_11_1228209_73</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28291847
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28291953
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28301159
</commentlist>
</thread>
<thread>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#thread_09_06_11_1228209_47</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28291859
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28292025
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28292401
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28303715
</commentlist>
</thread>
<thread>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#thread_09_06_11_1228209_50</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28291939
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28292059
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28298443
</commentlist>
</thread>
<thread>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#thread_09_06_11_1228209_41</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28291919
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28292433
</commentlist>
</thread>
<thread>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#thread_09_06_11_1228209_37</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28292127
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28302379
</commentlist>
</thread>
<thread>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#thread_09_06_11_1228209_89</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28291919
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28292165
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28296707
</commentlist>
</thread>
<thread>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#thread_09_06_11_1228209_102</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28291859
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28293411
</commentlist>
</thread>
<thread>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#thread_09_06_11_1228209_40</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28291919
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28299295
</commentlist>
</thread>
<thread>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#thread_09_06_11_1228209_98</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28291847
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28291953
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28296129
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28301327
</commentlist>
</thread>
<thread>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#thread_09_06_11_1228209_3</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28291939
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28292089
</commentlist>
</thread>
<thread>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#thread_09_06_11_1228209_31</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28291859
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28293641
</commentlist>
</thread>
<thread>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#thread_09_06_11_1228209_14</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28291859
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28294123
</commentlist>
</thread>
<thread>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#thread_09_06_11_1228209_104</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28291859
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28292045
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28292331
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28293791
</commentlist>
</thread>
<thread>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#thread_09_06_11_1228209_65</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28291859
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28292025
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28292401
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28295763
</commentlist>
</thread>
<thread>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#thread_09_06_11_1228209_88</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28291847
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28294815
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28308871
</commentlist>
</thread>
<thread>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#thread_09_06_11_1228209_79</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28292565
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28299345
</commentlist>
</thread>
<thread>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#thread_09_06_11_1228209_70</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28291925
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28294097
</commentlist>
</thread>
<thread>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#thread_09_06_11_1228209_95</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28291847
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28294857
</commentlist>
</thread>
<thread>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#thread_09_06_11_1228209_56</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28292153
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28292915
</commentlist>
</thread>
<thread>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#thread_09_06_11_1228209_11</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28291919
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28292501
</commentlist>
</thread>
<thread>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#thread_09_06_11_1228209_29</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28291859
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28292357
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28295327
</commentlist>
</thread>
<thread>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#thread_09_06_11_1228209_32</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28292127
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28293261
</commentlist>
</thread>
<thread>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#thread_09_06_11_1228209_60</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28291859
http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28303057
</commentlist>
</thread>
<conversation>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#conversation09_06_11_1228209.19</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28292209
</commentlist>
</conversation>
<conversation>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#conversation09_06_11_1228209.1</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28292315
</commentlist>
</conversation>
<conversation>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#conversation09_06_11_1228209.7</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28292033
-http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28293959
-http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28292943
-http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28292683
-http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28303769
</commentlist>
</conversation>
<conversation>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#conversation09_06_11_1228209.17</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28292931
-http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28294121
</commentlist>
</conversation>
<conversation>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#conversation09_06_11_1228209.8</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28292075
-http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28295217
</commentlist>
</conversation>
<conversation>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#conversation09_06_11_1228209.10</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28291919
-http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28314891
-http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28292501
-http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28293277
-http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28292165
--http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28295865
--http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28296707
--http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28293037
--http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28295743
-http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28292433
-http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28292329
--http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28293225
-http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28299295
</commentlist>
</conversation>
<conversation>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#conversation09_06_11_1228209.5</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28294365
</commentlist>
</conversation>
<conversation>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#conversation09_06_11_1228209.2</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28291925
-http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28294097
-http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28296641
-http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28293323
-http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28292733
</commentlist>
</conversation>
<conversation>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#conversation09_06_11_1228209.0</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28293451
</commentlist>
</conversation>
<conversation>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#conversation09_06_11_1228209.11</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28292153
-http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28292777
-http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28292893
-http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28298809
-http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28292915
-http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28292483
-http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28292745
-http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28294677
-http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28296803
-http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28293993
</commentlist>
</conversation>
<conversation>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#conversation09_06_11_1228209.22</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28292521
</commentlist>
</conversation>
<conversation>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#conversation09_06_11_1228209.20</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28291849
-http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28292031
</commentlist>
</conversation>
<conversation>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#conversation09_06_11_1228209.3</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28292127
-http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28293041
--http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28297603
-http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28317201
-http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28293455
-http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28293261
-http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28293833
-http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28302379
-http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28293639
-http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28293329
-http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28303789
-http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28293253
</commentlist>
</conversation>
<conversation>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#conversation09_06_11_1228209.6</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28295001
</commentlist>
</conversation>
<conversation>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#conversation09_06_11_1228209.4</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28292599
</commentlist>
</conversation>
<conversation>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#conversation09_06_11_1228209.25</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28291847
-http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28293573
-http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28291953
--http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28294685
--http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28296129
---http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28301327
--http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28301159
-http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28293213
-http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28294815
--http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28305583
--http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28296441
--http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28297325
--http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28308871
--http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28315971
-http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28294857
</commentlist>
</conversation>
<conversation>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#conversation09_06_11_1228209.14</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28292063
-http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28292523
</commentlist>
</conversation>
<conversation>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#conversation09_06_11_1228209.9</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28299473
-http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28302227
</commentlist>
</conversation>
<conversation>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#conversation09_06_11_1228209.12</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28291859
-http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28293525
-http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28293641
-http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28292159
-http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28293347
-http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28303057
-http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28292357
--http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28295327
--http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28293047
--http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28305081
-http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28292045
--http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28292331
---http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28293791
---http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28303623
-http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28292225
-http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28293411
-http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28303611
-http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28296953
-http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28298375
-http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28292117
--http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28297009
-http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28294123
-http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28292839
-http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28292147
--http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28296307
-http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28292901
-http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28292251
--http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28296619
-http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28292025
--http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28292783
--http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28292457
--http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28292359
--http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28292321
--http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28298321
--http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28292401
---http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28293759
----http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28295325
----http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28307977
---http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28300331
---http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28303715
---http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28295763
---http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28307685
---http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28294965
---http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28313449
---http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28297127
---http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28301825
--http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28293445
--http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28293483
--http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28292125
---http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28295549
---http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28297005
--http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28293475
-http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28292819
</commentlist>
</conversation>
<conversation>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#conversation09_06_11_1228209.26</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28292047
</commentlist>
</conversation>
<conversation>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#conversation09_06_11_1228209.15</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28292111
</commentlist>
</conversation>
<conversation>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#conversation09_06_11_1228209.18</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28291939
-http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28292813
-http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28292089
-http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28293535
-http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28292059
--http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28306665
--http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28298443
</commentlist>
</conversation>
<conversation>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#conversation09_06_11_1228209.23</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28291963
</commentlist>
</conversation>
<conversation>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#conversation09_06_11_1228209.24</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28291985
</commentlist>
</conversation>
<conversation>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#conversation09_06_11_1228209.13</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28292565
-http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28297941
-http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28303863
-http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28299345
-http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28302393
</commentlist>
</conversation>
<conversation>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#conversation09_06_11_1228209.16</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28292051
</commentlist>
</conversation>
<conversation>
	<id>http://www.semanticweb.org/ontologies/ConversationInstances.owl#conversation09_06_11_1228209.21</id>
	<commentlist>http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28292293
-http://www.semanticweb.org/ontologies/ConversationInstances.owl#comment09_06_11_1228209.28295131
</commentlist>
</conversation>
