Question:
Why fortran is still much popular in scientific computing?
2014-01-04 14:58:06 UTC
Anyone knows why fortran is still so much popular for scientific computing? I read on wikipedia (http://en.wikipedia.org/wiki/Fortran) that fortran is the language used for programs that benchmark and rank the world's fastest supercomputers. I am not an expert programmer but I think C++ and Java are more convenient to program and their codes are more readable. Please don't answer in computer science jargons as I'm not a comp. scientist.
Six answers:
Vincent G
2014-01-05 16:18:32 UTC
Several reasons. For instance:



1- a lot of work has been done to tune Fortran so that it produces the fastest compiling code, bar none. Fortran will most always outpace a code written in a different language because it lacks some features that could make it easier to code, but definitely harder to optimize. Java, for instance, would always take a performance penalty because of the VM layer that takes its share of additional processing. C++ would have those dynamic memory allocation that may be neat to program, but put a burden on processing. In contrast, a Fortran code typically would reserve all its memory space from the get go, and use it, using static addresses that can be optimized



2- a lots of scientific libraries and optimized algorithms have been developed in Fortran. No one sees any benefit in reinventing the wheel, so Fortran remains because Fortran was there first



The type of computation done in sciences and engineering are not those where the transactions volume can fluctuate. You would have that in a video game, for instance, where the number of obstacles to avoid in a race car could change; or in a inventory system. On the other hand, if you want to compute the stress in a car during a crash, the number of parts stays the same, it is only the physical load applied to each element that changes, but that load has to be computed for all the elements thousand of times.



Those are instances where ease of programming takes second place to computing efficiency. A weather prediction model will run for hundred of hours on supercomputers; the extra time required to program it in a less 'user friendly' language is not a big problem, as the code will be used hundred of times, with only the input changed. And also, with a language that is perhaps a bit more demanding, there is perhaps less chance that non-expert programmers would be involved and introduce new bugs. You see, expert programmers are not that common. The flip side with 'easy' languages is that there are a lot of people who are not, and never will be, expert are writing programs. If a video game has a flaw, nobody gets hurt. If an aircraft is flawed because of a problem in the code that designed it, people may die.
?
2016-12-15 15:03:00 UTC
Wikipedia Fortran
Jonathan
2014-01-05 04:19:32 UTC
The reasons ARE technical. There certainly is history, too. But without the technical reasons, the history would certainly have been much different. So it's essentially based on some early, technical, design decisions which turned out beneficial for vector processing machines (which were commercially available in the late 1970's and moving into the 1980's and beyond.)



The ONE thing that no one seems to have hit on (because I guess none of them actually USE FORTRAN for numerical processing themselves) is that FORTRAN makes certain contractual guarantees which permit compilers to optimize the resulting code better for both scalar and vector processing. Note that I'm not saying that somehow FORTRAN compilers use magical methods that C compilers mysteriously do not have access to. I'm saying that the language itself makes statements about the code you are permitted to write which is different than what a C coder is permitted to write. And these differences matter when it comes to optimization.



I'll provide a single example. You can find more, if you need them.



In FORTRAN, you cannot pass two different arrays as parameters if these arrays overlap anywhere. In C, you can pass an overlapping array as a 2nd parameter, for example. In FORTRAN, you cannot. This simple guarantee alone allows for ready vectorizing on VLIW, pipelined, or systems with parallel functional units. A C compiler, because it has no such guarantee, cannot generate such code for the function. It has to assume that the memories "might" overlap and generate appropriate code.



This difference was one important reason for the history you see. It started early, was found to be valuable as optimization technology lept by bounds during the 1980's (see BULLDOG: A COMPILER FOR VLIW ARCHITECTURES. 1985, by Dr. Ellis), and was only recently headed off at the pass, so to speak, with the "restrict" keyword in C. THere are other reasons that still exist, both historical and real, that still cause FORTRAN to be preferred. But the gap is diminishing somewhat.



A great deal of optimization efforts for the highest performance computers (vector processing, VLIW, and "transputer" array style as represented by Intel and NVIDEA high end computers) has been plowed into FORTRAN already now. If you want to get the most out of the fastest, you use FORTRAN (and/or mixed with assembly.) You won't find the ability to move code across code edges to fill functional units elsewhere, recognize DRAM cache refresh cycle boundaries for aligning data sets, etc., elsewhere. May happen in special cases here and there with other languages. But if you have a high end supercomputer to sell, you are porting a high performance optimizing FORTRAN to it first. You won't care about the other languages until later. Obviously, if you are porting such code, you will depend on that first-out compiler, too. And if you are developing new code, you will use the compiler you can first lay hands on and be pretty sure about it being solid.
Me2
2014-01-05 01:56:48 UTC
There are some very good reasons that Fortran is still widely used.  First and most obviously is that it HAS been widely used, and it is far easier to refine or add a feature to an existing large program than to recode it from scratch.  Second, a great many programmers are skilled in the language.



Third, fourth, and fifth are that Fortran is still a relatively simple programming language, it is the most suitable for computationally-intensive tasks, and it is FAST — Fortran compilers still generate some of the best optimized object code.



And sixth, although deserving higher priority than this tail-end mention, is the truly enormous Fortran code base, exceeded by COBOL's but having few other rivals.



Fortran is widely used in crash test simulations, nuclear weapons modeling and analysis of seismic data to detect nuclear events, satellite and probe data, weather and climate modeling and prediction, financial data, fluid dynamics work in aerospace (and the 1.000 MPH car), seismic data from petroleum exploration, etc., etc.
adaviel
2014-01-04 16:09:20 UTC
There are still code libraries that represent thousands of person-years of effort and still work just fine. No-one wants to re-code them in C++ and re-certify them. Even if the original code has bugs, it is what has been used to analyze lots of scientific data. If you analyze new data with a new program and get a different result, no-one can tell if that's because the program's different or the data is different.



But Fortran has declined in popularity and I don't think there is so much new code being written. However, I think sometimes it gives better-optimised faster binaries, which may be significant - that may be the difference between an analysis run taking 3 days or 1 day.
modulo_function
2014-01-04 15:39:55 UTC
Fortran stands for Formula Translation. It was one of the very first high level languages developed. It was developed before the developers of Java were born. It might even have been developed before their parents.



I learned Fortran 4 when I started engineering school in 1971.


This content was originally posted on Y! Answers, a Q&A website that shut down in 2021.
Loading...