|
Southmountain wrote: also I searched around and try to find a good IDE for GFortran. any recommendations for such IDEs? Ehr.. Fortran??
We keep burying VB6. Fortran?? Do we need to shoot it again?
Is the Fortran code worth anything? If yes, then maybe rewrite it in a modern language? You made me curious and awaiting your answer.
Bastard Programmer from Hell
"If you just follow the bacon Eddy, wherever it leads you, then you won't have to think about politics." -- Some Bell.
|
|
|
|
|
yes, it implements some algorithms on astrophysics I guess...
diligent hands rule....
|
|
|
|
|
If you plan to do programming in the area of astrophysics ...
There was an excellent program called Astronomy Lab for Windows, ALW - or ALW203 for the last version; I believe it was published in 1996. Maybe I should say that there is (not 'was') this program: It will still run under Windows 10. The functionality is great.
The only problem is that the user interface suggests that it was developed under Windows 1.0. You can't zoom fonts, all text is monospaced 'fixedsys' (the 'Set font' command has no effect), tables are drawn using |, + and - characters, graphs are drawn using hairlines, there is no support for cut & paste, scroll wheel and several other standard Windows functions. 30 years ago I was in contact with the developer, offering to update the user interface (I had a visually handicapped daughter who couldn't see the hairlines and needed a bigger font), but I was turned down: He had no intentions to continue developing the program, so there was no reason for him to let anyone improve the UI(!)
I have never found any replacement for ALW providing the same functionality and simplicity of use (in spite of its UI shortcomings). So if you go ahead developing anything with similar functionality, and a more modern UI, please use ALW as a checklist for what to include, and present the results as a CP article!
ALW203 is still available for downloading from a large number of sites on internet.
|
|
|
|
|
Eddy Vluggen wrote: We keep burying VB6. Fortran?? Do we need to shoot it again? Don't worry - it was shot, the Fortran giving you nightmares.
C.A.R. Hoare was right in his remark to the proposed extensions for Fortran 77: "I don't know what programming languages will look like in year 2000, but they will be named Fortran!"
I suspect that if you were presented with sample of modern Fortran code, you would never guess that the language is named Fortran. The evolution from Fortran IV to modern Fortran is more drastic than the evolution of the original thick coax 3 Mbps linear bus Ethernet to the Ethernet of today, using Cat6/RJ45, 1 Gbps, star topology.
A couple of years ago, a friend of mine working at the Supercomputing center of the Norwegian Universities. He told me that Fortran (in the modern form) still is a very significant language in supercomputer environments. Lots of scientists / developers find Fortran much more suitable than C/C++ for array manipulation, and lots of engineering problems are essentially array manipulation.
|
|
|
|
|
Fortran must be one of the most maligned programming languages ever -- and that's saying something. FORTRAN-IV was pretty awful, admittedly, but versions starting with Fortran-77 onwards were both very useful and usable -- regardless of the opinions of academic computer scientists. And modern Fortran is very much alive and well. Although I've spent most of the last 35 years using C, C++ and Python, I have no complaints about the Fortran versions I used (a lot) way back when and still use for hobbyist purposes. C++, on the other hand ... ... (although C++11 onwards is pretty decent).
|
|
|
|
|
trønderen wrote: Don't worry - it was shot, the Fortran giving you nightmares. It was a one night stand.
trønderen wrote: C.A.R. Hoare was right in his remark to the proposed extensions for Fortran 77: "I don't know what programming languages will look like in year 2000, but they will be named Fortran!" I'm awaiting Fortran.NET.
Most people do not own a supercomputer. The thing that Fortran has going for it, is that it isn't VB6
Bastard Programmer from Hell
"If you just follow the bacon Eddy, wherever it leads you, then you won't have to think about politics." -- Some Bell.
|
|
|
|
|
You sort of remind me of Peter.
|
|
|
|
|
Eddy Vluggen wrote: Is the Fortran code worth anything? If yes, then maybe rewrite it in a modern language? "If if works, don't fix it!"
Rewriting software to a potentially poorly suited other language, just because that other language is fashionable in many software development communities, may be a bad idea. Not always, but you need some stronger arguments than "We don't think Fortran IV reflects modern ideas about programming languages."
|
|
|
|
|
|
|
#define "a lot"?
Yours and mine may differ a bit. VB6 is still used "a lot", probably more than Fortran. Most business applications aren't in either language. There's "a lot" of software developed by businesses, "a bit" more than scientists.
Being used a lot isn't an argument, but an observation. It might be because lots of scientist learned "just" fortran and cobol in their younger years. That's why Java is a success; many universities refuse any commercial software and hence they prefer what small business doesn't.
Bastard Programmer from Hell
"If you just follow the bacon Eddy, wherever it leads you, then you won't have to think about politics." -- Some Bell.
|
|
|
|
|
|
I did read this page and decide to go with Code::Block...
diligent hands rule....
|
|
|
|
|
I was going to recommend Code Blocks. I use a lot. Great editor.
"A little time, a little trouble, your better day"
Badfinger
|
|
|
|
|
|
What did you do to get punished like that?
I’ve given up trying to be calm. However, I am open to feeling slightly less agitated.
|
|
|
|
|
personal hobby for astrological questions in financial areas...
diligent hands rule....
|
|
|
|
|
I used Fortran (77, 90) for years back in the day. It's claim to fame is very fast execution of mathematical problems, maybe the fastest. I was told it was the main compiler used for NOAA's super computers to predict/simulate weather systems. Today, of course, GPU's have come to the fore front so unless Fortran has been ported to these systems it may be eclipsed. Lot of Fortran code is still around.
"A little time, a little trouble, your better day"
Badfinger
|
|
|
|
|
If I am not mistaken, CUDA can be used with FORTRAN.
"They have a consciousness, they have a life, they have a soul! Damn you! Let the rabbits wear glasses! Save our brothers! Can I get an amen?"
|
|
|
|
|
I would not be surprised. Thanx, I didn't know that.
"A little time, a little trouble, your better day"
Badfinger
|
|
|
|
|
Another good thing about old style Fortran: You wouldn't experience any stack overflow, no out of heap space, no null pointer exception. Pre-90 Fortran didn't allow recursion, and had no dynamic allocation. All memory could be statically allocated. No run time load, no risk.
Lots of computing tasks can be solved without recursion, without new()/pointers. If you absolutely must do a recursion, you can manage your own stack by an array. Similar with linked lists.
In my student days, I was a TA for the Introduction to Programming course, taught to all Tech. University students. Some of the "classical" departments were still clinging to Fortran as the only viable language; half of the students were more modern, learning Pascal. The courses were identical, except for the language (even the textbook was identical but for the coding examples). 3 out of 4 hand-ins were identical. For the last one, the Pascal students were to build and manipulate a linked list, so the Fortran students had a completely different #4 hand-in.
One of the 'Fortran students' approached me, rather crossed: Why shall the others learn something that we don't get to learn? So I tried to explain to her how you could have a record field tell where you could find the next piece of data. I believe that I explained it referring to memory as a large array, the pointer being the index in that array. A few days later, this freshman girl approached me again, this time with Fortran code to solve the Pascal linked list problem, the 'heap' was a Fortran array, pointers were integers indexing the array, and the code certainly did solve the problem, giving the proper output.
If a freshman, non-computer girl (I think she was studying chemistry) can do it in Fortran, then a seasoned Fortran programmer with thirty years of experience should be able to!
|
|
|
|
|
yes, you correct. Code was all statically determined at compile time. Recursion was simulated with one's own stack arrays again already pre-allocated. If it was memory resident at execution time, it was bullet fast.
For many problems, the only thing not static was disk I/O. True the memory it used was static, but I/O times were not guaranteed fixed. Virtual memory was another variable not guaranteed fixed, but for most purposes, it was as good as fixed.
"A little time, a little trouble, your better day"
Badfinger
|
|
|
|
|
In my Fortran IV days, I wrote stacks / heaps / linked lists / trees using array and integer indices. That experience proved useful when I worked on a mainframe assembler. I was able to write recursive routines (e.g. an optimised quicksort) using the same techniques I had used in Fortran, emulating procedure calls as an in situ stack of return keys.
|
|
|
|
|
jsc42 wrote: In my Fortran IV days, I wrote stacks / heaps / linked lists / trees using array and integer indices. That experience proved useful when I worked on a mainframe assembler. 40 years ago I wrote a coroutine mechanism in assembler. This was on a VAX-like 32 bit supermini CISC, and I loved "having full control".
I recently picked up Aarch64 documentation to learn the ARM instruction set. I've never worked on a machine that register oriented, and I'm really itching to see both how bad code compilers generate and to learn how much you can save through assembler coding. My experience from that supermini, heavily microcoded CISC 40 years ago was that even if the generated code looked inefficient, it was very difficult to beat the compiler by more that a single digit percentage: Pipelining, fancy prefetching and other optimization techniques flushed linear sequences so rapidly through that the execution time was almost proportional to the number of jumps taken, killing all gain from the prefetch (branch prediction wasn't common in those days). Interrupts were extremely expensive.
So I am curious to see if a RISC type CPU is much different. (After studying the Aarch64 "reduced" instruction set, my reaction is "Thanks heaven that the ARM doesn't have a complex instruction set ) So I am itching to get myself something like that "Windows Dev Kit 2023", aka. "Volterra". But I fear that when looking back, five years from now, we'll view it like an early prototype. We will view the ARM version of Windows that comes with Volterra as an early prototype. So I will try to hold back, hoping that within a year or two, there will be a nice crop of competitors to choose from.
|
|
|
|
|
|