|
You wouldn't want too many of those in your app - they would eat stack like it was going out of style!
"I have no idea what I did, but I'm taking full credit for it." - ThisOldTony
"Common sense is so rare these days, it should be classified as a super power" - Random T-shirt
AntiTwitter: @DalekDave is now a follower!
|
|
|
|
|
Nowadays, stack is a problem even in Fortran
Recursion didn't enter Fortran until Fortran-90. All memory could be statically allocated. One of the reasons why Fortran has survived so well is that it would never crash on stack overflow (or heap empty), it is 'safe', memorywise. Even though recursion has been allowed from 90 on, seasoned Fortran programmer are not familiar with the term, and won't ever use it
Several other languages of that age did allow recursion, but only with functions explicitly flagged as recursive. Chill and Ada are examples - and for Ada, recursion was only allowed if the maximum recursion depth (hence, stack requirements) could be statically determined, at compile time.
In this case, both the old IV compiler and new 77, "secretly" allowed recursion. The compiler writers used back ends shared with other languages, and they didn't spend resources on making a special stack-less back end for Fortran. The static nature of Fortran made it trivial to determine the maximum stack requirement for a (non-recursive) Fortran program. I suspect that the same was true for a lot of other Fortran compilers as well around the 1970-80s. Actually, memory requirements were reduced, as the static allocation for each function could be reduced, with several functions using the same stack locations if they were not running at the same time. (I have a vague memory of some article claiming that Fortran optimizers could reuse even statically allocated space for different functions that couldn't be active at the same time, but I remember no details.)
Which reminds me of an old thought - warning: sidetrack follows:
(Maybe this should be moved into the 'Architecture' forum)
There is no real reason why stack frames are allocated edge to edge! The stack head has pointer to the frame below, but that pointer could go anywhere. Stack relative addressing never goes outside the current stack frame. Traditionally, even if you have a million objects, each running its own thread, each thread is allotted a stack for its maximum requirements. This can bind up quite a few megabytes that are hardly if ever used, most certainly not all at the same time. Never ever are every single thread preempted at its very deepest nesting of calls at exactly the same time.
Even though you with desktop PCs can just plug in another 64 GiB of RAM to satisfy stack needs, in other systems (such as embedded), this is not always possible.
Stack frames could be allocated from the heap, with space is occupied only as long as a function is active. Thereby, a given amount of RAM would be capable of handling a much larger number of threads. Especially in event driven architectures, a large fraction of threads idle at a low stack level. They receive an event and go into a nest of calls for handling it, after which it return to the low stack level waiting for the next event. The thread might do some sort of 'yield' halfway, but the great majority of threads would be back to 'ground base', or at a moderate call nesting level, most of the time.
The argument against this is of course the cost of heap allocation. There were machines offering micro coded function entry point instructions doing heap allocation from a buddy heap, so stack frame allocation was essentially unlinking the top element from the freelist of the appropriate size. Function return linked the frame to the top of the list. This requires a couple memory cycles for each operation.
Even though another 64 GiB of RAM is cheap nowadays, lots of programmers recoil in horror over buddy allocation: It causes lots internal fragmentation! 25% with binary buddies! Well, but how much can you save in stack requirements? Besides, lots of architectures demand word, doubleword or paragraph stack alignment anyway - that leads to internal fragmentation/waste as well! With a buddy scheme, allocation might be as cheap as two instructions. What is that on the total instruction budget for a modern style method call?
If the allocation cost worries you, lots of optimizations are possible, in particular if frame allocation is the responsibility of the caller. E.g. if a function's stack frame has a lot of unused (internal fragmentation) space calls a function asking for a small frame, it could fit in that internal fragmentation space, not requiring another heap allocation. There are dozens of such optimization.
I never heard of any software allocating stack frames on the heap. I never heard of anyone using the microcoded stack frame heap allocation on the one architecture I know of providing it. Is that because I am ignorant? Is it commonplace? Or has it been considered and rejected? If that is the case, what were the arguments against it?
|
|
|
|
|
When I wrote embedded code, I always used no heap at all - my code had to run 24/7/365, and dynamic heap allocation can cause fragmentation problems that can only ever occur on the customer site when the kit has been working for a long time; longer than I can practically test for!
Static allocation - with a small stack - was the order of the day.
And those days, I had 32Kb of RAM if I was lucky ... it was normally 8Kb
"I have no idea what I did, but I'm taking full credit for it." - ThisOldTony
"Common sense is so rare these days, it should be classified as a super power" - Random T-shirt
AntiTwitter: @DalekDave is now a follower!
|
|
|
|
|
So you never worked on an 8051 with 256 bytes
|
|
|
|
|
No, but I did some work with PIC10 - 32bytes of register memory, plus a two call deep stack, tiny code ROM - and no interrupts. Handy little buggers, but a pain to program anything complex.
I preferred the Z80 based stuff - you had room to breathe!
"I have no idea what I did, but I'm taking full credit for it." - ThisOldTony
"Common sense is so rare these days, it should be classified as a super power" - Random T-shirt
AntiTwitter: @DalekDave is now a follower!
|
|
|
|
|
But the code style guideline from your customer stated that each parameter had to be on a separate line!
With a comment line ahead if that was allowed by your compiler.
Fortran was my most used language in University, but I never tried putting a comment line between continuations. Seems like it would be legal, but a grader would have taken points off.
|
|
|
|
|
Wordle 574 4/6
⬛⬛🟩⬛⬛
⬛⬛🟩⬛⬛
⬛🟨🟩🟨⬛
🟩🟩🟩🟩🟩
|
|
|
|
|
Wordle 574 3/6
🟨⬜⬜⬜⬜
⬜🟩⬜⬜⬜
🟩🟩🟩🟩🟩
Software rusts. Simon Stephenson, ca 1994. So does this signature. me, 2012
|
|
|
|
|
Wordle 574 5/6
⬛🟨🟩⬛⬛
⬛⬛🟩⬛🟨
⬛⬛🟩🟩⬛
⬛⬛🟩🟩🟨
🟩🟩🟩🟩🟩
|
|
|
|
|
Wordle 574 4/6*
🟨⬜🟨⬜⬜
⬜🟨🟨🟨⬜
⬜🟩⬜🟨🟨
🟩🟩🟩🟩🟩
"I have no idea what I did, but I'm taking full credit for it." - ThisOldTony
"Common sense is so rare these days, it should be classified as a super power" - Random T-shirt
AntiTwitter: @DalekDave is now a follower!
|
|
|
|
|
⬜⬜🟩⬜⬜
⬜⬜🟩🟩⬜
🟩🟩🟩🟩🟩
Life should not be a journey to the grave with the intention of arriving safely in a pretty and well-preserved body, but rather to skid in broadside in a cloud of smoke, thoroughly used up, totally worn out, and loudly proclaiming “Wow! What a Ride!" - Hunter S Thompson - RIP
|
|
|
|
|
Wordle 574 5/6
🟨⬜⬜⬜⬜
⬜🟩⬜🟨🟨
🟨🟩🟩⬜⬜
⬜🟩🟩🟩⬜
🟩🟩🟩🟩🟩
|
|
|
|
|
Wordle 574 6/6
🟨🟨🟨⬛⬛
🟨🟨⬛🟨⬛
⬛🟩🟩🟩⬛
⬛🟩🟩🟩⬛
⬛🟩🟩🟩⬛
🟩🟩🟩🟩🟩
Get me coffee and no one gets hurt!
|
|
|
|
|
Wordle 574 X/6
⬜🟩⬜⬜⬜
⬜🟩⬜⬜⬜
⬜🟩🟨🟨⬜
🟨🟩🟩⬜⬜
⬜🟩🟩⬜⬜
⬜🟩🟩⬜⬜
I lose
"A little time, a little trouble, your better day"
Badfinger
|
|
|
|
|
|
|
ditto
"A little time, a little trouble, your better day"
Badfinger
|
|
|
|
|
|
Wow, thanks!
|
|
|
|
|
|
#Worldle #357 1/6 (100%)
🟩🟩🟩🟩🟩🎉
https://worldle.teuteuf.fr
pretty easy
"A little time, a little trouble, your better day"
Badfinger
|
|
|
|
|
Just got a new version of Office 365 pushed out to my work laptop and the "Simplified" Ribbon now looks more like the original toolbar used in the olden' days. ROFLMAO.
Use the Simplified Ribbon - Microsoft Support[^]
Perhaps this will go full-circle in the next couple of years and we will be back with the classic toolbar.
|
|
|
|
|
Slacker007 wrote: Perhaps this will go full-circle in the next couple of years and we will be back with the classic toolbar. Yeah, "simplified" to me means going back to a menu bar.
|
|
|
|
|
Personally, the ribbons never really clicked with me. I'm not saying they are bad or something and I use them in development - they can give a "modern" look to old MFC or WinForms apps. But obviously, as a user I'm not wired to get advantage of them, and I prefer normal menus or nav bars.
Advertise here – minimum three posts per day are guaranteed.
|
|
|
|
|
I have loved the ribbon since the beginning, the only thing I like more is the Quick Access toolbar at the top.
Cheers,
Vikram.
|
|
|
|