|
In my first university course on programming, we handed in 'coding sheets', which were copied to punch cards by a group of ladies having card punching as their meaning of life. (At least of working life.) Then they put the card decks into the batch job entry system for the huge mainframe. A day later, the job had been run, and we could pick up the listing (from both the compilation and run, if compilation was successful) and the card deck from the handout shelves.
Those ladies were certainly not perfect, error free typists. And, in rush periods, it could take two days before the mainframe got around to run our job. So we grumbled a lot ...
Around Christmas time, after four mandatory hand-in coding exercises, one of the girls in my class couldn't understand our grumbling. Who cares if it takes a couple of days before you get the results? What is really this thing about 'error messages'?? We slowly realized that after half a year as a programming student (with no prior coding experience), she had never made a single coding error, neither in syntax nor semantics, in any of the four exercises. Furthermore, the typing ladies had not made a single typo when copying her coding sheets. So after a full semester, she didn't have a clue about what an error message is! We tried to explain it to her, and she had problems understanding why we didn't fix such errors before handing in the coding sheets.
When she left the room, the rest of us where very much in agreement: She had been missing out on some very important learning experiences
|
|
|
|
|
trønderen wrote: So after a full semester, she didn't have a clue about what an error message is!
Well, she could have been one of the super-programmers you occasionally hear about. I would be interested of knowing how her studies (and career) progressed...
Freedom is the freedom to say that two plus two make four. If that is granted, all else follows.
-- 6079 Smith W.
|
|
|
|
|
We had to do much the same in college, except that we had to punch our own cards. All in all, it wasn't a big challenge, if one didn't mind having to arrive at school at 3 AM to find an available machine. But speaking of errors, I tried creating one program in COBOL; IIRC, it was 83 lines, but managed to spew 107 errors. Even the COBOL expert in the white coat who worked in the computer center couldn't find a single error in my code. I never tried COBOL again, since FORTRAN was all the school required us to learn.
Will Rogers never met me.
|
|
|
|
|
trønderen wrote: ...she had problems understanding why we didn't fix such errors before handing in the coding sheets. Maybe she was a fan of one of Knuth's famous quotes.
"One man's wage rise is another man's price increase." - Harold Wilson
"Fireproof doesn't mean the fire will never come. It means when the fire comes that you will be able to withstand it." - Michael Simmons
"You can easily judge the character of a man by how he treats those who can do nothing for him." - James D. Miles
|
|
|
|
|
trønderen wrote: So after a full semester, she didn't have a clue about what an error message is
Sounds like a case of "on error resume next"...
|
|
|
|
|
Sometimes when I don't get bug reports from my software I wonder if it's even being used
|
|
|
|
|
Maybe there is a bug in sending the bug reports.
|
|
|
|
|
My observation is that if nobody is reporting any problem in a new feature, it's not because it's bug-free, it's because it's not being used.
Then wait 6 months (after you've forgotten all the important details), then someone will find something and it'll take you forever to get re-acquainted with your own code...
|
|
|
|
|
One of the things that really helps me when coding in C++ with templates is I can easily visualize the result of the template expansion/instantiation as raw C++ (no templates)
When I'm coding in C++ I can visualize the equivalent C as I'm coding.
I can also to a degree, visualize the assembly. Not in any specific sense, but in a sort of pseudo-code like "I know when we're doing a shift, an add, and a push here" kind of thing.
I do this to a lesser degree in C# even, when I'm considering performance. I think about C++ code required to get it to do the same thing - but like I said to a lesser degree.
Do you do this? I'm just curious. It feels like such a blessing sometimes.
To err is human. Fortune favors the monsters.
|
|
|
|
|
I have a good sense of what's happening underneath and will think about it if it seems important.
I never coded in C but spent a lot of time in a proprietary, procedural language that might be described as a cross between Modula and a stripped down Ada. We sometimes implemented polymorphism, inheritance, and encapsulation manually, so how OO languages did it wasn't a surprise.
Having to parse and instantiate the code for templates significantly improved my understanding of what was going on there.
My most recent experience actually writing assembler was on the PDP-10 😲, but what happens in the depths hasn't changed much. Once in a while, I'd like to step into x64 assembler to see what the O/S is doing, but I can't follow it very well. Maybe someday it'll be important enough that I get my butt in gear and dig into it.
|
|
|
|
|
Greg Utah said: I'd like to step into x64 assembler to see what the O/S is doing, but I can't follow it very well. Maybe someday it'll be important enough that I get my butt in gear and dig into it.
If you decide to learn x64 assembly, take a look at this new release & fantastic book (which I am reading bits at a time):
The Art of 64-Bit Assembly, Volume 1: x86-64 Machine Organization and Programming [^]
Really great book, by a master author.
Is anyone else working their way through it?
|
|
|
|
|
I have a vague memory of working with the older Visual Studio versions (pre-. Net), and it provided a view of the actual machine code as you were debugging, and you could step through at that level. I found that quite useful at times.
Is that still available when working with C++ in VS? It's been a long time since I worked with that...
|
|
|
|
|
It exists for C++ in VS 2017 (what I'm using at present for maintaining an older project) - Debug | Windows | Disassembly. I don't know if you can do the equivalent (see the IL?) in C# or similar...
Freedom is the freedom to say that two plus two make four. If that is granted, all else follows.
-- 6079 Smith W.
|
|
|
|
|
It's still there in VS2022, and you can step into code for which you don't have the source.
|
|
|
|
|
I call it the x-ray vision. I look at some code and visualize the code approximately how it will be when it runs.
Common? Probably not. I came to C and C# etc from an assembly background, but most programmers nowadays do not. Hence why they're happy to use constructs that are "nice on the surface but gross underneath".
|
|
|
|
|
I don't use a "UI designer" in WPF or UWP; it's all XAML and / or code for my UI. I can visualize the whole visual tree.
I can't do the same in Windows Forms; you "need" the UI designer (IMO).
"Before entering on an understanding, I have meditated for a long time, and have foreseen what might happen. It is not genius which reveals to me suddenly, secretly, what I have to say or to do in a circumstance unexpected by other people; it is reflection, it is meditation." - Napoleon I
|
|
|
|
|
A similar situation: I met a young software developer, almost completely blind (he had a very narrow tunnel vision), so he did all software development using a Braille line, where he could read a single line at a time by stroking his fingertips over the Braille line. He had the entire source code inside his head.
I willingly admit that I think it a great help to see an entire method at a glance on the screen or printout. I would be using a screen even if I had the mental capability to keep every source line in my head.
|
|
|
|
|
Don't you remember ed[^]? I used to write assembler using ed and everything was in my head. Now, if I have to work with a single monitor, I start complaining. I became such an wimp!
Mircea
|
|
|
|
|
In those days we made printouts, marking corrections, additions and deletions using a ballpoint pen. Most times the printouts were the compiler listings, so we had the error messages readily at hand when we sat down to make all the corrections in one sweep from the top of the file to the bottom.
Printouts / compiler listings were essential! That's where we did most of our development and debugging - "Debugging by cranial massage", as one of my university lecturers called it. When we got more experienced, learning to insert debug output statements, we read two listings side by side: The compiler output (maybe with warning messages, but no fatal errors), and the printed output from the debug run.
An essential compiler quality metric in those days was the ability to report all errors in a single compiler run, to reduce both the number of compiler runs and the requirements for listing paper. The first compiler I read was a Pascal compiler, and was extremely impressed by how much resources was spent on recovery, going on with records describing e.g. an unknown symbol with all the possible interpretations of it: "It might be an integer, it might be a real, but not a string, not a label". If then a real literal was assigned to the symbol, the integer option was canceled, and subsequently, the symbol was treated as if it was a declared real variable. - A second quality metric was that one coding error, such as a missing declaration, should lead to one error message, not causing a cascade of hundreds of messages.
It looks to me as if modern compilers have thrown away both these qualities. They are not good at recovery, but rather tells: Fix this first error and recompile, and then I will tell you about more errors! And if you do ask it to go on, it may cancel the compilation when reaching 1000 messages ... which may all be consequential errors after a single typo ...
I guess I prefer the modern interactive frequent recompile development style. Yet I sometimes long back to those days when the number of error messages were at least within a magnitude or two of the number of actual coding errors. With the compilers of today, I certainly would not want work my way through a compiler listing with a few thousand error messages, trying to understand what are distinct errors and what are consequential ones, fixing the real ones with ed.
(Actually, I haven't used ed much at all, but several other line oriented, 'teletype oriented', editors that were not much different. Plus, of course, classic BASIC where you address the line you want to edit, or where you want to insert new lines, by the line number, mandatory on every line.)
|
|
|
|
|
In a sense, any developer "worth their salt" can model what the machine is doing in their head. The merit of their code will reflect their ability to do just that. I don't think the approach you take to the model matters all that much. All that matters is the fidelity of the model to the task at hand.
I've worked with a couple 'cargo cult' programmers who genuinely couldn't do this. All they did was recognize a pattern to the task and copy/pasted code they'd seen that matched the pattern. They would then bash the code with a hammer until it more-or-less did what was required. It's hard working with them, because they don't understand what's wrong with what they did.
Software Zen: delete this;
modified 26-Jun-22 12:38pm.
|
|
|
|
|
Any decent programmer should have a good idea of what the VM (the one we program to, not the physical device) is doing. A good programmer will be able to see how this affects the processor (what CPU instructions are being executed). An outstanding programmer will take this one further level - how is this affecting the system (memory usage, caching, disk swapping, etc.).
Freedom is the freedom to say that two plus two make four. If that is granted, all else follows.
-- 6079 Smith W.
|
|
|
|
|
I rarely do my programming on a VM. Understanding how a VM works is not at all relevant to my problem solving. Memory usage, caching, disk swapping etc. is equally relevant for a non-virtualized machine.
Just out of curiosity I would certainly like to understand the virtualization mechanisms provided by modern CPUs - but the documentation is certainly not written for software people! I cannot possibly imagine than more than one percent of all the software people I have been working with has the slightest clue about the actual mechanisms.
A few years ago, I bought a pile of books claiming to explain virtualization; they were college level textbooks, giving you an understanding of the real/i> mechanisms at a level that could be compared to your understanding of international politics gained from watching TV newscasts. Fair enough for cocktail party conversations ...
I am quite sure that the majority of software people claiming to know something about VMs are not much above the cocktail party level. They probably know what the VM will do for you, but not how. If I brought an architecture reference manual for a recent Intel CPU, asking questions about this and that mechanisms works to support the what, and the consequence of not having the mechanism available, as on older CPU families ... The great majority of software people would be lost - even among those claiming to know what VM is.
If you are active in development of the Linux kernel, or writing register-level drivers for Windows 11, you must of course have an understanding of the virtualization mechanisms way above cocktail party level. But that applies to a tiny little speck of the programming world. For the great majority of them, there are far more important skills and areas of knowledge that would make him a more outstanding software developer.
|
|
|
|
|
I should have made myself more clear - the language's virtual machine.
All higher-language programming is done to a "virtual machine" of some type. Even C assumes certain things about the hardware it is running on. It is in this sense that I refer to a virtual machine.
Freedom is the freedom to say that two plus two make four. If that is granted, all else follows.
-- 6079 Smith W.
|
|
|
|
|
honey the codewitch wrote: It feels like such a blessing sometimes. It is...
I can't do it. But coming from the PLC world, I (at least) still know how to be efficient and careful about resources way more that the average programer.
My senior once told me he enjoyed the brainstormings with me, because I many times managed to blow his mind with questions like "mmm... if I understood you correctly, you are trying X, couldn't it be done using Y?" And Y was way simpler and many times with better performance than what he was coding.
M.D.V.
If something has a solution... Why do we have to worry about?. If it has no solution... For what reason do we have to worry about?
Help me to understand what I'm saying, and I'll explain it better to you
Rating helpful answers is nice, but saying thanks can be even nicer.
|
|
|
|
|
I have met people who had to be able to imagine the microcode, and the specific signals sent to the various units of the CPU, to feel that they had a good grip on the software. Knowing the nature of static vs. dynamic RAM is essential to understand the effect of cache size on performance.
When compilers started using optimizing techniques (that was with Fortran II, wasn't it? Long before my time ), I lost grip on how the assembly code generated from my high level code will look. The optimizer moves stuff all over the place, may remove significant parts (e.g. as unreachable, or because that subexpression was calculated earlier so it is available in a temporary location), and so on.
Donald Knuth didn't trust any high level concepts. If he had, maybe his Bible would have gained a large following. It ended up with his Bible being something that sits on the bookshelf. To learn algorithms, you rather go to books based on concepts relevant to the programmer's problem, not to the CPU designer.
Gradually, I have come to trust the compiler. I know that not everyone does: In my scrapbook archive, I have preserved a discussion from a network forum where this one guy fiercely insisted that the VAX C compiler should have generated a different instruction, which this debater thought more appropriate. Others pointed out that the actually generated code was faster, but the debater stuck to his conclusion: The compiler was defect, generating inappropriate code.
Yet ... I am sometimes shocked by how willingly youngsters even with a university degree accept that "If you just flip this switch and push that button, it works - but I have no idea of why!" I still maintain that you should understand what is going on at least one level below the one you are working on. The thing is that today, I work at least two levels up from where I was as a student. Microcode and assembler is way below my current programming problems. I relate to objects and structures and parallelism, not to bytes and instructions. Say, subclassing and superclassing; that is understood conceptually, not as how they map to machine instructions. Not even how they map to C! My first encounter with C++, in my student days, was a compiler translating C++ to K&R C code, which required a second compilation pass. I enjoyed it then, but nowadays, I no longer find it worth the effort. Same with overloading: I learned how the compiler generates method labels based on the argument classes. Today, I know that the problem is solved; I have to know the rules for legal overloading, but not which signals the instruction decoder sends to the various parts of the CPU. Not even the binary instruction.
Obviously: If I were working with low-level drivers directly interfacing with hardware, instruction and binary data formats would be essential to understand. But not even a driver programmer needs to be concerned about internal signal paths in the CPU.
|
|
|
|
|