|
Being a 1970 model I learnt to debug Cobol code using hexadecimal dumps on reams of paper.
Sounds awful but in practise it was not too bad, it was just a case of being methodical when one went through the 90+ pages of paper.
“That which can be asserted without evidence, can be dismissed without evidence.”
― Christopher Hitchens
|
|
|
|
|
New? Wellllllllll - in the 1980's I had a FTIR* from Hewlett Packard - the software was written in England on IDRIS. The software was rev. 5 and buggy as hell.
We got updates - rev 6, 7, . . . etc. - and rarely was a bug fixed.
Not contesting what you mention, above, there's also the problem of coders so arrogant that they don't check for bugs.
If we go into the current climate, it's beyond coding: so much is manufactured as cheaply as possible and it's left to the consumer to get a replacement. As it turns out, it's cheaper to do no Q/C and replace the items that fail.
So, why not with code?
After all, how many innocent marks users bring their PC in for a repair due to software problems and the solution is to reload the operating system - trashing all of their data and non-OEM apps.
Not debugging? Perhaps it's because it's what has come to be acceptable.
* Fourier Transform Infrared Spectrometer
"The difference between genius and stupidity is that genius has its limits." - Albert Einstein | "As far as we know, our computer has never had an undetected error." - Weisert | "If you are searching for perfection in others, then you seek disappointment. If you are seek perfection in yourself, then you will find failure." - Balboos HaGadol Mar 2010 |
|
|
|
|
|
Wait.. are you implying that posting questions in QA ISN'T debugging??
If it's not broken, fix it until it is
|
|
|
|
|
Kevin Marois wrote: Wait.. are you implying that posting questions in QA ISN'T debugging?? With alarming frequency ... yes.
«Tell me and I forget. Teach me and I remember. Involve me and I learn.» Benjamin Franklin
|
|
|
|
|
Its not the art of debugging that is missing, its the whole process of problem solving for which debugging is a tool. Most programming quickie course teach how to write code, but not how to
1 methodically analyze problems,
2 review the evidence,
3 read the code,
4 formulate an hypothesis for the primary cause,
5 use the debugger and other available tools to prove or disprove the hypothesis,
6 repeat as required.
This is stuff you had to do when compiles took many minutes and had to be submitted to a mainframe, possibly on punch cards. Now, with a couple of seconds to compile massive solutions, you see them hacking at the code and re-running to see if it is fixed, not checking to see if their changes have 'suboptimal' side effects.
|
|
|
|
|
I like this general discussion - and your post struck a chord - now, I'm not so good at this yet (because I know how to debug & use logging etc so to some extent I 'get away with it' ), but I believe using 'tests'/a testing framework appropriate for whatever dev toolchain is being used, would assist greatly - not sure where it fits in 1-6 but its another 'tool' somewhere in the mix
|
|
|
|
|
Unit, Integration, and Acceptance Testing is a whole other discipline with the aim to prevent errors and prove the code meets requirements and works. Debugging is what happens when something slips through the Testing.
Both have the goal of creating bug free code, but Testing is proactive and Debugging is reactive.
Testing usually has the pointy-haired manager complaining about all the effort the doesn't create new code.
With debugging there is usually panic and pointy-haired managers demanding status every 10 minutes.
Poor PMs will try and save time/money by doing a poor job of Testing and then blame the programmers when they more than pay for it with bugs, debugging, and large code update/rewrites.
Ideally, when you are trying to fix a bug you should write a Unit Test to verify the bug, and validate the fix. Then fix the bug. The Test will tell you when you are done and the pre-existing Tests will tell you that you didn't break anything else.
modified 14-Dec-15 23:45pm.
|
|
|
|
|
This topic struck a note with me as well, as I find that most new developers have next to no skill in doing any sort of deep debugging.
I find myself in the group that learned debugging in the "old days" - I actually honed my skills in the '70s on the early hobbyist computers, teaching myself machine language/assembler programming (side note - first processor I programmed on was an 1802 ). Back then there wasn't a lot of good programming documentation - you were lucky to get a description of the ports/memory map locations and a brief description of each. And there was internet as such, so unless you were part of a user group or had access to a good local BBS, you had to figure these things out for yourself.
So often, you would have to really think through how to solve a problem, find the write approach for the hardware you were using, and then write a test case you could work through in a debugger to see what was actually happening. You would learn something, update the code, and try again. This iterative approach was great in getting a deep insight into your computer environment, and it really made you think through the problem you were trying to solve. (side note - I remember loving Microsoft CodeView, and how you could bring up the application in your EGA display, and the debugger on your monochrome display).
Yes, today we do have test frameworks to help catch problems before they occur. But it is next to impossible to anticipate every possible situation. Especially when you have to integrate 3rd party libraries and frameworks, which may or may not work precisely as documented. There will always be times when something goes awry, and you better know how to dig into the deeper levels of the code with a debugger, isolating the issue, and coming up with a reasonable fix.
I don't think debugging per se is a lost art, but there are certainly degrees of expertise to debugging. I know that when I have an customer support issue that needs to be fixed, the first thing I do is not to pull up the debugger, but think about the problem, trying to think about what the issue is, and the approaches that might be required - in short, come up with a plan. I find that in doing this first, I tend to have a better idea of where in the code to look at first, and what hints at issues I should be paying attention to, and what approach I should take to debugging the code.
A lot of this is simple experience. Those of us from "the old days" didn't have a lot of choice, and had a lot of experience in debugging code - as that was the only way we had available. Today though, with many more "proactive" approaches to dealing with issues, developers don't get the same level of deep debugging experience. And that is a cause for concern, as I doubt that the need for debugging will ever go away.
|
|
|
|
|
That's a very nice summary and I fully agree with it! What a glorious time when we had to read memory dumps in order to find a bug that was caused by a module written in Assembler
|
|
|
|
|
Got my CS degree in 03. The only how to debug education I was given was in HS; a number of my peers in college at the time appeared clueless about debugging. The situation with lack of debugging training was made worse by the fact that someone with pointy hair had settled the MSVC vs GCC holy war by mandating that freshman C++ 1/2 were to be taught using Borlands Tools. Neither of my profs, nor any of the student TAs in the labs really had a clue with Borland's stuff and weren't able to show us how to use any of the debuggering tools (Freudian slip, the Borland IDE was horrible at the time). More than a few basically said that for grading they normally just used their favorite compiler and only tried running in Borlands if it didn't work in theirs. Beyond freshman year the general req was just to state what compiler you used when you submitted.
Did you ever see history portrayed as an old man with a wise brow and pulseless heart, waging all things in the balance of reason?
Is not rather the genius of history like an eternal, imploring maiden, full of fire, with a burning heart and flaming soul, humanly warm and humanly beautiful?
--Zachris Topelius
Training a telescope on one’s own belly button will only reveal lint. You like that? You go right on staring at it. I prefer looking at galaxies.
-- Sarah Hoyt
|
|
|
|
|
When I was in HS taking CS we had an entire section on learning to use the debugger. That was because we were taught by someone who knew it was important to understand the entire process no matter the language. Starting with Pascal, then moved into C++, Delphi, Java, etc.
In college, one of the required courses for all CS majors was Logic. Imagine requiring someone know how to think through a problem!
More often than not, when I get called over to help fix someone else's code because it won't run they have done zero debugging. If I ask if they have added breakpoints and stepped through the code all I get is a blank stare like I've grown a second head. Gotten some of them trained because I stopped helping them if they hadn't.
Some of the logic I've seen just hurt my brain. Why on earth would you fire a request to get all the information for a record for each and every page during onload? If you don't need it or it doesn't change, why do you get it again? Are you worried that the database fairy decided to play a prank on you?
Along with debugging, I question if orders of efficiency (Big O notation) are taught. It is like when computers got powerful, all the basics got tossed out the window.
|
|
|
|
|
I agree on 'efficiency' - what data structure to use where, knowing its 'general' characteristics with respect to lookups, traversals etc - Im just not how to express the relevant characteristics - I guess thats why Big O is important - it just seems pretty 'dry' as a topic
|
|
|
|
|
When I was in University I took a few Electrical Engineering courses (counted for credit towards CS degree). ELEC 256 was all about basic circuit design, using standard gates (AND, NOR, etc.). Beyond the fact it was a lot of fun, it taught me a lot about basic logic theory, and ways to optimize logic circuits (simplifying circuits through Venn diagrams and such).
Since that time, I find I still apply those same techniques when programming, for example simplifying a complex set of logical conditions into a much smaller and simpler set of conditions.
Do such things get taught anymore in schools? Especially tech schools (as opposed to University or College), where there might be less emphasis on topics and Logic Design, Linear Algebra and Calculus, etc.?
|
|
|
|
|
I was required to take an EE course for my degree, very similar to what you mentioned. By that time I had already taken my general Logic course and understood the theory but not how to optimize the paths.
Complete agree with you about still applying those techniques on the job. The better I can filter down the initial dataset, the better the server handles that complex logic, the less data transfer back and forth and faster response times. At least most of the time.
The answer to your question is "It depends", a friend of mine who attended the same Uni and went after a Management of Information Systems degree instead of a Computer Science degree did not have to take the Logic, Calculus, and other theory classes. He got a BB, I got a BS. He learned how to make a program run but he did not learn how to make it run well. He is really good at Copy/Paste.
|
|
|
|
|
I agree - this seems to be a bit of a lost art.
I've noticed many new software engineers focused on tools as opposed to understanding what really goes on, so when they run into a problem they often run to me to help them. I think this is part of the "coding is easy" mindset that so many of them have.
It takes a lot of practice to debug very well. Learning the basics is relatively easy.
"Computer games don't affect kids; I mean if Pac-Man affected us as kids, we'd all be running around in darkened rooms, munching magic pills and listening to repetitive electronic music."
-- Marcus Brigstocke, British Comedian
|
|
|
|
|
its a great discussion - so what do 'we' ie 'we as a collective group of (cough) more senior programmers/... do about it ?
do we collaborate on an article series ?
|
|
|
|
|
I learnt debugging back in the early 80s for R/Basic, an interpretive language from an OS called Pick.
There was no such thing as an IDE back then, pretty much a suck-it-and-see approach.
That's where I also learned that what I really did was 85% plan, 5% code and 10% debug.
I also learnt that I could move those goalposts to as much as 93% plan, 5% code and 2% debug.
Debug back then was embedding print and input statements to step through the code throwing the values of variables to my screen (or lineprinter) as I went.
Now, with all the wondrous tools I am probably doing 45% plan 10% code and 45% debug because the tools have made it easy. I think they try and rephrase what I do as 'Agile'.
I simply call it lazy (and, yes, I know that infers I am lazy, lets remove the inference. I am http://codeproject.cachefly.net/script/Forums/Images/smiley_wink.gif )
|
|
|
|
|
not sure if on same line of QA that your talking about, however one team member that I am trying to 'train', mostly is an issue with getting them to understand what the actual Issue is and see past the symptoms, and fix the issue at root (if possible)
|
|
|
|
|
I can only speak for myself. I never really learned debugging, I just knew that debugging is that thing that pops up when you press "Go" in your IDE. This was my starting point.
Now, I don't want to miss my debugger. I've learned everything on the go, I've learned how to watch values, set breaking points, watch memory (those pesky C coders in my team hate symbolic debugging and love calculating pointers instead of using them as references).
I will never ever touch a development environment without a debugger again. Except Arduino, but this I do as a hobby, not during work. Debugging is, I think, never obsolete. All documentation out there won't help jack if my code misbehaves.
As for what to do for them, well, if you enconuter a question answerable by debugging, don't give the man the fish, teach the man how to fish. That is, don't answer the question right away, but write a longer post about how to find it yourself. This isn't easy, first it takes much longer, second it will be shot down by people who just love giving hungry fish instead of teaching them to fish. But hey, if someone wants to learn, then he will read!
|
|
|
|
|
Who's going this Friday? Let's see if you're a true fan:
http://dotnetify.net/trivia
|
|
|
|
|
What a rubbish site.
Wait.
Wait.
Wait some more.
Question appears.
Click on right answer.
Nothing happens/
Twiddle thumbs for 29 seconds until timer runs out.
Close browser page...
Bad command or file name. Bad, bad command! Sit! Stay! Staaaay...
|
|
|
|
|
OriginalGriff wrote: What a rubbish site. I didn't think it was that good.
|
|
|
|
|
I was trying to keep it KSS...
Bad command or file name. Bad, bad command! Sit! Stay! Staaaay...
|
|
|
|
|
Wow, plenty of jaded replies - my fault, I guess, for whipping out a few hours of fan work to master developers here; but point taken, I'll 'agile' your 'user voice' and shorten the timer
|
|
|
|
|
Why have the timer, except for a "Out-of-time!" reminder?
If I select an answer, I want feedback immediately - not half a minute later.
So when I get to the site, get that question up quickly.
When I press the button respond immediately: right, wrong, or "are you sure?" and get ready to move on to the next.
If this site took thirty seconds to respond to anything you did, you wouldn't visit it again, right? So why make yours move like a stunned slug on Mogadon deliberately?
Bad command or file name. Bad, bad command! Sit! Stay! Staaaay...
|
|
|
|
|