|
The workplace of an Internet Spider[^], of course.
Software rusts. Simon Stephenson, ca 1994. So does this signature. me, 2012
|
|
|
|
|
It is a desktop program that uses the same web-service calls as the web-based program uses for a start. As I said, it's a hybrid program.
- I would love to change the world, but they won’t give me the source code.
|
|
|
|
|
February?!??
Try Tuesday!
If you can't laugh at yourself - ask me and I will do it for you.
|
|
|
|
|
What annoys me the most is when I've commented the code but can't even interpret my comments anymore
It does not solve my Problem, but it answers my question
Chemists have exactly one rule: there are only exceptions
modified 19-Jan-21 21:04pm.
|
|
|
|
|
rarely do I look back at my old code and am amazed. More likely I am cussing myself because I was an idiot and if I would have only looked at it differently I could have saved myself so much time.
But there was that one time at 3am drunk coding. Still not sure why it works or how but it does. Also, not sure what it does either.
To err is human to really mess up you need a computer
|
|
|
|
|
This is great, I thought I was alone.
I watched someone using one of my tools once. He would start it and due to the size of the project it would take about 20-30 minutes to complete. He was literally just playing with his phone for 3/4's of his day instead of doing other things.
I got so angry I literally rewrote the whole thing an entirely different way while consuming an entire bottle of scotch. I woke up with no recollection of what I had done, but it worked and that same task only took 30 seconds! Years later I did unravel what I did and still don't understand how I made something relatively nice while hammered drunk.
Clearly my first implementation was not a good one, but in my defense it wasn't meant to be run on excessively large data sets at the time.
|
|
|
|
|
I knew I wasn't the only one. <grin>
To err is human to really mess up you need a computer
|
|
|
|
|
It's call "flow" (when it happens).
It was only in wine that he laid down no limit for himself, but he did not allow himself to be confused by it.
― Confucian Analects: Rules of Confucius about his food
|
|
|
|
|
honey the codewitch wrote: I was able to pick it up and start maintaining it right away despite me having written it in February
In my case, I look at code from a few years back, think "what turkey wrote this crap?", then discover that it was I.
honey the codewitch wrote: I understand how it works, but I still don't understand how I did it.
Occult powers?
(You are a witch, are you not?)
Freedom is the freedom to say that two plus two make four. If that is granted, all else follows.
-- 6079 Smith W.
|
|
|
|
|
I can provide a well-aged perspective here. Most of the software I write professionally has a long life. Development typically takes a couple of years and then maintenance lasts for a decade or more. I have an internal tool that got its start in 2000 and I'm still actively developing and maintaining it. In other words, I get to go back and look at code I wrote a long time ago - a lot.
Over time my coding and commenting style have matured. I write things as simply as possible. I avoid being clever, and I avoid 'clever' language features. I name things carefully, especially when they are tied to a particular part of the product or hardware. I use comments only to say things the code can't and to link the code to hardware or documentation when necessary.
During maintenance I refactor to simplify things or just to improve readability. The worst case is that I branch the code in question and refactor the branch until the cognitive dissonance from it is tolerable. Usually it doesn't go that far, as I'm probably trying to fix a problem, and I'll find it during the refactoring. Often I can just discard the branch, fix the bug in the original code, and go on. Sometimes I'll keep some of the refactoring. It depends upon the scope of the problem, the fix, and the risk associated with them.
The end result is that it's fairly rare for me to look back at something I wrote and be baffled or confused by it. The point here is that you only acquire this skill by doing it. If you only write code that you never revisit, you've lost the learning opportunity that arises when you do.
Software Zen: delete this;
|
|
|
|
|
I'm not confused by it. It's actually fairly easy for me to understand, considering what it does.
And what it does is complicated, any way you slice it. For starters, it relies on a LALR algorithm which is confusing no matter how you break it down. LALR is just complicated.
Furthermore on top of that, I have a non-deterministic worker that finds all possible trees for a parse based on an ambiguous grammar. Again, it's just complicated, but it can be simplified a bit, unlike LALR, and I had simplified it
Finally, it has to take all of this and generate code in most major .NET languages (usually C# or VB.NET)
It's just a complicated project. I avoid "clever" as well when I don't have to, but some of the features my generator has are implemented cleverly because the alternative is far larger in terms of code, and slower to execute.
Edit: Just so you know I'm not blowing smoke, here's an article on what it does: GLR Parsing in C#: How to Use The Most Powerful Parsing Algorithm Known[^]
Real programmers use butterflies
|
|
|
|
|
In my case, the complication arises from the number of agents acting on a variety of time scales. We build commercial ink-jet printing systems. At one scale, you have a 40,000 foot roll of paper that may take up to an hour to be printed through the machine. At the opposite end of the scale, you are generating and tracking over a billion drops of ink per second, each measuring 6-9pL in volume. In between, that paper is moving through the press at 17 feet per second and a user navigating a touch screen. The agents I mentioned include PLC's, custom processors and hardware managing the press, the actual ink-jet, image quality cameras, and system timing. Our product consists of a UI application and several Windows services which divvy-up responsibilities. All of them including the UI are heavily multithreaded.
My point in all this is that complexity in a given project can arise for any number of reasons. My experience has been that the key to managing that complexity is through professionalism and craft. I'm afraid your work hits something of a nerve with me. I've had a couple unfortunate experiences with folks whose work was more computer science than engineering, and had a generally low opinion of coders in the trenches.
Software Zen: delete this;
|
|
|
|
|
I feel I need to clarify that I don't have a low opinion of coders in the trenches. I used to be one.
That said, my code I post here isn't bizdev code, or even team developed. I code for the situation I'm in. My professional business software source doesn't look like the source I code in my free time where I can make it look and perform how *I* want to. It takes me less work to do it my way and I find the freedom of it liberating.
I think it's weird that you consider my code more CS than engineering, since I've never taken a CS course in my life.
I learned in the field.
Real programmers use butterflies
|
|
|
|
|
honey the codewitch wrote: I think it's weird that you consider my code more CS than engineering My bad. I'm stereotyping your code based on the subject matter: parsers and the surrounding ecosystem. That area of expertise has always seemed to be dominated by academics, in my experience.
I occasionally do work on the side from my M-F/8-4 job. One job was for a university professor who used graduate students as slave labor. They needed a multithreaded app to setup and control some hardware they were developing for sale outside the university. There was quite the culture shock when I started submitting code to them. They were used to using and writing code that started with the bare minimum necessary to perform some function, and then layered error handling and UI on top. The notion of architecting a solution in advance that kept these considerations in mind was utterly foreign to them.
The more noteworthy job was software to run a prototype machine. An intern at the company had written hardware control primitives that were quite good. A scientist wrote code that performed the detailed mathematics required to execute the machine's actual function. The scientist was a good mathematician, but a terrible programmer. I was hired to write a test bed application to let the company demonstrate the hardware to their customer. I wrote UI and integrated the intern's hardware primitives in short order. Integrating the mathematics was a disaster. I routinely set the warning level on my compilers at maximum just ensure that the stupid mistakes are caught. The scientist's code wouldn't compile clean, even at warning level zero. Lots of ill-advised pointer arithmetic, a global misunderstanding of type casting, random switching between float vs. double , a firm belief that array indices in C started at 1 (see the pointer arithmetic), and so on. I tried to work around the problems for a while, but finally gave up.
The scientist wouldn't give me a copy of his design notes for the mathematics, so I finally went to the head of the project for them. Between those notes and reverse-engineering his code, I was able to replace the mad scientist code with something a lot more robust. Interestingly this was one time in my career where my courses in numerical methods in college really came in handy. I replaced some of the scientist's integration and differentiation code with other algorithms to address precision issues. I even found some operational errors in the design as I coded the replacement. Since this was a prototype and a demonstrator, it wasn't too hard to make the code switchable to demonstrate the original mathematics versus mine. Since the original math crashed the app well over half the time, or took minutes to produce a result, and my code took a couple of seconds and never crashed, it made an impression.
The funny part of the whole thing was that I didn't know anything about the problem domain that the math was being used in. I just knew when the syntax of the operations being performed didn't make sense (multiplying a 5x7 matrix by a 4x3, for example), or that the order of operations was likely to cause an overflow or underflow, or that units conversions were not being handled correctly.
Software Zen: delete this;
|
|
|
|
|
Gary R. Wheeler wrote: I'm stereotyping your code based on the subject matter: parsers and the surrounding ecosystem. That area of expertise has always seemed to be dominated by academics, in my experience.
It is, but only because I never went to school for software development, so now that I have time, I'm picking up on the CS fundamentals I never learned. It's not to make my code more academic, but it's to round out my knowledge.
Gary R. Wheeler wrote: The scientist wouldn't give me a copy of his design notes for the mathematics, so I finally went to the head of the project for them. Between those notes and reverse-engineering his code, I was able to replace the mad scientist code with something a lot more robust.
That doesn't surprise me actually. This might be my failing in assuming code produced by academics has no place in production, but that's where I'm at and how I feel. We may even share that opinion.
Still, I don't want to be too hard on them, and I think being able to give your algorithms formal mathematical treatment has its place, especially with really complicated algorithms.
Real programmers use butterflies
|
|
|
|
|
When I look back into code I wrote in the past, I always admire the me-from-the-past. Elegant, lean, well documented. But then, I am a really good coder, this does not come as a surprise.
* struggles to remain serious *
|
|
|
|
|
In all seriousness, I've been told I'm good before, but I figure as long as there is room for improvement I'd rather think of myself as still learning - I've also been told that humility is the seed of wisdom.
A lot of my code is pretty lean though, and sometimes elegant. My documentation is spotty when left to my own devices but I'm getting better about it (again).
That's not to say I haven't written a lot of WTF code. In fact, my first attempt at doing anything non-trivial in terms of an application is often garbage. I even plan for that. I consider my first attempt a draft. It's that bad sometimes.
Real programmers use butterflies
|
|
|
|
|
honey the codewitch wrote: I consider my first attempt a draft.
The legendary Fred Brooks, in The Mythical Man-Month (1975), p116: The management question, therefore, is not whether to build a pilot system and throw it away. You will do that. The only question is whether to plan in advance to build a throwaway, or to promise to deliver the throwaway to customers. Seen this way, the answer is much clearer. Delivering that throwaway to customers buys time, but it only does so at the cost of agony for the user, distraction for the builders while they do the redesign, and a bad reputation for the product that the best redesign will find hard to live down.
Hence, plan to throw one away; you will, anyhow. Ain't that the truth!
Software rusts. Simon Stephenson, ca 1994. So does this signature. me, 2012
|
|
|
|
|
I used to have a copy of that book.
Real programmers use butterflies
|
|
|
|
|
Many times I've really got into programming with C# Linq or TSQL and then later wondered what the heck it all does.
|
|
|
|
|
No. It is senile dementia.
|
|
|
|
|
I have done this many times and am glad to see someone else post the same scenario. I've been utterly amazed looking back that I was able to be "in the zone" as you say, and whip out something so complex.
|
|
|
|
|
For sure. This is a joy. For me, it is usually the things that took months that I did by myself, products of immersion.
The other side of this is when someone asks me about something that I did 8 years ago with the expectation that I understand it like I wrote it yesterday. Suddenly I feel like a newbie in my own world, but then it comes flowing back.
|
|
|
|
|
Yep, been there. Months later - years later as well. And yes, you wonder what kind of zone you were in, and whether you can be there again. To me, that was one of the greatest feelings (I’m retired now).
Cheers!
Time is the differentiation of eternity devised by man to measure the passage of human events.
- Manly P. Hall
Mark
Just another cog in the wheel
|
|
|
|
|
I've made it a habit to explain every non-trivial part of my code in detail, using very_long_expressive_names, and multiline comments to explain how I arrived at the algorithm or formula. I do that for my future self, most of all, but occasionally my coworkers benefit from it, too. For that reason, most of the time, it is my future me who thanks my past self for spending that extra time when I have to look at that code months or years later!
With a sufficient level of explanation, that work looks a lot less like magic, so I typically end up with 'what the hell was I thinking?' more often than 'boy, what a brilliant idea'
GOTOs are a bit like wire coat hangers: they tend to breed in the darkness, such that where there once were few, eventually there are many, and the program's architecture collapses beneath them. (Fran Poretto)
|
|
|
|
|