|
Unfortunately the scanner isn't intelligent enough to know what ifdef means. We just took all the unit tests out of the repo because it was complaining about using assert, among other things. So now jenkins has to pull 2 repos to run the tests.
|
|
|
|
|
Memtha wrote: only because I am forced to run it through a painfully awful "security" code scanner.
...and because you are working for a company that doesn't know what they are doing. Or perhaps an individual up the authority chain that doesn't.
|
|
|
|
|
working for a company that doesn't know what they are doing
Close. The company is very good, but the client is... government. Need I say more?
Some battles just aren't worth the effort. We had to teach the "security lead" how to use git to get the zip and upload it to the scanning service they have forced on us. It's not even the government personnel that chose the scanner, it's a rival contractor that loves to tout the sheer number of "vulnerabilities" that "they" "addressed" by forcing us to add try/catch/throw on every single method.
|
|
|
|
|
Memtha wrote: but the client is... government. Need I say more?
lol - nope.
|
|
|
|
|
honey the codewitch wrote: The code is evil. It's absolutely terrible to read, almost as if they were *trying* to hide intent. I am now responsible for a piece of code that's been part of our business for over thirty years. This software handles rendering our proprietary printer language into bitmaps, and then shipping those bitmaps out to be printed. It was originally written by a guy who I'll call BK. BK worked on this one code base for 25 years. It had been 'stable' (bugs are features at that point) for quite a while, and they tried to reassign BK to another project. After six months of doing absolutely nothing he decided to retire when he was confronted.
When this landed in my lap, we had not even recompiled this stuff in over ten years. I looked through the source code a few times to answer questions, but hadn't needed to really understand it. Last fall we had a piece of customer-specific hardware go obsolete so we needed to design a replacement. I started looking through the code to see what would need changed.
Good ing grief. This code was written explicitly so that only a single person could maintain it. It was in C. Function prototypes weren't used. Header files didn't define the functions or data in the corresponding .C source file. There were numerous global structs, variables, and #define 's with 3, 2, and yes even 1! character names. I think the best bit of numb ery was a group of #define 's he'd added before #include <windows.h> which changed the definition of a number of values in the Windows headers which his comment (one of the few) claimed "Microsoft got wrong".
I spent over 100 manhours identifying precisely the changes I needed to make and where. My replacement took less than a day from the time I started writing it to when it was compiled, tested, and ready for a trial on customer equipment.
This code is [in]famous for something else. I'm a vulgar man, and I swear a lot in casual conversation with people I know, including my coworkers. That said, I've never cussed or used foul language in my source code. It just didn't seem professional. Until. This. Crap. There is now a comment block in this code following my modifications which looks something like this:
Software Zen: delete this;
|
|
|
|
|
Gary R. Wheeler wrote: I am now responsible for a piece of code that's been part of our business for over thirty years...Function prototypes weren't used. Header files didn't define the functions or data in the corresponding .C source file. There were...
So at least 1992. So yes that was par for the course for C code then. Things like prototypes sort of came about probably associated with C++ (perhaps not entirely but for the same reason that C++ added them.) Someone used to the the older style would not have used them. Globals were something that any C app would be using. Single char variable names were common.
Gary R. Wheeler wrote: a group of #define's he'd added before #include <windows.h> which changed the definition of a number of values in the Windows headers which his comment (one of the few) claimed "Microsoft got wrong".
Could not speak specifically to that but as I recall there were a lot of problems with the early Microsoft C++. I know for a fact that for years after the ANSI C++ spec was released the Microsoft scored very poorly on implementing it. That was back when people could actually post comparisons of software without getting sued.
|
|
|
|
|
In some circumstances what you're saying could be true. This code was adapted to Windows in the early 2000's, so function prototyping was the norm. No excuse there.
jschell wrote: Could not speak specifically to that but as I recall there were a lot of problems with the early Microsoft C++. I know for a fact that for years after the ANSI C++ spec was released the Microsoft scored very poorly on implementing it. That was back when people could actually post comparisons of software without getting sued. This is a case where BK was redefining constants for arguments to and return values from Windows API functions, thereby guaranteeing they would not work as documented.
BK was an asshole.
Software Zen: delete this;
|
|
|
|
|
C gets you rather close to the machine, but it keeps you there. Meaning there isn't really a way in C NOT to hide intent as you'd be rather busy spelling out mechanics of the "how" explicitly, burying the intent.
I very much agree with you on C++ making it way easier to spell out the intent, letting the library do the how, or at least abstracting it away.
|
|
|
|
|
The one line C contest is holding on line 1. It would like a word.
Real programmers use butterflies
|
|
|
|
|
Quote: I'm poring over C code right now - C really isn't that much better, but fortunately you can do less with it. The code is evil. It's absolutely terrible to read, almost as if they were *trying* to hide intent.
How do you hide intent in C? No overloaded operators, no overloaded functions, no implicit calls to constructors that need explicit calls to destructors, no symbols with identical names in different namespaces ...
The amount of "magic happens implicitly behind the scenes" things in C is ridiculously small.
|
|
|
|
|
By making your code do something that is non-obvious.
Real programmers use butterflies
|
|
|
|
|
but to be honest, that can be done in any language.
I'm reading through "learning python" and just hit the description of formatting strings.
I don't know what that guy was smoking when they came up with their approach, but I want some.
Charlie Gilley
“They who can give up essential liberty to obtain a little temporary safety deserve neither liberty nor safety.” BF, 1759
Has never been more appropriate.
|
|
|
|
|
Yeah it can be done in any language it just seems like some languages make it easier to write incomprehensible code than others.
Real programmers use butterflies
|
|
|
|
|
charlieg wrote: but to be honest, that can be done in any language.
Err..no it can't.
I had a C program that had assembly code in a text literal. I would cast it to a function and the call it.
Pseudo code for that was something like the following
char* code = "\xCC\xFF...";
typedef int* myfunct(int i)
int result = (myfunct)(code)(45);
If I hadn't put a lot of comments around that, including the exact assembly, then a maintenance programmer, even an experienced one, would have spent a lot of time figuring that out.
|
|
|
|
|
I'm sorry, but what is your point?
You have special code, you knew it was special code, so you made it real obvious what you were doing.
Charlie Gilley
“They who can give up essential liberty to obtain a little temporary safety deserve neither liberty nor safety.” BF, 1759
Has never been more appropriate.
|
|
|
|
|
I totally get the love/hate relationship with any code, frankly. I work with legacy code, so loving and hating is an everyday thing for me
You are doing a good work. Keep it up
|
|
|
|
|
Well I think all midway languages (which to me is any compiled language between FORTRAN's readability and Assembler) is that readability will always be second to performance...although that should mean that a proportional amount of comments should be added to the code (at least a full page for the dark arts).
Some of the coolest code I've seen uses pointer arithmetic like hell to bypass C++ permissions (this was a videogame engine) and the only comment it had was:
/* Do not change this code or ponies will cry */
I had been tracing the source code through 7 different files when I was rewarded with a hearty laugh there. I don't know how readable it was to any other person, but since then, I knew that game engines, compilers and O.S. source code is never going to be readily accessible to just anyone (Despite our best comments). That is the nature of language, human, computer or otherwise.
|
|
|
|
|
Breaking encapsulation by offsetting from a classes base address seems like bad form, even for a game engine. That's what the "friend" keyword is for.
Of course I have some nasty nasty code in my SPI bus code because it interacts with the ESP32's SPI hardware registers directly for performance. It hurts.
Real programmers use butterflies
|
|
|
|
|
Lol
For the time it was written it was most certainly a hack. That portion was still in C++ 98 and the pointer redirection was for a scripting language internal to the engine (to directly call any method from any class in the engine)
SPI bus code...you reminded me of assembler code I found in our main Nintendo DSi Engine (we called coldbits) to directly write to the buffer of the image processor. It was not child's play but I don't think the use of the keyword asm in C++ is hackish (more like "Here be dragons" kinda warning)
|
|
|
|
|
The reason I don't like the asm keyword is I like my C++ code to be portable. I target a lot of IoT, and those devices come in a variety of architectures, even among the same lines of chips.
I've studied the output of GCC in many cases to where I can structure my C++ code to generate the asm I want while maintaining its higher level structure. Of course that doesn't work if you need to set specific registers and such. Still, I use a lot of compile-time tricks to achieve my goals in the real world, but the result tends to be a best of both worlds scenario - you get efficient, highly structured code.
Real programmers use butterflies
|
|
|
|
|
After having lived with Microsoft's version of C++, and the C++ source code of applications written by C developers, I have come to the point that I like encapsulation but inheritance is a multi headed hydra and usually not worth the effort. My experience has been that if you go past one level of inheritance, you are elephanting doomed.
Reading about objects and what not sounds nice, but when you get your nice inheritance hierarchy set up, in maintenance you realize that you have an inverted pyramid. Touch one base class, and it all falls down.
Now I would agree with you - if you are sharing pointers amongst objects, you have design issues.
Charlie Gilley
“They who can give up essential liberty to obtain a little temporary safety deserve neither liberty nor safety.” BF, 1759
Has never been more appropriate.
|
|
|
|
|
These days I don't use inheritance that much except for doing compile-time computation tricks.
I usually use templates rather than have strict base classes.
In GFX I expose a "caps" template structure that indicates the capabilities of an object. GFX calls certain methods on that object based on the values in that template structure, so for example, it can determine if the object supports reading, or certain optimized operations. Normally, you'd inherit from a base class to do this, but you can do similar using templates. Code size can get to be in issue depending on what you're doing though, but I like the flexibility.
Real programmers use butterflies
|
|
|
|
|
Inheritance for me mainly works best in a very contained environment. I suppose if I were responsible for a core set of code that would apply to multiple applications (like a GUI control set), it might make sense. My career has been spent developing one application after another, and rarely do they inherit from each other. Maybe basic concepts, but as soon as some other team member does not understand something, they code up their own solution and off we go. Inheritance broken.
Templates - well there be magic, but in fact, yet again, the ivory tower folks seem to come up with a pristine solution, and the folks that are shoveling $^&&^& in Dixie code something up they understand to get the job done.
Charlie Gilley
“They who can give up essential liberty to obtain a little temporary safety deserve neither liberty nor safety.” BF, 1759
Has never been more appropriate.
|
|
|
|
|
I just love templates, personally. I'm not as big a fan of the STL, but I don't have much occasion to use it since I often target the Arduino platform with my code, and its STL implementation is only partial on some of its targets. Part of the joy of coding C++ for me is seeing what I can schlep from runtime to compile time in order to increase performance and maintain flexibility.
I have a pixel template class that allows you to declare individual color channels and the bit depth of each, and compose a pixel with as many channels as you want, up to the machine's word size.
It will then allow you to modify the individual channels so you can set the red channel of an RGB pixel, or the U channel of a Y'UV pixel, and it will recompute the overall value.
If you use constants it will compute all of it at compile time, including getting the compiler to bit shift arbitrary bits an arbitrary direction.
Real programmers use butterflies
|
|
|
|
|
honey the codewitch wrote: but would it kill people to write readable code, or at least comment it with something *helpful*?
In my experience and based on the cries of anguish, gnashing of teething and pulling out of hair when I even suggest that comments have a place in code I am guessing that the answer is that yes it would kill them.
|
|
|
|