|
That's enough Griff. No more bulldog please!
"If we don't change direction, we'll end up where we're going"
|
|
|
|
|
Possibly, but wouldn't a hot dog be more likely to start a frank discussion?
|
|
|
|
|
Oh burger! I didn't think of that!
Sent from my Amstrad PC 1640
Never throw anything away, Griff
Bad command or file name. Bad, bad command! Sit! Stay! Staaaay...
AntiTwitter: @DalekDave is now a follower!
|
|
|
|
|
That's really the wurst thing to post.
Ravings en masse^ |
---|
"The difference between genius and stupidity is that genius has its limits." - Albert Einstein | "If you are searching for perfection in others, then you seek disappointment. If you are seek perfection in yourself, then you will find failure." - Balboos HaGadol Mar 2010 |
|
|
|
|
|
Hey, they can't all be wieners...
|
|
|
|
|
I do a lot of development, migrating, porting, etc., in different environments, different systems, different languages, different versions,....get the idea? I have specialized over the years in doing weird stuff.
I just finished chasing a bug in some vintage commercial software that was written to accommodate big-endian and little-endian integers depending on the platform.
It turned out be that it applied big-endian logic in one place where little-endian was required.
The why's and wherefore's of doing this are not important. What is important is the line of thought I reached upon conclusion, i.e., What I want to see in software.
1) I want to be able to declare my variables of a type and then have the necessary conversions take place automatically.
E.g., I want to declare a "X" as a 2-byte integer and "S" as an ASCII (or utf8, utf16, whatever...) character string and be able simply specify 'S = X AS "format" --not "S =X.toString("fmt");", "S = itoa(i);" or whatever language conversion functions are appropriate.
I know what I have going in, I know what I want coming out--make it easy to get from A to B!!
2) I want my data declarations to be consistent across all platforms/languages/environments/...!
Before some reader gets all hot and bothered about using C typedefs, etc., in declarations, I know this can be done.
What I want is consistency--i.e., a "short" to be a 2 byte integer, a "long" to be a 4 byte integer,....get the drift. Part of problem I was chasing had to do with the software expecting a C int to be 32 bits, but some 64 bit environments define an int as 64 bits
3) I want my utilities and commands to operate the same way, with the same results, across all platforms.
If I do a "myutil -x" command on Linux, I want to see EXACTLY the same output and results across cygwin, Windows 10 Ubuntu, Debian, etc.
4) I want clear, simple, understandable, comprehensible programming constructs.
I am tired of chasing errors such as where somebody fat-fingered "=" when "==" was meant or where a brace was misplaced or omitted. I want to be able to look at a piece of code and understand what the author intended easily and clearly.
5) I want clear, complete commercial documentation.
I have seen thousands of circular definitions such as:
returntype FunctionXYZ(int p1, int p2); Returns XYZ of p1 and p2.
BIG WHOOPING DEAL! I'm not an idiot--I can plainly see that from the function call. I often need to know HOW it uses p1 and p2 to arrive at XYZ. (Of course, by now, some readers will be questioning the part about "idiot", but that's OK).
Somebody once said there's a beautiful language inside C++ just waiting to get out. I agree. But, C++ has gotten overly complex, overly confusing. (I ask you to consider why there are development environments such as wxWidgets, QTCreator, boost and std::... )
My desire is to "eschew obfuscation."
So, if someone plans on responding defensively or criticizing--Don't.
I am looking for ideas, things that people would want or would like to see done differently so that we can be more effective, more productive and more competent.
What's your thoughts?
|
|
|
|
|
According to Microsoft, VB6 should meet all your needs!
- I would love to change the world, but they won’t give me the source code.
|
|
|
|
|
VB6 is one of the few languages I have not messed with. I'll check it out. Thanks.
|
|
|
|
|
Don't. You are clean, stay that way.
"It is easy to decipher extraterrestrial signals after deciphering Javascript and VB6 themselves.", ISanti[ ^]
|
|
|
|
|
He's playing with you: VB6 died for new projects in 2002 or so when .NET was released.
Sent from my Amstrad PC 1640
Never throw anything away, Griff
Bad command or file name. Bad, bad command! Sit! Stay! Staaaay...
AntiTwitter: @DalekDave is now a follower!
|
|
|
|
|
|
I don't.
Some languages - like C# - are strongly typed for a reason: to catch errors early. Firstly, by catching type conversions at compile time means that the code does exactly what you wanted, or it doesn't compile. Secondly, by making you explicitly convert things like user input to the type you want and providing exceptions (or "failed" responses as appropriate) if the user input doesn't match up.
The global implicit typing you seem to prefer leads to errors because the compiler has to "guess" what you wanted - and that means that bad data gets into the system undetected. And until you've had to unpick a 100,000 row DB to try and fix dates that are entered as dd-MM-yy and MM-dd-YY you probably don't realise just how much of a PITA that is.
And then there is the "pointer problem": a pointer to a ASCII char is a pointer to a byte, but a Unicode character is a pointer to a word. You can - by casting - explicitly convert a byte pointer to a word pointer but that doesn't change the underlying data, and it means that half the data accesses aren't going to work properly, because the byte pointer can be "half way up" a word value.
And strong typing (in C#, if not in C or C++) also eliminates your "=" vs "==" in most cases because the result is not a bool so it can't be used in an conditional. You can set the C or C++ compiler to give warnings or errors when you type it wrong , but it is valid because being old languages they don't have a native boolean value: any nonzero value is "true".
You want to get away from the complexity and want consistent declarations? Try C# ... (but hurry, it's getting complicated as well now).
Sent from my Amstrad PC 1640
Never throw anything away, Griff
Bad command or file name. Bad, bad command! Sit! Stay! Staaaay...
AntiTwitter: @DalekDave is now a follower!
|
|
|
|
|
I agree with everything you wrote except the last paragraph. I despise everything about dot nyet.
"They have a consciousness, they have a life, they have a soul! Damn you! Let the rabbits wear glasses! Save our brothers! Can I get an amen?"
|
|
|
|
|
.NET is useful - it provides a huge library of classes that work in a consistent way, unlike the libraries you had to play with with C / C++, and reduces the memory leak problems endemic to pointer based code written by people who think they know what they are doing ...
It's a tool - and one that works across multiple platforms with a high degree of "similarity". Try that with a native C compiler, and it becomes a struggle to get anything to work in a short timeframe.
Don't get me wrong, I miss my assembler days (or decades more accurately) - but .NET is here to stay and I'm happy using it.
Sent from my Amstrad PC 1640
Never throw anything away, Griff
Bad command or file name. Bad, bad command! Sit! Stay! Staaaay...
AntiTwitter: @DalekDave is now a follower!
|
|
|
|
|
OriginalGriff wrote: and reduces the memory leak problems endemic to pointer based code written by people who think they know what they are doing ... In other words: .Net is for those who don't know what they are doing. Excellent argument.
OriginalGriff wrote: It's a tool - and one that works across multiple platforms with a high degree of "similarity". Try that with a native C compiler, and it becomes a struggle to get anything to work in a short timeframe. Have been trying that in the last days. It's not as bad as you think anymore.
OriginalGriff wrote: Don't get me wrong, I miss my assembler days (or decades more accurately) - but .NET is here to stay and I'm happy using it In other words: You have become too comfortable and now you finally love the Big Brother.
I have lived with several Zen masters - all of them were cats.
His last invention was an evil Lasagna. It didn't kill anyone, and it actually tasted pretty good.
|
|
|
|
|
"but .NET is here to stay and I'm happy using it."
I can happily say I hardly ever used .NET. As an embedded developer (now retired), .NET and C# were never options for the projects I worked on. Heck, I think only one project in my career had more than a meg of RAM. For my projects it's been C/C++ for at least the last 15 years of my career.
|
|
|
|
|
Rick York wrote: I despise everything about dot nyet.
I'm curious about why? Particularly since I have quite the opposite reaction.
Latest Article - Azure Function - Compute Pi Stress Test
Learning to code with python is like learning to swim with those little arm floaties. It gives you undeserved confidence and will eventually drown you. - DangerBunny
Artificial intelligence is the only remedy for natural stupidity. - CDP1802
|
|
|
|
|
Just one word: Mickeysoft
.Net can never be so good that they get me to marry them and then let them move in and do whatever they like.
I have lived with several Zen masters - all of them were cats.
His last invention was an evil Lasagna. It didn't kill anyone, and it actually tasted pretty good.
|
|
|
|
|
CodeWraith wrote: just one word: Mickeysoft
Subjective religious reasons, then. Any objective reasons?
|
|
|
|
|
OriginalGriff wrote: Some languages - like C# - are strongly typed for a reason: to catch errors early.
I found going from Object Pascall (Delphi) to C# was pretty easy, since they are both strongly typed. Of course going from ALGOL to PASCAl was also fairly easy, but going from FORTRAN to ALGOL was painful.
CQ de W5ALT
Walt Fair, Jr., P. E.
Comport Computing
Specializing in Technical Engineering Software
|
|
|
|
|
I also have had to fix problems such as the MM-DD-YY problem you described.
Strong typing often causes as many problems as weak typing because people want to find a solution for their problem. I'm thinking along the lines of a "definable" strong type conversions. Using arithmetric conversions such as integer to character, I would simply like to say "string S = i" having previously declared "i" as an integer. Then, add optional meta-data such as format.
Regarding pointers, C++ pointers has made debugging difficult. And having to cast causes even more confusion. I'm thinking the majority of pointer and casting problems are caused as a result of less experienced or lazy developers trying to find a quick, workable (in most cases) solution to their problem.
I'm just wondering if there isn't perhaps a better solution.
|
|
|
|
|
I disagree - they are a different type of problem. String typing reduces problems that can be located at compile time, while your weak typing problem is because the developer hasn't thought about the code sufficiently. If that forces him to look at what he's doing instead of assuming that the compiler will do the right thing, then that improves the code and reduces the chances of bad data.
You make typing mistakes, I make them - we all do. The more that the compiler can pick up instead of assuming that it's correct and blindly converting the wrong data the better, no?
string s = id; bad if you actually meant to do this:
string s = IansName;
Why is it such a problem for you to spell out what you want the compiler to do?
Adding formatting defaults and suchlike makes it more confusing if the developer doesn't realise what format is being used ... which is why you get dd/MM/yy and MM/dd/yy confusion because different users use different defaults!
Sent from my Amstrad PC 1640
Never throw anything away, Griff
Bad command or file name. Bad, bad command! Sit! Stay! Staaaay...
AntiTwitter: @DalekDave is now a follower!
|
|
|
|
|
What I want is for my local 5-Guys burger joint to stop burning the bacon on my double-bacon cheeseburgers.
|
|
|
|
|
I concur - but I skip the cheese part.
- I would love to change the world, but they won’t give me the source code.
|
|
|
|
|
...All I Want[^]
Sent from my Amstrad PC 1640
Never throw anything away, Griff
Bad command or file name. Bad, bad command! Sit! Stay! Staaaay...
AntiTwitter: @DalekDave is now a follower!
|
|
|
|
|