|
Mike Hankey wrote: eah I'm brain dead working on the follow up ARM article...Timers. In the meantime I am playing with the SysTick of the poor cousin of your chip (STM32C0).
"In testa che avete, Signor di Ceprano?"
-- Rigoletto
|
|
|
|
|
It's odd to me that there doesn't seem to be much information about SysTick timer, especially the registers and how to program.
I don't discuss SysTick or Watchdog timers in my article, thought I'd save that for another article.
|
|
|
|
|
I have just gone through four QA questions, each of which is an error caused by a null reference. And yet none of the posters seems to have any idea a) how to diagnose and fix it, or b) even what the error means. Do those of you who still work in teams find this is a common problem with younger team members?
modified 5-Mar-23 8:12am.
|
|
|
|
|
Perhaps because they are so spoiled that they have never in there life encountered "nothing" and have no empathy for the poor code where they would otherwise put in a null check or try/catch to help it succeed.
Prolly not. But kids today.
|
|
|
|
|
Well, after all it is the Billion Dollar Mistake[^], as Tony Hoare himself put it
Recently I had to go through a large code base where the authors went crazy using Non-nullable pointers[^] almost everywhere. What can I say, seems a complicated problem for many people
Mircea
|
|
|
|
|
Mircea Neacsu wrote: Tony Hoare himself put it
But I'm unsure if he doesn't really believe it himself; nulls are still a very good idea, but dangerous in the wrong hands. But so are chain saws, and hair spray, big deal.
|
|
|
|
|
The song "Absolute beginners" by David Bowie springs to mind 
|
|
|
|
|
Well, in a world where we have programming languages that have concepts like "truthy" and "falsey", what can you expect?
|
|
|
|
|
Marc Clifton wrote: in a world where we have programming languages that have concepts like "truthy" and "falsey", what can you expect?
In that Universe, I expect Qubits.
Just putting a Quantum spin on things.
|
|
|
|
|
Not sure... there's nothing to understand
|
|
|
|
|
I read somewhere, years ago, that zero/nothing is one of the most difficult concepts for the human brain to understand. I think it was suggested that that was why there is no zero in Roman numerals.
|
|
|
|
|
It isn't. I have one apple in my hand. How much do you have in yours?
Even prehistoric hunters came back with "zero".
There's no 0 in Roman Numerals because it would not make sense to count nothing. A farmer that owes no taxes gets ignored, they counted what was owed. "Zero" would have no use there; even if that is the return of your hunting trip, 0 is not recorded. Writing is too precious to record zero's.
Bastard Programmer from Hell
"If you just follow the bacon Eddy, wherever it leads you, then you won't have to think about politics." -- Some Bell.
|
|
|
|
|
In Norwegian, the name of numeric zero is 'null'. So Norwegian kids 'sort of' have an excuse for confusing the two.
But they are quite different. Zero is a distinct, well defined numeric value that you may treat 100% like any other numeric value.
'null' is nothing, not a numeric value, but a void. Emptiness. An abyss. Not at a valid numeric value. Some programming languages use the term 'void'; it is really much more descriptive.
I feel like digging up my old Robert Heinlein collection to re-read the short story—And He Built a Crooked House[^]. The story tells about a crazy architect (in California, obviously ) who designs a house which is a 3-dimensional projection of a 4-dimensional cube, a tesseract. The night before the house owners move in, there is an earthquake that makes the house fold up as a true tesseract, in 4 dimensions, not just as a 3-dim projection.
I believe that Heinlein has taken liberties in his description of how a real tesseract would appear. But his description of the view out one window, of a total emptiness, not even black, gave me shivers when I first read it, many years ago. It is a beautiful literary description of the concept of a 'null'. I think that I didn't fully understand the concept of null, void, myself until I read the Heinlein story.
|
|
|
|
|
trønderen wrote: 'null' is nothing, not a numeric value, but a void. Which arises after learning to count and has nothing to do with difference between numeric nothing or nothing at all.
What has Heinlein to say about having no apples 40.000 years ago? Would it be 0 apples, null apples, or would the net result of no apples be the same?
Bastard Programmer from Hell
"If you just follow the bacon Eddy, wherever it leads you, then you won't have to think about politics." -- Some Bell.
|
|
|
|
|
trønderen wrote: 'null' is nothing, not a numeric value, but a void This is not strictly true. In any (programming) language null is a value representing an abstract concept, just like an integer is a value representing an instance of a specific class of numbers, which are themselves an abstract concept used to quantify the world.
It is true 'null' is the concept of 'not having a valid value' (i.e. nothing) but it is represented in a program using a value in various different lexical forms; null/NULL/etc.
The OP clearly raised a very a good question that could, perhaps, be more pointedly made with the more abstract question; Why do so many developers not understand abstraction?
|
|
|
|
|
Do you ever really have 0 apples from a weird sort of ship of Theseus-like perspective?
Some cultures lack that sort of integer concept from a similar view... all apples are not created equal. No point talking about N of them as the potential distance from reality only increases the further from 1 you get. Forget apples to oranges, they don't even do apples to apples.
More recently, quantum physics is saying there is no "nothing" and that there is always "quantum foam" that has always been.
Maybe the young ones are so much smarter they look dumb. Nahhh haha.
But maybe sending null across a wire as a matter of course is a little bit dumb too.
|
|
|
|
|
jochance wrote: Do you ever really have 0 apples from a weird sort of ship of Theseus-like perspective?Prehistoric man? Pretty sure they know the concept of
"no apples".
jochance wrote: More recently, quantum physics is saying there is no "nothing" and that there is always "quantum foam" that has always been. Physical BS.
Quantum physics a bunch nonsense. Your mom always been fat, because quantum mech says so.
Bastard Programmer from Hell
"If you just follow the bacon Eddy, wherever it leads you, then you won't have to think about politics." -- Some Bell.
|
|
|
|
|
I guess my point was more that while I get the usefulness of bad models, they're still maybe bad in some specifics of practice.
Which is still fine and all. It's not a full and total indictment of concepts. Tons of that self-referential integrity stuff going on in math, yeah?
But most probably do not look at 0/null the same way as they look at other deeper mathematical concepts. It just sits there all happy exactly 1 unit from both -1 and 1. The reality seems to be that you can really never have 0 if you go splitting hairs. That said, it maybe makes it an even more perfect "init" value.
Null though? Null was maybe never destined to be anything but a logical error.
"But I really want to store that something is specifically not known/unknowable!"
"What about all the other stuff you don't even know you don't know?"
"Look, you could've brought this up before we were a decade into RDBMS development. We're keeping it."
"But I didn't know."
"Exactly! And now we can record it!"
|
|
|
|
|
jochance wrote: But most probably do not look at 0/null the same way as they look at other deeper mathematical concepts. Like ALL programmers, I had to deal with floats nearing zero, which is not NULL.
jochance wrote: Null though? Null was maybe never destined to be anything but a logical error. [Coffee] Nope. Let me explain, little padawan;
0 is having no apples.
null is having no concept of apples.
Bastard Programmer from Hell
"If you just follow the bacon Eddy, wherever it leads you, then you won't have to think about politics." -- Some Bell.
|
|
|
|
|
And here I might've supposed you'd want people talking less about things they have no concept of.
What's really important is the ability to dictate to the apple that it is inferior, and they mostly left that out altogether.
lol
|
|
|
|
|
jochance wrote: What's really important is the ability to dictate to the apple that it is inferior, and they mostly left that out altogether. That makes no sense at all.
Pears are superior tough
Bastard Programmer from Hell
"If you just follow the bacon Eddy, wherever it leads you, then you won't have to think about politics." -- Some Bell.
|
|
|
|
|
Zero and null are not remotely the same thing. Zero is a value like any other. Null is the absence of any value, "undefined" if you like.
|
|
|
|
|
In C zero and NULL are very much the same thing.
|
|
|
|
|
And so is the color black (assuming an RGB color space).
Or anything else represented as binary zeroes in your favorite computer architecture. That two values of completely different semantics have identical same internal representations doesn't imply that the two values are "very much the same thing" - if you just overlook the type/class information.
In an OO world, any object instance, regardless of class, having a single member which is set to some value represented as all bits reset is also very much the same thing as a numeric zero. Or a null pointer. If you ignore its semantics, the way you do to claim that zero and null are very much the same thing.
|
|
|
|
|
What I actually said was
Quote: In C zero and NULL are very much the same thing. Not that zero and null are the same everywhere.
|
|
|
|