|
Since colonial times... Interesting. The Declaration of Independence is month day year, but I think Lincoln used day month year. But he also used four score and seven years ago, which is confusing to everyone today.
My theory has always been that it was the IRS in early 1900s using month day year that forced U.S. to "standardize" on the wacky date format.
|
|
|
|
|
Yea, he even points out an instance where they use MM/DD/YYYY then DD/MM/YYYY in the same sentence. It seems as soon as people came over to the Americas, you see the MM/DD/YYYY format start being used along with DD/MM/YYYY, and then at some point people just decided to use MM/DD/YYYY exclusively. A historical mystery. Maybe it was just to spite the British?
|
|
|
|
|
Chris Maunder wrote: whether we as developers have a responsibility to ensure that the information we present to the world is always presented unambiguously Absolutely.
Chris Maunder wrote: Is this something you do? Always.
Chris Maunder wrote: Is it something your lead actually stops you doing? I am the lead. But if someone suggested that ambiguous display of dates was acceptable, I don't think they would last very long in our organization.
/ravi
|
|
|
|
|
I manipulate and persist dates/times in an unambiguous fashion. I present them according to the user's preferences as indicated by the Windows locale or other mechanism.
Software Zen: delete this;
|
|
|
|
|
Like a good developer should, thank you very much.
But in the current "developer" world (i.e. Silicon Valley style), minding that "not all of your millions of users are from the US" is a mind boggling concept. You think dates are bad? How about keyboard short-cuts that absolutely cannot work (looking at you Android Studio)? Everytime a see a "Ctr+/" or something like that, it's apparent that nobody tried the software with a non-US/UK keyboard. (hint, most naturalized keyboars use key combinations for (), [],\, etc.. So shortcuts that require those keys will not work.
|
|
|
|
|
André Pereira wrote: How about keyboard short-cuts that absolutely cannot work I have been through that scenario. I create user interfaces for our line of commercial ink-jet printing systems. Our UI is largely touch-screen driven. For one product line I implemented several locale-specific on-screen keyboard layouts for entry of file names and such. Most of them were fairly easy, except for the Japanese, Korean, and Simplified Chinese. For those I sent photos of the physical keyboard to colleagues in-country, and had them send me text files with the keytop characters encoded in UTF-16.
Software Zen: delete this;
|
|
|
|
|
Quote: photos of the physical keyboard
Quote: had them send me text files with the keytop characters encoded in UTF-16
That's a very hands on solution I like it. It's something a script kiddie could do, instead of researching the whole thing. But no, they just look at their Mac and that's it.
|
|
|
|
|
not forgetting degrees (temperature) and degrees angle. The latter is odd because almost all humans use the 0..360 whereas almost every math library uses radians - easy to visualise a 35 degree slope, but a .4 radians is how many? With pi an irrational number and computers not capable of doing infinite digits yet (not that long ago computers couldn't do over 6 dp very well) what a stupid choice that was.
Another that's slipping is currency: starting to see single decimals popping up: i.e. $5.5 ... sure cents (pennies if you must) are annoying, but it's just being lazy to skip that last digit.
(currently the temperature here is 298 degrees and my chair tilted at about .1 degrees, just the way this grumpy irrational old man likes it.)
Sin tack
the any key okay
|
|
|
|
|
Actually I'm not sure why we don't stick to more fundamental units like that. 2π rad = a full circle - what could be easier? And frankly I'd be happy to switch to Kelvin if it meant never having to look at another negative temperature.
cheers
Chris Maunder
|
|
|
|
|
I'd be happy just to never feel another negative temperature.
Bad command or file name. Bad, bad command! Sit! Stay! Staaaay...
AntiTwitter: @DalekDave is now a follower!
|
|
|
|
|
It seems we've come 2π rad.
/ravi
|
|
|
|
|
Now you're just being obtuse.
"the debugger doesn't tell me anything because this code compiles just fine" - random QA comment
"Facebook is where you tell lies to your friends. Twitter is where you tell the truth to strangers." - chriselst
"I don't drink any more... then again, I don't drink any less." - Mike Mullikins uncle
|
|
|
|
|
|
That's acute one.
"the debugger doesn't tell me anything because this code compiles just fine" - random QA comment
"Facebook is where you tell lies to your friends. Twitter is where you tell the truth to strangers." - chriselst
"I don't drink any more... then again, I don't drink any less." - Mike Mullikins uncle
|
|
|
|
|
Maybe, but there are degrees of cute.
/ravi
|
|
|
|
|
This is just going to go around in circles.
cheers
Chris Maunder
|
|
|
|
|
I have a bone to pick with you - my radius.
/ravi
|
|
|
|
|
That's pretty humerus.
cheers
Chris Maunder
|
|
|
|
|
Are you trying to strong arm me?
/ravi
|
|
|
|
|
I can't think of an on-topic comment that uses "uranus".
|
|
|
|
|
Shirley you mean 2π rads.
"the debugger doesn't tell me anything because this code compiles just fine" - random QA comment
"Facebook is where you tell lies to your friends. Twitter is where you tell the truth to strangers." - chriselst
"I don't drink any more... then again, I don't drink any less." - Mike Mullikins uncle
|
|
|
|
|
Tau master race.
𝜏 = 1 full circle. F*** π!
|
|
|
|
|
We use radians because the constants in (naively implemented) math libraries are inverses of integers.
As all serious math libraries use economised polynomials, this is less of a problem these days.
Note that IEEE 754-2008 recommends functions such as sinPi, defined as sin(pi*x). This could easily be modified to take angles in degrees, grads, mils, etc.
If you have an important point to make, don't try to be subtle or clever. Use a pile driver. Hit the point once. Then come back and hit it again. Then hit it a third time - a tremendous whack.
--Winston Churchill
|
|
|
|
|
The question seems a bit odd to me in that, since I never went to computer school, I always assumed the point of a UI is to give the user what they want. I've often returned SQL date, for example, as, LEFT(datefield, 11), which gave MMM DD, YYYY automatically (as long as one remembers to sort by the real datetime values). Or, really, anything else the user needs to look at should be made intelligible. Otherwise, the calls come in and it has to be changed.
From my point of view, the European convention, dd-mm-yyyy (regardless of delimiters) is every bit as dumb as the US convention: it won't sort correctly with a pain in the ass.
So - I've taken to YYYYMMDD, or, for human readable, YYYY.MM.DD, when it's for my use.
Big Endian, I think, is surely the way to go for dates.
Ravings en masse^ |
---|
"The difference between genius and stupidity is that genius has its limits." - Albert Einstein | "If you are searching for perfection in others, then you seek disappointment. If you are seek perfection in yourself, then you will find failure." - Balboos HaGadol Mar 2010 |
|
|
|
|
|
W∴ Balboos wrote: Big Endian, I think, is surely the way to go for dates.
Agreed. Especially if some decent delimiter is used (something like -/.) it is as readable, usable, workable, and consistent as possible. And thus it ends up with the best of all worlds - easy to sort, consistent for storage, no ambiguity, straight-forward, as well as easy to pick up the day/month as well. I mean how difficult is it to read the last 4 digits of a date instead of the first 4?
For me, I try to use the ISO standard date format as much as possible. Though for display I would try to convert to whatever local setting is in effect. But for storage and working with dates, especially if working on text based dates, there is simply no alternative to ISO standard (even allowing for variations in time zones for DateTime data).
Here's the irritating thing though: Most programs are designed for the US market. Thus most are then by default showing the US-only (not counting some small country somewhere wanting to be "different too" - now that sounds oxi-mor-ish) "randomised" date format. And no, the US is not the largest part of the world, not by a long shot. Meaning until the user (the largest part of users throughout the world) has realised that the date is in some unexpected rearranged order, they're reading it wrong to begin with.
IMO the default (if not more properly displaying to user preferences) should be ISO (i.e. YYYY-MM-DD) as it is impossible to misinterpret, there's never something like YYYY-DD-MM, or at least not that I've come across. So even if someone's used to DD-MM-YY, or for the US "non-world"-citizen's MM-DD-YY, it's quick to pick up exactly what is meant (especially if sticking to the 4 digit year).
|
|
|
|