|
This is just going to go around in circles.
cheers
Chris Maunder
|
|
|
|
|
I have a bone to pick with you - my radius.
/ravi
|
|
|
|
|
That's pretty humerus.
cheers
Chris Maunder
|
|
|
|
|
Are you trying to strong arm me?
/ravi
|
|
|
|
|
I can't think of an on-topic comment that uses "uranus".
|
|
|
|
|
Shirley you mean 2π rads.
"the debugger doesn't tell me anything because this code compiles just fine" - random QA comment
"Facebook is where you tell lies to your friends. Twitter is where you tell the truth to strangers." - chriselst
"I don't drink any more... then again, I don't drink any less." - Mike Mullikins uncle
|
|
|
|
|
Tau master race.
𝜏 = 1 full circle. F*** π!
|
|
|
|
|
We use radians because the constants in (naively implemented) math libraries are inverses of integers.
As all serious math libraries use economised polynomials, this is less of a problem these days.
Note that IEEE 754-2008 recommends functions such as sinPi, defined as sin(pi*x). This could easily be modified to take angles in degrees, grads, mils, etc.
If you have an important point to make, don't try to be subtle or clever. Use a pile driver. Hit the point once. Then come back and hit it again. Then hit it a third time - a tremendous whack.
--Winston Churchill
|
|
|
|
|
The question seems a bit odd to me in that, since I never went to computer school, I always assumed the point of a UI is to give the user what they want. I've often returned SQL date, for example, as, LEFT(datefield, 11), which gave MMM DD, YYYY automatically (as long as one remembers to sort by the real datetime values). Or, really, anything else the user needs to look at should be made intelligible. Otherwise, the calls come in and it has to be changed.
From my point of view, the European convention, dd-mm-yyyy (regardless of delimiters) is every bit as dumb as the US convention: it won't sort correctly with a pain in the ass.
So - I've taken to YYYYMMDD, or, for human readable, YYYY.MM.DD, when it's for my use.
Big Endian, I think, is surely the way to go for dates.
Ravings en masse^ |
---|
"The difference between genius and stupidity is that genius has its limits." - Albert Einstein | "If you are searching for perfection in others, then you seek disappointment. If you are seek perfection in yourself, then you will find failure." - Balboos HaGadol Mar 2010 |
|
|
|
|
|
W∴ Balboos wrote: Big Endian, I think, is surely the way to go for dates.
Agreed. Especially if some decent delimiter is used (something like -/.) it is as readable, usable, workable, and consistent as possible. And thus it ends up with the best of all worlds - easy to sort, consistent for storage, no ambiguity, straight-forward, as well as easy to pick up the day/month as well. I mean how difficult is it to read the last 4 digits of a date instead of the first 4?
For me, I try to use the ISO standard date format as much as possible. Though for display I would try to convert to whatever local setting is in effect. But for storage and working with dates, especially if working on text based dates, there is simply no alternative to ISO standard (even allowing for variations in time zones for DateTime data).
Here's the irritating thing though: Most programs are designed for the US market. Thus most are then by default showing the US-only (not counting some small country somewhere wanting to be "different too" - now that sounds oxi-mor-ish) "randomised" date format. And no, the US is not the largest part of the world, not by a long shot. Meaning until the user (the largest part of users throughout the world) has realised that the date is in some unexpected rearranged order, they're reading it wrong to begin with.
IMO the default (if not more properly displaying to user preferences) should be ISO (i.e. YYYY-MM-DD) as it is impossible to misinterpret, there's never something like YYYY-DD-MM, or at least not that I've come across. So even if someone's used to DD-MM-YY, or for the US "non-world"-citizen's MM-DD-YY, it's quick to pick up exactly what is meant (especially if sticking to the 4 digit year).
|
|
|
|
|
In my own writings I always use big endian dates, always with a 4-digit year, and may change e.g. the naming of files receceived to suit my preferences. I also move the date ahead of any descriptive term, so that it starts the file name / table entry / whatever. Main reason: It allows sorting on the date (which is my most common sorting criterion) as text.
But I have a slight feeling of being somewhat nerdy when I do so. Humans can sort the dates as they were originally written; my rewriting is for the machine, not for humans.
And, I must admit, I frequently do not do it that way for other kinds of data. My address book is not sorted big-endian but little endian. If someone asks me for my birthdate, I state it in little-endian form - and that is a date. Time of day is usually little endian ("ten to nine" - "eight fifty" sounds like something from an army guy). Friends are named by their first name preceeding their family name.
Little endianness is, in a way, user friendly in that it focuses first on the nearness, and then gradually puts things into a bigger scope. Big-endianness either requires you to start with the universe and narrow down from there, step by step - otherwise, it might be ambiguous. If you make a new friend, telling him where you live may be limited to giving the street name and number; the town, county, state, nation, continent, planet and galaxy are implicit. So, little endian may be more user friendly.
Actually, you have a similar issue in programming! In most programming languages, the opening of a statement may identify it as an assignment ("X = ..."). But after the assgnment operator, which gives you the Grand Overview of the statement, you dive deep into the details of the expression, with priority rules en masse, some of which are so obscure that you have to ignore/override them by use of parentheses. The APL language is 100% consistent: No priorities, main things first, and if you want details, continue reading. "X = 3 * <something>": X is being changed, that is the essential thing. It is being set to 3 times some calculated value, no matter how it is calculated; in most cases, the 3 has high semantic importance (e.g. number of units bough). If you want the details, read on to break the <something> up. If you only need an overview, you can read only the first parts of the statements.
And I have worked with languages going the other way (but with operator priorities): "(A+4) * B =: C" - first assemble the pieces, then tell what to do with it (i.e. storing in C).
The most common type of statement, "C = (A+4) * B", jumps up and down in semantic levels, just like month-day-year. Or, "Smith, Jim, Black Falls". Or "Prius car, baby blue".
Our endianness is inconsistent in hundreds of areas; dates is only one. We must learn to live with it, and program our computers to handle it.
To answer the original question: YES, it is our responsibility. Even though sometimes no solution is possible. (E.g. sorting: The Norwegian and Swedish alphabets both add a few characters to the A-Z set - but in different order! So how do you correctly sort a table of names containing both Norwegian and Swedish names?)
|
|
|
|
|
Member 7989122 wrote: So how do you correctly sort a table of names containing both Norwegian and Swedish names?
Got similar issues in my home tongue Afrikaans, lots of áâäèéêëïôöûü going on, not to mention other characters from French and German which made their way into it as well.
Usually for sorting purposes I tend to first convert them to an invariant culture, you do get some decent libraries which are pretty fast with this.
Member 7989122 wrote: Our endianness is inconsistent in hundreds of areas; dates is only one.
Definitely correct. Even just addresses is a problem over here (well not so much a problem as an inconsistency). E.g. in English you state number then street, in Afrikaans it's the other way round, but in both they're then followed by suburb/area, etc.
Member 7989122 wrote: Big-endianness either requires you to start with the universe and narrow down from there, step by step - otherwise, it might be ambiguous.
Not completely in agreement here. If I follow the same principle for little-endianess, the same idea then means you never stop until you reach the universe. And the same idea about stopping once it becomes obvious can be applied to big-endian too: Only start where you think it's no longer obvious (e.g. don't state the country if you're already there).
To show an example of big-endian we very commonly use: Telephone numbers. We seldom state the area code if they're in the same city, we even more seldom state the country code when they're in the same country. Just because they're big-endian doesn't mean people HAVE to start with the universe.
|
|
|
|
|
My problem with the dd/mm/yy format is there's a lot less numbers you make into math holidays.
In speaking the dates, Americans tend to say something along the lines of "August 4, 2017", in which case our format of mm/dd/yy makes sense. Do you non-Americans say it differently?
|
|
|
|
|
I saw a video on this subject recently and this is why US dates are presented as MM/dd/yyyy rather than the more common dd/MM/yyyy. Apparently it became common practice in the US to state dates as August 3rd or December 15th rather than the other way around and that translated into the short form that is used to express a date.
You get used to it. You never like it, but you learn to live with it.
|
|
|
|
|
|
Have you ever considered what "Go back to where you came from!" would be to most white Americans?
Do you really want that to happen?
|
|
|
|
|
Obviously, there are at least 3 parts to this:
0) How the data is handled in the back-end.
1) How the data is entered on the front-end.
2) If and how "1" is labelled on the UI.
Given your subject line it seems you're more interested at the moment in 2. If the whole world had one method then problem solved - no label required. Unfortunately, all 7.5 billion* of us can't agree on much of anything so the data needs to be labelled. Just treat it like anything that has a unit of measure. Mass, volume, speed, temperature, etc...
*Or should I write 7,500,000,000? Or is that 7500000,000? Or is that 75,00,00,00,00? Or is that 7.5 x 109 Or should I write 7.5 milliard?
Whenever you find yourself on the side of the majority, it is time to pause and reflect. ~ Mark Twain
|
|
|
|
|
In short yes. But also we have a responsibility to the user in particular rather than just anyone willy-nilly. The fact is, people will always find a away to disagree on stuff globally. It helps us feel unique. And in fact it's quite healthy, otherwise we'd all be mindless zombies never challenging or changing the status quo. Of course, it's all about balance, otherwise we as people would never agree on anything and thus never get anywhere as well.
So, in short, stuff like date format is context sensitive. It's why we have locales, etc. But you're right in the fact we as UI designers need to make certain things obvious. There are ways to do it, just a lot of people are lazy and only do the bare minimum.
I recently had a JSON feed with UTC dates in it... UTC dates mind you!! The feed tried to get all fancy with dashes to make it look like an ISO format too. But no, the person that made the feed put it in as mm-dd-YYYY - dashes not slashes. Lets overlook the fact a UTC date should never be formatted this way IMO, but this dude is apparently anti-slash and anti-YYYY-mm-dd.
Point being, the dashes are a hint for most normal people it should be an ISO date. And in a way that's like what the UI should try to accomplish, find ways to give us hints and visual cues about something. When done properly it helps tremendously. As in don't clutter up the screen with information overload. Find ways to make them subtle.
And another way of helping people figure it out is to give the user what the user expects to see in the first place, depending on its context.
Anyway, hopefully in the future paper receipts will be digital and they'll account for this kinda stuff.
Jeremy Falcon
|
|
|
|
|
Particularly annoying with receipt printers is that the processor (the company that handles the CC transaction) often dictates the format of the receipt in case of a dispute, including the date format, specifically so that it is unambiguous to them. At least, such was the case when I had to generate the receipts for CC purchases and check cashing.
Marc
Latest Article - Create a Dockerized Python Fiddle Web App
Learning to code with python is like learning to swim with those little arm floaties. It gives you undeserved confidence and will eventually drown you. - DangerBunny
Artificial intelligence is the only remedy for natural stupidity. - CDP1802
|
|
|
|
|
I don't display dates in mm/dd/yy unless the stake holder demands it. I always prefer to use dd-MMM-yyyy for the very reason you cite.
".45 ACP - because shooting twice is just silly" - JSOP, 2010
- You can never have too much ammo - unless you're swimming, or on fire. - JSOP, 2010
- When you pry the gun from my cold dead hands, be careful - the barrel will be very hot. - JSOP, 2013
|
|
|
|
|
The real goal is to switch everybody to yyyy-mm-dd (e.g. 2017-08-07).
1. It contains all the needed information.
2. It's good for sorting because it goes bigger --> smaller.
3. No one would get confused over which number means what (bigger --> smaller principle).
4. It's already used by 1.5 billion Chinese and possibly in other places in Asia too.
While we're at it, we should also switch to a 24H clock (basically for the same reasons).
If you like it: use it in your daily life!
You might still want to use mm/dd/yy on your IRS reports, but other than that - use it whenever you can and by god we will make the world unite around this one proper date/time format.
|
|
|
|
|
Unlike programmers who always simplify THEIR OWN life, there is normal people who don't care your way of sorting.
We should think about usage of dates in every cases, not only dates of files! And if you're human, hardly you like months in the numeric form. What's that month "10"??? November? August? Don't give a damn, it should be LETTERS! And of course if you're not time traveler, you know what year today is! So first place should be occupied by most important parts - day and month. My preference is: "dd MMM yyyy" - it's HUMANLY and it's convenient. And never let you guessing what order is. "17 Feb 2017" - simple, obvious.
|
|
|
|
|
During the US military attack on Iraq in 2003, Amercians renamed (or tried to do so) "french fries" to "freedom fries" to indicate their dissatisfaction with French opposition to the bombings.
I think that a nice similar political action, in these days of terrorism from the Arabic world, would be to reverse the digit order in our numbers. Our digits are, as most people know, Arabic numerals - exactly the same as the digits used in Arabic writing. Except that in Arabic, they are little endian, in Latin script they are big endians: We write them the same, but the Arabs read them from right to left, we read them from left to right. To clearly dissociate from Arab culture, we should reverse the digit order, and reject doing it "the Arab way".
This of course pinpoints another problem: By trying to be different from the Arabs, by writing numbers the other way around, in another sense it makes us more like them! They do it little endian, we change to little endian, too... Hmmmm....
|
|
|
|
|
First of I'd answer your actual question (do devs have a UI responsibility) with a resounding "yes".
As to why US companies insist on forcing the rest of the world to use an illogical date format that practically no other country uses? I think it's the same underlying reason why we get poor UI's in general: wrong mindset.
As a dev, your job is to write software for your users, not for yourself/your ego. This is something a lot of developers seem to forget/aren't aware of. That means putting yourself in your users' shoes and designing software that meets your users' functional requirements while also being a joy to use.
Poor UI design generally is the result of only the first part of this equation being deemed important: meeting the functional requirements. The ease of use is an afterthought, and you can guarantee the dev never actually used the UI in a real life situation (they might, if you're lucky test a few boundary cases). A prime example of this was the drop-down list in Outlook years ago for choosing the year of birth for a contact. Not only was this a drop-down list with 100 entries or so (one per year), the default value was the current year at the top of the list.
Now if they'd tried entering the data of a real person or two, they would immediately have realised that none of their users will have an address book full of babies born this year, and that selecting something like "1967" from a massively long list is a UI fail.
But if all you test is "date in the past, today, date in the future", you won't ever see the massive design flaw. MS fixed this in a subsequent update.
And this mindset - whereby people find it difficult to empathise and put themselves in other people's shoes - is also the reason for these weird date formats nobody (apart from Americans) wants.
As a programmer, you must know that there are different date formats (you should at least know ISO and your local format if it's not ISO). So it can't be ignorance, it can only be an unwillingness to put yourself in your users' shoes. Because if you did, you'd immediately understand that no one wants to use an unfamiliar and confusing date format. If your mindset is "we don't care about anyone else", your software suffers as a result. And it won't only be the date format that suffers.
|
|
|
|
|
Chris Maunder wrote: Somewhere a programmer decided to output the date this way. I have another thought about this, on this surprisingly long thread.
US Date is written the way we say it. August 7, 2017. I don't know all of the Europolyglot, but I believe a German date would be 7 August 2017. These are how they are spoken. Now, just transform them to a number format whilst maintaining this order natural to the reader.
At least to begin with.
Conventions and adaptions have followed suit. Consider: even the metric system could be recast with a different size 'meter', and everything else, calculated to remain the same relative to the alternate size (rename them if you wish). But it isn't that way. Only one base-ten system was adopted by (so far as I know) all parties using it.
This desire for convention is relatively new. Think of what is now the ultimate example of adopting convention: the Euro. With its particular value. Not a value with any real symbolic significance (or perhaps, trying to be similar to the US dollar in value!)
So - unless its from a purely logical standpoint, (YYYYMMDD), which is useful for sorting, the internal representation we so love (which, you recall, are actually FP values - dates like in StarTek), the date is for a human-readable form (UI, remember!). Honestly - doesn't it make sense to write it the way the reader will naturally say it!
Ravings en masse^ |
---|
"The difference between genius and stupidity is that genius has its limits." - Albert Einstein | "If you are searching for perfection in others, then you seek disappointment. If you are seek perfection in yourself, then you will find failure." - Balboos HaGadol Mar 2010 |
|
|
|
|
|