|
Eleven plus two
Twelve plus one
|
|
|
|
|
Now you added the second line ... we have a winner!
11 + 2 ELEVEN PLUS TWO
= 13
no matter how you look at it (anag)
TWELVE PLUS ONE
You are up tomorrow
"I have no idea what I did, but I'm taking full credit for it." - ThisOldTony
AntiTwitter: @DalekDave is now a follower!
|
|
|
|
|
|
I liked it!
I wasn't sure if anyone would get it, but I knew they would all know the words ...
"I have no idea what I did, but I'm taking full credit for it." - ThisOldTony
AntiTwitter: @DalekDave is now a follower!
|
|
|
|
|
I either love it or I hate it. My brain is outright refusing to decide which one it is.
|
|
|
|
|
I had seen it before - can’t remember where but relatively recently
|
|
|
|
|
Damn.
Somebody is stealing my ideas before I have them ...
"I have no idea what I did, but I'm taking full credit for it." - ThisOldTony
AntiTwitter: @DalekDave is now a follower!
|
|
|
|
|
Guess you must have subconsciously seen it too.
Be quite something if you came up with that independently
|
|
|
|
|
Could be - but I have no idea where I might, I don't read any newspapers ...
"I have no idea what I did, but I'm taking full credit for it." - ThisOldTony
AntiTwitter: @DalekDave is now a follower!
|
|
|
|
|
|
Hot Damn! Two years ago ... I'd forgotten all about that - or thought I had!
Well done!
"I have no idea what I did, but I'm taking full credit for it." - ThisOldTony
AntiTwitter: @DalekDave is now a follower!
|
|
|
|
|
It seemed familiar to me too. Not the clue, but the anagram. Perhaps somebody posted it in the lounge at some point. Otherwise it would have been FB or LI.
Found this[^]. Although it's longer ago than I expected.
|
|
|
|
|
There was a very similar clue in one of the better British papers years ago
"We can't stop here - this is bat country" - Hunter S Thompson - RIP
|
|
|
|
|
I'm talking about Cloud & the Hardware architecture abstraction.
I remember, back in 2000s when I joined a company as a rookie, a team of IT guys, all big experts sitting down and discussing the hardware configurations, Networking needed for a new project.
As things evolved, gradually, in the next companies I worked for, there were no datacentres. All the projects developed were deployed on cloud (Yep, typical start-up). And in more recent days teams are talking about "serverless" - which means you don't even get to read the configurations on paper.
Now I'm back to a mid size company having datacentre. but still most of the projects are in cloud and nobody gets a chance to discuss hardware.
What do you think? Cloud advancement is a positive evolution for developers?
or it's clearly dumbing down the brains w.r.t hardware architecture.
Great experience that I would love to have[^]
modified 16-Mar-20 6:44am.
|
|
|
|
|
Who works for those cloud companies?
|
|
|
|
|
AndyChisholm
|
|
|
|
|
Nand32 wrote: clearly dumbing down the brains
I feel that ever since we moved into higher languages than C (or even C++), it is a continuous dumbing down the brains (with respect to software).
|
|
|
|
|
I am curious about your criteria for classifying a language as "higher that C (or even C++)".
I may be disagreeing with you, both in what makes a language "higher" and to which degree the language will "dumb down" the brain.
It could be "dumbing down" in a similar sense that car driving was "dumbed down" when the synchronized gear box appeared, and even more when automatic transmission became common. Shifting gears is not the problem of driving today (with the possible exception of on the racing track), it is not what distinguishes a good driver from a bad driver. It takes quite different qualities to become a good driver than the ability to hanlde an unsynchronized gear box.
An analogy: Programming languages went from all static allocation (Old time Fortran didn't even allow recursion; it didn't have a stack), to "pedal driven" (malloc/free) heap management, to automatic garbage collection. You could declare a Fortran array and hand craft functions for allocating fragments of it as if it were a heap. In C/C++, you can leave the "how" to the compiler and run time library, but your are yourself responsible for managing the allocated space to make sure it is properly disposed, and disposed only once. In newer languages, you don't have to be your own garbage man. Is leaving your garbage to a garbage collector an example of "dumbing down"?
Generally speaking, "dumbing down" is 98% how you use the language, regardless of the language itself. When I started programming, "structured languages" (such as Algol, Pascal, Simula, ...) were pushing Fortran to the side. Not every programmer got a grip on the high level flow constructs, and a common saying was that "You can do Fortran programming in any language". Until you learn to use higher abstraction mechanism properly, you won't benefit from them. Once you learn, they may help you make more robust programs, increase your productivity and create more readabe programs, which bear a much closer resemblance to the real world problem they attempt to solve.
|
|
|
|
|
hmm, I think about the same at times. But then it reminds me of the Assembly language and punched-cards experts and their idea about C/C++. Maybe it's relative?
|
|
|
|
|
Thanks.
|
|
|
|
|
The brain can only contain so much... If you need to fill it up with information on how to run servers, something else will have to go.
Software always develop to the complexity level the people working on it can handle (well, a bit more) somewhat efficient. Unskilled people reach that fast as they make complex solutions to simple problems - but no matter the experience level you will reach the limit.
So "Shut up and run my code" (a.k.a. serverless, a name making no sense, as it is running on.... wait for it.... SERVERS) is ideal in my book. Why should I care about load balancers etc. If I need 10 instances, of course there should be a load balancer in front of it, that does not require a meeting to find out. If I define job x needs to talk with job y, of course there need to be a network allowing this (and only this). For once Microsoft appears to have understood that with the original deprecated Azure roles, but no-one else understood and all the idiots headed for virtual machines in the clould (WTF).
|
|
|
|
|
Evolution always moves in the direction of ordered systems, so it's perfectly natural that systems with a lot of more complex but transparent background processing will take the place of less complex systems.
"The fittest", in the computer world, is measured by the number of users (which is essentially the same as in real-world evolution -- "survivor" apps and concepts prosper and proliferate), and are inevitably more complex underwater than their forebears, to allow greater range and/or simplicity for users, who will ratify their evolution by preferring and using the newer products.
"Dumbing down" doesn't enter in to it. A more advanced product/system must be easier to use, or it will not prosper and proliferate (yes, this means you, Linux desktop), just as a dim(ension) saw is easier to use than a manual ripsaw -- the carpenters who use dim saws are by no means dumber, they just have the opportunity to get more done in the same amount of time.
The things to watch out for in IT evolution are false evolutions, where nothing actually improves except the marketing, and where systems become more complex for no good reason, becoming more cumbersome to use, not less (yes, this means you, windows).
I wanna be a eunuchs developer! Pass me a bread knife!
|
|
|
|
|
Mark_Wallace wrote: The things to watch out for in IT evolution are false evolutions, where nothing actually improves except the marketing, and where systems become more complex for no good reason, becoming more cumbersome to use, not less
So ... The Cloud, then ...
"I have no idea what I did, but I'm taking full credit for it." - ThisOldTony
AntiTwitter: @DalekDave is now a follower!
|
|
|
|
|
No doubt whatsoever about that. The cloud was an obvious failure-fad, good only for a small, niche market, from the moment it was first touted.
I wanna be a eunuchs developer! Pass me a bread knife!
|
|
|
|
|
Great thought.
|
|
|
|
|