|
Nerds are studious. Geeks are circus performers.
|
|
|
|
|
Geeks are (like circus geeks) unusual in the way they show their passion. Nerds are unusual in the depth of their passion. A geek will bite the head off a chicken. A nerd will spend days analyzing the bite marks.
|
|
|
|
|
For me, GEEK applied to technology. And more socially capable.
Nerd covered the socially awkward kids playing Dungeons and Dragons, getting too deep into ANY of the Sci-Fi stuff, etc.
I was a Geek, I had friends who were both nerds and geeks.
But in a foreign country... I could see the challenge. The two words could be interchanged. In Big Bang Theory, I consider them mostly nerds!
|
|
|
|
|
Thanks for referring to TBBT!
Yes exactly this is the point - at least regional around Vienna where I live, nobody ever ever uses the term "geek" - in fact, if I tell someone, that I am in theory more a geek than a nerd, I get back "Geek? What's that?"
The thing is, here (again: at least regional), TBBT "created" the word "nerd" for the masses. Now everybody referres to "crazy tech/science people" as nerds - no matter if they play games, develop, repair printers for live for their mother-in-law or are chemists/physicians.
It's been years that I heard the word "geek" the last time. Even some of the very young junior dev's know "nerd" very well but you get a ... confused look from them if you refer to geeks.
|
|
|
|
|
|
The venn is missing the "wears pocket protector" and "has calculator strapped to their hip in a zippered leather (or simulated plastic leather) holster."
In my day (1970's), nerds, a pejorative term btw, gave themselves by what they wore.
The term geek, also pejorative, existed but in my circle of friends wasn't used as often.
Maybe because that's what we were.
Cheers,
Mike Fidler
"I intend to live forever - so far, so good." Steven Wright
"I almost had a psychic girlfriend but she left me before we met." Also Steven Wright
"I'm addicted to placebos. I could quit, but it wouldn't matter." Steven Wright yet again.
|
|
|
|
|
If I had the inclination to really consider it I would say that the two are closely akin to each other however:
Geek - Tends to suggest one who while being brilliant may at times often be disconnected from the world around them.
While...
Nerd - Seems to suggest to me that same individual who has managed to circumvent the disconnected state of a burgeoning geek and succeeded at bridging the gap between the technical and the physical world.
Consequently geeks are more common than nerds and typically much more successful integrating and likely more accomplished based on world standards.
Just my thoughts.
|
|
|
|
|
In my life, the term "nerd" is often a pejorative, used as a label for people you do not like. "Geek" is applied to someone who has deep technical knowledge.
|
|
|
|
|
|
I just looked up the dictionary definition of a geek.
Originally a geek was a carnival performer who performed wild or disgusting acts such as biting the heads off live chickens. How it came to be applied to our present understanding of the term geek I do not know but this original definition has fallen to second place. The first definition of geek is now "an unfashionable or socially inept person."
Maybe geeks bite the heads off live computers!
|
|
|
|
|
A geek finds a great deal of interest in computers and electronic gizmos; they show little interest in athletic sports, and tend to goofiness, and shyness. Their lack of physical prowess suggests they are more easily dominated than others.
A nerd is a victim of society due to a lack of aptitude, both physically and mentally. Someone to destroy.
In a mechanistic world, they are co-victims, and therefore the terms become confused as people attempt to distance themselves from the contests of bullies, and so on, of schoolyards and barrooms. Of course, the entertainment media settles upon stereotypes to which the audience can relate.
These words can be confused easily. You shouldn't think of yourself as a nerd, IMHO, unless you're really weird.
Remain Calm & Continue To Google
|
|
|
|
|
You're right on the nerd-as-victim. The movie "Revenge of the Nerds" had a ring to it because you wouldn't imagine nerds being ept enough to exact an effective revenge. "Revenge of the Geeks" would be something else entirely.
|
|
|
|
|
|
It's the golden anniversary of the development of the Simula programming language, and I was reading an article evidently from the university that developed it.
50 years anniversary of Simula, the first object-oriented programming language - Universitetet i Oslo[^]
The article says that object-oriented languages were not of general interest to researchers until the '80s. However, my understanding of the famous PARC program of the early '70s was that Smalltalk was developed there. And then C++ was developed in the late '70s (the developer stated that he borrowed very heavily from Simula).
So is this article wrong on this? Or am I wrong?
|
|
|
|
|
|
swampwiz wrote: (the developer stated that he borrowed very heavily from Simula
Because "borrowed" is not a recognised OO practice.
had he "inherited,", or perhaps "encapsulated" or even "polymrphed" from Simula, now that would be different.
But yeah, actually I believe you are correct, I've heard/read the same too (- I think way back in my distant past uni days so no idea of the references.)
Format Success.
Welcome to your new signa&*(gD@@@ @@@@@@*@x@@
|
|
|
|
|
It has two sides...
All those copied the OO ides didn't made into a science, but used it because it was good and useful. In the late 80's it was put down in a scientific way and more languages designed along its lines and the OO method took over... until that time it was mostly a 'lab issue' and became 'general interest' only when became popular...
Skipper: We'll fix it.
Alex: Fix it? How you gonna fix this?
Skipper: Grit, spit and a whole lotta duct tape.
|
|
|
|
|
I started computer science at university in 1988.
Back then OO was not taught until the second year - if you chose that subject.
In the first year we were taught structured programming in Pascal.
I think this was largely because, even in 1988, there were still some people on a computer science degree course who had never programmed a computer before.
“That which can be asserted without evidence, can be dismissed without evidence.”
― Christopher Hitchens
|
|
|
|
|
I was in college at that time too (1986 - 1992).
If memory serves, Turbo Pascal v5.5 and Turbo C++ v1 were released in 1989/1990.
I bought both as soon as they hit the shelves.
There appeared to be more excitement concerning OOP being added to familiar languages than OOP-specific languages.
I still prefer multi-paradigm languages.
|
|
|
|
|
PIEBALDconsult wrote: I still prefer multi-paradigm languages. I think that is why I really like C# and .NET as I can use the procedural parts, the OO parts and the functional and declarative parts(Lambdas and LINQ) and not be forced into any one paradigm.
“That which can be asserted without evidence, can be dismissed without evidence.”
― Christopher Hitchens
|
|
|
|
|
GuyThiebaut wrote: I really like C# and .NET
Oh, let me share this information with you then, assuming that you didn't hear of it already!
Quote: Anders Hejlsberg.He was the original author of Turbo Pascal and the chief architect of Delphi. He currently works for Microsoft as the lead architect of C#[1] and core developer on TypeScript.
Anders Hejlsberg - Wikipedia[^]
|
|
|
|
|
Researchers? I think that identifies a big part of why OO wasn't adopted earlier. Researchers, as much as they may be vocal about claiming to be original thinkers, tend to stay in their comfort zone of what they know and what they got their PhD in. They tend to have "think outside of the box" hammered out of them. Such, at least, has been my experience over the years.
It's also interesting to note that OO became more adopted by the masses when computers and programming in general became available to the masses. Certainly I wasn't introduced to it until the late 80's, early 90's, when Borland C++ was an affordable $99.
On the other hand, I also think OO was (and continues to be) a long and winding dead end. Certainly some aspects of it are useful, but in hindsight, I think it would have been much more useful for mainstream programming to have glommed onto the ideas of agent-based programming rather than object oriented programming. But I can see how agent-based programming failed to take hold, particularly as hardware and operating systems were not up to the task (pun intended) of supporting asynchronous, multi-threaded, autonomous agents, except perhaps at the very low level kernel of the OS -- something certainly not accessible to high level languages, and something certainly not even implementable in most CPU architectures.
So after hardware, CPU's, and languages finally started supporting true multi-tasking, we're still stuck for the most part with OO as the scaffolding in which we program.
Marc
|
|
|
|
|
AOP looks very interesting, hadn't heard of it before.
I think OO is a good tool to have, like most other things; right tool for right job.
Someone's therapist knows all about you!
|
|
|
|
|
Marc Clifton wrote: hardware and operating systems were not up to the task (pun intended) of supporting asynchronous, multi-threaded, autonomous agents, except perhaps at the very low level kernel of the OS -- something certainly not accessible to high level languages, and something certainly not even implementable in most CPU architectures.
Per the following, 1965 was where threads were available to higher level languages.
Comp.os.research: Frequently answered questions [1/3: l/m 13 Aug 1996]Section - [2.2.3] The history of threads[^]
Asynchronous at least is some form of the definition was available in the earliest chip CPUs. Naturally with the more common definition it requires threading but that is not the only definition.
All of the following CPU chips had one or more interrupt pins: 8008, 6502, Z80, 8080. The 4004 didn't. So one might suppose that either the idea existed with the 4004 but not a way to implement it, or that it came into existence right about then.
|
|
|
|
|
jschell wrote: Per the following, 1965 was where threads were available to higher level languages.
Interesting writeup, particularly the evolution of Unix with regards to processes (separate memory) and threads (shared memory.)
jschell wrote: All of the following CPU chips had one or more interrupt pins: 8008, 6502, Z80, 8080.
Except for the 8008, I've programmed all of those. Interrupts were really nothing more than a hardware induced subroutine call - the handler had to save all the registers it was going to modify and restore them before doing an interrupt return. Plus, none of those processors had any concept of locking shared memory.
Hyperthreading is probably the closest I've seen to the CPU handling the register exchange between two or more processes.
Marc
|
|
|
|