|
In my life, the term "nerd" is often a pejorative, used as a label for people you do not like. "Geek" is applied to someone who has deep technical knowledge.
|
|
|
|
|
|
I just looked up the dictionary definition of a geek.
Originally a geek was a carnival performer who performed wild or disgusting acts such as biting the heads off live chickens. How it came to be applied to our present understanding of the term geek I do not know but this original definition has fallen to second place. The first definition of geek is now "an unfashionable or socially inept person."
Maybe geeks bite the heads off live computers!
|
|
|
|
|
A geek finds a great deal of interest in computers and electronic gizmos; they show little interest in athletic sports, and tend to goofiness, and shyness. Their lack of physical prowess suggests they are more easily dominated than others.
A nerd is a victim of society due to a lack of aptitude, both physically and mentally. Someone to destroy.
In a mechanistic world, they are co-victims, and therefore the terms become confused as people attempt to distance themselves from the contests of bullies, and so on, of schoolyards and barrooms. Of course, the entertainment media settles upon stereotypes to which the audience can relate.
These words can be confused easily. You shouldn't think of yourself as a nerd, IMHO, unless you're really weird.
Remain Calm & Continue To Google
|
|
|
|
|
You're right on the nerd-as-victim. The movie "Revenge of the Nerds" had a ring to it because you wouldn't imagine nerds being ept enough to exact an effective revenge. "Revenge of the Geeks" would be something else entirely.
|
|
|
|
|
|
It's the golden anniversary of the development of the Simula programming language, and I was reading an article evidently from the university that developed it.
50 years anniversary of Simula, the first object-oriented programming language - Universitetet i Oslo[^]
The article says that object-oriented languages were not of general interest to researchers until the '80s. However, my understanding of the famous PARC program of the early '70s was that Smalltalk was developed there. And then C++ was developed in the late '70s (the developer stated that he borrowed very heavily from Simula).
So is this article wrong on this? Or am I wrong?
|
|
|
|
|
|
swampwiz wrote: (the developer stated that he borrowed very heavily from Simula
Because "borrowed" is not a recognised OO practice.
had he "inherited,", or perhaps "encapsulated" or even "polymrphed" from Simula, now that would be different.
But yeah, actually I believe you are correct, I've heard/read the same too (- I think way back in my distant past uni days so no idea of the references.)
Format Success.
Welcome to your new signa&*(gD@@@ @@@@@@*@x@@
|
|
|
|
|
It has two sides...
All those copied the OO ides didn't made into a science, but used it because it was good and useful. In the late 80's it was put down in a scientific way and more languages designed along its lines and the OO method took over... until that time it was mostly a 'lab issue' and became 'general interest' only when became popular...
Skipper: We'll fix it.
Alex: Fix it? How you gonna fix this?
Skipper: Grit, spit and a whole lotta duct tape.
|
|
|
|
|
I started computer science at university in 1988.
Back then OO was not taught until the second year - if you chose that subject.
In the first year we were taught structured programming in Pascal.
I think this was largely because, even in 1988, there were still some people on a computer science degree course who had never programmed a computer before.
“That which can be asserted without evidence, can be dismissed without evidence.”
― Christopher Hitchens
|
|
|
|
|
I was in college at that time too (1986 - 1992).
If memory serves, Turbo Pascal v5.5 and Turbo C++ v1 were released in 1989/1990.
I bought both as soon as they hit the shelves.
There appeared to be more excitement concerning OOP being added to familiar languages than OOP-specific languages.
I still prefer multi-paradigm languages.
|
|
|
|
|
PIEBALDconsult wrote: I still prefer multi-paradigm languages. I think that is why I really like C# and .NET as I can use the procedural parts, the OO parts and the functional and declarative parts(Lambdas and LINQ) and not be forced into any one paradigm.
“That which can be asserted without evidence, can be dismissed without evidence.”
― Christopher Hitchens
|
|
|
|
|
GuyThiebaut wrote: I really like C# and .NET
Oh, let me share this information with you then, assuming that you didn't hear of it already!
Quote: Anders Hejlsberg.He was the original author of Turbo Pascal and the chief architect of Delphi. He currently works for Microsoft as the lead architect of C#[1] and core developer on TypeScript.
Anders Hejlsberg - Wikipedia[^]
|
|
|
|
|
Researchers? I think that identifies a big part of why OO wasn't adopted earlier. Researchers, as much as they may be vocal about claiming to be original thinkers, tend to stay in their comfort zone of what they know and what they got their PhD in. They tend to have "think outside of the box" hammered out of them. Such, at least, has been my experience over the years.
It's also interesting to note that OO became more adopted by the masses when computers and programming in general became available to the masses. Certainly I wasn't introduced to it until the late 80's, early 90's, when Borland C++ was an affordable $99.
On the other hand, I also think OO was (and continues to be) a long and winding dead end. Certainly some aspects of it are useful, but in hindsight, I think it would have been much more useful for mainstream programming to have glommed onto the ideas of agent-based programming rather than object oriented programming. But I can see how agent-based programming failed to take hold, particularly as hardware and operating systems were not up to the task (pun intended) of supporting asynchronous, multi-threaded, autonomous agents, except perhaps at the very low level kernel of the OS -- something certainly not accessible to high level languages, and something certainly not even implementable in most CPU architectures.
So after hardware, CPU's, and languages finally started supporting true multi-tasking, we're still stuck for the most part with OO as the scaffolding in which we program.
Marc
|
|
|
|
|
AOP looks very interesting, hadn't heard of it before.
I think OO is a good tool to have, like most other things; right tool for right job.
Someone's therapist knows all about you!
|
|
|
|
|
Marc Clifton wrote: hardware and operating systems were not up to the task (pun intended) of supporting asynchronous, multi-threaded, autonomous agents, except perhaps at the very low level kernel of the OS -- something certainly not accessible to high level languages, and something certainly not even implementable in most CPU architectures.
Per the following, 1965 was where threads were available to higher level languages.
Comp.os.research: Frequently answered questions [1/3: l/m 13 Aug 1996]Section - [2.2.3] The history of threads[^]
Asynchronous at least is some form of the definition was available in the earliest chip CPUs. Naturally with the more common definition it requires threading but that is not the only definition.
All of the following CPU chips had one or more interrupt pins: 8008, 6502, Z80, 8080. The 4004 didn't. So one might suppose that either the idea existed with the 4004 but not a way to implement it, or that it came into existence right about then.
|
|
|
|
|
jschell wrote: Per the following, 1965 was where threads were available to higher level languages.
Interesting writeup, particularly the evolution of Unix with regards to processes (separate memory) and threads (shared memory.)
jschell wrote: All of the following CPU chips had one or more interrupt pins: 8008, 6502, Z80, 8080.
Except for the 8008, I've programmed all of those. Interrupts were really nothing more than a hardware induced subroutine call - the handler had to save all the registers it was going to modify and restore them before doing an interrupt return. Plus, none of those processors had any concept of locking shared memory.
Hyperthreading is probably the closest I've seen to the CPU handling the register exchange between two or more processes.
Marc
|
|
|
|
|
Marc Clifton wrote: Interrupts were really nothing more than a hardware induced subroutine call
Yes but at least the definition I found for asynchronous allowed for that. So for the time period I was suggesting it was equivalent.
|
|
|
|
|
"general" being the operative word.
|
|
|
|
|
I did my first University level programming course in 1977, programming in Pascal. Later, I have realized that my professor was way ahead of his time: From Day 1, we were taught to put all attributes of <whatever> into a RECORD, and collect all manipulation of that RECORD type in a well defined set of functions/procedures that did nothing but operate on that RECORD type. All of these functions/procedures were to have a RECORD instance as the first parameter. So, when OO become mainstream a few years later, to me it was simply to change ThingFunction(Thing, A, B, C); into Thing.Thinfunction(A, B, C);
So what's the big deal with OO? Of course: OO is syntactical sugar. But I had tasted lots of that sugar already, in plain Pascal.
The first C++ compiler we got hold of - around 1981, a couple years before the official release, was really a prototype that wasn't a full compiler: It transformed C++ constructs into plain C, which could be compiled by any K&R C compiler. (I believe that even the first official releases worked that way.) It gave us the opportity to see how the C++ compiler did it: The class objects, the function pointer tables and so on. That was really great, helping us to understand how the mechanisms were intended to be used.
It also made me recognize OO written in assembly code! I got acccess to the source code of a 1974 vintage OS (not a toy one, but a full-blown with multiuser time sharing and real-time functionality). The driver model of the OS had class objects, inheritance, fuction tables, class instance objects... All the stuff we had seen the C++ compiler create. I don't think the OS creators ever put the label 'OO' on it; it was just a proper way to implenment an OS (very similar to the methodology taught by my 1977 professor). So again: OO methodology and mind set was not revolutionary at that time - it was primarily the high level language support for those ideas, and, of course, the terminology, putting the right labels on it.
I had my first Simula course in 1979/80. At that time, noone called it an "OO language", it was a "simulation language". Not until C++ made OO mainstream did the Simula people jump up, exclaiming "But we did that years ago, we did OO, too!" Even though Simula is older than C++, C++ was the one introducing the term "OO" to the general programming world, and OO in Simula appeared as an afterthought, not as a predecessor OO language.
|
|
|
|
|
Its a sliding scale. In the mid '80s, when & where I was in college, we weren't taught anything OO. However, I talked to people who were using Smalltalk in their research. I even ran into one guy who claimed to be working on OO extensions to Cobol (yeah, none of us believed him either at the time).
So, as of the mid 80s, OO and OO languages were already being used for research in some groups, but many others still didn't, and it hadn't made it into the curriculum where I went to school.
|
|
|
|
|
|
Clever - unlike the person who chose the muzak.
Bad command or file name. Bad, bad command! Sit! Stay! Staaaay...
AntiTwitter: @DalekDave is now a follower!
|
|
|
|
|
poor thing, did you look at the cage they're keeping it in?
- it's a jungle beast, not supposed to be in a hard cold stainless steel box.
Format Success.
Welcome to your new signa&*(gD@@@ @@@@@@*@x@@
|
|
|
|
|