|
charlieg wrote: I'm living in a static global world
My World is pretty dynamic.
charlieg wrote: control c / control v must be banned
No! I don't want to type that same basic template for model classes again and again. This and other goodies are there in my repository of common things I need in almost every project I work on. And I will copy and paste them. Wow, I am gansta!
"It is easy to decipher extraterrestrial signals after deciphering Javascript and VB6 themselves.", ISanti[ ^]
|
|
|
|
|
the control c / control v comment was mostly in jest... mostly.
Charlie Gilley
<italic>Stuck in a dysfunctional matrix from which I must escape...
"Where liberty dwells, there is my country." B. Franklin, 1783
“They who can give up essential liberty to obtain a little temporary safety deserve neither liberty nor safety.” BF, 1759
|
|
|
|
|
I'm glad I'm not the only one thinking this way
|
|
|
|
|
charlieg wrote: Inheritance results in hemorrhoids and other digestive issues Only if applied cluelessly (which I have done in several occasions and hopefully I learned from them though I wouldn't bet on it).
charlieg wrote: I have a file that contains data. I have several of them.
charlieg wrote: I'm thinking that to advance software development to the next level, control c / control v must be banned. You'd have fixed 90% of the bugs in our codebase and driven to the mental hospital one of my less favourite coworkers. Motion approved!
When I begin a new project I encapsulate everything, and I do the same when I'm tasked to modify something that exists. The only problem in my environment is that we don't have a single product but about five hundreds of them and growing, all with more or less the same code at different points in time over 20 years but no shared file. Each version has its own copy and the source control is a plain .zip file. This means that I may have to fix the same thing over a dozen of times before it becomes the new baseline for the future - the existing products are still to be maintained with the old code and architecture.
GCS d-- s-/++ a- C++++ U+++ P- L+@ E-- W++ N+ o+ K- w+++ O? M-- V? PS+ PE- Y+ PGP t+ 5? X R+++ tv-- b+(+++) DI+++ D++ G e++ h--- ++>+++ y+++* Weapons extension: ma- k++ F+2 X
|
|
|
|
|
I would have agreed with you even still 2 years ago.
Now I am really facing the market reality, and I am less evangelistic about holy code. Actually, getting things done faster and cheaper has superseded the need of quality - alas - and people do not care about quality anymore. Security issues in Facebook due to poor coding ? After a bit of show from Zuckerberg, everybody has already forgotten. Thousands and thousands of bugs in Microsoft products ? People earn money by blogging and youtubing about workarounds.
So if the demand is "please get me this done for yesterday and for no money", then copy & paste code is plain OK.
In my current position, which gets a bit above the boundaries of SW development, management made the choice to get products done and tested in low-cost locations, with a ratio of 3 people designing for 2 people fixing the design mistakes as they pop up in series production. Altogether, these 5 people are still 1/3 the cost of one solid and experienced engineer here and they are processing about 2 to 3 times as much projects, so ... The trend in the industry is to get low-cost locations do rapid-prototyping as product design and rapid-fixing on a case to case demand, since - unless very bis issues - nobody cares about quality. The experience of the designers in low -cost locations grow faster than their cost, so in about 2 to 3 years, we can expect an equivalent design skill as in high-cost locations, but for still half of the price, and the 2 fixing guys can probably be reduced to 1.
The only sacrificed element in all this is a bit of quality, but foremost innovation - future will tell if the business model can survive.
|
|
|
|
|
As a guy that wears a cybersecurity hat, I thank you for ensuring my future employment
"There are three kinds of lies: lies, damned lies and statistics."
- Benjamin Disraeli
|
|
|
|
|
charlieg wrote: control c / control v
Hey, isnt cloning mans answer to evolution?
|
|
|
|
|
charlieg wrote: I'm thinking that to advance software development to the next level, control c / control v must be banned.
I wouldn't dream of using control c / control v...
...now that I've coded them into a couple of redundant gaming keys.
|
|
|
|
|
Polymorphism is DAMN useful. MS broke inheritance in .net when they coded themselves into a corner resulting in the seemingly arbitrary restriction that a programmer could only inherit from a single class, but multiple interfaces. What a crock.
None of these OOP constructs would be confusing if instructors actually knew how to, oh, I don't know - INSTRUCT.
".45 ACP - because shooting twice is just silly" - JSOP, 2010 ----- You can never have too much ammo - unless you're swimming, or on fire. - JSOP, 2010 ----- When you pry the gun from my cold dead hands, be careful - the barrel will be very hot. - JSOP, 2013
modified 4-Jun-18 8:42am.
|
|
|
|
|
Well, it as not something Microsoft came up with. Languages like Java had that way before C# did. People moving from C++ usually find it an annoying restriction but after a few years of coding in C# or Java, most people usually find it to not be much of a limiting factor at all. And compiler authors don't have to deal with the diamond problem and all the complexities it adds to the compiler definitions.
|
|
|
|
|
Nish Nishant wrote: People moving from C++ usually find it an annoying restriction
That's me - in spades...
Nish Nishant wrote: but after a few years of coding in C# or Java, most people usually find it to not be much of a limiting factor at all
Nope. Still annoying, even after 11 years of coding in C#. I'm honestly not interested or concerned with the burdens experienced by compiler authors. The only reason C# even exists is because Microslop was bitch-slapped for trying to take over the java domain.
It's okay, though. I'm old, and soon I'll be dead, allowing Microsoft to continue on unmolested, because soy-boys willingly accept their assault on programming languages, and all the real programmers that raged against their absurdities will have passed away.
Eventually, you'll be able to write complete programs with a series of words like "flopgloop" and "pardultary". No matter what order you place them in, the app will still compile, but the functionality will change.
EDIT ==============
Microslop has proven once again they don't have any original thoughts left, by purchasing GitHub.
".45 ACP - because shooting twice is just silly" - JSOP, 2010 ----- You can never have too much ammo - unless you're swimming, or on fire. - JSOP, 2010 ----- When you pry the gun from my cold dead hands, be careful - the barrel will be very hot. - JSOP, 2013
modified 4-Jun-18 9:52am.
|
|
|
|
|
charlieg wrote: Why would you not write a class to handle it?
It's called "dumbing down."
|
|
|
|
|
Some people think that they are programming, simply by blindly copying/pasting working blocks of code and "search" the internet for other snippets to copy. Ask for a minor change, and they'll replace the entire block by another that has been "found".
Bastard Programmer from Hell
If you can't read my code, try converting it here[^]
"If you just follow the bacon Eddy, wherever it leads you, then you won't have to think about politics." -- Some Bell.
|
|
|
|
|
charlieg wrote: why encapsulation is ignored Because history repeats.
|
|
|
|
|
In my experience (~= 15 years of research about minimalistic design approaches) OOD is a flawed concept.
In theory, there's nothing wrong with it and it should lead to a better than average design, consistently.
In practice, abstraction is horrible, because people are really awful at defining, sharing and accepting the initial purpose of abstracted objects.
In part, this is because making the abstraction is rewarding in and of itself, while accepting an existing one is horrendously tedious.
The only form of OOD that seems to work consistently, is when you severely limit it to communication interfaces, DBO's and DTO's and drop polymorphism and inheritance all together.
Anything else, and people will misinterpret the abstracts made and muck it up beyond comprehension.
Also, reusability at object level is a maintenance nightmare. Reusability at API level is where it's at.
|
|
|
|
|
charlieg wrote: control c / control v must be banned No problem with that.
I use shift-delete / shift-insert.
|
|
|
|
|
I think you mean
control+insert / shift+insert
shift+delete is the same as control+X
as further evidence about MS lack of originality per a previous post.
Control + C, V, X were copied from Apple's Macintosh.
|
|
|
|
|
Actually no, I do mean shift-delete + shift-insert. (I find it easier to cut and immediately paste, then paste the copy). No need to move the right hand then. ) Also means that VS marks the "copied from" line as changed, which is a useful reminder when scrolling back to copy other code from the same area.
|
|
|
|
|
gotcha. I have a coworker that uses that pattern.
[shift+delete shift+insert] [navigate] [shift+insert]
Since I am writing this response later, I am scratching my head and wondering how we started talking about copy+paste when the subject has "OO design" in it! lol [but I am not going to go back and figure it out].
|
|
|
|
|
Everything I develop is centered around OOD/OOP as I use objects for the organizational constructs of my projects.
However, with very rare exception I do not use inheritance due to the many problems that senior software engineering analysts have found with it over the years.
This is not to say that inheritance is a bad concept but that it has to be used sparingly and carefully. Unfortunately, many developers rush into using it thinking that it is the real meaning of OOD.
When I was learning OOD many years ago I actually studied the history of its invention in Sweden (Simula-67). My conclusion was that it was created so that developers could better organize their development endeavors, not for the primary constructs that people promote.
In business applications, in general, there is little actual need for the use of inheritance and polymorphism, though there are applications that certainly could make good use of these attributes. However, here I am speaking generally.
As OOD emerged in popularity within the development community a small book was produced by a computer scientist back in the 1980s. This book went through all of the types of applications that could truly benefit from the use of OOD. None of the applications related to business development as all were from the scientific and internals communities.
In addition, the use of generic modules and the concepts of reuse were completely disproven many years ago simply from the statistical studies that were made against so many such projects.
Generic modules are nearly impossible to create as regular implementations into most applications since such modules must be relatively static in nature (ie: date validation module). The rise of popularity of generically designed modules was a direct result of incompetent technical managers who tried to foresee every possibility for any module developed. The result was terrible frustrations on the part of developers and projects that did not succeed very well if at all.
This was because, in reality, very few modules can be designed to be generic just like not every type of design can be shoe-horned into being an "Expert System" when that too was a craze in the late 1980s and early 1990s.
Object re-use failed as a direct result of organizations not setting up working repositories for re-usable modules that were designed to be used in such a manner (ie: data access layer). The result here, and which was eventually written about in our technical journals was that development teams\developers simply recreated the same modules multiple times for their own projects.
Also, the concept of reuse, when taken to the extreme as it was at NBC when I worked there in the 1980s, very easily gives rise to an environment of boredom as developers are rarely allowed to to experiment with their own creativity.
If new developers are not using OOD in a significant way than it is probably a result of the training they are receiving. However, not everything need hold to OOD through strict orthodoxy. Like everything else, OOD is a tool in our toolboxes and should be used with good understanding and care...
Steve Naidamast
Sr. Software Engineer
Black Falcon Software, Inc.
blackfalconsoftware@outlook.com
|
|
|
|
|
One of the things that happened was CRUD. When I learned OOAD back in the 80's, we designed our objects around what they did - methods first, attributes when necessary, and properties weren't just for exposing backing variables. "Data Objects" lived only in the back tiers. Things like MVC soon moved them to the clients. Then the Web, JavaScript, and CSS (where until recently everything was global and single threaded) returned us to procedural coding. Welcome back to the 70's. When all of your objects are just data objects and everything is global and procedural, what need is there for OO design? Then,(misinterpreting) Agile - hack in circles until it (sort of) works. On top of all of that, OO design is hard. Really hard to do well. Few initial design objects make it into the final code unscathed. Finally, design is documentation. We're Agile - "Working code over comprehensive documentation..."
|
|
|
|
|
On my team of 4 programmers, I am the only one who even knows that the four pillars exist. Getting any of them to use a property is near impossible, but reflection is in almost every program. Mind you, there are utility classes for email, sFTP, etc, but the class that uses them is C code consisting of inline functions with everything forced through the same path. Special cases requires switches everywhere they differ from the rigid flow. (carp mode off).
I have 5.5 years of OOP before this job and wish OOP was appreciated. The problem is that just like 'cheapest programmers', management is 'cheapest management' and no longer has the tech savvy to use the proper techniques. It is a self-feeding trend that I do not see changing. If a programmer can write code that stumps his coworkers, he gets a raise.
|
|
|
|
|
I would argue that there are really five pillars of OOP and OOAD. Without loose-coupling, using the other four becomes spaghetti.
|
|
|
|
|
charlieg wrote: it baffles me to the point of wanting to cut myself why encapsulation is ignored. Depending on what you cut with, and where, and how, you cut, it may violate encapsulation.
«... thank the gods that they have made you superior to those events which they have not placed within your own control, rendered you accountable for that only which is within you own control For what, then, have they made you responsible? For that which is alone in your own power—a right use of things as they appear.» Discourses of Epictetus Book I:12
|
|
|
|
|
|