|
It only makes sense to me when it can be more than one "is a". Other than that, it's another "ritual". And all you keep doing is going back and forth between (one) "implementation" and interface; until it's obvious one needs (or can benefit from) an interface.
Then you also have to deal with the school that says "no inheritance"; which in essence means no "base methods"; virtual or otherwise. Another pointless ritual that only becomes "real" because someone "ordered" it; or can't decide when it is appropriate. See: "abstract" TextBoxBase.
"Before entering on an understanding, I have meditated for a long time, and have foreseen what might happen. It is not genius which reveals to me suddenly, secretly, what I have to say or to do in a circumstance unexpected by other people; it is reflection, it is meditation." - Napoleon I
|
|
|
|
|
I do like the rule of no concrete super/base classes. One concrete type extending another concrete type always causes grief down the road when someone adds a third concrete type into the mix.
|
|
|
|
|
englebart wrote: One concrete type extending another concrete type
The problem there however is overuse of inheritance.
The solution is to use composition instead.
|
|
|
|
|
Whoever gave that directive is a man after my own heart.
It's extreme, to be sure. Realistic to literally follow 100% of the time? Probably not. But as an aspiration, a philosophy - absolutely.
If you do this, you will be able to grow and scale your products effortlessly for decades - basically for as long as the programming language you use is supported - without a rewrite.
Nuget packages, even entire application frameworks will come and go, yet your core code will be snug as a bug in a rug, wrapped in layers of abstraction that shield it from the chaos.
When your favorite library is deprecated, revealed to have a critical vulnerability, or the vendor jacks up the price on you, you scoff at how simple it is to assign someone to find a replacement and write the wrapper layer - completely independently of everyone else. Your customer tells you the application you designed for Azure now needs to run on AWS? "No problem", you say, "give me a week." Microsoft decides to make 100 new breaking changes to ASP.NET Core? Bah! The upgrade takes an hour.
You will never be stuck relying on proprietary technology outside of your control ever again. The term "technical debt" won't even be part of your vocabulary.
So yes. Those who know, do this.
|
|
|
|
|
Yep, I think you nailed the truth of it. It is definitely a core practice to follow.
Very few people think of software dev in these terms.
I'm guessing you've built or worked on extremely large projects?
|
|
|
|
|
Peter Moore - Chicago wrote: If you do this, you will be able to grow and scale your products effortlessly for decades - basically for as long as the programming language you use is supported
And have you actually done that?
I have worked on multiple legacy products and never seen anything like that.
At a minimum I can't see it happening in any moderate to large business unless the following was true
- Dedicated high level architect (at least director level) whose job is technical not marketing. The person enforces the design.
- Same architect for a very long time. With perhaps a couple other architects trained solely by that individual.
- Very strict controls on bringing in new idioms/frameworks.
- Very likely extensive business requirements to support multiple different configurations. From the beginning. That would insure the initial design actually supports that.
What I have seen is even in a company started with a known requirement to support multiple different implementations in rapid order (about a year) new hires decided to implement their own generalized interface on top of the original design without accounting for all the known (not hypothetical) variants. Making the addition of the newer variants into a kludge of code to fit on top of what the new hires did.
|
|
|
|
|
jschell wrote: At a minimum I can't see it happening in any moderate to large business unless the following was true
- Dedicated high level architect (at least director level) whose job is technical not marketing. The person enforces the design. You make a good point. It takes an experienced technical team to lay down guidelines like these.
Over the past 20 years I've worked mostly at early stage companies with very experienced small teams, each of which was tasked with implementing portions of a larger complex product. Because requirements are almost always less known early in a product's evolution, using the technique of enforcing interface definitions allows the code to naturally evolve as the requirements change and become more solidified. Coupled with a strict regimen of writing automated unit and integration tests, defensive programming designs like these increase the chances of developing a complex app with fewer bugs.
/ravi
|
|
|
|
|
|
raddevus wrote: Would you balk? or think, "Yes, that is the way it is and should be."
Depends.
If one complex layer (A) is dependent on another complex layer (B) then unit testing A becomes quite a bit more difficult if B does not provide and interface.
raddevus wrote: Do you know how crazy it is to look at project that has been designed this way?
Designing general solutions based on one implementation will fail. The general solution will encapsulate all of the assumptions about the single implementation. So it achieves nothing in terms of generalization.
Even when multiple implementations are known it requires rigorous oversight to insure that someone doesn't attempt to generalize a subsection of the implementations. They end up doing the same thing - implementing based on just the subsection.
|
|
|
|
|
Another corollary that a comment triggered,
Quote: Create a facade around any API that you are using to protect your code from changes in the API.
|
|
|
|
|
quote Create a facade around any API that you are using to protect your code from changes in the API.
A great idea that no one ever does.
Ok, not no one, but it is done more rarely than it should be.
Also, there are physical limitations to it.
We use a 3rd party component that has 100s methods.
We should wrap the component but it gonna take a while.
|
|
|
|
|
A 1:1 mapping is not what I was envisioning. (Beginners might miss that joke! What’s the point?)
Facade/Wrapper/Adapter/Proxy/etc
|
|
|
|
|
englebart wrote: Facade/Wrapper/Adapter/Proxy/etc
Yes but then all of those are classes also. And thus those two require an interface...
Myself I am joking but I remember quite a few years ago someone posting somewhere that all the classes in the application were required to have an interface and a factory.
|
|
|
|
|
A facade should be the simplest requirements needed for your code’s clients. If the library I am using has 5 classes and hundreds of methods/properties, my facade could be as simple as one class with 5 properties and 1 method. Since my facade has a hard dependency on the library, I would use an interface so that my consumers do NOT have a compile time dependency on the library.
Like I said earlier, (possibly paraphrased) if you find yourself wrapping everything 1:1, then just accept the hard dependency and ship! ship! The facade might be added later (YAGNI) as well as only if the next major update to the library has breaking changes. Without the facade I would have 70 projects to change. With the facade I have one project to change.
I do agree with the release often philosophy.
A lot of this depends on team size, pace, code stability, etc.
And maybe IDE? Creating an interface from a class or a class from an interface should not take more than a second. Right click…
Someone earlier mentioned they were stuck on VS2013. Yuck!
|
|
|
|
|
Why? How difficult is it to adopt the old semantics of a given dependency, and shim a a new replacement library when it becomes necessary to jump ship. I have done this a few times, though not often. Seems like deferring the pain ('YAGNI') until it becomes necessary is optimal overall.
|
|
|
|
|
hpcoder2 wrote: Seems like deferring the pain ('YAGNI') until it becomes necessary is optimal overall
Yeah, exactly.
You can either have:
1) Pain now (All Interfaces)
2) Pain later (that may never occur)
I figure take the pain later -- cause a lot of software rots for other reasons and is completely re-written anyways. So, you may never reach the "pain later" stage anyways.
As a matter of fact, I've rarely seen it in 35 years of software development.
And, when another manager comes in anyways, they think something totally different and wipe away the "old" code, even if it is extensible from all those Interfaces.
|
|
|
|
|
I program since more than 40 years and develop software since more than 30 years.
15 years ago I heard first time about DI containers, and since 10 years I use IODA as principle to avoid DI containers.
Only integration classes are allowed to call other operations.
If you put the whole logic into operation-classes, with no logic in data-classes and integration-classes (only something like "rail-switches" in integrations),
then separate input- and output- from logical operations,
than you do not need to discuss those things like DI any longer.
I use derivation very rarely, combination of classes is my favorite.
But I use also actions and closures.
You don't have to hide every operation class behind an interface, but it can be helpful.
Consider: By using of interfaces you can not longer jump to the executed code by just pressing "F12" in Developer Studio.
There is no universal answer to this question, it just depends on ...
What is IODA? See here:
https://www.infoq.com/news/2015/05/ioda-architecture/[^]
|
|
|
|
|
Fantastic and interesting reply.
Thanks so much for sharing your experience here.
Interesting that you specifically avoid DI.
Ralf Peine 2023 wrote: By using of interfaces you can not longer jump to the executed code by just pressing "F12" in Developer Studio.
That is definitely one of the problems that I encounter with DI.
It is a pain that the implementation is always somewhere else which you have to track down.
Must say though, that I downloaded VSTudio 2023 to help with this and now it can (as you would hope) navigate to the exact implementation -- even if the code is in another dll it will use reflection to show you the code. Very good.
|
|
|
|
|
Hmmm... once you decide that *all* something must do *something*, before you even know what problem you're about to solve, you've decided that reality takes second place to what you've decided reality should be. "Dijkstra would not like this" <- his immortality, for those who get the reference.
|
|
|
|
|
From CP newsletter
Create AI pro register to uphold ethics, says BCS • The Register[^]
In UK there is a proposal to license/enforce corps when using AI.
Then it uses the problem as an example with the UK government prosecuting post office managers for something that never happened.
Except of course that had nothing to do with AI. And the proposed rules are for corporations. No idea how that is supposed to prevent the government from making false allegations.
Reminds me of the silliness of a case in the US where people were prosecuted due to 'recovered memory' testimony from very young children against a number of people accusing them of satanic rituals. The testimony allowed defied logic.
McMartin preschool trial - Wikipedia[^]
That was far from the only case where recovered memories were allowed.
Unfortunately in the US quite a few people still believe in that nonsense along with so called professionals that keep promoting it.
|
|
|
|
|
So at some point, along with the "harassment training" (which, ironically, is supposed to teach you how to identify and prevent harassment and not be a harasser yourself, rather than training you on how to excel at harassment as the course name would seem to suggest, but I digress) at some point corporations will have an "ethical AI training" class we'll all have to take. Except for management.
|
|
|
|
|
Hmm. My employer has us do 'business conduct' and other training every year. No one is exempt, including the president. It's not too arduous and can usually be completed in an hour or so. Completing the training is a condition for continued employment. I've never heard of anyone being fired for not completing it, but I wouldn't want to work with anyone who was so argumentative and confrontational that they wouldn't comply with a reasonable request.
I could see the company creating a policy for using AI-generated material on the job. Intellectual property rights are a significant concern for us in places, and given the cloudy (pardon the pun) nature of much AI training data, it's a reasonable response.
As far as governmental regulation goes, I'm sure that's going to be a goat from the word go.
Software Zen: delete this;
|
|
|
|
|
It all boils down to where the spotlight is shining - if it gets too close to home, move the light to something else to divert the attention so we can go on doing the "right thing" irrespective of public view, AI now has the spotlight like any other tech taht were out there before.
|
|
|
|
|
In absence of hardware, while waiting for it to ship I've taken to producing a starting codebase for my new project.
It's a Cortex-A based CPU, and those are not typically "real time" - for those of you familiar with embedded these work like your smartphone and are complex enough that they generally require an operating system like linux or android in order to function.
The problem with that is boot times. There have been successful efforts to get linux boot times down to less than half a second but it requires so much u-boot hacking and linux kernel hacking that it's not much easier than going bare metal.
I'm going bare metal. I'm terrified that I've accepted a contract for something I've never done before. It has been years since I coded anything in assembly. I read it, but I don't typically write it these days. That's about to change.
I'm dealing with stripped down C and C++, multiple cores but no scheduler, and therefore no synchronization primitives other than say, suspending the other core entirely, if possible? And that's yikes.
I just managed to scrape together enough of the CubeMX HAL under this environment to give me SPI support (I think), but I won't know until I get this board and connect my logic analyzer, and I'm really anxious about it. Lots of moving parts, and I haven't tested a single thing yet.
It's out for delivery now. I was excited yesterday. Right now I'm kinda freaked out.
Check out my IoT graphics library here:
https://honeythecodewitch.com/gfx
And my IoT UI/User Experience library here:
https://honeythecodewitch.com/uix
|
|
|
|
|
Ah, the unknown. Exciting, terrifying, challenging, mysterious, bound to be frustrating and also with great successes and stories. It's times like these, if you had a crystal ball to show you the future (in detail!), would you look into it?
|
|
|
|