|
I'll Pay the Postage !
Been thinking of a new printer I have a Cannon MP 560 it has been great 10 or more years
The issue is the ink is now $65.00 which is half of what I paid for the printer
What Cannon do you have or recommend ? I seldom print more than once a month.
Thanks
|
|
|
|
|
My Canon was a steal; it was a floor model of an image class laser at Office Depot. I had gone in to buy $80 worth of HP cartridges and left with that printer for less than twice the price of the cartridges.
At a glance, the Canon Pixmas look pretty good. Best buy has a decent sale on a few for $100 or less. And the cartridge set for some is only $30. I haven't used a Canon ink jet in years, so I can't compare it to today.
|
|
|
|
|
I understand your frustration, but it is just a business proposition that HP is using to try and make enough money to run their business. A few years ago they had a CEO that wanted to get of the PC business, or the laptop business, I forget exactly; why? because they were losing money at it. The printer business model is to give you a five hundred dollar printer for a hundred dollars and make the difference back on the ink. HP is not the only company that does that. By having a subscription I imagine that their reasoning is that for the low volume users such as yourself that you won't see it as a good value proposition and will go to another printer. That does not bother them because they were not making enough money from your ink buying pattern. Users who print more may find their subscription service quite acceptable. But it is possible that it will piss off all their users and they will have to adapt to stay in business. I remember how pissed off I got when NetFlix changed to streaming from shipped CD's. I didn't get it, dumb me. But now I can't imagine me stuffing a video CD into one of my old players.
|
|
|
|
|
Kevin,
I think my frustration was with the two things. The horrible wording of the email, someone's poor idea of a marketing ploy.
I waited a month before I bit on the subscription free trial. As soon as I signed up, they shipped me two high yield cartridges, with the note to not install them until the printer ran out of ink. That was about two months later. So, I've had an XL set in the printer for just over two months. The existing cartridges are well more than 1/2 full (God knows, you cannot get that info from the printer). So, they'll shut them off if I don't pay; I can go buy a $32 set of original low volume and toss out two cartridges that have more ink left in them than the new cartridges. Not sure I see the sense in that.
It was about $150 when I got it and it has dropped some since. There is no way this is close to a $500 printer. My Canon color laser only listed for four and change; I paid less than half that since it was a floor model that had printed 16 or 18 pages, lol. And they don't play big brother watching, either.
HP needs to be more honest if that is their scheme. Of course, if they put out the "we'll be watching" message in their marketing, they probably wouldn't have much in the way of sales unless they really give them away. Since they sold/split off HPE, not sure what they'd have left.
|
|
|
|
|
I learned long ago to never ever buy an HP printer. It's Brothers for me all the way. They work under Windows and under Linux. Also, there's third party toner cartridges that seem to work perfectly fine in them. I got the wifey a black and white all-in-one and got myself a color model.
|
|
|
|
|
|
Well according the PR comments, a few other users/devs are super glad for your fix/PR.
Most excellent, I would say.
Brag away!
|
|
|
|
|
I totally understand your feeling of pride.
I'm still very pleased with myself for finding a similar initialization problem with the ages-old telnet client and submitting a fix to gnu.
You may be smiling about this for years to come.
|
|
|
|
|
We need this to work, the version we are currently having to use has Vulnerabilities that newer versions address but the newer versions had scoping issues when under load.
I really don't want to have to move to another tool.
I’ve given up trying to be calm. However, I am open to feeling slightly less agitated.
|
|
|
|
|
Proposed for discussion:
The net, “middle of the bell curve”, result of programming by AI will be the further influx of “programmers” who write even more awful code, but work cheap.
First, it was offshoring and hiring cheap H1-B labor for programmers. Taking our discipline from the level of professionals down to assembly line technicians. Non-tech bean counters, MBAs (full disclosure-I earned my MBA), and CTOs looking for better bonuses bought into those sources of reducing the development phase cost of the Software Development Life Cycle (SDLC). Now our industry is “et up” with the results - low quality code that drives up the biggest part of the SDLC costs - support and extension. Not all cultures encourage applying excellence and deductive reasoning in their work, but encourage varying degrees of making more money at the cost of excellence and just following “best practices” and other recipe books. The concepts of value engineering and defensive programming are rather alien to the cheap programmers.
If you, as a developer (full disclosure-I have 40+ years experience as a hands-on software developer/engineer/architect, and still going strong) have ever had to clean up (or throw away and start over) on outsourced/H1-B code, you know what I mean. (Full disclosure - I have worked with H1-B and offshore programmers for almost 30 years, and there are some, a minority of them to be sure, excellent ones that do not fit the description)
Now, even less knowledge about the discipline is needed when AI-driven programming just spits it out with even less “thinking with an engineer’s mind” and attention to the full SDLC. Low cost programmers can now be replaced by even lower cost “widget assemblers”. If you think too many software projects go south now (to wit, over-budget, fail to meet deadlines, buggy, high support costs, etc.), wait until the AI-assisted widget assemblers invade, making those CTO bonuses and short term labor overhead reductions even bigger. You know, cut costs and nab the bonus, then leave for another company before the support cost hens come home to roost.
I am not against AI/ML. I love using the AI/ML services in Azure, as well as Microsoft’s ML.NET library. Training an application to be useful and accurate takes a LOT of data, but once trained and including a self-learning routine based on how it processes real world data has very useful application.
AI as it is being used in Visual Studio is sometimes useful in code completion, and sometimes just annoying. MS needs to improve its adaptive behavior.
Remember, the driving forces in corporate software development (which are usually not defined by knowledgeable, experienced software engineers) are:
1 - Having someone/something to blame when there are failures.
2 - Keeping those bonuses coming in increasing amounts.
3 - Short term thinking.
4 - Just get a minimum viable product (MVP) out the door and don’t worry about future SDLC costs for support and extension.
5 - Use #1 above when projects fail or customer revenue streams are lost. It is much easier than getting it right the first time.
So, you may agree or disagree, in whole or in part, but I hope you have a lively and respectful discussion.
I do know, from reason and experience (the benefit of “been there, done that”), it does not have to be this way and such situations are correctable. I’ll bet a lot of you know that, also.
|
|
|
|
|
Not sure this is a cynical view at all, more like a pragmatic view.
For grins, I asked Bing for an authentication script. Now I understand that Sydney, or whatever it calls itself, isn't a programming AI but, hey, it was fun. It returned a working solution. Would an inexperienced programmer have just copied/pasted it in, changing out vars? I don't know. It certainly wouldn't accomplish what we'd want it to if they did.
How carefully does one need to architect or pseudo out what you want written and, at that point, is it any more efficient than working with a human?
|
|
|
|
|
Everyone these days (apparently) goes right into programming; no requirements gathering. AI will be great at programming the wrong solutions. Maybe it has a place doing user interviews; i.e. requirements gathering. Then I'd like it to design and program the thing. With pictures.
"Before entering on an understanding, I have meditated for a long time, and have foreseen what might happen. It is not genius which reveals to me suddenly, secretly, what I have to say or to do in a circumstance unexpected by other people; it is reflection, it is meditation." - Napoleon I
|
|
|
|
|
Concur. Suffering that list above for the last 15 years.
Charlie Gilley
“They who can give up essential liberty to obtain a little temporary safety deserve neither liberty nor safety.” BF, 1759
Has never been more appropriate.
|
|
|
|
|
MSBassSinger wrote: The net, “middle of the bell curve”, result of programming by AI will be the further influx of “programmers” who write even more awful code, but work cheap.
Because of course in the good ol' days (fill in a date here) everyone knows that (fill in something here) was much better.
Of course defining better and how it is known that it was in fact better is very murky.
MSBassSinger wrote: If you, as a developer (full disclosure-I have 40+ years experience as a hands-on software developer/engineer/architect,
My decades of experience only allows me to state with certainty that long ago I absolutely did not have enough knowledge or experience to judge whether anything was good or bad.
Now with that experience/knowledge now I can state that, certainly not surprising to me, that programming, like everything else, drives towards the average. Because if not it would probably require some supernatural explanation.
|
|
|
|
|
So, given the un-finite range of human abilities, there's a fair chance that the outcome from one of us will exceed the average. While I'm pretty sure that over my 25+ years, I'm probably average or below, but I can also say that we've produced some way above average results as well.
How does an AI exceed the average if it is built on the average? And how does it judge the feedback that it gets? All we have to do is visit stack exchange to see the complexity of that issue.
|
|
|
|
|
In other news, Bing is constrained to five answers to avoid going wild. How many useful lines eat up the include statements at the top of the file?
|
|
|
|
|
Not cynical. Entirely realistic.
|
|
|
|
|
Until incentives change to ones that prevent short-sighted decisions and rewards are given based on long-term success of the product, this will not change. And for that to happen, well, some serious shocks to the system will be required.
|
|
|
|
|
I'm not sure AI is the issue here.
Initially, the university where I work had a mainframe that handled student billing for many thousands of students and that system worked very well for 25+. It made sense to really invest in making the code right because the time you put in was rewarded with 2 decades of service.
The system was replaced with some difficulty in 2018 and yet five years later they're shopping for a replacement.
Coding as an engineering exercise makes sense when you're building something with hardware and software that will be around for two decades but it doesn't make sense to put the thought and effort into something that gets used for 6 months before being replaced by the latest framework/fotm language/cloud next best thing.
Look at the .NET framework - how does a 3 year support window for a version compare to two decades? If you start writing for .NET 6 right now and do a real quality job - on a complex system you may not even been completed before the framework is out of support. Why take all the extra time to produce good code when it will be obsolete security hazard before you can even get it out the door?
I'll worry about being an engineer when I get an environment that isn't completely upended every 24 months.
|
|
|
|
|
I’ve been developing systems in .NET for 22 years. It has never been upended. It has grown, expanded, and improved. Code I wrote 20 years ago still runs.
I remember the days of writing in FORTRAN and COBOL. Those languages grew, expanded, and improved over time, also.
There is a difference between replacing a program and upgrading it by extending it features.
|
|
|
|
|
If you've been maintaining it for 22 years you've either replaced the underlying framework a couple of times or you're running insecure code.
Regardless, I think technology churn is a huge driver for code quality problems.
|
|
|
|
|
I am not sure you know how this works.
The point is that there is no wholesale replacing of anything for the last 22 years of .NET. Expanded API, expanded OSS, added features, etc. But it is still .NET. Nothing like the false premise you offer.
All good, long lasting programs of any consequence are regularly updated and extended. It was true in the mainframe/minicomputer days, and is still true today.
To say that technology change - kin and of itself - is the cause for churn shows a lack of understanding or experience in software engineering.
As with any discipline, there are those who change something for change’s sake (always chasing the new and shiny), and there are those who change/amend/refactor/revise based on 1) need and 2) application of value engineering.
If you think it is “upending” to go from .NET 5 to 6 to 7, then you don’t understand .NET.
|
|
|
|
|
MSBassSinger wrote: If you think it is “upending” to go from .NET 5 to 6 to 7, then you don’t understand .NET.
I'm talking about decades of stability and somehow you believe migrating an application from .NET 5 (2020) to .NET 6 (2021) is a reasonable comparison?
Okay.
|
|
|
|
|
No, I think you don’t know what you think you know.
No production program remains unchanged for 20 years. The required business logic changes. If industrial automation, the hardware changes as it is replaced, thus making API or buffer location changes. The backend databases or third party APIs change.
Production software always changes to meet production requirement changes. The core purpose of the program can be stable for decades, but there are always changes. I’ve seen that in banking and HR with programs that had the same core responsibilities for 20+ years, written in COBOL.
The same is true where production programs were written to use .NET.
In both cases, stable production programs were updated to meet changing business requirements, not because .NET grew and improved like any language does.
Yes, churn does happen because unqualified and ignorant management chooses to ignore ROI and just chase after something new and shiny, or falls for the latest “best practices” silliness. That is not the fault of .NET. That is the fault of an organization hiring the incompetent and putting them in charge of something.
|
|
|
|
|
For some reason you keep moving the conversation to weird extremes.
For example:
MSBassSinger wrote: No production program remains unchanged for 20 years.
I never made that claim.
I've been more than clear - when you want to have a discussion with me and not weird caricatures of what I'm posting I'll re-engage.
Until then, good-day.
|
|
|
|
|