|
I completely agree with you.
During the studies I tried to learn all, ending being demotivated when I realized I had not enough knowledge about a subject. Somewhen I had an inflection point and changed my "moto", after that point I took the university as "The way to learn efficiently how to learn different stuff on the fly".
I am always in touch with new technologies because of the different customers and projects. I just keep myself up-to-date with "new stuff XXX does YYY", "there is a new possibility to do ZZZ" and similars. That is enough to propose new ideas in the brainstormings for new projects. If something gets selected, then it is time to learn it. If I like it, I will automatically learn it deeper without big efford. If not, just learn enough to get the job done with quality, do a good documentation about "what, where, why and so on" so I can understand it in the future (just in case of one use solutions) and change to next one.
M.D.V.
If something has a solution... Why do we have to worry about?. If it has no solution... For what reason do we have to worry about?
Help me to understand what I'm saying, and I'll explain it better to you
Rating helpful answers is nice, but saying thanks can be even nicer.
|
|
|
|
|
Maybe its because im in my 50's and been developing systems since the 70's but everytime I see a new technology I just 'know' it will only last a short while before its no longer flavour of the month and be cast aside into the 'legacy' bin where some poor old developer will have to unenthusiastically maintain it until they eventually can take it no more lol. I remember at one time it was possible to 'know' most things in computing (before the word IT was invented) or at least have a clue, but now, there's so much its impossible. Both BASIC & C (in one form or another) seem to be the only languages that have stood the test of time from the halcyon days, and even COBOL is still hanging on in there in some quarters, but why we need so much diversity in some of the newer stuff over the last 10 years gawd only knows.
GL
Bob
|
|
|
|
|
I'm pretty sure I've used all of those excuses and more.
|
|
|
|
|
Nowadays there are more software solutions than there are problems. New frameworks, languages, etc. are constantly being introduced to solve problems that are dubious at best.
Everyone seems to want to make a better wheel, but C/C++/Java/.Net still dominate for good reasons. The wheel is fine, a lot the newest trends are the software equivalent of putting heavy spinner rims on a sports car. Flashy, but pointless and inefficient.
I'm happy to learn new things when they can be of benefit, but newness for newness' sake is becoming an annoying trend.
|
|
|
|
|
The time I was most annoyed at someone for pushing learning a new language, it was Perl. The new developer in a group was pushing a C++/C# guy to learn Perl, while the executive director agreed with me to just to let him create whatever-it-was in C#. The reasons I objected:
- Perl is a dying language; PowerShell is a better option
- Perl is complicated, with limited future use
- Perl would not advance the developer's career
- The developer is most effective in his primary languages
- Perl would make future support complicated and harder to fill
As for myself, I gladly learn technologies that I think would be useful, or might advance my career, or that I find intellectually engaging, although it is usually not learning a new language, but more likely an new architecture, which often is not all that new.
My background is in MS Office, desktop, and database development, with light but varied work in web UI's. I was recently expected to pick up more ASP.NET MVC, which is not a radical departure from my own MVP/MVVM desktop designs, as well as things like Entity Framework. The work is fun enough, they are marketable skills, and they are natural extensions of what I currently work with, although I would gladly learn pick up some newer technologies given the opportunity:
- Functional languages
- NoSQL Db's
- Hadoop
- SAS / MatLab /R
|
|
|
|
|
Learn a new tech which isn't an extension of your existing skill base and you will be headed for junior/trainee land, if you're lucky. If you're not, you'll be stuck on the grid.
|
|
|
|
|
'Hoped it would just go away'
|
|
|
|
|
In the old days languages were created by committees and there were fewer platforms. So if you learned BASIC, C, COBOL, Fortran, RPG, or even BAL, there wasn't that much variance among platforms. (Yeah, I know that's a stretch but bear with me.) These days entire platforms run with completely different languages like Java or Objective-C. And any Joe can create a new framework claiming to be the best at whatever it does with JavaScript or PHP. But these are rarely compatible. Some of them don't last. The author gets a real job or a girlfriend or just gets bored, the project dies, and someone forks off another new-and-improved version. Meanwhile any of us who have invested in the grass-roots tech now have to look elsewhere or roll our own.
It's tough to latch onto anything anymore and stick with it for any length of time. Paradigms change too quickly and there are too many of them. A client might call and request specific tools be used for their project - it used to be they just wanted to solve a business problem but now many of them have their own notion of what tools are best. The problem here is that we can't be experts at everything so we lose a lot of gigs unless we can assert the technology to be used.
We've seen all of the issues with HTML, browser wars, and now the need for cross-platform mobile development. The entire industry goes through these cycles of diversity followed by a general kick-back from the developer community saying "stop it already", we want one tool for all platforms, even if it's not perfect.
One of these days I hope the tail stops wagging the dog, that individual notions of what's cool stop driving the industry. We need to respect individual innovation but subject it to more rigorous standards to avoid this over-fragmentation of the industry.
We need to focus more on solving business issues than on tools that claim to help us do things faster, because in the long run we've wasted decades on tools that all claim to help us to save time.
EDIT: Dove-tailing with my personal thoughts above, I found this article on the frustrations of "polyglot programming[^]".
modified 22-Sep-14 14:26pm.
|
|
|
|
|
I judge a technology with the opportunities it brings. A lot of companies making a lot of buzz of their "new and groundbreaking" technologies. A bad example in the last years is Microsoft: Vista, Zune and Windows Phone 7 were started with a lot of hype.
Or the smartwatches - only a hype of "bananaware"
Press F1 for help or google it.
Greetings from Germany
|
|
|
|
|
|
.. a programmer isn't that much motivated to learn it. A few days ago a company tried to hire me to write the documentation for their API but was not able to succeed. Because their API was something different, something I never saw. I didn't have time to pay attention to a whole new environment.
Sometimes these are really hard to learn and understand, how can I do this thing etc. APIs are sometimes not well written, to explain the developer how can he do a task.
These are the major aspects. That cause the new technology or API to be ignored and left behind, even that, the new technology might be a better one.
Favourite line: Throw me to them wolves and close the gate up. I am afraid of what will happen to them wolves - Eminem
~! Firewall !~
|
|
|
|
|
It's been a while since I've seen anything really new and interesting. All the "latest and greatest" tech is just a rehashed version of something else that has existed for decades. I can't help but laugh when I read about Node.js and its "revolutionary" event loop; BSD sockets introduced the select() function that follows the same design ages ago and then the world moved away from it as it doesn't scale well.
|
|
|
|
|
I laugh too, but that's life man. Only the hardcore types see this happening. The lightweight coders with little experience know this because they don't know what they don't know. I remember the MVC hype that RoR helped push out, but those patterns started in the 60s, and yet people thought it was new and shiny.
Jeremy Falcon
|
|
|
|
|
I think the same.
Computer science (or handicraft ) has invented the wheel a thousand times.
Is like one cycle that shifts from a server to the client, and backward. I remember the dummy terminals, the initial web applications with PHP and old ASP and now the HTML5..
The funny stuff, is that the enterprises just wants the same than 60 years ago, controlling processes. The research and science, always have wanted processing power and calculations. The normal users always have wanted fun, read contents, see and hear media content and glee around... same inputs, same outputs and a thousand paths between them
|
|
|
|
|
And checkmark *ALL* the answers!!
Mmwwahahaha!
(Either that, or get back to work...)
|
|
|
|
|
My answer would be :
the advantages of the new technology are not enough to justify the effort of learning
|
|
|
|
|
That was the one I was looking for as well. Along with: Nothing on which to use it.
|
|
|
|
|
How about "No pressing need".
I'm close to retirement and, pretty much, always busy. I don't need to advance my career and I don't have time to kill for curiosity. I'll only learn something new now if there's a clear and obvious need.
The last "new" technology I learned was GP Integration Manager; we had just implemented GP and I needed to get my applications feeding it data!
Life is like a s**t sandwich; the more bread you have, the less s**t you eat.
|
|
|
|
|
Learning a new programming language is well-defined.
What's learning a new technology? iPad, cellphone, tablet? IPv6?
If we use it, does that mean we've learned it?
|
|
|
|
|
I will usually not go out of my way to learn a new technology for the sake of learning a new technology. I am more apt to learn a new technology if it is going to be relevant in my current work domain or is going to be an industry standard base for doing something; i.e. WPF, etc....
My free time is usually spent with family and friends and not on the computer. So far, I have managed to stay highly marketable.
|
|
|
|
|
I found no option given for my answer. but i like to learn new technology and what is new in market. eg. for task we might have spent lots of time to achieve but it may be very simple in new technology isn't ?
|
|
|
|
|
I was also looking for that option.
|
|
|
|
|
That's just it as the title says.
Why would one downgrade to a new technology?
"I have not failed. I've just found 10,000 ways that won't work." Thomas A. Edison
"Politicians and diapers should be changed often and for the same reason." Eça de Queiroz (1845 - 1900)
|
|
|
|
|
I agree. In several cases some new library or language comes along and I look at it and realise this is pointless - what I currently do is easier|simpler|better|less overhead|doesn't need an extra library or component|will still be supported next year|all of the above.
- I would love to change the world, but they won’t give me the source code.
|
|
|
|
|
Agree with you 100%. For my work(doing CAD/CAE solutions) I don't even need the .NET.
Just super fast native C++ satisfy the criteria.
Funny thing, the Microsoft now even don't have any native C++ certification track
|
|
|
|