|
jeffery c wrote: Those 70-80's languages are only used in nasa mainframes
I'm with you that some languages have become less popular, even to the point where they could be called esoteric. Though I'd draw a line at calling them irrelevant.
Perhaps something like Fortran / Cobol tends to be used just (or rather mostly) in legacy systems yes - does that make them irrelevant? Assembly is still used though, quite a bit in fact, wherever you see hardware with extreme limits in resources you're sure to see at least some ASM codes (even in lieu of C), think stuff like a washing machine controller. Another place it gets used often is drivers. Also this new "fashion" of IoT is a prime place for ASM to help out where a higher level language just doesn't provide the full control. And then there's still those who go and optimise after a compiler's done its best.
Not to mention, in some cases certain things are just impossible to do in the higher level language. Even in DotNet there are some features impossible to implement in C# without reverting to IL instead (i.e. DotNet's "ASM").
No matter the language which was used to produce the program, it's rather rare to see users being trained in a language instead of how to operate the program. Think e.g. something like SAP, 99% of all its training is done on end users explaining to them which value to put into what field and what button to click when - not anything about the language used to develop those forms. Why would you think that training on a "mainframe" is any different, especially for the operators? In fact it's probably more useful to train users on a higher level language (for stuff like scripts and automation) while leaving the lower level stuff to actual programmers.
|
|
|
|
|
I level with you on most of this except most users barely know where the power button is on their computer. I worked in IT at a public K-12 school system and the multimedia/computer teachers were the only ones who had half a clue about them(I am from KY). The science teachers do not know how to program or even learn. High science class was the exception probably.
jeffery
|
|
|
|
|
jeffery c wrote: most users barely know where the power button is on their computer Well, I suggest that you don't make the kind of glaringly deceptive statement that you would use with those people when you post here, in CP, which comprises millions of people who have made a living coding for years (many of us, decades).
The only programming languages that have become "irrelevant" are the faddish languages, which came and went, while the old -- faithful, solid, and trustworthy -- languages are still in use, and still being used creatively, alongside the recent/new-fangled (if you're as old as I am/if you're not) OO languages.
The whole ms "universal" thing will go the way of all their fads, and in short order, because other people are doing better things -- whether individual, gifted developers, who are finding their own ways to do things; or large firms, who are looking to set standards. ms has already lost the battle, no matter what their marketing morons and wu-maos say.
Had windows phone been a success, things might have been different, but they killed windows phone with the very direction they went in for it.
Understand that very clearly: UWP killed the windows phone -- no-one wanted it on their computers, therefore they didn't want it on their phones, either. If something stinks, you don't want to put it in your pocket.
Shame, really, because the baby-blocks thing, with its low-ish graphics overhead, was a good interface for a phone.
Note that I have only really responded to the title of your posting, and not to the content, which is not entirely related to the title (don't get me started on titling things appropriately!), so, in response to your "shall I write articles about it?" question, I reply; Why the Hell not?
There is always something to be learned from every direction taken in computing.
I wanna be a eunuchs developer! Pass me a bread knife!
|
|
|
|
|
Mark, sorry for confusion I meant that in the context of my previous IT job. No offense to the programmers on here but normal people can be idiots especially in KY and some do not want to learn. I was referring more to normal users not programmers which deserve a special right to themselves.
jeffery
|
|
|
|
|
Fair enough, but don't forget that "normal people" can be absolute geniuses in areas where developers are absolute idiots.
"Bert isn't good at what I'm good at" doesn't mean "I'm good at what Bert is good at".
I wanna be a eunuchs developer! Pass me a bread knife!
|
|
|
|
|
No. The guy doesn't know anything. He thinks that UWP apps take over our whole screen. Ha ha ha.
|
|
|
|
|
The nuclear bomb launch systems AREN'T written in COBOL.
The bomb simulation routines were written in JOVIAL, a variant of Algol.
Most programs connected with the nuclear industry were written in Fortran. One just needs to look at the infrequent ads from Los Alamos National Laboratories and Livermore Labs to figure this out.
COBOL figures prominently in systems for banks, stock trading, and large bureaucracies such as the IRS and the Social Security Administration.
By the way, ADA was used when the government replaced the creaky Air Traffic Control system serving the USA. The IBM processors used for Air Traffic Control had not been manufactured for a decade or more (no, they were not the 360/370/390 processors) but I think the room-sized processors could have been replaced with small CMOS processors with the identical instruction set.
Reading the ads in Washington Post (during the glory days of the Cold War), one could see that the ELINT (electronic intelligence gathering) systems and COM3 (Command, Control and Communication) systems were also written in assembly language for the unique processors produced by IBM.
Just FYI.
|
|
|
|
|
Thanks for that. I forgot for a moment about Fortran but if I did I would have mentioned it. Was assembly language really used in nuclear systems?
jeffery
|
|
|
|
|
That question jogged my memory and I remembered the name AN/UYK. Googling that, I found they were the computers used by the US Navy. All advertisements in the mid-1970s for that computer were for assembly language programmers. With about 4K of memory, they certainly were programmed only in assembler.
Googling for Semi Automatic Ground Environment, that was a network of 27 computers across the US that were connected to radar sites that scanned the skies for incoming Russian bomber fleets. The vacuum-tube IBM mainframes had a max of 64K of memory. You could be sure that they were too programmed in Assembler.
While SAGE gave the order to launch the missiles, the missiles themselves were controlled by on-board computers with 4K of memory. You could be pretty sure nothing would fit in that memory except assembly language programs.
As the missile fleet was modernised, I am sure the US Department of Defense moved to ADA. Perhaps now, it is in C or C++.
|
|
|
|
|
Agreed. I think the only real, creative "UWP" being developed right now inside Microsoft is the WSL (Windows Subsystem for Linux) a.k.a. cancer. Once those guys have implemented all the syscalls, WSL will be more attractive to develop for than UWP.
|
|
|
|
|
Speaking of COBOL, if you know who Grace Hopper is, I was actually in her office in the basement of the Pentagon once.
The fundamental design of COBOL is good; it was intended to be understood by non-programmers. Most other languages, like C/C++ and PHP, are too cryptic and peculiar.
|
|
|
|
|
Sam Hobbs wrote: I was actually in her office in the basement of the Pentagon once. That's an experience I'd swap for a lot.
The best thing about COBOL is that it was optimised specifically for handling financial data, so it's blistering fast, when used in that field. The only language that can even come near to keeping up with it is C (with no plusses, sharps, etc.)
I wanna be a eunuchs developer! Pass me a bread knife!
|
|
|
|
|
I should have returned to Grace Hopper's office sometime when she was there and told her I wanted more challenging work. I did not think about that back then but my life would have been different if I had done that. At the time I was working in the Pentagon but I had very, very little to do.
As for COBOL, we need a language with similar requirements but improved with modern advances. I don't know what the current standard is but it is probably held back by COBOL's legacy. I think it would be interesting to design a language based on COBOL's design and used in internet servers like PHP. It could be and should be more professional than PHP.
|
|
|
|
|
COBOL was held back be by the fact that it became one of the major languages to be used for banking; and those guys just don't handle advances the way everyone else can.
I'm still jealous as Hell of the people you had access to, though, even though it's hindsight that gives me the possibility of such jealousy -- you can't tell something will be thought of as amazing in the future, when you're living through it day by day.
It's good to know that I know someone who was there. Kudos.
I wanna be a eunuchs developer! Pass me a bread knife!
|
|
|
|
|
I don't think I ever talked with Grace Hopper directly. I attended a presentation by her. She gave all the participants a nanosecond. I was in the Army. I also wish I had appreciated Grace Hopper back then.
As for COBOL, it is more appropriate to say "financial" instead of "banking". However COBOL was used substantially by non-financial companies such as aircraft manufacturers. The engineering and manufacturing systems of aircraft such as the L-1011 and the F-22 Raptor ATF were all COBOL. So it is probably more appropriate to say "business" instead of "financial" and it is called Common Business-Oriented Language. You are right that it is neither scientific-oriented nor software-oriented (as in operating systems and compilers). The C language was designed to include software-oriented uses.
|
|
|
|
|
Spoken like one who is "Truly" outside the loop.
I love the "Oh, we're dumping ReactJS. We've rewritten it completely with only "Small" breaking changes.
YEAH, let's go AngularJS the "More versions than fleas on a baboon" language. Pick one you will like. uh, no that one is now obsolete.
Besides, browsers are soooooo stable across platforms they should be declared a standard..... Actually there are so many standards you can't keep up with them.
You are probably right....
Having a true write once and just change the IO's to match the op system language and platform is probably a pipe dream....
Oh, wait...
I already have it working using ASP.NET Core.......
I know, I know,,,, Raspberry pi is really not a computer.........
Actually it is, and is very stable if you use a 2 amp dongle.
So C# and a PI talking to an MVC app running in the cloud is a lot of fun......
But MS is sooo confused.
Yes, they are probably just stupid to make almost all of their releases of software backwards compatible which protects everyone's investments in IP.
I know being open source is so stupid and letting the user community submit enhancements is way dumb..... But they plod along just the same.....
|
|
|
|
|
I can only say one thing...
If you want to write something... do it. It can't be worst than some of the stuff is getting approved lately.
There are some million users of CP, I guess someone will find useful what you post.
M.D.V.
If something has a solution... Why do we have to worry about?. If it has no solution... For what reason do we have to worry about?
Help me to understand what I'm saying, and I'll explain it better to you
Rating helpful answers is nice, but saying thanks can be even nicer.
|
|
|
|
|
I would like to answer your title in a simple sentence: UWP is, in no way, better than WPF. Period.
Then, as for your for articles, come on man! Keep them coming. The thing is, that entire of the Windows Runtime was developed on top of async/await pattern, now that pattern requires a lot of configuration, the configuration of the way we think, we develop, we debug, and we release the application. Not the configuration of software etc. There is nothing different, or confusing. If you want to be a better developer at Windows Runtime, I recommend be a hardcore async/await developer. I mean it.
1) Didn't they have problems with Windows 8, Windows 8.1? How about Aero effects? How about the other versions. People don't like change. And developers, they hate change. Because, at least for me, we have to change or upgrade, or migrate or sometimes rebuild everything just because SDK changed.
2) I don't have Hypervisor as I don't like it. I personally prefer VirtualBox, and as for testing, I consider writing good code, instead of breaking a live or virtual app.
The sh*t I complain about
It's like there ain't a cloud in the sky and it's raining out - Eminem
~! Firewall !~
|
|
|
|
|
I've heard of people using virtual box with the windows phone image is their a converter app out their that does that? Well, I mean linking it to the xde app (somebody in the article I read awhile back was doing registry hacks though). Personally, I like getting msdnaa software for free. I plan on going back to school and learning more about computer science as I get older.
jeffery
|
|
|
|
|
jeffery c wrote: using virtual box with the windows phone image That would be worst thing to do. Don't do it.
1) It would be extremely slow.
2) Even if you could get something like x86 around with that, there would be a lot of consequences.
3) Those who do it, are IT guys. They don't know anything of programming. All they care about is that blue logo of Windows.
In case if you really have to, consider Hyper-V, as that is the platform they require you to use, and there is a method where you can setup booting where you can (when working) boot with Hyper-V enabled, and (when at home, or not working) boot with Hyper-V inactive. Otherwise, consider an online solution.
The sh*t I complain about
It's like there ain't a cloud in the sky and it's raining out - Eminem
~! Firewall !~
|
|
|
|
|
Afzaal Ahmad Zeeshan wrote: The thing is, that entire of the Windows Runtime was developed on top of async/await pattern
I thought that was a fantastic improvement, but the fact that it only runs on W10 (I finally have a laptop that uses W10 that I purchased a couple months ago) was a total showstopper for me in bothering to even learn more about it.
Marc
Latest Article - Merkle Trees
Learning to code with python is like learning to swim with those little arm floaties. It gives you undeserved confidence and will eventually drown you. - DangerBunny
Artificial intelligence is the only remedy for natural stupidity. - CDP1802
|
|
|
|
|
Marc Clifton wrote: bothering to even learn more about it. Let me tell you what I learnt in my past 6 months, or literally-hair-pulling-ly-rough-experience, it works fine and in 2 different ways.
- You don't use async/await at all. You perform all the operations synchronously.
- You write every function call, with an await operator. That way, most of the errors (such as Illegal operation, Invalid state) etc would go away.
I would go with the second one, but like you said, it still requires us to learn more about it. I built one of my application, 1.5 years ago, which had 0 async await applied, because I did not know anything. But, I was working on another application a few weeks back, and since I knew async/await, I mistakenly added a few function calls as awaited. After that, it forced me to make sure every call in the stack was awaited, otherwise,
- Illegal state
- Invalid operation
- Infinite loop
Were only a few of the problems to tackle and work on. I think, in the next update Microsoft should work on these things, to make sure asynchrony is managed by OS itself and even being a feature of language, stays as a pain of the kernel only.
The sh*t I complain about
It's like there ain't a cloud in the sky and it's raining out - Eminem
~! Firewall !~
|
|
|
|
|
Afzaal Ahmad Zeeshan wrote: I think, in the next update Microsoft should work on these things, to make sure asynchrony is managed by OS itself and even being a feature of language, stays as a pain of the kernel only.
I'm not convinced this is reasonably possible. I assume when you say "asynchrony is managed by OS itself" you are saying the OS should re-use a blocking thread's time-slice? How would you maintain state then for the blocking code? If you save the stacks, registers, etc before loading the new code then you're still basically performing a context-switch with even more overhead. The OS/hardware is already both asynchronous and multi-threaded from its standpoint. The hardware has a specific number of cores ("threads") and the OS switches out kernel threads ("tasks") to make best use of them including suspending blocking kernel threads. Abstracting out these kernel threads into another thread/task pattern is of no concern to the OS and rightfully so.
The language being asynchronous would be possible but I'm not convinced it would be a benefit. Multi-threaded applications might rely on the main thread being synchronous which would no longer be a guarantee. Just enabling asynchrony by default might be an alright idea. You could just wrap the return in a Task<T> under the covers. If the function doesn't use await then it'll just run synchronously.
public class Whatever
{
public async Task<int> ReturnTwoSynchronously() => 2;
public int ReturnTwoSynchronously2() => 2;
}
At the end of the day though, is it worth the effort? My opinion - probably not.
modified 22-Apr-17 22:46pm.
|
|
|
|
|
Oh, some good points.
The sh*t I complain about
It's like there ain't a cloud in the sky and it's raining out - Eminem
~! Firewall !~
|
|
|
|
|
For you old C/C++ devs: await is the new const correctness.
|
|
|
|
|