|
bada bing
"A little time, a little trouble, your better day"
Badfinger
|
|
|
|
|
Yep, I totally agree with you! The grouping thing is so so annoying! I wasted sooo many hours of my life clicking/looking/complaining aver the taskbar. Solved using Startdock Start11 found in a comment here in Code project luckily. And tks to people who like to share their findings, I was going crazy, too!
You are not alone
|
|
|
|
|
I too have struggled with this missing feature in Windows 11. The grouped setup is the default in Windows 10, so I'm guessing that's why so many who have replied do not know what you are talking about. It really is wild how much more time it takes to switch between windows that are part of the same application takes with the default (grouped) taskbar setup. But then again, I'm always amazed at how many technology professionals and developers don't really know how to use Windows and primary work in applications that do not have multiple windows at the top level.
|
|
|
|
|
solve the problem using windows 10
Those that put class in JavaScript are the same that put var in C#
|
|
|
|
|
I have been sticking with Windows 10 and Stardock's menu system. Why change when there is no need to do so?
Microsoft is notorious for changing things just for the sake of changing them. And they also "throw out the baby with the bathwater for some reason..."
After so many years working with the Microsoft development environments I have decided to stop upgrading my tools based on their say so. As a result, I won't use their Core web development tools (ie. Blazor) because after working on a very large MVC project a number of years ago, I saw no reason to replace ASP.NET WebForms. Those who contend that the new environments are much more efficient may be correct for the internals aspects of them but from an implementation standpoint, these environments are simply far more complex and as a result, big time wasters for most professionals.
Concentrating mostly on Desktop development in the past several years, I am sticking with WPF even though a host of other branches of XAML tools have cropped up. Given that most of them are merely forks of WPF, I decided to stay with the original.
.NET Core? Meh! Microsoft took out more than they put in leaving it to third parties to rewrite what already existed in the original .NET Frameworks. One example of this was WCF, which was left out and as a result, a third-party team has just released their 1.0 version of CoreWCF.
Before anyone starts yelling at me, please note that I am not saying that these new technologies are not better refined than what was found in the original .NET Frameworks. My contention is that why should we constantly upgrade from a mature platform that was very stable to one that is still being developed? In short, why bother?
DO we really need to constantly rewrite our applications simply because Microsoft has this penchant for having the entire community go through massive trauma every time it decides to create a new product?
To date, I have stuck with the .NET Framework 4.6. It is very stable and does what I need it to do. And it has all the development tools I could possibly need.
True, the original frameworks will be going out of support but who cares? When was the last time anyone called Microsoft because they needed support for an internal issue with one of the versions of the .NET Framework. Microsoft did great work with this framework and should have simply left it alone and refined it.
The idea that we need cross-platform development may be true for some developers but for the most part most professionals work in fairy stable OS environments that will not change. For the cross-platform requirements, Microsoft could have followed already existing models that provided tools for such development.
To hear a multi-billion dollar corporation such as Microsoft complain that it would have too hard to take the original software and refine it to work in other environments is nothing but BS. Open Source developers did it with Mono, MonoDevelop, and MonoGame, just to name one example. The other well known one is, Xamarin.
Need I say more?
To date I still do not understand the drive to Windows 11 when just a few years ago Microsoft made a very big thing out of the fact that Windows 10 would be the last major version of Windows. This sounded very fishy given Microsoft's penchant for breaking their word but it appeared to be true.
Given that Windows 11 doesn't really bring all that much to the table for most developers, I would stick with Windows 10 for as long as one can. Unfortunately, all of us will be eventually forced to use Windows 11 once new machines no longer offer Windows 10 operating systems.
The thing of it is, there is always a tool to work around the nonsense that Microsoft implements with its upgrades. The only other choice we have is to turn to Linux for a desktop OS. I have been researching that for years and increasingly this is becoming an option as more and more language compilers are being offered under Linux such as Python with JetBrains freely available PyCharm IDE, which is quite nice.
Satya Nadella has to literally get his head out of the "Cloud" as has admitted that this is his focus. If he keeps it there I suggest that over time many of us will eventually leave the Microsoft development ecosystem simply out of frustration and exhaustion from their erratic behavior...
Steve Naidamast
Sr. Software Engineer
Black Falcon Software, Inc.
blackfalconsoftware@outlook.com
|
|
|
|
|
Steve Naidamast wrote: Why change when there is no need to do so?
It is a work laptop, and I have no say about the OS... At home I'm using Fedora for more than a decade, of which at least 6 years I have no even VMs of Windows...
"If builders built buildings the way programmers wrote programs, then the first woodpecker that came along would destroy civilization." ― Gerald Weinberg
|
|
|
|
|
When I worked on the mainframes back in the years when dinosaurs were still considered an option for pets, I did both batch and online development. Only once in 12 years in this part of the field did we have or even need an upgrade to the COBOL compiler.
With Microsoft both the C# and VB.NET (no longer being evolved) languages had so many features added to them that the source code could look so arcane as to be unintelligible. This was one of the factors that destroyed the Nantucket Clipper Compiler, which was very popular in the 1980s and 1990s.
Microsoft could land itself in the same situation the way things are going with them.
As professionals, we don't need constant change. What we really require is stability...
Steve Naidamast
Sr. Software Engineer
Black Falcon Software, Inc.
blackfalconsoftware@outlook.com
|
|
|
|
|
Microsoft is notorious for changing things just for the sake of changing them. And they also "throw out the baby with the bathwater for some reason..."
You must need an Altair computer.
Charlie Gilley
When MS changes the interface and it bothers you, remember, somewhere there is an Altair computer out there for you.
|
|
|
|
|
Does Open Shell work on Win11? If so, it's free. I've been using it on Win10 since Win10 was released, as it provides a Win7 style Start menu.
|
|
|
|
|
Yes, but I think you need to use the beta version. That's what I'm running.
|
|
|
|
|
Thanks for the tip. I'm continuing to use Win10 for the foreseeable future. While my desktop (upgraded a year ago) will run Win11, my 7 yo laptop will not, so I'm holding off until I have to replace the laptop.
I tried other Win10 menu replacements, but Classic Shell / Open Shell has been the best and problem free.
FYI for other readers -- Classic Shell was available from 2009 to 2017, when the author quit supporting it. It was transitioned to Open Shell, which I'm currently using.
I'm using the current release 4.4.170, although there is a 4.4.189 pre-release. @sasadler, is that the version you're recommending for Win11?
Releases · Open-Shell/Open-Shell-Menu · GitHub[^]
I'll sometimes use a pre-release on my laptop, but don't mess around with my desktop.
|
|
|
|
|
Yep, I'm using the 4.4.189 version.
|
|
|
|
|
That's why I'm waiting for Windows 12 to upgrade. The rule is, every other release is worthless.
Da Bomb
|
|
|
|
|
Kevin bless the ARM Cortex A family.
As soon as you're using one, you basically need to run linux or android.
Particularly if it does HDMI and has a GPU.
I've been combing through ZephyrOS's compatibility, and there's nothing useful.
Even if there was, nothing would support HDMI and a GPU.
ZephyrOS+LVGL+Arm Cortex A53+Mali GPU would be a winning team.
Instant boot. Nice, modern looking UI. Very little DDR3 ram required, and the whole thing could fit on hundred megabits of flash or so for even the most involved applications.
It's what I want. I know how to wire the hardware now. Those SBC thingies are actually easy to wire up. The trick is messing with BGA (so I'd just leave it to the professionals)
The problem is the software. ZephyrOS doesn't support it, and LVGL doesn't support OpenGL 2D acceleration, though I could probably implement that. The issue is no driver level support for the HDMI facilities of these chips. I wouldn't know where to begin.
I need a friggin team. This needs an open source project. But yeah. Far be it from me to think I can get any traction on something like that without an initial offering to show people.
And getting from here to there? It may as well be the moon. I probably won't live long enough if I had to do it on my own. I don't have enough man years left in me.
Why isn't there something - if not this - some kind of parallel out there? The use cases are endless. The predictability of such a package compared to something linux or android based - there's no comparison. The former is more predictable, real time, more testable, and has less boot time (almost none) and far less resource requirements.
It would make building smart machines so much less expensive, both to build and maintain an ongoing product. They would be less "crashy" - they would be less impossible to flowchart, if you needed to do that.
These damned things can run webkit, sure, but what if you don't need all that, don't want it, and don't want to pay for it?
This is a software problem, ultimately. That's what kills me. It's solvable. I just can't do it myself.
To err is human. Fortune favors the monsters.
|
|
|
|
|
If someone has built an Android smartphone or tablet around it (or one of its close cousins), then there is an appropriate hardware abstraction layer for Linux out there for it. I would look for companies that produce development boards for such devices and see if the software for the development board is downloadable in source code form, possibly after registration.
You probably (certainly?) won't be able to use the code in a commercial product, but at least it'll get you running.
Your Google-fu for this sort of thing is probably better than mine.
Freedom is the freedom to say that two plus two make four. If that is granted, all else follows.
-- 6079 Smith W.
|
|
|
|
|
So far the only "source" I found was for the linux images, but rather than source, it's a collection of binary blobs and scripts.
I'm not sure if the source code for say, the Orange Pi Zero 2 is available at all, despite these being supposedly open source.
To err is human. Fortune favors the monsters.
|
|
|
|
|
It might have a bit of a heavy footprint, but would RiscOS fit the bill?
|
|
|
|
|
I doubt it. I mean, getting an RTOS running on an ARM Cortex A53 is one thing. Getting it running on the H3 SBC that the ARM Cortex A53 is embedded in is another story.
I can pretty much guarantee RISC OS doesn't support an AllWinner H3 or an H616. At least not the peripherals on it, like HDMI
To err is human. Fortune favors the monsters.
|
|
|
|
|
I missed the start of this, but I sense from the wording this may be part of an ongoing "question"...
How much processing oomph is needed?
If it is not too much, is the RP2040 chip enough? (As featured in the Raspberry Pico and various Adafruit boards, etc.)
It has a couple of Arm cores, some RAM, some Flash and a bank of PIO - this last bit is the interesting bit: in my head it's like a little bit of FPGA, but in any case you can program it (in a C-like language) to off-load intensive operations from the CPUs.
The crux here being that folks have programmed the PIO to drive HDMI displays and so forth...
TBH, I've never used the PIO to drive HDMI, but I have used the RP2040 in a number of projects and it's been surprisingly good. I've only ever used the C-SDK, but there's support for various embedded-Python dialects too, they tell me.
The C-SDK also includes ports of LWIP (for Ethernet stuff) and TinyUSB (for USB host or device stuff) amongst other libs.
I've not used LWIP on this (have used it on other boards in the past; worked well but its API is quite unlike BSD-sockets!) but I have used the TinyUSB a fair bit and was surprised by how well it worked (mainly because I found, and still find, the API confusing and was surprised when my prototype actually worked at all!)
Also, cheap as chips* (*or other item idiomatically identified as low in cost by your respective cultures.)
|
|
|
|
|
PIO can't handle this.
Nothing Arduino capable can drive a 40 pin dot clk display, AFAIK.
To err is human. Fortune favors the monsters.
|
|
|
|
|
|
I'm not bitbanging HDMI. Bitbanging something doesn't even count. You can bitbang just about anything in theory, but that doesn't mean it's "supporting" it. You're basically hacking at that point, and using up CPU cycles to do something that should be done in hardware and *is* done in hardware on an SBC.
The RP2040 cannot drive a 24-bit color 40 pin dot clk display. It just can't. And doing it by bitbanging HDMI and using up all my cycles on that is not realistic.
It calls for hardware that's actually meant for it.
The RP2040 is not.
And PIO cannot be used to program against most Arm Cortex A based SBCs. There are not even board entries for them.
To err is human. Fortune favors the monsters.
|
|
|
|
|
I agree that avoiding Linux would be very elegant. But I fail to grok why Linux is a showstopper. Crashy Linux? Really?
There are also people talking about Yocto Linux. "Highly customizable" say the fanbois, "But dodgy AF toolchains" say others. Me, I am deploying on 64 core monsters, so all this is just speculation, and curiosity.
"If we don't change direction, we'll end up where we're going"
|
|
|
|
|
If you're thinking embedded: The machine runs a single fixed set of functions. You don't load arbitrary new executables at run time into an embedded system. Like any specific Linux executable, it utilizes a tiny little speck of the total Linux offering.
An IoT runtime such as Zephyr is split into tiny functional fragments, and only those fragments actually referenced by the embedded code is linked into the image for the embedded system. The OS footprint may be surprisingly small.
A Linux system is prepared for additional new executables being loaded at run time. It must include all the functionality that these executables might request. In a standard Linux system, the unused code may reside on disk, but most embedded systems have no disk. So all the code that might be requested at some future time must be loaded to flash or to RAM from an external source at every restart (and then the external source must be available!).
Linux, at least some distros, are quite configurable. Yet the flash/RAM footprint is very much higher than for dedicated embedded OSes. Maybe the configurability does not include removal of any OS reference to e.g. disk or memory management system - smaller embedded CPUs may be without a MMS. You might say that this careful shaving of standard Linux to leave only what your specific embedded functionality needs is exactly what those providers of special embedded OSes (such as Zephyr) has done for you. (Note: I do not know whether Zephyr is based on pieces of Linux code or completely independent.) They may have shaved off some core code needed for drivers hardly ever used by embedded systems - the UI is typically based on pushbuttons and dials, LED indicators and small b/w (no gray!) low resolution LCD panels. Drivers are typically tailor written - the general driver architecture much too general to fit in.
HDMI is not a typical UI device in embedded systems! You may write a HDMI driver (assuming that required hardware is available), but I suspect that Linux HDMI drivers lean heavily on the standard driver architecture, assuming that a lot of functionality is handled by standard code.
ARM started out as embedded CPUs, and the smaller models are still used for that purpose. AArch64 is certainly not aimed at the embedded market. Running Linux on a 64-bit ARM is fully possible, and has been running on countless ARM based machines for years. They are not embedded systems, but general Linux machines. If that is your kind of system, maybe with a few gigabytes of RAM and many gigabytes of disk, then go for Linux, and you will have lots of drives for all sorts of peripherals.
My impression is that the OP leans much more towards the embedded side, and a full Linux is like shooting sparrows with cannonballs.
|
|
|
|
|
Any full fledged OS is going to be harder to guarantee and verify than an RTOS.
When I say "crashy" it's relative. In this case relative to an RTOS, linux is much more likely to fail in unpredictable ways.
To err is human. Fortune favors the monsters.
|
|
|
|
|