|
Management 5th amendment: "it all depends"
I have had both go swimmingly and both fart loudly.
Recent up date of Debian 11 to 12 wound up in a loop trying to configure the kernel. Had to use Timeshift to go back and remove 2 packages that had patched the kernel, then redo the upgrade. I like to keep my home folders on separate partition, and data on its own as well.
Did an upgrade of a domain controller (CA 2010) and had to start over when it rolled over and died. Fortunately, that was a VM and backed up. What failed? Beats me. We do Windows servers in VM's for a reason.
I think a format/install is best but it all depends on how much stuff you have added and how easy to recover. I also think lack of a registry in Linux makes it easier, but have no empirical data to prove it. Just an old fart's feelings.
>64
Some days the dragon wins. Suck it up.
|
|
|
|
|
Yeah, the registry's gotta be a mess for an upgrade to process...
Incidentally, I have a domain controller (2008 R2) that I've been needing to upgrade to the latest for...a few years now?
Being a DC, I don't want to restart with a clean install...and I dread the in-place upgrade. OTOH the VM only has that one role, and it's only used to authenticate a few users here in my home environment, very little else. You'd hope the upgrade would be straightforward...
|
|
|
|
|
You'd hope the upgrade would be straightforward...
Shirley, you jest. We went from 2003 to 2008 to 2010 to 2012 and will go to 2019 in a week or two. Most were accompanied by new hardware but the all had VM's and went well with one big exception. As long as you follow the Windows server upgrade bouncing ball, it should go well. Fortunately,
here they have few users/systems.
I helped an accountant's office recover from ransomeware, they were still running Server 2003 SMB. I think they had only 5 users, so I just did a full reinstall. I have recovered/helped recover from 3 events. None are fun but all had very recent, protected backups of at least data. One was weird, they only encrypted office files and pdf's. The pbx system never flinched. Fun filled weekends.
I am now running a domain with a Debian 12 server using Samba for the domain controller. Testing for future.
>64
Some days the dragon wins. Suck it up.
|
|
|
|
|
A colleague of mine always did a clean install instead of in-place upgrade on Ubuntu in the early 2010s because more than once burned himself with incompatibilities in Gnome. So, GUI Linux was different.
There is a lot of crp under Linux, too, here and there, as seen personally on a Raspberry Pi after upgrading.
The feeling? Do you have the same understanding of your Linux as you have of your Windows installation? Maybe crp is considered magic on the first, and did not touched.
|
|
|
|
|
Peter Adam wrote: The feeling? Do you have the same understanding of your Linux as you have of your Windows installation?
Good point. No, I definitely don't know Linux as well as Windows. That, in itself, probably explains the bias I have against Windows and don't trust it as much not to screw up an upgrade.
Maybe if I knew more about Linux, I'd come to the conclusion it's just as likely to screw something up. I know it can. I just have nothing to come up with the likelihood of it happening.
As an aside - that Debian 11 -> 12 upgrade went on without a hitch, and didn't even require a reboot. Try that on Windows...
|
|
|
|
|
I have (mostly) successfully upgraded Linux (Debian) one release at a time with no serious after-effects. However in one case it killed the main app it was for (NextCloud) because it changed the php and Python versions and a lot of the app is version specific - regressing to php v7 was a total pain, it took me three days of fiddling to get it done properly. Python was easier.
Another upgrade that went sideways was my Domoticz server, however that was fixed by a simple re-install.
So, in conclusion.... for Linux, it all depends.
As for windows, I upgraded from 95 to XP and then to 7 with very little trouble, upgrading that to 10 caused one or two driver problems and USB behaviour became very flaky. Since I needed a 64bit OS for some software this was the occasion to finally do a clean install.
I have kept a pared-down copy of the final 7 version as a VM since 10 killed some software.
So old that I did my first coding in octal via switches on a DEC PDP 8
|
|
|
|
|
I had much rather the opposite experience. In the past, for example, I tried to use Fedora on a laptop of mine. Coming out with new versions rather frequently (twice a year), I had problems each and every time, starting from around Fedora 16 or 17 until Fedora 24. Each time something wouldn't work, hang the computer on reboot or even the installer. WiFi pretty much never worked afterwards, I had to manually install it over and over again. Luckily, with Linux Mint, things got better. Though I am currently at the same point, where it won't update to the latest version, always complains about some weird dependency changed (I am just using Linux to develop applications, I don't even have time to fuzz around with the OS itself). Same for my RPi4, just downloaded the latest image and will have to do a clean install, it just won't do a proper in-place upgrade, while it starts so, it will in the end mopper about something not being updated and leaves me where I started.
Did dozens of updates for example from Windows XP to Windows 7 (skipping the nonsense that was Vista) just fine. Maybe a newer printer or scanner driver, deleting the Windows.Old folder and the user kept going without issues.
Same when people upgraded from Windows 7 or Windows 8.1 to Windows 10. Very little issue, maybe some user application didn't like the newer .NET crap and had to manually install an older version by hand (looking at you Intuit/TurboTax). Haven't bothered with any upgrades to Windows 11 yet, though a couple of clients fell for the M$ bullying and clicked the upgrade. Though the most common complain was that the ClassicShell was deactivated and they had to deal with that horrid, nonsense start button/tile-kind of user interface instead of a proper Start Menu "like it used to be". And those fancy cartoon icons...
|
|
|
|
|
I have never had a problem doing a Windows inplace upgrade since W7 as long as I have done a thorough update (drivers, etc), tuneup and virus scan of the current setup. An upgrade is an upgrade NOT a repair.
|
|
|
|
|
AAC Tech wrote: An upgrade is an upgrade NOT a repair.
There's a lot of wisdom in that.
I have a system that was set up with Windows 10 (clean), for the very first time, back in June. From the get-go, it has NOT been able to install Microsoft's monthly cumulative updates (CUs) - starting from a clean state! None of the articles on failed updates I've come across have helped. Every month, I keep hoping that month's CU will somehow manage to get things sorted out.
The October update seemed promising at first, when it tried to install itself, as it also included a servicing stack update. It ran for a lot longer than previous updates, and went farther (%-wise) than any previous update so far. But in the end, it still failed just the same. I'm probably just doing to bite the bullet and repave that machine.
This is not unique to that system. I also used to have a Server 2019 VM that couldn't install any update, even from a fresh install. And given it was a VM, on Hyper-V, there was even fewer chances of a "bad" third-party driver or some-such that could cause some obscure failure. So, end-to-end, it was all Microsoft software, including the VM's abstracted hardware...I ended up nuking that VM, reinstalled from the same ISO, and that time around it worked fine...go figure.
|
|
|
|
|
Depends on your use-case.
I've had more Debian in-place upgrades fail than I care to remember.
About half of them.
It depends heavily on what packages you use:
- do you have additional apt sources configured?
- do you package code to fill in dependencies that aren't readily available?
- do you rely on closed source drivers?
Any of the above can cause issues.
Also, when it breaks, it often breaks spectacularly, with no way to recover.
That is why I moved from Debian and Debian-based to Arch.
At least with the rolling releases, it breaks in a way that's easy to fix.
Since WSL1 however, I'm sticking to Windows Pro exclusively.
I love running shell-based Linux without needing an hypervisor.
WSL2 has no value for me though, because that's basically running a VM.
|
|
|
|
|
Kate-X257 wrote: WSL2 has no value for me though, because that's basically running a VM.
Yeah, that came as a surprise to me. I was rather impressed with the WLS1 architecture in that it would work at all...but then, to throw all of that away and essentially turn WSL2 into a plain ol' VM...? That was somewhat disappointing, since all-out VMs are so much heavier.
|
|
|
|
|
I work on an air-gapped development network. Every year or so, I go out and spend a week or so on an internet connected machine and download Visual Studio and Android Studio and all the bits and pieces required for those products and the libraries they want to download to support our projects. Copy all that onto DVDs and transfer it to the air-gap and setup some scripts to install it all. Yes, it is a colossal PITA but it's what I've got to do for my work environment. VS has been getting worse each release as Microsoft ignores their offline developers more and more. If you need all those damn NuGet packages to do the most basic development actions like unit tests, include the *&(^# things in the offline packages! ... Okay, rant over.
Android Studio hasn't gotten worse, its process has been the same for years now ... run it online then bundle up the repository cache and take that offline.
I'm getting ready to do the big nasty for this year, and my question deals with Visual Studio. Currently, we have 2019 (with about half the major components), on the air-gap network with all current patches applied. I'm soliciting opinions on:
- how much more does 2022 want to access the internet in its normal course of operations, once you have all components for your project on the local machine? For reference, 2019 works fine with no delays trying to access the internet to do "other stuff" or look for updates or phone home.
- is it worth it to upgrade to 2022?
We have a suite of .NET Framework 4.7.2 WinForms applications, along with both C and C++ programs. No web and no database ... although I can see some small local DB stuff coming. This questions only applies if we stick with NET Framework. We're thinking about migrating to .NET 7 (whatever the current version is), which will force us to upgrade and render this question moot.
- how easy is it to set up a local NuGet server with just those packages put out by Microsoft, and maybe a few other select sources?
We are excessively paranoid about third party stuff here, so don't really use much that we can't get the source for and compile ourselves, so I'm not talking all those random open source packages that are out there. Yes, I know MS isn't qualitatively better, but my overlords are much happier if I can point to them, or some other recognized corporate purveyor of SW tools, as the source of a binary.
Data is transferred the old fashioned sneaker-net way, using DVDs. Having 10s of them is not a problem, but having 100s is.
Also, don't suggest creating a VM on the internet side and then transferring that VM to the air-gap side. This one is a NO for non-negotiable reasons; please don't ask what they are and don't try to argue why this should be an option.
Thanks in advance for your thoughts.
Be wary of strong drink. It can make you shoot at tax collectors - and miss.
Lazarus Long, "Time Enough For Love" by Robert A. Heinlein
|
|
|
|
|
Quote: Having 10s of them is not a problem, but having 100s is. How is about a USB- SSD drive? 4TB for about $250.
|
|
|
|
|
Corporate policy matches the standard for any air-gapped network: no removable writable media. The DVD sessions are closed, therefore not writable, before being used in the air-gap system.
Be wary of strong drink. It can make you shoot at tax collectors - and miss.
Lazarus Long, "Time Enough For Love" by Robert A. Heinlein
|
|
|
|
|
Can you do Blu-Ray? They can hold up to 50GB on a double layer disk.
|
|
|
|
|
You're not gonna like my thoughts, so feel free to skip this altogether. I understand what I'm about to write is a non-starter for you. My intent is not to rock the boat. And I fully realize none of this helps you in your current situation.
My thinking is, if a dev machine absolutely, positively, by design, has to be air-gapped, then from the get-go, something's very wrong with the picture.
I absolutely understand the need to air-gap things. But you don't develop/test against real servers with real data, you do all of that with a lab you can take down/rebuild on-demand. Externalize your connection configuration. User accounts shouldn't lead to valuable data. Work with made-up data. If that gets breached? There should be nothing of value lost.
I say this with all due respect. I understand you don't necessarily have a say in this. All I'm pointing out is that things are being made unnecessarily complicated for you because someone along the chain is making bad decisions. Why does a dev box need to be kept isolated from the rest of the world?
All that being said - sometimes you lose your live internet connection for reasons outside your control, and VS (2022 especially) has become awful at managing connected/disconnected states, but that's a rant for another day. If you can stick with VS2019 and it's working well for you...stick with it. My offline experience with VS2022 hasn't been a positive one.
|
|
|
|
|
That may work if the data is the protected IP. What happens if the algorithm is the protected IP? In this case, air-gapping the developer's platforms is the only sure way to protect the IP.
Freedom is the freedom to say that two plus two make four. If that is granted, all else follows.
-- 6079 Smith W.
|
|
|
|
|
The word "patent" comes to mind. But I can't imagine that, in the real world, it's all that effective at protecting said IP, but at the same time, I can't imagine that even the largest software companies today working on the most secretive stuff go completely offline. Lots of security checks, everything being monitored 24/7 in an automated fashion, sure, but completely offline? Maybe a few small labs here and there, and that's it...
But then, I've never really had the exposure to that sort of thing, so who am I to speculate...
|
|
|
|
|
I would think that in certain defence-oriented organizations, all IP is secured in an air-gapped network. Every developer would have two platforms, one connected to the Internet and the other connected to the air-gapped network.
Freedom is the freedom to say that two plus two make four. If that is granted, all else follows.
-- 6079 Smith W.
|
|
|
|
|
Thanks for your experience with 2022 working offline. If we decide to migrate to the new NET, we're going to have to upgrade to 2022. Unless someone knows a way to make 2019 support .NET 7. If 2022 is that unreasonable about working offline, that becomes a con in the debate about migrating. I'll be doing some experimenting with that when I'm out in internet land.
As for the rest -- given your underlying assumptions (which you can deduce, based on your post), your comments are correct in every manner, and I agree with them. But ... you knew there was a but coming ... in my case, those assumptions are not correct and the development environment is absolutely correct for what I work on.
Be wary of strong drink. It can make you shoot at tax collectors - and miss.
Lazarus Long, "Time Enough For Love" by Robert A. Heinlein
|
|
|
|
|
I can only speculate, and you don't owe me any explanation or need to elaborate. I'm sure it can all be justified. All I can say is good luck, and I hope you're well compensated for the extra hoops you're being made to jump through.
|
|
|
|
|
Daniel had it in one guess.
Be wary of strong drink. It can make you shoot at tax collectors - and miss.
Lazarus Long, "Time Enough For Love" by Robert A. Heinlein
|
|
|
|
|
I'm opinionated, but in my defense, never made the claim to know it all.
|
|
|
|
|
I would suggest skipping. Net 7, and wait for. Net 8 which will be out in a few weeks. .Net 8 is a long term support (LTS) version and Microsoft will provide security patches for a longer period of time than. Net 7
|
|
|
|
|
VS 2022 does a check for package updates normally. It is very easy to set up local nuget_repo on either an internal server or local directory on dev PC. Nexus operates as a proxy service for nuget but probably a lot faster and easier to build your own proxy service copy from "public" sources to local machine.
|
|
|
|
|