|
For any practical purpose: no. Keep in mind that you have a repository on your computer(s) that you push to GitHub. Even if GitHub dies or explodes or what not, you still have the repository on your computer(s). Assuming you have two computers, a desktop and a laptop, plus the repo on GitHub there are already 3 copies of your code and you are following the 3-2-1 backup rule (3 copies on 2 media with 1 offsite).
If you want an added layer of security you can make an account with another Git provider like Bitbucket or Gitlab and have 2 or more remotes. At a point maintaining all of them in sync becomes a hassle.
Mircea
|
|
|
|
|
thank you! I do have another SVN services to host my projects...
diligent hands rule....
|
|
|
|
|
You’re welcome!
Keep in mind that Git, as opposed to SVN keeps the whole repository on your machine. If the remote repository disappears, you still have all the code and history. It is normal to work and commit locally and push to the remote repository only from time to time.
Mircea
|
|
|
|
|
your info gives me more understanding of Git. thanks again
diligent hands rule....
|
|
|
|
|
One suggestion is:
For all your public projects, write interesting articles here on CP and paste identical code as zip attached to that article. This way you'll have two public domain copies of the same code, and losing both of them together has a low probability. (Need to personally take care to ensure no version differences between these two public repositories).
For your private repositories, the only possible backups are your USB drives.
|
|
|
|
|
The probability of losing data is directly proportional to the importance of it.
>64
Some days the dragon wins. Suck it up.
|
|
|
|
|
If you don't trust GitHub you can also create your own Git server with Gitea[^], which is quite easy to use as it mimics the GitHub user interface.
We have been using it for years on a Windows 10 server without any major problems, it is also available for Linux and Mac.
Our reason for self-hosting is not so much that we think GitHub can fail, but company policy dictates that no code may leave the premises.
|
|
|
|
|
thank you that link to gitea.
Charlie Gilley
“They who can give up essential liberty to obtain a little temporary safety deserve neither liberty nor safety.” BF, 1759
Has never been more appropriate.
|
|
|
|
|
Southmountain wrote: is there any possibility that GitHub lost my projects No. Don't delete your repo.
Jeremy Falcon
|
|
|
|
|
If I remember correctly, git can also work with an ssh server. You don't have any of the functionality of github, but you can alternatively also save your sources on an ssh server that you personally have control of.
|
|
|
|
|
No hosting platform is truly rock-solid, as people have pointed out in this thread. Personally, I keep some projects on GitHub but always keep a local copy as well, and (hopefully) that's enough for me.
|
|
|
|
|
Quote: is there any possibility that GitHub lost my projects?
Well, that's past tense, so does it appear that way?
Assuming future tense, Is there "any possibility"? Of course, it is possible. But a repo, which is arguably hosting, should never be the only copy of your work. Never, ever. Whether it's pushed from your local repo or uploaded as you want, you should have a local copy and I'm a fan of that being backed up as well. Twice.
But if you're really using it to "host" public projects where you are not looking for editors, personally, I'd have it on a real website where you can do a detailed explanation, demo, whatever, that a wider audience can utilize.
|
|
|
|
|
After installing an update to VS, I mean. Right now, I just updated to 17.5.4 (released earlier this week) from the previous 17.5.3.
Without fail, every time Visual Studio gets updated (I'm on 2022, but saw the same with previous versions), Task Manager shows many, many instances of mscorsvw.exe launching and running one after another--sequentially, not in parallel--for mere seconds each (< 2 seconds on average). It's actually a bit difficult to visualize with Task Manager as you might not even spot much going on, except that on each refresh the process (mscorsvw.exe) has a new PID and args, which means an instance has shut down, and a brand new one has started. I actually wrote a utility a while ago to log all process launches and shut downs, with full command-line args, etc, and right now - about an hour after the VS update process itself has completed and exited - I'm looking at over 800 instances of that process that have launched/closed (and still counting), each with params such as:
C:\Windows\Microsoft.NET\Framework64\v4.0.30319\mscorsvw.exe -StartupEvent d80 -InterruptEvent 0 -NGENProcess d04 -Pipe d00 -Comment ""NGen Worker Process""
I realize this is one of those things that ultimately are "normal" and entirely to be expected (don't worry your pretty little head, after all, just let MS do its thing), but I'm just curious, if nothing else but for the fact that it seems excessive. I don't recall ever seeing any article or blog entry discussing what's going on during that phase.
[Unrelated: My utility demonstrates Win10 is consistently launching/closing roughly 10x more processes on their own, 24/7, than any predecessor ever did in the same amount of time. For all its progress in power-saving claims, this seems counterproductive. Again, why should anyone care? I'm not sure. I'm just sharing the observation...]
|
|
|
|
|
NGEN is a tool that takes .net code, compiles it to native, so it can be cached for reuse. Assuming it eventually stops happening, VS is background precompiling all the shiny new .net code you've got from MSIL to native binaries.
Ngen.exe (Native Image Generator) - .NET Framework | Microsoft Learn
Did you ever see history portrayed as an old man with a wise brow and pulseless heart, weighing all things in the balance of reason?
Is not rather the genius of history like an eternal, imploring maiden, full of fire, with a burning heart and flaming soul, humanly warm and humanly beautiful?
--Zachris Topelius
|
|
|
|
|
[For argument's sake]
But the VS binaries I downloaded can only run on x86 anyway. Why are the binaries I downloaded not already precompiled as native x86, and save everyone that extra step?
|
|
|
|
|
dandy72 wrote: Why are the binaries I downloaded not already precompiled as native x86,
Some guesses...
It is including security info which is specific to the machine itself.
Although the OS is conceptually the same it is actually different on different machines. Some some potential compilations might produce different outcomes.
It doesn't just compile but also installs them. That can vary by machine.
It is just easier for them to do it this way in case any of the above is true or might be true. Maybe only for a couple of actual deployed units.
|
|
|
|
|
There are ARM versions of Visual Studio. And even on the Intel x86 architecture, there are newer and faster instructions to take advantage of. Or you prefer to be like C++; their generated instructions stick to archaic and minimum 80x686 instructions.
modified 15-Apr-23 4:56am.
|
|
|
|
|
Shao Voon Wong wrote: There are ARM versions of Visual Studio.
And how many people even have access to that, outside of MS?
|
|
|
|
|
anything vectorizable would have a bunch of versions depending on how new your CPU is. Older CPUs also need extra defensive programming around speculative execution exploits.
In theory they still could have precompiled binaries, but it's a lot more than just: x86, x64, or ARM?
Did you ever see history portrayed as an old man with a wise brow and pulseless heart, weighing all things in the balance of reason?
Is not rather the genius of history like an eternal, imploring maiden, full of fire, with a burning heart and flaming soul, humanly warm and humanly beautiful?
--Zachris Topelius
|
|
|
|
|
Another thing to consider... Just because the VS Installer is pulling the packages and running the install for you, the .NET packages are available separately and cover multiple platforms. The packages themselves are not all VS and platform specific...
|
|
|
|
|
The code is compiled to IL, just not JIT-compiled to machine code. NGEN does this ahead of time (AoT) to improve startup performance at the expense of disk space, assuming disk reads are much quicker than JIT compilation. This also saves time and bandwidth transferring only the IL from Microsoft.
|
|
|
|
|
In principle, it could even start executing the code up until the first point where it was dependent on something outside the image, and freeze that as a modified starting image. If the initializing code executed had no external references, it could be peeled off, once it was executed.
I doubt that it is doing that, although it would be possible.
|
|
|
|
|
*lol* before a decade (and more) one complained about the dll hell.
Nowadays I have the strange feeling it became much worst...
... VS which one now?
... .NET x/y/z/core this and that, which one now?
... Nuget I changed here and there
|
|
|
|
|
It merely morphed from DLL hell into NuGet hell.
There are no solutions, only trade-offs. - Thomas Sowell
A day can really slip by when you're deliberately avoiding what you're supposed to do. - Calvin (Bill Watterson, Calvin & Hobbes)
|
|
|
|
|
The end of DLL hell was a sales pitch. Different names these days. Same principle.
Jeremy Falcon
|
|
|
|