|
Wangle is a client/server application framework to build asynchronous, event-driven modern C++ services. For << those -> who* don't :: mind using& all #the keys on their keyboard
|
|
|
|
|
Ancient Egyptians used to leave their stories on walls and pyramids by writing in C++.
Thankfully, the new ages changed the way to communicate each other.
|
|
|
|
|
Who ever would think that the benefit of C/C++ performance outweigh the pain of dealing with memory issues in the environment where 99% of the time is spend waiting for the database?
|
|
|
|
|
Users must upgrade to Windows 8.1 or Windows 10 to keep getting patches. But not 8.1 for those allergic to Win10
|
|
|
|
|
Kent Sharkey wrote: Users must upgrade to Windows 7 8.1 or Windows 7 10 to keep getting patches.
FTFY
Skipper: We'll fix it.
Alex: Fix it? How you gonna fix this?
Skipper: Grit, spit and a whole lotta duct tape.
|
|
|
|
|
Unparalleled distribution strategy is the story. Do you want to install Windows 10? How about now? Now? What about now?
|
|
|
|
|
Kent Sharkey wrote: Do you want to install Windows 10? How about now? Now? What about now?
No. Can you hear me now? No! How about here? Can you hear me now? NO!
Marc
|
|
|
|
|
Made me think of Adele's Hello - Hello from the otherwise! I must have bugged you a thousand times, to tell you to upgrade.
|
|
|
|
|
Quote: In itself, pre-loading the upgrade was not that dissimilar to how any automatic update, including patches for Windows or a new version of Chrome, are downloaded to a user's device. But the timing of the Windows 10 pre-fetching -- before availability -- was unusual. When software makers wrap up development and release the product, they release it: It makes no sense to withhold it from customers when it's finished, but instead push it to their devices to await an installation date and time.
This might be new in the OS world; but Steam's been doing it to handle releases of games whose downloads can dwarf (ex GTA4 being ~50GB) your OS for years. Pre-load where it downloads the entire game before release (and uses an unlock mechanism of some sort to keep you from playing ahead of the official release date) has been a feature for AAA games for years. Downloading and installing patches in the background (eg while you're sleeping or at work) is available for all games so that when you are ready to play you almost never have to wait for a patch to download and install first.
Did you ever see history portrayed as an old man with a wise brow and pulseless heart, waging all things in the balance of reason?
Is not rather the genius of history like an eternal, imploring maiden, full of fire, with a burning heart and flaming soul, humanly warm and humanly beautiful?
--Zachris Topelius
Training a telescope on one’s own belly button will only reveal lint. You like that? You go right on staring at it. I prefer looking at galaxies.
-- Sarah Hoyt
|
|
|
|
|
Discovered by mathematicians at the Great Internet Mersenne Prime Search (GIMPS), the bug occurs when using the GIMPS Prime95 application to find Mersenne primes. So that's what happened to the Pentium floating point engineers
|
|
|
|
|
I knew there was a reason I chose to build my PC around an AMD processor!
The difficult we do right away...
...the impossible takes slightly longer.
|
|
|
|
|
Geeze, I just built a new Sky Lake box too. Anyway, I love it and I'm not likely to ever do calculating like that.
Core i7-6700 with 32 gigs of ram and 1TB SSD...lol...
|
|
|
|
|
I recently wrote a pair of at least slightly controversial articles about 64-bit vs. 32-bit applications and the performance costs associated with going to 64 bit. Another 4600 bytes on bits (more or less)
|
|
|
|
|
Beating a dead horse. Doesn't a 64bit process run faster than a 32 bit processor and hence it's all bs anyway?
Decrease the belief in God, and you increase the numbers of those who wish to play at being God by being “society’s supervisors,” who deny the existence of divine standards, but are very serious about imposing their own standards on society.-Neal A. Maxwell
You must accept 1 of 2 basic premises: Either we are alone in the universe or we are not alone. Either way, the implications are staggering!-Wernher von Braun
|
|
|
|
|
I think the usual answer to that is, "it depends".
TTFN - Kent
|
|
|
|
|
Well, sure. But a 64bit processor is presumably newer, faster, better tech than the older 32 bit processor. So one would think it'd be at least a wash. But I do I agree with rincon (WTFWHN?) that data structures containing pointers will be bigger and thus take more memory. Moving more memory takes more time, etc, etc. Code that deals with 64bit pointers will necessarily be bigger thus taking more time to move the code into the processor, etc. --- all other things being equal --- But that's just it, they're not equal, the newer 64bit processor is faster and better, so it should at least be a wash if not faster anyway.
But I want Moooooore, Mooooooore, Mooooooooooooore!!!!!!!!!!!!!
Decrease the belief in God, and you increase the numbers of those who wish to play at being God by being “society’s supervisors,” who deny the existence of divine standards, but are very serious about imposing their own standards on society.-Neal A. Maxwell
You must accept 1 of 2 basic premises: Either we are alone in the universe or we are not alone. Either way, the implications are staggering!-Wernher von Braun
|
|
|
|
|
You're forgetting that the newer processor will also run 32bit code faster than the old one; and without the cache miss penalties from bigger pointers.
Did you ever see history portrayed as an old man with a wise brow and pulseless heart, waging all things in the balance of reason?
Is not rather the genius of history like an eternal, imploring maiden, full of fire, with a burning heart and flaming soul, humanly warm and humanly beautiful?
--Zachris Topelius
Training a telescope on one’s own belly button will only reveal lint. You like that? You go right on staring at it. I prefer looking at galaxies.
-- Sarah Hoyt
|
|
|
|
|
While I didn't mention it, I didn't miss it. Why? Because most code when moving from 32bit to 64bit is also moving processors.
Decrease the belief in God, and you increase the numbers of those who wish to play at being God by being “society’s supervisors,” who deny the existence of divine standards, but are very serious about imposing their own standards on society.-Neal A. Maxwell
You must accept 1 of 2 basic premises: Either we are alone in the universe or we are not alone. Either way, the implications are staggering!-Wernher von Braun
|
|
|
|
|
Moving to the new processor is a speedup; and unless your old hardware is really ancient or really low power is already 64bit. Porting to 64 is normally at best a wash; in most cases it's marginal slowdown. If you want something to be faster you buy faster hardware; you don't port the code to 64bit.
Did you ever see history portrayed as an old man with a wise brow and pulseless heart, waging all things in the balance of reason?
Is not rather the genius of history like an eternal, imploring maiden, full of fire, with a burning heart and flaming soul, humanly warm and humanly beautiful?
--Zachris Topelius
Training a telescope on one’s own belly button will only reveal lint. You like that? You go right on staring at it. I prefer looking at galaxies.
-- Sarah Hoyt
|
|
|
|
|
Dan Neely wrote: Moving to the new processor is a speedup Exactly.
Dan Neely wrote: you don't port the code to 64bit. Maybe, maybe not. It depends. Much code doesn't need to be "ported", just recompiled. As such it may or may not be faster. But speed's not usually the reason to "port" the code, the reason is access to more memory than 32bit code will allow. So fighting, arguing and debating the speed implications is ridiculous and pointless.
Decrease the belief in God, and you increase the numbers of those who wish to play at being God by being “society’s supervisors,” who deny the existence of divine standards, but are very serious about imposing their own standards on society.-Neal A. Maxwell
You must accept 1 of 2 basic premises: Either we are alone in the universe or we are not alone. Either way, the implications are staggering!-Wernher von Braun
|
|
|
|
|
Excited rumors began circulating on Twitter this morning that a major experiment designed to hunt for gravitational waves—ripples in the fabric of spacetime first predicted by Albert Einstein—has observed them directly for the very first time. Fire up the macroscope!
|
|
|
|
|
That'd be nice.
"But what...is it good for?"
- Engineer at the Advanced Computing Systems Division of IBM, 1968, commenting on the microchip
|
|
|
|
|
Like I commented the last time around, the design sensitivity of the Advanced LIGO hardware is enough better than the previous generation that detection odds have gone from "maybe if we get lucky" (a neutron star closer than any known objects, or beating the odds and having a merger transient happen in a much shorter than statistically expected window) to "if we don't get a detection something is wrong" (hardware, data analysis, or the physics itself).
Did you ever see history portrayed as an old man with a wise brow and pulseless heart, waging all things in the balance of reason?
Is not rather the genius of history like an eternal, imploring maiden, full of fire, with a burning heart and flaming soul, humanly warm and humanly beautiful?
--Zachris Topelius
Training a telescope on one’s own belly button will only reveal lint. You like that? You go right on staring at it. I prefer looking at galaxies.
-- Sarah Hoyt
|
|
|
|
|
Open source has won -- as a cauldron for innovation as well as a frictionless means of software distribution. So why are we still messing with obscure licensing minutiae? I blame the lawyers
|
|
|
|
|
Princeton University - Computer scientists launch campaign to guarantee bug-free software[^]
I realize this is old news (Kent posted it last week [^]), but I don't feel that it has been sufficiently ridiculed yet.
Then again, perhaps I am just unduly skeptical about a group of 8 students and 2 professors that "aims to eliminate out [sic] bugs in complex software" with $10M over 5yrs. It certainly is a venerable goal; and who am I to suggest that they might possibly be a tad over-ambitious? After all, they have already recognized that their "initial challenge will be to dissect the overwhelming complexity of modern hardware and software to uncover the factors that determine how various computer components work together," which sounds like a prudent starting point, even if it does end up taking a good couple of weeks out of their schedule. And, of course, I shouldn't overlook the fact that they wisely have planned to develop their so-called "deep specifications" using such proven strategies as "deductive reasoning, syllogisms and mathematics". It's only a shame that we have had to wait so long for these eminent tools to be applied to the plebeian field of Computer Science!
So... never mind my suggestion. I humbly retract my call for ridicule. Clearly what we need to do is herald this project as the "coming of age for the industry". And not just for this industry! Indeed -- as the team has presciently observed -- this breakthrough could impact "not only computer science disciplines, but many other disciplines as well." Think of it! With just a few extra years and some more far-sighted grants from the NSF, we might be on the cusp of seeing the eradication of errors from all modern scientific endeavors!
Wow! I don't know about you, but I think we could use a few less mistakes in science as we know it. And all that for only $10M? What a bargain!
|
|
|
|