|
haha!
The problems crop up when trying divine hierarchies/trees of data from flat text.
Real programmers use butterflies
|
|
|
|
|
Chris Maunder wrote: I have a problem and am randomly looking around for a solution let me introduce you your next secret weapon: Quick Answers[^]
M.D.V.
If something has a solution... Why do we have to worry about?. If it has no solution... For what reason do we have to worry about?
Help me to understand what I'm saying, and I'll explain it better to you
Rating helpful answers is nice, but saying thanks can be even nicer.
|
|
|
|
|
I used CUDA, after looking at OpenCL. Opinion: OpenCL was what AMD got IBM,HP et al to impose on NVidia, so that "the same code" could run on AMD's (ATI's) video chips too. Having written asm to do the latter, it's ridiculous; you need to use different algorithms when the underlying chipset is that much less powerful. CUDA was really straightforward; high-level but targeting a GPU built for GPGPU.
That being said, have not used it in 10 years.
|
|
|
|
|
Yes, I have. We are rewriting a significant piece of an application to utilize it. This is just for HPC stuff. We haven't gone into machine learning yet but we have some targets in mind.
I have also messed around with fractal generation and other graphical things using CUDA and it is lightening fast at that. On the cards I have been using the double precision performance is considerably slower than single (more than twice) but it is still much faster than using a CPU. I can see the difference in detail on my graphics stuff when using single precision vs. double.
I went to Nvidia's GTC (Graphics Technology Conference) last year and was going to go this year also until it was cancelled. I will be certain to catch the on-line stuff when it happens next week.
"They have a consciousness, they have a life, they have a soul! Damn you! Let the rabbits wear glasses! Save our brothers! Can I get an amen?"
|
|
|
|
|
Nice
cheers
Chris Maunder
|
|
|
|
|
I used Cuda in my Masters for simulating spiking neural networks.
Ended up using cudafy (.Net library) as wanted C# familiarity.
Certainly allowed me to run my simulations far quicker but don’t underestimate the amount of effort required to tune (and get right - debugging 1000s of threads isn’t fun) non-trivial algorithms.
|
|
|
|
|
I used CUDA in my doctoral work in physics. Solving a non-linear partial differential equation via finite difference, I achieved a speed up of 32x on an NVIDIA GPU in my laptop, about 96 cores. It requires a different mode of thinking than we are used to, but it's worth it.
|
|
|
|
|
Used it for a basic convolution like problem with a large overlap. The code I wrote is rather basic, the stuff around needed some attention to get it working, but it delivered in spite of not studying that much on it.
But I'll wait for another real life application before delving into it again.
|
|
|
|
|
I have used Cuda\C\C++ for simple pattern matching on fairly large data sets. Keep in mind that there are some performance limitations when using Cuda due to the time required to marshal data to and from GPU memory and when the algorithm requires multiple synchronizations but still the performance is impressive.
Keep in mind that Cuda is not the solution for all problems; clever algorithm implemented on CPU only can match or even outperform GPU code in some scenarios. It is fun to play with good old C and different memory types of GPU. Debugging is more challenging and separate compilation that requires two compilers (NVCC and C/C++) is sometimes creating unexpected issues. Finding help on the web is more difficult than with more established technologies. I am using Visual Studio to do all of that on Windows.
|
|
|
|
|
I used CUDA for parallelization of a Java program. The task was about accelerating a fairly simple algorithm for projecting bitmaps. It was fun and I made an infographic on my method. Found GPGPU quite exiting but lost it from sight, anyway... Regards, Jürgen.
|
|
|
|
|
I tried out CUDA as a way of speeding up a raytracing graphics engine I was working on. My goal was to do raytracing without expensive (RTX) hardware. If I remember correctly, my program ran about as fast using most of the CUDA cores on a GTX 1050 Ti as it did using all the CPU cores on a Ryzen 7 1700X. I probably could have got it to run faster by optimizing it more for CUDA (I think I was using doubles), but the main problem for that project was my core algorithm being slow on anything.
|
|
|
|
|
The endless, endless texts, Slacks, Skypes, tweets, emails, FaceTimes, calls and messages.
cheers
Chris Maunder
|
|
|
|
|
Unlike meetings, where your absence or lack of attention is obvious, at least it isn't for most of these. They can be ignored or handled at your convenience, or you can do other things while pretending to be engaged.
One of the managers where I used to work told me he liked it when he had two meetings scheduled simultaneously. If he went to neither, people would assume that he was in the other one.
|
|
|
|
|
Greg Utas wrote: two meetings scheduled simultaneously. If he went to neither, people would assume that he was in the other one.
Ingenious!!
Those would be the two best meetings ever!
The best meeting is a cancelled meeting!
|
|
|
|
|
Greg Utas wrote: If he went to neither, people would assume that he was in the other one.
I love it!
Real programmers use butterflies
|
|
|
|
|
Oh, off topic, but I wrote an article a bit ago and was thinking of you. It demonstrates scannerless recursive descent parsing like you're doing versus doing it with a scanner.
I don't normally like to plug my own stuff but I thought you might be interested in it. On the off chance you are, here you go: How to Build a Recursive Descent Parser[^]
Real programmers use butterflies
|
|
|
|
|
Chris Maunder wrote: The endless, endless texts, Slacks, Skypes, tweets, emails, FaceTimes, calls and messages.
Don't forget Lounge posts!
|
|
|
|
|
raddevus wrote: Don't forget Lounge posts! It's only a pandemic; you can't expect people to give up essentials because of it.
I wanna be a eunuchs developer! Pass me a bread knife!
|
|
|
|
|
Yeah mate. We're not savages.
cheers
Chris Maunder
|
|
|
|
|
BTW your out of toilet paper!
I'm hiding from exercise...I'm in the fitness protection program.
JaxCoder.com
|
|
|
|
|
Chris Maunder wrote: The endless, endless texts, Slacks, Skypes, tweets, emails, FaceTimes, calls and messages.
The worst is when you realize that you are someone else's time-waster/productivity killer/jester.
I suppose all of us have been intently focused on some bit of code...stepping through in the debugger just about have it figured out...phone rings...damnit! It happens to me all the time.
In an age of constant distractions, getting back into 'the zone' every 10 minutes can really get annoying, not to mention the negative impact on code quality. (speaking for myself)
"Go forth into the source" - Neal Morse
|
|
|
|
|
The solution, from personal practice and experience:
1 - I don't text; I won't text
2 - I don't read texts (anyone who knows me knows this)
3 - No social networks (except for this time-waster[^])
4 - email - but only from a PC. I don't always have them on and so I get a break from that, too.
5 - you get the picture
Moreover, with a few domains that all include free email forwards (unlimited number of them, too) I keep my mail sorted in such a manner that I get remarkably little spam.
Just one more bit of advice. All the eating of garlic and drinking of beer, to induce any amount of near-toxic flatulence, will neither stop the hell you have welcomed into your life nor slow it down. Smart phones are, at least, smart enough not to have noses . . . the best you can hope for is to soil the screen
Ravings en masse^ |
---|
"The difference between genius and stupidity is that genius has its limits." - Albert Einstein | "If you are searching for perfection in others, then you seek disappointment. If you seek perfection in yourself, then you will find failure." - Balboos HaGadol Mar 2010 |
modified 19-Mar-20 16:08pm.
|
|
|
|
|
Unfortunately as someone trying to run a business there are some things I can run from but never hide.
Gotta be there, gotta be ready to help out everyone on the team.
It's just interesting how a quick hallway conversation becomes a major event when it's turned into a video conference meeting.
cheers
Chris Maunder
|
|
|
|
|
Chris Maunder wrote: there are some things I can run from but never hide. Intelligence is pursuing you but you are faster?
M.D.V.
If something has a solution... Why do we have to worry about?. If it has no solution... For what reason do we have to worry about?
Help me to understand what I'm saying, and I'll explain it better to you
Rating helpful answers is nice, but saying thanks can be even nicer.
|
|
|
|
|
For me, this was Day 3 of working from home for the foreseeable future. At one point I was:
Updating help text on my machine for a work application.
Remote Desktop'ed to my machine at work where I was gathering source code to work offline.
Remote Desktop'ed to a lab machine at work that was being recalcitrant about installing a test application.
Handling work e-mails with the work cell because my machine and the work machine session were busy.
On my personal phone wrangling doctor appts and subscription refills for my wife and I because we've reached the age where keeping our body chemistry regulated is complicated.
When not on the phone, talking to my wife about why it's okay for her to go for a drive (she's getting a little stir crazy), and no she shouldn't stop in any stores.
Occasionally turning around and talking to my dogs because I needed to see a friendly face. They're especially friendly right now because it's dinner time.
Communications are slightly more frenetic than usual. At one point our secure FTP provider at work went belly-up and I had to resort to emailing stuff to myself at home, 5MB at a time.
The one advantage is that the commute is easy (up one flight of stairs) and the dress code is... relaxed.
Software Zen: delete this;
|
|
|
|