|
They're getting nasty lately.
Wordle 285 5/6
⬛⬛⬛⬛⬛
⬛⬛🟨⬛⬛
⬛🟩🟨⬛🟩
🟩🟩⬛⬛🟩
🟩🟩🟩🟩🟩
|
|
|
|
|
Wordle 285 3/6
⬜⬜⬜🟨🟨
🟨⬜🟨🟩⬜
🟩🟩🟩🟩🟩
|
|
|
|
|
First, thank you for sharing this. I had no idea it was on NYT...
I finally found it. Luckily I found another site that lets me play infinite amount of games.
Wordle Game - Play Unlimited
Anyways, after playing quite a few games, I found about 6 words (4 I use to start EVERY puzzle,
that greatly reduces the letters, and 2-3 alternatives that allows me to split bfd/bvd/fzd)...
The downside is that after a LOT of games, you see the words repeating... And I type the first four words in so fast, I almost miss that I already have 5 letters. LOL.
I guessed this 5-letter word in 6/6 tries.
⬛⬛⬛🟩🟩
⬛⬛🟨⬛⬛
⬛🟨⬛⬛⬛
⬛⬛⬛⬛⬛
🟨🟩🟨⬛⬛
🟩🟩🟩🟩🟩
Can you guess this word?
https://wordlegame.org?challenge=ZmliZXI
I guessed this 5-letter word in 5/6 tries.
⬛⬛⬛🟨⬛
⬛🟨🟨⬛🟨
⬛⬛⬛⬛⬛
🟨⬛⬛⬛🟨
🟩🟩🟩🟩🟩
Can you guess this word?
https://wordlegame.org?challenge=dGh5bWU
I guessed this 5-letter word in 5/6 tries.
⬛🟨⬛⬛🟨
⬛⬛⬛⬛⬛
⬛⬛⬛🟨⬛
⬛🟩⬛🟩⬛
🟩🟩🟩🟩🟩
Can you guess this word?
https://wordlegame.org?challenge=Y2FyZ28
I guessed this 5-letter word in 5/6 tries.
⬛⬛🟨🟨⬛
⬛⬛⬛⬛🟨
⬛🟨🟨⬛⬛
⬛🟩⬛⬛⬛
🟩🟩🟩🟩🟩
Can you guess this word?
https://wordlegame.org?challenge=d2FzdGU
I guessed this 5-letter word in 6/6 tries. [Forced to use one of my alternative words]
🟨🟨⬛⬛⬛
⬛⬛⬛⬛⬛
⬛⬛⬛⬛⬛
⬛⬛⬛⬛⬛
🟩⬛⬛⬛🟩
🟩🟩🟩🟩🟩
Can you guess this word?
https://wordlegame.org?challenge=Ymxvb2Q
[This one doesn't seem obvious until you apply a bit of logic/doubling]
I like that it gives you a link to the specific game.
|
|
|
|
|
Spurred by this week's survey (Do you get up in the middle of the night to code because you can't sleep?[^]):
How much of your work hours and/or mental energy goes to find an "algorithmic solution" to your problem?
I have been coding for a few decades, but my experience is that at least 90% of my time and energy goes to collecting background information, putting pieces together (rather mechanically), typing the code, writing tests, managing the build scripts, reading compiler listings and logs, writing documentation, presenting stuff to coworkers, ... I cannot recite all sorts of algorithms by heart, so sometimes I dig up a text (or open-source code) describing how to solve the problem. Very rarely am I stuck with a problem where I cannot quite easily either devise a method (usually composed from a set of partial solutions), or where I can find a workable solution in literature or on internet.
Those 'eureka moments' are for the most part limited to when I understand the logic in a textbook presentation of an algorithm. I can't imagine not falling asleep because I am unable to devise a new, great, hitherto unknown algorithm.
Of course: If you are an advanced research scientist in a field such as eg. numerical methods, then you job is to develop new algorithms for the algorithm's sake. Few of us are.
Maybe I am different. Do you really spend any significant fraction of your working hours or mental capacity on developing new methods/algorithms?
|
|
|
|
|
Short: About 80%
But I do it because I like to do it
|
|
|
|
|
Are you saying that 80% of your time goes to delvelop the algorithm? That only 20% goes to typing it in, building, debugging, documenting, communicating with users/customers and other developers?
|
|
|
|
|
Mental: Yes
Working hours: No
|
|
|
|
|
Still, if you spend 80% of your mental effort on algorithmic decisions, it sounds to me as if you consider e.g. choosing an if - elseif - else sequence over a switch statement part of 'algorithm design'. Or while(){} versus do{}while(). I see those as trivial coding details.
I'd say that if you by a short glimpse on the alternatives can say that they have the same complexity, in the big O sense, then there is no significant algorithmic development from one to the other.
I am probably too ambitious. I really wish that University professors required any hand-in code taking variable size input to be followed by documentation stating the O() complexity of the function. Actually, I never ever saw any lecturer, professor or lower level, make such requirements. I wish it not only for college homework, but to be the norm for any published library or source code. It really should be part of our professional code of conduct to always include complexity as a basic part of the documentation.
You might develop an algorithm of the same complexity as an existing one for the same task. It may have other traits, e.g. execution complexity and data space complexity are not necessarily parallell. But trivial code changes / decisions affect neither.
|
|
|
|
|
If you do all your work on one platform and in a single language you can build a library of plug and play modules where it can become mechanical.
But I work with Embedded; various platforms and various languages.
I'm developing a large WPF application and WPF is very powerful but can be frustrating to say the least.
So yeah I often get up in the middle of the night to get some relief (I'm old and this occurs frequently) and I end up just staying up and coding. Seems like I get the most done in the early morning hours.
And recently I've gotten into CNC and am building a large machine so the extra learning of CAD/CAM and all that goes with the fabricating the machine has been a major undertaking.
The less you need, the more you have.
Even a blind squirrel gets a nut...occasionally.
JaxCoder.com
|
|
|
|
|
Getting up at night to get things done: Fair enough.
But is the reason why you are sleepless that your brain in struggling with how to solve the problem? Find the right algorithm?
I could be coding (or writing documentation) after midnight because it is quiet, cool, nothing to disturb me. But mostly to "get the work done". Not because I am struggling with finding a possible solution, having none available.
|
|
|
|
|
trønderen wrote: How much of your work hours and/or mental energy goes to find an "algorithmic solution" to your problem? At the moment I'm doing a rewrite on one of the components in a product. This component was originally written over 20 years ago, and has migrated from product to product and engineer to engineer. It got dropped in my lap a while ago, and the version of it in our newest product is having problems. I could probably fix it, if I really had to.
Instead, I convinced my boss to let me do a rewrite. I've got the work about 80% complete, and I've reached a point where I need to start connecting the major bits together and fill in the details. One of those connections needs to aggregate a large number of error bits and possibly trigger a state change in one or more devices. The key is to do this efficiently, only evaluating the error bits as needed, and only evaluating the device state when necessary. In other words, I'm looking for an algorithmic solution.
I'm having great fun .
Software Zen: delete this;
|
|
|
|
|
Isn't every routine/function/method/class/program an "algorithm" of sorts? As for my own time much of it is spent watching the twirly blue circle in Visual Studio and occasionally the white screen of death. Then there are the many many many 15m builds and the occasional language error leaving me scratching my head and attempting to discern the cryptic not very informative compiler error message. - Cheerio
"I once put instant coffee into the microwave and went back in time." - Steven Wright
"Shut up and calculate" - apparently N. David Mermin possibly Richard Feynman
My sympathies to the SPAM moderator
“I want to sing, I want to cry, I want to laugh. Everything together. And jump and dance. The day has arrived — yippee!” - Desmond Tutu
“When the green flag drops the bullshit stops!”
"It is cheaper to save the world than it is to ruin it."
"I must have had lessons" - Reverend Jim Ignatowski / Christopher Lloyd
|
|
|
|
|
PaltryProgrammer wrote: Isn't every routine/function/method/class/program an "algorithm" of sorts? Sure, but not an algorithm that takes a great amount of mental effort to develop. Just like every little shed is a "building", you do not to into a great "building construction task" that really takes you expertise as an construction engineer to calculate right.
I see myself more as a carpenter putting boards together, laying the tiles on the roof. Some effort goes on deciding the "floor plan", which pieces to put together and in which way, but for most things I do, that is really a minor part, both in hours and mental effort. The carpentry is the essential thing: Coding, debugging, testing, documenting. Not the architectural work.
|
|
|
|
|
As for mental effort, programmers are merely engineers. In my view there are three levels, i.e. to wit [0] scientists discover new knowledge, [1] engineers utilize these new knowledge to solve new problems, [2] technicians utilize the tools engineers fabricate to repeatedly solve the same problems again and again. - Best
|
|
|
|
|
How big is your cade base? I have a fairly big code base (50k lines maybe) and it takes about 15s, not 15m to compile on my 32-hyperthread CPU.
|
|
|
|
|
Approximately 15K lines mostly templates plus 1.5K lines test code. When this project is done pretty soon now I intend to return to a project finished some years ago 50K lines in C and convert to C++ w/ modern UI/GUI and many improvements. No way I will tolerate the build times for that size. I will have to get a proper development machine. My current horse and buggy is https://www.hp.com/us-en/shop/pdp/hp-slimline-desktop-290-a0035z[^]
|
|
|
|
|
Yes, I would say you need a new development machine!
|
|
|
|
|
I had your experience with business development.
With IoT it's all about stuffing everything you can into as small a footprint as possible, and I mean that, because once you get it to actually run on the hardware, you still have power concerns. Unless you want the user going to the charging station once every couple of hours.
So algorithms are king. It's why I enjoy it so much, I think. After all this time it feels like *real programming* again.
To err is human. Fortune favors the monsters.
|
|
|
|
|
I worked with IoT for more than ten years. For my first programming assignment, I was given a total budget of 1200 bytes to implement Bluetooth DTM (Direct Test Mode). That job was no development of new algorithms, that was essentially given by the DTM spec. I'd say that two thirds of the work was shaving corners, trying out alternatative statement types, repacking data in different ways to save a byte here, a byte there. (If you have ever tried to implement Bluetooth DTM, you will know that 1200 bytes total is a rather tight limit!)
That is more like the way I experience IoT work: Shaving, rubbing off and polishing, and testing, testing, testing, ...
Did I ever, in my IoT work, develop some new algorithm that I could present in pseudocode form to an audience, explaining that this is the new algorithm I have developed for solving this problem that had no earlier solution? I have spent a significant part of my working hours and mental effort on finding out how to do it, at the conceptual level? The kind of mental activity that might keep your brain awake when you should be sleeping?
No. Almost all I've been doing has been stuff that, if written as a semi-abstract algorithm (by that I mean e.g. in pseudocode, independent of any specific implmentation) would be rather trivial in terms of algorithmic development from already well known state of the art. My working hours are spent going from the conceptual solution to a working, reasonably error-free, ready-for-sale implementation.
In my studies, the Systems Engeneering book claimed that if making a 'bare' freestanding 'program' to solve a problem takes 1 unit of work, doing the same solution as a 'program component' in a lager software infrastructure (where you have to relate to standards, interfaces, various conventions) typically takes 3 units of work. If you are making a commercial 'program product', with test procedures, documentation, support system, marketing and sales effort, it takes 3 units of work for the freestading program. Combine these two axes into a 'program component product', it takes the product of the two axes, roughly 10 times the work of making the bare, freestanding program.
My experience is that this is a fairly good rule of thumb. If you manage to do it significantly cheaper, most likely you are either ignoring the rest of the world and any infrastructure requirements, making a simple freestanding program, or you make a bare-bones delivery with no or very limited support, documentation, ...
Algorithmic development relates only to that bare freestanding program, and even there, it makes up a limited part of the work. As a developer, you certainly do not do all the ten units of work of a program component product, yet you have to relate to (at a resource cost!) technical writers, marketing people and management, who depend on information from you to do their work.
I guess I have been sleepless more from sales people and management than from problems finding an algorithmic solution to my problems!
|
|
|
|
|
My primary client had a career as an electrical engineer and now handles sales and interfacing with our end-end clients.
He's realistic, easy to work with and treats me well.
I'd say that two thirds of the work was shaving corners, trying out alternatative statement types, repacking data in different ways to save a byte here, a byte there. (If you have ever tried to implement Bluetooth DTM, you will know that 1200 bytes total is a rather tight limit!)
That's part of what I mean when I'm talking about developing algorithms - this sort of refinement necessary to make them work on little devices.
And I don't handle all facets of development and delivery. I do some of the hardware, and all of the software on these projects. I do spend a lot of time testing and documenting, but when I'm not doing that, I'm writing wicked code, like getting truetype to work realistically on a an esp32
To err is human. Fortune favors the monsters.
|
|
|
|
|
I only do line-of-business stuff, and the OP's experience pretty much mirrors my own. Despite my work falling under the umbrella of "computer science," I do not like to think of myself as a computer scientist (or even an engineer, really), I tend to reserve that term for the people who are devising new technologies and novel algorithms. I take the excellent, innovative ideas of other people and apply them more or less intelligently to the business problem at hand. I'm comfortable with being good at that, even though I'm certain I'm not changing the world
|
|
|
|
|
I call this surfing on the down trough of the hype cycle.
If some new tech makes it to the 3rd iteration, that is a good time to use it.
|
|
|
|
|
For me it's 80% algorithm / 20% code.
Once the algorithm is clear in my mind (and maybe on paper) the coding part is easy.
If I can explain my approach to my colleagues - and they understand it, then I know I'm on the right track. If I can't explain it adequately or their eyes glaze over then I need to do more thinking.
|
|
|
|
|
Yes, I spent most of a year developing two completely novel, efficient algorithms, resulting in half-a-dozen U.S. patents. I'm "just a developer" but I happened to work at a company where a much more efficient algorithm became necessary because Moore's Law was making our product way too slow on modern circuits. To develop these algorithms, I also had to characterize the performance in big-O terms, of a dozen existing algorithms, including three algorithms we had previously developed ourselves.
In another instance, I had to reverse-engineer the inefficient sort algorithm a customer was using from the English description of a non-technical person, and then implement an efficient algorithm to sort his data in an acceptable time.
I have the same experience as trønderen, spending much of my life integrating known puzzle-pieces off the web into an effective, novel solution. But I have to be aware of algorithms to separate the efficient answers from the inefficient ones. There are recurring patterns underlying efficient algorithms that you can use every day if you are aware of them, and only thinking about and studying algorithms leads you to these patterns. You can tweak your code forever to make it faster, but the only way to achieve an order-of-magnitude performance improvement is to find a faster algorithm. It's just something I do.
|
|
|
|
|
The amount of think time is usually proportional to how much you "own" what you're working on.
As technical lead, you might spend all your time "thinking"; assuming someone else is doing the coding.
A system / app is just a big algorithm, IMO; but if you don't get the overall right, the little ones don't matter.
"Before entering on an understanding, I have meditated for a long time, and have foreseen what might happen. It is not genius which reveals to me suddenly, secretly, what I have to say or to do in a circumstance unexpected by other people; it is reflection, it is meditation." - Napoleon I
|
|
|
|
|