|
I did not have much time this week for MacGuyvered serial I/O on MacGuyver's processor. We had only this week to 'MacGuyver' together a working prototype of something that our customer only got nice six digit estimates for elsewhere. 'Macguyvering' is a broad field and unless you are professor for and against everything (or yor name really is MacGuyver), you have no choice but to train this skill a little. Look at Q&A if you want to see what happens if you don't.
Anyway, The serial I/0 now works fine. 9600 bps at 6 MHz clock frequency. Half duplex, of course. I don't want to have to work out the timing for a subroutine to pull off full duplex. Having an emulator for the old Elf that actually emulates both the Elf and the terminal and their interaction precisely was really helpful. I was able to develop a formula to calculate the timing constants for the subroutines at different clock frequencies and bitrates. I could conveniently test everything with the emulator. If my values worked there, then the Zwölf and the PC also could live with them. Every time. My compliments to those who wrote that emulator, did not cut corners and went through the trouble to emulate both sides extremely accurately.
And now something for all MacGuyvers here. A programming question:
I need a way to let the processor measure its own clock frequency. With that, I could calculate the timing constants for the serial I/O at initialization and let the processor find the highest possible bitrate, no matter how slow or fast it is clocked. With an independent reference, like a timer or a real time clock, that would not be a big deal. But without?
I have lived with several Zen masters - all of them were cats.
His last invention was an evil Lasagna. It didn't kill anyone, and it actually tasted pretty good.
modified 23-Feb-20 7:48am.
|
|
|
|
|
Just an untested idea:
If you have a spare output pin and a spare input pin, you could connect them with RC (resistor + capacitor) circuit. When the output pin goes to 1, the voltage on the input should rise according to V * (1 - e-t/(R*C)). Check for when the input value changes. The number of cycles, R, C, and V should give you the answer.
If you want a real McGyver solution:
Take a piece of radioactive material with short half-life, place it on top of the memory, and use the processor to count the number of random bit flips induced per period of time. Knowing the half life, the rate of change in the number of bit flips will give you the length of the period of time, which (with the known number of cycles taken by the code) will give you the frequency.
Freedom is the freedom to say that two plus two make four. If that is granted, all else follows.
-- 6079 Smith W.
|
|
|
|
|
You forgot to mention to do that (RC) only if the input is a schmitt trigger. Otherwhise there is a chance to destroy the input if slew rate is to slow
It does not solve my Problem, but it answers my question
modified 19-Jan-21 21:04pm.
|
|
|
|
|
I'm a physicist, not an electronics engineer.
Freedom is the freedom to say that two plus two make four. If that is granted, all else follows.
-- 6079 Smith W.
|
|
|
|
|
Not sure if this excuses you
It does not solve my Problem, but it answers my question
modified 19-Jan-21 21:04pm.
|
|
|
|
|
Daniel Pfeffer wrote: If you have a spare output pin and a spare input pin, you could connect them with RC (resistor + capacitor) circuit. When the output pin goes to 1, the voltage on the input should rise according to V * (1 - e-t/(R*C)). Check for when the input value changes. The number of cycles, R, C, and V should give you the answer. That actually is a good idea. I actually already have something like that as a simple power-on reset. I would add a Schmitt trigger[^] to make the transition from the analog RC signal to a digital input less bumpy.
Edit: MacGuyver just told me about this: 74HC24: quad 2-input Schmitt trigger NAND gate
I am going to need a Schmitt trigger anyway and with NAND gatesI can build a simple RC oscillator, a flip flop that clocks itself, and eliminate the need for the output bit, of which I only have a single one. With a variable capacitor I should also be able to calibrate the frequency precisely enough. Just what frequency? Something in the range of 100Hz - 1kHz should be comfortable and precise enough.
I have lived with several Zen masters - all of them were cats.
His last invention was an evil Lasagna. It didn't kill anyone, and it actually tasted pretty good.
modified 23-Feb-20 9:28am.
|
|
|
|
|
My preference would be for something more stable than an RC circuit. If you already want to throw in some hardware, why not use a CD4521 [^] driven by a quartz. With a 8.3886 MHz crystal you can get up to 0.5Hz output.
Just my 0.02$
|
|
|
|
|
Mircea Neacsu wrote: If you already want to throw in some hardware The next step will be to replace the simple reset logic and oscillator by a PIC microcontroller. This opens up many new possibilities, including a multiprocessor system and a modest memory expansion up to 16 megabytes. Why not include support for a compatibility mode to the old Elf and put all the legacy hardware on a second optional board? At the flip of a switch the PIC will reset the processor, switch to the right clock frequency and enable the good old stuff. This way I could have both the compatibility to the old computer and all the freedom I need to go in any direction I may want to take the new one.
I have lived with several Zen masters - all of them were cats.
His last invention was an evil Lasagna. It didn't kill anyone, and it actually tasted pretty good.
|
|
|
|
|
Daniel Pfeffer wrote: If you want a real McGyver solution:
Take a piece of radioactive material with short half-life, place it on top of the memory, and use the processor to count the number of random bit flips induced per period of time. Knowing the half life, the rate of change in the number of bit flips will give you the length of the period of time, which (with the known number of cycles taken by the code) will give you the frequency. You forgot to say the radioactive material has to be glued to the circuit using chewing gum.
M.D.V.
If something has a solution... Why do we have to worry about?. If it has no solution... For what reason do we have to worry about?
Help me to understand what I'm saying, and I'll explain it better to you
Rating helpful answers is nice, but saying thanks can be even nicer.
|
|
|
|
|
What is it that's dictating the frequency? There must be something pumping the CPU right?
Find that device. on a PC it's something like a PIC controller or something like that (i forget) but it has a timing crystal or whatever in it. Accessing that would be a trick on a PC but it depends on your setup.
If it's not using a timing crystal how is it regulating itself?
Real programmers use butterflies
|
|
|
|
|
A simple crystal oscillator on the breadboard at the moment, but I intend to replace it with a PIC for several reasons, including controlling the processor's clock. Besides that, the PIC will also help with getting rid of any slow ROM in the processor's memory map and synchronizing up to eight processors on the same bus without any (dead)locks or bus collisions. Letting the processor(s) talk to the PIC usually is not done, simply because that feature would occupy 11 of the PICs I/O pins, of which you never seem to have enough.
I have lived with several Zen masters - all of them were cats.
His last invention was an evil Lasagna. It didn't kill anyone, and it actually tasted pretty good.
|
|
|
|
|
I see. I got nothing but good luck
Edit: I think most of the time, people would just drive everything off the PIC's timing rather than try to detect it, but obviously you can't do that or you would have.
Real programmers use butterflies
|
|
|
|
|
That's actually not so wrong. The old Elf was bound to a specific clock frequency, otherwise all the macguyvered components and software would fail, including the graphics chip. Why not build in a 'old Elf' compatibility mode that enables all these things only when the clock frequency is right?
I have lived with several Zen masters - all of them were cats.
His last invention was an evil Lasagna. It didn't kill anyone, and it actually tasted pretty good.
|
|
|
|
|
I can't claim credit for thinking of it. It's how so many simple embedded CPU based systems work. It's how the Amiga worked. It's how the Nintendo worked.
And I figure if it worked for them, why not? But then i don't know all your requirements.
Real programmers use butterflies
|
|
|
|
|
Ah, the Atari Amiga! Never had one, but I always liked it. The 68000 was a good processor and the Amiga's chipset was better than that of the Commodore ST.
I have lived with several Zen masters - all of them were cats.
His last invention was an evil Lasagna. It didn't kill anyone, and it actually tasted pretty good.
|
|
|
|
|
was a commodore amiga i thought?
i'm getting old
Real programmers use butterflies
|
|
|
|
|
I did that deliberately. That was the last nerd war I did not participate in.
The 8 bit Ataris were really ahead of their time because of their custom chipset. The designers of that chipset later went on to open up their own company and wanted to design a new chipset for a game console. The name of that console and the company was Amiga. Later they they were bought by Commodore and guess what they did with the chipset and the console. The Amiga was the direct successor of the 8 bit Ataris.
Before that, Commodore had more or less thrown out the CEO. He had to go and took along some of the guys who had developed the Commodore 64. He bought Atari and had his guys design a new 16 bit computer. The Atari ST really is the direct sucessor to the Commodore 64.
Not knowing that, the Atari nerds went to war with a Commodore, and the Commodore nerds fought with their Atari. It can't get any more absurd.
I have lived with several Zen masters - all of them were cats.
His last invention was an evil Lasagna. It didn't kill anyone, and it actually tasted pretty good.
|
|
|
|
|
Ha! I didn't know any of that. I was a 6502 nerd too, but apple.
They had some dust ups WRT to CEOs as well.
And John Sculley can (i can't complete this sentence because of the forum rules)
He ripped my fam off to the tune of $2k in 1986 money by not supporting the brand new (at the time) Apple ][gs making it basically a brick right out of the box.
6 months after it was released you couldn't get any new software for it.
Real programmers use butterflies
|
|
|
|
|
I sped up my GLR via LALR(1) table generation code significantly. I think it's faster than Gold Parser now, despite being much more powerful.
LALR(1) is difficult to understand, but i've debugged and now optimized enough that I think I can say I understand it, almost.
I can explain how to make the tables, and I have, but I still can't explain the why of it. That's why I say almost.
Still, optimizing it taught me some things.
Real programmers use butterflies
|
|
|
|
|
I don't remember who told it, but...
Quote: 1st rule of optimization: Don't do it.
2nd rule of optimization (only for experts): Don't do it... yet.
I forgot the "only for experts" in the 2nd rule
M.D.V.
If something has a solution... Why do we have to worry about?. If it has no solution... For what reason do we have to worry about?
Help me to understand what I'm saying, and I'll explain it better to you
Rating helpful answers is nice, but saying thanks can be even nicer.
modified 23-Feb-20 4:40am.
|
|
|
|
|
3rd rule of optimisation: Backup first.
"I have no idea what I did, but I'm taking full credit for it." - ThisOldTony
AntiTwitter: @DalekDave is now a follower!
|
|
|
|
|
Whoever believes that never dealt with an unoptimized LALR(1) algo.
It was so slow i wanted to get out and push.
Real programmers use butterflies
|
|
|
|
|
honey the codewitch wrote: Whoever believes that... http://wiki.c2.com/?MichaelJackson[^] and not the "black or white", "thriller" and so Michael Jackson
M.D.V.
If something has a solution... Why do we have to worry about?. If it has no solution... For what reason do we have to worry about?
Help me to understand what I'm saying, and I'll explain it better to you
Rating helpful answers is nice, but saying thanks can be even nicer.
|
|
|
|
|
I mean, good for him for being a successful author but that doesn't make his statement correct.
As a counter-example I just sped up Glory by at least 100% in all cases through some well placed optimizations.
And since it's a build tool speed matters because it impacts your dev cycle such that modify->compile->run gets nasty long during the compilation part.
It makes editing grammars arduous. Speed is important for the user experience, even when the user is another developer.
On the other hand, "Don't optimize, yet" i can totally get behind. Or as I like to say "first make it work, then make it fast"
Real programmers use butterflies
|
|
|
|
|
honey the codewitch wrote: that doesn't make his statement correct. I know... if you see again in my first message, there were smilies because I was just joking / pissing
honey the codewitch wrote: As a counter-example I just sped up Glory by at least 100% in all cases through some well placed optimizations. I re-invented the wheel in an old project reducing cycle time from 84 msec to 17 msec while handling with 35% to 40% more data as the previous version.
honey the codewitch wrote: Or as I like to say "first make it work, then make it fast" But don't forgetting than many times "Best is the enemy of very good"
M.D.V.
If something has a solution... Why do we have to worry about?. If it has no solution... For what reason do we have to worry about?
Help me to understand what I'm saying, and I'll explain it better to you
Rating helpful answers is nice, but saying thanks can be even nicer.
|
|
|
|