|
Daniel Pfeffer wrote: I don't stress it as much as some people do...
Now swing that around: Does it stress you out?
|
|
|
|
|
There are many ways to be a Linux evangelist.
|
|
|
|
|
The SSD this machine came with sits on the table or the floor, depending on the cat's current mood.
This HDD that has Win7 on it that came out of the last laptop will be cloned onto a new SSD sometime soon.
If windows 10 was burning in a dumpster, I'd look for more refuse to chuck in.
|
|
|
|
|
enhzflep wrote: If windows 10 was burning in a dumpster, I'd look for more refuse to chuck in.
Nice one.
|
|
|
|
|
An SEO expert walks into a bar, bars, pub, tavern, public house, Irish pub and drinks, beer, alcohol, and booze.
Social Media - A platform that makes it easier for the crazies to find each other.
Everyone is born right handed. Only the strongest overcome it.
Fight for left-handed rights and hand equality.
|
|
|
|
|
And doesn't get drunk because he didn't have any spirits.
|
|
|
|
|
/ravi
|
|
|
|
|
when the policeman asked me where my passengers were I said; "because of social distancing they are in the vehicle behind me"!
I'll get my coat and show myself out.
I'm not sure how many cookies it makes to be happy, but so far it's not 27.
JaxCoder.com
|
|
|
|
|
On a serious note I have always found it silly that all you need is a second human being. That human being could be an infant. The law should be 2 or more drivers.
Also, Atlanta is nuts in that they enforce HOV restrictions 24 hours a day!
Social Media - A platform that makes it easier for the crazies to find each other.
Everyone is born right handed. Only the strongest overcome it.
Fight for left-handed rights and hand equality.
|
|
|
|
|
I hate going through Hotlanta!
I'm not sure how many cookies it makes to be happy, but so far it's not 27.
JaxCoder.com
|
|
|
|
|
Hmmm. Wouldn't that complicate things if one of them gets his / her licence taken away? Where would the actual driver stand then?
"I have no idea what I did, but I'm taking full credit for it." - ThisOldTony
AntiTwitter: @DalekDave is now a follower!
|
|
|
|
|
Use a mannequin. Works for me.
If you can keep your head while those about you are losing theirs, perhaps you don't understand the situation.
|
|
|
|
|
You know they are talking about driving - right?
If you can't laugh at yourself - ask me and I will do it for you.
|
|
|
|
|
|
No amount of caffeine is getting me through my morning, and I've found a bug in my MidiSlicer app. It's not a show stopper, but it's annoying. There are two ways to fix it, and one way is a dastardly hack. The other is complicated, but the Proper Way(TM) to do things.
The problem is such: In the MidiSlicer app you can modify a file while it is playing. However, since playback merges all of the tracks together for playback since they all must play at once, if you "mute" a track by removing it from playback, it changes the number of "events" in the stream. So your current cursor position during playback is now invalid.
Say for example we're playing a track with the following layouts
Drums: 50 events
Bass: 35 events
Guitar: 60 events
Total that's 145 events. So let's say I'm playing back and I'm near the end of the loop, and I remove the bass track. Well I'm on position 140, but it doesn't exist anymore now, because there are only 110 events in the stream now. Make sense?
The dumb way to fix it: Add a "dummy" NOP midi message to the API that never gets played or saved to disk. Replace any events i remove with a dummy event.
The right way to fix it: Before modification, get the absolute time of the cursor position. This should be expressed basically as system ticks or a timespan. After modifying the events, seek back within the event stream to that same *time* based position, wrapping if needed. and get the actual position within the event stream from that.
The latter way is preferable for many reasons, but while I've written code to convert from the current position to an absolute time based position, I do not have the code to go the other way around.
It sounds like a simple computation of time - basically the time each quarternote takes in system ticks, and then measure the score as quarter notes - which is simplified from what i'm actually doing but close. However, it won't work because the tempo can change throughout the track, meaning the duration of a quarter note can change throughout the score.
Measuring it involves starting at 0 and moving through the track, counting up times at the current tempo, and then recomputing durations whenever the tempo changes. That's what I do going the first direction. Converting from Time to position should be roughly the same thing but my brain isn't working this morning.
Real programmers use butterflies
|
|
|
|
|
honey the codewitch wrote: No amount of caffeine is getting me through my morning, Have you tried with bacon?
M.D.V.
If something has a solution... Why do we have to worry about?. If it has no solution... For what reason do we have to worry about?
Help me to understand what I'm saying, and I'll explain it better to you
Rating helpful answers is nice, but saying thanks can be even nicer.
|
|
|
|
|
i really should.
Real programmers use butterflies
|
|
|
|
|
Why not just find the first previous non-muted object, storing its 'this', as well as the current offset tick, then recreating the vector without shuffling, finding that 'this', setting the container's location to 'that', and then playing the next object in the list at the appropriate time?
Or something like that.
|
|
|
|
|
It's more complicated than it sounds because i'm not working on the merged tracks. I only see the individual tracks separated at that point.
They get merged only at the last second.
I'd have to show you the code, but putting in dummy messages is the easiest, if not the cleanest solution. It's far easier than recomputing all the delta times in each of the events in each of the tracks.
Real programmers use butterflies
|
|
|
|
|
If you are creating a pure MIDI stream and sending it to some process that only works in MIDI diff-times, then you will have to do something like you are stating. If not, you can just add a check to the C# object in the container when it reaches play time, and if it's track is muted simply don't play it.
|
|
|
|
|
It doesn't work they way, because the driver takes events with deltas attached to them. You queue them up with the deltas attached and it plays them when the time indicated by the delta comes up. It does this in the background. All the deltas are relative to each other, so if you remove an event, it shifts all the notes that follow it back by whatever it's delta was. So I'd have to recompute all the deltas any time i remove an event.
Real programmers use butterflies
|
|
|
|
|
Is it your driver or something you're using? If yours, maybe a change to absolute time (since the start of the piece), or even bar plus beat offset, would make this easier.
|
|
|
|
|
No, I don't have control over that. It's part of the MIDI spec, and the win32 driver API doesn't do translation
Real programmers use butterflies
|
|
|
|
|
I should add, I do actually, use the absolute time technique when I'm doing non-streaming playback, but that doesn't do background playback. It blocks the thread and just chews up CPU. It uses no waits, no timers, nothing, except querying the current system ticks and playing the event stream in a tight loop.
Real programmers use butterflies
|
|
|
|
|
If you can't do as Greg says, then it is a matter of going through each non-muted track individually, and finding the last non-muted event played. Recreate the new container without the muted events, find that last event played, and do the corresponding finagling. Unless I'm missing something.
|
|
|
|