I’m building a Touch OSC template that uses Osculator to communicate with like four different systems at once. (dLive being one of them.) Currently trying to wrap my brain around the TCP Midi values and interpolate that data into something useful. So far - cue recall is easy = program changes. Fader movements are… somewhat more complex.
Just curious if any of you guys are doing interesting stuff with TCP Midi yet?
Yes. I have created a very useful program for my band with the A&H QU-PAC.
With 2 pedals, the 2 singers can transform their microphones into talkback microphone to talk to the band into their IEM
Basically, it remote the sounds from the LR and floor speakers so the mics are not going out , and send them only to the IEM aux
While releasing the pedal, it set it back to the original value
Another thing I will add is : one button per IEM aux with the name of the AUX
When pressing this button, it also PAFL it, o I can hear what the musician is actually listening in its IEM
And maybe add 32 channels faders too, so I can edit them more easily than on the QU app
I hadn’t thought about using it for talkback in that way.
I finally got my touchOSC template going that has scene recall + 8 faders and controls for lights/media all on one page. (With full back-mapping so you can see where you’re currently at in lighting/media cues.) Took me a minute to parse out the midi messages in Osculator for the dLive, but it seems to be working fairly well.
One odd thing I’ve run in to: If I do a scene recall my faders Don’t jump to position in Touch OSC, but they DO move any time I move a fader on the desk or in director, or on an ipad etc. So the console is reporting the movement. Stranger still, they started working properly on scene recall for a few minutes, then stopped again. Have to dig a bit deeper into that one.
I haven’t seen anything. GLD spec seems to cover it though yeah? I’ve had really good luck just sniffing input messages from the desk in Osculator, but you could do that with any midi monitoring software.
Where can I find the TCP/Midi driver for windows computers? There is one for the Mac on the A&H website in the dLive section but no windows version that I can find.
Theoretically yeah, but as far as I’m aware that’s Mackie only; which means you’d still have to interpret the Mackie data into something useful, with like a windows version of Osculator. (Does that exist??)
No, that’s true - you need something between touchOSC and the MIDI NRPN to moderate communications. (Which is why I use Osculator.) In a windows environment I’d be willing to bet you could use Reaper as that intermediary. TouchOSC definitely works with it, and Reaper can handle NRPN natively IIRC.
I would like to see (like Yamaha has) a completely SYSEX-based MIDI implentation, instead of a combination of PC/SYSEX/CC/NRPN messages. A messsage for every parameter on the console as well - there are many parameters not implemented. I know that the communication of every parameter is already implemented in whatever method the console uses to speak to iPads, Director, surfaces, etc… It should be easy enough to translate those messages to SYSEX and OSC.
Anyone ever “sniffed” the communication between the MixRack and controller?