The control surface of the desk, and the control api for MixPad etc have access to be able to change everything users can control on the desk, however the midi implementation can only access a small subset of this. Given that the desk must already have event handler code in it for all the stuff which comes in via the MixPad api, is there any chance of allowing midi to access the same stuff?
I’m sure that this would be useful in many areas. My own use case is for radio broadcasting. There are loads of features we are used to in radio studio and outside broadcast desks which I don’t expect the SQ to provide, but which are tantalisingly close to being possible by using an external application sending midi to the desk. For example, I have written a touchscreen Raspberry Pi application which I use to control the outputs to my loudspeaker monitors - features such as level, balance, mono on L&R, mono on L, cut L, cut R, polarity reverse, speaker selection. What could be a simple and elegant solution, is fouled up a bit by the fact that I can’t get at the channel polarity switch via midi, so I have to essentially waste an extra channel on the desk which has its polarity button pressed, send identical signals and controls to that and my main ‘monitor L’ channel, and switch between the two when the polarity button is pressed in that app.
For all similar problems I’ve hit so far, there have been workarounds, but the workarounds always involve some bodgery which consumes more resources on the desk and makes the plumbing a bit more tortuous. A more complete midi implementation would make the desk even more attractive to those of us with specialist requirements, that we couldn’t reasonably expect A&H to include in a general purpose desk.
I would like my ‘radio studio’ app, when complete, to be able to do LS control, comprehensive talkback control so I can speak, for example, to individual mixes or to configurable groups of mixes, and it would handle muting the speaker in the room where the mics are when a mic channel is unmuted or its fader is opened. The talkback solution I currently have, ‘wastes’ a channel on the desk, but if I had midi access to the talkback routing screen facilities, the solution would me more efficient and more understandable.
An alternative approach, if you didn’t think that it was giving away the family jewels, would be to release the documentation for the existing API which MixPad uses, so we could use that instead, since it already does all we need. I realise this might cause hiccups on firmware changes, if you modify the API. However, the hard work coping with those, falls upon those of us developing external apps - and we can hold off on desk firmware updates until we’ve fixed our own code.