After quite a bit of head-scratching over how best to handle channels, the problem is that some widgets support multiple channels, xypad for example, which in turns involves multiple ranges. This seems clunky when a channel can never really exist without a range anyway.
So I’ve drafted up a new strucuture that feels kind of future proof to me. Ranges are now part of a channel object, and each channel object resides within a ‘channels’ array. The channel id
is the channel that Csound communicates on, and it’s also the parameter name as shown in a DAW. The updated structure looks like this. Note that each channel has an event associated with it under this new scheme.
[
{
"type": "button",
"channels": [
{
"id": "startStop",
"event": "valueChanged",
"range": { "min": 0, "max": 1, "value": 0 }
}
]
},
{
"type": "rotarySlider",
"channels": [
{
"id": "harmonic1",
"event": "valueChanged",
"range": { "min": 0.0, "max": 1.0, "value": 0.0, "skew": 1.0, "increment": 0.001 }
}
]
},
{
"type": "image",
"channels": [
{
"id": "cursorX",
"event": "mouseDragX",
"range": { "min": 0, "max": 1, "value": 0 }
},
{
"id": "cursorY",
"event": "mouseDragY",
"range": { "min": 0, "max": 1, "value": 0 }
},
{
"id": "leftButton",
"event": "mousePressLeft",
"range": { "min": 0, "max": 1, "value": 0 }
},
{
"id": "rightButton",
"event": "mousePressRight",
"range": { "min": 0, "max": 1, "value": 0 }
}
]
},
{
"type": "xyPad",
"channels": [
{
"id": "cutoff",
"event": "mouseDragX",
"range": { "min": 100, "max": 10000, "value": 1000, "skew": 0.5, "increment": 1 }
},
{
"id": "resonance",
"event": "mouseDragY",
"range": { "min": 0.1, "max": 20, "value": 5, "skew": 1, "increment": 0.1 }
}
]
}
]
Under this scheme users can also define custom mapping for buttons, mouse positions, etc. 1D controllers liks sliders, button, checkboxes, etc., use a single valueChanged
event. Multi dimensional widgets can define channels for any of these events:
Event | Description |
---|---|
mouseDragX |
User is dragging along the X axis; the channel value updates accordingly. |
mouseDragY |
User is dragging along the Y axis; the channel value updates accordingly. |
mouseDragZ |
(Optional) 3rd axis, if supporting 3D gestures or depth. |
mouseMove |
Movement of the pointer over the widget (normalized X/Y can be separate channels or combined). |
mousePressLeft |
Left mouse button pressed; channel goes 0→1. |
mousePressRight |
Right mouse button pressed; channel goes 0→1. |
mouseReleaseLeft |
Left mouse button released; channel goes 1→0. |
mouseReleaseRight |
Right mouse button released; channel goes 1→0. |
mouseClickLeft |
Optional shorthand for press+release detection. |
mouseClickRight |
Optional shorthand for press+release detection. |
Indeed, any widget can define channels for these events. If you need to know when a user clicks on a slider for example, you can add a ‘mousePressLeft’ channel event. These events are all defined in the frontend, the backend is agnostic to events, it simple parses the channel/value pairs. User develping their own widgets can define whatever events they like so the system is scalable to other user interactions.
Let me know what you think.
p.s. if range is left out,a default range between 0 and 1 will be used.