I also love that this could blur the lines of music playing and dance.
Great job OP, thanks for sharing.
Reminds me... I even used two PlayStation Eyes (EUR 5 each) with OpenCV and the EVM algorithm[2] on a ThinkPad X230 for a dance performance piece back in 2015. Movements rather than gestures and OSC instead of MIDI, but it worked great!
[0]: https://midipaw.com/
Using the gloves during an NPR Tiny Desk concert: https://www.youtube.com/watch?v=3QtklTXbKUQ&t=555s
Minor UX feedback
- I find the internal scroll of setting up the presets limiting. Because the whole top half is fixed it is hard to see all my settings at once. Could either the whole panel scroll?
- In dark mode, the "Save" button is black on dark grey and looks disabled.
- The delete button is scarily prominent and no confimation. Can that be moved inside the preset panel so i dont accidently hit it while performing?
Reminds me of the Moog Theremini - that was a fun bit of kit.
(Prompt: "Make an in browser midi theremin using Mediapipe")
There's a Web MIDI API, so might be able to reproduce all the functionality in a way that works across devices and operating systems.
I seem to only be able to map each axis to a single finger?
It seems like if it is detecting two fingers at a time I should be able to use the X and Y on each of them simultaneously?
Do you have any suggestion for how to learn how to hook this up to Logic for anyone who hasn’t used midi before?
I wonder how this would perform under live stage lighting conditions, i.e coloured strong lights and high contrast.
Then if you want to filter the track to receive specific MIDI channel from specific device, for example AirBending channel 2, then find it in the dropdown in the MIDI inspector section in the same MIDI track.
This is not a theremin, it’s a programmable controller which you can tie to various midi inputs. Think “slider” or “dial”.
Also with iPhone, I have to think how to transmit MIDI data to DAW on laptop. Well, most likely via USB or network.