I hope this is a reasonable place to put this request/idea ?
I am loving having three external midi notes to send via HIT, SS and RS , but I was wondering if there is any room within the memory of the#
e-drumin module, to, one day, be able to train it to learn more gestures such as placing the tip of the stick on the head and hitting the stick with the other stick, or two head notes by playing nearer to or further from the sensor ( kinda the reverse of hot spot suppression I guess) ....?
I am using my edrumin module in my own custom Norns Script called D2MS ( Drums to Midi Stuff) , and I am loving the amount and flexibility of midi note and velocity date I can send on to my script to then be reinterpreted in all sorts of melodic instruments, leveraging all that velocity data into CC data to control filter sweeps / fx etc ... the Edrumin module has proved to be more than capable of giving me very easy to repurpose inputs... so I thought it was worth asking if this could be considered, I feel like a ton of people am showing my own project to , are on the cusp of embracing this style of "augmented drumming" due to the rise of this style of "one man" song creation.
Training Gestures for more MIDI Notes
Re: Training Gestures for more MIDI Notes
You're talking about some sort of machine learning for the eDRUMin? Not very likely. I've actually done something similar for 'Bell Sense' where I've taken large sets of data and pass them though optimization models to fine tune parameters, but that's all done on a computer. The device itself would not be able to do that level of processing.