Tim from the London Music Hackspace is working on a gestural music controlled toolkit and has a survey up about what people want from it. Here's my comment :
I'm interested in how to go beyond basic mapping of body movements to sounds. The examples of which I've seen so far, tend to be either very simple and give little control (arm up makes the note go up etc) or are so hard to understand and repeat that they're more or less aleatory.
I'm interested in how to combine gestures with some kind of "virtual world", by which I mean less a 3D visualization than a space full of virtual "objects" which have their own autonomous, music-making behaviours and which the user can invoke in some way ... eg. by "touching" them or "moving" them.
No comments:
Post a Comment