Interactive Dance Floor with Motomi
Artist,producer Sherin Vergese approached Sound.Codes after being intrigued by one of the interactive installation based on Motomi, to have some sort of interactivity in his live show with his outfit called Liquid Bass Project.
We accepted the challenge and decided to incorporated some 500 odd people to interact with and contribute to the music while the band was performing live on stage. As the event was held in a night club, recognition of people in low light and constantly changing ambient tonality; was the key issue.
We began brainstorming on how we could give every individual an identity in terms of computer vision and recognise them accurately. Another issue was to make the installation immune to larger numbers, so that it would produce the same results with even a 1000+ people.
We crossed this hurdle by using glow in the dark wrist bands. These bands worked really well and also allowed us to cut through the clubs strobe and the constantly changing shade of the house.
We started to play on the idea that what if the computer can recognise one individual pixel in terms of RGB data and match the data within the selected pixel range. The second challenge was being able to analyse the change of position of pixel and the rate at which it changed, if at all.
We started modifying Motomi by adding colour recognition logic into the algorithm. Using this logic, Motomi began to isolate the given range of RGB data and match it with all the pixel within the frame. This allowed us to have a error range, since the lighting cannot be uniform in a club setting, hence the colour information will have a gradient changes depended upon the position of the object and the light source.
Here, both the source of light and the object are really dynamic and changes constantly. Hence working with an error range or a gradient of the given colour gave us the leverage to analyse and track movements without much errors.
To test the algorithm we captured the audience at the venue on various nights, from full house to partially full. It gave us confidence in the neon bands.
To manage the uncertainty generated by the number of people we had to average them out and scale the output to make it stable. With an error of 5% on ‘y’ axis we averaged all the nearest neighbour.
The Execution
We used #Livesync to synchronise the Ableton Live, also Live’s Scale device to scale the incoming signal. Two choreographers were in place to assist and lead the audience. The feedback from the audience was projected on the 4 projectors.
Over 500 people danced and created a layer of music alongside the band.