The next thing I would like to show is the progress that I have made. For my part of the collaboration I will be working on making the beat tracking and tempo timing along with any other information that I can use to send to the animated dancer to make the best visuals.
The first problem I had was getting the sound that i wanted into max. In the final product I would like any user to be able to plug in a device and play music that they have available, however for now I am using Spotify on my computer and linking it to max through Soundflower.
I started working in max trying to make my own beat recognition using some articles that I found on the internet talking about how it registers attack and used that as a starting point. However, after a while it became prominent that although it may be possible to make my own, making one that is effective will be very difficult.
After some time on the cycling74 forums and talking to Darren I found some external objects for max such as Bonk~, Fiddle~, and qm.btrack~ and found them all to be very useful. the qm.btrack~ has excellent beat recognition and built in tempo calculator while Bonk~ has let me create a graphic spectrum to visually analyse any song that is being played. This is how it is working so far using a few different styles of songs.