Work In Context

When starting my project the work had a clear context. The first idea of projection mapping was to make this project an installation for others to participate in, choosing and playing songs for the visuals to react to. However due to the timeframe and the restraint that we have had to put on the project it has become a smaller scale idea that can be used to facilitate other projects. For example, this can still be used as an installation on a larger scale by mapping a projection onto a surface and having the dancer move around in the same way but representing the space being projected on to. It can also be used on a smaller scale, such as mentioned when me and Darren were talking. Having a simple screen set up with the character ready to dance and then have some sort of audio input for someone to play music from their phone or MP3 player and have it dance.

A few more interesting Ideas to me would be to use the idea of holographic projection, setting up a large full scale/life size 3d projection much like the one shown at the start of this blog (with the tv) and having people be able to interact with the dancer. Not fully as it will be inside glass but having the projection life size and in 3d would add another level to the project.

My second idea, which is the most interesting to me and my future research, is to use this projection in the realms of Virtual Reality. Having the dancer in a digital environment that you can populate as well, being able to dance with the animated character. This could incorporate a few people and more characters making a virtual disco. Using this with the idea of silent discos could create a complete extension of the real world and being able to manipulate that world in real time leaves the possibilities endless. It should not be viewed as a way of replacing the disco or nightclubs/dance events but as a way of extending, enhancing and augmenting the reality that we know.

Advertisements

Other Works

While researching the work that i wanted to do I had to research work in a similar filed. Beat detection and audio analysis is something that is used quite a lot today especially in the world of VJ’ing. However, a lot of these (such as shown below by Tony Broyez) focus on using music to control patterns, and they are more focused on the frequency of sounds rather than the beat.

Another example shows a version of the same project projected on a larger scale.

 

There is an article on Airtight Interactive that has a section about audio analysis. It gives a lot of insight into beat detection and the use of threshold to register the pulse of the music. This was helpful when I was trying to create my own beat tracking patch in max, however, as stated in the article it is very hard to get reliable beat detection, which is why i chose to use the qm.btrack instead. The article can be found here.

 

This type of beat detection can be used in other contexts as well. LightJams have made it possible to create all sorts of lights with music. A great use that they have documented is an LED light up dance floor that reacts to the music being played.

Having visuals that react to music is a great way to bring the senses together. Having visuals react to the music that you can hear and in some situations feel will give a full sensory emergence.

Final Touches

From testing the final project I have found a few issues which Darren will need to sort out. Firstly being that the lights on the floor are also supposed to follow the tempo of the music which doesn’t seem to be happening at the moment. Secondly, the camera is too close to the character meaning he is not very visible when doing some of the dance moves.

Apart from these few small issues everything seems to be working fine. I am however still trying to figure out of to create some sort of beat or bar tracking to further synchronize the dance moves with the music.

Problem Solving

Before Darren was able to update the file however I needed to send him my patch for him to see if it would work with his scripting. There was an issue however in the face that Darren only has a Windows machine and I primarily use Mac which means Darren was unable to use the qm.btrack, which is essential to the beat and tempo tracking. I managed to solve this though by suggesting that he creates another type of tempo input for the purposes of testing, which worked fine in the end.

13245452_10156836781275648_5254914721711801897_n

The Link

Darren has now sent me an updated version of the Unity project which has the OSC receiver to receive the tempo and animation instruction from the max patch. He has told me which strings to put to link with the Unity coding that he has done and now I have been working out how to control the dancer from certain musical gestures.

The first thing I wanted to work was how to select which animation the dancer does. There are 4 overall – 1 Idle animation and 3 dancing. I started off by using the beat track and counter to determine how long each animation would last. I have gone for 32 beats as this seems to be the best amount of time to complete the animation and have a smooth transition. I then used a random counter from the carry count and the select object attached to 3 bangs to randomly select which animation to do. However this proved unsuccessful as it would stay on the same animation for 3 or 4 rotations sometimes. Instead I have used the carry count of the first counter and place that into another counter object counting from 0 to 3, back to 0 and repeat. As one of the animations fits better than the others, and is what I consider to be the main animation, I have used two of the outputs to trigger it, making it more frequent than the others.

string selection

What you can see here is the counter on the left under the beat track running into another counter as described. You can also see the sprintf object being used to send the strings from the tempo which is /1/### (### being bpm). and then the other messages sending the name of the animation to choose. You can also see in the picture the idle message which I will explain now.

idle string

To organise the Idle animation I decided that I wanted it to be selected after no music was being played for 3 seconds, causing the dancer to stop dancing. I have used the amplitude output form the fiddle object as it had the best accuracy for numerical output, I have then used a less than and greater than function to determine when the music is on or off. However, at first this caused a problem as even when the music was off it was still sending a signal, which was glitching any bang that I placed in the sequence meaning the Idle message would constantly be being sent. The most accurate way around this that I found was to use two of the same more than less than circuits – one to control a ggate (to stop the constant signal) and another to control the metro counter (3 seconds and then send bang to Idle).

Progress

Due to my brother having work commitments and also his wedding happening last weekend development has taken a bit of a break. From conversations with Darren over last weekend we have organized what is left to do, the time frame to do it, and who will do what by which day for the final deadline.

To Do

For the purposes I have asked Darren to detail the work that he has done for the project for me to detail in my evaluation.

Wavetick / VDMX

As my part of the collaboration is to create the tempo tracking, i started looking at beat tracking and tempo tracking software. The best that I found, which was suggested by Darren, is a program called ‘Wavetick’ created by Wavesum.

Wavetick

As you can see from the picture there is no visual representation of the BPM tempo. However, a great trait of Wavetick is the beat tracking. The three boxes at the top represent bar, beat, and Atom. The Bar square will flash at the beginning of what it recognizes as the beginning of the bar, the Beat will flash on every beat, and the Atom will flash at 8th notes giving a very accurate representation of tempo. This feature would be very useful to have in my project, especially the Bar meter, so that the 3D character can recognise when to start dancing – at the beginning of the bar.

The other program which Darren had mentioned is VDMX, which with an addition of Waveclock is a way of automating visuals to music. This looked like a good way to go at first as it would be a simple way to make it work. However, since we decided to work with 3D visuals we decided to abandon this and work within Unity. Here is a video to show the capability of VDMX.