Thursday 20 August 2009

Live Visuals Continued...


Continuing on my everlasting theme of live audio/visual synchronisation, I have some new links based around the topic of using Blender - specifically its game engine - as a visuals tool.

I had previously come across the script for Blender called 'Midi_Import_X' (blog for that here). This script doesn't allow for real-time import, but rather using pre-built MIDI sequences and then allows actions to be triggered from each signal when being rendered. So for instance a note might make an eye blink at a certain point in the timeline.

Click on the image below to see a nice example of this.

Now if only this could be done live...

The next example of a script/plug-in is this one - called Pyppet (link for download here). It allows control of certain elements with various input devices (including the Wiimote). However at the current time, MIDI is not mentioned and nor is the Blender game engine. As far as I can tell, it is only for use in the animating stage in linear based movies. If it was to allow for MIDI control in the GE, it would be perfect...

Again, click the image for a video...

All this would aid my quest to create visuals that linked up to sounds perfectly as they were being played. One of my current projects - using the track 'Starter' by Boys Noize - uses a lot of keyframing to hook sounds to visuals. Whilst this process is satisfying when done, it is time consuming, and also linear. It doesn't allow for syncing up to other songs.

Here is a clip of one of the sections...




Going back to Pyppet, if certain animation actions were saved i.e. like the flashing X in the video, and then applied to controllers, the flashing X could be synced up to this song or indeed any other song that is playing. The only catch with this method is that MIDI sequences would need to be created for all songs that are going to be played live. This isn't really much of a downside however, just a little bit more preparation if the song you are going to be playing is not your own (i.e. like when DJing). If the track is your own and you have the MIDI sequences, this method of visuals would be perfect.


More on this soon I'm sure...

ANDY GILMORE .II

I finally came across Andy Gilmore's own site.

I'd seen his work on Flickr before which was stunning, but hadn't seen his website before - including a couple of blogs with some interesting works on it.

Monday 3 August 2009

AMUSING IMAGE

I just found this image amusing when I came across it...

YELLENA JAMES

Some really nice illustrations here from the artist Yellena James. Liking the organic feel to them.

FLASH AUDIO

Some very interesting and inspiring works here from André Michelle using Actionscripting in Flash and some synthesised sounds. He's even put some of his works together to make a full blown music creating interface that works in your browser...

See:

aM Laboratory

Hobnox Tool

WORRIED NOODLES

This vid is awesome.

GAME ENGINE AS LIVE VISUALS

Another post from Create Digital Motion that has taken my interest is this one featuring MAX and the Unity game engine.


In the video featured on the page they show a cube and a sphere being controlled externally by Jitter - with the hues being altered, movement being applied, and even video with various blending modes being used as live textures. This set up made me wonder whether the same thing was possible in Blender - the free 3D modelling software, with a built in game engine.

This game (GooBall) is running under the Unity game engine as shown in the video.

(Still from GooBall)

And also a shot from a game produced in Blender...

Links to the game site, and the Blender project page.


It would be cool to see environments like these being controlled in realtime through alternative controllers - i.e. not gamepads, which brings me onto my next point...

Awaiting future release is the Max for live extension - allowing MAX/MSP to connect directly with Ableton Live - music production and performance software. This means that with this link and the link into a game engine, visuals set within 3D environments could truly be synced with music; whether it is in the form of playing a keyboard live and things happening within the engine - as you would expect in a videogame from pressing buttons, or via precomposed MIDI sequences that are performed as per a common Ableton setup. The precomposed MIDI would allow for patterns of visuals to be stored for instant triggering.

On top of this, the live game engine feed could be run through another program as an input - perhaps in VDMX somehow. Then post-effects could be applied to the source to enhance the image.

The possible outcomes for this kind of set-up would be endless. With the level of detail going into today's games and the kind of work beng produced in the motion graphic and animation world - making something that runs in realtime would be fantastic for events and installations. Music events could have live visuals that were designed specifically for the artist and sync specifically to their music. That's not to say that this isn't happening already - as there certainly are artsists out there with accompanying visuals by top design agencies - but my point is that showing them through a game engine that links directly to Ableton is possibly the most straight-forward, and best synced set you can have, and being a game engine, allows for visuals with a lot more depth than a set of pre-rendered clips. For instance, imagine that an environment like this...

(Zeitguised - Perepetics)

was created in a game engine - and that several of the different components were hooked up to different parts of a song like a drum pattern or a bassline. Like I said, the possibilities would be endless. And beautiful.

Bookmark and Share