Thursday 20 August 2009

Live Visuals Continued...


Continuing on my everlasting theme of live audio/visual synchronisation, I have some new links based around the topic of using Blender - specifically its game engine - as a visuals tool.

I had previously come across the script for Blender called 'Midi_Import_X' (blog for that here). This script doesn't allow for real-time import, but rather using pre-built MIDI sequences and then allows actions to be triggered from each signal when being rendered. So for instance a note might make an eye blink at a certain point in the timeline.

Click on the image below to see a nice example of this.

Now if only this could be done live...

The next example of a script/plug-in is this one - called Pyppet (link for download here). It allows control of certain elements with various input devices (including the Wiimote). However at the current time, MIDI is not mentioned and nor is the Blender game engine. As far as I can tell, it is only for use in the animating stage in linear based movies. If it was to allow for MIDI control in the GE, it would be perfect...

Again, click the image for a video...

All this would aid my quest to create visuals that linked up to sounds perfectly as they were being played. One of my current projects - using the track 'Starter' by Boys Noize - uses a lot of keyframing to hook sounds to visuals. Whilst this process is satisfying when done, it is time consuming, and also linear. It doesn't allow for syncing up to other songs.

Here is a clip of one of the sections...




Going back to Pyppet, if certain animation actions were saved i.e. like the flashing X in the video, and then applied to controllers, the flashing X could be synced up to this song or indeed any other song that is playing. The only catch with this method is that MIDI sequences would need to be created for all songs that are going to be played live. This isn't really much of a downside however, just a little bit more preparation if the song you are going to be playing is not your own (i.e. like when DJing). If the track is your own and you have the MIDI sequences, this method of visuals would be perfect.


More on this soon I'm sure...

ANDY GILMORE .II

I finally came across Andy Gilmore's own site.

I'd seen his work on Flickr before which was stunning, but hadn't seen his website before - including a couple of blogs with some interesting works on it.

Monday 3 August 2009

AMUSING IMAGE

I just found this image amusing when I came across it...

YELLENA JAMES

Some really nice illustrations here from the artist Yellena James. Liking the organic feel to them.

FLASH AUDIO

Some very interesting and inspiring works here from André Michelle using Actionscripting in Flash and some synthesised sounds. He's even put some of his works together to make a full blown music creating interface that works in your browser...

See:

aM Laboratory

Hobnox Tool

WORRIED NOODLES

This vid is awesome.

GAME ENGINE AS LIVE VISUALS

Another post from Create Digital Motion that has taken my interest is this one featuring MAX and the Unity game engine.


In the video featured on the page they show a cube and a sphere being controlled externally by Jitter - with the hues being altered, movement being applied, and even video with various blending modes being used as live textures. This set up made me wonder whether the same thing was possible in Blender - the free 3D modelling software, with a built in game engine.

This game (GooBall) is running under the Unity game engine as shown in the video.

(Still from GooBall)

And also a shot from a game produced in Blender...

Links to the game site, and the Blender project page.


It would be cool to see environments like these being controlled in realtime through alternative controllers - i.e. not gamepads, which brings me onto my next point...

Awaiting future release is the Max for live extension - allowing MAX/MSP to connect directly with Ableton Live - music production and performance software. This means that with this link and the link into a game engine, visuals set within 3D environments could truly be synced with music; whether it is in the form of playing a keyboard live and things happening within the engine - as you would expect in a videogame from pressing buttons, or via precomposed MIDI sequences that are performed as per a common Ableton setup. The precomposed MIDI would allow for patterns of visuals to be stored for instant triggering.

On top of this, the live game engine feed could be run through another program as an input - perhaps in VDMX somehow. Then post-effects could be applied to the source to enhance the image.

The possible outcomes for this kind of set-up would be endless. With the level of detail going into today's games and the kind of work beng produced in the motion graphic and animation world - making something that runs in realtime would be fantastic for events and installations. Music events could have live visuals that were designed specifically for the artist and sync specifically to their music. That's not to say that this isn't happening already - as there certainly are artsists out there with accompanying visuals by top design agencies - but my point is that showing them through a game engine that links directly to Ableton is possibly the most straight-forward, and best synced set you can have, and being a game engine, allows for visuals with a lot more depth than a set of pre-rendered clips. For instance, imagine that an environment like this...

(Zeitguised - Perepetics)

was created in a game engine - and that several of the different components were hooked up to different parts of a song like a drum pattern or a bassline. Like I said, the possibilities would be endless. And beautiful.

Bookmark and Share

Friday 26 June 2009

GRAFFITI WALL

I love stuff like this...by ALEX BEIM

RESONANT MIGRATION

Here's an interesting concept - taking musicians and using devices to match where they are standing (or where the instrument is) and then having devices detect variables such as volume, pitch etc.


I'd like to see a full swing jazz band with this kind of setup creating insane compositions as they jam. Perhaps creating images like this.

FUNKY FOREST

During my final year of university I researched a lot of visual and interactive design, which then led up to my Crazy Landscape project being produced. One piece that I found extremely inspiring based on the context of my work was 'Funky Forest'.
It was an interactive installation that consisted of a room with projections onto each face and the floor. When a person stood against the wall, a tree would appear in the shape that they made as they stood against the wall - for instance standing straight would create a fairly upright tree, whilst splaying your arms and at least one leg - dependant on how many legs you have as to whether this is possible - would create a tree with branches that splayed in a similar way.
Along the floor and up the entrance ribbons (like those you get at the back of old corner shops) was a stream of virtual water that could be guided towards the trees by standing in front of the stream. Some clever coding and detection here. Something I'd very much like to achieve myself one day...

(click image for more)

TARATIEL

Some awesome work here that I discovered over a year ago but thought it was still worth blogging now.. The artist combines paint with a lot of 'found' objects to create some interesting installations. The shapes are quite free-form in their existence and appear almost doodle-like.

Truly beautiful...


(click image for more)

Thursday 4 June 2009

SPLURGE!

I never blogged this after finishing it so I thought I'd do it now...

Here's a link to the posters that Harry Osbourne and I created for the reception area and editing suites at work. The editing suite ones sit at 140x100cm and the reception posters are a grid of 2x3x3 30x30cm panels. Don't worry you don't actually have to multiply anything there.



Bookmark and Share

Monday 1 June 2009

MONSTERS!

Some awesome animations that involve some pretty neat tracking and well placed characters in the real world. These guys have done a load of them and have got their own site dedicated to the little critters...

Take a ganda.

GRAND IN YOUR HAND

Hurrah! My piece 'Grand in your Hand' is finally finished!

See it here
Vimeo

and here
Flickr


Commissioned by AMV BBDO in September 2008, 'Grand in your Hand' was initially intended as a piece to create awareness of the referral scheme that was taking place within BBDO. Simple idea - Refer someone to the company and receive £1000 if the person is given a contract.

The brief was to create an animation that would be sent around as a viral and an accompanying poster that encompassed the notion of finding a person to refer and then seeing them settle well into the company – the final execution is not too obvious (as to not offend the creative types) and not too abstract (so the suits could understand it!).

Bookmark and Share

Tuesday 7 April 2009

MULTI-TOUCH

In terms of MIDI control and Multi-touch devices, the Lemur was truly king, until people started making their own alternatives - and alternatives that were ultimately a lot more customisable (in terms of it's control abilities and how pretty you want to make it...)

This example shows someone using MAX/MSP as an app that works in exactly the same way as the lemur, with multi-touch abilities, a whole load of functions running at once - now if we all had touch screens (and a copy of MAX) we'd all be taking a big step in terms of music/visuals...

And I would make mine look far more attractive than this screengrab:

VDMX STRUCTURE

I came across this post in Create Digital Motion a while back called 'Structuring a VJ set' by Momo the monster. In it he explains to the viewer how he would go about creating various types of media that would run in different media sets - and be on specific layers- thus allowing more control over what is being displayed on screen. I had tried exactly the same method before (and assigning the clips to Ableton tracks) with a degree of success. The limitations were that movies being triggered in fast sequences would either run with a delay - or in the worst case, crash the machine.

Click on the shots of my set-up, the output, and what I was trying to achieve...

----------

1.

The first is a shot of Ableton, with a number of MIDI tracks, all sending signals out that can be read by VDMX. There are a couple of 'audio in' tracks for any incoming music that I want the visuals to sync with.

In the MIDI slots, I have a number of pre-made sequences that are sending signals out. What this set-up also allows for is creating sequences (and therefore visual patterns) on the fly - by using a MIDI keyboard, such as the M-Audio Oxygen8 that I have.

There are 12 MIDI slots in total - 2 for background patterns, 2 for drum patterns, 2 for percussion (or other random sound effect) patterns, 2 for bass patterns, and 4 more called 'Main' - which can contain main song patterns or live input - so it would literally like playing videos as a piano

----------

2.

The second shot is of VDMX - with 12 Media bins, linking back to the 12 MIDI slots that I have. From the shot you can see that there isn't much in the way of Media here, however They could be filled with large amounts of clips that can represent each sound or note - it is here however that the ambitious size of the project starts to enter pitfalls as literally trying to render out so many videos at such speeds uses up massive chunks of power. And thus crashes horribly.

----------

3.
This is a shot of the output window, displaying several layers of video - all reacting to MIDI.

----------

4.

And this is an Illustrator image of the kind of visuals scene I was trying to achieve...
Each of these elements would be linked to MIDI and thus react (changing shape, colour, or animating in an entirely different way) to each signal.

I'd like to try a set-up such as this on Modul8, just to check speeds and the ease of linking videos - though the Modul8 demo is, alas, quite limiting because of the lack of module support.

SCI-FI-O-RAMA

A new favourite site of mine -

SCI-FI-O-RAMA

- contains a collection of sci-fi related images by various illustrators and artists - ranging back to really old-skool painted book covers depicting fantasy lands, to modern day symbol-like vector art.

Check it out...

OHM 64

And now an impressive piece of hardware...

THE OHM 64


It is a MIDI controller awesomely suited to visuals - with it 64 button square that could be used to trigger a mass of clips, and array of knobs and sliders for layer control, it jsut leaves me drooling (though probably not quite as much as a Lemur).

PIXEL ADDICTS

I recently stumbled across these guys - The Pixel Addicts. A bunch of designers who create some stunning installations and visuals across a whole range of events. They even created the visuals set-up that runs within Matter (the new sister club to Fabric).

It seems the have some impressive custom hardware (and software) for running their visuals on as well. It'd be sweet to get to see some of it close up...

Click the image to see more...

Sunday 5 April 2009

QBRK

Some awesome graffiti/illustration work from QBRK, found on flickr.

EXYZT

Some more fantastic examples of projection mapping here from the collective 'Exyzt'. I particularly like the building that has been mapped so perfectly and completely trannsformed by the projections on it.

See them on Youtube and Vimeo

AUGMENTED REALITY TOYS

I came across this site recently, demonstrating the use of augmented reality in toys. I'm looking forward having kids just so that I have an excuse to be able to play with stuff like this...

OMEGA CODE

I recently entered this competition for Omega Code with a version of my Crazy Landscape series. There has been a load of great work submitted for it so far, and the official posters are also stunning pieces of design work - an example is shown below.
I find the work of Omega Code interesting, as for every song they accompany it with visuals, their site even has a cool flash visualiser thing...



My image can be seen here at my flickr account...

Wednesday 4 March 2009

PLEASE SAY SOMETHING

I've been following the work of David O'Reilly for a while now, and absolutely loving all that he has done. The series 'Please Say Something' started in short episodes a while back but were removed to allow David to enter the project into a film festival, and thus complete the series as one long animation (of 10 minutes rather than 20 or so second stints).
What I love about the story is the way that it unfolds via such simple things - such as the scarf. You get a real sense of character about each...character very quickly through their actions and learn to love them both by the end.
I also love its style. It is honest - as David has mentioned on his site that '[his] goal aesthetically has always been the more broader aim of simply not hiding the artefacts of software, the same way Bacon didn’t hide paint strokes'. I have often found errors in software that have bugged me - most notably in 3D programs where polygons would spike off into infinity - when I first started out using 3D programs all those years ago I hadn't a clue what these were, and now there is an artist who is managing to embrace these errors - though not in a way that destroys the integrity of the animation.

And now for the sake of spreading the love and getting more hits or whatever, here is the embedded video...



I urge you to check out his site and his other stuff, especially RGBXYZ -

HERE.

PRE ENTER

I can't remember how I came across this exactly, but all I know is I remember thinking 'wow'. It's a fairly simple concept that has been executed so well. Ordinary desktop scanners (without the lids) were placed on walls around the club, and in the bathrooms, and the attendees of the show were urged to scan their faces (and other body parts) to be used during the show - as live visuals.

Click the picture to see what I'm on about...

MAPPING

With all the talk of projection mapping, I came across this festival for VJs and visual artists...


And on the site for Modul8 (VJ software for OSX) there is a clip that (I think i read somewhere) was from a previous Mapping festival...

It has some interesting use of projections onto material to create the illusion that some of the projections are in the air...

Kind of reminds me of the scene at the end of Terminator 2, except less fiery.

RED HOT CAR

Nice video/visuals for Squarepusher's 'Red Hot Car' via YouTube...


SUPER 2009 !

Another fantastic use of projection mapping, possibly my favourite yet...

by SUPERBIEN

TOUCH. SCREEN.

With touch-screen technologies being all the rage and the must-have, there are more and more of the things springing up everywhere, though mostly doing the same thing... Here is a different type of touch-screen that actually changes properties depending on how it is manipulated - as it has a certain elasticity to it, rather than a static panel.

Click the image for more...

WILL SCOBIE

An illustrator i have come across before in some design books a while back...

Here is an example of his work off his Behance account. I love the simplistic style of it and the way that it almost looks like it could be mechancial drawings.

Click the image for more...

MMM COLOURS

Some nice LED screens here - the way that club visuals should be done...


PROJ'IT v1.0

Here is a really nice use of projection mapping that makes a simple wall and block of post it notes suddenly seem far more interesting. As far as projection goes, I think the way forward is definitely projecting onto object/things/whatever rather than just a flat wall, as this can be pretty boring...

Wednesday 25 February 2009

DATA MOSHING?

I realise that this has been upon loads of blogs as of recent, but I thought I'd add it here anyway...

Kanye West's Welcome to Heartbreak


I really like this technique as it is a very creative use of the frame tearing and lossy atrributes that go with compressed video formats. As far as I have read in an interview with the director, it's not the easiest of techniques to pull off well. I can't stand the term 'data moshing', however. It comes across as some sort of high-school kid term used by the nerds in an attempt to sound cool.

Similarly, David O'Reilly, one of my favourite animators used this technique a while back in his showreel. With a nice soundtrack.

DAVID O'REILLY'S COMPRESSION REEL

SUNGLASSES!

Pretty sweet sunglasses designed in collaboration with graphic designers/illustrators such as Genevieve Gauckler and EBoy. It's a shame the currency conversion rate is so awful now...

COLAB EYEWEAR

Monday 23 February 2009

BUILT BY RIMES

I find Processing, VVVV and other such programs quite interesting and have seen a lot of good work produced with them. I have also, however seen a lot of work that seems to look the same and is instantly tagged as 'generative art'. This photo set by 'Flan [CWM / ICTV]'s' shows some generative work that also combines a lot of graphical elements - something I'd like to see more of in generative design. I still feel that there is a missing link between motion graphics and realtime generative art, though pieces like this show that the gap is ever so slowly closing...