I’ve been having a good play with Quartz Composer over the last couple of weeks. It is very exciting and a bit scary at this late stage of my MA research: “Does it make what I’ve been doing for the past two years redundant?”; “Should I drop the whole interactive QuickTime thing and start from scratch in this new environment?”; “Should I ignore it for now and continue with QT because it is cross platform and more accessible?”.
In many ways it lets me do what I have been doing, experimenting with, and wanting to do (real time interactive online video) much more quickly and with exciting new visual results. In some ways it makes basic QT redundant but it is quite a different beast.
QuickTime excels on the network. Child movies can be sourced from anywhere, XML and QTlists while a pain to set up sometimes are very powerful and I’ve only really scratched the surface of their potential when combined with server side scripting such as php. Quartz Composer is much more at home on the desktop. It can import still image files from a URL but not movies. It can read RSS very easily, but is designed for human readable text and requires custom scripting to deal with generic XML files and attributes. I have had some success getting QC to load movies from the network via a local QuickTime link file pointing to a URL, but the targeting it is local, relative to the QC composition. It seems this link is lost if the composition is exported to a .mov file.
Here is a quick example (requires Mac OS 10.4). Apologies for the cheesy kaleidoscope imagery once downloaded and unzipped, the .qtz file should play in Quartz Composer, importing link.mov which points to a video file on my server. The zip file is about 4k.
The cool thing is, in many ways this (Quartz Composer), builds upon what I have been doing in QuickTime and is mostly playable by both the QT player and plugin. While the linking to movie files online is problematic at this point, surprisingly live video and audio inputs are supported even in the QuickTime browser plugin! Here is an example which takes a live feed from a FireWire camera, layers it over itself on 3 differently coloured layers (red green and blue) and scales in real time based upon audio input from the computer’s built in microphone. Link to livergb.mov. This has been tested in Safari with a Sony HandyCam and my PowerBook’s built in microphone. Here’s the source .qtz file. While live video input into a movie playing in a browser is pretty exciting, unfortunately more simple things like keyboard and mouse input are missing.
Stay tuned for more examples as I play more…