I now mostly post over here: blog.greer-inc.com

I'm a Digital Designer, Art Director and Developer. I'm into photography, interactive design, the web, advertising and art.

Fractal Explorer

Posted: January 16th, 2010 | Author: | Filed under: General | Comments Off

Having just started my first forays into Adobe’s Pixel Bender, I’ve been blown away by the scripts which are currently available for it. As a way to temporarily avoid actually learning the language, I spent quite a bit of time playing around with Subblue’s Fractal Exporter. Results below.

It’s particularly nice to see these shapes morph as the parameters change. Will get them into After Effects soon to render out these shifts.

Giant Green running.

Posted: January 10th, 2010 | Author: | Filed under: video | Tags: , , , , , | Comments Off

Finished putting together a quick running test with the rabbit character posted yesterday.  This was a little tricky, as I’ve set up the limbs so they don’t deform.  The result is an odd bouncing gait, which could be cool with a bit or work.  Eventually, he’ll be composited over video of a city.  But not yet.  For now: running.

Jeremy, the Pan Britannica Bunny

Posted: January 9th, 2010 | Author: | Filed under: General | Tags: , , , , , | Comments Off

Simple character design for a rabbit to be built in 3D and rendered as a shadeless, one-color animation.  It’d be fun to composite this style over video of cityscapes.  Might try out Blender’s new spline IK system to get the joints movements looking really fluid and curvy.

One-colour bunny

The pan britannica bunny

vector file: rabbit (pdf, 204KB)

Broken umbrellas

Posted: January 8th, 2010 | Author: | Filed under: photography | Tags: , , , , | Comments Off

I’ve been meaning to post this set of photographs featuring broken umbrellas for a while.  Most, but not all, were taken in Glasgow.  There is, perhaps in that city more than others, a tendency to cast aside an umbrella with even the slightest fracture, presumably in frustration at the lack of protection it provides against driving rain.  I could go on about the form of the broken umbrella in contrast to it’s urban setting, but I won’t.

Vancouver seems like the perfect place to continue the collection.

Broken umbrella set: on flickr.

Organic glass city.

Posted: January 7th, 2010 | Author: | Filed under: 3d, illustration | Tags: , , | Comments Off

Well structured distopian vision, or lazy arbitrary render?  (It’s the second one).


Blender, Photoshop.

Full size: glasscity2.jpg

Communicating between Blender game engine and Processing

Posted: January 7th, 2010 | Author: | Filed under: Blender, processing | Tags: , , , , | Comments Off

The Blender game engine gives you a pretty sophisticated world to play with.  Physics calculations like collisions, gravity, rebounds, etc are taken care of for you in the system.  Communicating between the game engine and other systems allows you to work with fewer limitations, or to work in a more familiar system where necessary.

In this example, Processing is used to access the macbook’s sudden motion (tilt) sensor, and to send the data into the game engine using OSC.

Some of the python scripts used here are pretty old, and various versions exist online.  Along the way, the authors seem to have been lost.  There have been varying amendments to most to produce this example.

With both Processing and Blender installed, download the source files here: motion2Blender

Connecting Blender to OSC.
To try out this example, you’ll need Blender 2.49 or later, Processing, the processing libraries oscP5 and sms, and a macbook post 2005 (with a sudden motion sensor in it).

1.  From the blender folder, open motionSensing.blend.
- you may need to open blender from the Terminal in order to see python errors or print commands properly (see footnote).

2.  There are two important python scripts: “connServer.py” and “oscLoc.py”.  The first will attempt to connect to the OSC server when the game is first run, the second listens out for osc messages once the connection is made.

3.  Hit ‘P’ whilst hovering over the 3D viewport to start the game in Blender.

4.  Ensure that you have the libraries oscP5 and sms in your Processing sketchbook libraries folder (see footnote).

5. To get data sending across OSC, open up motion.pde (in processing/motion/).  Hit the run button, or cmd+R.

6.  Tilt your mac book about, and the ball in the game engine should roll around in concert.

More on Blender:
The connServer.py and oscLoc.py scripts are linked into logic bricks located on the ball in the scene.      - connServer.py is triggered when the game first starts.
- oscLoc.py is connected to an “Always” sensor, and so is running at all times (but will only get any data after the connServer.py has been triggered).

You may need to specify the port number for OSC to listen on (line 35 of connServer.py) – this should correspond to the port that Processing is broadcasting to ( myRemoteLocation = new NetAddress(“″,12000); in the setup function).

You’ll need to adjust oscLoc.py to correlate to the messages you’re sending in on OSC.  This example is set up with processing, but it could receive OSC from anything.

Whenever an OSC message is received by Blender, the oscLoc script sends any data it receives to the OSC callback manager: “manage.handle(data)”.  In turn, this handler examines the first element in the array, and sends the message to the appropriate function.  In this example, all data sent will be “/move”, which will call a function called ‘move’:

Here’s the simple function which is being called:

Force is being applied to the object based on the movements of the laptop.

Launching Blender from the Terminal:
If working on a mac, you may need to launch blender from the terminal in order to be able to see the python outputs for prints and the like.  You should be able to see these in the Console, but it often refuses to update properly.

To launch from the terminal, navigate to your blender folder in Terminal, (type “cd /Applications/blender/” and hit enter), then type “blender.app/Contents/MacOS/blender &” to launch.  Python output and debugging will be printed to the terminal.  Alternatively, right click on the blender application, and select “show package contents’.  From the window which opens, navigate to Contents/MacOS, and drag the ‘blender’ file into the terminal window, hit enter to run.

Installing Processing Libraries
You can now add a library to Processing by adding its folder to the Libraries folder in your Sketches directory.  If Processing is running, you will need to restart it before it will recognise a new library.

To set your sketchbook location, open Processing, and from the top bar select Processing > Preferences.  Each library in the Libraries folder should be in a folder containing a subfolder called ‘library’, within which there should be a .jar file, eg:

Game of life

Posted: January 6th, 2010 | Author: | Filed under: General | Tags: , , , , | Comments Off

I made a flash version of Conway’s ‘Game of Life’ a while ago.

In an early version, I failed to clear out the array of cells which needed to change state that round (from dead to alive, or alive to dead).  The result was far more mathematically arbitrary, but made some interesting patterns. In this version, as the dots flicker across the screen I see a robot, an alien, a fat man, a mouse wearing a crown, and a smug tiger.  It’s way more space invaders than the proper version.

Source (including proper, fixed,  game of life) available here: Connways_gameOfLife


Posted: January 5th, 2010 | Author: | Filed under: General | Comments Off

Just been exploring the excellent toxi color libraries in Processing.

Dots for a slow day.

Dots for a short night.


Posted: January 4th, 2010 | Author: | Filed under: General | Comments Off

I’ve been working through a few of the examples in Keith Peters’ Advanced Actionscript Animation book recently, which is as well thought through as its prequel.  The section on steering behaviors with 2D vectors is incredibly informative, and got me working on a game of tag between the little arrows.  Using the behaviors I’d just worked through in the book examples, this was surprisingly straightforward.  The players all avoid whoever is ‘it’, while he-who-is-it looks for the nearest player, and moves towards him (or, rather, his destination based on his speed and direction).

Get Adobe Flash player

Catches are pretty frequent when the arrows bounce off the walls, as the predator tends to be behind the prey as they turn.  Getting rid of the walls shows a more interesting pattern as the prey swarm together:

Get Adobe Flash player

Things have a tendency to get stuck on the edges still.

May have a go at skinning it with some frantic chickens running aboot after each other.  Then onto the chapters on isometric rendering.  Whoop.

Code available here: catch.

Drops of the world.

Posted: January 3rd, 2010 | Author: | Filed under: General | Tags: , , , , , | Comments Off

Expanding on yesterdays idea, I’ve just put together a processing sketch which takes a ‘panoid’ (extracted from streetview links), and creates a printable net of the location.

To create the net, I modeled the drop in blender, and used its handy ‘unfold’ python script (scripts > mesh > unfold).  This creates an svg of the flattened net, which I imported to illustrator to change line weights and add tabs for gluing everything together.


Streetview tiles


Blender model




Unfolded net.


Added tabs.

The processing sketch takes a ‘panoid’, which can be extracted from the link to a google street view.  It then extracts the image tiles, and squashes them to fit the net, before saving it out as a jpg.  At the moment, it’s making pretty small files (water droplets of the world), which may be modified in future to create higher-res, larger nets.

It’d also be nice to have this accept an address, and do the geocoding to find the nearest streetview point.  Looks like this should be possible with the services google are now offering (getNearestPanoramaLatLng()).  With this in place, it could all be wrapped nicely in a user-friendly website.

There are definitely still some problems with the mapping of images to the net that will be resolved when making a higher resolution version, but it’ll be interesting to see what these look like once printed and assembled.

The processing files can be downloaded here: dotw.