Skip to content

College projects

Cellular automata explorations, etc

Side by side comparison of source input image and transformed output

Various generative sketches, some implemented with Processing and some with openFrameworks, that use cellular automata and particle systems to transform static source images into generative artworks.

Side by side comparison of source input image and transformed output (example 1) Side by side comparison of source input image and transformed output (example 2) Side by side comparison of source input image and transformed output (example 3)
Three different progression stages based on the same source image. Output pixels are determined by the state of a cellular automaton, which is essentially just a variation of Conway's Game of Life.
Abstract landscape based on Game of Life
Stencil composition 1 Stencil composition 2
In other explorations, the source image is used as a stencil and cellular automata and particle systems are underlaid.
Stencil composition 3
Stencil composition 4
Webcam frame self-portrait composition
Self-portrait using a webcam frame as the source image.
Head composition variation 1
Head composition variation 2
Head composition variation 3
Head composition variation 4
Head composition variation 5
Head composition variation 6
Head composition variation 7
Head composition variation 8
Head composition variation 9

WildGarden

Digital painting made with WildGarden

Senior thesis project, implemented using Processing and the library oscP5.

Archived source code here.

The original project description I wrote (preserved from the thesis project abstract):

WildGarden enables participants to create visual compositions through an exploratory process of invoking and manipulating algorithmically generated elements that simulate paint strokes. WildGarden's digital paint strokes employ particle algorithms such as flocking and path following. While the paint strokes can move and develop independent of user intervention, a participant has the power to use a Nintendo Wii controller to expressively direct how their compositions evolve in real time by affecting core parameters of the paint strokes. Each paint stroke can be modified to affect its colors and size, as well as the behavioral parameters that determine how fast and to what degree of chaos it moves.

WildGarden additionally provides utilities for customizing color palettes and saving and exporting high-res screenshots of compositions.

WildGarden is implemented in Java using the open source Processing library. It can be used as both a standalone application (currently only being tested on OSX) and a toolkit to be applied in other projects.

Sketch of early controller concept
Early concept sketch for WildGarden's input controller, which would be a custom-made scepter that emitted infrared light to signal its position. The custom scepter was eventually abandoned in favor of a Nintendo Wii Remote (which itself has an infrared pointer built in).
Home menu screen screenshot
Home menu screen
Configure colors screen screenshot
Configure colors screen
Composition pause screen screenshot
Composition pause screen

Various composition stills created with WildGarden

Composition 1
Composition 2
Composition 3
Composition 4
Composition 5
Composition 6
Composition 7
Composition 8

A footnote

In retrospect, I didn't do a great job documenting and preserving much of my work in college. Notably, most of my programming projects from this time are actually interactive, and the visuals are animated or otherwise dynamic. The artifacts I have are mainly still images taken of visuals produced with these projects, and, in the case of my thesis project, the original source code (which is painful to look at now).