Painting, data and music

I felt like I was in over my head taking this traditional painting course at OCAD. Fortunately, our instructor has a more open definition of painting.

I’d spent a lot of time learning a Mendelssohn song on the piano, practicing it again and again to focus on each layer of information in the sheet music (notes, volume, pedal, etc.) I thought it would be simple to translate the sheet music into a data set on excel. It ended up being more complicated. How do you put right hand and left hand notes in the same table when there are different amounts of each? How do you show markings that refer to groups of notes or that occur in the spaces between notes?

In Processing, I took data from:

  • pedal markings
  • note pitch
  • note length
  • volume (eg.: mp, p, f)

and turned them into:

  • line continuity
  • line curve
  • line length
  • line thickness and colour

Screen Shot 2016-11-15 at 10.25.29 AM.png

(An early test)

Instead of creating a line graph to represent music, the program “paints” a line one segment at a time, and the line curves and twists according to the relationship between each note and its predecessors. To me, this method of showing the data is more representative of the way we listen to music.  We usually don’t have perfect pitch, so we use the relationship between one note and the next to understand what’s being played. We pay attention to this relative pitch as much as we pay attention to the global pitch.

Screen Shot 2016-11-15 at 9.19.44 AM.png

This image is generated from one line of music, divided into the melody and the harmony. The shapes remind me a little of photographs of microscopic organisms.

Screen Shot 2016-11-24 at 11.16.15 AM.png

I adjusted the code to be able to include more accurate information. In both, you can see that when the line curves very smoothly, notes are progressively going up or down. Repeating patterns in the music can be see in through patterns in the line.

Screen Shot 2016-11-17 at 1.15.24 PM.png

The chunky line above on the left is something I generated after having figured out how to choose notes that would create interesting shapes in the line. It does not use data from a real piece of music.

It may be interesting to use a keyboard and generate these graphics in real time to see how someone might improvise when they have this added visual feedback.


Data Visualization

I worked a lot with data visualization a while back. It has been really fun to work with big data sets and to try to make them understandable. Here are some images of a visualization project I made in Processing with Sally Luc.

Sally was the master coder and UI designer behind the practical side of the program. Users can organize movie data in ways that are most relevant to their interests, and everything is presented cleanly.

Screen Shot 2015-12-02 at 9.11.50 AM

I was more involved with the weirder side of the program, where popular movies are visualized by the dominant colour of their poster. You can see how this changes throughout the years by basically drawing the data onto the screen… or you can just calm yourself down by making some pretty circle art. I particularly like the purple trend around 2008.

Screen Shot 2015-12-02 at 9.13.00 AM
The program still needs a bit of fine tuning (e.g.: that glitch on the left), but people enjoyed playing with it.

Playing with APIs and sentence structure

I was going through my files when I found a project I had been playing around with early last school year.

Screen Shot 2015-03-28 at 12.02.47 PM

Screen Shot 2015-03-28 at 12.03.28 PM

The program, created in Processing, takes famous quotes from a website, breaks them down into words and allows you to recombine them into “misquotes”. I like games that play with words. This program is my first step into developing an idea about a co-operative game where players sabotage each other’s attempts at communication.

Full Feather

Screen Shot 2015-03-13 at 7.39.10 PM

I’ve been working on a game in Unity for what has probably been my longest project at University. Since my semester has been so busy, I had to squish working periods into a few free spaces in my schedule. The process of making the demo was really stressful, but I made sure to go with an idea I really cared about and I’m excited to continue working on it past the stage of it being an assignment.

Screen Shot 2015-03-12 at 11.23.58 PM

I worked mostly alone on this one, which gave me a lot of flexibility. It also meant I would have to learn a lot of things I had never done seriously before. This included coding in Unity, designing repeating patterns (patterns are an integral part of the game) and 3D modelling/rigging/animation. The game has a long way to go. It has been a great learning experience and I have a lot of ideas of how it can grow. You can check out the development blog here.


Escape Darkness – Video Game

Project Title:

Escape Darkness

In Escape Darkness, fight to stave off the rapid darkening of the sky, follow a path created by sound and blend beautiful colours.

Screen Shot 2014-10-02 at 10.41.09 AM


Escape Darkness is a java/Processing based laptop game where players use a cursor to dodge various dangers and touch objectives. Bars continuously move down the screen in pairs and their length is generated using microphone sound input.

The player must move in between bars or the game background will get darker- to the point when the game ends. Another way that the game will end is if the player hits one too many black dots. These cause the “walls” around the player to slowly close in, making the path they must navigate twisty-er.

Screen Shot 2014-10-02 at 10.40.01 AM

The objective for the player is to replicate a colour, which is shown to them in a circle in the corner of the screen. To do so, they must touch the moving dots of red, green and blue, which will add a little bit of the corresponding RGB value to a palette on the screen.

Screen Shot 2014-10-02 at 10.41.18 AM

The path between the walls (which the player follows) responds to the sound environment that the laptop is exposed to. The game can be played simultaneously with music or even when the player is talking at the computer. Depending on the smoothness of pitch transitions, the path will be easy or harder to follow.

Screen Shot 2014-10-03 at 7.19.35 AM

Process Work/Future Versions

We started the project wanting to use the sensor card and the element of sound. This itself was probably the biggest challenge of making the game, because getting a sound input to interact meaningfully with the game was quite difficult.

We tried using different microphone input feeds, through a Processing library called Minim. One thing that helped to smooth/round the signal was mixing some of the previous signals into the next ones, before translating this into the horizontal positions of walls in the game.

Screen Shot 2014-10-02 at 10.40.21 AM

The difficulty of the game varied greatly depending on what kind of sound the laptop microphone was exposed to. For example, in a quiet environment with a single instrument song playing out of the laptop speakers, the path created in the game would be curvy. In a noisy environment, or with pop music playing, the path created in the game would be jagged and sometimes impossible to follow.

During play-testing sessions, we brought the laptop out into the hall and the game quality improved significantly. In a future development of the game, an external microphone could be plugged into the laptop. There could also be a more complicated smoothing algorithm preformed on the sound input.

Screen Shot 2014-10-03 at 7.17.20 AM

To balance the fact that hitting walls was inevitable, we added little coins that the player could hit to counter-act the rapid darkening of the background (the darkening would ultimately lead to the game ending).
Screen Shot 2014-10-02 at 10.40.23 AM

Both of us were learning how to make our first game in Processing. To tackle this, we started with a simple game and then tweaked and built onto it for a game with more interesting mechanics and aesthetics. This type of exploration really gave us a lot of ideas for future games- especially ways we could incorporate sound as more than just a game output. In future game developments, it would be interesting to use sound input in the generation of a dynamic game environment, in a way that the player can understand and possibly even control.


During play testing, players found it difficult to get into the game at first. Yifat suggested that the game start more slowly and then increase in speed over time. This is something that could be easily implemented in a future version of the game so that players could have a learning session without being bombarded with “Game Over” screens. It would also make the game more interesting for players who have become accustomed to how the game works and who have gained skill in playing it.

Screen Shot 2014-10-03 at 7.09.23 AMScreen Shot 2014-10-03 at 7.07.29 AM

Another comment was that the overlaid display (that showed the player what colour they needed to mix and what colour they had mixed) was causing the player to have to dart their eyes back and forth between the cursor and the corner of the screen. A solution to this would be to constantly move the display up to the vertical location of the cursor. Another would be to make the cursor a little bigger and actually display the colour that is being mixed on the cursor.


Team Information

The game was created by Karina Kurmanbayeva and Sophia Niergarth. Many thanks go to the play testers from Thursday’s class.

Climate Change

For my final coding project of the semester, I wanted to write something that would address the issue of climate change but still be fun to use.

The final prototype is basically a program that enhances a physical hexagonal tile game. Players move around a future “earth”, investing in houses that they hope won’t get destroyed by natural disasters. The program takes information from a weather database on the internet and creates a very simplistic simulation of future weather conditions, which influences the game difficulty.



Digital element




Physical element