SoniCity

SoniCity is an abstraction of various people's movements through space - an abstraction through sound. It traces pathways and velocities and transforms the data into sonic reverberations. The result is a map that unfolds through time. By tracing people in this way, the map reflects the complexity of human travel as experienced through flow.

SoniCity revolves around recording GPS tracks: data is brought into a Pure Data patch, parsed, then the individual GPS attributes (i.e. Latitude, Longitude) are used to drive the sonic engine.  

SoniCity allows us to think about public space and mobility in novel ways, creating a playful relationship to our journeys and their audible results. We are not attempting to produce an objective cartography.  By drawing awareness to our patterns, it becomes possible to imagine alternatives – in shared spaces and in sound. In this way, SoniCity becomes a tool for the derive i.e. the psycho-geographical exploration,  and also makes public space an instrument with which to play. 

****************

project history

In November of 2008, Jesse Scott attended the 'Almost Perfect' locative media residency at the Banff New Media Institute, and developed the initial technical engine of the project. This consisted of 1) a Java applet that would run on Symbian phones, sending nmea data to a server in a live stream, 2) a server-side php script that parsed the nmea data and forwarded it to an ip address, and 3) a Pure Data patch that received and further parsed the incoming data into individual gps attributes, such as latitude, longitude, altitude, etc.

In February of 2009, Jesse was awarded a self-directed residency at the BNMI, and used this time to reinvestigate the aesthetic, technological, and design concepts surrounding the work. During this time, basic sonicization and visualization was driven by the gps data, and a version of the Pure Data program was built in order to read 'offline' nmea from a text file.

Research at Daimon's Artist in Residency Program (Ottawa, 2013). The domain of technical research involves transcoding of different gps data formats into text files to be read in Pure Data, building the sonicization engine in Pure Data, the construction of a User Interface that allows for live rewiring of the gps data to sound, and the programming of a Processing application for Android phones to record pure nmea data. 

****************

The video below is a proof-of-concept that shows a custom Processing application that prints the realtime gps data to the screen of an Android device.

SoniCity: Proof-of-Concept: GPS on Android w/ Processing from the memelab on Vimeo.

****************

The following set of beta sonicizations were presented at The Cognitive Cities Conference (Berlin, 26-27 February 2011). These tracks are tests of our Pure data sound engine for SoniCity. Each track is composed through generative sound synthesis, with frequencies derived from GPS data collected while walking along routes connecting locations for the conference.

SoniCity by the memelab

****************

The following video shows (non realtime) parsing of a gpx track into Pure Data, and using the ensuing values to drive sound synthesis.

SoniCity betatest from the memelab on Vimeo.