Assignment,Final Project,Submission — Dan Russo @ 3:34 am



Phase Preview from Dan Russo on Vimeo.


The concept for this project started as an exploration into the physicality of mathematics and geometry.  These are concepts that are largely used, explored, and taught with on screen software environments.  The direct relationship that these operations hold with computation, make an excellent and effective pair.  However, the goal of this project was to take these abstract and more complex processes and bring them into the real world with the same level of interactivity they had on screen.  By combining digital technology with transparent mechanics, exploration could happen in a very engaging and interactive way.





Phase works by mechanically carrying two perpendicular sign waves on top of each other.  When the waves are in sync (frequency) with each other, the translated undulation in the weights below is eliminated.  When the two waves become asynchronous, the undulation becomes more vigorous.  The interaction to this piece is mediated by a simple interface that independently controls the frequency of each wave.  When the sliders are matched, the undulation becomes still, and when they are varied, the undulation becomes apparent.





Related Work:

Rueben Margolin

Reuben Margolin : Kinetic sculpture from Andrew Bates on Vimeo.

Rueben Margolin’s work uses mechanical movement and mathematics to reveal complex phenomena in a fresh accessible way.  The physicality of his installations were the inspiration for this project.  These works set the precedent for beautiful and revealing sculptures,  but the separating goal of phase was to bring an aspect of play and interaction to this type of work.


Mathbox is a programming environment for creating math visualizations in web gl.  This environment is an excellent tool for visualizing complex systems.  Phase seeks to take this visual way of explanation and learning, and apply to a physical and tactile experience.


Lessons Learned: (controlling steppers)

Stepper motors are a great way to power movement in a very controlled and accurate way.  However, they can be really tricky to drive and work with.  Below is a simple board I made that can be used to drive two stepper motors on one arduino.  The code is easy to pick up via the accel stepper library, so this board will clear up a lot of the challenging hardware issues so you can get up and running very quickly.

Things you need:

+Pololu Stepper Driver (check to see if you need low or high voltage variant)- see your specific motor’s data sheet.

See Photo Below and Link for Wiring

Circles in Motion

Assignment,Final Project — Dan Russo @ 7:23 am

Circles in Motion from Dan Russo on Vimeo.

Precedent Analysis 2

Precedent Analysis,Reference,Uncategorized — Dan Russo @ 7:54 am


Carlo Ratti (2008)

MIT SENSEable City Lab


Double-Taker (Snout), Interactive Robot from Golan Levin on Vimeo. (2008)




J. Mayer H (2002)

“Ar Drone Parro-Tweet” by John Mars, Yeliz Karadayi, and Dan Russo

Assignment,Submission — Dan Russo @ 7:39 am

Parro-Tweet from Dan Russo on Vimeo.

Parro-Tweet utilizes the AR Drone hardware platform coupled with Open Frameworks and the twitter api. Open frameworks allows the drone to be controlled multiple sources, including a remote laptop with gaming controller or the raspberry pi with custom control panel.  The drone can be used to seek out photos and instantly place them on a twitter feed.

The biggest challenge with this hardware platform was the latency experienced from the video feed outside of it’s designated mobile app.  This is most likely due to the compression type the wifi connection uses.  It’s proprietary nature made it difficult to find any documentation on how to fully utilize the system.  The latency made it nearly impossible to run any computer vision / navigation through open frameworks.  However, all manual controls and the twitter api are fully functional.


Raspberry Pi Open Frameworks Demo

Uncategorized — Dan Russo @ 11:56 am

RasPI GPIO Demo from Dan Russo on Vimeo.

This example demonstrates the use of physical input (GPIO pins) to activate an on screen graphic response.  All of the computation is completed by the raspberry pi, which is utilizing open frameworks to generate the graphics and access the digital read pins.  There are 3 buttons (momentary) that each activate a different geometry.  The buttons can be activated separately or in various combinations to alter the graphic.

This was my first experience using raspberry pi and open frameworks, as well as any programing interface that uses true C++.  One of the most crucial things I learned while working through this demo, was the process of coding and compiling C++.  Learning the role of the uncompiled file structure was a rewarding experience that allowed a closer look into how the programming language utilizes memory allocation.  The use of the Raspberry Pi’s GPIO pins was also a crucial learning portion of the demo.  The GPIO interface of the Raspberry Pi differ’s from other micro controllers in some ways, so dealing with a more complex process provided a better look into wiring switches and sensors to digital read inputs.




Assignment,Project01 — Dan Russo @ 5:02 am
[vimeo 105531168 w=500&h=280]

Breathe was created to explore the dialog between people and their built environment.  The installation gives walls the ability to subtly communicate through the mimicked act of respiration.  The respiration creates an active dialog that changes with the character of the space. Breathe samples ambient sound levels and bases it’s rate of respiration on this data. High ambient sound levels lead to anxious tendencies of the wall panel, thus reflecting the living energy of a space.




Project01 Precedent Studies

Precedent Analysis,Project01,Uncategorized — Dan Russo @ 5:38 am


Date: July 2014

Project: Light House

Creator: SOFTlab

The purpose of this installation was to respond to real time sound input from equipment and live performances.  It was the intention of  SOFTlab to create an environment which visually displayed characteristics of music being played at SONOS.  This project engages music in a very unique way that goes beyond a visualizer.  The complexity of the patterns made by the lights and the depth of input make the installation very responsive.  The complexity of the display, and how it physically relates to the sound itself is great aspect to further investigate and think about going forward.



Date: 2012 

Project: VERSUS.

Creator: David Letellier

The purpose of this installation was to explore sound as a constantly evolving system.  The artist wanted to create a space that started with one sound, but built the ambient sound of the space on top of each subsequent recording.  One side will record the other, and then play it back for the opposing side.  This creates an ever evolving loop that includes all active participants and the space itself.  The idea that the character of a space can be displayed and collected in a compressed auditory manner is very intriguing.  The idea of creating opposing components that build on each other makes it a valuable piece to study.




Date: August 2011

Project: Interactive Robot Painting Machine

Creator: Benjamin Grosser


This piece of interactive art turns sound into a visual painting.  It was the intention of the artist to expand the boundary of interaction between music and visual art.  This project uses genetic algorithms to process audio data into a painting style.  This style becomes a evolving guideline for the image output.  The robotic system uses a controlled paint brush on canvas to generate the visuals.  This project displays a very unique way of processing audio input into a framework of guidelines.  Studying this project may be helpful to clearly define a relationship between an input and output.

Dan Russo

Description — Tags: — Dan Russo @ 1:02 am

My past educational background in architecture provided a broad exposure to many interdisciplinary topics.  While studying architecture my main focuses developed around global culture and the impacts technology have made in the built environment. Both digital and hand fabrication techniques were employed to explore these interests in differing ways.  I also worked as a student web developer, where I worked on various tools for faculty and students on mobile platforms.

Both of these opportunities have brought my focus to investigating communication with dynamic environments, in which our experiences become energized 2- way conversations involving objects and people alike.  I hope to fully explore environments that can raise as many questions as they resolve.

A portfolio of my previous work can be found at


Below is a short video documenting the assembly of a full scale prototype for Reactive Space

[vimeo 104267680 w=500&h=280]


This work is licensed under a Creative Commons Attribution-Noncommercial-Share Alike 3.0 Unported License.
(c) 2021 Making Things Interactive | powered by WordPress with Barecity