Assignment,Precedent Analysis — ygk @ 1:55 am

“OpenControllers” by ECAL/Marc Dubois (2014

OpenControllers ECAL/Marc Dubois from ECAL on Vimeo.

Open Controllers allows you to use your daily device as a controller of a different kind by placing it within specifically designed artifacts that embed the device and utilize its capabilities such as the gyroscope and the light detection. This project takes your physical movements and interprets them. I think this can be done with a range of patterns, which gets into the next video…

“Tilt Brush” by Skillman & Hackett (2014)

This is a lot more similar to what I want to to, except that I also am interested in the painting becoming three-dimensionalized as you are painting. Not that I intend on doing that exact thing…I just mean to draw a parallel. The gestures here looks swift and intuitive if it’s real, and the interface is quite honestly very confusing but maybe that’s because I’m not aware of the gestures it uses. Assuming this is true, I think this is the future of design through tangible interaction.

“3Doodler” by Pete and Max (2013)

This project does a lot of what I am hoping to accomplish with its ability to prototype quickly and efficiently without having to prepare ahead of time what will be made.

“3D Printed Blooming Flowers” by Mikaela Holmes (2014)

This kind of movement is exciting as a less mechanical way to actuate an artifact. I wonder how it could be applied at a more complex or responsive level.

I think these videos show the progression of my thoughts on the final project quite well. I want to be able to design something indirectly by using a tool, not unlike how surgeons now use robots to do the actual surgery. I am struggling to know the scope I am capable of, and coming up with a good project idea to make it happen. Something I had thought about is using my surface as a tool to communicate with a CNC robot to fabricate sketches drawn in real time on the surface.

Precedent Analysis 2

Precedent Analysis — pvirasat @ 11:21 pm

Art: “Quotidian Record” by Brian House (2012)

Quotidian Record is a custom-made vinyl record, featuring Brian’s location-tracking data over a year. The data is mapped to a harmonic relationship from 365 days down to approximately 11 minutes. This project connects digital information to the physical world though music. I find it a very unique and interesting way to visualize the data that can be easily forgotten and make it tangible and memorable.



Industry: Fitbit

I am still deciding on the type of data I’d like to collect, and that would also determine the tool I use. Fitbit is definitely on top of the list. One idea is to track the amount of sleep I get each night, and how that affects my daily activities the following day.


Academia: “Understanding Quantified-Selfers’ Practices in Collecting and Exploring Personal Data” by Eun Kyoung Choe, Nicole B. Lee, Bongshin Lee, Wanda Pratt, and Julie A. Kientz (2014)

link to paper

Even though everyone tracks something about themselves, this paper looks into the “extreme users” – the quanified-selfers, who diligently collect many different kinds of data about themselves. Through Meetup groups, blogging, and other outlets, they have shared their experiences of what they have found useful and the mistakes they recommend to avoid.  This paper will help me narrow down the data I should be collecting, thinking about how the data would be beneficial or appropriate for this project.



Since I am looking to work with my personal data that is collected over time, the most appropriate outlet would be to be published on a few notable design blogs. However, my ultimate goal for this project is to learn a little bit more about myself through making this data permanent and tangible.


Augmented Windows

Precedent Analysis — Tags: — John Mars @ 5:25 pm


In 2012, Samsung debuted a transparent LCD. Where typical LCDs use an artificial backlight to shine through their pixels, the transparent display uses the sun (and artificial edge-lighting at night). The display also includes a touchscreen component.

Samsung previewed their technology as a 1:1 desktop monitor replacement, but I see value in using the technology more inventively: as an augmented reality window, where the display builds upon what’s visible in the outside world.


Memo Akten (who I just realized authored the ofxARDrone library I used in the last project) created a series of videos for the launch of the Sony PlayStation Video Store. The videos use a live projection-mapping technique where the content is mapped according to the perspective created by the camera’s position and angle in space.

If you look through a window at a nearby object, draw a circle around it, and then move your head, the circle is no longer in-place. With head- or eye-tracking, the perspective projection could change with your changing viewpoint, so that the circle will always be around the object.


What happens if two (or more) people are looking out the window? A conventional display can’t show different images for each person. Alternatives to make that happen would be active- (or maybe even passive-, if you’re really good) shuttered glasses, as used with 3D TVs, or a lenticular display. A lenticular display is one that uses lenticular film — you know, these things:

This 1999 paper1, by N.A. Dodgson, et al, in addition to his 2011 workshop, discusses a way to create a glasses-free 3D display using lenticular arrays. In addition to its 3D uses, such a display could also be used for “two-view, head-tracked displays; and multi-view displays”. They’re not talking about displaying unique perspectives per-viewer, but instead about delivering correct stereoscopic images to each eye of each viewer; my use should be much more simple.


I definitely want to try to get my project picked up by a media outlet — a technology/design blog, for example. A little boost of fame would be most appreciated. And, heck, if it turns out to be really cool, unique technology (which it might be — I’m not finding all that much in academia), maybe even a paper submission would be in order.

  1. N. A. Dodgson, J. R. Moore, S. R. Lang. 1999. “Multi-View Autostereoscopic 3D Display.” International Broadcasting Convention 99.

Project 2.01 – Oatcam

Uncategorized — rbrill @ 11:24 am

Making Things Interactive: Assignment 2 from Ryan B. on Vimeo.

For our second project, we were tasked with making a spy device using Raspberry Pi. This was my and Danita’s first time working with Raspberry Pi.

We created a motion sensing spy device using the Raspberry Pi, an oatmeal container, a PS3 Eye camera and a passive infrared (PIR) sensor. We used the WiringPi mapping for the GPIO pins to take in data from the PIR sensor and camera. It took us a while to get consistent results with these pins until we realized that one of the pins we were using had a built in pull up resistor. Changing pins did the trick.

We definitely pushed the Raspberry Pi’s processing power to its limits at times in the development process, but in the end we had a system that can consistently capture and store and transfer photos of passers by.

by Ryan and Danita

OpenFrameworks Awesomeness: TRON Legacy

Software,Technique — Tags: — John Mars @ 9:47 pm

I remembered this and am not using it for my precedents, but I thought I’d share anyway (read his post, don’t just watch the above video):

Tron Legacy by Josh Nimoy

“Dizzy, The Deceitful” by Patt Vira & Epic Jefferson


Dizzy is a friendly looking teddy who actually watches your every move and makes everything it sees public.

Making use of the Raspberry Pi’s GPIO, we hooked up a PIR sensor to trigger a webcam capture event and publish the image to Tumblr.



By far the most challenging part of this project was working with the API. Since it’s a lot easier to find examples of raspi projects written in python than in c++, we decided to use python. For example, the Tumblr API page has example code for python but not c++, adafruit has a great python tutorial for hooking up a PIR sensor to a Pi’s GPIO.

pytumblr is Tumblr’s official API client for python, but the instructions are unclear.


Python-to-tumblr Tutorial


Source Code






Streaming live video with ffmpeg/ffplay

Software,Technique — Tags: — John Mars @ 9:33 am