Planet Wars

Final Project,Submission — Tags: , — priyaganadas @ 1:53 am

The idea of the project is to render fictional world of novel by using playing cards so that the content becomes interactive.
Planet wars is based on famous science fiction novel “Hitchhiker’s guide to the Galaxy” by Douglas Adams. It consists of 28 planet cards that have been described in the novel. Every card has a keyword that describes the key character of the planet or the people who live on the planet.

How to Play-
Two or more players shuffle and distribute the cards among themselves. Using the planet set that each player has they are supposed to create a story of how their planet would defeat opponent’s planet. This argument has to be counter argued by the opposite player. Whoever builds a better story wins.
How to see the fictional part and interact with the story-
These cards are put below a tablet which has a custom made application.
Cards have the artist’s rendering of the fictional world. The winner of the round gets to tap on the 3D render of opponent’s planet and blast it.
Once the planet is blasted, it is dead and the card can be used anymore. the augmented object is not seen anymore.




I tried multiple things before arriving at the final project described above. I think it’s important that I describe all the steps and learnings here-

1. I started with this idea of data sonification of an astronomy dataset. I looked at various dataets online and parsed a dataset of planet set and rise time from pittsburgh for a week, within pure data. After this step, I realised that I need to move to max msp in order to create better sound and more control on data.

2. I started working with max msp, I set up max with ableton and kinect so that body gestures of a human skeleton can be tracked. This is the video demo of music generated when a person moves infront of kinect.
The music is currently changing in relation to hand movement.

3. After this step, I dived deep into finding the correct dataset for sonification. I came across following video and I read more about deep field image taken by hubble telescope over ten days.

Every dot in this image is a galaxy.
I was inspired by this image and I decided to recreate the dataset inside unity. Aim was to build the deep field image world in unity and using kinect to give the ability to move through this simulation.

Here is how it looks

Screen Shot 2014-12-17 at 5.14.47 PM

Screen Shot 2014-12-17 at 5.20.28 PM

4. I got feedback from the review that the simulation wasn’t interactive enough. Also, there wasn’t enough user experience and immersion. We also happened to visit the galley space where the final exhibit was going to take place. All of this made me realise that I should use all the skills I learned till this point of time to create something fun and playful. That’s when I thought about developing Planet Wars.

1. I learned to parse dataset in pure data and work with sounds
2. I got introduced to Max and ableton and made my first project work on both of these platforms.
3.The technology used for the final project is- Unity 3D game engine.
Leanings- I learned to render 3D objects inside unity. I learned to add shaders and textures on objects. I also wanted to be able to create particle systems which was part of creating the explosion animation/special effect. I learned how to make my own ‘prefab’, quality or set of features applied to an object that can be repeated for other objects inside unity. Also, I learned how to add gestural interactions like tapping on virtual objects to andriod apps developed in Unity. I worked with getting 3D sounds attached to explosions so that the sound is different depending on whether the user is near or far from the virtual object.

References and related work
1. Astronomy Datasets- Astrostatistics, NASA dataset
2. SynapseKinect
3. Crater project
4. LHC sounds
5. Data sonification

From the feedback I received, I am planning to add more special effects during planet interactions and also more 3D objects than just planets- The spaceship and different artifacts used in the novel.

Update- project part2

Final Project,Technique — priyaganadas @ 4:17 am

I have been working on Max since last week. I have decided to use kinect to map depth and movement of a person. When the location of the person interacts with the 3d model of the dataset, sound will be generated.
This is the demo of test set up of Kinect-Max-Ableton working together. Patch creates sound based on hand motion and distance(depth) from the sensor.

Next Step is to import a 3d model of the database in the patch so that interaction of person and dataset can result in music.


Uncategorized — priyaganadas @ 7:53 am

RPi-Windows troubleshooting

I recently got tangled into Raspberry Pi – Windows connectivity problems again. Hence, I decided to solve it for once and forever.
I have windows 7. I am running all following steps as admin.

Here is the ultimate fix.

1. Download DHCP Server for Windows here-

2. Unzip and install dhcpserver.exe

3. Go to properties of your Local Area Network and Assign a static IP. For example- Enter Subnet mask. For example- .

4. Run dhcpwiz.exe from the downloaded folder.

5.Select Local Area Network. It should say ‘Disabled’ . Hit Next.

6. Do not do anything on this screen, hit Next.

7. Insert a range here(highllighed in the image). for example Put 100-110

8. check box “Owerwrite previous file” and Write the Configuration file.
you should see “INI file written after this step.”. Hit Next/finish

9. Now you should see status as “Running” here. If not, hit Admin.

10. That’s it, you are done. On next page of the interface, enable the option “continue running as tray app”.

Now, boot up your RPi, you should see inet address as the one you assigned in the DHCP server.

Hope this helps future MTI folks with windows machine.

Similar video here

Sonification of Astronomical Data

Uncategorized — priyaganadas @ 9:29 pm

I have been going through different types of astronomical datasets lately. One of the interesting directions is to sonify astronomical data. This is what I have decided to pursue. Following project taken in deep space field data from Canada-France-Hawaii Telescope over four years. Based on brightness, duration, distance from earth of supernovae that were captured, piano notes are played to create a sonata.

Internet radio station, CRaTER, plays live cosmic ray count as music

read more about CRaTER

Datasets from NASA

Large Hadron Collider (LHC) sounds

Demo of Color Detector

Assignment,Submission,Technique — priyaganadas @ 10:37 am

Here is the video of how the Spy Object works.

Github Repository is here

When the program is run, camera takes three consecutive photographs. Every image is scanned to determine dominant color of every pixel. Pixels are converted into dominant color (either red , green or blue). Now, entire image is scanned to determine dominant color of the entire image. This color is printed out. If all three images are of a predefined color sequence, an audio file is played. If the sequence does not match, program returns nothing.

Lessons Learned
1.The original idea was to to do face detection using RPi. We couldn’t find much precedence on that, also processing via RPi makes it very slow. The only method of doing it is creating a database of predetermined face (say 9 different expressions and angles) and train the Pi to detect the face using this database. This method is not scalable since if more faces are to be detected, larger database has to be built, which can not be handled in Pi.
2. We reduced the size of the image (160×120 pixels) to decrease the time it takes to process the image. Processing time is very high for images larger than that.
3. Color detection is not very accurate. We don’t know if it is the lights, reflection or the camera. Camera can detect the dominant color of a pixel(orange to pink are taken in as red and so on for blue and green) but differentiating between three closely related colors proved to be difficult. Possible solution here would be to print RGB value for a colored object and then manually determine a range of detection.

Precedent Analysis 2

Final Project,Precedent Analysis,Uncategorized — priyaganadas @ 6:17 am

Create a data visualization of an everyday far away space phenomenon in physical reality.

Similar Projects that I found


Cosmic Gravitational waves are detected(data is acquired) and showcased through LEDs. What I like about this project is an unseen or invisible phenomenon is given a physical form.
Moon Creeper: Sublime Gadgets by Auger-Loizeau(2012)

Apollo 11 mission left giant reflectors on the surface of moon. Reflectors can be illuminate by telescope laser beams from earth. The reflected beam is measured to calculate speed of receding moon from earth(3.8 cm per year). Moon Creeper is a visualization of receding moon. What I find interesting in this project is that a subtle motion is chosen and represented as is, and not accelerated in the visualization.

The Asteroid Impact Probability Drive.

the asteroid impact probabilty drive from luckytwentythree on Vimeo.

The device shows potentially hazardous astroids rotating around earth and shows warnings when their orbit matches with that of earth. Device Highlights that our lives our result of giant probabilistic equation. There are so many things that can go wrong and yet we go on living our lives as if we are immortal.


This paper by St├ęphanie Fleck and Gilles Simon defines augmented reality platform for young children to learn astronomy. Every market corresponds a planet in the solar system. The augmented objects react based on their position and distance with respect to each other, real time.

RPi Primer

Assignment,Submission — priyaganadas @ 7:58 am

Three switches were connected to Raspberry Pi. Switch 1 activates the graphics and generates first smiley. Switch two changes the expression into a happy face and switch three changes the expression into a worried face. Aim was to learn to use wiringPi with openFrameworks.




Video is here

Lesson learned-
1. The pinout diagram is different for different models, GPIO numbers are not similar to pin breakout on the board.
2. OpenFrameworks is fairly easy to use to generate graphics.

Troubleshooting with Windows
1. Before starting, always check if the computer is on ‘CMU’ network and not on ‘CMU Secure’. Even though I registered my computer on CMU network, it did not register for longer time, (more than 30 minutes). This problem got solved automatically when I registered for fourth time.
2. Also, Go to “Internet and sharing option’ , then click on to ‘ adapter settings’. Here, find CMU wireless connection, Go to advance properties, Under ‘sharing’ make sure you have selected ‘ Local Area Network’. Also, check the box which says ‘Allow sharing with devices on the network’.
3. Above setting may change next time you connect to CMU network. Check every time.


Assignment,Project01,Submission — priyaganadas @ 12:47 am

SILO from Priya Ganadas on Vimeo.

SILO is a silent tracker which senses people walking by.

Goal– The idea of the project is to make people realize life beyond their own bubble, by grabbing their attention while they do a routine act such as walking from one building to another. SILO gets activated by footsteps and prints out messages that can be related to any situation. These messages add serendipity to everyday life. Little creatures hide inside SILO and appear to see the person who activated them. They go back in once SILO is finished giving out the message.




Technology– SILO has a thermal printer that prints out messages, Piezo sensor to detect footsteps, LEDs, Arduino, Speaker phone and Servo to activate the little creatures.

Background- Priya Ganadas

Background — priyaganadas @ 12:48 pm

Hi, I’m Priya. I am an Electrical Engineer. I ventured into Interaction Design to understand human side of making things. I freelanced for a year and majorly worked on few mobile applications, interactive toys and couple of gaming projects-both board games and digital games. Astronomy is one of my strong interest and I’m yet to make a solid space themed game.
Here is my website
I’m posting one of our test videos for the augmented reality game that I was working on before I left India for CMU.

Planet Found from Priya Ganadas on Vimeo.

Next Page »
This work is licensed under a Creative Commons Attribution-Noncommercial-Share Alike 3.0 Unported License.
(c) 2017 Making Things Interactive | powered by WordPress with Barecity