Directions and Current Market

Final Project — amyfried @ 6:47 pm

In biology class we learn about the heart, how it pumps blood throughout the body, and how oxygen is inhaled through the lungs. When you lift you inhale and exhale to maximize movements, but why do you do this? How does one know they are training their body in a healthy way or when their muscles are fatigued (rather than based on feeling)?

Heart Monitor – To know what ones heart rate is we measure the pulse. Using an LED the change in color on the finger tells when blood is moving.

Electromyography (EMG) is used to determine muscle fatigue

“Fatigue is a phenomena that accompanies repeated muscular exertion. The ability of the muscle to produce force is reduced as fatigue occurs. Local muscle fatigue is manifested in electromyography (EMG) signals by a shift in the EMG power spectrum from high to low frequencies.” (

How can this information be simplified to a user and be understood in a different way outside of the realm of medicine? Should visualizations of the body be included?

Current Market

Mio Link

Heart Monitor with Bluetooth connectivity, acknowledges current training zone by color on the wristband. Images below is off of the Mio Website . It indicates the 5 heart rate zones that are tracked by the heart rate monitor.

Screen Shot 2014-10-26 at 6.14.54 PM

My Issues: The wristband shows a simple out put of color, but you have to remember what that means and how it is important to incorporate that into your workout, a feed back of why and how long would be helpful to explain what the training is doing. You know you body is at its max due to red, but how fatigued are your muscles compared to heart rate?



I am currently using the products below to extract data and understand it to the simplest of variables so create a component that outputs the information in a simplified, non-medical manner.

EMG Spikershield from Backyard Brains

PulseSensor by PulseSensor


Teletickle Concept Video

Assignment,Final Project — alanhp @ 1:59 pm

password: teletickle

RPi-Windows troubleshooting

I recently got tangled into Raspberry Pi – Windows connectivity problems again. Hence, I decided to solve it for once and forever.
I have windows 7. I am running all following steps as admin.

Here is the ultimate fix.

1. Download DHCP Server for Windows here-

2. Unzip and install dhcpserver.exe

3. Go to properties of your Local Area Network and Assign a static IP. For example- Enter Subnet mask. For example- .

4. Run dhcpwiz.exe from the downloaded folder.

5.Select Local Area Network. It should say ‘Disabled’ . Hit Next.

6. Do not do anything on this screen, hit Next.

7. Insert a range here(highllighed in the image). for example Put 100-110

8. check box “Owerwrite previous file” and Write the Configuration file.
you should see “INI file written after this step.”. Hit Next/finish

9. Now you should see status as “Running” here. If not, hit Admin.

10. That’s it, you are done. On next page of the interface, enable the option “continue running as tray app”.

Now, boot up your RPi, you should see inet address as the one you assigned in the DHCP server.

Hope this helps future MTI folks with windows machine.

Similar video here

507 Mechanical Movements

Hardware,Reference — Tags: , , , , — epicjefferson @ 3:20 pm


I found out that the book is now in the public domain, so it’s free to download-

507 Mechanical Movements PDF

And here’s the site that has animated some of the movements.

Live Time Lapse 2

Uncategorized — jk @ 4:41 am

For the second iteration of my Live Time Lapse prototype I have independently developed four aspects of this project:

1, Camera Operation

2, Uploading Photos

3, Creating Video

4, Mobile App

I have approached this project with the premise that other people know how to do all the above better than I could; I saw my job as researching what others had done. I identified requirements for each individual aspect of my project and identified programs and related hardware that would fulfill these requirements.

1, Camera Operations:

I wanted a camera that was cheap and easily configurable with good image quality. A Raspberry Pi Coupled with a RaspiCam fulfilled these requirements. The 5 MegaPixels images are fine for time lapse and you can adjust the camera settings and there’s a lot of code already written for these cameras.  The most interesting code I found was from James Welling of fotosyn, a photo developer which has also created a variety of interesting apps. There’s a great blog post about the timelapse camera he developed and created. I used this code to initiate my time lapse sequence with the following code:

This code can be downloaded here.

2, Uploading Photos

I used an amazing piece of code called Dropbox Uploader on the Raspberry Pi to upload images to Dropbox. This piece of code took a bit to actually figure out how to use given my limited knowledge of Python, but eventually I was able to get it to work.  One instructable post really helped with this.  As well another post pointed me in the right direction towards understanding how to create a python script to add to the above mentioned raspiLapseCam script.

Here’s the code to make the Dropbox Uploader work.

This piece of the code indicates the director whose images are uploaded

This piece below indicates the directory which is created in the App portion of your Dropbox folder (which you create when installing Dropbox Uploader on your Raspberry Pi).

In addition there is another piece of code I am trying to get to use. On the Dropbox Uploader Git Hub instructions under Optional Parameters, Andrea Fabrizi (the developer) indicates that you can use the “-s” to not upload images which already exist in the Dropbox. I don’t know how to use that piece of code yet. If anyone knows, that would be great!

3, Creating Video

I identified MAXmsp/Jitter as a program I had some familiarity with, which also had a lot of documentation, and which someone had already created a “patch” for that I could use. I found this person whose name is Gian Pablo Villamil. The time lapse looping is now working, but it is not directed to the correct Dropbox folder. This should be a simple fix.

What won’t be as simple is perhaps translating this patch to Pure Data so it can work directly on a Raspberry Pi.

4, Mobile App

Raspberry Pi’s are endlessly configurable, but their user interface is terrible. It would be awesome to operate a camera and timelapse directly from a mobile app. This is something that forosyn is working on. I was able to install their BerryCam Express and get it to work from my iPhone. You can use their BerryCam app to take pictures remotely from your phone. They are also promising timelapse functionality, which would be nice.   I found their app worked really well.

The Mobile App isn’t necessary for my final project, but it is exciting to experiment with.

Here’s a video:

Sonification of Astronomical Data

Uncategorized — priyaganadas @ 9:29 pm

I have been going through different types of astronomical datasets lately. One of the interesting directions is to sonify astronomical data. This is what I have decided to pursue. Following project taken in deep space field data from Canada-France-Hawaii Telescope over four years. Based on brightness, duration, distance from earth of supernovae that were captured, piano notes are played to create a sonata.

Internet radio station, CRaTER, plays live cosmic ray count as music

read more about CRaTER

Datasets from NASA

Large Hadron Collider (LHC) sounds

Tracking the ISS

Software,Technique — Tags: , , , , , — epicjefferson @ 1:32 pm

Searching for an astronomy related API, I found this neat site called “Where is the ISS at?” They have an API

Since there is no sample code, I modified the kimonolabs sample code for python.

Then you get back something like this.



Piezo Sensor Prototype

Uncategorized — jk @ 12:17 am

Recently I completed a collaboration with Dakotah Konicek for the Tough Art Artist Residency at The Children’s Museum of Pittsburgh. In conjunction with this project, for the MTI class I attempted to make a Piezo sensor that could be used to both sense sound and activate a pump. This video documents this project, in part, as well as the Piezo microphones I built for this project and my attempt at building an amplifier that could be input into an Arduino sensor and output as sound.



Above is the final piece at The Children’s Museum.


Here’s excellent advice on building Piezo Preamps on Alex Rice’s website.

Sound Studies (Precedent Analysis)

Uncategorized — jk @ 12:05 am

Over the past couple of years I’ve become more and more interested in sound and what sound can do. It’s taken me a while to come around to sound, but the first time I really remember being struck by sound was at a noise music concert that a friend of mine put on in which, with gleeful transgression a group of punks blew away Burnside Avenue in Portland, Oregon on a Saturday afternoon. Incidentally, the orchestrator of that event has recently started a deconstructionist film blog Talking at The Movies with the tagline: “Spoiler Alert: Meaning is an Artifact of Creation.”

Three other experiences with sound include experiencing “The Forty Part Motet (A reworking of “Spem in Alium” by Thomas Tallis 1573)” by Janet Cardiff and George Bures Miller at The Cloisters in Manhattan.  This piece was truly transforming. You would walk around this amazing chapel which was transported from Spain and could here each individual voice on each individual speaker. It really got you to think about sound and I really believe that sound is something we have difficulty focusing on. This allowed a modern audience to focus on Thomas Tallis’ truly amazing composition.


Recently I’ve become fascinated with John Luther Adam’s compositions such as Inuksuit, which incorporate the avant-garde musical tradition of early 20th century percussion oriented composition with Stockhausen’s radical site specific “Helicopter String Quartet.” John Luther Adams, however, combines the radicalness of these gestures with truly relatable sounds such as those, in the case of Inuksuit, of the arctic. He studies these sounds and reinterprets them for orchestra, again, so we can here them again.


Composer Nico Muhly has done the same thing with the traditional folk song “Oh, The Wind and Rain.” He has essentially, deconstructed this song into separate parts and then over the course of a three part composition (one of which is in the video above) this song is rebuilt. More about this song and this composition can be found on Nico Muhly’s website.

The commonality I have found between these compositions is the way sound is used to call attention to a composition or sound that already exists that we pass over and don’t really hear.


Project01: iVolume — a volume-controlled RPi Radio

Assignment,Project01 — pvirasat @ 11:21 pm

iVolume is a device that allows you to listen to your favorite radio station on Pandora, where the volume of the music is controlled based on how loud or quiet the surrounding environment is. As someone who loves listening to music on Pandora, I thought this would be an interesting way to use the input from the microphone, and a productive way to get my hands dirty with the Raspberry Pi.





The video above has illustrated the basic working of the device. The electret microphone amplifier senses the crowd noise, where the data is interpreted and sent to the Raspberry Pi through Serial communication. Pianobar, an open-source, console-based client for Pandora, comes with many features including playing, managing, and rating stations. For this project, I used the input data to manipulate the volume of the music. In the video, I plugged in a set of speakers to the Raspberry Pi to illustrate how the device works. However, a more ideal setting would be to use headphones instead because the microphone will take in the loud sound from the speakers, which will eventually make the volume go up and never come down.

Even though there are quite a number of tutorials for Pianobar + Raspberry Pi, I learned a lot working on this project. This includes figuring out how to manually install the libraries onto the Raspberry Pi, working in terminal, sending data from an Arduino to a RPi, and coding in Python. Looking forward, I plan to create a more durable and portable prototype, and possibly mix in other types of inputs to manage the different stations.


« Previous PageNext Page »
This work is licensed under a Creative Commons Attribution-Noncommercial-Share Alike 3.0 Unported License.
(c) 2020 Making Things Interactive | powered by WordPress with Barecity