Activate Yourself

Activate Yourself from Amy Friedman on Vimeo.




FullSizeRender (1)



Activate Yourself is a visualization aid, to understand muscle activity. Users follow on screen prompts to visually understand if their muscle of choice is being used during different motions they utilize. This is a being step to better understand our bodies and whether we “activate” ourselves during different activities the way we think we are.

My main interests involve body monitoring, and how information is conveyed to users. Many times we visualize data in a way that not everyone can understand, therefore our experience with data doesn’t add value to our everyday lives to change how we act or inform us about healthy activity. The notion of Muscle Activity can help understanding stroke victims ability to move during rehabilitation, trainers/athletes/kids to understand when they are using the muscles they want, and overall maximize training by understanding if your body is responding the way you believe it to be. Using Processing 2.0 I created the onscreen prompts and software. The software connected to the EMG Shield/Arduino through Firmata imports into Processing, and using the Firmata Code on the Arduino.

Using the Backyard Brains Electromyogram(EMG) Arduino Shield I was able to retrieve readable data that informed whether a muscle had been “activated” or used which someone was moving. The higher the analog read, the more the muscle was trying to be utilized through local electric muscle activity sent from the brain. I first began by testing out the different Backyard Brains experiments, such as Muscle Action Potentials (measuring the amount of activity), and Muscle Contraction and Fatigue. The latter is what inspired my original path to further understand our bodies.

We currently visualize signals through sin waves, but is there a better way to visualize this information. I then tried to utilize the EMG to detect muscle activity to determine if a muscle is fatigued, not active, active, and rested. This information could optimize working out and lifting. A wearable versatile device that could be worn on any muscle group with haptic or LED feedback would be the idealized version of this project.

I first began by reading how EMGs measure information, can muscle fatigue be recognized, how would I even do this? I read the following articles:

Sakurai, T. ; Toda, M. ; Sakurazawa, S. ; Akita, J. ; Kondo, K. ; Nakamura, Y. “Detection of Muscle Fatigue by the Surface Electromyogram and its Application.” 9th IEEE/ACIS International Conference on Computer and Information Science, 43-47 (2010).

Subasi, A., Kemal Kiymik, M. “Muscle Fatigue Detection in EMG Using Time-Frequency Methods, ICA and Neural Networks”  J Med Syst 34:777-85 (2010).

Reaz, M.B.I., Hussain, M.S., Mohd-Yasin, F. “Techniques of EMG signal analysis: detection, processing, classification and applications.” Biological Procedures Online 8: 11-35 (2006).

Saponas, T.S., Tan, D.S., Morris, D., Turner, J., Landay, J.A. “Making Muscle-Computer Interfaces More Practical.” CHI 2010, Atlanta, Georgia, USA.

I created my own analysis of my data using the procedures in the article:

Allison, G.T., Fujiwara, T. “The relationship between EMG median frequency and low frequency band amplitude changes at different levels of muscle capacity.” Clinical Biomechanics 17 (6):464-469 (July 2002).

I realized that in order to better understand the data I needed to filter it to get rid of external noise, and compare frequencies using Signal Processing, after filtering the data I could use Machine Learning or Neural Network tools to recognize patterns of fatigued, active, rested or not-active. With the help of Ali Momeni, CmuArtFab Machine Learning Patch and my resources above, I created a patch in Max MSP by filtering the signal from Arduino, but this wasn’t enough to be able to recognize the signal. The sample rate of my data is only 1024 while the samplerate for Audio is 44100Hz making my data very tiny when transformed using the Fast Fourier Transform(FFT) settings in Max MSP. It was recommended that I try to utilize Pd. I was able to filter the data, but as I am not rehearsed in Signal Processing methods it was unclear to me how to go about the next phases to utilize Neural Networks.

Screen Shot 2014-12-16 at 7.43.30 PM

Max Patch


Screen Shot 2014-11-17 at 6.32.19 AM

PureData Patch

At this point I refocused my project scope to help visualize muscle activity, or as the Backyard Brains experiment calls this “Muscle Action Potentials”. Using Processing 2.0, I created the “Activate Yourself” software which instructed users how to put on an EMG based muscle choice (the tricep, bicep or forearm), gave them on screen instructions and feedback for each timed activity, while showing their activity levels on a metered display. Creating the software took me time as it was hard to navigate between menus and I had trouble moving words while utilizing a timer. I spent too much time on making this work, that the end “AH HA” moment needed more attention. I spent time sketching out the interactions and how they should experience the timers.



IMG_1551 IMG_1552

For the Physical Visual Piece, I used Rhino to create a Shadow Box and 3d printed Lightbulbs using the Cube and FormLab printers. The FormLab printer was able to print the bulbs without any issues, while the Cube required supports and the Cubify software doesnt provide this type of additive support options as the PreForm software for Formlabs does. In order to print without supports on the Cube I made the sphere flat at the top, but there were issues with printing the neck of the bulb as there was so support for it to print over/connect to, making that area brittle.


Tests to create design with Cube

I also learned that you can copy several of the same parts into a print to allow for it to print quicker using a Formlab printer, which sped up the process alot!


Printing with FormLab Printer


Printing with Cube

Connecting the NeoPixels to be insync with the Firmata code was hard and I am still figuring this part out better at the moment. I will post when I have this fixed!


1. I have low knowledge in Signal Processing, but it was a good start to my task of learning about wearable technology and has helped me focus on what I need to learn next semester

2. Creating the software took me time as it was hard to navigate between menus and I had trouble moving words while utilizing a timer. I spent too much time on making this work, that the end “AH HA” moment needed more attention.

3. I got to work with Max MSP and PureData which was a great opportunity, and although it was just beginning it was nice to work in both softwares and understand their basic setups better. I was previously overwhelmed by each.

4. Balancing between the physical components and software components was not easy as if one didnt work then you couldnt utilize the other.


Athos – Athos is a wearable fitness shirt that utilizes a six-axis accelerometers, EMG to measure muscle effort, muscle target zones and muscle fatigue. Heart rate and breathing patterns are tracked to further enhance you overall performance, and this device determines your recovery rate to truly maximize your workout.

Mio Link- Heart Monitor with Bluetooth connectivity, acknowledges current training zone by color on the wristband. Images below is off of the Mio Website . It indicates the 5 heart rate zones that are tracked by the heart rate monitor.

Screen Shot 2014-10-26 at 6.14.54 PM

Tommy Doyle

Assignment,Description — Tags: — tdoyle @ 2:37 pm

Hello! My name is Tommy Doyle and I am currently a Senior majoring in Computer Science and Human Computer Interaction. My background is highly technical with experience in software engineering, mobile development, and web development. As well I have experience in interaction design (mostly UX and UI) and User Research. My goal with any design I create is to bring powerful technology to the user and to add a bit of joy to every experience.

With this course, I hope to dive into an area of interaction design that I do not have as much experience in. I hope to challenge myself to get out of my comfort zone and produce some work I’m really proud of.

The work I am most proud of so far has been around booth during carnival on campus. Below is a link to the game that I played a part in making in the spring of 2013.

Journey to Paradise Falls from Andy Biar on Vimeo.

Here is a picture of the outside of the booth!


More of my work is available here:

I’m looking forward to the semester and what it holds in store!

Yeliz Karadayi – Background

Assignment,Description — ygk @ 9:32 am

I graduated the 5-year Architecture program here at CMU, which makes this my 6th year in Pittsburgh. Through my education I have taken on an interest the process of fabrication in architecture, which is currently often designed such that each iteration yields a physical product. I will be exploring design processes that symbiotically integrate the power of digital fabrication; this would work such that a designer is not simply hitting “Run” on a robot and looking at the finished prototype and reacting to that, but instead responding to real-time feedback from the digital fabrication on some data, re-adjusting the fabrication process as a prototype is being made.

Some work I have done in the past speak to this concept, but most don’t really. Visit my website to see more of my projects:

Petal Pavilion

Digital Nouveau:

Yeliz-Esra_03 Yeliz-Esra_10Yeliz-Esra_02 Yeliz-Esra_07

Dan Russo

Description — Tags: — Dan Russo @ 1:02 am

My past educational background in architecture provided a broad exposure to many interdisciplinary topics.  While studying architecture my main focuses developed around global culture and the impacts technology have made in the built environment. Both digital and hand fabrication techniques were employed to explore these interests in differing ways.  I also worked as a student web developer, where I worked on various tools for faculty and students on mobile platforms.

Both of these opportunities have brought my focus to investigating communication with dynamic environments, in which our experiences become energized 2- way conversations involving objects and people alike.  I hope to fully explore environments that can raise as many questions as they resolve.

A portfolio of my previous work can be found at


Below is a short video documenting the assembly of a full scale prototype for Reactive Space

[vimeo 104267680 w=500&h=280]


This work is licensed under a Creative Commons Attribution-Noncommercial-Share Alike 3.0 Unported License.
(c) 2017 Making Things Interactive | powered by WordPress with Barecity