Activate Yourself

Activate Yourself from Amy Friedman on Vimeo.

FullSizeRender

IMG_1584

IMG_1561

FullSizeRender (1)

 

CONCEPT/PROCESS:

Activate Yourself is a visualization aid, to understand muscle activity. Users follow on screen prompts to visually understand if their muscle of choice is being used during different motions they utilize. This is a being step to better understand our bodies and whether we “activate” ourselves during different activities the way we think we are.

My main interests involve body monitoring, and how information is conveyed to users. Many times we visualize data in a way that not everyone can understand, therefore our experience with data doesn’t add value to our everyday lives to change how we act or inform us about healthy activity. The notion of Muscle Activity can help understanding stroke victims ability to move during rehabilitation, trainers/athletes/kids to understand when they are using the muscles they want, and overall maximize training by understanding if your body is responding the way you believe it to be. Using Processing 2.0 I created the onscreen prompts and software. The software connected to the EMG Shield/Arduino through Firmata imports into Processing, and using the Firmata Code on the Arduino.

Using the Backyard Brains Electromyogram(EMG) Arduino Shield I was able to retrieve readable data that informed whether a muscle had been “activated” or used which someone was moving. The higher the analog read, the more the muscle was trying to be utilized through local electric muscle activity sent from the brain. I first began by testing out the different Backyard Brains experiments, such as Muscle Action Potentials (measuring the amount of activity), and Muscle Contraction and Fatigue. The latter is what inspired my original path to further understand our bodies.

We currently visualize signals through sin waves, but is there a better way to visualize this information. I then tried to utilize the EMG to detect muscle activity to determine if a muscle is fatigued, not active, active, and rested. This information could optimize working out and lifting. A wearable versatile device that could be worn on any muscle group with haptic or LED feedback would be the idealized version of this project.

I first began by reading how EMGs measure information, can muscle fatigue be recognized, how would I even do this? I read the following articles:

Sakurai, T. ; Toda, M. ; Sakurazawa, S. ; Akita, J. ; Kondo, K. ; Nakamura, Y. “Detection of Muscle Fatigue by the Surface Electromyogram and its Application.” 9th IEEE/ACIS International Conference on Computer and Information Science, 43-47 (2010).

Subasi, A., Kemal Kiymik, M. “Muscle Fatigue Detection in EMG Using Time-Frequency Methods, ICA and Neural Networks”  J Med Syst 34:777-85 (2010).

Reaz, M.B.I., Hussain, M.S., Mohd-Yasin, F. “Techniques of EMG signal analysis: detection, processing, classification and applications.” Biological Procedures Online 8: 11-35 (2006).

Saponas, T.S., Tan, D.S., Morris, D., Turner, J., Landay, J.A. “Making Muscle-Computer Interfaces More Practical.” CHI 2010, Atlanta, Georgia, USA.

I created my own analysis of my data using the procedures in the article:

Allison, G.T., Fujiwara, T. “The relationship between EMG median frequency and low frequency band amplitude changes at different levels of muscle capacity.” Clinical Biomechanics 17 (6):464-469 (July 2002).

I realized that in order to better understand the data I needed to filter it to get rid of external noise, and compare frequencies using Signal Processing, after filtering the data I could use Machine Learning or Neural Network tools to recognize patterns of fatigued, active, rested or not-active. With the help of Ali Momeni, CmuArtFab Machine Learning Patch and my resources above, I created a patch in Max MSP by filtering the signal from Arduino, but this wasn’t enough to be able to recognize the signal. The sample rate of my data is only 1024 while the samplerate for Audio is 44100Hz making my data very tiny when transformed using the Fast Fourier Transform(FFT) settings in Max MSP. It was recommended that I try to utilize Pd. I was able to filter the data, but as I am not rehearsed in Signal Processing methods it was unclear to me how to go about the next phases to utilize Neural Networks.

Screen Shot 2014-12-16 at 7.43.30 PM

Max Patch

 

Screen Shot 2014-11-17 at 6.32.19 AM

PureData Patch

At this point I refocused my project scope to help visualize muscle activity, or as the Backyard Brains experiment calls this “Muscle Action Potentials”. Using Processing 2.0, I created the “Activate Yourself” software which instructed users how to put on an EMG based muscle choice (the tricep, bicep or forearm), gave them on screen instructions and feedback for each timed activity, while showing their activity levels on a metered display. Creating the software took me time as it was hard to navigate between menus and I had trouble moving words while utilizing a timer. I spent too much time on making this work, that the end “AH HA” moment needed more attention. I spent time sketching out the interactions and how they should experience the timers.

IMG_1546

IMG_1547

IMG_1551 IMG_1552

For the Physical Visual Piece, I used Rhino to create a Shadow Box and 3d printed Lightbulbs using the Cube and FormLab printers. The FormLab printer was able to print the bulbs without any issues, while the Cube required supports and the Cubify software doesnt provide this type of additive support options as the PreForm software for Formlabs does. In order to print without supports on the Cube I made the sphere flat at the top, but there were issues with printing the neck of the bulb as there was so support for it to print over/connect to, making that area brittle.

IMG_1508IMG_1509

Tests to create design with Cube

I also learned that you can copy several of the same parts into a print to allow for it to print quicker using a Formlab printer, which sped up the process alot!

IMG_1517

Printing with FormLab Printer

IMG_1504

Printing with Cube

Connecting the NeoPixels to be insync with the Firmata code was hard and I am still figuring this part out better at the moment. I will post when I have this fixed!

LESSONS LEARNED:

1. I have low knowledge in Signal Processing, but it was a good start to my task of learning about wearable technology and has helped me focus on what I need to learn next semester

2. Creating the software took me time as it was hard to navigate between menus and I had trouble moving words while utilizing a timer. I spent too much time on making this work, that the end “AH HA” moment needed more attention.

3. I got to work with Max MSP and PureData which was a great opportunity, and although it was just beginning it was nice to work in both softwares and understand their basic setups better. I was previously overwhelmed by each.

4. Balancing between the physical components and software components was not easy as if one didnt work then you couldnt utilize the other.

RELATED WORK:

Athos – Athos is a wearable fitness shirt that utilizes a six-axis accelerometers, EMG to measure muscle effort, muscle target zones and muscle fatigue. Heart rate and breathing patterns are tracked to further enhance you overall performance, and this device determines your recovery rate to truly maximize your workout.

Mio Link- Heart Monitor with Bluetooth connectivity, acknowledges current training zone by color on the wristband. Images below is off of the Mio Website . It indicates the 5 heart rate zones that are tracked by the heart rate monitor.

Screen Shot 2014-10-26 at 6.14.54 PM

507 Mechanical Movements

Hardware,Reference — Tags: , , , , — epicjefferson @ 3:20 pm

507movements

I found out that the book is now in the public domain, so it’s free to download-

507 Mechanical Movements PDF

And here’s the site that has animated some of the movements.

507movements.com/

Hand-gesture controlled sound synthesis

One of my main interests is working with interfaces for sound synthesis. Over the years I’ve been experimenting with a few different techniques to see how different interfaces inspire different styles of performance and how the interface affects the sound produced. Without having a clear goal, I’ve delved into circuit bending/ hardware hacking, computer-vision, touch screens and web/text based systems. Here are some of my findings:

Circuit Bending

Elvis box – Circuit Bent DS-1 Distortion

Circuit Bending provides a great introduction to incorrect electronics, the idea is that you use wires to randomly create connections within existing circuits (using only battery power, for safety) and explore the effect these connections have on the sound (or visual). I think this wires the brain in a great way because you expect failure, instead of total control you have only curiosity and luck. This got me thinking about how I was going to control these sounds. Why had I decided to use buttons, patch cables and knobs?

(more…)

Wired Magazine: Even More Raspberry Pi

Reference — delce @ 4:36 pm

Some interesting hacks with a raspberry pi

www.wired.com/2013/01/even-more-raspberry-pi-projects/#slideid-357791

Precedent Analysis

Precedent Analysis,Reference — delce @ 1:51 pm

I am really interested in wearables and I am intrigued by the work of Anouk Wipprecht. Specifically her intimacy dress, which uses smart e-foils, which becomes transparent when you run a current through it.

www.studioroosegaarde.net/project/intimacy-2-0/

 

I am also interested in projected images but I haven’t thoroughly thought through how to combine the electric foil with the projected image. Here is one of the projected images I found interesting:

www.facebook.com/video.php?v=684709411615365&set=vb.100002289370012&type=2&theater

Precedent Analysis 2

Precedent Analysis,Reference,Uncategorized — Dan Russo @ 7:54 am

Academic:

Carlo Ratti (2008)

MIT SENSEable City Lab

senseable.mit.edu

Art:

Double-Taker (Snout), Interactive Robot from Golan Levin on Vimeo. (2008)

www.flong.com

pttrpattr

 

Industry:

J. Mayer H (2002)

www.jmayerh.de/index.php?article_id=79

How to Upload Image From RaspberryPi (RPi) to Dropbox using OpenFrameworks (OFx)

Reference,Software — alanhp @ 8:11 pm

Okay! here is the explanation, hopefully its not super hard to get it working.

Link to explanation with cool colored text: docs.google.com/document/d/1iopRcz5xk_z5ZRB-2PiaK7pi0I-P3qWJQp5L2xNMhjE/pub

For the instructions by the creator of Dropbox-Uploader go here github.com/andreafabrizi/Dropbox-Uploader, these are not specific to the RaspberryPi and OpenFrameworks, they are meant to be for all environments thus they may be harder to use specifically for our case.

Instructions:

1) get the Dropbox Uploader software from github. To do this the full instructions are at (github.com/andreafabrizi/Dropbox-Uploader) but the simplest method is to connect your RPi and from the home directory (which is the one you should be at by default when you connect your RPi) type on the terminal the following command:
git clone github.com/andreafabrizi/Dropbox-Uploader/

2) Then move to the directory where Dropbox-Uploader is (which should just mean you have to type cd Dropbox-Uploader) and in it you want to give the right permissions to the script by typing the following command (you may have to play a little with the +x option, it gave me some trouble. I think you can say chmod help and it will give you a list of options):
chmod +x dropbox_uploader.sh

3) Then you want to run the script which will guide you through some configuration options. To run the script, the command should be:
./dropbox_uploader.sh

4) One of the configuration options is whether you want your app to have access to all of your dropbox or just to a specific folder, I said just to a specific folder which I think makes things easier to control. The rest of the instructions are based on the assumption that you are choosing that option. You should create a folder in the Apps folder in your dropbox, I think you should have one by default, you should be able to create a folder on this dropbox for developers url www.dropbox.com/developers/apps

5) Switch to your openFrameworks app directory i.e. opt/openFrameworks/apps/myApps/myCoolApp Now you can do the cool stuff. The thing I really liked about how this integrated is that my actual interaction with dropbox takes only one line on the code. I save the file I want to upload on the RPi and then I just get it from the directory where it is at and save it to dropbox.

6) So now, wherever in your app you want to upload your file to dropbox you will place that one line of code mentioned on the previous step. I put the line right after saving the image file to my raspberry pi (for which I used the image saving example on the examples/graphics folder for openFrameworks) The line is basically going to have the form system(“command that we will specify”). What will basically happen is that whatever is between the quotes will be executed from the command line. We are momentarily leaving openFrameworks and running something from the command line. The specific line for uploading is going to look something like this: system(“../../../../../home/pi/Dropbox-Uploader/dropbox_uploader.sh upload ../bin/data/uploadFile.png uploadFile.png”)

Once that is done, your file should get uploaded to Dropbox… but it usually doesn’t happen on the first try, I had to play a lot with the location paths for all the files. I am going to explain in a little more detail each part of the quoted command so that hopefully getting the command just right is easy. So…

../../../../../home/pi/Dropbox-Uploader/dropbox_uploader.sh this is the path for where the dropbox_uploader.sh file is in relation to the directory where the app home is. ../../../../../ with this we are going up five directories, so basically to the terminal right before we enter the opt folder. Then from this directory we want to go to the directory /home/pi/Dropbox-Uploader/ where the dropbox uploader file is and here we run the script dropbox_uploader.sh

upload this says that from the dropbox uploader we want to use the upload functionality.

../bin/data/uploadFile.png and what we want to upload is the file called uploadFile.png which is located in the folder bin/data to which we get access once we go up one directory ../ from the src folder where our compiled code is (I think).

And then since we configured our app to have access to one specific dropbox folder, we don’t need to specify the directory where we are saving in dropbox and instead can just type the name we want the file to have in dropbox uploadFile.png.

that’s it.

Precedent Analysis_Project01_Amy Friedman

Assignment,Precedent Analysis,Reference — amyfried @ 10:49 pm

Scanadu Scout™ – The first medical Tricorder



“The Power of Microphones as Medical Sensors”
. Posted in Electronic Components by Brian Buntz on June 3, 2013
Credit of project to: John Stankovic, Shahriar Nirjon

This compact sized device is meant to measure vital signs to allow people to understand what their body is doing, comparable to what is measured while one is in the emergency room. It is less invasive than the traditional product. This product uses 32-bit RTOS Micrium platform, Bluetooth 4.0, micro USB Adaptor, Electrodes for an EKG that convert to FM sound signals and are sent to the built-in smartphone microphone. I find it interesting that amount of data that can be achieved in a short span of time, and I am interested in healthcare technology which this qualifies as. I can borrow the idea of compacting several elements to an object. The issues I have is that the data is only read for a short time not consistently and therefore not much can be determined from the data incorporated in the element.

The Dash — Wireless Smart In Ear Headphones

Created to: Bragi in Munich, Germany

These earbuds are meant to track your everyday performance which working out. This product uses an earbone microphone, bluetooth 4.0, 4 GB storage, 32-bit arm processor, digital signal processor, analog frontend with 22 bit ADC, passive noise isolation, audio transparency, and other sensors. The earbone microphone records your personal vocals which reducing the noise surrounding you. When you run you understand your vitals, and are able to block out noisy elements. The products allows one to workout without invasive wires or noise, and monitors your fitness. Im attracted to this project due to the healthcare wearables, and would borrow the idea of a sleek compact design and important of bluetooth. I find that the product would need to be more interactive to understand its true potential as it relies on the smart phone to convey information.

Metro Dot-Braille Bracelet for Independent Train Travel

metro_dot

Designers: Hoyeoul Lee, Jinwoo Kim and Sangyong Choi
IDEA Award Entry 2012
Article from yankodesign.com
This object is a design idea, that hasnt been created. It utilizes a users voice and Electro Active Polymer, to give the braille of which station to get off at after the destination is given. There is silicone rubber, constant magnets, and Electromagnetic coil. I find this innovative as it is truly interactive and uses a microphone to convert information to be used by special needs. I find this use to be practical, but Im not sure how it entirely works.
This work is licensed under a Creative Commons Attribution-Noncommercial-Share Alike 3.0 Unported License.
(c) 2025 Making Things Interactive | powered by WordPress with Barecity