Kibble Control! This is a pet bowl device that can connect to the internet! Sure, there are other bowls out there. Bowls that detect RFID. Bowls that schedule your cats feeding time. Bowls that connect to the internet and can update you on when your cat ate. Problem is, none of these bowls have the ability to accommodate multiple pets while being able to connect to the internet and allow for the pet owner to control exactly how much the cat should eat each meal and at what time. No other bowl understands whether one pet tends to bully another away from the food bowl and will update the owner over a connected web application. Other bowls that do connect over the internet do so via a phone app, which requires a smart phone to be used, and cannot be accessed via other devices.
Point is, we thought of [almost] everything! It’s a work in progress but we believe we are on to something here. We plan to continue to explore our options regarding this project because it was a lot of fun to work on, it benefits me personally to make this bowl the best it can be so at least I can use it in my home, and there does not seem to be any draw-backs to giving it a go when we have the time for it.
Horay for KibbleControl!
KibbleControl from Yeliz Karadayi on Vimeo.
closed back- locked in with magnets
the mess inside
Knock is a more intimate way to connect with specific people on campus. Use it to signal for help, that you’re heading over, or just to drop in on a friend.
Very intriguing approach to making the digital world physically interactive. Not sure I want to do something like this really. It seems a bit too literal somehow, and limited. It’s not a design tool as they have it now.
“OpenControllers” by ECAL/Marc Dubois (2014
OpenControllers ECAL/Marc Dubois from ECAL on Vimeo.
Open Controllers allows you to use your daily device as a controller of a different kind by placing it within specifically designed artifacts that embed the device and utilize its capabilities such as the gyroscope and the light detection. This project takes your physical movements and interprets them. I think this can be done with a range of patterns, which gets into the next video…
“Tilt Brush” by Skillman & Hackett (2014)
This is a lot more similar to what I want to to, except that I also am interested in the painting becoming three-dimensionalized as you are painting. Not that I intend on doing that exact thing…I just mean to draw a parallel. The gestures here looks swift and intuitive if it’s real, and the interface is quite honestly very confusing but maybe that’s because I’m not aware of the gestures it uses. Assuming this is true, I think this is the future of design through tangible interaction.
“3Doodler” by Pete and Max (2013)
This project does a lot of what I am hoping to accomplish with its ability to prototype quickly and efficiently without having to prepare ahead of time what will be made.
“3D Printed Blooming Flowers” by Mikaela Holmes (2014)
This kind of movement is exciting as a less mechanical way to actuate an artifact. I wonder how it could be applied at a more complex or responsive level.
I think these videos show the progression of my thoughts on the final project quite well. I want to be able to design something indirectly by using a tool, not unlike how surgeons now use robots to do the actual surgery. I am struggling to know the scope I am capable of, and coming up with a good project idea to make it happen. Something I had thought about is using my surface as a tool to communicate with a CNC robot to fabricate sketches drawn in real time on the surface.
There is a common myth that plants will grow faster when you sing to them often. This flower guarantees you its blossom, that is if you are able to sing it the right song.
Well, luckily you get to pick that song. You can be that special person to make this flower blossom. You can put this flower next to other flowers so that you can get that instant blossom gratification from your beautiful singing efforts, and simultaneously test out that myth…I think my basil plant grew faster this week?
This fourth precedent is being posted because Ishin-den-shin was already used and I wanted to be sure to have three new ones. This technology can turn anything into an instrument by picking up on the way that the object being used as an instrument is being touched. Like Ishin-den-shin, this project has something seemingly low-tech do something unexpected and truly immerses the user in the experience by having them be directly involved.
This project is an incredible manifestation of the technological capabilities we have reached today. The ability to use your own body to contain a hidden message and, by simple touch, deliver it to another, takes interactivity to another level of true immersion in the activity with the technology. The whole system seems very low tech but of course that is because the actual technology working here is unseen, adding a layer of complexity that goes completely unnoticed- except for when you touch someone’s ear!
The thoughtfulness of this project is appealing as an object identification device. Though it is difficult to consider a realistic application for this, it is intriguing to imagine a world physically marked with identification that you can hear, feel, and see. The requirement of a haptic interaction with an object in order to get feedback from the audio effect is the true function that entices me.
Interact with one thing, produce a resulting sound from it, then that sound is heard. The indirect relationship between the “input” object and the direct input to the program is where I find inspiration.
A Drifting Up is a project I was shown over the summer of a flocking particle system that responds to sound. Its response creates particle sub-systems within the larger whole. It is a complicated system that yields an interesting visual effect, but to take something like this a step further and manifest physically somehow could be even more intriguing.
I graduated the 5-year Architecture program here at CMU, which makes this my 6th year in Pittsburgh. Through my education I have taken on an interest the process of fabrication in architecture, which is currently often designed such that each iteration yields a physical product. I will be exploring design processes that symbiotically integrate the power of digital fabrication; this would work such that a designer is not simply hitting “Run” on a robot and looking at the finished prototype and reacting to that, but instead responding to real-time feedback from the digital fabrication on some data, re-adjusting the fabrication process as a prototype is being made.
Some work I have done in the past speak to this concept, but most don’t really. Visit my website to see more of my projects: