Sensing the surrounding

Screen Shot 2015-09-24 at 9.15.29 AM

Screen Shot 2015-09-24 at 9.15.49 AM

Screen Shot 2015-09-24 at 9.14.03 AM

We are the sensors

This week in general was very overwhelming. There were days, moments, actually, where I was feeling a lot of clarity and things would start to make more sense. And then suddenly, it felt as if I lost everything and hadn’t understood anything to begin with. I’m in the belief that ITP is actually a language school. The languages we are learning aren’t ones we necessarily use to verbally communicate with one another. Rather we are exploring ways in which we understand what’s happening around; ¬†we’re exploring how to convey to others what we’re constantly thinking about; we’re exploring ways in which we want to see change in the world. We’re communicating through the things we create and build. And we’re using a variety of languages to do so– Arduino, p5.js, Photoshop, Illustrator, github, Unity, C Sharp. These are some of the languages¬†I’m currently learning. The languages I’m attempting to use to create my ideas. And like learning any language, the only way to learn one is to use it.

Learning about analog input and output has made more sense than anything we have done so far in PComp. I think because it’s most like how humans take in and convey information. In the workshop on Tuesday, the residents gave a physical representation of how analog input and output works.

“Let’s say I receive information. I am reading the information. analogRead. And then I want to tell someone else. analogWrite.”

Just like humans, or possibly any living thing in the world, there’s no yes or no answer. We’re much more complicated than that. What’s my takeaway? We are the sensors here. We are the analog input and what we create is the output.

Below was my favorite of the labs we did this week. I loved it because it inspired me the most. Although I didn’t build anything upon this, I thought of ways to apply this for future projects to build upon.

I love the idea of sensing light. I also love the idea of helping people feel more comfortable. Especially when learning. Here at ITP we are absorbing tons and tons of information. Sometimes after long days when I try to go to sleep, I won’t be able to turn my brain off. Sure, I could watch something mindless on Netflix, but that’ll still keep my brain busy. I want to create a device that when light is sensed, a lullabye-type song will play. The sound played can be altered and personalized for the person. The sounds of the ocean, rain, or even a calming song. I’d love to also adapt this for children.

Imagine a child’s mind whirling with constant wonder. They have so many thoughts going on at once perceiving/sensing what they see, hear, or feel in the world. All the want to do is tell you about it. Ask questions. Get answers. Give their “output”. Then it’s time for bed. How to calm their mind down? Other than reading, sometimes a child just wants to hear a song, which results in often an awkward lullabye. With this said, this device would sense as the room would get darker, and a soft lullabye would play. Or it could even be personalized to be the voice of their parent/loved one when they are sleeping at a friend’s house.

Interactive Technology

I have a few ideas for interactive technology:


  • CVS self-checkout
  • Movie theater kiosk
  • Disney World Fast-Pass kiosk

For each of these, I can’t help but think of Brett Victor’s rant, in that using screen isn’t as interactive as it really could. Nothing is changing shape and giving us clues that things are working in the way they should.

I’m going to explore the interactive technology of self-checkout, specifically at CVS. I regularly shop at CVS. I have a personal reason for using self-checkout and I’m curious if my reason is similar to others. Whenever I have to buy something I’m slightly embarrassed about, such as Gas X, or an obscene amount of chocolate that’s obviously for me, but I’m pretending I will share, I will go to self-checkout. This idea works with Norman’s article on emotion design, as the experience is then more pleasing for the user.

However, funnily enough, the actual interaction most of us have with self-checkout kiosks are not very pleasurable. This ties along with Norman’s idea of affect and behavior. When you touch the screen, but the screen senses you want a different button than what you intended to choose, it’s common to feel frustrated. You might even feel embarrassed. Perhaps this negative affect will cause you to hurry up and finish your transaction before a line too large complies behind you. Conversely, you might be so excited that virtually no one (that you know of) saw that you just bought 4 types of candy bars and a bag of beef jerky that you breeze through the checkout process.

Something I always find funny about self-help kiosks is that there is typically an employee monitoring the process. Either they’re helping with visibility of the kiosks or helping them figure out how to make the kiosk stop telling you to place your items in the bag.