Final Project Concept

This project is all about how we “send messages”. We send messages intentionally, for example, when we text someone, and also unintentionally from our body language. This idea came from my interest in thinking about how we have grown to communicate in the digital age. As new technologies for communication are introduced less of our bodies are being used. I want to explore what we lose from digital communication by sending messages to loved ones with our finger tips. What will happen to our bodies if technology continues to grow? How much control do we actually have when we send messages to people?

 

diagram_final-001

Final Project Proposal

I will be combining a final project for Live Web and Interaction Design Studio. My goal is to make a physical communication device for people with loved ones who live far away. What I find fascinating about long distance communication is that we miss something that cannot be simulated. I want to achieve a way to not only communicate to someone afar, but do it in a way that’s unobtrusive, private, and uses the body.

Questions:

How can we communicate using our bodies?

Can use more of our bodies that our fingers to communicate?

Can I simulate what if feels like to communicate with someone who’s far away?

Initial idea:

fullsizerender-12

Above is an illustration of what this communication device would be like. Essentially it would be a wearable that would send data through a socket server to another person through their body movements. In the above illustration, I am proposing to use an accelerometer as an input that would send data to the socket server and then back to the other person with an output of a vibration. That way when person 1 sends a message, person 2 receives a vibration. Giving the sensation of feeling someone else without them actually being there.

fullsizerender-11

Music Interface Design

For this assignment we were asked to re-design the interface for a Max patch that Luke made. Since I am new to Max, I wanted to focus more on understanding basic elements of Max and how to connect things to make sound happen. Below is a screenshot of the basic patch I made that controls the sound based on the light in the video. I am not 100% happy with this and hope to continue working on this after this class.

screenshot-2016-10-17-19-06-39

Week 3: Connecting terms

WWND?

What would nature do?

For this assignment I was given four design terms to define and then connect into a narrative. The terms I was given were archetype, mimicry, scaling fallacy, and threat detection.

Definitions based on Lidwell’s Universal Design Principles:

Archetype:

Universal patterns of theme and form resulting from innate biases or dispositions. Hardwired ideas and conceptions. Example, Harley Davison= outlaw archetype.

Archetypes represent fundamental human motifs of our experience as we evolved; consequentially, they evoke deep emotions.

http://www.soulcraft.co/essays/the_12_common_archetypes.html

Mimicry:

What would nature do? WWND

the act of copying properties of familiar objects, organisms, or environments in order to realize specific benefits afforded by those properties. These can be surface, behavioral, or functional. (improves usability)

“I think the biggest innovations of the 21st century will be at the intersection of biology and technology. A new era is beginning.”- Steve Jobs referring to biomimicry

Scaling Fallacy:

A tendency to assume that a system that works at one scale will also work at smaller or bigger level. The fallacy occurs when a designer assumes usability will be retained when a design is scaled up or down. ie performance & interaction. Eg Big Screens

Threat Detection:

The natural ability to detect threatening stimuli more efficiently than non-threatening stimuli.

Examples: Smoking advertisements, animal protection

Each of these design principle is related to nature and how we naturally see things. When I think of Archetype I think of the brand of an object or design and who is going to be using the product. Comparing two types of vacuum designs a manual and a robotic. For example, looking at a Dyson, its functionality is much more important to the design. It’s designed for people who want to get things done and get their house cleaned; the design reflects its function by exposing how it moves across the floor and collects dirt. Whereas, looking at the Roomba, robotic vacuum, we immediately know it doesn’t need assistance. The form of this vacuum design shows us that it works autonomously. Just like threat detection, the design gives us information about what it does without having to know very much about the product itself.

 

screen-shot-2016-10-02-at-3-10-16-pm screen-shot-2016-10-02-at-3-13-14-pm

Considering the original designs of the vacuum were quite large, I would assume the design of the vacuum went through many design iterations in order to scale it down to its modern day size (scaling fallacy).

The biggest takeaway of understanding these design principles is that one of the main goals designers have is to design products that are intuitive and visually compelling. In order to do this, it’s best to pull from nature, from what people know and are comfortable with–looking back at last week’s assignment Good & Bad Design, BMW’s focus is to take the most modern car technology and combine it with the familiar. This way the user doesn’t have to learn two new things at once, rather they are able to truly utilize the advancements in technology without having an experience that is overly complicated and has a learning curve.

Examples of Mimicry:
(surface) software icons looking like the actual objects they are representing;  cloud;
(behavioral) Tamagotchi
A super popular gadget game in the 90s mimicked the behavior of an actual pet. It was also ambiguous, allowing children to define what kind of pet, realistic or not, they were taking care of.
screen-shot-2016-10-03-at-2-40-12-pm
(functional) Claw Machines
The infamous claw machine seen at most arcades and amusement parks, which has haunted likely millions of people, is functionally mimicking a human hand grabbing for something. Making users believe it’s not always about how many quarters you use.
screen-shot-2016-10-03-at-2-45-03-pm

Week 1

“The hurricane is coming, you have 20 minutes, get your stuff and go. You’re not going to be saying, ‘Well, that got an amazing write-up in this design blog’” –Objectified

Good design:

For this aspect of the assignment I decided to dissect the design of the BMW experience. This is more about how I feel, or my emotional journey of starting a BMW. This is a newer car, I believe a 2014 model, which means it has cool features like pushing a button to turn on the ignition, and controlling the temperate of heated seats.

I’m going to describe a few parts of the experience:

  • Opening the car door
  • Starting the ignition
  • Turning on/controlling the A/C and car temperature
  • Switching the gears from Park, Drive, Reverse, Park

The first part of the experience which is not pictured is opening the car door. There is no traditional key, I just need to have the key on my person. I actually had the key in a pocket in my backpack and all I need to do is pull the handle on the door. It typically unlocks the car right away. Sometimes I need to take the key out and put it closer to the car if the motion sensor in the car is not working properly (I’m guessing this is what happens).

img_5285

Next I turn on the car. This is super easy and kind of amazing. There is a button behind the steering wheel, in about the same place as where you would be keys in a traditionally designed car. If you press the button without your foot on the brake it will just turn the car battery on. Meaning you can listen to the radio, but cannot switch any gears. (Seen below)

img_5289

If you want to turn the ignition on, you must press down on the brake while pressing the button. As you can see in the image below, it gives the user more information– the indication that there are more features to use now that the ignition is on.

img_5293

Now that the car is on, let’s turn on the A/C and make the car comfortable.

First control the temperature by simply turning the knob that corresponds with your location of either passenger or driver.

Driver (left)

img_5299

Passenger (right)

img_5303

Now you can control the fan.

Increase air flow

img_5302

Decrease air flow

img_5300

The buttons indicate which air direction is selected. Below it shows that all three on selected, which means the air is flowing to your feet, to your legs/lower half, and upper body.

img_5304

Finally if you want to turn on the A/C simply press the A/C button.

What is nice about all of these buttons is that there is a light indicator letting the user know that that feature is on. This is a very familiar feedback loop and even though there are many selections to make, it is simple and fast to make them.

img_5301

Now to start the car. The joystick in the center is no different than any other joystick in a car I have seen. The unique difference is the interaction. What I really like about this is that because it is a familiar design, it invites a user to interact with it. It’s likely a user knows it must move back and forth in order to change gears. However, this is different because its location is always the same–meaning to change from Park to Drive, you simply tilt it backward and to go from Park to Rear, you tilt it forward.

img_5295

Tilt forward to Rear

img_5297

Tilt backward to Drive

img_5296

And to park, simply press the “P” button on the top of the stick.

img_5298

Another example of Interaction Design I have been thinking about lately is the flow of coffee shops and restaurants, more specifically how people order food or drinks. I tend to get slightly nervous when I go to stand in line and order something. Sometimes it’s not knowing how to pronounce something, or when you have to say slightly-embarrassing names, like “FAT BURNER” or “HANGOVER HELPER”. If I am not hungover, I feel weird to order the juice named that, as it implies I am ordering it while being hungover. But this is beside the point. I also think about how an establishment designed the experience of ordering; how do they welcome people to feel comfortable ordering, even ridiculously named items?

Bad Design

At the Bean coffee in Willaimsburg here is the ordering counter from left to right. The menu stretches from almost the entire counter starting from behind the dessert display. I am pretty near-sided and can barely read any of the menu all the way to the left. The interesting thing is that the items most visible are very standard coffee shop items, Hot: coffee, tea, espresso, americano, etc. Iced: Coffee, tea, latte, mocha, etc. Although people read this off the menu, typically these are items a user already knows they want or don’t want. The Bean has a variety of innovative Smoothies, Coffee Smoothies, Acai Smoothies, Juices, and Smoothie bowls. These are also the most pricey items, yet they are the hardest to read. I can’t help but wonder why they designed the menu like this.

I am especially curious about the “Smoothie Bowls” sign pictured in the picture on the left. Customers are standing from the center to the right of the counter. If there are more than two people in line, it moves a person back even further to the right, which means by the time a person gets to the counter they might not have any clue of what to order, as they hadn’t been able to distinguish anything on the menu. Why not move the free-standing menu all the way to the right? I would be curious to learn if this would increase sales on those items.

img_5306 img_5307

Bad/Good Design?

Another design I’d like to briefly discuss is the UMBRELLA. I have always thought the umbrella was such a funny thing we have naturally adopted to our society. It’s used different for some cultures, typically based on climate, as in it’s usually used to protect from rain but some use it to protect from sun. For whatever reason I still feel more comfortable using it as a way to protect from sun, even though I don’t even use the umbrella for that. Similarly to the discussion in Objectified about the toothbrush, for umbrellas, it seems that cost and advancements to the umbrella are based on the grip and also the spring mechanism used to open and close the umbrella. We all know a cheap umbrella won’t last by either the spring breaking, or the metal rods bending and no longer supporting the top cover. It’s definitely fits the bill with being a simple design, as it essentially is just a cover we hold. What I find interesting is that no one has come out with a design where we don’t have to hold the umbrella. I mean, seriously, how many umbrellas are we going to lose before someone re-designs it?

Design evolution of cameras

Digital cameras vs. Film Cameras