For my third week in physical computing, I connected a digital input circuit and output to a mictrocontroller. I first set up my breadboard as shown:

My breadboard with a pushbutton connected to the Arduino

I wrote a program in my Arduino Nano, which served as my 3.3V power source. I connected power and ground from my breadboard to the power and ground of my mictrocontroller with red and black wires on the upper left column. I connected a pushbutton to digital input 12. A 120-ohm resistor connects the pushbutton’s bottom left pin to the ground bus on the breadboard. In this row, digital pin 12 is connected by a yellow wire. Afterwards, I connected a yellow and a red LED light to two 220-ohm resistors. Two blue wires connect the LED lights to digital pins 11 and 10.

In my next step, I created a simple circuit using a mini pool table.

Hunson pool table

I covered the outer black plastic with aluminum foil, which connected to a blue wire that replaced one of my pushbutton pins. The black 8-ball is covered with copper tape and sharpie. The ball is connected to a special flexible wire that replaced the other pushbutton pin.

Close-up picture
Red LED light
Yellow LED light

Each time the black -ball entered one of the holes, the LED light would change from red to yellow. In the future, I would like to create a pool table in which each hole signifies a different light.

Code on Github:

For the second lab, I created an analog input by connecting a variable resistor to a microcontroller.

Arduino Nano connected to a potentiometer and and LED

The 3.3V and ground pin of the Arduino is connected by the red and black wires in the left side rows. The potentiometer is connected to the voltage and ground buses. The center pin of the potentiometer is connected to analog 0. The LED light is connected by a 220-ohm resistor and a wire to digital pin 9 of my Arduino Nano.

More LED lights

I connected two more LED lights in a series circuit. In my code, I created ledPin variables for my digital pins 7 and 8 and set them as outputs.

Code on Github:

For my observation, I watched users interacting with a McDonald’s menu. The menus are very large and are sometimes even taller than customers. When I entered the restaurant, two menus were across the cash register. I noticed that the user interaction took a few minutes and were often longer than simply speaking to a worker. Users often kept their pointed hand in midair in an awkward dangling manner. Sometimes consumers held on to the side of the menu for physical support.

The first screen is very easy to understand, but the screens afterwards become more complex. Customers were stuck looking at all of the food options, which are revealed through the use of scrolling. After each item is ordered, a deal screen urges consumers to order more food.

Bombardment of options

According to Crawford, interactivity is a cyclic process between two actors. In this case, the actors are the human and the menu. Since the menu is very tall, most items are at eye level. In order to complete or cancel an item, however, customers must click a button on the bottom left or right. The users were constantly bending down in order to complete their objective. Each item must be ordered individually, which created more obstacles. I believe that the menu would be characterized as a lower degree of interactivity because there is no active listening or speaking involved.

As Norman argued in “Emotion & Design,” feelings are significant in interactive technology. Attractive events that make a user feel good produce more creativity and they are also more tolerant to obstacles. Even though the McDonald’s menu is designed poorly, consumers overlook these inconveniences because they are hungry. In this positive state on the visceral level, the consumer does not analyze every detail and instead focuses on the larger objective.