Final Final Project Proposal – Week 12

Safe Russian Roulette

Introduction

After having done a bit more research and some more ideation I am now sure of what I am going to create exactly. Of course there may be some changes down the road but the project should be at least 90% like what I’m going to outline here.

Inspiration

I originally had this idea before the midterm and it was going to be my midterm project but instead I changed it to the Mexican Standoff game. But, what I really wanted to do for both of these projects is make a game or anything that was interactive but that allowed for at least two people to enjoy the experience together.

I had been inspired by the indie game Buckshot Roulette as well. It’s a game where the player plays against a strange looking dealer who has lives and so does the player and the lives are represented by lightning bolts and there are abilities as well which makes the game more tactical. I wanted to almost recreate the game or at least the aesthetic but for now I will not do the abilities since I need to at least finish the foundation, i.e. just normal Russian Roulette.

The lightning bolts representing the lives of both the dealer and the player were done because each of them could be revived with a defibrillator. It was the lightning bolts themselves which lead to me having the idea of a player receiving an electric shock.

Then having the lives and a little menu on the side of the table inspired me to have a p5js sketch which can has a similar aesthetic.

How Russian Roulette works
& How this game will work

In the real game of Russian Roulette a revolver would be loaded with one live round and then the rest would either be empty or blanks.

In Safe Russian Roulette there will also be at least one live “round” and the rest are blanks, totaling 6 rounds. Depending on the difficulty chosen there will be between 1-5 rounds that are live. The names of the difficulties in ascending order are Standard (1 live round), Hard (1-2 live round), Harder (1-3 live rounds), Nearly Impossible (2-4 live rounds), Masochist (3-5 live rounds)

There will be 2 buttons for each player to press to either shoot themselves or the other. When the player shoots themselves they may either shock themselves or it may be a blank. If its a live round they lose and if its a blank they get a second turn which means its more likely that the next round will be live and therefore more likely that they will win if they shoot the other player. Each player has one life. Basically its a game of chance.

Before the players start shocking each other they will be greeted with a start page. There will be a potentiometer or knob of some kind and a button and together the user can select different options that are displayed using p5js. Options such as difficulty and retry.

There will also be some kind of lights that flash in a certain way which are in the middle of the table. It will display the number of live and blank rounds at the beginning (it can change color, red = live, white = blank) then, depending on the difficulty, it will either disappear completely, or it will show the number of rounds remaining but it will never show the number of live or blank rounds left, just the number of rounds left at most.

Then after one of the players lose it will indicate on the play screen who won and who lost and then ask the players if they want to play again or not. Also, during the game while the players are playing it will indicate the difficulty that the players are playing at so that others can see and in case they forgot.

Redha Final Project Update – “Selfies for Two <3"

Upon developing the concept for my final project, I have decided to incorporate a more communal element into my work as a means to utilise internet culture to physically bring people together and create tangible memories.

I will be using the same image classification process as described in my initial post but will create my own machine learning model using Google’s Teachable Machine. Doing this will enable the object classification to be more accurate and specific, ensuring a smoother interaction between users and the project. When gathering the sample images for this, I will make sure that the environment and framing is as close as possible to what the program will see on-site in order to ensure its accuracy while being open to the public.

In terms of the communal aspect, the project will require two phones (and therefore two people) to be detected by the program in order to generate the digital mirror. In making this choice, I hope to utilise the tendency we have to take photographs of ourselves as a means to bring us closer with our friends and loved ones, thus replacing an otherwise self-centred process with one that focuses on interpersonal connection. In order to compliment and emphasise this point, I will generating an ASCII-style digital mirror which fills the thresholded space with the text-heart emoticon ‘<3’. Not only does this link to the theme of interpersonal connection, it also refers back to the theme of internet culture which was ultimately the influence behind my project.

 

Raya Tabassum: Reading Response 7

“Design Meets Disability” provides a rich examination of how design intersects with disability, with a focus on both the historical context and contemporary innovations in the field. It delves into specific examples like eyewear, hearing aids, and prosthetics, and underscores a cultural shift from viewing these items merely as medical aids to considering them as fashion statements and expressions of personal identity.

As someone interested in design, I find the intersection of fashion and functionality particularly fascinating. The way eyewear evolved from a stigmatized medical device to a stylish accessory exemplifies how cultural perceptions of disability and assistive devices can shift dramatically. This shift challenges us to think about other devices in the same light. Could hearing aids or prosthetics become similarly fashionable? This idea encourages a reconsideration of what we define as “normal” and pushes the boundaries of inclusivity in design. From my perspective, embracing design innovation in disability aids not only enhances functionality but also boosts the self-esteem of users. If more designers viewed these aids as opportunities for creative expression, we might see a broader acceptance and desire for these products, much like what happened with eyeglasses.

Week 12 : Pi Final Project Progress

As for the progress on my final project this week, the body frame is already built. I am still assembling the legs (They take time).

Also, in terms of the electronics, I need to order an additional Li-Po battery and a voltage converter to supply the power to the servo motors.

Also, for the cardboard castle above the walker, I began generating the textures in Midjourney. Still in the progress of assembling.

I unfortunately lost my ESP32 chip, so need to re-order it again.

Raya Tabassum: Final Project Concept & Progress

Finalized Concept for the Project:
“Interactive Musical Garden” is an immersive installation combining technology and nature, designed to create a harmonious interaction between physical and digital realms. The project features a set of 3D-printed roses, each equipped with capacitive touch sensors and LEDs, laid out on a cardboard garden surface to mimic a garden. Interacting with these roses triggers specific musical tones and stimulates the blooming of virtual flowers on a screen, managed via a P5 sketch. This setup aims to explore the symbiotic relationship between touch-based interactions and visual-audio feedback, enhancing the user’s sensory experience through art and technology.

Design and Description of the Arduino Program:

Inputs: The Arduino will use capacitive touch sensors placed beneath each 3D-printed rose. These sensors detect the presence and duration of touch.
Outputs: Corresponding to each sensor, LEDs under the roses light up when a touch is detected. Additionally, the Arduino will send serial data to the P5 sketch indicating which rose was touched.

Interactive Elements:

Visual Outputs: The P5 sketch will display a dynamic garden where each touch on a physical rose causes a new type of flower to bloom at random positions on the screen.
Audio Feedback: Different musical tones are played corresponding to which rose is touched, enhancing the multisensory experience.

User Testing Goals:

Usability: Assess the ease of interaction and intuitiveness of the touch interfaces.
Responsiveness: Measure the system’s responsiveness and the synchronization between physical touches and digital responses.
Aesthetic and Sensory Feedback: Gather feedback on the visual and auditory elements to determine if they effectively enhance the user experience.

To-Dos:

  • Begin assembling the components as per the design (I couldn’t get all the sensors yet because of unavailability but as soon as I get them I’ll try to assemble everything).
  • Complete printing all the roses (I’ve one prototype now but I’d need to replicate another 3).
  • Test the initial versions of the Arduino and P5 code (I’ve written the primary codes, I’ll be repurposing the Virtual Garden p5 assignment I’d completed, and I’ve shortlisted which sound files I want to use).

Reading Response – Disability and Design

I feel that this week’s reading provides interesting insight into the disability design space, specifically in how referencing other spaces can both support and limit the development of design.

The comparison made between the design approach to glasses and hearing aids led me to consider this point. Over time, the destigmatisation and eventual popularisation of wearing glasses allowed designers to make bolder stylistic choices when producing new glasses. Comparatively, as hearing aids are slowly becoming destigmatised, the reading points out that some designers started to adopt elements from eyewear in an attempt to push the development of ‘HearWear’. While effectiveness of a design is ultimately dependent on the user to decide, I feel that this approach presents even more constraints in a context which is already presented with limitations surrounding social stigma and technical capabilities (how well the hearing aid works, how seamlessly it can be integrated into a design (discrete or otherwise)).

In response to the quote by Charles Eames stating that “design depends largely on constraints”, I do believe that too many constraints can hinder rather than progress design. In this case, I feel that the aforementioned limitations offer enough constraints for designers to experiment and develop different approaches that offer a comfortable solution for users. By adding a stipulation of following the conventions of eyewear, I feel that the designer in this case has prioritised their own creative vision over the usability and practicality of their design.

Week 12 Reading – Jihad Jammal

The reading offers a profound examination of the role design plays in the realm of disability, challenging the conventional perception that design for disability must focus solely on functionality. The discussion centers around the transformative potential of integrating aesthetic and cultural elements into the design of disability aids, proposing that these elements are not merely supplementary but integral to redefining these aids within broader societal contexts.

Historical examples, such as Charles and Ray Eames’s plywood leg splints, serve as a foundation for arguing that design constraints related to disability can catalyze broader innovations in design practice. The Eames’s ability to transform a functional object like a leg splint into a design that influenced their later iconic furniture pieces illustrates how solutions born out of necessity can transcend their origins to impact broader design disciplines. This notion is further reinforced through the transformation of eyewear from a stigmatized medical necessity into a fashionable accessory, which underscores the potential for disability aids to evolve beyond their functional inception towards cultural significance.

However, this integration of aesthetics raises questions about the balance between form and function, reflecting a potential bias towards design that may overlook practical user needs like accessibility and affordability. While the reading persuasively invites a rethinking of disability aids as elements of personal identity and expression, it also prompts critical reflections on ensuring these designs remain accessible and practical. The challenge lies in achieving a harmonious integration where design innovations in the disability sector do not compromise on functionality, ensuring that these aids are not only culturally resonant but also remain true to their primary purpose of serving the needs of the disabled community.

Final Proposal_Week_12 – Jihad Jammal

Final Proposal for End of Semester Project

Instead of a  joystick-controlled P5.js game, this project will have a slight twist with what can be considered a “joystick” by using a physical device shaped like a “clam”, which will interact with a digital environment displayed via P5.js. The core idea is to gamify the task of throwing wooden blocks (metaphorically representing tasks or challenges) into the clam, with the difficulty and visual feedback varying according to the intensity of the interaction.

Design and Functionality of the Arduino Setup

Inputs:

The Arduino will be equipped with pressure sensors embedded in the clam-like contraption. These sensors will detect the force exerted by the wooden blocks thrown into the “clam”. The primary input for the Arduino will thus be the varying pressure from these blocks.

Outputs:

The Arduino will control a motor mechanism responsible for opening and closing the “clam”. The speed at which the “clam” opens and closes will be directly proportional to the pressure sensed by the sensors – more pressure will result in faster movement.

Interaction with P5.js:

The Arduino will send data to the P5.js program indicating the current pressure level detected by the sensors. This will inform the P5.js application about how quickly to adjust the visuals accordingly.

Design and Functionality of the P5.js Program

Inputs:

The P5.js program will receive data from the Arduino about the pressure levels. This input will determine the rate at which the in-game visual elements change.

Outputs:

The main output of the P5.js program will be the visual representation on the screen. The size of a dog character in the game will increase as more pressure is applied to the “clam” sensors. This size increment and the speed of the “clam’s” movements will serve as visual feedback to the player, creating a dynamic gaming experience.

Interaction with Arduino:

Apart from receiving pressure data from the Arduino, the P5.js program will also reset all functions (i.e. “clam” speed, size of the dog) on the Arduino and the sketch  once the game is over and the limit of not send any information back to the Arduino.

Conclusion

By integrating robust physical components with dynamic digital responses, the game not only becomes more engaging but also introduces a novel method of interaction in digital gaming environments. Through user testing and iterative design, this project will evolve into an innovative and enjoyable experience for users, showcasing the potential of hybrid physical-digital interaction platforms.

Final Project Proposal – Darko

Inspiration and Concept

Ever since I’ve come to Abu Dhabi, my passion for jet-skiing and boats has been growing and growing inside me. At the same time, I wanted to do something unique, something that nobody has ever done before. That is why for this final project I decided to make a boat. 

I want the boat to be controlled by the users hand, and with gestures like: if you want to go left, you slide your arm to the left, if you want to go right, you slide your arm to the right. Same thing goes for the speed. If you want to go faster, you put your hand closer to the sensor (kind of like a pedal).

Implementation

The implementation phase can be very easily explained. For the gestures of the user, I would like to use this sensor that is available in our booking website:

 

I have also found a 3d model for the boat and matched the size so that It can fit the Arduino, the Breadboard and the Batteries. The plan is to attach a fan to a DC Motor, and attach the DC Motor to a Servo, which would be used to set the directions. As the user moves his hands, the servo will adjust the angle accordingly and the dc motor will adjust the power too. The communication will be done through a wireless receiver which would be provided by PI.

I plan to use p5.js only as a “dashboard” highlighting the way the boat is headed (straight, right, left) and also the speed. It would be interesting if I could animate the boat so that it shows some fun animation If we are speeding up or turning too sharply.

Week 12: Final Project Proposal & Progress

Concept

For my final project, I am yet again choosing to go with a butterfly theme. This time, however, I plan to give it a little life – in the real physical sense. With the ability of bringing hardware components together, and controlling them programmatically, we can simulate many things. I will attempt to do just that and bring a little life to a small butterfly.

My final project will essentially be a physical, mini-interface with a butterfly that reacts to human touch. The interface will display a butterfly atop a flower, undisturbed and resting peacefully. Once touched, its wings will flutter, awakening a swarm of butterflies that begins to flap away from beneath. The interface itself will be a physical surface on a table, with the animation, designed and implemented in p5.js, projected onto it. The interface will have a physical butterfly prototype (made from paper and laminated, perhaps) protruding in the middle. Once a sensor (a capacitive touch sensor) that is connected to it senses a signal, its wings will be activated and so will the butterflies’ animation in the backdrop. Its wings will be attached to two servo motors that will move once a signal is received from p5.

Materials and Hardware

  1. Arduino Uno and breadboard
  2. Capacitive touch sensor
  3. Two servo motors
  4. A general-purpose tripod
  5. A projector

p5.js

The p5.js sketch is responsible for creating and controlling the animation. For the creation of butterfly objects, I tried experimenting with generative art, but it proved to be quite challenging as I needed to simulate the smooth motion of flapping wings. I, hence, will be using standard images (spritesheets) to display the graphics. The p5.js sketch will consistently read the messages transmitted from the Arduino program, waiting for a touch signal to activate the motion of the butterfly wings and the subsequent animation. Once the animation begins, it will send messages to the Arduino program, communicating that the animation is still underway. These messages are going to be used to keep the servo motors, controlling the wings, in motion. Once the animation stops, as indicated by all the smaller butterflies that escaped from beneath the central butterfly leaving the screen display, the sketch will send a terminating message to the Arduino, which will be used to bring the motion of the servos to a halt.

In order to ensure a seamless mapping experience between the sketch and the physical surface, I will be using p5.mapper. This library allows you to calibrate your sketches dynamically in order to match the dimensions of your sketch with that of a surface, once projected.

Arduino

The Arduino program will have a touch sensor as an input and servo motors as output. The touch sensor sends a signal to p5 indicating a touch is detected and will receive messages used to control the duration of the motor-controlled wing flaps.

Progress

p5

So far, I have the major parts of my sketch (the animation, serial communication, and projection mapping) completed for the most part. I want to work on the aesthetics of it a little more next and will have to modify the serial communication implementation to synchronize the movement of the motors.

https://editor.p5js.org/sarah-altowaity1/sketches/5eZkJXShI

Arduino

The main circuitry is wired up to connect the touch sensor and the motors. The program sends the sensor signal over to p5 for detection of touch. I, additionally, implemented the wing flap motion with the motors. However, I still need to work on the synchronization between the animation and the servo movements. I also need to figure out a way to conveniently and securely place the wings on the motor flaps and stabilize the components.

Sketch mapped onto a surface

Next Steps

The main next step is putting it all together and completing the technical aspects of synchronizing the animation with the physical components. I also need to work on designing a neat interface with stable components as I would like to reduce the overhead of setting up (and the possibility of things falling apart or out-of-sync) as much as possible.