Week 4: Thursday, 14 12 17. Presentation + exhibition set-up.

On Thursday, we discussed how to present our work in terms of the exhibition. We planned to show the whole process of form finding with the help of a slides show projected onto the wall, and, additionally, have all of our physical prints and form tests laying on a pedestal next to into. Furthermore, we had the wearables exhibited on pedestals as well and going to be around to encourage people to test it. To show the creating of the artefact live, we are going to project the mapping of those values, which was happening live in Rhino and Grasshopper, onto the wall above our objects as well.

The entire used for the two artefacts is in the following Github repository: empathyForAnxiety

 

 

Week 4: Wednesday, 13 12 17. Debugging + media production.

The day before the presentation, all our energy was used to have working prototypes, with which we could show our approach live at the presentation.

Our wearables were finished by Tuesday night, but there were still shields and drivers to solder and attach, cables to extend and values to map. Fernando locked himself up in the Physical Computing Lab in order to finish this part, while Gabriel printed the second artifact of empathy and helped Fernando. Nadine edited the contents and the documentation for the presentation, took care of the media production for the final delivery on Friday.

After Gabriel and Fernando could finish all components, we attached and assembled everything and we defined Nadine and Gabriel as the two subjects for the prestation, who should live demo our wearables. Nadine will be the scared one and Gabriel will be the empathetic one. We mapped the values according to their physical state, so the function of our work could be seen and experienced clearly.

 

To transfer the heart beat between subject 1 (the person feeling the anxiety) and subject 2 (the empath), we chose to use vibration motors. Subject 1 has them at the wrist, subject 2 on the back. The four vibration motors both subjects have immitate in serie the heart beat of the other person.

 


void vibrationLoop() {
if (vibrationActive) {
if (vibrationLastExec + vibrationInterval * 4 > currentTime) {
for(int i = 0; i < 4; i++){ vibrationStatus[i] = currentTime > vibrationLastExec + vibrationInterval * i && currentTime < vibrationLastExec + vibrationInterval * (i + 1); } } else { for(int i = 0; i < 4; i++){ vibrationStatus[i] = false; } vibrationActive = false; } activateVibration(); } } void updateVibrationInterval() { if (!vibrationActive && vibrationInput > vibrationInputThreshold) {
vibrationInterval = map(constrain(vibrationInput, 50, 120), 50, 120, vibrationMapLow, vibrationMapHigh);
vibrationLastExec = currentTime;
vibrationActive = true;
}
}

void activateVibration() {
for(int i=0; i < 4; i++){ digitalWrite(VIBRATION_PINS[i], vibrationStatus[i] ? HIGH : LOW); } }

Week 4: Tuesday, 12 12 17. Fabrication wearable #01 + #02, coding.

On Tuesday, we wanted to be done with the two final wearables and test it in terms of attaching everything together. Since we knew that the prototype of wearbale #1 is working and we just had to slightly adjust the position of the components, we were mainly focusing on the complex and time-consuming wearbale #2 the last days. But of course, we needed to rework it, and so Fernando, who is the only one who had access to the textile workshop, was fabricating our final wearable #1, whereas Gabriel and Nadine focused on wearable #2.

Fabricating the right prototype for this wearable #2 brougt us a lot of insights and we went into an interation phase and noted what could be improved and were weak points could be. Especially the right order of the components in order to make cables as short as possbile to keep it tidy, enough space to exchange the battery pack and the distance between the motor with the spool included and how it was fabricated. We were also testing and prototyping with an Arduino Uno, but our final prototype should use a MKR1000 to be mobile.

After Fernando finished his part of the fabrication, he designed and printed a shield for the MKR, which was triggering the stepper motor and went on to the coding part, which was almost done but still had some issues with the two sensors on one board and interfering each other. He got back to Joel several times in order to debug intensely and ened up in recoding and adusting the library to fit our needs. This took him nearly the whole day, which made it impossible for us to test the installation as a whole and finely map the values of our sensor data to correctly trigger the vibration motors and the stepper motor.

 

 

 

Week 3: Thursday, 07 12 17. Fabrication right prototype wearable #02, coding.

After setting our goals on a team meeting after the mentoring, Gabriel and Nadine focused on fabricating and refining the prototypes for the output device of subject 2. It should digest the sensor data sended by the pulse sensor of subject 1 into vibration motors, which are placed around the wrist of subject 2. Therefore, we wanted to use a glove that could also carry the sensors needed for creating our object of empathy. The values of other sensor, the skin conductivitiy sensor, are needed to control a stepper motor located around the chest of subject 2, which winds up a string or belt in order to make the tension of subject 1 experienceable.  We sketched and discussed several materials and techniques, especially for the belt, which has to be thin enough to transport an incisive feeling, yet wide enough to feel tension.

Fig. 01: detail sketches of our back output device

 

 

 

Fig. 02: detail sketches of our back output device

We rapidly went into fabricating and testing while making and ended up with base layer out of metal, which was adapted roughly to the shape of a back, but also carried an uncomfortable and cold feeling that comes with the material itself. Technically, we had to assemble for this part of our device a MKR 1000 (in this prototype phase, we used an Arduino Uno, since Fernando used it to make it mobile) to receive the necessary sensor information to run our motor and the vibration motors, a battery with 3.7 V and an amplifier that raises the power output to 12 V in order to be able to run the stepper motor with the cord.  For this set of components we bended metal cases, which we spot welded onto the base layer to hide and protect the sensitive technical parts. The outcome is an unfriendly and rough looking wearable, which can be worn at the back of subject 2. Together with the sleeve, which will be produced on Tuesday, an immersive empathical experience is created.

 

 

 

 

Fig. 03 + 04: sketches of metal cases used for protecting purposes

 

While Gabriel and Nadine were focusing on the fabrication part, Fernando was taking care of the coding part, which included mapping the values of the sensors to a decent range of values, which could be digested by the MKR 1000 and the output devices. This took a lot of time, since the MKR 1000 was not able to handle the abstract values of the skin conductivity sensor and be mobile.

Fig. 05: outcomes / conclusions of the prototype

 

Fig. 06 + 07: sketches of the sleeve

The pulse sensor library provided by the manufacturer of the sensor, wasn’t meant to be used with the Arduino MKR1000. The code uses interrupts to calculate the heart beat, which isn’t supported by the MKR1000. In collaboration with Joël Gähwiler, Fernando and Joël refactored some of the C++ library to remove the dependence from the interrupt mechanic. In the end, the values weren’t as accurate as before, but now the whole setup is mobile and can be used in any location in combination with batteries and a cellphone with a WiFi hotspot.


if (pulseSensor.sawStartOfBeat()) {
if (currentTime - lastTransmit > SEND_INTERVAL) {
lastTransmit = currentTime;
client.publish(PUBLISH_BPM, String(pulseSensor.getBeatsPerMinute()));
client.publish(PUBLISH_SKIN, String(analogRead(PIN_SKIN)));
}
}

 

 

Week 3: Wednesday, 06 12 17. Mentoring

Mentoring

11.20, Room 5.T04

with Joelle Bitton, Florian Wille, Luke Franzke, Joel Gähwiler

 

Further proceedings

make it mobile, enables us to gather sensor data in real fear situations -> make MKR 1000 usable

testing fabrication methods. laser is not usable because it is not correctly representing the multilayered data. 3d print / form1 is able to do so

refine sensor wearable

 

 

 

Week 3: Wednesday, 06 12 17. Embodied suffering

We have been focusing on a way to enable subject 2 to establish empathy, which is only achievable through creating a certain immersive experience created through perceiving the emotions of subject 1 in multiple ways. Therefore, we have been looking for a way to use several sensors measuring skin conductivity and pulse of subject 1 and feed them back to subject 2 in a tangible and experiencable way.

 

Fig. 4: diagram depicting the feedback loops and relations of the components

 

Subject 1 to subject 2

After analyzing the physical expression of fear, we came to the conclusion that this feeling can be firstly mimicked or communicated when using a device, which tightens your chest in order to simulate the pressure on it and the constriction in terms of breathing, and secondly through simulating the feeling of  something giving you a chill or having creeping horrors with the help of a device, which vibrates and, additionally, cools your spine area.

Practically, we developed a harness,worn by subject 2, which has a string around your chest that tightens and eases according to the skin conductivity values  generated by subject 1 and another belt running down subject 2’s spine, which contains vibration motors and a cooling and heating element, which is also controlled by the pulse sensor values generated by subject 1.

 

 

Fig. 5: wearbale empathy harness

 

Subject 1, subject 2 and depicting empathy

These values, generated by subject 1, can be used to control a graph or a shape. To visualize empathy, we also need values from subject 2. As a result, we receive one value of subject 1 and one value of subject 2, which are used to draw the object of empathy. In terms of a visual language, we thought about a metaphor that is able to visualize the feeling part and the part of empathy. Practically, we concluded to a geode, which has a visible and peaked surface, in our case the visible fear, and an inner part with a smoother surface that displays the empathy.

 

Fig. 6: sketched object of empathy as a tangible and interpretable result

 

 

 

Fig. 7.0: 3d-modelled object of empathy as a tangible and interpretable result. The appearance surface and the inner part can be controlled with sensor data

 

 

 

Fig. 7.1: the geode in Grasshopper with sliders for simulating sensor input

 

 

 

Fig. 7.2: 3d-modelled object of empathy as a tangible and interpretable result. The appearance surface and the inner part can be controlled with sensor data

 

Next steps + questions

  • fabricating the harness, connecting sensor + Arduino to act according to sensor values
  • use that connecting to additionally trigger the object modelled in Rhino
  • in terms of final fabrication of object: testing waffling and slicing as well as different ways of fabricating (form1, Ultimaker, laser cutting)
  • What could be the feedback from subject 2 to subject 1? Could it be a visual or a tangible image?
  • We should try to enrich the experience for subject 2 materialwise (use metal for emphasizing the hold-cold-experience in the neck, make it look rough, …)

 

 

 

Week 3: Tuesday, 05 12 17. Rorschach test

The intermediate presentation amplified our view on empathy, particularly focusing on the perception of empathy. Thus, we iterated through our concept draft did some reasearch on how empathy was communicated, mostly in psychological concerns, but also how emotions, feelings and perception itself can be communicated in order to make it understandable for others. While researching, we stumbled upon the Rorschach test. This psychological test uses inkblots to analyze the interpretation of  a test person in order to establish a common ground from which a broader understanding of the counterpart can be derived. Since this is a very famous, altough considered critically in terms of defensibility,  approach in this field of communicating emotions, we have been investigating this particularly.

 

 

Fig. 1: exploring the Rorschach test

 

Our approach was to express this form of empathy in a visual way, so we used the test’s typical pattern of mirrored forms, which then create an interpretable image. We used the sensor data gained by using the skin conductivity sensor for subject 1 and the pulse sensor for subject 2 and loaded it into Grasshopper, which tells Rhino to draw a continuing graph for each of those two sensors.

 

 

 

Fig. 2:  first tests with loading sensor data into Rhino via Grasshopper

 

 

 

Fig. 3: first result of generating graphs with two sensors

 

 

We mirrored this graph and thus, created a 3D Rorschach test, which shows the anxiety of subject 1 and on the other axis the emotions of subject 2 created while watching / listening to subject 1. The result is an object of empathy of this particular moment.

 

 


Fig. 4: first result of using two sensors to create a shape

 

Based on this first result, we picked up our previous thoughts concerning multilayered visualization, and are now searching for more ways to communicate and express emotions, in our case focusing an anxiety, in mutlilayered way. This enables subject 2 to dive profoundly into the mind set of subject 1 and enables him to gain a broader understanding of his perception and thus, establish empathy.

 

 

 

Week 2: Friday, 01 12 17. Contemplating about empathy

What striked our mind was the possibility of our idea to talk about empathy in a more experiencable way. Further on we were contemplating about how to visualize or communicate it.

 

Questions

Is it possible to find a method to communicate one’s perceptions of sensations or fear to somebody else?

Could that be a new approach to psychologists or doctors to understand their pateints more profoundly through profoundly relating to their sensations?

Week 2: Thursday, 30 11 17. Intermediate Presentation

Presentation PDF

Week 2: Wednesday, 29 11 17. Prototype #01: Vibration Sleeve

To gather the first set of data to measure a stress level caused by vibration on the body, we created a rapid prototype with a small vibration motor. Now we’re exposing the subjects to vibration and measure the changes with a skin conductivity sensor.

Week 2: Tuesday, 28 11 17. Ideation #02: Uncomfortable Situations

After speech, we further went into the topic of “uncomfortable situations”. Can we recreated the feeling of anxiety for example in a crowded train station?

The question arises of what kind of output we’ll have to fabricate.

Week 1: Thursday, 23 11 17. Ideation #01: Sounds to communicate

Are the sounds we use to communicate just the words we use? There’s much more nuance to it.

  • Pitch
  • Volume
  • Pronunciation
  • Speed

The surroundings of a scenery also have an effect

  • Other people talking
  • Noise
  • Movement

Can we transfer the experience of sound to different senses like… touch?

Week 1: Wednesday, 22 11 17. Waste / Bodystorming

The traces we leave behind everyday in the form of waste could be a way to collect data about the behaviour of an individual. Like every form of data collection, it’s subject of interpretation what could have happened.

 

 

 

Fig. 01: human waste traces.

 

 

As an exercise I decided to track my waste over the course of a day. It’s a small insight into my activities over those 24 hours and not the full picture. Tracking should always be considered as a window into someones behaviour, and not a full account.

 

 

 

Fig. 03: tracked waste of Fernando’s day

 

 

Conclusions / questions

  • Was I stressed over that day (12 cigarette buts)?
  • There isn’t much casings coming from food. Did I eat enough or did I just eat food without casings?
  • What happened to the other sock?