Week 3: Thursday, 07 12 17. Fabrication right prototype wearable #02, coding.

After setting our goals on a team meeting after the mentoring, Gabriel and Nadine focused on fabricating and refining the prototypes for the output device of subject 2. It should digest the sensor data sended by the pulse sensor of subject 1 into vibration motors, which are placed around the wrist of subject 2. Therefore, we wanted to use a glove that could also carry the sensors needed for creating our object of empathy. The values of other sensor, the skin conductivitiy sensor, are needed to control a stepper motor located around the chest of subject 2, which winds up a string or belt in order to make the tension of subject 1 experienceable.  We sketched and discussed several materials and techniques, especially for the belt, which has to be thin enough to transport an incisive feeling, yet wide enough to feel tension.

Fig. 01: detail sketches of our back output device

 

 

 

Fig. 02: detail sketches of our back output device

We rapidly went into fabricating and testing while making and ended up with base layer out of metal, which was adapted roughly to the shape of a back, but also carried an uncomfortable and cold feeling that comes with the material itself. Technically, we had to assemble for this part of our device a MKR 1000 (in this prototype phase, we used an Arduino Uno, since Fernando used it to make it mobile) to receive the necessary sensor information to run our motor and the vibration motors, a battery with 3.7 V and an amplifier that raises the power output to 12 V in order to be able to run the stepper motor with the cord.  For this set of components we bended metal cases, which we spot welded onto the base layer to hide and protect the sensitive technical parts. The outcome is an unfriendly and rough looking wearable, which can be worn at the back of subject 2. Together with the sleeve, which will be produced on Tuesday, an immersive empathical experience is created.

 

 

 

 

Fig. 03 + 04: sketches of metal cases used for protecting purposes

 

While Gabriel and Nadine were focusing on the fabrication part, Fernando was taking care of the coding part, which included mapping the values of the sensors to a decent range of values, which could be digested by the MKR 1000 and the output devices. This took a lot of time, since the MKR 1000 was not able to handle the abstract values of the skin conductivity sensor and be mobile.

Fig. 05: outcomes / conclusions of the prototype

 

Fig. 06 + 07: sketches of the sleeve

The pulse sensor library provided by the manufacturer of the sensor, wasn’t meant to be used with the Arduino MKR1000. The code uses interrupts to calculate the heart beat, which isn’t supported by the MKR1000. In collaboration with Joël Gähwiler, Fernando and Joël refactored some of the C++ library to remove the dependence from the interrupt mechanic. In the end, the values weren’t as accurate as before, but now the whole setup is mobile and can be used in any location in combination with batteries and a cellphone with a WiFi hotspot.


if (pulseSensor.sawStartOfBeat()) {
if (currentTime - lastTransmit > SEND_INTERVAL) {
lastTransmit = currentTime;
client.publish(PUBLISH_BPM, String(pulseSensor.getBeatsPerMinute()));
client.publish(PUBLISH_SKIN, String(analogRead(PIN_SKIN)));
}
}

 

 

Week 3: Wednesday, 06 12 17. Embodied suffering

We have been focusing on a way to enable subject 2 to establish empathy, which is only achievable through creating a certain immersive experience created through perceiving the emotions of subject 1 in multiple ways. Therefore, we have been looking for a way to use several sensors measuring skin conductivity and pulse of subject 1 and feed them back to subject 2 in a tangible and experiencable way.

 

Fig. 4: diagram depicting the feedback loops and relations of the components

 

Subject 1 to subject 2

After analyzing the physical expression of fear, we came to the conclusion that this feeling can be firstly mimicked or communicated when using a device, which tightens your chest in order to simulate the pressure on it and the constriction in terms of breathing, and secondly through simulating the feeling of  something giving you a chill or having creeping horrors with the help of a device, which vibrates and, additionally, cools your spine area.

Practically, we developed a harness,worn by subject 2, which has a string around your chest that tightens and eases according to the skin conductivity values  generated by subject 1 and another belt running down subject 2’s spine, which contains vibration motors and a cooling and heating element, which is also controlled by the pulse sensor values generated by subject 1.

 

 

Fig. 5: wearbale empathy harness

 

Subject 1, subject 2 and depicting empathy

These values, generated by subject 1, can be used to control a graph or a shape. To visualize empathy, we also need values from subject 2. As a result, we receive one value of subject 1 and one value of subject 2, which are used to draw the object of empathy. In terms of a visual language, we thought about a metaphor that is able to visualize the feeling part and the part of empathy. Practically, we concluded to a geode, which has a visible and peaked surface, in our case the visible fear, and an inner part with a smoother surface that displays the empathy.

 

Fig. 6: sketched object of empathy as a tangible and interpretable result

 

 

 

Fig. 7.0: 3d-modelled object of empathy as a tangible and interpretable result. The appearance surface and the inner part can be controlled with sensor data

 

 

 

Fig. 7.1: the geode in Grasshopper with sliders for simulating sensor input

 

 

 

Fig. 7.2: 3d-modelled object of empathy as a tangible and interpretable result. The appearance surface and the inner part can be controlled with sensor data

 

Next steps + questions

  • fabricating the harness, connecting sensor + Arduino to act according to sensor values
  • use that connecting to additionally trigger the object modelled in Rhino
  • in terms of final fabrication of object: testing waffling and slicing as well as different ways of fabricating (form1, Ultimaker, laser cutting)
  • What could be the feedback from subject 2 to subject 1? Could it be a visual or a tangible image?
  • We should try to enrich the experience for subject 2 materialwise (use metal for emphasizing the hold-cold-experience in the neck, make it look rough, …)

 

 

 

Week 3: Tuesday, 05 12 17. Rorschach test

The intermediate presentation amplified our view on empathy, particularly focusing on the perception of empathy. Thus, we iterated through our concept draft did some reasearch on how empathy was communicated, mostly in psychological concerns, but also how emotions, feelings and perception itself can be communicated in order to make it understandable for others. While researching, we stumbled upon the Rorschach test. This psychological test uses inkblots to analyze the interpretation of  a test person in order to establish a common ground from which a broader understanding of the counterpart can be derived. Since this is a very famous, altough considered critically in terms of defensibility,  approach in this field of communicating emotions, we have been investigating this particularly.

 

 

Fig. 1: exploring the Rorschach test

 

Our approach was to express this form of empathy in a visual way, so we used the test’s typical pattern of mirrored forms, which then create an interpretable image. We used the sensor data gained by using the skin conductivity sensor for subject 1 and the pulse sensor for subject 2 and loaded it into Grasshopper, which tells Rhino to draw a continuing graph for each of those two sensors.

 

 

 

Fig. 2:  first tests with loading sensor data into Rhino via Grasshopper

 

 

 

Fig. 3: first result of generating graphs with two sensors

 

 

We mirrored this graph and thus, created a 3D Rorschach test, which shows the anxiety of subject 1 and on the other axis the emotions of subject 2 created while watching / listening to subject 1. The result is an object of empathy of this particular moment.

 

 


Fig. 4: first result of using two sensors to create a shape

 

Based on this first result, we picked up our previous thoughts concerning multilayered visualization, and are now searching for more ways to communicate and express emotions, in our case focusing an anxiety, in mutlilayered way. This enables subject 2 to dive profoundly into the mind set of subject 1 and enables him to gain a broader understanding of his perception and thus, establish empathy.

 

 

 

Week 2: Wednesday, 29 11 17. Prototype #01: Vibration Sleeve

To gather the first set of data to measure a stress level caused by vibration on the body, we created a rapid prototype with a small vibration motor. Now we’re exposing the subjects to vibration and measure the changes with a skin conductivity sensor.