Week 3: Wednesday, 06 12 17. Embodied suffering

We have been focusing on a way to enable subject 2 to establish empathy, which is only achievable through creating a certain immersive experience created through perceiving the emotions of subject 1 in multiple ways. Therefore, we have been looking for a way to use several sensors measuring skin conductivity and pulse of subject 1 and feed them back to subject 2 in a tangible and experiencable way.


Fig. 4: diagram depicting the feedback loops and relations of the components


Subject 1 to subject 2

After analyzing the physical expression of fear, we came to the conclusion that this feeling can be firstly mimicked or communicated when using a device, which tightens your chest in order to simulate the pressure on it and the constriction in terms of breathing, and secondly through simulating the feeling of  something giving you a chill or having creeping horrors with the help of a device, which vibrates and, additionally, cools your spine area.

Practically, we developed a harness,worn by subject 2, which has a string around your chest that tightens and eases according to the skin conductivity values  generated by subject 1 and another belt running down subject 2’s spine, which contains vibration motors and a cooling and heating element, which is also controlled by the pulse sensor values generated by subject 1.



Fig. 5: wearbale empathy harness


Subject 1, subject 2 and depicting empathy

These values, generated by subject 1, can be used to control a graph or a shape. To visualize empathy, we also need values from subject 2. As a result, we receive one value of subject 1 and one value of subject 2, which are used to draw the object of empathy. In terms of a visual language, we thought about a metaphor that is able to visualize the feeling part and the part of empathy. Practically, we concluded to a geode, which has a visible and peaked surface, in our case the visible fear, and an inner part with a smoother surface that displays the empathy.


Fig. 6: sketched object of empathy as a tangible and interpretable result




Fig. 7.0: 3d-modelled object of empathy as a tangible and interpretable result. The appearance surface and the inner part can be controlled with sensor data




Fig. 7.1: the geode in Grasshopper with sliders for simulating sensor input




Fig. 7.2: 3d-modelled object of empathy as a tangible and interpretable result. The appearance surface and the inner part can be controlled with sensor data


Next steps + questions

  • fabricating the harness, connecting sensor + Arduino to act according to sensor values
  • use that connecting to additionally trigger the object modelled in Rhino
  • in terms of final fabrication of object: testing waffling and slicing as well as different ways of fabricating (form1, Ultimaker, laser cutting)
  • What could be the feedback from subject 2 to subject 1? Could it be a visual or a tangible image?
  • We should try to enrich the experience for subject 2 materialwise (use metal for emphasizing the hold-cold-experience in the neck, make it look rough, …)




Week 3: Tuesday, 05 12 17. Rorschach test

The intermediate presentation amplified our view on empathy, particularly focusing on the perception of empathy. Thus, we iterated through our concept draft did some reasearch on how empathy was communicated, mostly in psychological concerns, but also how emotions, feelings and perception itself can be communicated in order to make it understandable for others. While researching, we stumbled upon the Rorschach test. This psychological test uses inkblots to analyze the interpretation of  a test person in order to establish a common ground from which a broader understanding of the counterpart can be derived. Since this is a very famous, altough considered critically in terms of defensibility,  approach in this field of communicating emotions, we have been investigating this particularly.



Fig. 1: exploring the Rorschach test


Our approach was to express this form of empathy in a visual way, so we used the test’s typical pattern of mirrored forms, which then create an interpretable image. We used the sensor data gained by using the skin conductivity sensor for subject 1 and the pulse sensor for subject 2 and loaded it into Grasshopper, which tells Rhino to draw a continuing graph for each of those two sensors.




Fig. 2:  first tests with loading sensor data into Rhino via Grasshopper




Fig. 3: first result of generating graphs with two sensors



We mirrored this graph and thus, created a 3D Rorschach test, which shows the anxiety of subject 1 and on the other axis the emotions of subject 2 created while watching / listening to subject 1. The result is an object of empathy of this particular moment.



Fig. 4: first result of using two sensors to create a shape


Based on this first result, we picked up our previous thoughts concerning multilayered visualization, and are now searching for more ways to communicate and express emotions, in our case focusing an anxiety, in mutlilayered way. This enables subject 2 to dive profoundly into the mind set of subject 1 and enables him to gain a broader understanding of his perception and thus, establish empathy.




Week 2: Tuesday, 28 11 17. Ideation #02: Uncomfortable Situations

After speech, we further went into the topic of “uncomfortable situations”. Can we recreated the feeling of anxiety for example in a crowded train station?

The question arises of what kind of output we’ll have to fabricate.

Week 1: Thursday, 23 11 17. Ideation #01: Sounds to communicate

Are the sounds we use to communicate just the words we use? There’s much more nuance to it.

  • Pitch
  • Volume
  • Pronunciation
  • Speed

The surroundings of a scenery also have an effect

  • Other people talking
  • Noise
  • Movement

Can we transfer the experience of sound to different senses like… touch?