We have been focusing on a way to enable subject 2 to establish empathy, which is only achievable through creating a certain immersive experience created through perceiving the emotions of subject 1 in multiple ways. Therefore, we have been looking for a way to use several sensors measuring skin conductivity and pulse of subject 1 and feed them back to subject 2 in a tangible and experiencable way.
Fig. 4: diagram depicting the feedback loops and relations of the components
Subject 1 to subject 2
After analyzing the physical expression of fear, we came to the conclusion that this feeling can be firstly mimicked or communicated when using a device, which tightens your chest in order to simulate the pressure on it and the constriction in terms of breathing, and secondly through simulating the feeling of something giving you a chill or having creeping horrors with the help of a device, which vibrates and, additionally, cools your spine area.
Practically, we developed a harness,worn by subject 2, which has a string around your chest that tightens and eases according to the skin conductivity values generated by subject 1 and another belt running down subject 2’s spine, which contains vibration motors and a cooling and heating element, which is also controlled by the pulse sensor values generated by subject 1.
Fig. 5: wearbale empathy harness
Subject 1, subject 2 and depicting empathy
These values, generated by subject 1, can be used to control a graph or a shape. To visualize empathy, we also need values from subject 2. As a result, we receive one value of subject 1 and one value of subject 2, which are used to draw the object of empathy. In terms of a visual language, we thought about a metaphor that is able to visualize the feeling part and the part of empathy. Practically, we concluded to a geode, which has a visible and peaked surface, in our case the visible fear, and an inner part with a smoother surface that displays the empathy.
Fig. 6: sketched object of empathy as a tangible and interpretable result
Fig. 7.0: 3d-modelled object of empathy as a tangible and interpretable result. The appearance surface and the inner part can be controlled with sensor data
Fig. 7.1: the geode in Grasshopper with sliders for simulating sensor input
Fig. 7.2: 3d-modelled object of empathy as a tangible and interpretable result. The appearance surface and the inner part can be controlled with sensor data
Next steps + questions
- fabricating the harness, connecting sensor + Arduino to act according to sensor values
- use that connecting to additionally trigger the object modelled in Rhino
- in terms of final fabrication of object: testing waffling and slicing as well as different ways of fabricating (form1, Ultimaker, laser cutting)
- What could be the feedback from subject 2 to subject 1? Could it be a visual or a tangible image?
- We should try to enrich the experience for subject 2 materialwise (use metal for emphasizing the hold-cold-experience in the neck, make it look rough, …)