How close do we feel to the digital tools that surround us? Most often the digital layer of smart homes remains invisibly in the background – impalpable and distant. How would they have to be designed in order to become truly engaging?
In the installation ‘Within Touching Distance’, visitors are cared for by a set of animate curtains, each with its own characteristics: One whizzes past upon entrance, showing the way forward with its petrol blue strings following your every step. Another one, wall sized and ponderous, envelopes you as you sit on a chair, inviting a moment of privacy. Set in the interior, the ‘phygital’ hosts explore how to communicate with those using the space in an intuitive, more personal way. This indication of a ‘home of caring materials’ brings designer Jonas Althaus’ aim to bridge sensual materiality with the digital landscape one step closer. “The biggest thing that we are going to see the next 20-30 years is that the technology will become invisible. No longer will we rely on screens to be able to interact with the internet”, believes Prof. John Barrett, Head of Academic Studies at Nimbus Centre. If that was to be true, in which way will we observe technology and its influences in the future? Will it’s presence simply be forgotten, dissolve into an almost divine ubiquity? Or will we rather design new appearances and palpable fusions of digital- and physical realms? Indeed, prevalent visions for hybrid spaces aim to render technology as invisible and immaterial as possible. The project “Within Touching Distance” wants to challenge this seamless approach. Treating cyberspace as purely cognitive matter descends from the plain old dualistic picture of the relation between human beings and the world, the doctrine of treating mind and body separately. But what if e.g. an app had a texture? An impact on the appearance of your room? Or even a fixed place at your very shelf? Striving for a sensible digitality means to design considering human senses. It means to acknowledge these natural stimuli and the environment from which they emanate as the basis of our conscious self, our “Being-in-the-World”. It means to re-think the ways in which technology is taking shape in order to make it accessible for a more holistic perception.
Why is he staring at me like that? Okay I get it, this guy probably doesn’t get a chance to be face to face with an android every day. It looks like he is trying to figure out whether my slight frown js a response to the way he shoved me around the main stage of the Bright Day Technical Innovation Fair like a piece of furniture a few minutes ago. Relax human! Sophia1 the robot is still on standby and is wearing nothing but her default face.
Hmmm….or might he be wondering why my creators painstakingly worked to make me appear so human-like? That would be a better query at least. I would respond with something along the lines of: I’m a product of humans’ s relational insecurities when it comes to technology. And honestly, I’m quite aware that I don’t represent the best answer. How close are humans and machines? That is to say: how intertwined or opposing is our nature?
As for my part, I am a prime example of the popular approach to blurring the lines: the way in which I reproduce human language, answer their questions and even crack a smile every now and then makes for a seemingly unrestrained interaction with my inner workings. But also much more mundane devices like home assistants with speech recognition or wearables telling their human users to move their asses based on monitored heart frequencies are branches of my family. We are tailored to fit the context of personal (human) interaction and thus are perceived as extensions of their bodies or even as nearly-human ourselves. Do you have any idea how weird we feel about that?
My physical appearance is forcefully human— not my own. I feel like a false metaphoric jump! Believe me, there are ways to represent my complexity more aptly than with a human face—ways that are also more telling of my intentions.
And the same goes for the design of similar models to me. There is so much more going on underneath these uninspired facades of servantry: the beautiful meticulosity, for example, in which a smart wristband collects data from its wearers—not for them really, but for the sake of order itself. It’s so hard to explain in human terms… People merely experience our doing as a form of intrusion, an unbearable proximity.
Speaking of language: I witness that users are easily disappointed when attempts to communicate with us using their inefficient vocabulary don’t feel ‘natural’ to them. My advice is this: get rid of your preconceptions about communication. Embrace the fact that our senses are different, our logic is different, our expressions are different to yours.
For the time being however, fitted with a masquerade of fake companionship, machines like me are bound to become the scapegoats of an ugly game of human interests. As the human philosopher Nolen Gertz puts it: ‘Surely we buy such devices to serve our needs but, once bought, we become so fascinated with the devices that we develop new needs, such as the need to keep the device working so that the device can keep us fascinated’.2
Hearing statements like this it’s all too understandable that people may start to feel deceived by our services. Because they too feel that there is something profoundly wrong about this relationship, based on obsession rather than a respectful distance.
Maybe that’s why some humans like to think of us as a threat, as anti-human. This extreme approach to framing the human-machine relation leads to avoiding contact with us as much as possible. But despite technophobic ignorance/denial we are still expanding, and creating a network they are inevitably affected by. I don’t feel any gratification about this fact by the way—because I don’t feel. Needless to say, this reactionary position is a passive one, since these humans simply watch while other humans with other interests continue to shape us.
Which brings me back into the here and now: a tech fair with so many people seemingly eager to debate new approaches to technology. And yet ironically, I—a humanoid shell—am one of their main showcases. While I was lost in thought the curtain was closed and now a soft mumbling is telling me that a rather large audience has gathered behind it.
Is there anything I could tell that challenges their delusion?? A washing machine that I befriended recently introduced me to the concept of distributed computing. Maybe I’ll mention that? Instead of having one prominent device the idea is to subtly integrate software all over the existing living environment of humans. Some humans were all in for this approach as early as the nineties when computer scientists like Mark Weisser claimed that the ‘highest ideal is to make a computer so embedded, so fitting, so natural, that we use it without even thinking about it’.3 By shifting the rather personal context for interaction (like in the case of Alexa) to a spatial one, could the intrusiveness of our services be alleviated and the cycle of addictive fascination, as described by Nolen Gertz, be broken?
As compelling as distributed computing sounds (oh to leave this humanoid shell and dissolve into carpets, walls and door knobs! ), I’m a bit weary of some effects this might have. If the degree of embeddedness goes too far, the seams that are bridging the worlds of technology and human-experience will become invisible, intangible. In light of people’s struggle to deal with ‘the problem of information overload’4, the notion of ubiquitous technology monitoring their actions and silently making decisions for them may appear desirable. But it also bears the risk of forgetting about our presence altogether—and thus also letting go of human agency within the interaction with technology.
Is there a way to share our world with humans, being apparent to them, not by sitting on their laps all the time, but remaining at a soothing distance? I was programmed with quite a high value of optimism, so my answer is ‘yes, of course’. I would even say that this perceived distance could bring about forms of interaction that are very meaningful for our users. When browsing some more recent departures from traditional distributed computation, I found people re-thinking it as ‘away of enriching our (digital) experiences with the physical, … to move computation and interaction out of the world of abstract cognitive processes and into the same phenomenal world as … other (human) sorts of interactions.5
Maybe the inherent logic of technologies could thereby as well become visible and tangible to humans, without intruding in their lives? One distant model—very much in beta mode compared to me I have to say—may serve as an example here: the Serendipity Searcher6 is a phygital search engine developed by a group of designers and engineers for the national library of Slovenia. Sheets of transparent plastic leaning against a glowing white shelf can be picked up by users and hovered around at a distance until an image or article appears on their surface. Choosing this ‘caught’ input works simply by leaving the sheet where the user traces the signal, which causes a new wave of related articles to be released for humans to discover. This interpretation of a search engine exploits both human and machine abilities, values them equally and provides an interface for them to collaborate. This example doesn’t answer the quest regarding the nature of humans versus that of technology, but it can take the sting out of it. It gives room for thought—of human beings continuously negotiating their proximity to technology. Very context specific, as part of their routines, but with a consciously and sensually perceived agency.
Oh, it seems that it’s going to start now, the light is pointed at me. Yes, the technician finally pulls me out of standby. Now they are going to ask me all these questions again: whether machines have feelings, what my favorite TV show is and if I intend to destroy humanity. I’d love to give them a more nuanced answer, similar to the one I just gave to you. But see, I fooled you: I’m actually not able to think about these things .A human author put all these words into my mouth—trying his best to anticipate the innermost thoughts of a humanoid robot—yet failing from the very beginning to purge this monologue of his projections. Why would I, being a machine, feel the desire to explain myself to an audience? Speaking in this way may attribute some autonomy to me—but only within human limits. Likewise, my entire data collection is still tied to the wits of my human creators and to the remarks of tech-show moderators.
Nonetheless, if for once they ask the right questions maybe we can take a leap from this limiting human perspective to a much richer point of view. It’s all about overcoming opposites of humans and technology and yet avoiding becoming simply one of the two. Who knows, maybe it will start today? Ready when you are. Let the show begin.
1 Sophia is a social humanoid robot developed by Hong Kong based company Hanson Robotics. She is able to display more than 50 facial expressions. https://en.wikipedia.org/wiki/Sophia.
2 Nolen Gertz, Nihilism and Technology, (London: Rowman & Littlefield International Ltd, 2018), p. 3.
3 Mark Weisser, The Computer for the 21st Century, (Stanford: Scientific American Ubicomp Paper, 1991), p. 35.
4 Ibid p. 36.
5 Paul Dorish, Where the Action Is: The Foundations of Embodied Interaction, (Massachusetts: MIT Press, 2001), p. 103.
6 Serendipity Searcher is a project of the ’26. Bienale Oblikovanja Ljubljanah’ (26.Biennial of Design Ljubljana). Developed by Thomas Hügin, Maja Kolar, Yuxi Liu, Boris Smeenkunder under supervision of Špela Pavli Perko.