LUST after all things tech.


People spend more time with technology than with friends and family during this technogenic age, yet longing for a friend. A friend who always Listens and is always willing to Learn about their nature and preferences. An Unbiased friend, who promises to hold what you share Secure. A Trustworthy friend who is who they say they are, who is ever present through the seasons of their life. This makes tech users reliant and even dependent on technology, making everything we do, everything we say, and everywhere we go transparent and available. Users are quick to understand and engage in the benefits of technology but slow to adjust their behaviors to be congruent with their privacy concerns. This blog post reminds the user of how social interactions have been revolutionized, sheds light on the importance of informed trust decisions, and urges users to take responsibility for their privacy preferences towards technological devices.

Keywords: Social interaction, Interpersonal trust, Privacy preferences, Technology, virtual reality

Technology promise to:

Listen & learn… “I hear you and know your preference” …

Humans spend most of their waking time in social interaction. Advances in technology revolutionized social interaction as we know it and now include human-computer interaction and relationships with technological devices. For example, socializing with friends in virtual reality in the form of avatars, interacting with intelligent home assistant devices like “Alexa,” “OK Google” or “Olly” who has an evolving personality striving to become more like its user. Furthermore, communication with mobile phones’ “Siri” or a self-driving car’s computer is possible. They are ready to do all the thinking and searching to suggest where to dine or have wine based on the users’ preferences.

Unbiased & Secure … “I am safe, move towards me” …

There is a shift in the relationship between user and product, which leads users to consciously or unconsciously, become reliant and/or dependent on technology. Their personalized devices know them so well that they tell them when their stress levels are too high, when they have not moved enough in a day to reach their caloric goal, when to stan-up as sitting is the new smoking, when and where they need to go, and how to get there. These devices have become our best friends, so much so that we feel comfortable to share the most intimate details of our human condition with them. Trusting them to hold the content safe and secure and not sharing the content with anyone else. There is also a reliance on them to know our preferences and have unbiased responses. Just like a best friend would in an ideal world.
in love with technology

Trustworthy: “you can trust me; share with me” …

Be-friending an application like Woebot (among other psychological applications) allows users to share their immediate mood, unfiltered and unrefined thoughts while remaining unconditional and steadfast in their positive regard towards the irrational. Such applications keep records of the content users choose to share. This includes tragic events and users’ emotional reasoning processes, learn about the users’ personality and emotional triggers, collect user thinking processes and patterns, and get acquainted with the users’ intrapersonal dynamics and users’ vulnerabilities in meaningful relationships. Based on these learnings of the always-available companion and preprogrammed with psychological modalities, for example, cognitive behavioral therapy (CBT) or solution-focused tools, users have a unbiased personal mental health companion in their back pocket. Furthermore, we can now give permission to applications and smartwatches to measure our heart rate, blood oxygen levels, sleep patterns, stress levels and caloric needs, and draw correlations between heart rate, emotions, and events we share with our devices across applications.

This AI needs to be taught how to be human to enable a more natural and intuitive relationship. This interactive relationship has changed! From being the user of technological products to becoming the product whose data is being used to train the AI for service. From the technical perspective, everything we do, everything we say, and everywhere we go can be known by others. The efficiency, usability, and always-connected nature of new AI technologies and Internet technologies (the 'Internet of Things') mean that domains previously unreachable are now accessible and transparent to others. State of the art virtual reality headsets record the layout of a users’ room and collect location data, movement patterns (if a body suit is worn), emotional facial expressions and eye movements, which can be used to make medical diagnoses, for example, attention deficiency – hyperactivity disorder (ADHD). Users of VR can meet friends in virtual environments without being certain that that the representation of their friend in VR is really who they say they are and not a subject of virtual identity theft. Which raises concern for users of VR technology potentially trusting an imposter with sensitive information without their knowledge.

Why is privacy necessary if there is trust?

Trust is the building block for any relationship. Users of technology are urged to not blindly fall in love with the benefits of the technology, for example, home assistant devices, smartphones, smart watches, applications, or virtual reality technology. But instead take responsibility for their own understanding of how the technology works, what personal information is necessary for the products’ functioning, and consider with who what will be shared. Users need to try and understand the terms and conditions of their privacy before using the technology. To enhance users’ autonomy and control over their private information, users’ vulnerability needs to be protected by the technology developers and by the users through education. People need control over their transactions and information they share with others and technology to experience the well-being associated with intimacy and emotional release.

For the curious:

Joinson, A., Reips, U.-D., Buchanan, T., & Schofield, C. B. P. (2010). Privacy, Trust, and Self-Disclosure Online. Human-Computer Interaction, 25(1), 1–24.

Liebers, J., Abdelaziz, M., Mecke, L., Saad, A., Auda, J., Gruenefeld, U., Alt, F., & Schneegass, S. (2021). Understanding User Identification in Virtual Reality Through Behavioral Biometrics and the Effect of Body Normalization. Proceedings of the 2021 CHI Conference on Human Factors in Computing Systems, 1–11.

Stuart, A., Bandara, A. K., & Levine, M. (2019). The psychology of privacy in the digital age. Social and Personality Psychology Compass, 13(11).

This blogpost was written by Johrine Cronjè.  She is a doctoral researcher and Maria Skłodowska-Curie Fellow at the University of Würzburg (Germany) in the Psychology department. She studies the environmental and psychological factors that influence the acceptance and usage of privacy protection solutions within the PriMa ITN Horizon 2020 project. Her Ph.D. focus on trust evaluation between users of social virtual reality.

Johrine Cronjè