Can Social VR Substitute Real Life?

From a Perspective of Identity, Privacy, and Authenticity


The' metaverse' bubble seems to burst with the evaporation of Meta's market value and its Reality Labs division losing $9.4 billion this year. However, this recently hyped concept is, in fact, nothing new. Social VR has been studied as one of the most important topics in VR technology for decades. Its rapid development has led to many exciting applications that serve different purposes. Not only can you explore with friends in fascinating user-created virtual worlds while dressing as fantasy characters (e.g., VRChat), but you can also telepresence and collaborate with coworkers in a virtual workplace (e.g.,, Horizon Workroom). In addition, social VR has already been applied and well-received in education, healthcare, and manufacturing, to name a few. Admittedly, at the current stage, social VR is nowhere near the imagination of sci-fi (e.g., Ready Player One) or even the vision Meta wants to sell in either its graphics or interaction techniques. Despite that, with the increasingly advanced VR technology, the future of social VR shows tempting and promising prospects but also leads to the question: can social VR become a substitute or extension for real life? Leaving aside for the moment the limitations of current VR technology and the gap with public expectations, in this blog post, we will discuss whether social VR can comprehensively substitute real-life social, interpersonal, and professional scenarios from the perspective of identity, privacy, and authenticity.

The Relevance of Authenticity

Authenticity plays an essential role in building trust and credibility in communication, e-commerce, information dissemination, collaboration, and, more importantly, extending real-life societal and interpersonal relationships to the virtual world.

While mainly focusing on entertainment and online socializing, current social VR applications hardly consider the relevance of authenticity building. Users usually enter the virtual world with a second identity or an alter ego. Admittedly we cannot ignore efforts developers and providers take to prevent account stealing and identity impersonation. For example, AltspaceVR's terms of service restrict creating "a false identity or impersonating another person or entity in any way." and RecRoom also prohibits "impersonating or misrepresenting your affiliation with any person or entity" [1]. However, such protections are implemented more from the perspective of "protecting virtual property" and are insufficient for "establishing authenticity."

On the other hand, most social VR applications, such as VRchat and AltspaceVR, where users enter the virtual world represented by cartoon and fantasy characters without associating with their real-life identity, can hardly establish authenticity. In comparison, user relationships in social media that encourage users to create a profile representing their real-life identity (e.g., Facebook) are more like an extension of real-life social and interpersonal relationships. There is no doubt that social VR users can as well establish connections in real life through the virtual community, but there is often a lack of measures to protect the authenticity of users.

One possible way to establish authenticity is by using personalized realistic avatars. Compared to other characteristics, human appearance is intuitive, unique, and highly recognizable, with substantial effects on emotion contagion [2]. In some social VR applications, the choice of avatars has shifted to photo-realistic avatars that accurately capture the likeness of the users. Nowadays, we can efficiently generate our digital bodies with the advancement of 3d scanning and processing pipelines [3]. Such graphical representation works as an identifier that other users recognize and acknowledge in the virtual community and help build a connection with your unique personality even if they do not know you in real life, thus establishing authenticity.

The Necessity of Verification and Authentication

However, this raises another question: how do you know the avatars you encounter in the virtual world are who you think they are or who they claim to be? Verification and authentication of identity are always disregarded in social VR. One example is, a social VR application for workplace collaboration, which allows users to create their realistic avatar with only a profile photo, while there is no verification process checking whom the photo belongs to, nor any authentication on who is using the account and controlling the avatar.

Therefore, verification and authentication are necessary to ensure social VR users are who they claim to be, and together they contribute to the protection of authenticity. Furthermore, traditional schema such as PIN could be vulnerable and unrealistic for continuous authentication, while specialized techniques utilizing biometrics acquired by VR sensors such as body movements and tracking data [4] could be the answers.

Privacy and Ethical Issues

We have previously argued the relevance of establishing authenticity in social VR. Inevitably, it may also lead to privacy and ethical issues. We summarize three aspects of concern that should not be overlooked in the development of social VR towards authenticity: harm to anonymity, invasion of privacy, and threat to autonomy.

Harm to Anonymity

The need for anonymity should not be ignored; users have the right to decide whether or not to disclose personal information and to enter the virtual world anonymously. Since anonymity is often not welcomed in real life, it will likely cause discrimination and cyberbullying against anonymous users if the authentication mechanism is implemented in social VR. Besides, internet users do not easily distinguish between authenticity and integrity [5], and anonymous users may be perceived as less trustworthy and reliable, hindering their active participation in activities in the virtual community.

Another potential harm to anonymity is that users may be more self-regulated in their behavior. Considering that all user activities take place on the servers, anonymity is somewhat taken away for those concerned that their words and actions will be recorded or even monitored.

In response, the social VR communities should strive to increase their tolerance for anonymity and safeguard the user experience of anonymous users. In addition, there should be clear boundaries, corresponding regulations, and technical countermeasures to ensure that users can genuinely be alone in the virtual world.

Invasion of Privacy

User authentication is essential to ensure authenticity, yet mishandling can lead to serious privacy violations. Personal information is collected during the enrollment and verification of user identities. Using more sensitive data such as biometrics can increase accuracy but also lead to new privacy challenges such as unauthorized collection and disclosure and user profiling.

 Two principles are substantial to protect users' privacy. Firstly, developers should ensure that the information and biometric templates are protected from technical aspects. Privacy-preserving authentication schemas should be taken into consideration. Secondly, developers and providers should clarify data collection and its use. Only very few current VR applications have privacy policies, let alone explicitly mention the use of data [6]. Standards and ethical guidance are urgently desired.

In response, the social VR communities should strive to increase their tolerance for anonymity and safeguard the user experience of anonymous users. In addition, there should be clear boundaries, corresponding regulations, and technical countermeasures to ensure that users can genuinely be alone in the virtual world.

Threats to Autonomy

Being autonomous is being able to deliberate and make decisions without being influenced or manipulated by external sources [7]. Implementing the authentication mechanism may pose a potential threat to autonomy: users will overly rely on the authentication system that decides whose identities are authentic and lose their judgment. Similar to the previous findings that Facebook users tend to give higher credibility to "verified accounts" [5], system failures and bias within algorithms can significantly influence and even manipulate trust and relationships among users. The ways of conveying authenticity information to users should try not to manipulate users' judgment and leave enough room for independent evaluation.


So, can social VR become a substitute or extension for real-life socio-cultural interaction? We believe that it has a very promising future if authenticity can be established. On the other hand, developers and policymakers should always consider the privacy and ethical issues stemming from encouraging authenticity that may lead to negative impacts. With further breakthroughs in VR technology and the effective resolution of these issues, social VR could become an alternative realm for human socio-cultural activities in the near future and bring substantial changes to our lives.

Recommended resources:

[1]   D. Maloney, G. Freeman, and A. Robb, “Social Virtual Reality: Ethical Considerations and Future Directions for An Emerging Research Space,” in 2021 IEEE Conference on Virtual Reality and 3D User Interfaces Abstracts and Workshops (VRW), Mar. 2021, pp. 271–277. doi: 10.1109/VRW52623.2021.00056.

[2]   M. Volonte et al., “Effects of Virtual Human Appearance Fidelity on Emotion Contagion in Affective Inter-Personal Simulations,” IEEE Trans. Visual. Comput. Graphics, vol. 22, no. 4, pp. 1326–1335, Apr. 2016, doi: 10.1109/TVCG.2016.2518158.

[3]   J. Achenbach, T. Waltemate, M. E. Latoschik, and M. Botsch, “Fast generation of realistic virtual humans,” in Proceedings of the 23rd ACM Symposium on Virtual Reality Software and Technology, Gothenburg Sweden, Nov. 2017, pp. 1–10. doi: 10.1145/3139131.3139154.

[4]   C. Schell, A. Hotho, and M. E. Latoschik, “Comparison of Data Representations and Machine Learning Architectures for User Identification on Arbitrary Motion Sequences,” in Proceedings of the IEEE International conference on artificial intelligence & Virtual Reality (IEEE AIVR), 2022.

[5]   T. Vaidya, D. Votipka, M. L. Mazurek, and M. Sherr, “Does Being Verified Make You More Credible?: Account Verification’s Effect on Tweet Credibility,” in Proceedings of the 2019 CHI Conference on Human Factors in Computing Systems, Glasgow Scotland Uk, May 2019, pp. 1–13. doi: 10.1145/3290605.3300755.

[6]   D. Adams, A. Bah, C. Barwulor, N. Musaby, K. Pitkin, and E. M. Redmiles, “Ethics emerging: the story of privacy and security perceptions in virtual reality,” in Fourteenth Symposium on Usable Privacy and Security (SOUPS 2018), 2018, pp. 427–442.

[7]   F. O’Brolcháin, T. Jacquemard, D. Monaghan, N. O’Connor, P. Novitzky, and B. Gordijn, “The Convergence of Virtual Reality and Social Networks: Threats to Privacy and Autonomy,” Sci Eng Ethics, vol. 22, no. 1, pp. 1–29, Feb. 2016, doi: 10.1007/s11948-014-9621-1.

This blogpost was written by Jinghuai Lin.  He obtained his Bachelor degree (B.Sc.) in Electronic Information Science and Technology from Sun Yat-sen University in China, and his Master degree (M.Sc.) in Computer Graphics, Vision and Imaging from University College London. From September 2020, he works as the ESR13 of PriMa and is pursuing his PhD in the University of Würzburg, focusing on the protection of personalized photo-realistic avatars against identity theft in social virtual reality.