Document Type
Article
Publication Date
2019
Abstract
Technologies that mediate social interaction can put our privacy and our safety at risk. Harassment, intimate partner violence and surveillance, data insecurity, and revenge porn are just a few of harms that bedevil technosocial spaces and their users, particularly users from marginalized communities. This Article seeks to identify the building blocks of safe social spaces, or environments in which individuals can be free of privacy and safety dangers. Relying on analogies to offline social spaces—Alcoholics Anonymous meetings, teams of coworkers, and attorney-client relationships—this Article argues that if a social space is defined as an environment characterized by disclosure, then a safe social space is one in which disclosure norms are counterbalanced by equally as powerful norms of trust that are both endogenously designed in and backed exogenously by law. Case studies of online social networks and social robots are used to show how both the design and law governing technosocial spaces today not only do not support trust, but actively undermine user safety by eroding trust and limiting the law’s regulatory power. The Article concludes with both design and law reform proposals to better build and protect trust and safe social spaces.
Recommended Citation
Waldman, Ari Ezra, "Safe Social Spaces" (2019). Articles & Chapters. 1317.
https://digitalcommons.nyls.edu/fac_articles_chapters/1317
Included in
First Amendment Commons, Privacy Law Commons, Science and Technology Law Commons, Torts Commons
Comments
Washington University Law Review, Vol. 96, Issue 6 (2019), pp. 1537-1580