Cliquez ici si vous voulez lire la conversation dans sa langue originale avec une introduction en français.
By Juliana Friend, Ndeye Ndiaye, and Yacine Diouf*
*Ndeye and Yacine are pseudonyms for the co-authors’ anonymity and safety
Editor’s note: In early 2022, the Center for Long-Term Cybersecurity issued a call for researchers from UC Berkeley to pursue projects exploring “alternative digital futures,” focused on how the future of digital security could or should be reimagined to be more inclusive of diverse perspectives. Among the scholars selected for this program was Dr. Juliana Friend, an anthropologist working at the intersection of health equity and tech policy, who facilitated the conversation below and authored its introduction.
Below is a transcribed conversation with Ndeye and Yacine, two Senegalese activists who practice sex work and want to share their perspectives on digital privacy with international audiences. (To protect their anonymity, Ndeye and Yacine are both pseudonyms.)
I first met Ndeye and Yacine in 2017. I was conducting fieldwork as part of my anthropology dissertation research on how criminalized communities in Senegal navigate digital privacy risks. A mutual friend involved in sexual health activism connected me with Ndeye, and Ndeye connected me with Yacine. Both were exceptionally generous in sharing their experiences with, and anxieties about, social media.
They encouraged a community-engaged, participatory approach to research. Since the summer of 2022, as an extension of my dissertation research, we have been collaborating on a cybersecurity tool kit for sex-working communities in Senegal and other Francophone countries. We’ve translated English-language materials for Senegalese peers, exploring the differences between national digital security contexts along the way.
In addition to academic research, Ndeye and Yacine told me they wanted to speak directly to English-speaking readers. They have ideas they want tech leaders and policy makers to hear. They want Americans like me to challenge the status quo. So when I visited them in Senegal in June 2022, we decided to record a conversation, transcribe it, anonymize it, translate it, and publish the bilingual transcript. (An academic article on our participatory research is forthcoming.)
Tech justice activists assert that sexual autonomy must be a guiding principle for technologists.¹ The Hacking//Hustling collective conducts peer-led research — research by sex workers, for sex workers — on topics ranging from deplatforming to the impact of FOSTA/SESTA legislation. This research amplifies the expertise of communities for whom digital security has incredibly high stakes.
In the following conversation, Ndeye and Yacine advance this project. They foreground the urgency of addressing digital privacy risks; for instance, they would far prefer online labor to working corps à corps (“body to body”) if it wasn’t for the risk of image-based abuse (IBA), the obtaining and/or sharing of intimate images or videos of a person without their consent. As Ndeye and Yacine describe below, working online would eliminate the risk of contracting COVID-19 or STDs. But if clients were to record and distribute intimate images of them without their consent, and their families viewed these images, it would be devastating. It would threaten their safety, social position, and livelihood. For Ndeye and Yacine, the social risks of IBA outweigh the physical risks of transmissible diseases.
If clients were to record and distribute intimate images of them without their consent, and their families viewed these images, it would be devastating.
Because the stakes of digital privacy are so high for Senegalese sex workers, Ndeye and Yacine offer crucial insights into the importance of online safety and security more broadly. Indeed, IBA affects internet users worldwide. One recent study found that roughly 1 in 12 US adults have experienced IBA.² Another study, conducted in the UK, New Zealand, and Australia, found that 1 in 6 respondents had perpetrated some form of IBA against others.³
The conversation below invites readers to reflect on a range of important questions. What if technology companies heeded the insights of sex workers in low- and middle-income countries? Would platforms be designed more equitably if the expertise of criminalized and marginalized communities was valued and amplified?
This is the first part of a series. Ndeye and Yacine also engaged in a dialogue with US-based activists. I served as the live interpreter and translator in that transnational conversation. A bilingual transcript is forthcoming.
Confidentiality is crucial in these conversations. We publish a written transcript, not audio, in case the sound of their voice could reveal their identities.
This caution reflects one of the conversation’s key themes: sutura. Sutura is the Senegalese ethic of discretion. One’s honor, social status, and moral belonging are all predicated on the ability to shield aspects of life deemed “intimate” from public view.⁴ While sex workers are particularly vulnerable to non-consensual exposure, they might themselves be accused of lacking sutura. This paradox is entwined with Senegal’s colonial history; French colonizers sanctioned sex workers for purportedly exposing intimate life in public space.⁵
But sutura has another valence: community protection. For instance, in the name of sutura, communities may provide social and material support to sexual assault survivors, shielding them from further contact with their perpetrators.⁶
In the following conversation, Ndeye and Yacine define sutura for international audiences. What would platforms look like if they were founded on the ethic of sutura’s community protection? What if digital privacy were conceived not in terms of individual awareness and responsibility, but as a matter of mutual aid and obligation?⁷
What if digital privacy were conceived not in terms of individual awareness and responsibility, but as a matter of mutual aid and obligation?
It will take collaborative thinking to fully explore these questions. It will take collective action to convince tech companies to integrate these ethics into platform design. Toward these ends, Ndeye and Yacine pose crucial questions about how best to fight image-based abuse. This fight is essential, they argue, to the health and flourishing of sex workers and other marginalized communities.
Transcript of the Conversation
The conversation excerpted below took place in Dakar, Senegal in Yacine’s apartment on June 20, 2022.⁸ We spoke in a mix of French and Wolof. The latter is Senegal’s most widely spoken national language. I turned the recorder on and off as Yacine intermittently answered phone calls from potential clients. The demands of labor punctuated our conversation.
Juliana Friend: Yacine, Ndeye, thank you for coming and participating in this conversation to help people outside Senegal better understand your perspectives on how to make the internet more secure. For my first question, when I spoke of “good security hygiene” [in French], Yacine, you translated that as “suturalante.” For Americans and others who don’t know what sutura is, could you explain what it is? Could you and Ndeye share your perspectives?⁹
Yacine: OK, I’m Yacine. I’ll answer Juliana’s question. Suturalante¹⁰ means, well, it starts with the family, even before we start talking about sex workers. People should provide discretion to their peers, that is, their family, their friends, and sex workers. The term sutura encompasses everything. Whatever you’re going through. If something doesn’t concern other people, people should sutural¹¹ me.
That is, what I’m doing, if it only concerns me, and Ndeye knows about it, she should sutural me, because I’m concealing it. Maybe I don’t want anyone to know. There are people who find out about a problem and expose it. We, among ourselves, have to suturalante, especially sex workers. We have to suturalante. You understand? Suturalante is a good thing. That’s my brief commentary.
Ndeye: Like Yacine was saying, what is sutura? Sutura, in French, is kind of like “confidentiality.” Even clients who see me, as a sex worker, if he’s a minister, or president of the nation, no one should know [who he is]. I should sutural him. There is confidentiality in my relationship with my client. There should be confidentiality with my clients and with my sex worker peers. Even with the world in general, the general population. We should sutural each other.
Juliana: And does this affect internet life in any way?
Yacine: In the context of the internet, as I was saying, during the first wave, second wave [of the COVID pandemic], I saw people who kept working. The internet, if there was a little more security, just a little security, I believe that soon nobody would be working body-to-body.¹² If you look at it from my perspective, if you do body-to-body, you can catch all kinds of diseases. Even COVID you can catch. Versus if you have internet and you do your trade online. There are those that don’t even apply for a card!¹³ They do their internet, wait, get some number of euros, or dollars, and it’s done! There you don’t risk anything. You don’t have disease.
The internet, if there was a little more security, just a little security, I believe that soon nobody would be working body-to-body.
But do you know what the danger is? You will see people who do video calls, and their face doesn’t show. But someone takes your body and exploits it by going to sell it on [a website advertising sexual services]¹⁴ or a porn site. What’s the sense in that? While you have only earned 50 thousand or 30 thousand,¹⁵ he can earn 100 thousand or 150 thousand. Or millions! So there is no security [online]. Security is lacking. So for sex workers, they will go back to body-to-body.
Ndeye: Like she was saying, sutura has to start with the internet. When the first COVID wave hit, there were some people who were working. But those who couldn’t work couldn’t eat. Their families couldn’t eat. So they risked their lives. They worked body-to-body. But the majority did internet video calls. But the problem was what I was explaining: the capture of videos, the capture of images. That is our problem. There’s no sutura there.
Juliana: What Yacine was saying before is also interesting: that “sutura is us.”
Ndeye: It’s us. Because me, you will tell me something bad. I give you sutura so that no one will know it. You, I will tell you something bad. You give me sutura. You won’t talk about it to anybody.
Juliana: And you’ve taught me two words: suturalante and jappalante¹⁶.
Ndeye: Yes jappalante is, like I often say, MSM [men who have sex with men],¹⁷ they help each other. They have solidarity among themselves. We should help each other. Me, I have a problem, and you as a sex worker, you know that I have a problem. Even if there isn’t money for me to fix my problem financially, you will support me morally. You know what I mean. Support me morally. Support me, hold me close, guide me until I’m not worried, not stressed.
Juliana: How does that work on the internet? Could you expand on that?
Ndeye: Jappalante, the general population should help each other, give sutura to each other, not expose their peer on the internet. They won’t know. Even on the internet, you know, if you see someone who has published something bad, you take it down, remove the recording.
Yacine: Before it gets a single view.
Ndeye: Before it gets a single view, you take it down. In that case, the person who uploaded it will not get any satisfaction. Because me as a sex worker, we the sex workers who are involved, we who see this early before it has a single view, we take it down.
Yacine: For example, I am sex worker. Someone uploaded something damaging to my reputation. The internet shouldn’t accept it unless it asks me security questions. Do you understand? That’s very important. Someone uploads something without consent. But here is the problem: how do you know what is without consent? That’s a question. How do you identify lack of consent? Because the owner of Facebook can’t know if this person is consenting and that person is not. It’s very difficult.
Now there’s another thing that is often on my mind. Because Facebook is doing business. It can’t just concentrate on sex workers. That’s impossible. Maybe something else is possible. An internet that resembled Signal. There is security, you see. It could be like Facebook, but where sex workers owned it. That could be possible, or a WhatsApp that sex workers owned. That could be possible. So no one has access, no one has access to image capture. No one could upload a photo unless they owned it. That is what’s possible.
But as for Facebook, they wouldn’t accept that. You enter the site and see something worrying, then simply click and it takes it down for you? That’s impossible because Facebook people are in business.
Maybe though, sites like [website name omitted], like designated romantic meeting sites,¹⁸ you know [website name omitted], you can’t sell things like soft drinks. That wouldn’t fly. No one would buy it. But if you do a sexy photo, they will know that it’s a special site. There is a site especially made for sex workers. But if it was more intense it would be even better. It would be a meeting site where sex workers can make good money. They would see themselves in that site. That’s important. But Facebook, I don’t think so.
Juliana: That’s very interesting. Can I throw something out there? Because in the US there was a network that was similar to that. But now it’s against the law. They said it encourages nefarious things, but many activists have observed that since they banned that website, many [sex workers] were forced to return to street work or take on more risks.¹⁹
Ndeye: Now, what I was saying, I was saying that whoever is looking for a sex worker-only website — but regarding that, say I am an innocent, part of the general population. I’m not a sex worker. But I’m curious. I will go on the site to see if I know anyone on it. So that I can expose them to the general population. That’s why for this, if we want sutura or security on Facebook, it has to be generalized. It won’t be just for sex workers. It would be generalized. Because even if you just do it for the sex worker population, members of the general population will go on it. Mm hm!
Juliana: That’s very interesting.
Ndeye: That’s what I wanted to say.
Juliana: So you’re saying that sutura should be generalized.
Ndeye: The general population. You, I should give you sutura. You should give me sutura. You should give Yacine sutura. She should give you sutura. At the same time, she should give me sutura. The general population, even if they are not sex workers, should give sutura to each other.
Juliana: With that site, it was possible, if there was a violent client, you could signal that on the website.²⁰ That way other sex workers wouldn’t work with them. In my personal opinion, that site embodied suturalante.
Ndeye: That website was good.
Yacine: It was good.
Ndeye: That really was good. If you’re on it, and you know that I’m a criminal, and I’m a client, I’m violent, you go onto the site and tell Yacine, tell Juliana, tell everyone, “if the client comes, no one should take them.” That person who calls won’t be approved. You won’t speak with them. Because you’ll know that this person is violent. They’re a criminal. That website is good.
Yacine: Yes, if Senegal had that it would be great. If you post something too, if the bandit knows that you posted it, they could look for you and catch you. You’d want to not know who posted it.
Juliana: Yes, it would be anonymous.
Yacine: Yes, anonymous is possible.
Ndeye: So you asked, if sex workers owned the internet now. That’s what you said?
Juliana: Right, if it was them who ran the internet, how should it be run? What would it be like?
Ndeye: That’s what I was talking about. If they were running the internet, they wouldn’t run it for sex workers only, but for the general population. Because suturalante, and image capture, and audio capture and the like, it concerns the whole population. Maybe it surfaces the most for sex workers, but the general population faces this. In that case, as sex workers, we would run internet sites for the whole population.
We would make it so that no one can perform image captures. Maybe you, your photo, you can capture it. Your own video, you can capture your own video. But as for my own video, I would talk with you first. If you want to capture my video, you can’t capture my video.
Ndeye: Jappalante, suturalante, regarding that –
Yacine: It’s everywhere. It would be generalized. That’s what I was saying. Within the family, among friends, in everything really. What is the actual foundation of this conversation? It’s sex workers. But suturalante is everywhere.
Juliana: In another conversation, someone suggested, why not conduct advocacy with the owners of Facebook? What are your thoughts about this?
Ndeye: Regarding that kind of advocacy with the owners of Facebook and WhatsApp, it’s possible. That kind of advocacy is viable. We should do it. To tell them to help sex workers. So that those who do captures are not online, so that they do not capture others’ videos or audio. Don’t circulate them or do blackmail. Do you know what I’m talking about? The leader of Facebook, if I could talk with him and do that kind of advocacy with him, it would be very good. Sutura, whoever exposes you, when you see it, you make it so that it doesn’t have a single view by deleting it. You understand?
Juliana: Yes, I understand.
After about two hours of discussion, Ndeye had to return home to her family. Yacine and I stuck around a while, chatting. Discussion turned to the digital traces we leave online. The conversation became very rich, so I asked Yacine if we could turn the recorder back on, and potentially add this section to the published document. Below is an excerpt from this “bonus conversation.” It begins with a discussion about digital privacy advocates.
Juliana: Some worry about the fact that companies are always learning about us. They worry about that. They want to protect our private lives, not just our photos but –
Yacine: Everything! Documents. Private life, that says it all.
Juliana: What you research in Google.
Yacine: They protect it.
Juliana: What you search.
Yacine: What you search, they protect everything.
Juliana: But some Americans and Europeans, some researchers have observed that they focus on the individual. What you do to protect yourself. Maybe you change some settings on your phone to increase your own security.²¹
Yacine: But that should apply to the world in general. There are people who haven’t been to school. We should protect everybody. There are some people who can enter into settings. There are some that don’t know how, you know? But we can protect them on that. For example, I’ve studied a little. If you explain it to me I will do it. But my mother didn’t study. My younger sibling didn’t study. How are they going to protect themselves? It should be something automatic. That’s my perspective.
Juliana: What you’re saying is extremely important.
Yacine: There are those who speak French, there are those who speak English but haven’t studied. But if it was the world in general, that’s better. You understand? Versus, you enter settings, activate this, activate that, deactivate this. It’s too complicated.
Juliana: Does your mother need to protect herself, or is it Facebook that should be protecting everybody?
Yacine: Everyone. It should protect everyone. But it’s not just Facebook. It’s everybody. Facebook should protect people. The internet should protect people.
Juliana: Those who haven’t studied have the same right to private life.
Yacine: Thank you. Exactly. You know, our mothers, there are some who haven’t studied. They use WhatsApp calls without knowing if it’s protected or not protected. But if you have generalized protection, protection is built in, everyone will be in peace. That will facilitate total sutura. Globally you will have a good sutura.
Now me, people have told me, iPhones, there is something where if you don’t take it out, iPhone will follow everything you do. But that’s not normal! You can’t go and buy a telephone for 400,000 or 500,000 or 500 Euros, all the while you don’t have a private life! iPhone shouldn’t follow what you’re doing. It’s not normal!
Juliana: Yes, Apple learns about you.
Yacine: But it’s not normal!
Juliana: The hard thing is that they get money from that.
Yacine: But the problem there is, they shouldn’t do it without my consent.
 See Stardust, Zahra, Garcia, Gabriella and Egquatu, Chibundo. What can tech learn from sex workers? Sexual Ethics, Tech Design & Decoding Stigma. 2020 [cited Apr 27, 2022]. Available from https://medium.com/berkman-klein-center/what-can-tech-learn-from-sex-workers-8e0100f0b4b9 (accessed Apr 27, 2022).
 Ruvalcaba, Y., & Eaton, A. A. 2020. Nonconsensual pornography among U.S. adults: A sexual scripts framework on victimization, perpetration, and health correlates for women and men. Psychology of Violence, 10(1), 68–78. https://doi.org/10.1037/vio0000233
 Powell A, Scott AJ, Flynn A, McCook S. Perpetration of Image-Based Sexual Abuse: Extent, Nature and Correlates in a Multi-Country Sample. J Interpers Violence. 2022 Feb 21:8862605211072266. doi: 10.1177/08862605211072266. Epub ahead of print. PMID: 35184577.
 Mills, Ivy. 2011. Sutura: Gendered honor; social death; the politics of exposure in senegalese literature and popular culture. University of California, Berkeley.
 Fouquet, Thomas. 2011. Filles de la nuit, aventurières de la cité: Arts de la citadinité et désirs de l’Ailleurs à dakar. l’Ecole des Hautes Etudes en Sciences Sociales (accessed Apr 20, 2022).
 Packer, Beth D., and Friend, Juliana. Why few women in senegal speak out about their rapists. 2021 [cited Apr 20, 2022]. Available from http://theconversation.com/why-few-women-in-senegal-speak-out-about-their-rapists-160269 (accessed Apr 20, 2022).
 For other powerful ways to think about privacy beyond individual responsibility or awareness, see Baik, Jeeyun (Sophia). 2021. Privacy for all: Enshrining data privacy as a civil right in the information age. University of Southern California https://digitallibrary.usc.edu/asset-management/2A3BF1SQPCV84; Arora, P. (2019). Decolonizing Privacy Studies. Television & New Media, 20(4), 366–378. https://doi.org/10.1177/1527476418806092; and Mantelero, A. (2017). From Group Privacy to Collective Privacy: Towards a New Dimension of Privacy and Data Protection in the Big Data Era. In: Taylor, L., Floridi, L., van der Sloot.
 Translation note: These excerpts have been lightly edited for length and clarity in dialogue with Ndeye and Yacine. The transcript of the conversation has been purposely left fairly raw. This is to allow us to stay true to Ndeye and Yacine’s modes of expression, while still making the transcript interpretable to an international audience.
 Translator’s note: I decided to keep the term sutura in its Wolof form for most of the translation, except in cases where grammatical awkwardness hindered readability.
 The suffix “ante” indicates reciprocity.
 Sutural is the imperative or command form of sutura. For more context, see Ndeye’s comment below. Feel free to also refer to this blog’s introduction for historical background on sutura.
 In-person sex work.
 In-person sex work is legal in Senegal under particular conditions. One such condition is that one must obtain a carnet sanitaire, a “health notebook,” and perform regular sexual health screenings. Here the “permission” Yacine references is the medical notebook. For one discussion of the benefits and harms of this policy, see Ito, S, Lépine, A, Treibich, C. The effect of sex work regulation on health and well–being of sex workers: Evidence from Senegal. Health Economics, 2018; 27: 1627– 1652. https://doi.org/10.1002/hec.3791
 We have omitted the name of this and other websites on which people advertise sexual or erotic services, so that the website does not attract attention from law enforcement, which could jeopardize sex workers’ livelihoods.
 Around 100 or 60 US dollars.
 “Jappalante” is the reciprocal form of “jappale.” Jappale can be glossed as, to help or to support. Jappalante might best be translated as mutual help, mutual support, or mutual aid. Like sutura, I will leave these Wolof words un-translated in the English text because they are important and powerful concepts for Yacine and Ndeye.
 Abbreviation for “Men who have Sex with Men.” This term is often used in public health contexts.
 Translation note: Indicates sites for erotic and sexual services.
 For discussions of the deplatforming of sex workers, see Hacking//Hustling https://hackinghustling.org/erased-the-impact-of-fosta-sesta-2020/ and Carolyn Bronstein (2021) Deplatforming sexual speech in the age of FOSTA/SESTA, Porn Studies, 8:4, 367–380.
 In this article, sex workers discuss how they used “bad client lists” to warn others about improper or dangerous conduct by clients. Stryker, Kitty 2018. 6 Sex Workers Explain How Sharing Client Lists Saves Live. Vice News. https://www.vice.com/en/article/ne975m/sesta-fosta-sex-workers-sharing-client-lists-saves-lives
 See Arora, P. (2019). Decolonizing Privacy Studies. Television & New Media, 20(4), 366–378. https://doi.org/10.1177/1527476418806092; Mantelero, A. (2017). From Group Privacy to Collective Privacy: Towards a New Dimension of Privacy and Data Protection in the Big Data Era. In: Taylor, L., Floridi, L., van der Sloot, B. (eds) Group Privacy. Philosophical Studies Series, vol 126. Springer, Cham. https://doi.org/10.1007/978-3-319-46608-8_8