This is a version of a publication in Please cite the publication as follows: DOI: Copyright of the original publication: This is a parallel published version of an original publication. This version can differ from the original published article. published by ‘Unmochon’: A Tool to Combat Online Sexual Harassment over Facebook Messenger Sultana Sharifa, Deb Mitrasree, Bhattacharjee Ananya, Hasan Shaid, Alam S.M. Raihanul, Chakraborty Trishna, Roy Prianka, Ahmed Samira Fairuz, Moitra Aparna, Amin M Ashraful, Islam A.K.M. Najmul, Ahmed Syed Ishtiaque Sultana Sharifa, Deb Mitrasree, Bhattacharjee Ananya, Hasan Shaid, Alam S.M. Raihanul, Chakraborty Trishna, Roy Prianka, Ahmed Samira Fairuz, Moitra Aparna, Amin M Ashraful, Islam A.K.M. Najmul, Ahmed Syed Ishtiaque. (2021). ‘Unmochon’: A Tool to Combat Online Sexual Harassment over Facebook Messenger. CHI '21: Proceedings of the 2021 CHI Conference on Human Factors in Computing Systems. May 2021. Article No.: 707.pp. 1–18. DOI: /10.1145/3411764.3445154 Author's accepted manuscript (AAM) Association for Computing Machinery CHI '21: Proceedings of the 2021 CHI Conference on Human Factors in Computing Systems /10.1145/3411764.3445154 © 2021 ACM, Inc. ‘Unmochon’: A Tool to Combat Online Sexual Harassment over Facebook Messenger Women in the global south often seek justice of online harassment through unavailing the harassers and the screenshots of their sent harassment texts and visual contents on social media. However, women often experience further backlash with the argument that these screenshots may have been manipulated. We studied these issues through a survey with 91 female Facebook users and interviews with 43 females and other stakeholders from Bangladesh and designed ‘Unmochon’- a tool that helps women collect such harassment texts and visual contents from Facebook Messenger, share with their intended group of users and prove their authenticity. Our user-study with 48 participants revealed various challenging aspects of seeking justice on social media using such technologies and the assumptions they are built on. Based these findings, we further discuss how technologies should be designed to address women’s harassment on social media in such a complex social setting. CCS Concepts: •Computer systems organization→ Embedded systems; Redundancy; Robotics; •Networks→Network reliability; • Online Gender Harassment, Social Media, Facebook, Messenger, Solidarity, Feminism, Bangladesh; Additional Key Words and Phrases: Online Gender Harassment, Social Media, Facebook, Messenger, Solidarity, Feminism, Bangladesh, Unmochon, User-evaluation 1 INTRODUCTION Women’s harassment is one of the age-old problems in the history of humankind. Many international reports explain how prevalent this problem is from the enlightened western world to low-resourced global south. For example, the United Nation Women reported that Up to 70 percent of women across the world face physical and/or sexual violence in their lifetime [7] and National Intimate Partner and Sexual Violence Survey (NIPSV) reported in 2018 that about 1 out of every 5 women (21.3%) in the United States experienced at least one attempted or completed rape [82]. The picture in the global south is also equally disturbing, if not worse, as 66% of women reported experiencing sexual harassment between two and five times in Delhi, India [7]. Today, with the widespread advancement of information technology, this problem has taken over the cyber- spaces. Thus, online gender harassment is a widely discussed agenda across the world. Prior studies have showed that female social accounts experience a significantly larger number of harassment incidents than that of male accounts [9, 21, 29, 62] and thus have concluded that women are more vulnerable to online gender harassment. The situation is more severe in patriarchal societies like Bangladesh as reports show that more than 70% percent of the complaints filed at the government’s Information and Communication Technology Division’s Cyber Help Desk in 2017 was about women’s online harassment [50]. Such harassment incidents often stress out the victims and their families and lead to further effects like exclusion, humiliation, and public resentment, especially when they live in conservative and patriarchal societies [55]. However, harassers often remain beyond the scope of law-enforcement as Mishuk Chakma, Additional Deputy Commissioner of the Cyber Security and Crime Division of Dhaka Metropolitan Police (DMP), commented, “However, most (female) victims of online harassment do not go to the police station" [26]. Among the very tiny portion of the female victims of online gender harassment who break the silence and seek out for help, often prefers exposing the harassers on social media with screenshots of their chats as supporting evidence and seek justice [47, 52]. But, in most cases, the reports and the press releases by concerned authorities show that the victims’ attempt to seek help and justice go in vain with the argument that the advancement of technology now allows manipulation of screenshots. Thus, offenders often go away receiving the benefit of doubts [47, 52]. However, any formal research has not reported or clarified yet what level of the extremeness of the harassment pushes them to take such steps and what still blocks them from seeking out help. 1 We address this gap in the literature through a two-phase span study. In the first phase, we investigated the above-mentioned queries through a survey with 91 female Facebook users and interviews with 43 Facebook users. Over the course of this phase, we sought answers to the following research questions: RQ1: What kind of gender harassment do Bangladeshi women often face online, and which kind of challenges often block them from seeking help? RQ2: When and why do Bangladeshi women prefer exposing their online harassers and what factors are significant while choosing possible online spaces for such actions? RQ3: How do the existing tools, platform policies, and online environment often fail to assist them in this regard? Our findings from this first phase show that often their complaints about harassment and supporting evidence are challenged with authenticity. Building on these, the second phase of the study aimed at designing a tool to support the women who seek help and justice to their online harassment. We build on the shame-based model of justice in this adversarial design [12, 28]. In this regard, we prototyped an application for women to help them seek justice for their online harassment. This paper introduces ‘Unmochon’, an application that helps with authentication issues associated with the screenshots of the chat history with the harasser that women often share while seeking help and justice. We evaluated this application with 48 participants using interviews and focus group discussions. We found that women often prefer hiding their identity as a victim, fearing that this might bring them further victim-blaming and slut-shaming and may even engender real-life security threats. During the user-evaluation of the tool, some participants also mentioned that any misuse of such technologies could lead to counter-harassment for the accused party and urged to include such concerns in the set of agendas while designing feminist technologies. Our work makes a four-fold contribution to feminist-human computer interaction (feminist-HCI), social media studies, and information and communication technologies and development (ICTD). First, our findings detail Bangladeshi female internet users’ online gender harassment and what kind of measures do they take while addressing those. Second, drawing on our findings, we build a set of design assumptions practical and appropriate for assisting victims to seek justice. Here, we bring the model of shame-based justice to design [12]. Prior works suggest that societies may have lower crimes, if shame about the crime is communicated effectively [16]. Therefore, shame about harassing women must be communicated effectively for seeking online gender Justice and reducing women harassment. In this regard, we prototype and evaluate Unmochon and report the perception of victims and other stakeholders of such tools that support online gender justice. Finally, we report findings from our discussions with the participants and other stakeholders that unfold different complicated issues entangled with cyber-harassment and social justice and suggest possible design directions for safer online experience for women in the global south. 2 RELATED WORK In building our research question and making arguments based on our findings, we draw on the literature of three broader themes: technologies supporting women’s safety, online movements for gender justice, and literature of justice and gender harassment in Bangladesh. In this section, we briefly discuss some of the existing works built on these themes and situate our research question in this regard. 2.1 Gender Harassment, Cyber Spaces, and Related Technologies Online harassment is a widely discussed agenda in HCI, social media research, and gender studies. Online harassment not only stresses the users out but also may cause economic damage to the platform when users choose to leave it or limit their activity [33, 59, 90]. Even though most of the online platforms and social media offer the users with help to handle such cases, often users find those processes slow, inconsistent, and dismissive 2 Unmochon [40, 91, 92]. Researchers working in the intersection of social media, web security, and HCI design have been engaged in developingmachine learningmodels to detect harassment for years [12, 20, 36, 57, 70, 97]. Some of these models have also integrated human intervention in cases for further effectiveness of the models [12, 20, 36, 70, 97]. However, these models often fail to address the needs of the users due to a higher odds of deception and misclassification, and the presence of biases in the process of their training mechanisms [11, 46]. Along with all of the above-mentioned concerns and limitations, the existing state-of-the-art tools and mechanisms for fighting online harassment often fail to address culturally contextual gendered needs in such cases of harassment, including users’ vulnerability due to their gender identity and a balance between their needs and expectation and their power and agency on social media [12, 33, 80, 84, 90]. In recent years, a group of researchers has showed that users’ perception of online safety is subjective and depends on a lot of factors including their digital privacy and security and the values upheld by the community [73]. Some researchers have also engaged in understanding forms of harassment, its gendered angles, sealioning, and other relevant abuse on social media and call for designing more options in the platforms than blocking harassers and deactivating own IDs in this regard [12, 13, 76, 90]. Jhaver et al. studied Twitter users’ blocklists to interrogate online harassment and reported that users may not feel entirely protected by blocking the harassers and often blocked users feel they are unfairly treated [51]. However, the design, features, and affordances of social media, including Facebook, vary across regions (see Facebook lite and messenger lite for the global south [78]) and thus users’ social media behavior are also diverse and culture-specific [61, 98]. Thus, the aforementioned rich body of works needs to expand by investigating how the patterns of harassment vary for users of different populations and how the existing supporting tools and techniques are challenged by those. Another body of literature has progressed over time in explaining how internet connectivity, messenger services, multimedia calls, and social media have been useful for women addressing and fighting their harassment and abuse. For example, Satchell and Foth found that women on physical spaces and on-street pretended to talk or text on their phones so that the prospective harassers nearby would think that the women are active on social network right at that moment and they have the option to seek help right away if needed [77]. To boost their courage against prospective street-harassers, some women also reported preferring to remain connected to their loved ones over the time of their travel and inform their loved ones over a phone call upon arriving at their destination [14]. Furthermore, there are many designs and interventions to address harassment on streets or other public places in western countries. They include ComfortZones [14], CityWatch [53], Hollaback! [27], CampusWatch [71] among others. However, addressing gender harassment using technology is more challenging in low-resourced countries in the global south because many of the users live in strictly patriarchal societies, and raising voice against men is often challenging and usually discouraged through victim-blaming and slut-shaming [3, 56, 86, 88]. Still, a thread of such design has evolved over time to address harassment and crime in the physical space through design. For example, Protibadi, ProtibadiNext, Safestreet, and HearMe in Bangladesh [2–5], Harassmap in Egypt [99], Safe Mathare in Kenya [39], CrimeID in Indonesia [96], and Safetipin, Samrisa, and Panic-button in India [56, 83, 89] were designed to address street harassment, seek help from passers-by, friends, or police, share stories online, and/or map harassment incidents on an interactive digital map. However, gender harassment on social media, messenger services, and cyberspaces in the context of the global south has remained largely an understudied topic in HCI, feminist-design, social media research, and related domains. One notable exception is Nova et al.’s work on understanding the pattern of sexual harassment on social media in Bangladesh [68], where they have revealed how people close to women harass them anonymously using the social media. However, their work does not involve any preventive design interventions to stop such harassment. This calls for growing a deeper understanding of the types and ways of harassment that Bangladeshi women often face online, and which kind of challenges often block them from seeking help (RQ1) so that we can design a safer online space for them. 3 2.2 Online Gender Movements and Protests The history of online movements and protests is fairly new but rich. Often such movements and protests come with keywords that have specific meaning followed by hashtags and some activities. Covering all of them is beyond the scope of the paper. Here, we will briefly describe some of the major and relevant feminist online movements and protests. Many of the early online feminist movements were launched basing on ideological conflicts. For example, Pink Hijab Day (PHD) in 2004 was related to breast cancer awareness and aimed at countering the stereotypes of Muslim women [75]. The movement started in the US and gradually spread through social media calling for solidarity. However, this movement did not receive much organizational support outside of the US, and many Arab women denied their solidarity because of local constraints. Another significant Muslim women solidarity movement also took place in the last decade in Germany when a gender activist organization, namely FEMEN, aimed at gender-freedom and called for several on-street activities that did not align with core sentiment of many of the Muslim women [93]. Those groups of Muslim women, self-defines as ‘Muslima Pride’, protested on street and spread the movement through Facebook, Twitter, and Instagram. using hashtag #MuslimahPride. Black feminist agendas hold a significant part of online movements when it comes to protesting against injustice. Most of those lies in the intersection of racism and gendered crimes. For example, #SayHerName started in 2015 to tell the story of violence against black women while circumventing traditional media barriers [95]. It started with the idea of making the sexual assaults and deaths of black transgender women noticeable which the mainstream media kept ignoring, bringing out the stories of police brutality and violence against black women, and gaining justice for them. Similarly, #JadaPose, #StandWithJada, and #JusticeForJada on twitter and other social media in 2014 stood for soliciting justice for Jada, a sixteen-year-old rape-victim black teen [94]. Another significant movement, #JusticeforLiz, shook the internet and social media in 2013 and sought justice for Kenyan teen rape victim pseudonymously tagged as ‘Liz’ [45]. Arguably the biggest online movement broke the internet in October 2017, in which women across the world used the hashtag #MeToo (or some variants of it) and publicly shared their untold stories of being sexually harassed [81]. Founded in 2006 by Tarana Burke, a black female activist from New York [17, 18, 81], these events followed Hollywood actress Alyssa Milano’s call on Twitter: “If all the women who have been sexually harassed or assaulted wrote ‘Me too’ as a status, we might give people a sense of the magnitude of the problem" on October 15, 2017 [38]. This call for sharing harassment experiences with #MeToo hashtag followed Milano’s own story of being sexually abused [63] and millions of women joined this movement. Some literature sees #MeToo as a descendent of prior #MyHarveyWeinstein, #YouOkSis, and #SurvivorPrivilege movements [42, 66] and argue that it only gained prominence when several Hollywood female celebrities came out with their stories of sexual harassment [79]. Within a few days, #MeToo reached the Indian subcontinent, as many women flooded Twitter and other social media platforms with stories of their sexual harassment [43, 58, 66]. However, this wave also faced a similar backlash that victims here felt discomfort in disclosing stories of harassment if their harassers were someone in the upper level of the power structure. For example, Sarkar, a Dalit Queer, cohered with #MeToo by preparing a list of harassers in Indian Academia [35]. In this regard, Sarkar shared that the victims didn’t provide any context while reporting and didn’t share any details explaining the incidents. The official investigation found a few academicians guilty with no news of ongoing or planned inquiries for other names in the list [24, 79]. A retired female showbiz celebrity shook the media in late September 2018 with her posts about the experience of harassment by a colleague [25, 60]. Many other female and queer colleagues joined her quickly breaking their silence [79]. While this movement successfully triggered the academia, industry, and showbiz, the harassment of a large group of women from other professions, class, and social status remained beyond its scope. Also, reports 4 Unmochon show that #MeToo in Bangladesh was less successful than expected because of cultural differences, lack of hope, and a lacking reliance on alternatives [43, 66]. The limited success of #MeToo in the Indian subcontinent aligns with the previous pattern of online movement of #SlutWalk back in 2012, which also flopped in this region [65]. Following the western wave of fighting for women’s right to choose their attire and not to be slut-shammed [64], there was a call for #SlutWalk in Bhopal, India. This was officially declared as a failure due to insufficient in-person participation, while thousands of people showed their interest on Facebook and other social media platforms and promised to join [65]. However, not all the online movements stay unsuccessful in the Indian subcontinent. For example, protest against increased tax on female hygiene products [#LahukaLagaan, see [30]] is known to be one of the biggest women-led Twitter campaigns to demand policy change in India. This provokes us to investigate how social media and other technologies together leverage gender justice in the Indian subcontinent and we ask what kind of roles do different tools and technologies play or underplay in this regard (RQ3). 2.3 Model of Justice and Gender Harassment in Bangladesh Finally, we turn to the literature of justice relevant to our context and research question. Prior works on gender justice showed that many female victims of street harassment are reluctant to report to police and law enforcement authorities since those may not have been much unhelpful previously and the investigations may lead to the next level of inconvenience and costing another harassment to themselves [48]. Also, reports show that women often feel that their reports to law-enforcement authorities may also bring them shame and blame for supposedly provocative behavior [48]. Another study revealed that such incidents often leave great stress on women, resulting in making them feel helpless, and a further limit on their mobility [3]. Also, they are generally not comfortable engaging in discussions regarding such harassment incidents because there is some social backlash of bringing attention to street harassment, and disclosure of them being harassed could further itself be played as a source of shame and embarrassment [3] for the female victims. It has been reported that women often vent their frustration regarding such incidents in social media and disclose their experience to friends, peers, or followers [6, 27] and there are tools and interventions that support such agenda of gender justice sought online by disclosing identifiers, an image of the face of the harasser, for example [27]. The reports on seeking justice online in the context of Bangladeshi gender harassment also cohere with this pattern, as we see from the reports [26, 50]. In most of the cases, women are fed up with the response from their situated infrastructure of legal and social support and thus find alternatives by exposing and shaming the harassers in front of internet-citizens, often termed as ‘Netizens’. This model of seeking justice aligns with the idea of ‘Mob Justice’ .Mob Justice is defined as, “Ruling by a group of mass" [41, 54]. Here the Netizens constitute the mob who decide the conviction. Although part of such a justice model seems to be unpredictable public opinion, mob justice has often been differentiated as a model of justice when proper democratic participation is missing in a justice system and people feel in-secured and lose their faith in legal systems [69]. This practice of Bangladeshi women seeking mob justice excites one of our research questions. In this study, we investigate when and why do they prefer exposing their online harassers and what factors are significant while choosing possible online spaces for such actions (RQ2). 3 UNDERSTANDING ONLINE GENDER HARASSMENT IN BANGLADESH We conducted an anonymous survey and investigated the nature of women’s experiences of online gender harassment, their existing strategies for seeking help and justice for it, and how their networks often react to that. Then, through interviews with 43 participants, we grew a deeper understanding of various angles of online gender harassment to find possible design avenue. This phase of work was conducted between June to October 2019. This study was approved by the ethics review committee of the authors’ institutions. 5 Details of Pre-Design Survey and Interview Survey Total Number of Survey Responses: 91 Age Range (in Years) Occupation 18-25: 52 (57%) Student: 53 (58%) 26-35: 36 (40%) Public/private job-holder: 27 (29%) 36-45: 2 (2%) Teacher: 5 (6%) 46+: 1 (1%) Others: 6 (7%) Interview Total Number of Interview Participants: 43 (Female:37, Male:6) Age Range (in Years) Occupation 18-25: 18 (42%) Public/private job-holder: 27 (63%) 26-35: 23 (53%) Student: 7 (16%) 36-45: 2 (5%) Housewife: 3 (7%) 46+: 0 (0%) Others: 6 (14%) Table 1. Demographic Characteristics of the Survey and Interview Participants 3.1 Methods To get a broader understanding of Bangladeshi women’s harassment experiences online and to develop ideas of their resistance and associated challenges, we conducted both an online survey and an interview study. The anonymous online survey allowed many women to talk about this tabooed issue more comfortably. On the other hand, the interviews allowed us to go deeper into some of the stories coming from some women who felt comfortable to talk about this issue with us. We describe both of our methods below with further details. 3.1.1 Online Survey. We hosted our anonymous online survey on Google Survey and circulated it online, including emails of the authors’ friends and Facebook groups which the authors’ frequently used. Participants took part in the online survey by clicking on a link attached to the emails or the Facebook post. We requested a response from Bangladeshi female users only. The request for participants’ consent came along with the post and the email. In the call for a response to the survey, we explained the purpose of the survey and ensured the anonymity of the respondents. The participants were allowed to proceed upon confirming at the beginning that this was their first time participating in the survey. The language of the survey was Bengali. It asked the participants about the type of harassment they faced on Facebook and other social media platforms, their relationship with the harassers, if they sought help from friends and law-enforcement authorities and other relevant information. Although most of the questions were structured as check-boxes or multiple-choice questions, the survey also included several optional open-ended text-boxes where participants could freely express their opinions and concerns around their harassment on Facebook messenger and other social platforms. No personally identifiable information (e.g., Name, Location. etc.) was collected from the survey participants in this study. We did not offer any compensation for participation. It generally took the participants around 10-15 minutes to complete the survey. The participants were allowed to leave the survey at any time without any further consequences just by closing the survey window in their computer browser. A total of 91 participants completed the survey. See table-1 for the age range and occupation of the participants. 3.1.2 Interviews. To grow a deeper understanding of different angles of online gender harassment, we conducted semi-structured interviews with 37 female and 6 male Facebook users. Among them, there were 3 male police 6 Unmochon officers who have been working for several years on gender harassment on cyber-spaces and one professional female lawyer who has been working on gender issues for the past three years. We recruited the participants from our professional networks on Facebook through convenient sampling [49]. We explained the purpose of the work to them while scheduling the one-on-one interviews. Since gender harassment is a stigmatized topic in Bengali culture, we let them choose between online and in-person meetings. We conducted the interviews in Bengali. We asked them about their and their friends’ experiences of online harassment, especially in comments and chats on social media, what kind of actions they took, what kind of help they sought, and other relevant information. We led to the deeper discussions only when they confirmed us of their comfort in talking about those incidents. The Police officers and the lawyer helped us with different socio-cultural, ethical, and legal angles of online gender harassment and existing support systems in the country for the victims. All the interviews started with oral consent from the participants and were audio-recorded in researchers’ mobile phones with the permission of the participants. We also took notes during the interviews. Generally, the interview sessions lasted around 30-40mins. 3.1.3 Data Collection and Analysis. All the survey responses were initially recorded in the storage of the Google survey. The data came anonymized and we retrieved them in comma-separated values (CSV) format. Later we stored it in our secured storage space. First, we translated the CSV from Bengali to English and cleaned the data. We used the open source statistical tools on python for further analysis. We also evaluated the responses qualitatively to understand the nuance of the online gender harassment faced by Bangladeshi female social media users. The interviews generated a total of 11 hrs of audio recordings. We transferred them to a secured computer owned by the researchers. Then we translated and transcribed them. The transcriptions and the interview notes generated 80 pages of documented data. We removed the identifiers before conducting open coding and thematic analysis on them [15, 85]. Four of the authors independently read through the transcripts carefully and allowed codes to develop. Later they shared their codes with each other. A Total of Thirty-five codes were spontaneously developed initially, for example, stigma, victim-blaming, skepticism and mistrust, denial, slut- shaming, betrayal, blackmail, threat, etc. After a few iterations, we clustered related codes into themes and drew our design assumptions on them. 3.2 Understanding Online Gender Harassment in Bangladesh The findings from the survey and the interviews inform us about the harassers and the type of harassment often the victims face, effects of harassment on them, existing social and legal supports and how those are helpful to them, and how they often resist. This subsection highlights some of the key findings. 3.2.1 The Harassers. We found that more than 70 of our survey respondents (58%) were harassed by unknown people online [see fig-1(left)]. 11 of the interview participants mentioned that they tried visiting the Facebook profiles of their unfamiliar harassers but their information was hidden or their Facebook profile used "locked" feature. However, unknown online IDs may not be always strangers as 5 of the interview participants mentioned that they had clues and evidence that their unknown harassers were someone they knew. We quote one of them, “The unknown ID in the Facebook other-box shamed me for my bigger breasts and my skin being dark. Later we found that a group of students from our class together ran that account. Rather than apologizing for what they did, they said, "Relax, chill, and take it easy. It was not to insult you, we were just kidding." But that was no fun for me!", (P15) 45 of the survey respondents (36%) said that their harassers were their classmates, friends, and relatives who did not even bother hiding and harassed them upfront [see fig-1(left)]. A survey participant expressed her frustration and intimidation as her harasser was a law student at her university and hoped that this survey might help her 7 Fig. 1. (Left) Relationship between the harassers and female victims in context of harassment over Facebook Messenger, (right) if women consider seeking legal support and why, if not find ways. Another interview participant mentioned that her husband lived overseas and her sister-in-law and sister-in-law’s husband are the temporary guardians of her toddler and her. She explained how her sister-in-law’s husband tried to take advantage of her and used Facebook messenger for that, “The man keeps texting me on Facebook asking to see him outside alone, sends me vulgar images, and deletes those from his end in a few minutes. I can not even block him because then what will I tell my sister-in-law? They live next door, I often need to leave my baby to them if I go outside. I can not offend her, right?", (P28) Another 13 of the interview participants shared similar stories of their online harassment incidents and explained how those were challenging since those came from family members or close relatives. Five of them even refused to discuss the details of the incidents as they tagged those as “private problem within the family" and thought discussing those would be inappropriate even for research purposes. 3.2.2 Types of Harassment. Our survey reports that online harassers often send women vulgar text messages, and inappropriate multimedia contents, request nude photos of them, threaten them of defaming using their photo-shopped pornographic images, and even sometimes threaten them to rape and murder [see fig-1(right) for the breakdown]. We divide the patterns of harassment in major five categories: forcing a romantic relationship, seeking revenge, spreading hatred and threatening, and venting random frustration. Forcing for a Romantic Relationship. 39 of the survey respondents reported that they were forced to go in a romantic relationship, date them, or marry them [see fig-1(right)]. Sometimes such proposals come with even worse kinds of threat, as 11 of the survey participants mentioned in their response. We quote one of the survey participants’ explanations form her response in the survey, “One of my friends was threatened (on Facebook) by a senior student (who wanted to date her) at (name of the university where she studies), he said, “I will kidnap you from the university campus if you do not meet me in person". She was extremely frightened.", (S80) 15 of our interview participants informed us that they were repetitively asked them out and marry them through texts and calls, and even in comment sections of their and their friends’ posts by the harassers. 8 Unmochon Seeking Revenge. 13 of our survey respondents mentioned that their harassers were their ex-boyfriends or someone whom they have refused to date or marry before. Two of the respondent mentioned that their harasser continued even though the respondents are married for several years. Often these harassers tried to defame them in public, as we quote one of them, “(After I refused to date my classmate)...he told people the truth by mixing lies in my name and tried to gain sympathy. He wanted to portray that, it’s me who played dirty with him, and threatened me that he will reveal all the things I share with him when we used to be friends and will prove to everyone that I am bad and so on...", (S34) Thus, harassers of this category are generally known to the victims and often they have something to defame the victims, including romantic chat history, previous intimate photos, and other private multimedia contents. Imposing Conservative and Religious Sentiments. 19 of the female interview participants mentioned that their harassers, both online and in real life, used conservative and religious sentiments to harass them. In many cases, often the female victims are cursed for numerous reasons including their working or studying outside and their attire as a justification for their harassment. Our participants opined that online spaces are easier for imposing these. Even our survey received such a response that name-called one of our male researchers and expressed objection against his wife’s attire, as we copy that response, “Girls need to cover themselves properly by doing “Parda". That will help to lessen harassment. People along with (One of the male co-author’s name) and others, please keep your wives covered following the rules of “Parda".", (S27) Mockery Using Anti-Religious Sentiments. 4 of our interview participants were Hindu females, and all of them mentioned that their online harassers frequently used mockery and disrespectful words towards Hindu religion. They explained that being women of the minor religious group makes them easier targets for such harassment, as we quote one of them, “Being a Hindu is already challenging here, being a Hindu female is even worse... One Facebook account sent me some awful vulgar texts about Draupadi and her five husbands (characters from Hindu religious myths). It said that they were also five people running that account and kept asking me out with inappropriate offers... Using my religion to harass me is twice as hurtful.", (P15) Venting Out Random Frustration. Our survey informs that 17 participants’ harassers called them over the phone or messenger calls services and initiated threatening and inappropriate conversations. 21 participants received threatening and erotic texts, or vulgar messages, inappropriate audio clips, and multimedia contents from unknown accounts. Harassment for no reason may also come from known people, as one of the participants gave an example of her friend’s experience, “... Another friend of mine was consoling a guy she knew after his breakup, and he started saying things like ‘I named my dick the hulk’ and then sent her pics (images). It was so uncalled for, considering we knew him as an elite person.", (S80) 11 of our female interview participants also mentioned that they thought most of the harassers they experienced were possibly frustrated and harassing random women online was probably a reflection of that. They also opined that harassers probably think women’s inboxes are the safest places for venting out their random frustrations. 3.2.3 Effects of Harassment on Victims. In many cases, women’s harassment is considered as normal phenomena by their friends and family and other possible support. Sometimes their complaints and frustrations are even discarded as a topic of discussion as if pretty women are supposed to be harassed, as one of the survey participants explained, 9 Fig. 2. (a) Break down of the forms of harassment that female Facebook users face over the messenger inbox “...(T)he comment was like, “As a beautiful girl, it’s pretty common that guys will fall for you, so it’s very normal. When pretty girls are given freedom, the girl and her family members/husband need to accept this kind of situations and understand that these kinds of incidents can happen with her.", (S60) Furthermore, when a female participant discovered that she never experienced any gender harassment, not that she was aware of and she expressed her surprise in the survey response. In the survey and during the interview, many participants mentioned that taking those was never easy for them and these incidents made some emotional, social, and educational damage, as we discuss them below. Restricting Social Life.Often womenwho face harassment are ashamed in society, as many of our participants informed us. 14 of our participants mentioned that in many cases their friendly behavior was misread as sexual signaling and they were harassed. As one of the participants explained by giving an example of her experience, “I was involved in a lot of organizational activities. But then some students started saying that I am doing those for attention. One day a text from a co-organizer arrived on my phone asking me how many times he needs to use his (body-part) so that I am convinced to be his partner in some event. When I saw him on campus, I asked him and he said he did not do it. But I know either he or his friends have done that. I have stopped being social since then, I do events occasionally, but I have a cold face now.", (P19) Our interview participants also mentioned that this fact equally applies to domestic environments as well. 14 of the interview participants reported that at some point in their lives, they were harassed by their family or next-door neighbors. One of the participants shared her experience of being harassed by a cousin and how their families went into a bitter relationship, “My cousin who is 8 years younger than me used to text me frequently and I replied nicely. Then he started sending me inappropriate media contents. I was shocked and asked him not to send those to me. Rather than stopping, he called me old-fashioned. I blocked him because he won’t listen to me. Later my aunt, my mom, and I had an unpleasant conversation and my aunt cursed me. Now my mother blames me for speaking up. But I am not going to be social with that family anymore.", (P32) Thus, often women stop being social in their real-life to avoid harassment. This sometimes helps and sometimes fails, as several participants mention. Furthermore, sometimes they also need to limit their online social life because of their harassment, as 9 of the women mentioned and here we quote one of them. “I was first harassed online on Mig chat. I was only 17 then and it was only a few months that I started using it. An ID wanted me to act like I am doing physical intercourse with him, I did not even know that 10 Unmochon called sexting. I was scared and stopped using it. But that was a way for me to stay connected to a lot of friends, so I lost those connections.", (P19) Limiting Scope of Education and Career. Our participants mentioned that things are worse when their harassers share the same academic community. 13 of the interview participants mentioned that they have experienced harassment from their classmates and some of them had to compromise their academic scope and opportunities because of this issue, as we quote one of them, “I used to ask a lot of questions in the Facebook group of the class. That person put likes on all my comments. Then he texted me asking to see him separately and help him study. He would never stop nagging so I blocked him. Then he confronted me in the class one day. I was scared. I went to the register-office immediately, changed my class shift to morning ones. Also, I deactivated my Facebook so that there is no way he can reach out to me, not even using another Facebook ID. But this is not a sustainable solution. It is very easy to find me on campus and harass me again.", (P23) However, things are more complicated when the harasser is somewhere in an upper level of the power structure. In such situation, the victims are thought to be making things up to climb the ladder and further become victim of slut-shaming. 15 of the interview participants mentioned either them or their friends have become the victim of such slut-shaming, as we quote one of them, “He is a junior teacher in my department at the university. He seemed very friendly with the boys in my class but he texted me strange things using Facebook message. I was astonished. I did not know whom to talk to, so when I saw my advisor next time, I informed him and asked him what to do. My advisor said that I should stop flirting with him as he would be beyond my league. But I did not flirt with him! And beyond my league!? What league? He was the awful one!", (P27) 9 of our participants also mentioned that at some point of their lives, their parents thought of stopping them from going to school, tutoring center, or for education in general because they were victims of harassers from their schools or tutoring centers and avoiding those places was the best solution their parents thought of. 5 of the participants also mentioned that their mothers were child-brides as their grandparents married their mothers off to avoid the trouble of eve-teasing on their way to school. Furthermore, 13 of our participants reported that their or their friends’ professional career was impacted because of online harassment at some point. One of them explained, “My friend used to work at a USA based foreign company and her boss was located in the USA. So, often she had to work in the night shift and keep talking to her boss over messenger. He used to send her adult contents jokingly and later explained that such contents would keep her awake at night. And he used to send those in the messenger’s secret conversation mode, so those would disappear in a few seconds. My friend told him that she is uncomfortable. But nothing worked. My friend just stopped going to the office, stopped receiving their calls, and did not even go there to retrieve her experience letter.", (P9) Another 3 female participants mentioned that they also changed their job because of their harassment that took place within their office network and internal platform and they found no other way than quitting. Emotional Damage. 21 of our interview participants termed online harassment as ‘stressful’. They mentioned how such incidents stopped them from concentrating on their work momentarily and for a longer time. Even if they tried working around it, things did not go back to normal as before, as one on the participants explained this way, “It is not easy to stay calm once someone you do not know threats you that they will enter in the night through your window and rape you. How do I even know that I do not know them? What if they are someone who has access to my house? When I first received a threat like this in my other box, I could not sleep a whole night.", (P22) 11 However, 7 women mentioned in their interviews that the harasser being a known person was difficult for them to handle and that is how it stressed them out. We quote one of them, “One of my male friends in the class sent me awful jokes in the middle of one night. I thought he must have been sleepy while sending those. I ignored it and pretended that I did not see since talking about those would initiate further awkwardness. But things got worse when a male friend of him cracked a joke of me being his sexting partner. That was untrue and humiliating in front of a bunch of classmates. No matter how much I explained, I received that sarcastic gaze from them that I am one slutty bitch. I have stopped making friends. If friends are supposed to not support you rather make fun of you then I do not need one.", (P31) Sometimes, even if the victims are classmates, their lack of control over the environment gives the harassers some extra benefit, and this further damages the victims emotionally. One of the participants explained how it was difficult for them, as we quote her below, “I was one of the friendliest persons, but not anymore. They misinterpreted me intentionally and further posted my caring comments and questions as I am trying to play slut with them. Seeing a strange vulgar framing of my questions and comments as Facebook posts was traumatic for me. I begged them to stop, but they did not listen and said, “it is just a joke". I was so broken that I had to take sleeping pills for the next few nights. I had to take sessions (at the psychiatrist’s).", (P17) When we asked her if she took any step against it, she mentioned that she made a complaint at the school but her school refused to look into that case and told her that cyber-space is beyond the scope of the school campus. 3.2.4 Existing Social and Legal Support. In the survey, we asked if they ever considered seeking help from police and law-enforcement authorities. 19 of them responded that they did not know the appropriate process. Another 10 mentioned in their survey responses that their parents are reluctant to get into legal issues. Some of the respondents mentioned their mistrust in local law and law enforcement agencies, as we quote one of them, “After I reported the incident to one of the faculties, he told me to contact the police authority. But I didn’t dare to speak about this at home because my family is a little backdated, so they will blame me instead. And I have enough doubts about whether the police will take it seriously if I tell them about this incident.", (S34) One of the survey participants also mentioned in her response that involving the law-enforcement agencies might even bring her further trouble in the day-to-day life security, as we quote her, “I do not know if it can be brought under the law. And if there is a law which is not enforced, I am not sure whether it will be a threat to my security or not.", (S2) When we discussed this with three of our professionals, they explained to us that many times it is difficult to find the harassers and bring them under the law because they either use a fake Facebook ID, or use cloned ID. Also, they mentioned the challenge with digital evidence, “Often we receive complaints about online gender harassment and they mention of comments and private chats. They also send us a copy of those in images, but those images may not work as solid evidence. See, today, there are many software that would manipulate the images of chats and comment on Facebook. One way to solve this challenge is that they would come by our office, log in to their Facebook from our computer, and show us the evidence from there. Or, they can log in using a desktop and the browser to enter Facebook and video record the chat history along with the address bar on the top so that we can see who exactly was the person they chat with. This might minimize the scope of manipulation.", (P41) 12 Unmochon However, in many cases these are troubles for many of the victims because of their limited resources. Also, filing police cases takes a long time to solve. Thus, many of our participants mentioned that even if they know the process, they would just pass complaining. 3.2.5 Resistance. During the sessions, we investigated how do the participants address their online harassment and documented their response. We cluster the ways in three themes: ignoring and blocking, reporting to the authority and the platform, and expose and shame. We briefly discuss them below: Ignore and Block. 27 of our interview participants informed us that they think it is better to ignore the harasser silently. 75 of our survey respondents and 23 of the interview participants mentioned that they have blocked their harassers on the social media they used. But blocking does not end the story in many cases, especially when the harassers are desperate and fearless. As one of the interview participants explained, “I had a friend who was harassed by a man from her village. He used to text her asking out and my friends never agreed. He sent awful things and he will do those to her once he can force marry her. No matter how many times she blocked his phone number on her phone, and his Facebook IDs, it does not help. He would change the phone number and get a new Facebook ID and restart.", (P9) Another participant gave an example of her friend who was harassed by her school teacher in class and also over Imo a few years back. The participant explained how the blocking and ignoring was not feasible for her, as we quote her, “He used to touch from the side while pretending to check her papers. In the night he would explain in Imo that how much fun it was to touch her breast that way. My friend could not tell her parents because he was the teacher, could not even block him because her parents are connected to him using Imo.", (P7) Thus, when women are harassed by known people and have connections in real life, they try blocking and ignoring but those do not solve their problems due to their social relationship. Report to Platforms and Rightful Authorities. 23 of our interview respondents mentioned that they have reported their unknown harassers on the social media platform they used. But the process of reporting to social media platform often also suggests them to block that person, which is not a practical solution in many cases as we explained earlier. Furthermore, online harassment in professional groups is more challenging for women to handle this way. Over the course of the interview, we collected 13 cases where women’s online harassment happened in their workspace, and reporting to rightful authority was difficult since those harassers were either their seniors at work or their favored ones, as 11 of the stories suggest. In the rest of 2 cases, when the victims complained to their higher officials, those cases were resolved with minimum punishment, as one of the respondents shared her friend’s experience, “Her teammate wanted to date her and she was not interested. He started cursing her in texts and even in the office-emails since she ignored him. When she reported to her team-lead with copies of those emails, he ‘helped’ resolving the case and said, ‘This is a minor problem and this may happen when we work together. Let it go.’ Their office asked her to delete the copies of the emails from her end since they wanted to keep the environment toxicity free and keep the reputation of the team neat and clean.", (P12) Thus, this participant explained her friend’s frustration as she could not block or ignore him, neither her reporting to the rightful authority worked, and she was requested to destroy the evidence. 3.2.6 Motivation to Expose and Shame. Our participants mentioned several reasons that influence online gender harassment victims to expose their harassers online. These reasons include: i) many of the victims find justice in shaming the harassers in public, ii) women want to warn others to be aware of the harassers, iii) exposing the harassers online help them prove the case of harassment, and iv) in many cases, exposing helps to push the formal reporting to the rightful authorities. We discuss these reasons below. 13 Justice in Shaming. Sometimes women facing harassment feel that exposing their harassers should be a possible response while seeking gender justice. 21 of the interview participants mentioned that they think shaming is the way to prevail when it is challenging to bring the harassers under justice through social, professional, and legal systems. One of the participants who lived in a small town shared a story of a school teacher who harassed her and many other girls over messengers. “He picked the under-performing girls, sent them porn images, and asked them to explain the details. If someone refused, then he would complain to her parents that she is inconsistent with studies. One of the girls finally exposed his stupid activities. I am glad that what I could not do during my days, she did it. Now he is known as a predator in the town, he has lost most of his private tuition. Girls like me were scared and suffered alone, he should suffer in public.", (P37) Thus, 15 of interview participants mentioned that they have considered exposing or have exposed their harasser by publishing the harassment stories and a copy of their conversations, if available, in public. The victims believed that this would impact on harassers’ social and professional life and that is the punishment they deserve. Blowing the Whistle to Warn Others. 9 of the interview participants who went to the same university mentioned a similar story of a harasser at the university who harassed many female students in their Facebook inbox and later got exposed to a Facebook group related to that university. One of the participants explained the story to us, “My roommate was a victim of his consistent dating proposal. And a girl living 2 rooms after ours also kept receiving the same proposals. My friend was afraid that the guy might get vindictive. Meanwhile, his harassment stories started coming out in (name of the group). My friend and I convinced a male friend to share screenshots from our side to show that there are more victims of this harasser. And we wanted to tell the other female students that they should be aware of this person.", (P29) Thus, these female participants mentioned that they did not see prevailing justice only through punishment or shaming, but through finding a way that no person becomes a victim of the same thing again. They also explained that since society is patriarchal and men are considered superior here, it is an injustice to a girl not to inform her about the predators around them. Burden of Proof. 13 of our participants opined that when a female victim raised their voice against their repetitive harassers, often they are asked to submit proofs. In many cases, their cases are dismissed because they do not save those or post those. One of the participants explained this by sharing her experience with us. She explained that she was involved in organizing an event with a bunch of senior alumni at her school and one of them harassed her online. He kept asking her to see him alone and spend some ‘quality time’ as he termed that way. When she raised her voice against it, she was shut because of a lack of proof, as she explained, “He used to sendme strange texts using secret messagemode (on FacebookMessenger). Once the celebration was over, I raised my voice and posted the story with a copy of the most recent text from him on the school alumni group. But his friends asked me where is the proof that he ridiculed me consistently and why am I saying it now... I wish I saved the previous texts as well.", (P23) The participant further explained that how the friends of that harasser started dogpiling. Another 5 participants also shared a similar pattern of such experiences when people asked for proofs even for repetitive harassment incidents. Exposing is Reporting. As we showed that sometimes formally reporting to the concerned authorities failed and dismissed, organizing mini-movements on social media against the harassers might help establish gender justice. For example, here is a story that one of the interview participants shared. A renowned professor in their department was accused of online gender harassment. Initially, some of his students and colleagues denied accepting that because of his skyrocketing reputation. We quote one of the interview participants, 14 Unmochon “When the first girl raised her voice on Facebook, they challenged her by asking, “Did he really harass you in the chat?". Then she gave screenshots and they said she made it up. Then gradually more women came out, some from other universities and even school kids who got connected to him through a competition. Some girls claimed that they formally reported at the university but their cases were dismissed as he is a star professor. Finally, when he lost his reputation, the university stepped in and fired him.", (P20) Our participants mentioned that sometimes it is difficult even to start a conversation on harassment due to fear of further victim-blaming and slut-shamming. Thus, 17 women mentioned that while reporting these to anyone outside their friends, women should either use a fake ID, possibly as male one, to hide behind it and avoid victim-blaming and slut-shaming, or request a male friend to speak for them for more credibility. 3.3 Summary and What is Missing In summary, we found that women are often victim-blamed and slut-shamed for their harassment. Therefore, they generally prefer to keep harassment incidents a secret. They would share within their friends and solicit possible solutions. Sometimes, they also try to locally solve it while hiding their identity. The existing process of seeking legal help is often complicated, lengthy, and challenging for them due to a number of reasons. So they respond to their harassment by exposing the harasser with evidence of the history of conversation and showing solidarity to other victims if harassed by the same harasser. Still, they often face backlash from the people they seek support from as these groups of people often deny to believe the incidents of harassment. In such cases, even if the victims present the screenshots, often people are skeptical if that evidence is authentic. The professionals working for police informed us that confirming the authenticity of digital evidence of harassment is challenging and the forensic procedure is also complicated. Still, they try to work it out by either in-person evidence collection, or videos of chat where they actually get to see the harassers’ ID. However, this complicated process further discourages the victims to seek legal help. Today, this part of the design of social media is poor. Widely used social media platforms in Bangladesh, including Facebook, are missing a feature which would help online gender harassment victim prove the authenticity of their harassment evidence and help them engage in conversation with the law enforcement organizations to establish online gender justice with minimum hassle. 4 DESIGNING ‘UNMOCHON’: A TOOL SUPPORTING GENDER JUSTICE We draw on the findings from the survey responses and the interviews and translated those into the design. We named our tool Unmochon. It is a Bengali word which means ‘disclosing’ or ‘unavailing’ something. One of our pre-design interview participants discussed an idea of a hypothetical anti-harassment tool and named that ‘Unmochon’. Later, we borrowed it from them. This section describes our design goals and components of the application followed by the details of implementation. 4.1 Design Goals Our findings from the pre-design survey and interviews inform us of the denial that the victim receives while seeking online gender justice. Therefore, we set our objective to help women collect evidence of their harassment in a way that could be accepted as authentic by the concerned ones. We borrow the idea of shaming from shame- based design for gender justice [12]. We select Facebook as the platform for the deployment since Facebook is one of the most widely used social media platforms in Bangladesh. We draw on our findings to determine four design goals necessary for a successful online anti-gender harassment tool: G1: Capture Screenshot with Authenticity. The participants described scenarios where they saw general people on Facebook being suspicious of the screenshots being true. In some cases, the 15 Fig. 3. Work-flow diagram of Unmochon application with three major components: plugin at the user-end, the server with end-to-end encryption, and the customized Facebook group. screenshots were manipulated and some words were replaced. This kind of scenario urge us to set the goal to design a tool that captures the screenshot with minimum or no option of moderation. G2: Confirm that the Facebook Account in the Chat and Facebook Account Being Accused is the Same. In some cases, someone opened a cloned Facebook ID of a particular person, pretended to be him, and harassed women over the chat to defame him. This scenario urges to set the goal to design a tool that confirms that Facebook account in the chat and Facebook account being accused is the same. This will also make sure that no innocent individual gets reported and shamed in this process. G3: Hide the Identity of the Victims. Some of the participants reported that they would prefer opening another Facebook account or requesting friends to make the case on behalf of them so that they would stay behind the curtain and avoid victim-blaming and slut-shaming. Thus, our third goal of the design is to create such an affordance and lessen women’s dependency on others. G4: Share Screenshot in Public. In most cases, victims share the evidence of harassment in public to shame the harasser, find more victims of the same harasser and share moral support, and seek mob justice. Our final goal is built on this existing practice and we aim to create a convenient platform for them to share their evidence and proof of its authenticity with minimum hassle. 16 Unmochon 4.2 Components and Workflow of Unmochon From the user needs and design goals arising from the interviews, we prototyped Unmochon. It is a windows-based application. It has three main components: the browser plugin at the user-end, a server for storage, and a dedicated Facebook group run by human admins. 4.2.1 The Browser Plugin. The application comes with an unmochon.exe plugin. The purpose of the plugin is to take the screenshots from the chat window of the victim and prepare it for sharing. The initial version of the plugin that we used for the study only works on the Chrome browser. It is developed on Java Platform. Taking Screenshot. Once the plugin is installed, the welcome window pops up with a button on it saying ‘Take Screenshot’. After pressing the button, the application will allow the users with 10 seconds to set up their Facebook messenger chat thread on the Chrome browser of which they want to capture the screenshot. Once the 10 seconds is over, it will automatically capture a screenshot and copy the URL from the address bar that shows the unique Facebook ID number of the account with which the user had a chat and took a screenshot of. Hiding Identifiers. Upon capturing the screenshot, the application opens the editing window that allows the users to use the drawing pen only so that they can hide their own name, display picture in the thumb and any related sensitive information, which they prefer not to the public (see the components and the flow in fig-3 (left)). Once they are done hiding such identifiers, the application allows them to save the image of the chat and set it ready for sending it to the server for further process. However, the step of hiding identifiers is optional and users can skip it by pressing ‘cancel’ button and proceed to the next step of reporting by sending it and the harasser’s Facebook account number to the server. 4.2.2 Storage in Server. Once the screenshots reach the server, it stores them there. Then it puts a stamp on the image that says ‘Verified by Unmochon’ which means the screenshot of harassment chat history was taken and reported using the application and thus it is possibly not fabricated. The mechanism of putting the stamp is preset and functions only when the Facebook account number reported by the user and the one retrieved by the plugin while taking the screenshot is the same (see the components and the flow in fig-3(top-right)). The mechanism also detects the mode of the reports by analyzing if the user has kept their identifiers (mode-1) or hidden those (mode-2) in the sent image, before sending it to the Facebook group. Thus a package of four items is sent to the Facebook group, the image, the victim’s requested mode and Facebook account number, accused harasser’s Facebook account number, and how many times the same Facebook account number has previously been reported as a harasser. We have used google cloud platform for the server. All the communication between the server and the Facebook users are end-to-end encrypted. 4.2.3 Facebook Group. This is a Facebook group with all the posts preset to the privacy setting of ‘public’. Only the application developed by us can post here upon being approved by human admins (see the components and the flow in fig-3 (bottom-right)). This admin panel consists of responsible individuals of the society including members of law enforcement, psychologists, and gender activists. Once the package from the server arrives at the group and requests posting, the admins check if the report is a spam. There could be two types of mode of reporting. If it is ‘mode-1’, then the post comes with the victim’s name and Facebook account number, the screenshot of the harassment chat-history, the accused harasser’s Facebook account number, and how many times this person has been reported previously. For a ‘mode-2’ reporting, the post discards the victim’s name and Facebook account. We have used Facebook Graph API for the communication between the server and the Facebook group. 5 USER-EVALUATION OF UNMOCHON Our findings from the survey and pre-design interviews suggest that online harassment of women is a sensitive issue and often victims are in a vulnerable position due to lack of socio-cultural support. So, we were wary of 17 Fig. 4. Unmochon application takes a screenshot of Chrome tab that is currently open. It also copies the harasser’s unique Facebook id from the address-bar. Before sending it to the database and posting to the dedicated Facebook page for Unmochon, the application window only allows the users to mark the photo with red ink. (All the purple geometric shapes in the figure were post-edited for anonymization.) conducting an intervention with the participants, rather we decided to share our plugin with the users with no active contact with the server and the Facebook group, explain the whole idea, and seek their feedback. We first prepare a user-study package, which included a prototype version of the plugin and the user-guide to the application. To avoid unintended posting during the user-study, we shared the version of the prototype with no connection to the server and the Facebook group (see fig-4, attempt to posting on Facebook failed). For their better understanding, we also created a demo video and added that to the package that we shared with the participants. This section details the methods used and the data collection and analysis process. 5.1 Methods Used to Collect User-Feedback 5.1.1 Focus Group Discussion (FGD). We conducted a total of 6 FGD sessions with 19 female (4 groups) and 10 male (2 groups) participants. 13 of the female participants were from the set of participants of pre-design interviews. We recruited the rest of the participants from our professional networks on social media, from Facebook groups and friends. Upon reaching out to them, we explained the purpose of the work. We grouped relatively known participants of the same gender together and set up FGDs with them. We also sent them the user-study package a few hours before the interview so that they experience those and get some time to think about those. In the FGDs, we explained the purpose of the application and helped them to go through it step-by-step if needed. Then we discussed what kind of benefits this application might bring for the females to fight their harassment on Facebook Messenger in different kinds of cases. We also asked them what kind of challenges and troubles this application might generate for the users and other stakeholders. The sessions were conducted in the Bengali language. The average length of the sessions was 35 mins. We requested for participants’ permission to audio record the sessions and received permission to record 5 of them. 18 Unmochon Details of Post-Design User-Evaluation FGD Total Number of FGD Participant: 29 (Female:19, Male:10) Age Range (in Years) Occupation 18-25: 25 (86%) Public/private job-holder: 17 (59%) 26-35: 4 (14%) Student: 12 (41%) 36+: 0 (0%) Others: 0 (0%) Interview Total Number of Interview Participants: 19 (Female:13, Male:6) Age Range (in Years) Occupation 18-25: 6 (32%) Student: 6 (32%) 26-35: 11 (58%) Public/private job-holder: 13 (68%) 36+: 2 (11%) Others: 0 (0%) Table 2. Demographic Characteristics of Participants of Post-design User-Feedback We also offered them to discard the discussion at any moment due to their discomfort. However, no such incident happened. 5.1.2 Interviews. Along with the FGDs, we conducted 19 interviews (13 females and 6 males) with Facebook users as they requested a one-on-one conversation instead of a group discussion. 8 of them were from the set of participants of pre-design interviews. As before, we recruited the rest of the participants from our professional networks on social media, from Facebook groups and friends. Upon reaching out to them, we explained the purpose of the work. We also sent them the user-study package a few hours before the interview so that they experience those and get some time to think about those. During the interviews, we explained to them the purpose of the application and helped them to go through it step-by-step if needed. Then we asked them what kind of benefits did they see and how this application might be effective for different kinds of gender harassment scenarios on Facebook messenger. We also asked them what kind of challenges and troubles they could imagine that might generate from this application for the users and other stakeholders. All of the interview sessions used Bengali as the primary language. It generally took around 25 and 40 minutes to complete the interview. 13 of the interviews were audio-recorded with the permission of the participants. We left every opportunity for the participants to leave the interview if they felt uncomfortable, even during any ongoing sessions. We also informed them that we would discard the record of their participation if they wanted. However, no such event took place over the course of the study. 5.2 Data Collection and Analysis The audio files of interviews were recorded in the researchers’ phone and later saved in a hard drive for further data processing steps. We collected a total of approximately 8 hours of audio recordings and around 100 pages of interview-notes. In the analysis process, we first transcribed the audio recordings and later translated them into English. We then performed thematic analysis on the transcriptions and our detailed notes [15, 85]. Four of the authors independently read through the transcripts carefully and allowed codes to develop. Later they shared their codes with each other. A total of 24 codes spontaneously developed during the first round of the coding. Then we clustered related codes into themes after a few iterations. Some of the themes seemed recurring, for example, vindictive, privacy, posting, recognition, justice, etc. Such themes influence the organization of our finding section presented next. 19 6 FINDINGS FROM USERS’ FEEDBACK We organize the findings from the the user evaluation in three categories: design concerns and suggestions, justice concerns, feedback on user interface and process. We discuss them below. 6.1 Design Concerns and Suggestions One of the suggestions that came out during an FGD session with 6 participants was whether it is possible to check the metadata and include a verification status with the screenshots before sending that to the server and adding that in the reports’ status while posting them on Facebook. They explained that this approach could advantage the authenticity of the evidence presented by the victim. In another group of 5 participants, they discussed what could be the disadvantages of letting the harassment reporters hide their identifiers by drawing lines on them. They discussed a hypothetical scenario where someone manipulated the browser with its meta-elements, then captured the screenshot and hid the identifiers. In the same conversation, one of the participants suggested a possible solution to such scenario could be to let the application report the screenshot along with metadata of the page. Also, they suggested that the tool should reload the page by itself automatically before it captures the screenshot, to avoid possible manipulation in the chat through some advanced engineering with the HTML elements of the page. Concerning this point, one of them suggested developing this as a mobile app. 10 of our FGD participants anticipated that once such a tool is available, the server will be flooded with reports and many of those might be just random people sending random reports which might have nothing to do with harassment. They also proposed that this tool needs an intelligent filter to select and let in harassment-related posts only. Also, in these conversations, the participants discussed the need for a guideline to define harassment. Another group of 4 participants raised a concern that if the harasser is wise, then they would immediately know which one of the victims of them have reported just by seeing the screenshot in the report. They also suggested to report the name of the harasser upon verifying the authenticity of the evidence and not to expose the image of the chat; to hide all the possible identifiers of the victim. 6.2 Justice Concerns During the user evaluation, our participants also discussed and showed their concern about hegemonic legal consciousness of the stakeholders in the conversation of gender justice; many aspects of mob justice on social media and how those influence gender justice; and offenders’ privacy in this regard. We discuss them below. 6.2.1 Hegemonic Legal Consciousness. One of our interview participants, P61, is a lawyer and she informed us that institutional justice in Bangladesh is not transparent and female-friendly. The system is inclined more towards power-politics. Thus, often the victims find it easier to seek justice in social shaming. She told us some of the stories from the cases of gender harassment which she recently handled and shared her experience with us as a relevant example. A female debater developed an intimacy with another fellow male and that affinity turned toxic at some point. Later, the female debater turned in a written complaint to their association and accused the male colleague of gender harassment on messenger inbox in a particular social media. As P61 explained, “The female submitted some screenshots of their chat history to their association to make the case of harassment. Then the accused offender submitted some evidence from earlier chat histories to prove that they were in a romantic relationship at some point and claimed that the female colleague is accusing him of harassment to break his upcoming marriage. On the basis of the submitted evidence, the association came to the conclusion that the female colleague was NOT harassed, but the male colleague should also have behaved himself." (P61) P61, thus, explained how gender, social position, and other hegemony influence the transparency in the model of justice in Bangladeshi society, including professional spaces. She further stretched this conversation 20 Unmochon and explained how such a social and professional setting often disregards women’s sensitivity to their online harassment. She pointed out that it would be generally difficult for women to seek out online gender justice, even with technology like Unmochon. 6.2.2 Mob Justice on Social Media. Our participants brought up several challenges while seeking gender justice on social media. During our discussion with one of the male participants, he informed us that communication skill matters the most in seeking online mob justice. He explained, “When a female victim exposes the online harasser to the online community shared by both of them, often her communication skill plays a role here. Sometimes people visit the victim’s Facebook profile and thoroughly investigate if she looks suspicious. If she wears modern clothes, then people often redirect the case and say she was asking for it. Often women lose control over their temper at that point and start fighting. However, if the victim stays calm, ignores the slut-shamming, and responds to the audiences with relevant evidence then the victim wins their support." (P73) In the same discussion, he added that storytelling is also equally important. When a woman keeps telling a consistent story on social media and if the clues that she leaves connect to reality and to the version spoken by the other stakeholders, then that story is widely accepted and the victim is likely to be trusted by the people. Therefore, how the evidence is compiling into a coherent narrative rather than a scattered truth often makes the difference, as he explained, “There could be multiple versions of the same story. But it is the woman’s burden to speak out the version in a way that aligns with plausibility. If she sounds strange and illogical in view of the public, then her claims are denied. For example, a group of words, “I really need to meet you this evening, if you do not meet me then that is going to be a problem" could mean a lot of things. People will be curious about why the accused person wanted to meet her. People on Facebook would say it is her job to provide more context and coherent evidence so that it can be categorized as harassment and not some business or real needs." (P73) 8 of the interview participants also were concerned by the standard of judgment by the people on Facebook as there are diverse groups of people with different ideologies and philosophies. They were concerned about who would decide the metrics in a complex social setting of multiple moralities, as they expressed it this way, “There are different types of people on Facebook. Some are extremely patriarchal, some are extremely conservative. There are also people who are against women using Facebook since they think women meeting unknown people on Facebook, losing their purdah and this is against the women’s purity concept in their practicing Muslim ideology. So when a woman comes and seeks justice here, if she has any such people in the community in front of which she is trying to establish her requited gender justice, they might face a backlash as the mob would say she should have not talked to men or create a scope to let the men talk to her. They might say she invited it. So in such cases, even if she proves that the harassment was true, she might still fail to establish justice because of this blame." (P64) 13 of our participants pointed to the fact that the idea of exposing the offender in front of the online community shared by both of the parties using the screenshot of chat history repetitively might lessen the gravity of their complaints and could be problematic for them. Thus, the community is involved in the process of investigation and may become curious about further details which might be irrelevant to the complaint made, but later those can be subject to some other harassment. For example, P67 explained the story of one of her female friends, “A few months back, my friend first complained against a male classmate in our class’s Facebook group with no solid proof. Many of the group-members found that baseless and dismissed her accusation. Recently, when she complained against the same person again, the harasser and his fellows said that she 21 was deliberately doing this again and pointed to the previously dismissed complaint. Many people in the group then dissed her recalling her previous complaint, my friend felt sick and left the group." (P67) Thus, our participants informed us through many discussions that securing online gender justice with mob would need not only authentic evidence but how the victim communicates and how they align with mob sentiment are equally important factors in this regard. They further stretched the discussion and urged that our design goals should include such needs of the victims for the upcoming versions of this prototype. 6.2.3 Offender’s Privacy. 5 of our interview participants also pointed that the harasser might make a case of their privacy breaching if a victim share their personal conversation in front of a wider audience. One of the female participants shared her experience of accusing a male friend of harassment and further consequences, “First when I shared the photos of our chat history with some of the other friends, he claimed that the photos are made up. Like I fabricated them. Then one day I logged into my Facebook profile in front of people at the university and proved that I did not manipulate it, he started countering me by saying whatever happened, he did it privately and yelled at me asking why did I invade his privacy in front of people. Do I really have to care about his privacy after he had done so much damage to my mental health?" (P59) While making this point, the participant also added that the tool we have developed, may not be able to secure the privacy of the harasser and might be much less helpful than expected by the victim while seeking gender justice for their online harassment. 6.3 Feedback on User Interface and Process During the user-evaluation, the participants gave us some feedback on the User Interface. For example, 5 FGD participants showed their frustration on the lengthy processing time of the image after being captured by the tool. Also, another 6 participants suggested having a progress bar so that the users may know the approximate waiting time for particular processes. 9 of the FDG participants suggested to convert it into a mobile application and put it as a floating button on top of the screen. They added that a lot of users in Bangladesh do not have access to a desktop. Therefore, developing it as a mobile application would benefit them. 11 of the participants insisted on making the application more informative, interactive, and guided. During the interview session with P71, she explained her problem, “While running the application, I did not understand if it had already taken the screenshot and where did it go. I think it will be more helpful if the application windows are a bit more verbose and tell the users what exactly is happening in the ongoing step." (P71) The required browser version to run tool did not match 5 of our participants’ browser versions and thus, they had a hard time using the tool. After testing for some more time, participant P60 noticed that while capturing a screenshot, it’s difficult to move the application window from the intended surface for the screenshot. Both participants P61 and P59 tried but the application window could not be moved or minimized once the button was clicked. 7 DISCUSSION In this paper, we have described the details of Bangladeshi female internet users’ gender harassment and what kind of measures do they take while addressing those. Then drawing on our findings, we built a set of design goals that are more practical and appropriate for helping the victims fight their online gender harassment. We prototyped a version of Unmochon, conducted user-evaluation, and reported the perception of victims and other stakeholders of such feminist designs. Findings of the work lead us to discuss some broader agendas of feminist technology design, social support and law, and gender justice. 22 Unmochon 7.1 Feminist Technology Design for the Global South Our study joins the feminist-HCI discussion in the context of the global south. We cohere with some previous literature and show that women in this particular region are socially and culturally trained to accept or ignore their harassment and not to seek out external help [3, 88]. However, women in such strongly patriarchal societies still find ways to cope with the situation and fulfill their needs even within their adversarial environments [86, 87]. Some of the feminist-HCI literature in this social setting argue to design for women using all possible local support available for the women and enable other forms of situated tactics [87, 88]. Our design of Unmochon adapts the assumptions from ‘designing within patriarchy’ orientations [88] and joins this body of feminist technology design [8]. Feminist-HCI calls for establishing a relationship with subjects [8]. Similarly, researchers working on gender harassment on social media often call for engaging more women, especially gender harassment victims in designing gender justice tools and techniques [12]. We have responded to these calls since the majority of the authors and designers in this work are females and have experienced gender harassment on social media. We believe that our perspectives have helped us understand the participants and integrated cultural appropriateness in the features and affordances in designing tools to support gender justice. Building on the findings from our study, we argue that along with focusing on the users and their immediate needs, feminist-HCI and ICTD should also aim towards political agenda like supporting gender movements on social media for a better online experience for the female social media users from such complex patriarchal societies. 7.2 Social Support and Law Our study of online gender harassment also joins the design for social good movement within computer science, Ubicomp, and HCI [1, 10, 19, 31, 34, 72]. Building on the concept of community-based social support, the existing body of work adapts the assumptions of affirmative design and persuasive design [32, 44, 74] and contributes to the goals of social good for all. With our study, we have extended this line of literature to the context of the global south where often users and multiple stakeholders hold diverse and tangential agendas in a given social setting [3]. In this paper, we show that although often women seeking gender justice online are dissed and humiliated by online populations, they still seek support from the community via friends or using a fake Facebook ID. We have also showed that victims often believe that their sporadically and locally organized small-scale protests might help them more than legal systems and we have listed a number of reasons for their belief. All of these angles rising from the findings of our study lead us to argue that technology should be designed aiming to balance and bridge between social support and legal perspective to support the victims better. 7.3 Shaming Platform and Transformative Justice Finally, we highlight the transformative aspect of the design that we implemented through ‘Unmochon’. Transfor- mative design is based on a broad idea of fairness. Instead of punishing an individual for a crime, transformative justice investigates why that person committed that crime. This allows us to find deeper problems in our society including poverty, discrimination, lack of education, lack of social support, lack of goo childhood, etc., which contribute to making a person criminal [23, 37, 67]. Transformative justice, hence, focuses on fixing on those bigger issues [22]. This happens in three channels: (a) restoring the social imbalance caused by the individuals through social support, (b) changing the mindset of individuals who committed a crime through education and social support, and (c) addressing the bigger social problem that contributed to making a person commit crimes. Thus, transformative justice provides a way to reduce crime in a society in a more sustainable way. One of the design principles of Unmochon is ‘shaming’ which may be interpreted as an adversarial action toward punishing someone for their misconduct. However, shaming also provides a person with a way to reflect on what they have done from their social peers and repent for that. Therefore, shaming may reduce crime in a society [16]. The social process of shaming often also involves seeking forgiveness and committing not to repeat 23 the crime. At the same time, social pressure is imposed on the person that keeps them away from a repetition of the same crime. The perpetrator then gets an opportunity to learn and change their behavior and mindset. At the same time, the discourse that is created around a shaming incident allows the society to reflect on the broader picture of the society - where the perpetrators are coming from? why is this happening? how can we put a stop to this? how can families, communities, and societies come together to fix the problem?. Such conversations often lead to social welfare activities. From this perspective, we hope that Unmochon also creates a platform for transformative justice. Having said that, we do also acknowledge that the actual happening of transformative design largely depends on how people will use this platform and there are chances that many people may only focus on the immediate punishment part of shaming. However, we believe that, given enough time, more and more people will start focusing on the long-term sustainable solutions of the problem and the transformative aspects of Unmochon will prevail. 8 LIMITATIONS OF THE WORK Our work has several limitations. First, both online and real-life gender harassment is a stigma-topic in Bangladesh and for that reason, recruiting participants was challenging for us.Most of the interview and focus group discussion participants are recruited through snowball sampling and thus they are mostly the people from the researchers’ primary and secondary networks. Since we recruited the participants through convenient sampling, our work is not free from participation bias and selection bias. The opinions of our participants and arguments driven from those may not represent the collective view of the females of the whole nation. Second, we failed to engage with any participant of the third gender due to our limited reach. Thus, we did not gain any insight into their experiences of harassment and their opinion about our tool. Despite these limitations, the findings of our study will be useful for feminist technology design in the context of low-resource and patriarchal setting. Also, the arguments and lessons from this study will contribute to women-empowerment policy-making in Bangladesh and other countries with a similar resource, environment, and social setting. 9 IMPLICATIONS FOR DESIGN AND FUTURE WORK In the future iterations of the design, we plan to overcome or minimize some of the limitations of the existing version. Our user-evaluation also points to some of the necessary future design iterations. First, we prototyped our tool for Facebook messenger only. Our participants mentioned that they use Imo, Viber, Whatsapp, and other text messaging services frequently and face gender harassment on those platforms as well. Following their suggestions, one of the future work could be developing similar tools for Imo, Viber, Whatsapp, and other text messaging services that they use. Second, Our participants pointed out that sometimes it is difficult to explain the whole story of harassment just using one image and thus often the offenders go away taking the benefit of the doubt. In the existing version of the prototype, there is no way to collage multiple screenshots in the same report to explain the flow of the conversation. Our participants requested such functionality in the tool’s future iteration of the design. Third, this version of the prototype does not support submitting the evidence to police directly, although it fulfills some of the qualities and matches the protocol of the standard legal evidence collection followed by the Bangladeshi police in cases on cyber-crimes. This opens up a new opportunity for us to design technologies supporting gender justice within legal spheres. Finally, we will find ways to set up better communication between the victim and the law-enforcement authorities through this application and revisit our design assumptions in this regard. 10 CONCLUSIONS In this paper, we present a deeper understanding of online gender harassment faced by Bangladeshi female social media users. First, we conducted an extensive investigation of women’s online experiences of harassment. Building 24 Unmochon on the findings from that, we then set our design goals to support gender justice on Facebook often sought by female victims and designed Unmochon application for women who were harassed on Facebook messenger. Our user-evaluation with female Facebook users and some other stakeholders suggests a further improvement in design assumptions, goals, and the user interface. We further extend our discussion on a different aspect of legal, ethical, and pragmatic trade-offs of this type of approach. Taken together, our findings expand the ICTD and feminist-HCI communities’ understanding of gender-justice technology design within a resource-constrained and patriarchal social setting of Bangladesh and other similar countries. 11 ACKNOWLEDGMENTS Anonymized. REFERENCES [1] Rediet Abebe and Kira Goldner. 2018. Mechanism design for social good. AI Matters 4, 3 (2018), 27–34. [2] Nova Ahmed, Sadd Azmeen Ur Rahman, Rahat Jahangir Rony, Tanvir Mushfique, and VikramMehta. 2016. Protibadinext: Sensor support to handle sexual harassment. In Proceedings of the 2016 ACM International Joint Conference on Pervasive and Ubiquitous Computing: Adjunct. 918–921. [3] Syed Ishtiaque Ahmed, Steven J Jackson, Nova Ahmed, Hasan Shahid Ferdous, Md Rashidujjaman Rifat, ASM Rizvi, Shamir Ahmed, and Rifat Sabbir Mansur. 2014. Protibadi: A platform for fighting sexual harassment in urban Bangladesh. In Proceedings of the 32nd annual ACM conference on Human factors in computing systems. ACM, 2695–2704. [4] Saad Ahmed Akash, Md Al-Zihad, Tamal Adhikary, Md Abdur Razzaque, and Arifa Sharmin. 2016. Hearme: A smart mobile application for mitigating women harassment. In 2016 IEEE International WIE Conference on Electrical and Computer Engineering (WIECON-ECE). IEEE, 87–90. [5] Mohammed Eunus Ali, Shabnam Basera Rishta, Lazima Ansari, TanzimaHashem, and Ahamad Imtiaz Khan. 2015. SafeStreet: empowering women against street harassment using a privacy-aware location based application. In Proceedings of the Seventh International Conference on Information and Communication Technologies and Development. ACM, 24. [6] Nazanin Andalibi, Oliver L Haimson, Munmun De Choudhury, and Andrea Forte. 2016. Understanding social media disclosures of sexual abuse through the lenses of support seeking and anonymity. In Proceedings of the 2016 CHI conference on human factors in computing systems. 3906–3918. [7] UNWomen – Asia-Pacific. 2020. Facts and Figures. https://asiapacific.unwomen.org/en/focus-areas/end-violence-against-women/evaw- facts-and-figures. [8] Shaowen Bardzell. 2010. Feminist HCI: taking stock and outlining an agenda for design. In Proceedings of the SIGCHI conference on human factors in computing systems. 1301–1310. [9] Jamie Bartlett, Richard Norrie, Sofia Patel, Rebekka Rumpel, and Simon Wibberley. 2014. Misogyny on twitter. Demos (2014), 1–18. [10] Russell Beale, Nicola Bidwell, Eli Blevis, Stephen Brewster, Anxo Cereijo Roibas, Keith Cheverst, Andrew Deardon, Jussi Impio, Amit A Navavati, Abigal Sellen, Yvonne Rogers, and Lucia Terrenghi. 2009. UBICOMP 2009 Workshop CFP: Globicomp - Taking Ubicomp Beyond Developed Worlds from Julie Kientz on 2009-02-04 (www-multimodal@w3.org from February 2009). https://lists.w3.org/ Archives/Public/www-multimodal/2009Feb/0000.html. [11] Reuben Binns, Michael Veale, Max Van Kleek, and Nigel Shadbolt. 2017. Like trainer, like bot? Inheritance of bias in algorithmic content moderation. In International conference on social informatics. Springer, 405–415. [12] Lindsay Blackwell, Jill Dimond, Sarita Schoenebeck, and Cliff Lampe. 2017. Classification and its consequences for online harassment: Design insights from heartmob. Proceedings of the ACM on Human-Computer Interaction 1, CSCW (2017), 1–19. [13] Lindsay Blackwell, Mark Handel, Sarah T Roberts, Amy Bruckman, and Kimberly Voll. 2018. Understanding" Bad Actors" Online. In Extended Abstracts of the 2018 CHI Conference on Human Factors in Computing Systems. 1–7. [14] Jan Blom, Divya Viswanathan, Mirjana Spasojevic, Janet Go, Karthik Acharya, and Robert Ahonius. 2010. Fear and the city: role of mobile services in harnessing safety and security in urban use contexts. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems. 1841–1850. [15] Richard E Boyatzis. 1998. Transforming qualitative information: Thematic analysis and code development. sage. [16] John Braithwaite. 2000. Shame and criminal justice. Canadian Journal of Criminology 42, 3 (2000), 281–298. [17] Tarana Burke. 2006. me too. Movement. https://metoomvmt.org/. [18] Tarana Burke. 2018. Why We Need to Acknowledge the True Founder of the #MeToo Movement. https://www.blackburncenter.org/ post/2018/02/06/why-we-need-to-acknowledge-the-true-founder-of-the-metoo-movement-tarana-burke. 25 [19] Daniela Busse, Eli Blevis, Richard Beckwith, Shaowen Bardzell, Phoebe Sengers, Bill Tomlinson, Lisa Nathan, and Samuel Mann. 2012. Social sustainability: an HCI agenda. In CHI’12 Extended Abstracts on Human Factors in Computing Systems. 1151–1154. [20] Eshwar Chandrasekharan, Mattia Samory, Anirudh Srinivasan, and Eric Gilbert. 2017. The bag of communities: Identifying abusive behavior online with preexisting internet data. In Proceedings of the 2017 CHI Conference on Human Factors in Computing Systems. 3175–3187. [21] Rhitu Chatterjee. 2018. A New Survey Finds 81 Percent Of Women Have Experienced Sexual Harassment. https://www.npr.org/sections/ thetwo-way/2018/02/21/587671849/a-new-survey-finds-eighty-percent-of-women-have-experienced-sexual-harassment. [22] Donna Coker. 2002. Transformative justice: Anti-subordination processes in cases of domestic violence. (2002). [23] Erin Daly. 2001. Transformative justice: Charting a path to reconciliation. Int’l Legal Persp. 12 (2001), 73. [24] Priyashree Dasgupta. 2018. #MeToo In India: 75 Professors, 30 Institutes, What Happened To Raya Sarkar’s List Of Sexual Ha- rassers? https://www.huffingtonpost.in/2018/10/25/metoo-in-india-75-professors-30-institutes-what-happened-to-raya-sarkar-s-list- of-sexual-harassers_a_23571422/. [25] Damayanti Datta, Shweta Punj, and Chinki Sinha. 2018. #MeToo hits home - Cover Story News. https://www.indiatoday.in/magazine/ cover-story/story/20181022-metoo-hits-home-1360419-2018-10-12. [26] DhakaTribuneDeskReport. 2019. 70% of women facing cyber harassment are 15-25 years in age. https://www.dhakatribune.com/ bangladesh/dhaka/2019/09/24/70-of-women-facing-cyber-harassment-are-15-25-years-in-age. [27] Jill P Dimond, Michaelanne Dye, Daphne LaRose, and Amy S Bruckman. 2013. Hollaback!: the role of storytelling online in a social movement organization. In Proceedings of the 2013 conference on Computer supported cooperative work. ACM, 477–490. [28] Carl DiSalvo. 2012. Adversarial design. The MIT Press. [29] Hande Eslen-Ziya. 2013. Social media and Turkish feminism: New resources for social activism. Feminist Media Studies 13, 5 (2013), 860–870. [30] Deepa Fadnis. 2017. Feminist activists protest tax on sanitary pads: attempts to normalize conversations about menstruation in India using hashtag activism. Feminist Media Studies 17, 6 (2017), 1111–1114. [31] Maria Angela Ferrario, Will Simm, Peter Newman, Stephen Forshaw, and Jon Whittle. 2014. Software engineering for’social good’: integrating action research, participatory design, and agile development. In Companion Proceedings of the 36th International Conference on Software Engineering. 520–523. [32] Brian J Fogg. 2009. A behavior model for persuasive design. In Proceedings of the 4th international Conference on Persuasive Technology. 1–7. [33] Jesse Fox and Wai Yen Tang. 2017. Women’s experiences with general and sexual harassment in online video games: Rumination, organizational responsiveness, withdrawal, and coping strategies. New Media & Society 19, 8 (2017), 1290–1307. [34] Sarah Fox, Mariam Asad, Katherine Lo, Jill P Dimond, Lynn S Dombrowski, and Shaowen Bardzell. 2016. Exploring social justice, design, and HCI. In Proceedings of the 2016 CHI Conference Extended Abstracts on Human Factors in Computing Systems. 3293–3300. [35] Radhika Gajjala. 2018. When an Indian whisper network went digital. Communication Culture & Critique 11, 3 (2018), 489–493. [36] Björn Gambäck and Utpal Kumar Sikdar. 2017. Using convolutional neural networks to classify hate-speech. In Proceedings of the first workshop on abusive language online. 85–90. [37] Paul Gready and Simon Robins. 2014. From transitional to transformative justice: A new agenda for practice. International Journal of Transitional Justice 8, 3 (2014), 339–361. [38] Cristela Guerra. 2017. Where did ’Me Too’ come from? Activist Tarana Burke, long before hashtags. https: //www.bostonglobe.com/lifestyle/2017/10/17/alyssa-milano-credits-activist-tarana-burke-with-founding-metoo-movement- years-ago/o2Jv29v6ljObkKPTPB9KGP/story.html [39] Margaret Hagan, Nan Zhang, and Joseph’Jofish’ Kaye. 2012. Safe mathare: a mobile system for women’s safe commutes in the slums. In Proceedings of the 14th international conference on Human-computer interaction with mobile devices and services companion. 47–52. [40] Randi Lee Harper. 2016. Putting out the Twitter trashfire. Art+ Marketing.(13 February 2016). Retrieved September 8 (2016), 2017. [41] Jasmin Hasanović. 2015. Ochlocracy in the practices of civil society: a threat for democracy? Studia Juridica et Politica Jaurinenisis 2, 2 (2015), 56–66. [42] N. Hassan, M.K. Mandal, M. Bhuiyan, A. Moitra, and S.I. Ahmed. 2019. Nonparticipation of bangladeshi women in #MeToo movement. In ACM International Conference Proceeding Series. https://doi.org/10.1145/3287098.3287125 [43] Naeemul Hassan, Manash Kumar Mandal, Mansurul Bhuiyan, Aparna Moitra, and Syed Ishtiaque Ahmed. 2019. Nonparticipation of Bangladeshi Women in# MeToo Movement. In Proceedings of the Tenth International Conference on Information and Communication Technologies and Development. 1–5. [44] Eric B Hekler, Jennifer C Taylor, Steven P Dow, Michèle Morris, Faren J Grant, Sayali S Phatak, Don Norman, mc schraefel, and Dana M Lewis. 2019. Exploring, Defining, & Advancing Community-Driven Design for Social Impact. In Companion Publication of the 2019 on Designing Interactive Systems Conference 2019 Companion. 373–376. [45] Eleanor Tiplady Higgs. 2015. # JusticeforLiz: Power and Privilege in Digital Transnational Women’s Rights Activism. Feminist Media Studies 15, 2 (2015), 344–347. 26 Unmochon [46] Hossein Hosseini, Sreeram Kannan, Baosen Zhang, and Radha Poovendran. 2017. Deceiving google’s perspective api built for detecting toxic comments. arXiv preprint arXiv:1702.08138 (2017). [47] Shampa Iftakhar. 2020. # MeToo in Bangladesh: Can You Change? Journal of International Women’s Studies 21, 2 (2020), 126–142. [48] Nadia Ilahi. 2009. Gendered contestations: An analysis of street harassment in Cairo and its implications for women’s access to public spaces. Surfacing: An Interdisciplinary Journal for Gender in the Global South 2 (2009), 56–69. [49] Justin Jager, Diane L Putnick, and Marc H Bornstein. 2017. II. More than just convenient: The scientific merits of homogeneous convenience samples. Monographs of the Society for Research in Child Development 82, 2 (2017), 13–30. [50] Syeda Gulshan Ferdous Jana. [n.d.]. 7. Bangladesh: Social media, extremism and freedom of expression. TRANSNATIONAL OTHERING GLOBAL DIVERSITIES ([n. d.]), 103. [51] Shagun Jhaver, Sucheta Ghoshal, Amy Bruckman, and Eric Gilbert. 2018. Online harassment and content moderation: The case of blocklists. ACM Transactions on Computer-Human Interaction (TOCHI) 25, 2 (2018), 1–33. [52] Natasha Kabir. 2018. Cyber Crime a New Form of Violence Against Women: From the Case Study of Bangladesh. Available at SSRN 3153467 (2018). [53] Cristina Kadar and Irena Pletikosa Cvijikj. 2014. CityWatch: the personalized crime prevention assistant. In Proceedings of the 13th International Conference on Mobile and Ubiquitous Multimedia. 260–261. [54] Yoshiro Kamitake. 2007. From democracy to ochlocracy. Hitotsubashi journal of economics (2007), 83–93. [55] Semanur Karaman. 2017. Women support each other in the face of harassment online, but policy reform is needed | LSE Women, Peace and Security blog. https://blogs.lse.ac.uk/wps/2017/11/29/women-support-each-other-in-the-face-of-harassment-online-but-policy- reform-is-needed/. [56] Naveena Karusala and Neha Kumar. 2017. Women’s Safety in Public Spaces: Examining the Efficacy of Panic Buttons in New Delhi. In Proceedings of the 2017 CHI Conference on Human Factors in Computing Systems. ACM, 3340–3351. [57] George Kennedy, Andrew McCollough, Edward Dixon, Alexei Bastidas, John Ryan, Chris Loo, and Saurav Sahay. 2017. Technology solutions to combat online harassment. In Proceedings of the first workshop on abusive language online. 73–77. [58] N. Kumar, N. Karusala, A. Ismail, M. Wong-Villacres, and A. Vishwanath. 2019. Engaging Feminist Solidarity for Comparative Research, Design, and Practice. In Proceedings of the ACM on Human-Computer Interaction CSCW. ACM, 167. [59] Amanda Lenhart, Michele Ybarra, Kathryn Zickuhr, and Myeshia Price-Feeney. 2016. Online harassment, digital abuse, and cyberstalking in America. Data and Society Research Institute. [60] C Mallapur and A Alphonso. 2018. #MeTooIndia: 54% rise in sexual harassment reported at workplaces between 2014-17. https: //www.indiaspend.com/metooindia-54-rise-in-sexual-harassment-reported-at-workplaces-between-2014-17/ [61] Claudia Manzi, Sharon Coen, Camillo Regalia, Ana Maria Yévenes, Cristina Giuliani, and Vivian L Vignoles. 2018. Being in the Social: A cross-cultural and cross-generational study on identity processes related to Facebook use. Computers in Human Behavior 80 (2018), 81–87. [62] Robert Meyer and Michel Cukier. 2006. Assessing the attack threat due to IRC channels. In International Conference on Dependable Systems and Networks (DSN’06). IEEE, 467–472. [63] Alyssa Milano. 2017. How We Can Help Women Come Forward. Time (oct 2017). [64] Kathy Miriam. 2012. Feminism, neoliberalism, and SlutWalk. Feminist Studies 38, 1 (2012), 262–266. [65] Durba Mitra. 2012. Critical perspectives on SlutWalks in India. Feminist studies 38, 1 (2012), 254–261. [66] Aparna Moitra, Naeemul Hassan, Manash Kumar Mandal, Mansurul Bhuiyan, and Syed Ishtiaque Ahmed. 2020. Understanding the Challenges for Bangladeshi Women to Participate in# MeToo Movement. Proceedings of the ACM on Human-Computer Interaction 4, GROUP (2020), 1–25. [67] Ruth Morris. 2000. Stories of transformative justice. Canadian Scholars’ Press. [68] Fayika Farhat Nova, MD Rashidujjaman Rifat, Pratyasha Saha, Syed Ishtiaque Ahmed, and Shion Guha. 2019. Online sexual harassment over anonymous social media in Bangladesh. In Proceedings of the Tenth International Conference on Information and Communication Technologies and Development. 1–12. [69] Rogers TE Orock. 2014. Crime, in/security and mob justice: The micropolitics of sovereignty in Cameroon. Social Dynamics 40, 2 (2014), 408–428. [70] Ji Ho Park and Pascale Fung. 2017. One-step and two-step classification for abusive language detection on twitter. arXiv preprint arXiv:1706.01206 (2017). [71] Sangkeun Park, Sujin Kwon, and Uichin Lee. 2018. CampusWatch: Exploring Communitysourced Patrolling with Pervasive Mobile Technology. Proceedings of the ACM on Human-Computer Interaction 2, CSCW (2018), 1–25. [72] Nimmi Rangaswamy and Nithya Sambasivan. 2011. Cutting Chai, Jugaad, and Here Pheri: towards UbiComp for a global community. Personal and Ubiquitous Computing 15, 6 (2011), 553–564. [73] Elissa M Redmiles, Jessica Bodford, and Lindsay Blackwell. 2019. “I just want to feel safe”: A Diary Study of Safety Perceptions on Social Media. In Proceedings of the International AAAI Conference on Web and Social Media, Vol. 13. 405–416. 27 [74] Johan Redström. 2006. Persuasive design: Fringes and foundations. In International Conference on Persuasive Technology. Springer, 112–122. [75] Rebecca S Robinson. 2016. Pink Hijab Day: Mediation of the Hijab as a Symbol of Protest. International Journal of Communication 10 (2016), 20. [76] Jennifer D Rubin, Lindsay Blackwell, and Terri D Conley. 2020. Fragile Masculinity: Men, Gender, and Online Harassment. In Proceedings of the 2020 CHI Conference on Human Factors in Computing Systems. 1–14. [77] Christine Satchell and Marcus Foth. 2011. Welcome to the jungle: HCI after dark. In CHI’11 Extended Abstracts on Human Factors in Computing Systems. 753–762. [78] Vijay Shankar. 2015. Announcing Facebook lite. Facebook Newsroom (2015). [79] Ditilekha Sharma. 2018. What Is Missing In the #MeToo Movement? Economic and Political Weekly 53, 49 (2018). https://www.epw.in/ engage/article/what-is-missing-metoo-movement-limitation-law-justice [80] Tamara Shepherd, Alison Harvey, Tim Jordan, Sam Srauy, and Kate Miltner. 2015. Histories of hating. Social Media+ Society 1, 2 (2015), 2056305115603997. [81] Emily Shugerman. 2017. Me Too: Why are women sharing stories of sexual assault and how did it start? [82] Sharon G Smith, Xinjian Zhang, Kathleen C Basile, Melissa T Merrick, Jing Wang, Marcie-jo Kresnow, and Jieru Chen. 2018. The national intimate partner and sexual violence survey: 2015 data brief–updated release. (2018). [83] Navya R Sogi, Priya Chatterjee, U Nethra, and V Suma. 2018. SMARISA: a raspberry pi based smart ring for women safety using IoT. In 2018 International Conference on Inventive Research in Computing Applications (ICIRCA). IEEE, 451–454. [84] WHOA Comparison Statistics. 2013. Working to Halt Online Abuse. http://www.haltabuse.org/resources/stats/index.shtml. [85] Anselm Strauss and Juliet Corbin. 1990. Open coding. Basics of qualitative research: Grounded theory procedures and techniques 2, 1990 (1990), 101–121. [86] Sharifa Sultana and Syed Ishtiaque Ahmed. 2019. Witchcraft and HCI: Morality, Modernity, and Postcolonial Computing in Rural Bangladesh. In Proceedings of the 2019 CHI Conference on Human Factors in Computing Systems. ACM, 356. [87] Sharifa Sultana, Syed Ishtiaque Ahmed, and Susan R. Fussell. 2019. “Parar-daktar Understands My Problems Better": Disentangling the Challenges to Designing Better Access to Healthcare in Rural Bangladesh. Proceedings of the ACM on Human-Computer Interaction 1, CSCW (2019), 25. [88] Sharifa Sultana, François Guimbretière, Phoebe Sengers, and Nicola Dell. 2018. Design within a Patriarchal Society: Opportunities and Challenges in Designing for Rural Women in Bangladesh. In Proceedings of the 2018 CHI Conference on Human Factors in Computing Systems. ACM, 536. [89] Kalpana Viswanath and Ashish Basu. 2015. SafetiPin: an innovative mobile app to collect data on women’s safety in Indian cities. Gender & Development 23, 1 (2015), 45–60. [90] Jessica Vitak, Kalyani Chadha, Linda Steiner, and Zahra Ashktorab. 2017. Identifying women’s experiences with and strategies for mitigating negative effects of online harassment. In Proceedings of the 2017 ACM Conference on Computer Supported Cooperative Work and Social Computing. 1231–1245. [91] Charlie Warzel. 2016. A Honeypot for Assholes: Inside Twitters 10-Year Failure to Stop Harassment. BuzzFeed News (2016). [92] Charlie Warzel. 2017. Twitter is still dismissing harassment reports and frustrating victims. Buzzfeed.(17 July 2017). Retrieved September 8 (2017), 2017. [93] Beverly M Weber. 2016. Kübra Gümüşay, Muslim digital feminism and the politics of visuality in Germany. Feminist Media Studies 16, 1 (2016), 101–116. [94] Sherri Williams. 2015. Digital defense: Black feminists resist violence with hashtag activism. Feminist media studies 15, 2 (2015), 341–344. [95] Sherri Williams. 2016. # SayHerName: using digital activism to document violence against black women. Feminist media studies 16, 5 (2016), 922–925. [96] Yohanes Sigit Purnomo WP, Theresia Devi Indriasari, Kusworo Anindito, Yoshua Andrean, and Jaka Galih Prasetyo. 2019. CrimeID: Towards Crime Prevention and Community Safety in Indonesia using Mobile and Web Technology. International Journal of Interactive Mobile Technologies (iJIM) 13, 09 (2019), 52–65. [97] Ellery Wulczyn, Nithum Thain, and Lucas Dixon. 2017. Ex machina: Personal attacks seen at scale. In Proceedings of the 26th International Conference on World Wide Web. 1391–1399. [98] Susan P Wyche, Sarita Yardi Schoenebeck, and Andrea Forte. 2013. “Facebook is a luxury" an exploratory study of social media use in rural Kenya. In Proceedings of the 2013 conference on Computer supported cooperative work. 33–44. [99] Chelsea Young. 2014. HarassMap: using crowdsourced data to map sexual harassment in Egypt. Technology Innovation Management Review 4, 3 (2014), 7. 28