黑料门

The Ethical Pitfalls of Crime Prevention Apps

January 19, 2016

Our current atmosphere of digital connectedness has spawned innovative new ways to help citizens feel secure in their surroundings. Unsure of the safety of an area you鈥檙e visiting? There鈥檚 an app for that. See a suspicious character lurking around your neighbor鈥檚 home? There鈥檚 an app for that. Witness a crime taking place? Never fear鈥攁 new wave of mobile apps allows you to be aware of, and alert authorities to, suspicious characters in neighborhoods, stores and other venues through real-time tracking and user-reported incidents. Their goal is to keep you safer by providing notice of unsafe areas and criminal activity as well alerting authorities and other community members of crime-related incidents. Upon initial examination, this seems like a useful and efficient way to deter crime and increase personal safety, but it appears these crime-fighting superhero apps have a dark side.

There鈥檚 no doubt that apps like  can help citizens get fast assistance in the midst of a crime. This particular app sends authorities the location, photo, video, audio and text description of the crime at the push of a button. Similar technology is also in use at police departments like the  to encourage citizens to submit anonymous tips about suspicious behavior to the police for follow-up. Apps like SketchFactor (which has since been  from the market due to  concerning racial profiling) and GhettoTracker (also ) mainly targeted geographical areas with unsafe reputations, although SketchFactor allowed reporting of individuals. Other mobile tools like  and  help connect community and business members with one another and with local authorities to monitor local criminal activity and perceived threats.

Unfortunately, 鈥減erceived鈥 is the operative word. While these apps allow members to alert one another to suspicious activity, they have also seemingly opened the door to a McCarthy-era level of racial bias. In a recent example, businesses and residents of Georgetown, an affluent neighborhood outside of Washington, D.C., used the GroupMe app in an attempt to curtail the area鈥檚 growing shoplifting problem.  reported allegations in October of this year that the group was racially profiling African American shoppers, since over 72 percent of the 鈥渟uspicious individual鈥 GroupMe reports targeted African Americans. , a representative of the Georgetown Business Improvement District, defended the group, noting that less than 5 percent of the African American individuals identified on GroupMe were arrested. As further evidence of his community鈥檚 neutrality, he explained that group members who post inappropriate content are either told to work within the specified rules or they are kicked off the app. He did not mention what 鈥渋nappropriate content鈥 was, and he did not expand on whether or not the remaining 68 percent of African Americans tagged in reports but not arrested were approached by the police. After the controversy was reported in the media, the group  use of the app.

Georgetown is one of the 鈥溾 neighborhoods in the D.C. area, with over 85 percent of the population reported as Caucasian and just over 3 percent African American, as opposed to the neighboring  with its 38 percent Caucasian and 50 percent African American populations, respectively. With the low level of Caucasian population density in Georgetown, a black individual would be easy to notice and might seem out of place. However, the African Americans that live in Georgetown are, like their white neighbors, affluent, well educated, and law-abiding. , a Georgetown University associate professor of sociology, explains: 鈥淐rime does occur in Georgetown. And quite often when people describe the perpetrators of those crimes, they鈥檙e usually young men of color. But that doesn鈥檛 mean every person of color is an automatic suspect.鈥 One February  underscores her statement. An employee at a Georgetown retail establishment took a photo of a tall, well-dressed African American man that he described as 鈥溾ery suspicious, looking everywhere.鈥 Later, an employee at another store responded, 鈥淗e was just in Suitsupply. Made a purchase of several suits and some gloves.鈥

As demonstrated through the prior example, apps like these can quickly become a forum to unfairly categorize members of another race or socioeconomic status as dangerous or sketchy. For instance, riders on San Francisco鈥檚 Bay Area Rapid Transit system (BART) can use a BART-created app for iOS and Android called BART Watch that allows them to report suspicious activity, crimes, and other unwanted behavior to authorities instantly. When a local newspaper, the , requested a month鈥檚 worth of these complaints they found that there was a disproportionate number of reports aimed at blacks. Approximately 68 percent of the complaints that included a description referenced blacks. Interestingly, only 10 percent of BART ridership is attributed to blacks, with whites and Asians making up the majority of the remainder. Also, many of the 鈥渙ffenses鈥 included in the report were relatively benign activities such as playing loud music, smelling bad, and taking up more than one seat. , the executive director of the Ella Baker Center for Human Rights, decries the app,  that, 鈥淏y encouraging passengers to report these types of complaints, BART is furthering our punishment economy, wherein we find punitive solutions to social problems that actually require reinvestment in communities.鈥

While many of these apps have a polarizing effect on demographically separate groups, at least one was created in response to an already sensitive social situation. 鈥攁n app designed to reduce street harassment aimed at women, people of color, and the LGBT community鈥攁llows real-time reporting of incidents with a location map of the occurrence. Problem is, there is no strict definition of street harassment. A well-meaning compliment to one may be a serious infraction to another. Also, reports may be fabricated, exaggerated, or created in an attempt to hassle another individual and even purposefully get them in trouble with authorities. In fact, the argument that Hollaback may overlook the harassment issues of men, mainly white or straight men, has surfaced on social chatrooms like , demonstrating that apps that call out (or leave out) some segment of society are at risk for fomenting social discontent. Finally, any app that relies upon communal reporting may also contribute to the proliferation of  justice, where community members take matters into their hands based on a mobile report.

Evidence such as the Georgetown incidents show that in some cases these apps have a way of marginalizing certain members of society. They can also depersonalize the impact that anonymous reports, with their subsequent follow-up investigation by authorities, can have upon innocent persons. The apps emphasize people鈥檚  what is different and allow individuals with deep-seated anger toward another ethnicity, religion, age, gender or sexual preference to harass others through erroneous reports of dangerous activity.

While they may seem like a reasonable way to keep an eye on crime, there are flaws in the design of these mobile group reporting apps, which can contribute to a more significant racial divide. In a  to Georgetown University students earlier this year,  spoke candidly about racial tension and overcoming bias. Importantly, he noted that racial bias and misunderstanding run both ways and to overcome it, people need to see and understand one another. 鈥淚t鈥檚 hard to hate up close,鈥 he explained in a question-and-answer  following his speech. Unfortunately, apps like these鈥攚ith their snarky digital anonymity鈥攁llow prejudice and misunderstanding to snowball as the accuser, accused, and authority figures are even further disconnected from one another. Anonymity may protect the informant, but it can also enable emotional distance and contribute to incident exaggeration. An article in  noted that the CrimePush application lets users report crime anonymously 鈥渟o that they may continue with their busy lives knowing that with a push of a button, police will know and have everything to pursue the criminal.鈥 This cavalier attitude toward situational reporting minimizes the significance reports like these can have on innocent individuals.

We mustn鈥檛 forget the pluses in this crime prevention app equation: sometimes having a mobile reporting app saves lives and property. A in Arizona that used the Nextdoor app to keep tabs on criminal activity saw their rate of burglaries plummet. A community in  was able to help authorities apprehend a group of burglary suspects through the use of NextDoor.

There is no doubt that there is a need for apps that can send help to victims in distress, allow crime reporting on-the-fly, keep neighbors and businesses aware of suspicious activity in their area, and let travelers know where it is safe to trek. But app developers need to have an awareness of the social risks and costs of this type of anonymous, instantaneous reporting. They should engage with lawmakers, citizens and law enforcement authorities to determine and build in fail-safes that reduce false reports and discourage, perhaps even penalize, biased targeting.


Nikki Williams
is a bestselling author based in Houston, Texas. She writes about fact and fiction and the realms between, and her nonfiction work appears in both online and print publications around the world. Follow her on Twitter @williamsbnikki or at nbwilliamsbooks.com.

January 19, 2016

Our current atmosphere of digital connectedness has spawned innovative new ways to help citizens feel secure in their surroundings. Unsure of the safety of an area you鈥檙e visiting? There鈥檚 an app for that. See a suspicious character lurking around your neighbor鈥檚 home? There鈥檚 an app for that. Witness a crime taking place? Never fear鈥攁 new wave of mobile apps allows you to be aware of, and alert authorities to, suspicious characters in neighborhoods, stores and other venues through real-time tracking and user-reported incidents. Their goal is to keep you safer by providing notice of unsafe areas and criminal activity as well alerting authorities and other community members of crime-related incidents. Upon initial examination, this seems like a useful and efficient way to deter crime and increase personal safety, but it appears these crime-fighting superhero apps have a dark side.

There鈥檚 no doubt that apps like  can help citizens get fast assistance in the midst of a crime. This particular app sends authorities the location, photo, video, audio and text description of the crime at the push of a button. Similar technology is also in use at police departments like the  to encourage citizens to submit anonymous tips about suspicious behavior to the police for follow-up. Apps like SketchFactor (which has since been  from the market due to  concerning racial profiling) and GhettoTracker (also ) mainly targeted geographical areas with unsafe reputations, although SketchFactor allowed reporting of individuals. Other mobile tools like  and  help connect community and business members with one another and with local authorities to monitor local criminal activity and perceived threats.

Unfortunately, 鈥減erceived鈥 is the operative word. While these apps allow members to alert one another to suspicious activity, they have also seemingly opened the door to a McCarthy-era level of racial bias. In a recent example, businesses and residents of Georgetown, an affluent neighborhood outside of Washington, D.C., used the GroupMe app in an attempt to curtail the area鈥檚 growing shoplifting problem.  reported allegations in October of this year that the group was racially profiling African American shoppers, since over 72 percent of the 鈥渟uspicious individual鈥 GroupMe reports targeted African Americans. , a representative of the Georgetown Business Improvement District, defended the group, noting that less than 5 percent of the African American individuals identified on GroupMe were arrested. As further evidence of his community鈥檚 neutrality, he explained that group members who post inappropriate content are either told to work within the specified rules or they are kicked off the app. He did not mention what 鈥渋nappropriate content鈥 was, and he did not expand on whether or not the remaining 68 percent of African Americans tagged in reports but not arrested were approached by the police. After the controversy was reported in the media, the group  use of the app.

Georgetown is one of the 鈥溾 neighborhoods in the D.C. area, with over 85 percent of the population reported as Caucasian and just over 3 percent African American, as opposed to the neighboring  with its 38 percent Caucasian and 50 percent African American populations, respectively. With the low level of Caucasian population density in Georgetown, a black individual would be easy to notice and might seem out of place. However, the African Americans that live in Georgetown are, like their white neighbors, affluent, well educated, and law-abiding. , a Georgetown University associate professor of sociology, explains: 鈥淐rime does occur in Georgetown. And quite often when people describe the perpetrators of those crimes, they鈥檙e usually young men of color. But that doesn鈥檛 mean every person of color is an automatic suspect.鈥 One February  underscores her statement. An employee at a Georgetown retail establishment took a photo of a tall, well-dressed African American man that he described as 鈥溾ery suspicious, looking everywhere.鈥 Later, an employee at another store responded, 鈥淗e was just in Suitsupply. Made a purchase of several suits and some gloves.鈥

As demonstrated through the prior example, apps like these can quickly become a forum to unfairly categorize members of another race or socioeconomic status as dangerous or sketchy. For instance, riders on San Francisco鈥檚 Bay Area Rapid Transit system (BART) can use a BART-created app for iOS and Android called BART Watch that allows them to report suspicious activity, crimes, and other unwanted behavior to authorities instantly. When a local newspaper, the , requested a month鈥檚 worth of these complaints they found that there was a disproportionate number of reports aimed at blacks. Approximately 68 percent of the complaints that included a description referenced blacks. Interestingly, only 10 percent of BART ridership is attributed to blacks, with whites and Asians making up the majority of the remainder. Also, many of the 鈥渙ffenses鈥 included in the report were relatively benign activities such as playing loud music, smelling bad, and taking up more than one seat. , the executive director of the Ella Baker Center for Human Rights, decries the app,  that, 鈥淏y encouraging passengers to report these types of complaints, BART is furthering our punishment economy, wherein we find punitive solutions to social problems that actually require reinvestment in communities.鈥

While many of these apps have a polarizing effect on demographically separate groups, at least one was created in response to an already sensitive social situation. 鈥攁n app designed to reduce street harassment aimed at women, people of color, and the LGBT community鈥攁llows real-time reporting of incidents with a location map of the occurrence. Problem is, there is no strict definition of street harassment. A well-meaning compliment to one may be a serious infraction to another. Also, reports may be fabricated, exaggerated, or created in an attempt to hassle another individual and even purposefully get them in trouble with authorities. In fact, the argument that Hollaback may overlook the harassment issues of men, mainly white or straight men, has surfaced on social chatrooms like , demonstrating that apps that call out (or leave out) some segment of society are at risk for fomenting social discontent. Finally, any app that relies upon communal reporting may also contribute to the proliferation of  justice, where community members take matters into their hands based on a mobile report.

Evidence such as the Georgetown incidents show that in some cases these apps have a way of marginalizing certain members of society. They can also depersonalize the impact that anonymous reports, with their subsequent follow-up investigation by authorities, can have upon innocent persons. The apps emphasize people鈥檚  what is different and allow individuals with deep-seated anger toward another ethnicity, religion, age, gender or sexual preference to harass others through erroneous reports of dangerous activity.

While they may seem like a reasonable way to keep an eye on crime, there are flaws in the design of these mobile group reporting apps, which can contribute to a more significant racial divide. In a  to Georgetown University students earlier this year,  spoke candidly about racial tension and overcoming bias. Importantly, he noted that racial bias and misunderstanding run both ways and to overcome it, people need to see and understand one another. 鈥淚t鈥檚 hard to hate up close,鈥 he explained in a question-and-answer  following his speech. Unfortunately, apps like these鈥攚ith their snarky digital anonymity鈥攁llow prejudice and misunderstanding to snowball as the accuser, accused, and authority figures are even further disconnected from one another. Anonymity may protect the informant, but it can also enable emotional distance and contribute to incident exaggeration. An article in  noted that the CrimePush application lets users report crime anonymously 鈥渟o that they may continue with their busy lives knowing that with a push of a button, police will know and have everything to pursue the criminal.鈥 This cavalier attitude toward situational reporting minimizes the significance reports like these can have on innocent individuals.

We mustn鈥檛 forget the pluses in this crime prevention app equation: sometimes having a mobile reporting app saves lives and property. A in Arizona that used the Nextdoor app to keep tabs on criminal activity saw their rate of burglaries plummet. A community in  was able to help authorities apprehend a group of burglary suspects through the use of NextDoor.

There is no doubt that there is a need for apps that can send help to victims in distress, allow crime reporting on-the-fly, keep neighbors and businesses aware of suspicious activity in their area, and let travelers know where it is safe to trek. But app developers need to have an awareness of the social risks and costs of this type of anonymous, instantaneous reporting. They should engage with lawmakers, citizens and law enforcement authorities to determine and build in fail-safes that reduce false reports and discourage, perhaps even penalize, biased targeting.


Nikki Williams
is a bestselling author based in Houston, Texas. She writes about fact and fiction and the realms between, and her nonfiction work appears in both online and print publications around the world. Follow her on Twitter @williamsbnikki or at nbwilliamsbooks.com.