Community platform Nextdoor is courting police across the U.S., creating concerns among civil rights and privacy advocates who worry about possible conflicts of interest, over-reporting of crime, and the platform’s record of racial profiling, per a Thursday report by CityLab.
That effort included an all-expenses-paid meeting in San Francisco with members of Nextdoor’s Public Agencies Advisory Council, which includes community engagement staffers from eight U.S. police departments and mayor’s offices, according to CityLab. Other outreach has included enlisting current and former law enforcement officers to promote the app, as well as partnerships with local authorities that enable them to post geo-targeted messages to neighbourhoods and receive unofficial reports of suspicious activity through the app. According to CityLab, attendees of the meeting in San Francisco had to sign nondisclosure agreements that could shield information on the partnerships from the public.
Charles Husted, the chief of police in Sedona, Arizona, told CityLab that the intent of the meeting was to ensure that community officials are in touch with locals and hearing their concerns. He said his department uses social media networks like Facebook and Twitter as well as Nextdoor: “We have to evolve with the times, and the times have to do with social media: That’s where our communities are at. We have to find a way to be there too.”
Nextdoor has “crime and safety” functions that allow locals to post unverified information about suspicious activity and suspected crimes, acting as a sort of loosely organised neighbourhood self-surveillance system for users. That raises the possibility Nextdoor is facilitating racial profiling and over-policing, especially given its efforts to build relationships with authorities and its booming user base (reportedly past 10 million). During the ongoing coronavirus pandemic, Nextdoor has seen skyrocketing user engagement—an 80 per cent increase, founder Prakash Janakiraman told Vanity Fair earlier this month.
“There are compelling reasons for transparency around the activities of public employees in general, but the need for transparency is at its height when it comes to law enforcement agencies,” ACLU Speech, Privacy, and Technology Project staff attorney Freed Wessler told CityLab. “It would be quite troubling to learn that police officers were investigating and arresting people using data from private companies with which they have signed an NDA.”
Nextdoor and its fellow security and safety apps, including Amazon’s Ring doorbell camera platform and the crime-reporting app Citizen, are also implicitly raising fears of widespread crime at a time when national statistics show crime rates have plummeted across the country, Secure Justice executive director and chair of Oakland’s Privacy Advisory Commission Brian Hofer told CityLab. Nextdoor marketing materials, for example, assert that Nextdoor played a role in crime reduction in Sacramento.
“Our entire country—and California, and the Bay Area specifically—we’re at a 40-year historic low for violent crime … and yet people are walking around like we’re living in the most violent place in the world,” Hofer told CityLab. “These vendors are doing a really great job of creating a sense of fear.”
In 2016, Nextdoor responded to concerns of rampant racial profiling on the platform, primarily in the form of users who reported anyone with dark skin as suspicious, with a series of major changes. Those included prompting users for additional information if a post mentioned race, asking them whether they would still find a person suspicious if race or ethnicity was taken out of the equation, and adding categories for users to flag conversations as inappropriate.
Nextdoor said the changes successfully reduced the number of racist posts by around 75 per cent, but that hasn’t eliminated bias on the platform, especially when it’s coded. Investigations by BuzzFeed in 2017 and by Gizmodo’s sister site, The Root, in 2019 found that racism remained rampant on the site. One weak point identified by BuzzFeed included that the system for manually flagging profiling that escapes the notice of automated system is only as fair as appointed community moderators.
“I now live in a conservative, white subdivision and I literally know every time a black person comes in the neighbourhood,” Madison, Wisconsin, teacher Rebecca Stinton told The Root. “It makes me so angry to see so much casual racism that I deleted the app. I imagine it’s like that everywhere.”
“I haven’t actually seen a lead post a message where they said, ‘I will not enforce the racial profiling guidelines,’ but I have certainly seen leads participate in threads where [the guidelines] were being ridiculed,” Jackson, Mississippi, resident Tom Head told BuzzFeed. “In majority-white communities in Mississippi, the idea of opposing racial profiling as a matter of policy is not necessarily a popular one.”
The Public Agencies Advisory Council is “a forum for [public agency] partners to share their expertise and experiences with each other and our product development team,” Nextdoor told CityLab in a statement.
Nextdoor added, “As we look to build a product that best serves all of our customers, it is critical to engage with them and gather direct input to inform our product development decisions. Nextdoor greatly values this best practice and, as part of that, we work closely with experts such as academics, community leaders, and sociologists as they bring diverse perspectives and advise on important areas of Nextdoor like civic engagement, neighbourhood vitality, and member experience.”