Maine and California voters show skepticism of the government’s ability to responsibly use certain smart city technology

Automatically updates every 5 minutes.

0Joe BidenIcon of check mark inside circleIt indicates a confirmed selection.
Icon of check mark inside circleIt indicates a confirmed selection.Donald Trump0

0Dem Icon of check mark inside circleIt indicates a confirmed selection.
Icon of check mark inside circleIt indicates a confirmed selection. Rep0

0Dem Icon of check mark inside circleIt indicates a confirmed selection.
Icon of check mark inside circleIt indicates a confirmed selection.Rep0


  • Global spending on smart city projects is expected to more than double in the next five years.
  • But in a sign of public skepticism of the government’s ability to responsibly use certain smart city technologies, voters in Maine and California voted to limit government use of facial recognition and carceral risk assessment algorithms, respectively.
  • Insider Intelligence publishes hundreds of insights, charts, and forecasts on the Connectivity & Tech industry with the Connectivity & Tech Briefing. You can learn more about subscribing here.

Global spending on smart city projects is expected to more than double in the next five years, growing from $131 billion in 2020 to $295 billion by 2025, per Insider Intelligence estimates from March 2020. Smart city projects promise to reduce commute times by 15-20%, shrink a city’s environmental footprint by 10-30%, and overall make citizens happier and more engaged, per McKinsey.

More cities are beginning to crack down on the use of facial recognition technology.

David McNew /Getty Images


But many smart city projects elicit public backlash stemming from concerns of government overreach—for instance, nearly half of US adults do not trust law enforcement to use facial recognition technology responsibly, while 69% felt uncomfortable about how governments handle data, according to Pew Research. 

This election cycle, voters in Portland, Maine and California got a chance to directly vote on the acceptable use of technology by the government—here’s what happened:

  • Voters in Portland, Maine approved a ballot measure to ban the government use of facial recognition technologies. The ballot measure builds on a temporary ordinance city council members approved in August, per the Portland Press Herald. Public scrutiny of law enforcement use of facial recognition reached new heights this summer amid the George Floyd protests—in response, Amazon, IBM, and Microsoft  suspended or terminated the sale of facial recognition services to law enforcement agencies. The companies all advocated for federal regulation of the technology rather than an outright ban. But regulation might not be enough to overcome the inherent privacy issues posed by facial recognition, such as the consent issue highlighted by Canada’s investigation into Clearview AI, or the implicit racial biases exhibited when Detroit police falsely arrested Robert Julian-Borchak Williams in January. The majority of voters in Portland, Maine seem to agree with this assessment, and the city now joins the ranks of Boston, San Francisco, and Portland, Oregon in opting to outright ban government use of the technology. 
  • Voters in California rejected Proposition 25, which would have replaced the cash bail system with an AI-based risk assessment algorithm. Private bail bond companies typically charge individuals 10–15% of their total bail fee in exchange for taking on the full bail obligation from the court. And in the US, the bail industry disproportionately profits from poor minority communities. Prop. 25 would have replaced the cash bail system with an AI-based risk assessment algorithm that allows individuals categorized as low- or medium-risk to leave jail after 36 hours, but keeps individuals deemed high risk in jail. The ballot proposal included commitments to regularly demonstrate that the risk assessment tool minimized implicit bias. But opponents were skeptical of these claims. For instance, Alice Huffman, the president of the NAACP’s California State Conference, said of the ballot measure: “Computer models may be good for recommending songs and movies, but using these profiling methods to decide who gets released from jail or who gets a loan has been proven to hurt communities of color.” The fact that Prop. 25 didn’t pass isn’t necessarily reflective of public mistrust in AI, however—the bail bond industry poured over $5 million in funding opposition to the initiative, promoting the notion that removing the cash bail incentive would allow defendants to stop showing up to court hearings.

Want to read more stories like this one? Here’s how you can gain access:

  1. Join other Insider Intelligence clients who receive this Briefing, along with other Connectivity & Tech forecasts, briefings, charts, and research reports to their inboxes each day. >> Become a Client
  2. Explore related topics more in depth. >> Browse Our Coverage

Are you a current Insider Intelligence client? Log in here.

LoadingSomething is loading.

More:

Insider Intelligence
BI Intelligence
BI Intelligence Content Marketing
Tech

Chevron iconIt indicates an expandable section or menu, or sometimes previous / next navigation options.

Read More

Hirsh Chitkara