"We do not sell facial recognition technology to police departments in the United States today,” Microsoft president Brad Smith said during a Washington Post Live event Thursday. “But I do think this is a moment in time that really calls on us to listen more, to learn more and most importantly to do more. Given that, we’ve decided that we will not sell facial recognition technology to police departments in the United States until we have a national law in place, grounded in human rights, that will govern this technology.”
Smith noted that the tech giant will put "additional review factors" in place to regulate uses of facial recognition technology in "other scenarios."
He also called for action from the federal government, stressing that the move by tech companies to stop selling the technology will not be enough by itself.
"If all of the responsible companies in the country cede this market to those that are not prepared to take a stand, we won't necessarily serve the national interest or the lives of the black and African American people of this nation well," Smith added. "We need Congress to act, not just tech companies alone. That is the only way that we will guarantee that we will protect the lives of people."
The move comes after IBM CEO Arvind Krishna sent a letter to Congress on Monday calling for reforms on how facial recognition technology is being used in the wake of protests across the country sparked by the death of George Floyd.
"IBM firmly opposes and will not condone uses of any technology, including facial recognition technology offered by other vendors, for mass surveillance, racial profiling, violations of basic human rights and freedoms," Krishna wrote. “We believe now is the time to begin a national dialogue on whether and how facial recognition technology should be employed by domestic law enforcement agencies.”
Krishna called on Congress to "bring more police misconduct cases under federal court purview" and "make modifications to the qualified immunity doctrine that prevents individuals from seeking damages when police violate their constitutional rights."
"Artificial Intelligence is a powerful tool that can help law enforcement keep citizens safe," Krishna added. "But vendors and users of Al systems have a shared responsibility to ensure that Al is tested for bias, particularity when used in law enforcement and that such bias testing is audited and reported."
In addition, he called for a federal registry of police misconduct and measures compelling states and localities to review police departments' use-of-force policies. He also asked Congress to consider the Walter Scott Notification Act, which would require states receiving federal funding to report more details on the use of deadly force by law enforcement officers to the Department of Justice.