Why facial recognition’s racial bias problem is so hard to crack
Nearly 40 percent of the false matches by Amazon’s facial recognition tool, which is being used by police, involved people of color.
Tech companies have responded to the criticism by improving the data used to train their facial recognition systems, but they’re also calling for more government regulation to help safeguard the technology from being abused.
Source: Why facial recognition’s racial bias problem is so hard to crack – CNET