Search

Facial recognition app used by more than 600 law enforcement agencies suffers significant data breach - Fox News

A new facial recognition company, which claims to be used by over 600 law enforcement agencies, has suffered a data breach, according to the company.

Clearview AI confirmed with Fox News that its entire client list was stolen in the breach. According to the company, someone gained access to a list of all of Clearview’s customers, the number of accounts used by those customers and the number of searches they have conducted.

The software, first outed in a New York Times investigation earlier this year, claims to identify anyone with just a single image by instantly running an image through its database of over 3 billion internet photos. Law enforcement agencies had been using the app in secret, and have reported how accurate and helpful the technology has been in identifying suspects.

NEW FACIAL RECOGNITION APP PROMISES TO SOLVE CRIMES, BUT CRITICS SAY IT MEANS END OF PRIVACY

This breach is significant in that privacy advocates, and some law enforcement officials, have voiced cybersecurity concerns about the brand new app’s potential vulnerability to hacking. Clearview AI stopped short of calling this incident a hack.

“Security is Clearview’s top priority,” Tor Ekeland, attorney for Clearview AI said in a statement. “Unfortunately, data breaches are part of life in the 21st century. Our servers were never accessed. We patched the flaw, and continue to work to strengthen our security.”

Just a couple weeks ago, Hoan Ton-That, the founder of Clearview AI, boasted in an interview with Fox News: “We’ve never had any breaches on our servers yet.”

ACTIVISTS DEMAND FACIAL RECOGNITION BAN FOR LAW ENFORCEMENT IN MAJOR NEW PUSH

“We have some of the best engineers in the country who were at some of the most prominent firms running our cybersecurity,” Ton-That said. “And we've done outside audits of our code and our setup to make sure it's as good as it can be.”

Gurbir Grewal, the attorney general of New Jersey, has temporarily banned the application in the state's police departments, citing cybersecurity and privacy concerns, despite the fact that the application already has been used by one department to help identify a pedophile.

“Some New Jersey law enforcement agencies started using Clearview AI before the product had been fully vetted,” Grewal said in a statement. “The review remains ongoing.”

Two weeks ago, a class action lawsuit was filed in New York federal court against Clearview AI claiming the company is illegally taking people’s biometric information without their consent.

MAJORITY OF AMERICANS TRUST LAW ENFORCEMENT TO USE FACIAL RECOGNITION RESPONSIBLY

The Clearview AI software runs images through its database, which the company claims contain more than 3 billion photos pulled from websites.

“It searches only publicly available material out there,” Ton-That said in February. “This is public data. We're not taking any personal data ... things that are out there on the Internet, in the public domain.”

CLICK HERE FOR THE FOX NEWS APP

However, Google, YouTube, Facebook, Twitter, Venmo and LinkedIn have sent cease-and-desist letters to Clearview AI in an effort to shut the app down. The companies said photos users put on their accounts are not public domain and taking people’s photos, a practice known as scraping, violates their terms of service.

Let's block ads! (Why?)


https://news.google.com/__i/rss/rd/articles/CBMigAFodHRwczovL3d3dy5mb3huZXdzLmNvbS91cy9mYWNpYWwtcmVjb2duaXRpb24tYXBwLXVzZWQtYnktbW9yZS10aGFuLTYwMC1sYXctZW5mb3JjZW1lbnQtYWdlbmNpZXMtc3VmZmVycy1zaWduaWZpY2FudC1kYXRhLWJyZWFjaNIBhAFodHRwczovL3d3dy5mb3huZXdzLmNvbS91cy9mYWNpYWwtcmVjb2duaXRpb24tYXBwLXVzZWQtYnktbW9yZS10aGFuLTYwMC1sYXctZW5mb3JjZW1lbnQtYWdlbmNpZXMtc3VmZmVycy1zaWduaWZpY2FudC1kYXRhLWJyZWFjaC5hbXA?oc=5

2020-02-26 17:29:22Z
52780634213834

Bagikan Berita Ini

0 Response to "Facial recognition app used by more than 600 law enforcement agencies suffers significant data breach - Fox News"

Post a Comment

Powered by Blogger.