TOM LEONARD: New facial recognition program has 3bn people on database

The far scarier face book: A new facial recognition program solved a murder in 20 minutes. And with a database of 3bn people, it can identify you, too … with chilling consequences for freedom, writes TOM LEONARD

Consider the scenario: you are walking down a street and everyone knows who you are.

This is not the cosy familiarity of a small town or village but a chilling situation where complete strangers are wordlessly stripping away your privacy.

Who you are, what you do, where you live, what you buy and even who your friends are — all of it disclosed whether you like it or not.

The ‘theft’ of your personal information might be conducted via a CCTV camera high above the ground or a smartphone pointed in your direction for just a moment.

Some now predict Clearview — a tiny set-up which has also licensed its software to a string of private companies for supposed security purposes — could end up destroying privacy as we know it by exploiting the vast size and reach of social media [File photo]

It might be even more covert — a passer-by wearing high-tech ‘smart’ glasses linked to the web.

Now, in a bittersweet example of the biter bit, the face-stealers have been raided themselves.

This week it emerged that the entire client list of secretive New York company Clearview AI — the firm behind this controversial facial recognition system — had been stolen by someone who ‘gained unauthorised access’ to documents and data.

In a notification sent to customers that was obtained by the Daily Beast website, it said the intruder had also discovered the number of user accounts those customers had set up, and the number of searches conducted.

Savour the anonymity of modern life while you can — it won’t be available for long. 

Facial recognition technology has advanced so drastically that Clearview’s new computer program can now identify almost anyone from a single snatched photo.

The company has been supplying police forces across the U.S. with a system that potentially allows them to recognise a face by comparing it with three billion publicly available images of people ‘scraped’ from all across the internet, including social media giants like Facebook, Twitter and YouTube.

A Clearview programme was developed to automatically collect images of people’s faces, not only from social media but also from company home pages, and news and educational sites [File photo]

Last night, it was claimed Clearview is now working with 2,200 organisations including the Metropolitan Police, as well as banks, major U.S. retailers such as Walmart and Macy’s, sports and entertainment venues and casinos.

The list, which included a sovereign wealth fund in the United Arab Emirates, was reported by Buzzfeed News.

It claimed that, while the Met says Clearview is not being used in its recently deployed live facial recognition tool, Clearview’s logs show it has used the service for more than 170 searches.

Clearview claims at least 600 law enforcement agencies, including the FBI, are using this technology — even though it is condemned by the internet companies — to recognise criminal suspects.

The development has horrified tech experts as well as privacy and human rights campaigners.

Some now predict Clearview — a tiny set-up which has also licensed its software to a string of private companies for supposed security purposes — could end up destroying privacy as we know it by exploiting the vast size and reach of social media.

Police officers say that Clearview offers several advantages over other facial recognition tools. For one, its database of faces is so much larger. Also, its algorithm doesn’t require people to be looking straight at the camera , it can even identify a partial view of a face — under a hat or behind large sunglasses

Even Clearview’s own employees wouldn’t necessarily disagree with this dark prognosis.

With the arrogant insouciance typical of Silicon Valley, a backer of Clearview acknowledged to a newspaper that its facial recognition system ‘might lead to a dystopian future or something, but you can’t ban it’.

In the UK, alarm was fuelled last month by the news that the Met police is deploying facial recognition cameras, scanning people on London streets for criminal suspects. More than half a dozen other UK police forces have been conducting similar trials.

Yet speaking this week, Britain’s top police officer, Met Commissioner Dame Cressida Dick, attempted to downplay privacy fears, claiming the security benefits of spy cameras make them worth it.

She said: ‘Concern about my image and that of my fellow law-abiding citizens passing through LFR [live facial recognition] . . . feels much, much, much smaller than my and the public’s vital expectation to be kept safe from a knife through the chest.’

Her words sparked concern among privacy campaigners who brandished it as proof that Britain is turning into a surveillance state — though it should be noted that the Met’s technology is much less intrusive than Clearview’s.

Although it can still analyse faces in seconds using CCTV cameras, phones or officers’ body cameras, the former checks them only against a watchlist of suspects rather than the entire internet.

And, according to a recent report by Scotland Yard, the Met’s software is far from accurate. In fact, it can only recognise a third of women and two-thirds of men. And black people are far more likely to be wrongly flagged up than white people.

Clearview, however, has billions of faces to compare and is said to be far more efficient, allegedly finding matches up to 75 per cent of the time.

Britain’s top police officer, Met Commissioner Dame Cressida Dick, attempted to downplay privacy fears, claiming the security benefits of spy cameras make them worth it

Its creators — a young Australian computer expert and former model named Hoan Ton-That and Richard Schwartz, a former aide to New York mayor Rudy Giuliani — started out by simply vacuuming up faces online.

A Clearview programme was developed to automatically collect images of people’s faces, not only from social media but also from company home pages, and news and educational sites.

Clearview engineers then designed a complicated algorithm to convert all these images into mathematical formulas, or ‘vectors’, based on facial geometry, such as the distance between nose, mouth, ears and jaw.

When a new photo is uploaded into the system — from a police camera, for instance — the software converts it into a vector and compares it with similar vectors in the database, flashing up links to the websites where the original photos came from.

Clearview’s creators say they don’t want the public to get hold of their creation for fear of what people will do with it, but their investors and police predict it will eventually be freely available.

And even if it isn’t, someone else is likely to copy it.

Clearview claims it originally envisaged using its creation for mundane tasks such as vetting babysitters or helping hotel staff identify guests.

Then it dawned that it had much bigger potential and they started marketing it to law enforcement agencies.

Its first success came last February when Indiana State Police used Clearview to solve a murder case in just 20 minutes.

Two men had been in a fight that led to one of them fatally shooting the other. A bystander filmed the crime and his face was matched to that of a man who had been in a video posted online which had helpfully also provided his name.

So how does facial recognition software actually work?

Step one

Using a complex algorithm, facial recognition software measures the facial geometry of a person, such as the distance between their nose, mouth, ears and jaw.

Step two

These values are then matched with similar photos across the internet — including social media giants such as Facebook, Twitter and YouTube.

Step three

This means that the 600 law enforcement agencies in the U.S. using this technology can recognise a face by comparing it with three billion publicly available images of people ‘scraped’ from all across the internet, allowing them to identify a suspect or bystander.

The suspect hadn’t been on official police databases so Clearview’s help proved crucial. The company started offering police officers a 30-day free trial.

According to Clearview’s promotional material, the software helped identify serial bank fraudsters, an unidentified man found dead on a pavement and — most astonishingly — an alleged child abuser whose face was caught in the mirror of someone else’s photo, taken at a gym.

Investigators across the U.S. and Canada are now using Clearview to identify child sex abuse victims.

Police officers say that Clearview offers several advantages over other facial recognition tools. For one, its database of faces is so much larger.

Also, its algorithm doesn’t require people to be looking straight at the camera , it can even identify a partial view of a face — under a hat or behind large sunglasses.

But not only is it swiping photos that people have innocently posted online over the years, it is also ignoring the expressed wishes of the web companies that have forbidden such data ‘scraping’.

Google boss Eric Schmidt has said facial recognition is the one technology he won’t pursue as it could be used ‘in a very bad way’.

It isn’t illegal for U.S. police to use Clearview so long as it’s not their only evidence for arresting someone. However, some law enforcement officials are already uneasy. New Jersey’s attorney general has banned the state’s police from using it.

Critics say Clearview is ripe for endless misuse — even by the police. 

Eric Goldman, co-director of the High Tech Law Institute at Santa Clara University, said he wouldn’t be worried if he were sure this technology would be used solely to investigate crime. 

‘However, we have a long history of law enforcement misusing such tools, such as rogue officers using it to stalk potential romantic partners,’ he said.

He also predicted that unscrupulous governments might use what would be a ‘really juicy database’ to ‘dig up secrets about people to blackmail them or throw them in jail’.

It’s unclear whether any British police force has yet expressed interest in Clearview. But the Home Office plans to invest £97 million in biometric technology, including facial recognition, and thanks largely to our history of terror attacks, the UK is more tolerant than other Western countries of surveillance cameras.

There will be many in Britain who, worried by declining police numbers and soaring knife crime, may applaud any development that takes criminals off the streets.

Police officers say that Clearview offers several advantages over other facial recognition tools. For one, its database of faces is so much larger. Also, its algorithm doesn’t require people to be looking straight at the camera , it can even identify a partial view of a face — under a hat or behind large sunglasses [File photo]

However, others are appalled by the idea of facial recognition being used at all. Silkie Carlo, director of the UK privacy campaign group Big Brother Watch, accused the Met of turning people into ‘walking ID cards’.

She said: ‘’My fear is this is going to end up across the CCTV networks which means we lose anonymity in public spaces.’

The technology, particularly the ‘incredibly intrusive’ Clearview, will have a ‘chilling effect’ on private life, she predicted.

‘People will think twice before they go on a rally or a protest.’ 

And for those who reassure themselves they’re not on Facebook or Instagram, she warned: ‘There are very few people who don’t have a data trace online.’

Even before Clearview appeared on the scene, a cross-party group of MPs including former Brexit Secretary David Davis backed Big Brother Watch in calling for a stop to facial recognition surveillance.

Ms Carlo says she knows of at least one British CCTV security company which has boasted that it can use the same data-scraping technology employed by Clearview to identify people recorded by its cameras.

Soon computer algorithms won’t even need a face to identify people. Artificial intelligence installed in a mobile phone is being trained to recognise the person carrying it by the way they walk or the pattern of their heartbeat.

Meanwhile, programs are said to be able to analyse our faces and read our emotions.

Many of us crave fame and the thrill of being recognised. In future, the real luxury may be to live our lives in blissful obscurity.

Source: Read Full Article