Researchers unveil colorful patch that makes wearers unrecognizable to AI surveillance

Researchers say powerful, artificially-intelligent surveillance systems capable of tracking your every move may have an unlikely kryptonite: a simple printable patch. 



In a demonstration, engineers from KU Leuven, a university in Belgium, showed how subjects wearing specially designed patches -- about the size of a vinyl record -- were able to elude an algorithm designed to identify humans. 

In a YouTube video two researchers, one without the patch and another with the colorful card hung around his neck, are shown standing in front of a camera working in tandem with an algorithm to identify both objects and humans in the room. 

Scroll down for video 

In a demonstration, engineers from KU Leuven, a university in Belgium, showed how subjects wearing specially designed patches -- about the size of a vinyl record -- were able to allude an algorithm designed to identify humans. Above, the person on the right remained undetected

In a demonstration, engineers from KU Leuven, a university in Belgium, showed how subjects wearing specially designed patches -- about the size of a vinyl record -- were able to allude an algorithm designed to identify humans. Above, the person on the right remained undetected 

While the demonstrator without the patch is clearly marked by the algorithm, his counterpart seems to be completely invisible.   

ADVERTISING

Patterns printed on the patches could be used to design t-shirts or other types of clothing to achieve the same results, say researchers. 

Their cloaking patterns currently only work on one type of algorithm, they say. 

'We believe that, if we combine this technique with a sophisticated clothing simulation, we can design a T-shirt print that can make a person virtually invisible for automatic surveillance cameras,' the say in a paper published on ArXiv

Patterns like the one's demonstrated by researchers could be used for a number of purposes, say experts, not all of them benevolent. 



For one, adversarial patterns could be strategically deployed by thieves or criminals to evade detection from cameras designed to identify shop-lifting or even authorities looking to make an arrest. 

Secondly, and possibly more concerning, the patterns have shown effective in tricking non-surveillance related systems like those used in self-driving cars.

In a recent report, researchers showed how adversarial patterns and objects could potentially trick a Tesla powered by the companies auto-pilot software into swerving into oncoming traffic or ignoring stop signs.

The applications of the cloaking tools aren't all boon's for bad actors, however. 

As AI systems become more advance and their use in surveillance more prevalent, some critics have warned that they could give way to a culture of mass surveillance by governments and more. 



For one, adversarial patterns could be strategically deployed by thieves or criminals to evade detection from cameras designed to identify shop-lifting or even authorities looking to make an arrest. 

Secondly, and possibly more concerning, the patterns have shown effective in tricking non-surveillance related systems like those used in self-driving cars.

In a recent report, researchers showed how adversarial patterns and objects could potentially trick a Tesla powered by the companies auto-pilot software into swerving into oncoming traffic or ignoring stop signs.

The applications of the cloaking tools aren't all boon's for bad actors, however. 

As AI systems become more advance and their use in surveillance more prevalent, some critics have warned that they could give way to a culture of mass surveillance by governments and more. 


Some AI surveillance systems have already been deployed to concerning results. Stock image

Some AI surveillance systems have already been deployed to concerning results. Stock image

Recently, in a detailed report from the New York Times, China's government was revealed to have been using AI to to surveil an ethnic minority, the Uighurs, to shocking results. 

In just one month, the report states, the systems identified 500,000 Uighurs.

To help disguise oneself from the watchful eye of AI-powered camera systems, some researchers have also proposed various hairstyles and makeup that make it difficult for algorithms to read one's face.   

Using a flashy mix of hair colors, makeups, and other accessories may trick a computer, but unfortunately for those looking to fly under the radar, the getups will likely have the opposite effect on a human observers. 

HOW DOES FACIAL RECOGNITION TECHNOLOGY WORK?

Facial recognition software works by matching real time images to a previous photograph of a person. 

Each face has approximately 80 unique nodal points across the eyes, nose, cheeky and mouth which distinguish one person from another. 

A digital video camera measures the distance between various points on the human face, such as the width of the nose, depth of the eye sockets, distance between the eyes and shape of the jawline.

A different smart surveillance system (pictured)  can scan 2 billion faces within seconds has been revealed in China. The system connects to millions of CCTV cameras and uses artificial intelligence to pick out targets. The military is working on applying a similar version of this with AI to track people across the country 

A different smart surveillance system (pictured) can scan 2 billion faces within seconds has been revealed in China. The system connects to millions of CCTV cameras and uses artificial intelligence to pick out targets. The military is working on applying a similar version of this with AI to track people across the country 

This produces a unique numerical code that can then be linked with a matching code gleaned from a previous photograph.

A facial recognition system used by officials in China connects to millions of CCTV cameras and uses artificial intelligence to pick out targets.

Experts believe that facial recognition technology will soon overtake fingerprint technology as the most effective way to identify people. 

Ad
 
Unmute

Researchers unveil colorful patch that makes wearers unrecognizable to AI surveillance

.

Artificial Intelligence FAQs

You May Like

Comments