• Subscribe

A view to a brawl

Speed read
  • Public video surveillance is more prevalent than ever
  • Researchers are working to help officials spot crowd violence
  • Will AI make the world safer or restrict human rights?

Ever feel like you’re being watched? You probably are.

Who is watching? Camera-equipped drones employ AI trained to distinguish poses associated with aggressive behavior to monitor crowds for signs of violence. Courtesy Amarjot Singh.

Millions of video surveillance cameras are installed on streets and in businesses throughout the world. Some of these cameras have helped law enforcement agencies solve crimes. The suspects in the 2013 Boston Marathon bombing, for example, were identified using this technology.

But the question now is whether video surveillance can be used to prevent future crimes. Amarjot Singh of the University of Cambridge thinks that if cameras are trained to spot suspicious behavior, future attacks like the Boston bombing might be averted.

Highly accurate automated video surveillance systems have been developed to identify abandoned objects that might contain bombs. Other systems can detect activities such as purse snatchings and child abductions. These technologies work well but are limited by the amount of area they can surveil.

<strong>Peacekeeper.</strong> An aerial view of a volatile crowd could potentially help keep the peace by reassuring authorities on the ground that a tense situation is not actually turning violent. Courtesy Gunnery Sgt. Robert Piper.Singh, along with Devendra Patil of the National Institute of Technology in Warangal, India, and S.N. Omkar of the Indian Institute of Science in Bangalore developed Eye in the Sky to alleviate this coverage problem. The project uses camera-equipped drones and artificial intelligence (AI) to spot violent behavior in large crowds.

The proposed Drone Surveillance System (DSS) uses aerial imagery to detect individuals who are engaging in violent activity. To do this, the AI is trained to recognize the human figure by looking for 14 body key-points.

The DDS then learns to distinguish poses associated with violent behavior by using a ScatterNet Hybrid Deep Learning (SHDL) network and the annotated Aerial Violent Individual (AVI) dataset. Singh explained that the ScatterNet extracts low-level features learned by the first layer of the deep network. This streamlines the process because there are fewer layers for the system to learn.

<strong> Stabbing pose.</strong> Singh’s deep learning network uses 14 key points on the human body to identify violent poses such as strangling, punching, kicking, shooting, and stabbing. Courtesy S.N. Omkar.Created by Singh and his colleagues, the AVI is comprised of 2,000 images containing 10,863 humans engaged in at least one violent activity such as kicking, punching, and stabbing. To train the system to recognize real-life conditions, the images in the dataset show elements such as lighting changes, shadows, poor resolution, and blurring.

Singh’s team created the AVI dataset using a Parrot AR Drone with a 1GHz CPU running Linux OS. The drone is fitted with both front- and downward-facing cameras. The front-facing camera has a higher resolution and records images. The downward-facing camera monitors the state of the drone such as its roll, pitch, and altitude.

Once recorded, images are sent to the Amazon cloud for real-time identification. Singh says that cloud computing gave the team the flexibility of using unlimited computational resources. “The cost of using the Amazon machine is $0.100/hour. Since we trained the model on our local machine, overall the experiments were not too expensive.”

<strong>Eye in the Sky</strong> becomes less accurate with larger crowds. Due to the potential for abuse, co-developer Amarjot Singh believes the use of AI for crowd control should be regulated. Courtesy Benny Jackson/Unsplash.Tests of the DSS produced varying results. The researchers found that as the number of humans in the aerial image increased, the system’s ability to detect violent activity decreased. The DSS, however, did outperform existing state-of-the-art technology by 10%.

Singh envisions police using his system as a tool to aid officers in quickly curtailing violent activity in large crowds. When asked about risk of the technology being used to violate human rights, Singh says that AI’s potential to make society a safer place outweighs that risk.

But he does concede that safeguards should be applied. “AI is extremely powerful and should be regulated for specific applications like defense, similar to nuclear technology,” says Singh.

For more information about the impact of video surveillance, visit the American Civil Liberties Union (ACLU) page dedicated to the issue.

Read more:

Join the conversation

Do you have story ideas or something to contribute? Let us know!

Copyright © 2018 Science Node ™  |  Privacy Notice  |  Sitemap

Disclaimer: While Science Node ™ does its best to provide complete and up-to-date information, it does not warrant that the information is error-free and disclaims all liability with respect to results from the use of the information.

Republish

We encourage you to republish this article online and in print, it’s free under our creative commons attribution license, but please follow some simple guidelines:
  1. You have to credit our authors.
  2. You have to credit ScienceNode.org — where possible include our logo with a link back to the original article.
  3. You can simply run the first few lines of the article and then add: “Read the full article on ScienceNode.org” containing a link back to the original article.
  4. The easiest way to get the article on your site is to embed the code below.