Skip to Content

Community policing officer – Requesting backup!

Capgemini
2022-01-17

Too much information and infobesity threaten the effectiveness of community policing officers.

An Intelligent Virtual Assistant (IVA) can filter information based on relevance. IVAs can monitor behavior and stress in order to deliver information effectively.

A pitfall of IVAs is ‘filter bubbles’ that limit areas of focus and the extent to which choices can be explained. The IVA can assist community-based officers in staying in control of their own work. The work of community policing officers has changed due to the stream of information from social media, sensors, and statistics. Having your own Intelligent Virtual Assistant (IVA) can help.

In community policing work, officers are the link between residents, companies, and institutions in the neighborhood. They are crucial in the integrated approach to safety in the communities they serve. Based on the principle of ‘knowing and being known’, community policing officers endeavor to increase trust in the police and reduce the distance to civilians. In this digital age, close contact with civilians and organizations is more important than ever. The first to arrive at the scene of any event in the neighborhood are civilians, who use smartphones, social media, and local neighborhood apps to share high-quality videos and photos, as well as real-time information. In addition, more and more information comes in digitally from local authorities, governments, and both public and private sensors. And, of course, community policing officers also receive information from their colleagues. From all of this, they must filter what is relevant and what they need to act on.

There are significant differences in information needs between community policing officers in large cities and those who work in rural areas. In cities, community policing teams are often used where officers have responsibility for certain aspects of policing, such as youth crime groups, environmental, drugs, and cybercrime, for which there are specific information needs. In the current landscape, filtering of information takes place via intake channels, such as service centers, online support and coordinators within police teams, which are unable to provide the desired customization for individual community policing officers. In order to prevent relevant information from being overlooked, more information is shared with the officers than they can actually handle. As a result, officers are buried under a growing mountain of information.

AI as a solution for information overload

Could Artificial Intelligence (AI) help to extract the information that is relevant to a specific district or expertise of a community policing officer from that mountain of information? We discussed the possibilities offered by Intelligent Virtual Assistants (IVAs) with Tibor Bosse, Professor of Social AI at Radboud University Nijmegen, and Charlotte Gerritsen, Assistant Professor of Social AI and Criminology at VU University Amsterdam. They defined the problem as one of infobesity[i].

They indicated that infobesity causes a person to lose sight of the overall picture due to a continuous flow of information. In particular, people tend to read information that matches their own preferences and prejudices. The dangers of infobesity within policing were warned about in 2016 in a Belgian evaluation of the terrorist attacks in Zaventem and Maalbeek[ii]. It was revealed that infobesity and fragmented sources of information are one of the biggest problems facing today’s police force. The police have traditionally had an information policy of ‘need to know’, which has led to the fragmentation of information throughout police forces. Based on the survey, the police in Belgium decided to shift the approach from ‘need to know’ to ‘need to share’. This shift is also noticeable in other European countries. And by applying a ‘need to share’ approach, the problem of infobesity will only increase further.

Could AI help tackle the issue of infobesity?

In the world of AI, IVAs are increasingly being used to help people in their work and protect them from infobesity. For example, IVAs can be trained to effectively filter information flows for the user and can even proactively search through various fragmented sources of information. In order to arrive at a good final selection of information, an IVA can use a Recommender Engine. We’re already familiar with these recommendation algorithms in our private lives, for example on YouTube and Netflix, where we receive recommendations for products or films based on our interests. These algorithms initially make use of historical data, and subsequently adapt the recommendations to the specific user’s viewing and listening behavior.

In the case of the community policing officer, the recommendation algorithm would be trained on historical data from the police intake channels and fellow officers in different geographical neighborhoods. At first use, the virtual assistant will produce similar kinds of information from the community for each community policing officer but, over time, the recommendation algorithm will learn more and more about the personal viewing behavior. In addition, the officer will be able to give specific feedback on how relevant the presented results are, in order for the IVA to get better and better at identifying which information is more or less relevant to each community policing officer. The virtual assistant could also use, for example, skin conduction in a smart watch or the phone camera to detect stressful situations. In acute situations, this would allow the IVA’s information to be used to support and protect the community policing officer from the growing dangers of infobesity in varying situations.

Explainable AI

We also asked Bosse and Gerritsen whether there were any risks associated with the use of these IVAs. As it turned out, there are several. For example, there is the risk of the creation of a filter bubble or ‘fable trap’, as one Dutch TV host called it on his show[iii]. Since the system only shows information that is considered interesting, the user ends up in a digital information bubble. This also creates the notion that other information is irrelevant. A 2015 survey by Facebook[iv] showed that, on social media, people tend to focus on their own interests 15% more than they would when watching TV or reading newspapers. One way to counteract this is by using technology to draw extra attention to other or even opposing information in the area of their interest. This approach is also widely used for targeting online shoppers. Websites use it to entice users to view products other than those they were initially looking for. The Intelligent Virtual Assistant also has to be configured for this to prevent a filter bubble.

Another risk we discussed with Bosse and Gerritsen was the extent to which the IVA’s choices could be explained. This concept is called ‘explainable AI’, where a human must be able to trace the path that an AI system has taken in order to come to a decision. Neural networks are a well-known example of an AI system where this is not the case; it is a ‘black box’. This would be unsuitable for community policing officers who must always be able to retrace how information choices are made. Therefore, the IVA will have to use explainable AI. This would allow community policing officers to, for instance, trace back the information to topics they had previously labelled as being of interest.

After a discussion about the pros and cons, we concluded that IVAs can offer many possibilities for community policing officers. However, it is important to take into account all the pitfalls of AI. If this is properly done, community policing officers with an Intelligent Virtual Assistant in their pocket will be able to keep control of their work and cope with the growing digitization of our society. Having to spend less time being exposed to irrelevant information will result in spending more time making the neighborhoods safer for everyone.

At the end of our conversation, prof. dr. Bosse briefly discussed his work with the European Space Agency[v]. ESA is working on the development of floating digital e-Partners for astronauts. These small robots provide contextual information and will help with complex situations during a Mars mission and can even give emotional support to astronauts. If this kind of futuristic technology could be added to the IVA of a community policing officer, then protection against infobesity might just be the tip of the iceberg for how AI can help our future police officers.

Find out more

This article has been adapted from a chapter in the Trends in Safety 2021-2022 report giving European leaders insight into the safety and security trends affecting citizens in the Netherlands.

  • The full report in Dutch can be found here.
  • An executive summary in English can be found here.

Authors

Arul Elangovan
Enterprise ArchitectArul Elangovan mainly focuses on digital and customer experience for citizens in the public safety domain. He also provides training in customer journey design and agile architecture.
Email : arul.elangovan@capgemini.com
Frank Inklaar
Business AnalystFrank Inklaar is a Senior Consultant at Capgemini. He focuses on the application of advanced analytics and Artificial Intelligence in the field of public security and safety.
Email : frank.inklaar@capgemini.com

[i] Infobesitas and psychological factors: https://www.researchgate. net/publication/336612222_DECISION_MAKING_IN_THE_ERA_ OF_INFOBESITY_A_STUDY_ON_INTERACTION_OF_GENDER_AND_ PSYCHOLOGICAL_TENDENCIES

[ii] 2016 Belgian survey into the attacks in Zaventem and Maalbeek: https://www.politieacademie.nl/kennisenonderzoek/kennis/ mediatheek/pdf/94442.pdf

[iii] Broadcast ‘De Fabeltjesfuik’ by Arjan Lubach: https://youtu.be/ FLoR2Spftwg

[iv] Exposure to ideologically diverse news and opinion on Facebook — E. Bakshy, S. Messing, L.A. Adamic: https://science.sciencemag.org/ content/348/6239/1130

[v]  Supporting Human-Robot Teams in Space Missions using ePartners and Formal Abstraction Hierarchies — T. Bosse, J. van Diggelen, M.A. Neerincx, and N.J.J.M. Smets, 2015