Op-ed

Below the radar of legislators and the public, AI systems used by the police are making "predictions" about who might commit crimes

| Written By:

The Israel Police is acquiring technological systems without sufficient public and professional debate, and their use is only being discovered after the fact. We need to rethink the processes and mechanisms for the adoption of new technologies, which are important, but may also be used by the police in a harmful manner.

Photo by Shutterstock

Let's talk for a moment about your daughter. We'll call her Noga. She's 26 years old and is on her way back from a vacation in the Caribbean. Of course, Noga, the apple of your eye, does not have any criminal record. Let's imagine that you are on your way to the arrivals hall to meet her when she calls you, hysterical, to say that she has been detained after landing and has had her baggage and person searched. Shockingly, they found three small bottles of something—ketamine, perhaps. Or maybe they found nothing. When you finally meet her, Noga is in tears. She says that each of the three police officers at the airport with whom she had contact said to her, separately, "You're a lesbian, right?"

In this scenario, you would certainly be asking yourself whether this search was legal, why it was done, and under what jurisdiction it was conducted. If they did find something in Noga’s suitcase and she was charged, your lawyer would submit a request for the investigative material under Section 74 of the Criminal Procedure law If nothing was found and you were furious and wanted to know why Noga was subjected to such a humiliating experience for no reason, you would hire a PR consultant. Either way, you discover something interesting: Your dear Noga's name was on a list passed on to border control officers by the Israel Police, allowing them to search her body and belongings. Who prepared this list? According to what operational procedure? What is the legal basis for this?

Now let's assume that you discover, after struggling with avoidance tactics and with court orders banning publication of the details, that the list was prepared by an AI-based machine that issues statistical predictions of the potential for involvement in drug crimes of incoming passengers to Israel. You don't know where the machine gets its training information from, how it creates its statistical predictions, and what weight it attributes to different factors. How did your daughter, with no criminal record, end up on this mysterious list? The system may take into account all kinds of data—sexual orientation, political opinions, location data, activity on social media. You know: big data, black box, et cetera.

While this story may be surprising, the system described is (more or less) at the heart of several confidential criminal proceedings, but is already known to the press, civil society organizations, and members of the Knesset. The chair of the Knesset Constitution, Law and Justice Committee, MK Simcha Rothman, submitted a request to the police for further details back in November and convened a discussion on this issue in recent weeks. But more fundamentally, this case—the use of "predictive policing" systems—is just one more in a series of incidents based on a troubling modus operandi being used by the police.

This method was evident in the "Eagle Eye" affair (in which the police deployed a system for vehicle registration number recognition with the legal power to do so and was forced to suspend its use by the High Court of Justice until relevant legislation is introduced) and in the NSO Pegasus spyware affair (in which more than 1,000 cellphones were remotely infected with spyware that extracted the phones' entire contents and which the Merari Commission found was being used without legal basis and without any checks and balances).

Under this model, the police purchases technological systems without sufficient public and professional debate, with the facts only emerging after the event. Public discussion should be held in the Knesset and should address the implications of the use of technological systems and the ability to envision the new possibilities that come along with them.  This is a moral debate over the very foundation of this method and the implications of using technology to balance power between the branches of government and for social processes. But this public discussion should also be preceded and followed by a professional discussion.

What emerges from the troubling model currently in use, is that the Israel Police headquarters does not conduct sufficient and thorough work to examine the extent to which the proposed technology is aligned with existing law; does not carry out regulatory assessments to evaluate the impact of using such technologies on human and civil rights; and does not formulate detailed operating procedures before the technologies are deployed. In addition, the police do not conduct broad evaluations of the systems to identify failures and successes, and where more limited evaluation is conducted, its findings are not passed on for internal and external oversight.

Technological systems, certainly those based on machine learning, are not inherently "bad" or "good." But the way in which they are designed and developed and the context in which they are used—especially when it comes to law enforcement and policing—have such a huge impact, that both the European Union and the United Nations have proposed banning their use, outright. Examples include systems for face recognition, for intrusive surveillance, and for crime prediction.

In Israel, a thorough and fundamental assessment of these issues is needed in law enforcement systems and in the systems that oversee them (the Attorney General, the courts, and the Knesset), rather than only addressing individual cases or specific aspects of the issue Otherwise, we will find ourselves in a world in which our rights are taken away in secret, systems are used without any oversight, and state enforcement efforts encroach on our lives in unimaginably extensive and powerful ways.

This article was originally published in the Jerusalem Post.