Article

The Municipal Elections 2018: Digital Resilience

| Written By:

Digital Election Advertising: The Prohibition on Using Municipal Resources

Illustration | Flash 90

From a legal perspective, the digital space of election advertising in Israel is like the Wild West. The rapid technological advances of the last two decades have created an untenable legal situation: on the one hand, when it comes to the traditional media—the press, radio, and television—the law regulates the content and scale of political advertising during election campaigns, and does so--in strict and exacting detail. Consequently these channels are subject to close regulation, even though the impact of election advertising on these media is declining. On the other hand, although digital platforms’ influence on voters is increasing, candidates, political parties, and others (“third parties”) advertise there freely and with no statutory restraints.

The fact that there are no regulations governing election advertising in the digital space may have grave consequences for the values on which democratic elections are based—freedom of expression, multiplicity of opinions, equality of opportunity, privacy, fairness, and honest elections; it may also lead to a waste of public resources. These values are undermined when any individual who is sufficiently wealthy or close to the centers of media or political power can flood the public space with unlimited messages. The possibility of abusing the special characteristics of the digital media also exists.For more on the subject, see Guy Lurie and Tehilla Shwartz Altshuler, “A Reform of Election Propaganda Laws” (Policy Study 109), (Jerusalem: Israel Democracy Institute, 2015).  

In this paper we offer a general description of the challenges that political advertising in the digital world poses to Israeli democracy, especially in the context of municipal elections; and, more specifically, the problem of the use of a municipality’s own resources for advertising purposes.

Local election advertising requires separate treatment from that for Knesset elections, because of three unique features, enumerated by Amnon Reichman. First, the legal provisions governing local elections differ from those for Knesset elections. Second, unlike Knesset elections, in which the party organizations supervise what content will be published or distributed, in local elections there are no internal balances to check the circulation of digital advertising. Third, digital media offer unique possibilities not only because of the direct link that social networks create between council heads or mayors (hereinafter “mayors,” even those not chosen in direct elections) and their constituents, but also because of the characteristics of these networks, such as the possibility of running negative or positive campaigns that target local residents.Amnon Reichman, “Innovations and Developments in Public Law in Israel in 2014,” Din Udvarim - Haifa Law Review 11 (2018): 205–259, esp. 254. Nevertheless, there has yet to be any serious discussion about regulating the use of digital media in local election campaigns.

In this paper, we focus on a problem unique to local elections; namely, the relative ease with which incumbent mayors can disseminate election advertising via the digital media and harness public resources for their campaigns.

Computational Political Advertising in the Digital World

The growing role of digital marketing has worked a fundamental change in politics, with an impact on the interaction among candidates, lists, the traditional media, and of course the electorate. All agree that the digital space makes an immense contribution to improvement of the political process by multiplying the sources of information, increasing the options for civic participation, empowering individuals, maintaining connections and involvement between elections, crowdfunding, and so on. At the same time, however, we also see the manipulation of voters’ thoughts and of the voting process, infringements of privacy, and a lack of transparency. Sometimes these take the form of disinformation or fake news, which are amplified by the use of techniques of planned marketing or targeting based on the processing of personal data.Alison Weissbrot, “MAGNA and Zenith: Digital Growth Fueled by Programmatic, Mobile and Video,” Ad Exchanger, 20 June 2016. In recent years it has become increasingly evident that data analysis, automation, opaque algorithms, and computational advertising based on analysis of big data can be exploited for unprecedented manipulation of public opinion and undermine the viability of a public sphere based on individual choice and autonomy.

Part of the problem stems from the fact that techniques originally devised to sell products and services are now being used to influence beliefs, ideas, and democratic elections. The fact that algorithms can identify and target specific sectors, quite independent of what content will be conveyed to these groups and individuals (fake news, legitimate advertising, terrorism, pornography, pirated content), in what place or site or profile the ads will be sold, or of who stands behind them, raises the question of whether advertising and persuasive content and methods that relate to the democratic process and not to commercial marketing have unique attributes.Jack Nicas, “Fake-news Sites Inadvertently Funded by Big Brands,” Wall Street Journal, 8 Dec. 2016; Suzanne Vranica, “Advertisers Try to Avoid the Web’s Dark Side, From Fake News to Extremist Videos,” Wall Street Journal, 18 June 2017; Julie Clark, “Fake News: New Name, Old Problem. Can Premium Programmatic Help?” AdAge, 25 Apr. 2017 

In many ways, the techniques of digital political advertising and marketing are uncharted territory. This is why there are advertising and persuasion services that are incompatible with general legal provisions, such as the ban on deception and impersonation or the rules of social network communities, where these campaigns are usually conducted. For example, automatic and semiautomatic networks of bots that mimic human beings (automated programs that seem to be real users) were recently discovered on Hebrew-language social networks.Ran Bar-Zik, “Army of Bots that Replied in Favor of Bugi Yaalon Caught on the Web,” Internet Israel, 19 August 2018; Noam Rotem, “How a Network of Political Tweets in Hebrew was Uncovered,” 5 August 2018, online.  

In the context of local elections, an investigative report by Yedioth Ahronoth in August 2018 uncovered methods of questionable legality: circulating fake news, slandering candidates and sullying their reputations, creating fictitious profiles or Facebook pages that appear to belong to one’s opponent, setting up seemingly innocent Facebook groups that attract thousands of members and aim at promoting politicians; using trolls to spread anonymous slanders on the web, and purchasing products in order to create behavioral profiles of web surfers that will mislead public opinion.Shahar Ginossar and Guy Lieberman, “Grunt Work,” Yedioth Ahronoth Weekend Supplement (7 Yamim), 9 Aug. 2018; Guy Lieberman and Shahar Ginossar, “Racism and Hate? That’s What Works Best in Elections,” Yedioth Ahronoth Weekend Supplement, 15 August 2018. Another investigative report, published on the Mako website and Channel 2 News, found a similar picture of “hundreds or thousands of accounts that are suspected of being fictitious and that serve political goals—of the major parties-- and also many on the municipal level: Herzliya, Nahariya, Haifa, Tiberias, Yavne, Qiryat Motzkin, Hod Hasharon, and many others.”Dror Globerman, “Investigation of the Fakes that Got Politicians Excited,” Mako, 23 August 2018.  

From the Yedioth Ahronoth report we learn that there are no clear boundaries for what is permitted and what is forbidden for many forms of advertising. According to Ron Tannenbaum, CEO of the Media Group company, “you can monitor and identify political opinions, find your psychological profile and influence. Anyone who has the means can move entire groups.”Ginossar and Lieberman, “Grunt Work.” In fact, we can identify growing sophistication of digital political campaigns in the last decade, with the goal of identifying individual voters and interacting with them. Such campaigns may include the following:

• An attempt to generate support for candidates or parties during campaigns or, alternatively, to stimulate antagonism towards candidates or parties; 

• An attempt to generate support for a “yes” or “no” vote on a referendum;

• An attempt to persuade people to come out to vote or to stay home;

• An attempt to generate support for particular viewpoints, or alternatively to drum up opposition to other opinions.

They may also include:

• A package of relevant public information or, by contrast, of incorrect or slanted information (distribution of junk news or fake news, disinformation) during election campaigns or between them, in normal times, and during wars or other humanitarian catastrophes;

• A focus on the pros and cons of a particular position, or, alternatively, a one-sided presentation of the issue;

• Serious consideration of a leader’s or candidates attributes, or, alternatively, slander;

• Explicit advertising (labeled as such), non-advertising organic content (sometimes disguised), or talkbacks, shares, and likes—organic or paid—concerning content that has already been published.

At the polar extremes of the content offered to users, one can note the following:

• Legitimate use of Facebook accounts and other social networks, versus the creation and use of fictitious profiles or impersonations of other individuals and the opening of accounts intended to embarrass them;

• The use of human responders and automated or semi-automated bots, digital trolls, and biased commentators in order to interfere with the dialogue on a social network, give some individuals or opinions a false appearance of popularity, make radical views acceptable, and deter other participants in the discussion and thereby influence the political discourse. For instance, political bots are used to flood networks with messages linked to a particular hashtag, to promote or attack some politician, or to produce fake likes or followers on the social networks. There is also increasing use of such bots to use certain words strategically in order to deceive the software algorithms and make certain content more popular. Sometimes they are used in order to tag and label accounts and content as inappropriate for a social network and its community rules, which may lead to the blocking of accounts or deletion of content;

• Paid advertising;

• Changing a site’s rank in search results as a result of the optimization of search engines and manipulation of search engines (“black hats”); 

• Making information go viral, optimizing content on a social network, and producing a significant echo for offensive information or hate speech on a social network or creating clickbait content, that is, content designed to go viral.

It is important to emphasize that the use of disinformation in an election campaign is not limited to the open web or open social networks, but also occurs in message applications such as WhatsApp, Telegram, and Signal.

Illustration | Flash 90

The Use of Personal Data for Election Advertising 

In the early 1990s, digital marketing already relied on long-term data collection and on monitoring users’ digital behavior patterns.K. C. Montgomery, “Safeguards for Youth in the Digital Marketing Ecosystem,” in D. G. Singer and J. L. Singer, eds., Handbook of Children and the Media (2nd ed.), (Thousand Oaks, CA: Sage, 2011), 631–648 This culture developed especially in the United States, where there was no government intervention in the digital world, ranging from the principle that platforms bear no responsibility for the content published on them to the minimal protection of consumers and privacy.O. Solon, and S. Siddiqui, “Forget Wall Street: Silicon Valley is the New Political Power in Washington,” The Guardian, 3 Sept. 2017. All of this, along with the constant innovations, such as the massive switch to social networks and portable devices, facilitated the collection of data about many aspects of users’ lives, which could be processed in order to target commercial messages back at them.C. Smith, “Reinventing Social Media: Deep Learning, Predictive Marketing, and Image Recognition Will Change Everything,” Business Insider, 20 Mar. 2014 Today there is an ever-growing arsenal of analytical tools and software for forecasting, rating, and scoring, and especially for categorizing and segmenting individuals based on precise sets of traits and markers of demographic, psychographic, and behavioral information. Taken together, all of this generates an accurate profile that is used later, for example in order to adapt advertising on the social networks.

In other words, for some time now it has not simply been a matter of purchasing advertising on YouTube, but rather of integrating tools for data collection, audience segmentation, and targeting of a deluge of messages that produce a reaction and engagement. Data collection and data analysis are the fuel of the industry; they are becoming more automated and optimized thanks to machine learning. What is new is that the broad range of techniques employed for data collection and targeting creates unprecedented power for campaigns and a qualitative difference with what was formerly the norm in political advertising. For our purposes we will mention one technique here—emotion-based psychographic targeting.

The digital industry is constantly creating and improving tools to study the stimulation of diverse emotions and conscious and unconscious reactions, with the goal of reinforcing the link to brands or messages or of creating an emotional bond with them.C. McEleny, “Ford and Xaxis Score in Vietnam Using Emotional Triggers Around the UEFA Champions League,” The Drum, 16 Oct. 2016. Companies such as Facebook (but also Nielsen) use what is known as “neuro-marketing”— tools developed for research in the neurosciences in order to determine the emotional impact of advertising messages. One such technique is emotion analytics, developed by Google to make use of new types of data and track users’ reactions so that advertisers can understand the impact of their campaigns and the emotional assets they produce.T. Kelshaw, “Emotion Analytics: A Powerful Tool to Augment Gut Instinct,” Think with Google, August 2017.  

In 2016, Experian Marketing Services for Political Campaigns offered campaign managers clusters of big data that combined demographics, psychographics, and emotional traits in order to reach voters in a way that would make it possible for the campaign managers to investigate the “heart, soul, and mind” of those targeted, chiefly with regard to their political profile, pooled with political attitudes in general, expectations, behavior, lifestyle, purchasing history, and media preferences. This is where Cambridge Analytics got into the game. It used “a five-factor personality model,” known as OCEAN (Openness, Conscientiousness, Extroversion, Agreeableness, and Neuroticism), to characterize the personality of every adult in the United States.J. Albright, “What’s Missing from the Trump Election Equation? Let’s Start with Military-Grade Psyops,” Medium, 11 Nov. 2016; M. Kranish, “Trump’s Plan for a Comeback Includes Building a ‘Psychographic’ Profile of Every Voter,” The Washington Post, 27 Oct. 2016.  

The analysis was based on Facebook data, on voting histories available to parties, and on marketing data acquired from leading companies, such as Acxiom, Experian, Nielsen, Data Trust, Aristotle, L2, and Infogroup. This made it possible for Cambridge Analytics to develop an internal database with thousands of data points for each individual, identify weaknesses, target content, and design advertising that eventually appeared on various digital channels, including television (which is still considered to be an influential medium).Advertising Research Foundation, Cambridge Analytics: Make America Number One (case study, 2017); A. Nix, “The Power of Big Data and Psychographics in the Electoral Process,” presented at the Concordia Annual Summit, New York, 2016 (YouTube). The strategy devised was based on the identified weaknesses of individual voters.D. Karpf, “Will the Real Psychometric Targeters Please Stand Up?” Civicist, 1 Feb. 2017; N. Confessore and D. Hakim, “Data Firm Says ‘Secret Sauce’ Aided Trump; Many Scoff,” New York Times, 6 Mar. 2017; M. Schwartz, “Facebook Failed to Protect 30 Million Users from Having Their Data Harvested by Trump Campaign Affiliate,” The Intercept, 30.3. 2017.  

The companies that run campaigns on social networks are at the intersection where machine learning algorithms and advertising technologies meet. Hootsuite, Sprinklr, Hubspot, Sprout Social, and others offer diverse packages of messages for targeted sectors, based on standard posts and paid content. The software they sell draws on the analysis of behavioral data and the monitoring and tracking of activity on the networks, with the goal of delivering to the objects of the campaign messages based on successful segmentation and targeting as well as appropriate timing to enhance their persuasive power.

As stated, the danger inherent in the misuse of these capabilities is augmented when they are employed by those of means, by controlling interests, and by those with access to databases about the virtual space of local residents, especially because there is no regulation or supervision. 

The Prohibition on the Use of Public Resources

There are few or no regulations that apply to local election advertising in the digital space. Here we will focus on one principle that does, however, apply to the digital world: the prohibition on the use of public resources. Section 2A of the Elections (Modes of Propaganda) Law 5719-1959 (hereafter “Election Propaganda Law”) bars the use of public resources for election propaganda; this includes the resources of a local authority:

No use will be made, in connection with election propaganda, of the funds or of the tangible or intangible assets of a supervised body […] or of a corporation in whose management or equity the Government or a local authority has a share.

This provision always applies, not only during election periods. Its goal is to prevent the use of public funds to serve the interests of candidates and to preserve, to the extent possible, an equal opportunity between incumbents and challengers. Incumbents enjoy a built-in advantage by virtue of their position; the ban on the use of public resources is intended to prevent them from wielding an even more unfair advantage.See, e.g., KEF (Knesset Elections File) 2/21, Rami Cohen v. Miri Regev, the Minister of Culture and Sports, 18 Apr. 2018. 

The question, of course, is what constitutes the use of public resources in this context. Is it a use of public resources when a municipality publishes a review of its successes during the past year? Is a concert or show organized by the municipality, with free admission for residents, a use of public resources?

The directives issued by the Attorney General and the rulings of the chairs of the Central Elections Committee and the District Elections Committees clarify what constitutes election propaganda that makes use of the public resources of a municipality. The criterion is the test of dominance, that is, what is the dominant effect perceived by reasonable voters of some publication by a municipality is it election propaganda or is it simply conveying information to the public? With this in mind, the Supreme Court, the chairs of the elections committees (central and regional), and the Attorney General have defined auxiliary tests, such as the date of the publication and its proximity to Election Day, who sponsored the advertisement, whether it is a regular advertisement or a special publication in advance of the election, common sense, and so on.See, e.g., KEF 2/20, Ometz Movement v. the Consumer Protection and Fair Trade Authority et al., 18 Jan. 2015; KEF 2/21, Cohen v. Regev (supra, n. 17); LEF (Local Elections File) 14/20, Members of the For the Residents Faction on the Raanana City Council v. Nahum Hofri, Mayor of Raanana, 16 June 2013; LEF 16/21, Gabriel Gaon v. the Mazkeret Batya Local Council, 10 July 2017; LEF 21/21, Shlomo Zino v. the Mayor of Nesher, Avraham Mahlouf Binamo, 13 Dec. 2017; LEF 24/21, Shimon Shmueli v. Yosi Bachar, Mayor of Bat Yam, 11 Feb. 2018.  

The ways in which elected officials attempt to employ public resources in their campaign advertising, in violation of this prohibition, have changed over time and in keeping with the advances in technology. The ban was enacted in 1961, following a series of complaints about the 1959 elections, which alleged the use of public resources for election rallies (such as election propaganda during a review of the Home Guard) and activity related to election advertising by employees of municipalities and government corporations (such as the clerks of the Citrus Marketing Board).See: Complaint submitted by MK Y. Sapir to Justice Y. Sussman (21 October 1959), Israel State Archive (ISA) 5706/3-g; complaint from Dr. Robinson to Justice Minister Pinchas Rosen about the activity by clerks of the Citrus Marketing Board in favor of Mapai (21 October 1959), ISA 5706/3-g; Justice Minister Pinchas Rosen to the Government Secretary concerning the complaints about political parties’ propaganda methods (18 October 1959), ISA 5706/3-g; the Justice Minister to A. Yadin et al. about the appointment of an intraministry committee to study the election laws (7 February 1960), ISA 5706/3-g; the Elections (Modes of Propaganda) Law (Amendment) 5721-1961 bill, Knesset Bill 466, 227 (3 May 1961). In recent years the complaints about breaches of this prohibition have dealt with other forms of activity, at least when it comes to local elections. For example, there have been many complaints about advertisements or events sponsored by municipalities that were alleged to be thinly disguised election advertising on behalf of the incumbent mayor.See, e.g., LEF 1/21, Soshi Kahlon Kidor, chair of the There’s a New Day faction et al. v. Efraim (Efi) Deri, mayor of Kfar Yona et al., 30 July 2015; LEF 26/21, Shmueli v. Bachar (supra, n. 18); LEF 27/21, Mordechai Ben-David v. Shlomo Buhbut, mayor of Ma’alot-Tarshiha, 29 Mar. 2018. Recently the chair of a Local Elections Committee, Judge Oded Mudrick, ruled that an advertising package that included explicit advertisements and articles with a marketing content in a local newspaper, purchased by an NPO funded by City Hall and the follow-up interviews the mayor’s assistant constituted improper election advertising. Even unpaid interviews in local papers, he ruled, are an inappropriate use of the authority’s resources, because the municipality purchases advertising space in that newspaper.Oren Persico, “Astonishingly,” The Seventh Eye, 12 June 2018.  

There have also been complaints about material published on municipal websites and Facebook pages.See, e.g., LEF 7/21, Gabriel Gaon v. the Mazkeret Batya Local Council, 11.2.2016; LEF 12/21, Alon Geyer, head of the Gederatyim Faction on the Gedera City Council,, v. Yoel Gamliel, head of the Gedera council et al., 19 Dec. 2016 In the context of election propaganda, the chairs of the Central Elections Committee and of district elections committees have ruled that these digital channels are public resources and must not be employed for election propaganda.LEF 16/21, Gaon v. Mazkeret Batya Local Council (supra, n. 18). Section 2A of the Election Propaganda Law is worded in a way that covers all the media; that is, its total ban on the use of public resources for election propaganda does not depend on the medium over which the propaganda is disseminated.Tomer Tarbes, “Election Propaganda on the Internet in Conditions of Changing Technology,” LL.M. thesis, Tel Aviv University, 2003, p. 61 (Hebrew). This wording makes it easier to adapt the rulings and directives to new forms of propaganda—ranging from Home Guard reviews in 1959 to Facebook pages in 2018. However, technological progress requires constant vigilance on the part of the regulators, chiefly the chairs of the elections committees and the Attorney General. The Attorney General’s directives on this matter, based on the rulings of the elections committee chairs, provide local authorities with the clearest guidance as to what uses of digital means are forbidden; the Interior Ministry distributes it to mayors in advance of local elections.“Prohibition of Election Propaganda Paid for with the Funds of a Supervised Body: Publications Distributed by Government Ministries,” Attorney General’s Directive 1.1900, December 2014; “Attorney General’s Directives in advance of the Local Authority Elections,” Interior Ministry Director General’s Bulletin 5/2018, 17 Apr. 2018.  

The Use of Mayors’ Personal Facebook Pages

One of the most recent clarifications in this matter—which began with a ruling by elections committee chairs and was extended in a directive issued by the Attorney General—relates to the personal Facebook pages of incumbent mayors. After it had been made clear that the municipality’s website and Facebook page are public assets that may not be used for election advertising,LEF 16/21, Gaon v. Mazkeret Batya Local Council (supra, n. 18); LEF 12/21, Geyer v. Gamliel (supra, n. 22). the question arose about the mayor’s own Facebook page. On the surface this would not seem to be public resource. However, in the wake of several appeals the chairs of regional elections committees ruled that this too may be illegal.

An example is when content produced by the local authority is shared on the mayor’s own Facebook page. District Judge Oded Mudrick, the chair of an elections committee, rejected the petition, but stated as follows: “Because sharing is defined as acceptable, it can be used as the basis for evasion and the use of municipal resources. […] Suppose that the mayor wishes to publicize his extensive activity on one of his personal Facebook pages and needs original (authentic) materials […]. He can, directly or indirectly, or even silently, see to it that this material is generated by municipal officials and published on the local authority’s page. Those personal elements will then be linked to personal pages by sharing that is totally permissible.”LEF 29/21, Liora Pur v. Motti Sasson, mayor of Holon, p. 4, 4 May 2018.  

This is a problematic statement, because it fails to take account of the symbiotic value of a public figure’s personal page (similar, for example, to the individual account of a journalist who works for the establishment media): the public figure’s personal account also serves the public body—the local authority—because it is a venue for public involvement and contact with the mayor. On the other hand, the public figure enjoys popularity on her private account thanks to her public position. The fear cited by Judge Mudrick is indeed justified, but a ban on sharing would mean that only the mayor loses, when every other citizen or owner of a social media account can share the content in question.

Another interesting example of the forbidden use of an individual Facebook page is a maneuver that transferred the followers of a municipal Facebook page to the mayor’s personal Facebook page. According to the ruling in that case, “by virtue of the fact that the respondent employed the pool of subscribers (those who “liked”) to the local authority’s Facebook page, every one of his posts will appear by default on the homepage of the 6,000 users who liked another page. […] A pool of 6,000 users is incontestably a public resource according to the provisions of Section 2A.”LEF 22/20, Dr. Yoav Rosen, head of the New Direction faction in Hod Hasharon v. Mr. Hai Adiv, mayor of Hod Hasharon, 1 July 2013. Relying on these decisions, the Attorney General issued the following directive (which remains in force today): 

“To eliminate all doubt, we would like to make it clear that internet sites, social media pages, and government or municipal apps, including those that allow residents of the state to receive information about what the bodies enumerated in this section are doing, constitute a ‘public resource’ and their use for election advertising contravenes the provisions of the law. The same applies to the subscription lists of a site, pages, and these apps.”Attorney General’s Directive 1.1900 (supra, n. 25).  

Election Advertising Making use of Sensitive Personal Data

Is the Attorney General’s directive sufficiently detailed? Technology and municipality practices have continued to advance. Even though this directive was updated in late 2014, a new update has become essential. The subscriber lists of sites, pages, and municipal authority apps are indeed an important resource, but they are certainly not the only important information resource.

Here are two additional such resources:

1. A panel, dashboard, or tracking data of activity on social networks related to a local authority. Many Israeli municipalities employ the services of web-data pooling and tracking companies in order to learn about the public mood, issues of concern to residents, demographic segmentation, and so on. Sometimes this information has been anonymized by the service provider. In other cases the source of the data can be identified (that is, the user’s Facebook profile, which frequently includes a name, address, and photograph).

2. Databases maintained by the municipality as part of its ongoing activities: These databases include vast quantities of identifying personal information about residents—ranging from statistical and demographic data through their tax liabilities and discounts, children in regular and special-education schools run by the authority, citizens who are welfare clients, and so on. What is more, the advent of a world full of sensors and what are known as “smart cities” has significantly increased the volume of information held by local authority; this includes, among other things, travel patterns (collected by means of traffic cameras, including license plate numbers), behavior in public (captured by the networks of security cameras), and sensors that are attached to garbage bins and streetlights.

These techniques for pooling and segmenting information for use in political campaigns require more and more layers of information on the targets of political messages. Hence, there is growing concern that incumbent mayors will combine the information in the local authority’s possession with other data and run campaigns that are tailor-made, and target those whom they wish to reach and persuade.

The Fear: Use of Information held by the Municipality to Help Mayors be Reelected

As follows from the above, the fear is that information management companies and the managers of digital political campaign will attempt to persuade mayors to make use of the information held by the municipality y in order to increase their prospects of reelection. Some examples:

• Retargeting of identified profiles that tracking of social network activity tagged them as opposed to the mayor and critical of him. The retargeting of messages can also include the use of information about a critical resident, for example the fact that he is in arrears on his property tax payment or that his children are in special education frameworks.

• Creating precise profile of the target audience by adding levels of data to those already available on information platforms or from information brokers, including data in the municipality’s possession such as family information (number of children, ages) habits (travel patterns, garbage removal, entering and leaving the street), economic situation (entitled to discounts or not), and so on.

In addition to the concerns about the misuse of private data, it is important to add that such data is a resource with great economic value for producing persuasive campaigns, and consequently falls into the category of the misuse of municipal resources. If we see the continuing exploitation of these resources for election campaigns, there is no reason to assume that in the Information Age—especially given that we see improper uses in digital political campaign in other contexts as well—there would not be attempts to do so here as well.

Summary and Recommendations

Only in recent years has the extent to which the digital space influences—and will continue to influence—election advertising, become so clear. Here we have presented only an example of the problems related to the enforcement of election advertising laws for local elections in the digital space. In our view, the ban on the use of public resources that applies to information held by the local authority—whether acquired through its normal activities or by tracking the public mood on social networks—must be updated and be much broader than what is currently covered by the Attorney General’s directive.

Updating the Attorney General’s Directive 

As stated above, the Attorney General’s directive—last updated in late 2014—makes it clear that “internet sites, social media pages, and government or municipal apps […] constitute a ‘public resource’ and their use for election advertising contravenes the provisions of the law. The same applies to the subscription list of a site, pages, and these apps.”Attorney General’s Directive 1.1900 (supra, n. 25). This directive is incapable of coping with the continuing sophistication of data analytics and the capacity to create “autonomy traps” (influencing an election and individuals’ preferences without their being aware of the manipulation) and manipulative persuasion. Hence it needs to be adapted and updated accordingly.

It is important to remember that it is essential to modify current regulations to suit the digital age and its unlimited capacity for mining, pooling, and using data about individuals, not only in order to protect citizens’ right to privacy and autonomy, but also to safeguard the very possibility of conducting a democratic process based on free choice.

Information is collected, analyzed, and processed in order to segment the population, create behavioral profiles, and produce targeted advertising based on a precise segmentation of individuals. In the predigital age, advertising and persuading specific groups was an imprecise science. The situation is different today. Information about individuals is a political asset, and its commercialization has become one of the most important parts of managing political campaign and influencing public opinion. Information about individuals is a valuable asset of political intelligence, because it makes it possible to understand inclinations, groups, modes of influence, preferences, and the campaign’s effectiveness.“The Influence Industry: The Global Business of Using Your Data in Elections” https://ourdataourselves.tacticaltech.org/posts/influence-industry/ 

This is why any discussion of the use of municipal resources during election campaigns cannot focus exclusively on financial resources, but must also deal with the possibility that candidates will make use of databases and personal information, anonymized or identifiable. The Attorney General’s directive needs to be updated in the following ways:

1. It should prohibit—in addition to the existing ban in the Protection of Privacy Law—the use of any database that contains personal information in the possession of a municipality or that was collected by it in the course of its performance of its duties or by someone acting on its behalf, because such information can help generate a behavioral profile for voters or expand such profiles; hence this is a use of municipal resources for purposes of election advertising. It may be necessary to totally ban any contract to obtain services that makes use, directly or via a third party, of sensitive personal information in order to design or disseminate election-related messages, because of the concern that such companies will pressure the municipality to make use of the personal databases in their possession.

2. It should bar contracts with companies that monitor activity on the social networks in a way that provides identifying information about residents. In addition to the infringement of privacy, the use of municipal resources for such contracts indicates that the main intention is later use of this information for retargeting as part of election advertising; consequently it is a forbidden use of public resources.

Applying a General Requirement for Transparency in the Election Propaganda Law

The no-man’s land of the use of municipal digital resources for election advertising exemplifies the difficulty that current regulations have in keeping pace with developments in the digital world. Another example of the failure of the directives and of current law to meet the challenge of advertising in the digital world is the problem of transparency.

The requirement that election advertising be transparent—meaning that every political advertisement must clearly state who is behind its publication—applies to the print media. Is it possible to apply this requirement to election advertising in the digital domain? About a year ago, around the time of the special election for the mayor of Ramle, negative advertising was circulated anonymously in a local newspaper and on Facebook. Judge Avraham Tal, the chairman of the District Elections Committee, issued an order barring the local paper from violating the principle of transparency, and also instructed the candidates not to publish “any propaganda advertisement, in any medium, that is not identified,” even though there is no statutory provision explicitly authorizing him to do soLEF 17/21, Motti Yitzhaki v. Ramlod Plus newspaper et al., 7 July 2017. based on similar rulings by the chair of the Central Elections Committee.See KEF 16/19, the Jewish Home List v. the Likud-Beitenu List et al., 3 Jan. 2013; KEF 25/19, the Likud National Liberal Movement v. the Likudnik website, 13 Jan. 2013; KEF 48/19, Gal Adler v. nrg Maariv, 21 Jan. 2013.  

Was Judge Tal’s action justified, given the lack of legal authority in the law?Yarom Sagan and Yosef Ben-David, “Is it Possible and Appropriate to Apply the Obligation of Publishing an ‘Information Tag’ as per the Elections (Modes of Propaganda) Law on the Internet as Well?” Ha’arat Din 8 (2003): 117 (Hebrew). How could such authority be compatible with a Supreme Court ruling several months later, that interpreted the limits of the authority of the chairman of the elections committee in a very precise manner and recognized his authority to issue restraining orders only in order to prevent the infractions enumerated in the Election Propaganda Law or other laws mentioned explicitly there (Section 17B(a))?High Court of Justice, Rehearing 1525/15, MK Dr. Ahmad Tibi v. the Yisrael Beitenu Party (published by Nevo, 23 Aug. 2017). Even if we are satisfied that he does have such powers, is there any point in instructing Facebook Israel—as Judge Tal did—to contact the appropriate office in Facebook Ireland and make it aware, “to the extent of its ability,” of the Israeli legislation related to election propaganda, because Facebook in Ireland is responsible for these publications and not Facebook in Israel?

In this case, in order to clarify the legal situation, it is imperative to pass new legislation and to define the general obligation of transparency as applying to all media, as we have already proposed in the past and similar to the recommendations issued a few months ago by the Public Committee for the examination of the Election Propaganda Law, headed by retired Chief Justice Dorit Beinish (the “Beinish Committee”).The Public Committee to Study the Elections (Modes of Propaganda) Law 5719-1952, headed by retired Chief Justice Dorit Beinish, Din ve-Heshbon, 21 Nov. 2017; cf. Lurie and Altshuler, “A Reform of Election Propaganda Laws,” 64–65.  

Enhancing the Digital Literacy of Enforcement Agencies and the Public

The public interest in the diverse uses of big data is growing precisely because information about data-based practices usually remains in the shadows and below the radar. Hence an external evaluator—a concerned citizen, an investigator, or a regulator employed by the Elections Committee—is apt to be hard pressed to track, monitor, and evaluate the use of these tools. The contradictory testimony of those who commission the service and the firms they employ about the character of the activity, the difficulty of monitoring and creating transparency in data-based advertising, as well as the fact that enforcement agencies lack sufficient digital literacy in these matters because of the rapid advances in them, have created a veritable threat.

This is what underlies our recommendation that skilled personnel—staff of the elections committees and in the offices of the State Comptroller and the Attorney General—be trained to enable them to identify-- in real time-- improper uses of information belonging to municipalities or other harmful practices; understand allegations or complaints submitted to the elections committees on these matters; advise the chairs of the elections committees and provide them with relevant information; and create and update appropriate directives.

In this context it is also important to increase public literacy, so that users of social networks who are also local residents will improve their digital resilience in the face of attempts at manipulation, and by developing awareness regarding the details or phenomenon they experience, will be better equipped to expose illegal practices.