By Linda Apollo

The Data Protection Act needs to be fully operationalized as Kenya heads into the 2022 election cycle. A sensitization exercise needs to be undertaken concerning the use of personal data in campaigns.

Considering the various threats posed to electoral integrity by digital platforms, it is imperative to discuss the use and regulation of personal data. The link between access to personal data on one hand and the commission of electoral fraud or voter manipulation on the other has been examined severally in news media.

The pertinence of this discussion in Kenya is clear considering two major developments have occurred since the last election cycle; parliament enacted the Data Protection Act (DPA) and approved the appointment of a Data Commissioner.

Are these changes likely to result in a positive material change in the conduct of campaigns, and if not, what can be done to ensure this? We focus on the use and regulation of personal data in the context of political messaging/campaigning.

Political messaging is central to electoral integrity. How political actors conduct themselves in the dissemination and crafting of their messages can either promote or undermine democracy. The aim of political messaging is often persuasion.

Through their messages, political actors hope to convince voters to support their policy positions or candidature. In the not-so-recent past, political messaging in Kenya, and generally around the world was aired through traditional broadcast media. Radio, newspapers, and television served as the primary means through which political actors could reach their audiences.

The nature of these means of communication, and the context surrounding their use, often meant that political messaging was easily discernible from regular content.

In other words, audiences could easily tell when they were looking at a political advertisement due to the overt nature of the means and message. Further, since these are mass forms of communication, there existed little opportunity for targeted messaging.

Differentiating the type of messages disseminated based on the receiving audience and thereby disguising the political aims sought through the message.

This meant that the electorate often had a shared experience of elections because they were subjected to uniform persuasion tactics by political actors.

Nonetheless, even when using one-to-many forms of communication, there were attempts to use targeted messaging. During the 2007/8 elections, for example, some local language radio stations were used to fan the flames of ethnic violence by exploiting the homogeneity of their respective listeners to disseminate messages of hate.

In another example, bulk text messages targeted at specific communities were used to divide Kenyans along tribal lines to the extent that the then Safaricom CEO, Michael Joseph, considered blocking text messaging services.

The premise of targeting is simple. With basic demographic information, a person crafting a message can do so in a manner that appeals to specific subsets of the target population with a view to persuading the recipients.

The demographic information required for targeting is often clearly observable and easily obtainable, names, ethnicity, age, occupation, etc. Through targeting, the messages disseminated to members of one demographic may vary considerably from messages sent to the rest.

Targeting has been shown to be practically effective, and in some cases beneficial. In Wajir, community radio has been used to educate the local community on the effects of climate change as it relates to them.

The fact that the information has been presented in the community’s language Somali, coupled with the relation of the messaging to their lived experiences, has led to robust community engagement on the topic.

In political contexts, targeted messaging may be used to raise awareness around key policy or legislative decisions to ensure affected individuals are involved in the decision-making process. However, it may equally be used to achieve undesirable outcomes as we noted in relation to the bulk text messages used in the 2007/8 elections.

Targeting and microtargeting.

One election cycle later, political parties involved in the 2013 elections had significantly increased their reliance on digital campaigning and engaged in more detailed targeting.

With an increased rate of internet connectivity and smartphone penetration in the country, political actors were better able to reach audiences at an individual level. For example, messaging targeting younger audiences appealed to their concerns about unemployment, while older audiences were informed of candidates’ plans for national stability.

This was perhaps aided by the fact that a lot more demographic information was readily available on social media, and there existed no legislation regulating the collection and use of such personal data.

However, the use of this ordinary targeting did not reflect the state of technology at the time. Through the introduction of social media and the large-scale collection of personal data that takes place on such platforms, the nuance applied that targeting had considerably developed by the 2013 election cycle.

The sheer amount and scope of personal data available to political actors through these platforms meant that the precision of targeting could be infinitely refined.

Essentially, there was a shift from targeting to microtargeting, with the major difference being the amount and scope of personal data used.

While targeting involves using basic demographic data to craft messages for subsets of the target audience, microtargeting makes use of a wider range of data points such as online habits gleaned from trackers on social media platforms.

With a broad enough range of data points, individuals conducting microtargeting can create profiles on each audience member and tailor individual messages that are a lot more subtle and convincing than ordinary targeting.

If a political actor were deploying ordinary targeting, their message would focus on the homogeneity of the receiving audience, if the factors that would persuade them to lie in their homogeneity.

In microtargeting, the audience, despite being homogenous, would be further broken down at a granular level, bringing out everyone’s unique profile, and the motivations behind their political positions.

The messaging targeted at such individuals are often presented in a seemingly organic manner. For example, by tracking an individual’s social media use either directly or through analytic firms, political actors can create a profile on the said individual and use that to inform the type of online advertisements they would purchase and organically place on the individual’s social media feed.

In essence, microtargeting campaigns narrow in on the specific trigger points of an individual or small blocs of voters, seeking to influence their behaviour during campaigns and on voting day in subtle ways.

There was a shift from targeting to microtargeting, with the major difference being the amount and scope of personal data used.

There is not enough publicly available evidence to assess the extent to which political actors in Kenya engaged in microtargeting during the 2013 and 2017 election cycles, perhaps other than the documented use of social media advertising.

However, in both cycles, it is widely reported that Cambridge Analytica rendered its services to various political actors in the country.

Cambridge Analytica’s involvement in Kenya, which it described as “the largest political research project ever conducted in East Africa,” entailed a large-scale gathering of Kenyans’ data through participant surveys.

This, coupled with the personal data it had already improperly acquired through Facebook, ostensibly allowed it to carry out microtargeting. It claimed to be able to craft messages specific to individuals as opposed to broad demographics.

It admitted to developing messaging to leverage voters’ fears of tribal violence. The risk posed to electoral integrity by practices such as microtargeting is clear.

An inability on the electorate’s part to discern organic content from political advertising calls into question their democratic autonomy and the legitimacy of political processes.

The lexicon adopted by some commentators in relation to these practices “digital gerrymandering” and “computational politics” is therefore unsurprising.

Personal data use in targeting and microtargeting

The idea that one can sort personal data based on certain traits and analyze it for purposes of targeting is not novel. Neither is the audacity of the attempt.

In Kenya, such data has previously been easy to obtain, with little-to-no controls on its usage. In everyday life, Kenyans encounter dozens of vectors through which their personal data is collected. From mobile money payments to entry logs at government buildings, Kenyans are forced to part with crucial personal data to obtain various services.

The value of this personal data for commercial advertising has been recognized by data brokers who reportedly harvest such data for direct marketing.

Political parties have also collected personal data from such brokers for targeting. For political parties and candidates, the avenues through which they can harvest personal data are not limited to brokers.

However, one of the material differences arising from the involvement of Cambridge Analytica was the vast amount of personal data they collected both directly and indirectly, likely rendering this regular targeting even more potent than usual. They were able to collect such data due to Kenya’s weak regulatory framework.

As Cambridge Analytica’s CEO at the time explained, Kenya’s virtually non-existent privacy laws provided them a conducive environment for their activities.

This is arguably one of the main reasons political actors have been able to get away with the improper harvesting and use of personal data for both targeting and microtargeting in the past. With the enactment of the DPA, it is hoped that this will change.

The DPA brings the practices around personal data collection and uses under the supervision of the Data Commissioner, with whom these political actors would be required to register.

It is not yet clear what tangible effects (if any) the DPA has had, or will have, on the practice of targeting and microtargeting other than, perhaps, a broader awareness of privacy rights among individuals.

It is also too soon to measure this because the operationalization of the DPA is, at the time of writing, still ongoing. To be clear, the DPA is fully in force and is binding.

However, key components such as the draft regulations are yet to be put in place; they were only recently developed. Without these, the Data Commissioner would be unable to, among other things, register data controllers and data processors (in our case political parties and candidates) to ensure that their activities are monitored.

The proposed regulations, for example, would require individuals and entities involved in canvassing for political support to mandatorily register under the DPA, enhancing the Data Commissioner’s visibility of such actors, and facilitating enforcement action (if required).

The fact that the DPA is yet to be fully operationalized has not prevented Kenyans from relying on it to hold institutions accountable.

The Data Commissioner commendably provides the public with an opportunity to file a complaint through its website even though the regulations relating to compliance and enforcement are yet to be enacted.

In June of this year, many Kenyans discovered through the Office of the Registrar of Political Parties (ORPP) online portal that they were registered as members of political parties without their knowledge or consent.

After receiving over 200 complaints, the Data Commissioner held a meeting with the ORPP to arrange for the deregistration of those individuals.

Less than a month after the ORPP scandal, the guest list of an upscale hotel in Nairobi was leaked online for purposes of revealing that a certain politically connected individual had resided there for a period.

Shortly thereafter, an advocate filed a public interest complaint with the Data Commissioner. In response, the Data Commissioner indicated that it would investigate the possibility of a data breach. The implications of these complaints to the Data Commissioner are twofold.

On one hand, it is a positive development that Kenyans are aware of the office and its mandate. However, on the other, it is concerning that the improper handling of personal data is still common nearly two years after the enactment of the DPA.

Such practices are indicative of either the absence of a sufficient understanding of the DPA and its requirements or a blatant disregard of those requirements, though the two are not mutually exclusive.

Putting in place the systems and infrastructure required to operationalize the DPA is important. However, it may not be very effective if the culture around data use is not reformed.

The fact that the DPA is yet to be fully operationalized has not prevented Kenyans from relying on it to hold institutions accountable. From the improper handling of personal data, it is apparent that broad sensitization around digital rights is required.

 Content regulation

The efforts to improve the culture around personal data use in campaigns could further be supplemented by regulation of the actual political messaging that results from this data use.

Kenya’s legal framework governing political advertising is currently underdeveloped. Aside from the Communication Authority’s (CA) guidelines on bulk messaging, there are no detailed guidelines on how political advertising ought to be carried out and how transparency can be achieved.

The CA’s guidelines effectively aim to increase the transparency of political advertising done through bulk text messages.

This is the aim of the regulation of political advertising reclaiming the transparency lost over time through advancements in technology.

Considering the subtle nature of messaging derived from microtargeting campaigns, an increase in transparency would likely contribute to restoring (or at least safeguarding) some level of autonomy for the electorate.

The CA guidelines would sufficiently cover the use of ordinary targeting in the form of bulk text messages as we head into the 2022 elections.

Authorities such as the IEBC and the Data Commissioner may be able to work with social media platforms to identify appropriate transparency tools that could be deployed in the forthcoming elections.

Such a collaboration would have to be alive to unique local contexts. Heading into the 2022 election cycle, Kenya ought to do a few things. First, the DPA should be fully operationalized.

Second, the Data Commissioner should collaborate with political actors and the IEBC to engage in widespread sensitization around data protection and the use of personal data in campaigns.

Third, political parties should commit to the proper use of personal data in their campaigns, perhaps even signing public pledges as a show of goodwill.

Fourth, political advertising on social media platforms should be more closely regulated to ensure transparency.

Finally, the Data Commissioner and the IEBC should work with social media platforms to develop appropriate tools that would be applied in Kenya to enhance platform accountability and transparency of messaging.

Image courtesy of Gordon Johnson from pixabay