Professors Elizabeth Dubois and Michael Pal appeared before the House of Commons

Technology Law, Ethics and Policy
Aerial view of the University of Ottawa Campus and the Rideau Canal.
On October 2nd, 2018, Professors Elizabeth Dubois and Michael Pal, members of the Centre for Law, Technology and Society, appeared before the House of Commons’ Standing Committee on Access to Information, Privacy and Ethics (ETHI).
Michael Pal and Elizabeth Dubois

The Committee is currently discussing à remuneration models for artists and the creative industry.

Professor Dubois is an Assistant Professor in the Department of Communication, at the University of Ottawa's Faculty of Arts.

Professor Pal is an Associate Professor in the Common Law Section at the University of Ottawa's Faculty of Law. He is also the Director of the Public Law Group.

  • See below for Prof. Dubois's speaking notes.
  • The audio recording of the hearing is available here.

Speaking Notes (Prof. Dubois)

Hello and thank you for inviting me to speak today. I am an Assistant Professor at the University of Ottawa and completed my doctoral studies at the University of Oxford. My research focuses on political communication in a digital media environment. I have examined issues such as the use of AI and political bots, echo chambers, and citizens’ perceptions of their social media data use by third parties such as journalists, government and political parties. My research is conducted in both Canada and internationally.

Today I want to talk about four things:

  1. Analogue versus digital voter targeting
  2. Changing definitions of political advertisements
  3. Self-regulation of platforms
  4. Artificial intelligence

I will use the term platform throughout my testimony today. When I do I am referring to technology platform companies including social media, search engines, and others.

Voter targeting is not a new phenomenon but it is evolving at a spectacular rate.

It is typical, and in fact considered quite useful, for a political party to collect information by going door to door in a community and asking people if they plan to vote and for who. In some cases, the issues they care about are also recorded. This helps political parties learn where to dedicate their limited resources. This also offers voters an opportunity to connect with their political system in a tangible way. However, even with this analogue approach there are concerns because voter disengagement and discrimination can be exacerbated. For example, if certain groups are identified as unlikely voters and are then essentially ignored for the remainder of a campaign.

Digital data collection can amplify issues present in analogue settings and pose new challenges. I see 4 keys differences in the evolving digital context as opposed to a primarily analogue context:

  1. There are meaningful differences between digital and analogue data. The speed and scope of data collection is immense. While data collection used to require a lot of human resources to door knock or make phone calls, huge swaths of data can now be automatically collected through sophisticated tools. Data collected can also now more easily be joined with other datasets such as credit history. Similarly, voter data can now much more easily be shared and transported. It can also be more easily searched and predictive analytics can be employed to a greater degree because there are more data points and more tools for analyzing large amounts of data.
  2. Citizens may no longer be aware when their data is being collected and used, unlike when they had to answer their door and respond to questions. They are also unlikely to know what is even possible. This means they are not in a position to demand protection and so it is up to government to ensure its citizens are protected in this complex technical context. Notably, in a study of Canadian Internet users my colleagues at Ryerson University and I found that most Canadians are uncomfortable with political uses of even publicly available social media data.
  3. The uses of data are evolving. Since online advertisements can now target niche audiences personal data has more uses. At the same time, these uses are less transparent to regulators and less clear to citizens. This means that emerging uses could be breaking existing laws but they are so hard to trace we do not know. We need increased transparency and accountability in order to respond adequately.
  4. Political entities are incentivized to collect data continually, and not just during election times which means election laws may be insufficient. Notably, this is not just political parties there are third parties such as non-profits, unions, and others which also do this.

These changes are particularly concerning because political uses of these data are not typically covered under our privacy laws.

These data uses will continue to evolve and it is important we do not outlaw all uses of digital data because there are positive outcomes such as increasing voter turnout. We need to:

  1. Include political parties use of data under PIPEDA
  2. Create provisions to ensure transparency and accountability of political uses of data
  3. Ensure citizens are literate which includes better informed consent statements among other things

Stemming from emerging political uses of digital data are a range of other potential concerns.

First, political advertisement is not as clear cut as it once was. In addition to placement costs for what platforms call “advertisements” political entities may place and disseminate paid content via other means such as sponsored stories, brand ambassadors, “rented” social media accounts, troll farms, or bots and other types of automated accounts that can game algorithms in order to modify what content shows up first on a user’s feed, recommendations, or search results list.

In response, we need to re-define what constitutes a political advertisement in order to continue enforcing existing laws with their intended outcomes. This is particularly important when we consider the world-wide rise in use of instant messaging applications and emerging uses of those tools by political entities

Second, self-regulation is insufficient. While the steps some platforms are taking is encouraging there are still huge gaps and the fact that these steps have largely been reactionary should be concerning to us. Platforms should be held responsible for what they allow on their site but at the same time platforms must ensure their processes for deciding what is and what is not allowed are transparent. We need a mechanism for accountability so as to ensure both that privacy is protected, other laws such as hate speech are respected, while free speech is maintained and certain groups are not unfairly silenced. This is an extremely difficult task which is why it needs to be open and accountable.

Finally, the use of artificial intelligence is already complicating matters. The typical narrative at the moment is that when learning algorithms are used it is impossible to open the black box and unpack what happened and why. While this may be true from a narrow technical perspective there are in fact steps that we can take to make the use of AI transparent and accountable. For example, clear testing processes where data is open for government and/or academics to double check procedures, regular audits of algorithms the way financial audits are required, documented histories of the algorithm development including information about how decisions were made by team members and why. Furthermore, clear labelling of automated accounts on social media or instant messaging applications, registrations of automated digital approaches to voter contact similar to the Voter Contact Registry, and widespread digital literacy programs – likely involving journalists can help identify positive uses of these technologies.

Ultimately, I see value in the political uses of digital data but those uses must be transparent and accountable in order to protect the privacy of Canadians and integrity of Canadian democracy. This requires privacy laws to be updated and applied to political parties, the Privacy Commissioner to have increased power to enforce regulation, platforms to be held responsible for the content they chose to allow, and citizens to be given the information they need to be digitally literate.