Our special coverage of the Bill C-11 hearing is continuing onto the last known segment for now. This one looks at the second segment of the tenth hearing.
We seem to be entering into the last segment of these hearings. It’s been a very long and difficult road to get here, but with this summary and analysis, we are confidently wrapping up the current round of hearings.
A quick overview of this epic coverage can be shown below:
Hearing 1 – Privacy Commissioner and Global Affairs/Justice Department
Hearing 2 – Digital rights organizations and lobby groups (1)
Hearing 3 – Lobby groups (2) and platforms (1)
Hearing 4 – Lobby groups (3) and lobby groups (4)
Hearing 5 – Lobby groups (5) and lobby groups (6)
Hearing 6 – Music Canada / platforms (2) and lobby groups (7)
Hearing 7 – Scholars/researhers and digital first creators (1)
Hearing 8 – Digital first creators (2) / lobby groups (8)
Hearing 9 – Digital first creator (3) / lobby group (9) / platform (3) / Lobby group (10)
The most recent posting on these hearings revolved around the first segment of the tenth hearing. It featured a record label as well as some lobbyists. Interestingly, the hearing largely revolved around the record label who was calling for, among other things, clarification that user generated content is out, clarification that the CRTC cannot meddle with algorithms, and better support for Canadian creators.
So, one more time with feeling, the video we are watching can be found here. Nothing will beat the thoroughness of an official transcript or the video itself, however, we are happy to provide a detailed summary and offer analysis of what we’ve heard. So, with that out of the way, let’s dive into this final (?) hearing.
Opening Remarks
Fenwick McKelvey, an associate professor, opened with his statement. For him, change cannot wait. He notes that creators need rights. Issues such as that is not addressed, or not clearly addressed in Bill C-11. He recommends that Bill C-11 is narrowed in scope and subsequent legislation address digital creators and their rights. He notes that platforms is a broad term. So, he has four recommendations for Bill C-11.
First, he says that discoverability has undermined a broader movement of AI accountability. He recommends that discoverability be defined and its feasibility of the governments commitment to the USMCA. Through this, he recommends a broader AI Accountability Act.
Second, the discoverability question is something that needs to be reframed. He recommends an amendment that recognizes privacy rights of individuals as well as the cultural significance of information about them. The amendment recognizes the need to harmonize privacy as well as broadcasting policy.
Third, he says that he is disappointed that the act does not implement recommendations for CRTC reform – especially the establishment of a public interest committee. He worries that the role of the CRTC places to much focus on corporate interest and too little to public interests.
Finally, he says that he continues to not know the scope of the act. The lack of willingness to add clarity to Bill C-11 continues to be the bills shortcoming. He recommends that 4.1 be clarified and defines how platforms be subject to regulations.
He believes that the legislation continues to be stalled due to a lack of C-11 reform. This as there is importance to the future of cultural policy.
Emily Laidlaw, an associate professor, then opens with her remarks. She focused her remarks on social media because that is her area of expertise. She asked should Bill C-11 be captured by the bill and, if so, whether the provisions are sufficiently scoped to protect freedom of expression. She said that she can help by grounding what freedom of expression means in law and how this could impact the legality of Bill C-11.
She notes that freedom of expression includes the right to seek, receive, and impart information and ideas regardless of the frontier. We protect it because it matters to a democratic society. This in the search for truth to figuring out who we are and what makes us tick. It is the building block of a creative and resilient creative and informed community. So, in examining freedom of expression in the context of this bill, the question is not only the purpose of whether it is restricting freedom of expression, but also whether that is just simply its effect.
So, she notes, the indirect knock-on effect of this legislation on users of what they seek and share online is important to the analysis. So, if the adverse effects of the provisions on social media users are too great, then the interference towards free expression is disproportionate and unconstitutional. The thing is is that freedom of expression must be given a large and liberal interpretation and any restrictions narrowly construed.
This, she says, poses a dilemma for Bill C-11. This is because to be nimble, it makes sense that the details to regulations are pushed there, that it is pushed to the CRTC to develop something that she advocates in the area of online harms that this is something we leave to the regulator to develop. However, as drafted, this bill leaves too much to be decided later, and user generated content on social media ends up being captured within it – almost the entire social media ecosystem.
So, she notes, the provisions as they stand, are vague and overbroad. She understands the desire to target the sliver of commercial content, but the problem is in the drafting of it because of what it captures. It will fail the test of restriction of rights be narrowly construed. She lists the issues:
For one, what is social media? The bill does not say. If the goal is to target the biggest players, then be explicit. One way, though imperfect, is to target very large platforms. Does social media include private messaging? Private messaging groups can be enormous. The bill applies to uploads. Does this include links? What if a link is posted to Facebook which takes a user to a website where a person posts songs or videos and is a subscription site? What about tweets that embed a YouTube video?
The other issue, she says, is that the bill captures a program that directly or indirectly generates revenue. So, this provision in particular fails the test of a narrow restriction on rights. Generates revenue for whom? How indirect? The web of revenue streams on social media are complex. Social media is largely audio and visual content, so a lot of (bog?) standard content could be captured.
She thinks the goal is to target specific kinds of commercial content. She recommends a narrow provision that errs on the side of under-inclusive.
The other issue, she comments, is the issue of discoverability. The bill imposes the discoverabiltiy of Canadian commercial content on social media. If we are looking to narrowly focusing on commercial content on very large platforms, there is some scope for this to be reasonable. In terms of proportionality, consider will it achieve the outcomes that are desired? Does it restrict more speech then necessary? Does it incentivize privacy invasive practices? One issue is does it compel speech in platforms in ways that cannot be justified?
So, she notes, there have been some testimony that it does not impose specific algorithms, only outputs. That, she says, is neither here nor there in terms of the legality of the approach.
So, she says, some of these legal risks highlighted can be cured by the requirement that these provisions be consistent with free expression. This forces a rights analysis which is good, but here’s the problem: what are the metrics for free speech compliant broadcasting regulation applied to social media content? This is new territory in Canada and globally, so telling The Commission (CRTC) to consider free expression doesn’t tell the commission how to do it. She adds that Canadian constitutional law is far too narrow of a framework to consider free expressions. We are operating in a global ecosystem and it is imperative that the infrastructure of the internet is considered.
So, she comments, for The Commission to properly consider free expression, it would need to consider Canadian free speech jurisprudence including Charter and private law, but also international human rights. The work of standard setting bodies like the OECD and ITU and internet regulation broadly. What is the internet infrastructure? How does the information flow?
Ultimately, she says, social media is not a broadcasting program, and that is the problem here. There are legitimate reasons to regulate some aspects of what is posted to social media, but either not under this legislation or done in a different way.
If the Senate does choose to proceed, she says she has two pieces of advice: 1) Narrow targeting of social media to social media of a certain size, narrow the range of content and behaviours that are being targeted. In short, be under-inclusive in the name of certainty and constitutionality. 2) Broaden the focus to algorithmic accountability that should be guiding the thoughts of the commission and how Senators proceed in the senate.
(This really feels like a massive vindication to me because I’ve said for a long time that there are constitutional concerns with this bill. There’s not a lot of those who agree with me when I say that this bill is very likely unconstitutional. I suspect some might have been skeptical about my own position and may have thought my views are a little too extreme to question the constitutionality of the bill. This opening statement really does a fantastic job of laying out why there are legitimate concerns about the constitutional nature of Bill C-11.)
Blayne Haggart, another associate professor, then opened with his remarks. He focused his comments on the discoverability requirements. He believes that, if passed these provisions represent a significant advancement Canada’s regulation of global intermediaries that have emerged as significant and unaccountable cultural regulators in and of themselves.
The goal, he says, is to promote Canadian content and the work of Canadian creators in online intermediaries, or platforms. As such, they represent continuity with long established cultural objectives. He says that these rules do not interfere in an otherwise free market of ideas which is the traditional terrain for these cancon debates. Rather, they are targeting platforms that are acting as cultural regulators.
(he was interrupted for speaking too fast for the interpreters).
He continued that the traditional cancon debate asks the question of should the government interfere in the free market of ideas. That, he claims, is not what is going on here. It’s targeting platforms that are acting as a consequential regulator which is what a platform is.
This is where algorithms come in, he says. Algorithms have become magic and scary words that intimidates people, but all it is is a set of rules that are repeated over and over again. He thinks of it as a form of automated bureaucracy or a form of regulation (that is really stretching the terminology here). YouTube and TikTok, in this case, depend on these regulations to determine what to show users. In other words, they are private discoverability rules. These private regulatory rules are not designed to show the most popular content, or the content of you, the viewer, are likely the most interested in (if that was the case, YouTube and Tiktok would fail). These companies do not just define what is popular, but defines what popular means. They already create winners and losers.
They also pick winners and losers to protect their own interests – however they decide to define them. This self-interest can actively disadvantage certain groups and creators. For example, YouTube has long been criticized for allegedly demonetizing LGBTQ content. He says allegedly because YouTube doesn’t publicize this ranking. It also means that creators and researchers have to reverse engineer the algorithms to understand how it works.
The need for this guesswork makes it difficult for government to craft legislation to deal with these platforms. One of the hopes is that the CRTC can force data from these platforms so that decisions can be made based on hard evidence rather than anecdotes. It’s also important to understand that these companies change the rules all the time in ways that helps some creators and hurts others (not avoidable). There is nothing normal about this privatized status quo.
He says that, in short, the question before parliament is what criteria should the government use to ensure content gets promoted to Canadians and who should be allowed to make these decisions? These are all legitimate reasons for action from the government and the CRTC.
The biggest problem, he says, is that for discoverability requirements is that we don’t really know if it’s working or not (I’m not even sure that cracks the top 10 list, personally). The idea of discoverability is in line with ensuring that cultural policy is supply driven, not just and demand and market driven. Ideally, targets are ideal to ensure different cultures are represented as well as creators that are not in line with the Canadian mainstream. There needs to be a robust discussion on what those targets should be and he doesn’t think that has happened.
He concludes that he doesn’t believe Bill C-11 should be seen as the end of the journey, but rather, the beginning step of robust platform regulation. The government should begin to build a system to better understand and regulate these companies. This capacity requires that regulators be given the tools to actually regulate. Mistakes will happen and that’s completely normal in a new area like this (a lot of these mistakes right now are avoidable assuming the government actually listens to the critics – though I don’t know if that’s going to happen). When this happens, we reassess and we adapt. He cites Germany as a country in regulating online hate. Canada hasn’t even gotten out of the starting gate of regulating platforms and he says it’s about time that we got under way.
(The whole opening remarks with this guy makes a lot of assumptions including the assumption that platforms have perfect knowledge and have completely perfected what people should and shouldn’t see. In practice, that is not the case. He also falls into the trap that if regulations were good in the past, then they are automatically good for modern realities which is also most assuredly not the case. Platforms are obviously very much different from terrestrial radio in a huge host of ways. So, a lot of assumptions that make it sound like the solutions are easy when that is definitely not the case.)
Questioning the Witnesses
Senator Rene Cormier opened up the questions. He commented that he followed with great interest the Creator Act that was published in policy options. It speaks to the power of social media and the online creators. What interests him is that power relationship between the creators and the platforms. How does Bill C-11 maintain the balance between the platforms that make a lot of money and the creators – some who make a lot of money and some, much less? He wanted to hear how Bill C-11 is able to strike a balance and maintain a balance so that creators are able to find their place and how C-11 obtains better working conditions. It interests him because creators are at the heart of the content that is circulating social media.
McKelvey responded that he should emphasize that there is a spectrum of creators and one of the challenges is the confusion or maybe the need to delineate the spectrum the places that its from. On the one side are commercial creators with a designation before the government, and on the other end, user generated content creators. One of the parts that he thinks complicates the bill is the growing involvement of platforms in cultivating creators through things like affiliate programs or with podcasting deals with Spotify. So, it’s important to say that when they are talking about creators, it’s not all the same category.
The problem, he points out, is that nuance is lacking and, at best, you have 4.1 that alludes to the possibility of capturing certain creators that are entering into contractual relations with platforms might fall under that bill.
That, however, leads to two big issues on the table for him. One, it’s not particularly clear how those particular creators are going to be captured like when Twitch recently dramatically cut the amount of revenue to different Twitch streamers. It speaks to the many different market relations and it’s hard to see how that is captured in that one-size-fits-all approach in C-11.
So, to him, it’s really important to have a bill and regulation for digital creators. It’s more complicated than what is being considered in C-11. That’s his worry. It’s not that there isn’t a need for reform, but C-11 is cumbersome in doing that. In particular in ways that ultimately downloads onto the CRTC, it raises risks about known issues in the consultation process of the CRTC. Without the proper representation of creators and creators unions, you’re not going to be seeing anything that is seen as effective or meaningful – especially for a whole generation of creators who haven’t grown up and see themselves as part of the cancon system.
Haggart responded that in terms of people who are disadvantaged by this – the key point here is that when we are talking about algorithms, they have to choose what they are going to promote, how you are going to measure popularity, and what’s considered beyond the pail. For instance, in March 2020, there were worries that YouTubers mentioning COVID-19 were being demonetized. Afterwards, it was found out that YouTube did so to protect advertisers from sensitive content. This disadvantages people who want to talk about that. It creates a level of precariousness with creators and their platforms. It’s much more stable than YouTube will surface the most popular content everywhere.
He then commented that, with respect to McKelvey in regards to different nuances lacking, that’s a very good point. At the same time, though, he’s a bit sympathetic with the governments position here because the problem with getting too precise with definitions. User generated content is a huge category and it risks blocking the regulator into a specific moment in time and be unable to adapt. More guidance could be used to figure out where the government wants to go in this area.
(I always had a problem with this argument about being too precise will prevent the CRTC from being flexible because there is no satisfactory solution that really is presented whenever this argument is brought up. Instead, it’s always a case of pounding the fist on the table and screaming “regulate it all!” approach and damn the consequences. Up to this point, this has always been a really bad argument that has all the usefulness of a meaningless buzzword that’s a bad attempt to wish the whole issue away.)
Senator Leo Housakos commented that Haggart heard clearly that algorithms can be used by platforms and used in different ways. The question is, who determines how those algorithms are used at the end of the day? Is it government, the CRTC, or consumer choice? At the end of the day, he has a lot more confidence in Canadians than people he doesn’t know, who they are, or what they are doing. Is allowing platforms to use algorithms no different than allowing radio’s to determine what to play using ratings or a bookstore using their best seller list to determine what books to showcase what books to showcase in front of the store? At the end of the day, even Canadian artists need revenue and it gets to the point where if they can’t get customers to keep coming back in order to watch what they are producing, we’re going to have a nation where the only culture we support are radio stations, music, books, and so on that government subsidizes that Canadians don’t want to see, listen to, or hear. He thinks that’s the crux of the debate here.
Haggart responded that a lot of this debate is gussied up. They are talking about platforms and streaming and algorithms. So, basically, it comes down to who should be allowed to set the course for Canadian culture? Is it the free market that’s not free because the platforms determines what gets seen or is there a role for government intervention in this area?
He then said that to answer the first question, right now, they are currently being set by private companies. So, it is the algorithms, it’s not consumer choice. Consumer choice is just one input into that system. It’s not the only one. It’s content that is seen as a good fit for advertising or if it’s being seen as something controversial. We don’t know because these algorithms are proprietary knowledge and trade secrets. So, we can’t know exactly how these things work.
From there, he commented on whether it’s similar to a store displaying books at the front. He says that he would argue that it’s different from that. It’s similar to the cancon regulation where the goal is to not just promote the commercially popular stuff, but it’s to promote culture that we see is necessary to the betterment of the country. For instance, francophone content which does not have a global market share. Also, it’s not a world where we have thousands of booksellers. It’s that we have four or five platforms who are determining what Canadian’s see and what Canadian creators are promoted and which ones are left by the wayside.
Laidlaw chimed in, saying that she has a few remarks about this because she wants to make sure that they don’t conflate a few things when they talk about accountability – at least for algorithms. One is that there is a legal question here that is going to have to be grappled with the extent that any legislation can compel algorithmic outcomes and whether that is essentially a form of compelled speech. There is a lot of caselaw in the US that suggests that this may be a problem. For instance, the question of if it’s compelled speech by mandating certain search results. It may not be as extreme in Canada with caselaw, but it certainly means that you need to be careful about imposing too hard of requirements.
What is important, she says, is to talk about algorithmic accountability. She’s not in favour of pointing at the platforms and saying these are the big bad platforms fully driven by self interest. She thinks it’s a lot more complex then that. There is a lot of curing of certain behaviour that we need to do and she thinks we need to implement algorithmic accountability. However, they do engage in private curating of algorithms and we need them to do that. The question is just how much government oversight is needed.
One thing that is key, she says, is general algorithmic accountability and transparency on the part of these companies about how their algorithms are operating. What they are doing with it and proper auditing of these algorithms and oversights of that.
The other thing, she says, is the legitimacy of requiring certain algorithmic output. It’s a question mark legally. She thinks it can be supported. Then the question becomes whether it delivers on the broadcasting objectives and Patrick Aldous in the previous panel dug into and provided valuable feedback.
(I think there is a misconception that every algorithmic change and every moderation decision is driven purely by the platforms. That is obviously not the case. There is well-documented cases where religious organizations have been putting pressure on payment processors like Visa, Mastercard, and PayPal, to crack down on the use of their services for adult oriented content – and consequently, LGBT content among other things. So, there are external pressures being applied to things like platforms to behave in a certain way.)
(Another good takeaway is the not very well talked about issue of compelled speech. I mentioned it in passing in a previous analysis of this bill, but I think Laidlaw is doing a great job at pointing at one of the more well cloaked elephants in the room. That question being is whether compelled speech is a violation of Canadian Charter rights. Further, does this bill actually go too far in this way. While I did try to dig into this in the past, I didn’t really find much. Apparently, though, it’s not just me, so it was interesting to know that Caselaw really doesn’t have much to say about that. Still, if this law goes through not amended, that is a very real possibility for subsequent litigation – possibly on the part of platforms or an organization representing them.)
McKelvey commented that, in checking with his notes, there is a lot that he agrees with Laidlaw on. There is a conflation between algorithmic accountability, which is the real issue, and whether algorithmic accountability is going to fix the cancon issue. They are two separate issues. (he was pressed for time to make a short answer.)
Senator Paula Simons commented that Section 4.1, Section 4.2, and Section 4.2 (2). This is where we seem to get mired in the muck. She’s looked at suggested language from TikTok and YouTube for ways of narrowing the scope so that it includes long songs instead of little snippets of songs and that it only includes commercial content. Then they have Monica Auer come before them who said briskly, why not get rid of Section 4 altogether? She’s trying to decide if that would be the simplest and most eloquent solution rather than to re-write Section 4 to make it more complicated and more narrow. She wondered what the witnesses thought of that approach.
(To be honest, I’d be down with that idea. It would nip a lot of problems in the bud right there.)
Laidlaw, after getting clarification, responded that, for her, that’s the cleanest approach. The cleanest approach is to scope out user generated content entirely. She is very conscious about the specific example of YouTube. There is legitimacy with targeting that specific type of content, but maybe not doing it through the lens of social media or through the lens of it being user generated content.
What she says she doesn’t know is that transparency with that relationship between YouTube and, say, Sony for instance. Is this even a situation where YouTube is acting like social media and its user generated content being posted or is it more of a licensing arrangement? She doesn’t actually know what that arrangement is here. It might still be captured if it’s not traditionally considered user generated content.
Senator Simons responded that it’s uploaded by the record labels, not uploaded by YouTube. This is the rabbit hole that we spin down.
Laidlaw responded by asking is there some sort of contractual arrangement with YouTube that changes the nature of it? Now, we are going into the monetization whether directly or indirectly. However, there might be a different way to scope that in with relation to those contractual arrangements so that it can be better targeted.
Senator Simons commented that she does not believe the government intended to target an indie-folk singer intending on selling sweatshirts or an indo-Canadian skateboarding mom on TikTok (I think that’s a reference to AuntySkates) and yet, she fears that this is what the language allows.
Laidlaw responded that this is her fear too and even examples she gave were examples of what is not being targeted by the legislation, but that is the effect of it. It creates a ripple effect of uncertainty. At minimum, this would have to drastically be narrowed in scope – even just to target the narrow set of what they are trying to go after now with the hopes of maybe revisiting that legislation later. That would be the better approach: either get rid of it or make it very very narrow.
Senator Simons turned to Haggart and said that he raises an interesting point. The algorithms tend to favour the controversial material or the salacious material that is buzzy, it’s important for them to be mindful that the algorithms are not agnostic or benign. At the same time, she doesn’t see anything in the bill that addresses the issues that he and another witness raised. The bill doesn’t try to solve for that problem. It’s dealing with a cancon issue, not the issue of whether the algorithms are fair to artists or if they are privileging some works over others.
(I agree, there really isn’t anything in the bill that’s really trying to tackle that specific issue.)
Haggart responded that he thinks its trying to. This is where the discoverability issue comes through. Transparency is an important thing, but it is only useful if we have a direction of where we want to go – especially if these things are black boxes. They way you evaluate an algorithm is to evaluate the outcome. We have to figure out a destination in order to figure out if it’s a good thing or not. (He was the same guy that brought up the issues of downranking LGBTQ content which isn’t related to the bill in the first place?)
Senator Simons commented that the outcome YouTube wants is to maximize ad revenue.
Haggart responded that, yes, this is true, but that’s not in the interest of Canada in terms of cultural policy – at least that’s the way it’s been historically. It’s about promoting our culture and making sure we aren’t leaving these things entirely to the market place. The idea of having discoverability rules is not an insane idea. It’s in keeping with that kind of tradition. The problem is, at least as it stands, and there’s a great deal of frustration among the witnesses about this, is that discoverability is great, but to what end? What is that going to look like?
He thinks that it’s probably a good idea to have some, or even a lot of flexibility in this bill, but he doesn’t think the government has done itself any favours by saying that while these are the kinds of content they want to target, it’s a moving target, so we need to give ourselves the flexibility to go after that moving target.
Senator Simons turned to McKelvey, noting that he raised the issue of the USMCA (sometimes referred to as CUSMA as heard here) in the lack of a discoverability definition. She wanted to know what sort of issues might emerge.
McKelvey responded by giving a shoutout to multiple people and an organization for their work in this area. There is a question of whether this counts as user generated content anymore. There continues to be a lack of definition of what types of creators we are talking about. The contractual agreements between record labels and platforms is something that is not well investigated. The consequences of promoting Canadian content puts the onus on platforms to produce outcomes. That seems out of scope with the algorithmic carve outs that are with the USMCA. This as well as the links to the platform liability exemptions within the USMCA. Both of which he feels are an issue of policy coordination and the ones that are going to be worked out by lawyers because he thinks that this raises real issues just as there has been around the CRTC and simultaneous substitution issues for football. It’s a known issue and one that should be addressed before Senators put the bill in.
Senator Simons requested that the work of the academic that was mentioned could be forwarded to the committee because that would be helpful.
Senator Fabian Manning commented that many witnesses have raised the broader implications of Bill C-11 will be in terms of broader culture. Were other countries follow suit in a similar manner, we would have a much smaller role of culture in the world where most artists will only be reaching a domestic audience. Many smaller artists would find that unsustainable. He asked if the witnesses agree with these concerns and whether the concerns would become reality if this bill is passed as written.
Haggart responded that he is relatively less concerned about that because because these algorithms and platforms are privatized discoverability. It has its own biases. It’s not a neutral setup. What that means is that the companies are able to modify it to do what they would like it to do. If Canada had a cancon requirement that moved away the ranking of Canadian content within Canada away from the norm of Google’s algorithm. There is nothing stopping Google to introducing a modifier to the algorithm to make it more in line with what they see as what would be in line with Google’s larger universe. This would be an issue of a technical fix if the company wanted to do that.
McKelvey chimed in, saying that there has often been the use of this threat of platforms and this is something he saw this in the Australian News Code The issue of threatening to leave markets is something that he would want to have investigated more. There certainly are potentially risks. The cultural policy is not simply an economic one. Too often, we neglect the welfare state of arts and culture. Drawing important distinctions and nuances is important.
Laidlaw also chimed in saying that the story of one country has a ripple effect in other countries. She doesn’t want that to dissuade the imposition of domestic law. When we look at the scope of what has happened, you look at the legal scope of what are the objectives and does that achieve those objectives that minimally impairs different rights. That is key in whether this helps creators and industries in the way that is intended. If it is, then there may be justification of that interference, but if it doesn’t, then it’s far less justified.
Senator Manning said that they have heard of the concerns of the power of the platforms and what it does to regulatory authority. Senators have also heard from witnesses about the potential power of the CRTC about closed door hearings and online creators being swept aside by traditional broadcasters who know how to navigate the regulatory system. The question is finding that balance where everyone has, at least, a level playing field. How do we walk towards that balance within the legislation where smaller creators can feel that they aren’t being swept away and have a circle all to their own.
Laidlaw responded that the answer you are not going to want to hear is that there is no easy answer to this. You’re damned if you do, damned if you don’t with this type of legislation because some of the changing nature of the landscape needs to be kicked to a regulator. One of the issues we are facing is the lack of detail in the legislation to be able to confidently direct the CRTC and what it’s supposed to do. As it stands now, she doesn’t think the bill, as it stands now, cures the problem of holding platforms accountable for what they do. That accountability mechanism, she doesn’t really see that. She sees output requirements, but she doesn’t see accountability requirements that we should be looking for from platforms.
She also said that we should be narrowly focusing on broadcasting regulation and not veer into online harms. This is just one piece of the pie.
McKelvey added that if we are talking about platform accountability and delegating that to the CRTC, it sounds like a joke to him. the CRTC has issues with the current monopolies within Canada to account. Particularly, we can talk about Red Friday with Rogers (I think that’s in reference to the massive Rogers outage). As someone who has a deep experience with technical matters before the CRTC involving confidentiality, algorithms, and AI, the inability to consistently deal with technical issues from the Commission should be particularly concerning that demands more work in the bill. This is something that he is deeply worried about as someone who has to deal with this for years to come. He can understand confidentiality, but the CRTC has issues doing basic internet measurement which is a minuscule more simple problem then the discussions of algorithmic outcomes. If you seriously believe that the CRTC has the ability to fix this now, he thinks that’s a huge oversight.
Senator Julie Miville-Dechene commented that the idea of private discoverabiltiy is something that interests her a lot. They’ve been hearing over several weeks now that platforms and creators ask to what extent that platforms have a secret recipe and would take into account the popularity and interest of consumers. Could Haggart delve into further detail? These algorithms have to take into consideration what’s popular and what’s not. What other criteria is being used? There are ads. Could we theoretically have contracts between music giants and, for example, YouTube or Spotify to put forward singers or groups that would be favourized by these platforms? We don’t know because we aren’t working for these companies. We don’t know how they structure their algorithms. Could he also explain more specifically how they thinks the platforms can counteract the negative effects of discoverability with regards to Canadian content. Haggart said that they would just have to change a few things and this would cancel out the impact internationally when we are talking about promotion of Canadian content in Canada. This is a way of seeing algorithms differently.
Haggart responded that in terms of the question of whether the platforms can adjust the algorithm to counteract the cancon effect, it’s just a thought exercise because he doesn’t know how the algorithm works. He doesn’t work for YouTube. He does know that these are made by humans like they are designed to say ‘if this, then that’. That’s how these things work. It’s convoluted, but it can be as straight forward as that in thinking about it in the abstract. If, for instance, they saw a piece of Canadian content that was being surfaced in a certain way, whether to pick it up or not, it would be possible to see how that surfaces to a different audience.
On the other point, he says, with respect to the earlier question, these algorithms are inscrutable by design partially so people can’t game them. However, it also means that they change all the time. He quoted a YouTuber who said that this is YouTube, it’s my job to figure out what changed again. This is the situation we are dealing with. It’s not just the creators, it’s also the regulators who have to figure out what is going on with all of this.
He then responded to McKelvey, saying that he is very much in favour of government intervening in this market, however, he is very conscious of the fact that the CRTC hasn’t shown itself to be a very strong regulator in the telecom space in general. That doesn’t get rid of the need for regulation in this area. What it means is that we need of the overhaul of the CRTC in some way to do the things like including small groups and small creators.
Senator Yuen Pau Woo noted that witnesses have encouraged them to look at algorithmic accountability rather than the specific tampering of algorithms or even offering some proposed outputs so that companies can tamper with the algorithms themselves. The idea of algorithmic accountability is, intellectually, very appealing. However, he wonders if that’s it. What about policy tractability. What is achievable and what can be done? So many have said that these algorithms are inscrutable and secret and always changeable. They are closely guarded and there is probably a lot of duplicity even when some information is released. Do they not think that a secondary approach to achieve certain policy objectives, not by tampering with the algorithm, but by offering some desired goals is perhaps the best that policy can get at insofar as achieving the objective of promoting Canadian content. He is assuming that there is some sort of agreement on the need to promote Canadian content.
Laidlaw responded by saying that she thinks that there is an eloquent solution. They have been looking at this issue of online harms and platform accountability. It was more about how do you make platforms more accountable without telling them specifically what they are supposed to do. The goal is to reduce online harms and address hate speech. In a lot of ways, it is here too. It’s about promoting the cultural industry in Canada and protecting them. One way towards that is improving that access to regulators and third parties to hold them accountable by seeing what they are doing to regulate their algorithms.
The other, she adds, is to figure out certain policy goals saying ‘this is what is hoped to be achieved’. You can decide how to achieve these different goals. You can decide what the specific platform’s goal is and report back the steps that are being taken and many of those platforms are making those efforts. That way, you are both improving accountability. So, that is a potential approach that could be explored and it’s being explored elsewhere. A lot of shifts that we are seeing in European law reform was much more processed based.
The takeaway, for her, is to echo the concerns about the CRTC. She would be more direct in saying that they are not a technology regulator, they are not an internet, technology, and human rights regulator. So, when we start talking about the issues of algorithmic accountability, they are really looking at the idea of some sort of digital regulator. This is not restricted to the legislation that they are talking about today. It may be what’s beyond the scope that’s before them, but that’s what concerns her.
McKelvey chimed in and said that there is a care and respect for the CRTC. He doesn’t mean to slag it too much, but he does want to see it improved. With respect to algorithmic accountability, does this fall within C-27 or something else? It would be great to see more coordination about that.
He then noted that we are often using algorithms as a proxy that might be more in line with artificial intelligence and the fact that they are dealing with something where there is no coded statements and more of a neural network of emerging behaviour. It brings up a whole other host of risks that he sees that these discussions keep up with that. It worries him and he’d like to see these discussions keep up with the pace of technology.
He said that the second thing is the introduction of an age design code is an innovation where you are trying to start thinking of ways that standards can be set with AI and algorithms. That’s in the UK.
Haggert commented that in order to have accountability, you have to define what your goals need to be. It can be something vague like promoting Canadian culture. At the end of the day, someone is going to have to put some meat on that and say what does that mean. This is the issue. Returning to traditional transmissions, the goal was to promote Canadian culture, but what they measured was how many times a Canadian song was played on the radio. It had the knock on effect of that.
He says he takes his point about tractable policy goals, but that’s what kind of cancon target is. This will get there. It is a proxy for the thing we want which is to promote Canadian culture even if it’s not all Canadian culture.
Senator Housakos commented that we talked a lot about algorithms which has been a fascinating thing for him over the last little while. He listens to Spotify. He is on Twitter. He watches Netflix. He Google’s all of the time. It seems that all the things that pop up are all things that are in his interest zone. Maybe it is simplistic, but maybe these are businesses that are so successful because they give people like him and their clientele the things they want to see or hear in a quick expeditious fashion. Correct him if he’s wrong. He then handed things over to Senator Cormier.
Senator Cormier said that his question is completely different. He wanted to go back to the power dynamic between the creators and the platforms. The creators are not protected in their power to negotiate. Maybe that question is not in the scope of the bill, but in their opinion, how can we further protect the creators in their relationship with the platforms and how can we better balance their ability to negotiate with the platforms?
McKelvey responded that this is really his hope from the start – seeing a Digital Creators Act. something similar to the news remuneration where there’s collective bargaining rights or finding particular ways of entering into binding arbitration with the platforms and seeing where the role for government will be. He really thinks that Cormier is opening a very important line of questioning, but he just doesn’t see it in the bill presently. That is something that is quite detrimental.
That, he says, is partially why the carving out of limited function of this bill because, Cormier is rightly raising deeper questions about bargaining power between smaller creators and large platforms. If you look at Twitch’s decision to lower the rates for its streamers really demonstrates the balance of those bargaining powers. He doesn’t see how C-11 is going to get those directions, but he does see that as something that is important of eminent importance for the government legislation going forward.
Haggert responded to Housakos’ question by saying that this is what makes this issue so difficult to get a grasp on. Depending on a discoverability algorithm or search engine, you don’t know what you are not seeing. What you end up having is the illusion of comprehensiveness. Google will return thousands of search results for a certain query. However, it’s not clear how it’s been ranked, but also, you are not sure about how that works. It looks like it is providing you with everything that you would need, but Google search engine has become worse. It’s being overwhelmed with ads and results that are harder to find relevant to you. These are decisions meant to privilege certain kinds of content over others. It may not have you, as the searcher, your best interests at heart.
Also, he says, there might be 100 pages worth of results, but most people don’t go past the first 10 pages anyways because that’s good enough even though what you might be best looking for might be on page 4 or 5.
with that, the hearing adjourned.
Concluding Thoughts
So, quite a lot of stuff to take in to say the least.
One theme that I took out of this is the credibility of the CRTC. While the witnesses tried to remain polite about it, let’s face it, the CRTC has made a number of decisions that severely hurt it’s credibility. It has earned that reputation of only doing things that are in the interest of the big telecom sector businesses – the very organizations that it is charged with regulating in the first place. That reputation extends into the idea that the CRTC really has taken consumer choice and consumer interest and shoved it into the back seat of its decision making.
So, the real question is, should we be handing such an enormous amount of power over to them to basically regulate the internet. Some people might argue about semantics, but that is what C-11 is trying to do – hand over a massive amount of power over a massive amount of content that takes up the bulk of the modern internet and just leaves it up to the CRTC to regulate it as it sees fit.
I think it’s fair to say that it’s not unreasonable to ask that we get the CRTC ducks in a row before we hand off such a huge amount of responsibility over to the CRTC. To hand the keys to the kingdom over to the CRTC when they obviously have not earned that trust or social credibility strikes me as completely reckless and dangerous. We don’t even know if they are even remotely qualified or even ready to regulate in this manner.
Saying that the CRTC can regulate the internet because they regulated TV and radio for many years is completely insane. As said numerous times already throughout the hearings, the internet is a completely different beast compared to TV and radio. We’re talking about two way communication rather than simply content that streams one way. We are talking about limited choice vs effectively unlimited choice. Anyone with any shred of credibility would never in a million years say that regulating the internet is just a small step over from TV broadcasting and radio broadcasting and practically the same thing anyway. The notion is completely ludicrous.
Another theme that I don’t think gets enough attention is the question of whether or not the legislation is constitutional. That ultimately is the main thrust of Laidlaw’s comment is that it’s a tough sell to say that this bill respects free speech given just how wide the net is cast over human expression. I’ve always felt that the issue of compelled speech is going to bite supporters in the rear one way or another. We may not have the most comprehensive amount of caselaw talking about this issue, but that library could very quickly expand should this legislation become law.
While Senators may have passively gave some interest and attention to these comments, it won’t matter in the world of litigation. There is a strong case for questioning whether this law is constitutional or not and the angle of compelled speech is a very vulnerable area for Bill C-11.
Some people might look at this and say that there was no constitutional question with regards to cancon requirements for TV and radio, why would that be different for the internet? The answer to that is simple: TV and radio had limited choice. If your content wasn’t on one of 12 channels, then your content isn’t being shown at all. So, a case could be made that, in light of such limited choice, such regulations are, in fact, justified. That argument simply doesn’t exist for the internet because you have unlimited channels and unlimited choice. So, the justification defence is much weaker in the absence of scarcity within the market.
Further, the remarks about trying to set up something that allows digital first creators some kind of power or bargaining capabilities is interesting, but it isn’t something that is addressed in the bill. It’s a problem because this is another way that we have basically put the cart before the horse on this (the previous one is the disorganized and disastrous state the CRTC is in right now). As brought up in numerous other hearings, the idea of actually empowering creators is really an issue that should have been tackled first.
If creators have that extra power, they are getting that financial backing, they are creating lots of content, and there is still the issue that they are not being discovered (and a case can be made now that they are absolutely getting discovered without the intervention of government), then we can talk about whether cancon rules should be brought in. This further shows just how out of place and bizarre this legislation truly is.
As for Haggert, he is, in my view, just the latest example why not all professors are created equal. The way he presented algorithms was that there is the strong implication that platforms are doing something nefarious. The reality is that, yes, they are choosing some content over others, but that is how, to a good degree, how they should be operated and run. If I search for Roblox video game footage, I don’t want video game footage of Doom Eternal. In that instance, I would expect video game footage of Roblox to be picked as winners over that other content because that is how a search feature should work.
Further, if I am watching a video of analysis of Poker hands at the World Series of Poker, it would be completely out of place to see recommendations feature compilation clips of horse jumping. It would make little sense to include a bunch of recommendations of that nature because it simply isn’t relevant. Again, in that case, I expect some content to be winning over others. It’s not nefarious, it’s how it should work.
What’s more is that I have a really hard time believing the notion that platforms like YouTube show video’s that is in the interest of their bottom line rather than what is in the users interest. For the most part, both are one in the same. A platform like YouTube wants you, the viewer, to keep watching video’s. If you aren’t interested, then you are going to click away. Anyone with even the smallest amount of web knowledge knows that the attention span of web users has always been incredibly short, so keeping the attention of users is often a really difficult thing to get. So, the idea that they are shoving video’s that is not in the users best interest before them is a really tough sell alone, let alone coupled with the fact that he offered little to no evidence to back up these claims.
Because of that, the argument that we would be better off if the government intervened in algorithmic choice pretty much collapses. It wound up being incumbent on the other witnesses to present a case why transparency and accountability of such things can be important. They approached it from a different angle and pretty much salvaged that case that Haggert was, well, letting fall flat on its face. All Haggert was basically saying was that these choices are better left in the governments hands because the platforms are somehow engaged in a massive conspiracy to show results not in the users interest. You really can’t facepalm hard enough over that.
All in all, though, this hearing does a great job at shining a light on the issue of how unprepared the country is for Bill C-11. There is so much foundational work that needs to be done before we even get to the step of asking whether or not such a bill is needed. There is no question just how much damage such a bill could do to the internet ecosystem, free speech, innovation, and even economical ecosystem, that its really difficult to keep up with it all. As a result, Bill C-11 is not even close to being ready for prime time.
Drew Wilson on Twitter: @icecube85 and Facebook.