We continue our analysis of Bill C-63, also known as the Online Harms Bill. This starting with Section 63.
While there are a number of questions surrounding the legislation, one question that isn’t in doubt is the fact that this bill is long… very long. Like, over 100 pages long. Analyzing this legislation was never going to be an easy task and that was proven when we eclipsed 5,000 words recently in the first part of our analysis. Still, just because this bill is really large doesn’t necessarily mean we are going to find this bill overly daunting. It just takes a little more time is all.
Throughout the previous analysis, we did note how the bill isn’t exactly a speed read. In fact, the text has definitions broken up and scattered throughout. So, making sense of it wasn’t going to be as easy as simply reading through it once and getting a clear understanding. In fact, even we aren’t immune to misreading the legislation. Several provisions make vague references to website operators. For a while, we took that to mean, well, websites in general and not just large social media websites. However, after a discussion with Matt Hatfield of Open Media about the legislation, apparently, this bill actually did simply focus on large social media websites. While we did update the article upon finding this out, we thought we’d start this article by, again, noting what we had learn to make the point we missed earlier clear.
The source of the problem is two separate definitions. The first piece of the definition is this:
For greater certainty — social media service
(2) For greater certainty, a social media service includes(a) an adult content service, namely a social media service that is focused on enabling its users to access and share pornographic content; and
(b) a live streaming service, namely a social media service that is focused on enabling its users to access and share content by live stream.
The second, which is a little further down in the bill, is this:
Regulated service
3 (1) For the purposes of this Act, a regulated service is a social media service that(a) has a number of users that is equal to or greater than the significant number of users provided for by regulations made under subsection (2); or
(b) has a number of users that is less than the number of users provided for by regulations made under subsection (2) and is designated by regulations made under subsection (3).
Even put back together, it’s not entirely straight forward reading this within the context of who is in this bill and who is not. The answer, however, is that a regulated service is simply another term for “social media”, not necessarily an umbrella term to encapsulate every website out there. So, in later provisions, when the bill says “regulated service”, it well and truly means a social media service. We don’t care about trying to inflate an ego or saying a particular political party is either good or bad no matter what. What we do care about is getting the facts right. That has been true since the very beginning of this site and, for me personally, true throughout my entire writing career.
So, unless we see something that says otherwise moving forward in the bill, that is how we are seeing this bill moving forward. We don’t want anyone else make the same reading mistake we did. The legislation is, indeed, tricky to read. Cool? Cool.
Now, let’s move forward and dive deeper into this legislation. For those who want to follow along, the text of the legislation can be found here. So, you can read this bill yourself if you so desire (and I encourage you to do so). For clarity, we are continuing on from Section 63 with the heading “Duty to preserve certain harmful content”. You can CTRL+F this to make it easier to find.
Part 4 Continued
Requirements of Social Media
The next section is this:
Duty to preserve certain harmful content
63 (1) If the operator of a regulated service makes inaccessible to all persons in Canada content that incites violence or content that incites violent extremism or terrorism, the operator must preserve that content, and all other computer data related to it that is in the operator’s possession or control, for a period of one year beginning on the day on which the content is made inaccessible.
This is simply related to specific kinds of content (ala terrorism related). The implication here is that social media sites are being asked to hold personal information related to the user that posted that. I would imagine that would relate to an IP address, related cell phone number, and any other identifiable information that the platform has on file. This is likely so that authorities like the RCMP can obtain a copy of that information if they feel that they need access to it.
The next section clarifies that the platform must then destroy that information after the period of one year:
Duty to destroy
(2) After the end of the one-year period, the operator must, as soon as feasible, destroy the content and any other computer data related to it that would not be retained in the ordinary course of business, as well as any document that was prepared for the purpose of preserving that content and data, unless the operator is required to preserve the content, data or document under a judicial order made under any other Act of Parliament or an Act of the legislature of a province or under a preservation demand made under section 487.012 of the Criminal Code.
This might be running into a fuzzy area for platforms. For instance, what if another jurisdiction has laws requiring the platform to store that information for a period of two years? Would that be in violation of 63 (2) here? I… actually don’t have an answer to that. My guess is that it would fall under “ordinary course of business”, but I wouldn’t know for sure.
Part 5
Research
This section largely concerns itself with researchers looking at analyzing data obtained under the bill. Throughout reading it, my only question wound up being whether or not people’s personal information would be protected. The section makes it clear that personal information would be protected. As a result, there is nothing really exciting to see in this section as far as I can tell, so I’ll just move on.
Part 6
Remedies
The next notable section we came across is this:
Submissions from public
78 (1) A person in Canada may make submissions to the Commission respecting harmful content that is accessible on a regulated service or the measures taken by the operator of a regulated service to comply with the operator’s duties under this Act.
This is related to one of the biggest concerns I had with the original 2021 version. In the 2021 version, anyone can make a complaint directly to any website about anything anonymously. My big concern is that a botnet could be established to hammer a given website and force it to take millions of obviously frivolous complaints, overwhelming the system in the process.
In this version of the bill, it seems that the complaints would be handled by the Commission (AKA the Digital Safety Commission of Canada) instead. This, in my view, makes significantly more sense. As a result, I would say this clears up one another one of my fears. What’s more, there are protections for the identity of the individual making the complaint:
Operator informed of submissions
(2) The Commission may inform an operator of any submissions that it receives under subsection (1), in a manner that protects the identity of the person who made the submissions.Information made public
(3) The Commission may make public information respecting any submissions that it receives under subsection (1), in a manner that protects the identity of the person who made the submissions and the regulated service in respect of which the submissions were made.
I think this is about as good of a system as you can hope for, honestly. There’s a much lesser likelihood that a system can be abused and, in the process, the identity of the person making the complaints still retains that anonymity as an added bonus. If there’s a better way of handling this aspect, I’m not really sure what that system would be.
For the rest of this part, there is nothing that really stands out to me that I think needs exploration. It all reads as reasonable to me.
Part 7
Administration and Enforcement
So, this section deals with what happens when a social media company has to be hauled in front of the Commission. One thing that I did notice is this:
Hearing
88 (1) The Commission may hold a hearing, in accordance with any rules made under subsection 20(1), in connection with(a) a complaint made under subsection 81(1); or
(b) any other matter relating to an operator’s compliance with this Act.
Private hearing
(2) A hearing must be held in public, but the Commission may decide that it is to be held in private, in whole or in part, if the Commission considers that(a) it would be in the public interest;
(b) it would be in the interest of victims of harmful content;
(c) it would be in the national interest, including if there is a risk of injury to Canada’s international relations, national defence or national security;
(d) a person’s privacy interest outweighs the principle that hearings be open to the public; or
(e) the following information may be disclosed:
(i) information that is a trade secret,
(ii) financial, commercial, scientific or technical information that is confidential and that is treated consistently in a confidential manner by the person to whose business or affairs it relates, or
(iii) information whose disclosure could reasonably be expected to
(A) result in material financial loss or gain to any person,
(B) prejudice the competitive position of any person, or
(C) affect contractual or other negotiations of any person.
Technically, this opens up the possibility of secret hearings by the Commission. It kind of renders “must be held in public” kind of pointless because the Commission can simply say, “nah, we don’t feel like it.” I would worry something like that would get abused further down the road, personally. It’s similar to the problem in the US with the push to reinstate secret courts where rulings and decisions are made out of the public eye. So, I think this is a bit of a slippery slope to instituting a very similar problem here.
The legislation also does have provisions related to use of inspectors that are charged with verifying compliance with a social media platform:
Designation of inspectors
90 (1) The Commission may designate as inspectors persons or classes of persons that the Commission considers qualified for the purposes of verifying compliance or preventing non-compliance with this Act.Certificate
(2) The Commission must provide every inspector with a certificate of designation. An inspector must, if requested to do so, produce their certificate to the person appearing to be in charge of any place that they enter.
Power to enter91 (1) Subject to subsection 92(1), an inspector may, for a purpose related to verifying compliance or preventing non-compliance with this Act, enter any place in which they have reasonable grounds to believe that there is any document, information or other thing relevant to that purpose.
Entry by means of telecommunication(2) An inspector is considered to have entered a place if they access it remotely by a means of telecommunication.
Limitation — access by means of telecommunication(3) An inspector who enters remotely, by a means of telecommunication, a place that is not accessible to the public must do so with the knowledge of the owner or person in charge of the place and must be remotely in the place for no longer than the period necessary for the purpose referred to in subsection (1).
Other powers(4) An inspector may, for the purpose referred to in subsection (1),
(a) examine any document or information that is found in the place, copy it in whole or in part and take it for examination or copying;
(b) examine any other thing that is found in the place and take it for examination;
(c) use or cause to be used any computer system at the place to examine any document or information that is found in the place;
(d) reproduce any document or information or cause it to be reproduced and take it for examination or copying; and
(e) use or cause to be used any copying equipment or means of telecommunication at the place to make copies of or transmit any document or information.
Now, there is a bit of an odd provision shortly after:
Warrant to enter dwelling-house
92 (1) If the place referred to in subsection 91(1) is a dwelling-house, an inspector is not authorized to enter it without the occupant’s consent except under the authority of a warrant issued under subsection (2).
I would be really really shocked if a Facebook server is located in the basement of someones house to be perfectly honest. I guess this is an effort to cover all legal basis, but it is a bit strange seeing this pop up under this context.
What is a bit worrying, however, is this:
Compliance order
94 (1) If the Commission has reasonable grounds to believe that an operator is contravening or has contravened this Act, it may make an order requiring the operator to take, or refrain from taking, any measure to ensure compliance with this Act.
Enforcement of orders
95 (1) An order of the Commission may be made an order of the Federal Court and is enforceable in the same manner as an order of that court.Procedure
(2) An order may be made an order of the Federal Court by following the usual practice and procedure of that court or by filing a certified copy of the order with the registrar of that court.
Essentially, a judicial body acting as an oversight body would also have similar powers of a federal court in the event that there is non-compliance. I struggle to think that this is something a standard court system couldn’t handle and I find it puzzling that such power would be taken away from the standard court system. In my head, it would’ve made more sense that the Commission could simply refer the matter to the court where the court would handle things from there.
What I can see is that a safeguard is that a ruling can be appealed back into the standard court afterwards. Still, I find it really strange that the Commission here is taking judicial powers like this, though.
In a subsequent section, we see this:
Maximum penalty
101 The maximum penalty for a violation is not more than 6% of the gross global revenue of the person that is believed to have committed the violation or $10 million, whichever is greater.
This provision was a major red flag for me back in 2021. In this version, though, it looks like there are two major improvements. For one, this is in the context of a social media platform and not every website out there. For another, those penalties are now a ceiling instead of a hard and fast “this is the penalty, deal with it.” So, I’m honestly satisfied with this and am hugely relieved by this fact as well.
What’s more, the section was also expanded on:
Factors — determination of penalty
102 The amount of the penalty is to be determined by taking into account the following factors:(a) the nature and scope of the violation;
(b) the history of compliance with this Act by the person that is believed to have committed the violation;
(c) any benefit that the person obtained by committing the violation;
(d) the ability of the person to pay the penalty and the likely effect of paying it on their ability to carry on their business;
(e) the purpose of the penalty;
(f) any factor prescribed by regulation; and
(g) any other relevant factor.
To my recollection, none of this even existed back in 2021. In reading this, all of it is pure improvement as far as I can tell.
What’s more, there are provisions allowing for the platform in question to challenge a complaint:
Representations
104 (1) A person that is served with a notice of violation may make representations to the Commission within the time and in the manner set out in the notice, in which case the Commission must decide, on a balance of probabilities, after considering any other representations that it considers appropriate, whether the person committed the violation.Decision — violation committed
(2) If the Commission decides that the person committed the violation, it may
(a) impose the penalty set out in the notice of violation, a lesser penalty or no penalty;
(b) suspend payment of the penalty subject to any conditions that the Commission considers necessary; and
(c) make an order requiring the person to take, or refrain from taking, any measure to ensure compliance with this Act.
Honestly, I find this to be very fair.
Another element that I think is quite fair is this:
Due diligence defence
113 A person is not to be found liable for a violation if they establish that they exercised due diligence to prevent its commission.
So, there is some give with this law that takes into account circumstances beyond a persons control.
Penalties
There’s some pretty big penalties in this for platforms that violate the law:
Penalty
(2) Every operator that commits an offence under subsection (1) is liable(a) on conviction on indictment, to a fine of not more than 8% of the operator’s gross global revenue or $25 million, whichever is greater; or
(b) on summary conviction, to a fine of not more than 7% of the operator’s gross global revenue or $20 million, whichever is greater.
Then there are separate penalties for a person as well:
Penalty
(2) Every person that commits an offence under subsection (1) is liable,(a) on conviction on indictment,
(i) to a fine of not more than 3% of the person’s gross global revenue or $10 million, whichever is greater, in the case of a person that is not an individual, and
(ii) to a fine at the discretion of the court, in the case of an individual; or
(b) on summary conviction,
(i) to a fine of not more than 2% of the person’s gross global revenue or $5 million, whichever is greater, in the case of a person that is not an individual, and
(ii) to a fine of not more than $50,000, in the case of an individual.
Still, the legislation makes it clear that due dilligence is a defence here:
Due diligence defence
122 A person is not to be found guilty of an offence under subsection 120(1) or 121(1) if they establish that they exercised due diligence to prevent the commission of the offence.
One of the things I’ve heard about from others is that there are provisions about imprisonment, but, as far as this section is concerned, there are no prison penalties here:
Imprisonment precluded
124 If an operator or individual, as the case may be, is convicted of an offence under subsection 80(1), 120(1), 121(1) or 128(1), no imprisonment may be imposed in default of payment of any fine imposed as punishment.
Part 8
This section contains a lot of provisions surrounding confidentiality, but where things kind of got interesting is with this section:
Report — Commission
130 (1) The Commission must, within three months after the end of each fiscal year, submit a report on its activities in that fiscal year to the Minister.
Contents
(2) The report must contain information respecting(a) any complaints that the Commission received under subsection 81(1), presented in a manner that protects the identity of the complainants;
(b) any orders made under paragraph 81(4)(b) or subsection 82(5) or 94(1);
(c) any inspections conducted under this Act; and
(d) any agreements or arrangements that the Commission enters into under section 136.
Additional information
(3) The report must also contain any information that the Minister requests.
In and of itself, this section isn’t necessarily all that interesting, but combining it with this section made it interesting:
Tabling
134 The Minister must cause each report made under sections 130 to 132 to be laid before each House of Parliament on any of the first 15 days on which that House is sitting after the Minister receives the report.
I wish I had more knowledge on how parliamentary procedures work, but I do idly wonder if the public will inherently get to find out what sorts of actions were taken on platforms. I’m sure some would find data like that to be interesting at the very least. Of course, this assumes that because it is being tabled at parliament means that it is available to the public. I’m not entirely sure on that, admittedly.
Another interesting provision is this:
Consultation
137 The Commission and the Canadian Radio-television and Telecommunications Commission must consult with each other to the extent that they consider appropriate in carrying out their respective mandates.
So, apparently, there will be CRTC involvement in this after all. From a consultation perspective, but the CRTC does have a presence in this sprawling regulatory apparatus.
Part 9
One section that stood out to me was the fact that the Commission can make changes to… well… a lot of different things in the bill:
Commission
140 (1) The Commission may make regulations(a) respecting the information that must be provided under section 4;
(b) respecting the factors referred to in paragraphs 55(2)(a) to (d);
(c) respecting factors that the Commission must take into account under paragraph 55(2)(e);
(d) respecting measures that operators must implement under section 56;
(e) respecting the guidelines referred to in section 57, including the duty under that section to make the guidelines publicly available;
(f) respecting the tools referred to in section 58, including the duty under that section to make those tools available to users;
(g) respecting the tools and processes referred to in subsection 59(1), including the duty under that section to implement those tools and processes;
(h) respecting the duty under section 60 to label harmful content;
(i) respecting the duty under section 61 to make a resource person available to users and the manner in which the resource person fulfills their role;
(j) respecting the duty under subsection 62(1) to submit a digital safety plan, including the time within which and the manner in which it must be submitted and the period to which it must relate;
(k) respecting the information required under paragraphs 62(1)(a) to (l), including the manner in which the information must be organized in the digital safety plan;
(l) respecting information required under paragraph 62(1)(m);
(m) respecting the duty under subsection 62(4) to make the digital safety plan publicly available, including the time within which and the manner in which it must be made publicly available;
(n) respecting the duty under section 63 to preserve harmful content;
(o) respecting design features for the protection of children referred to in section 65, such as account options for children, parental controls, privacy settings for children and other age appropriate design features;
(p) respecting the duty under sections 67 to 70 to make certain content inaccessible to persons in Canada, the making of representations under subsections 69(1) and 70(2) and requests for reconsideration under subsection 70(1);
(q) respecting the duty under section 72 to keep records, including the length of time for which the records must be kept;
(r) respecting the accreditation of persons under subsection 73(1), including
(i) conditions that apply to persons who are accredited, and
(ii) criteria and procedures for the suspension or revocation of an accreditation;
(s) respecting access to inventories of electronic data given under subsection 73(2), including the conditions to which access is subject, including with respect to confidentiality, data security and the protection of personal information;
(t) respecting requests made under section 74 for access to electronic data, orders made under that section and access to electronic data granted under those orders;
(u) respecting conditions with respect to confidentiality, intellectual property, data security and the protection of personal information under which access to electronic data is granted under orders made under section 74 and any other conditions under which that access is granted;
(v) respecting the revocation or amendment of orders under section 75, including requests and determinations made under that section;
(w) respecting the case management of complaints made under subsection 81(1) and the transmission, preservation and treatment of content that is the subject of a complaint and any information related to that content;
(x) respecting an operator’s duties under subsections 81(5) and (6) and 82(6); and
(y) respecting the manner in which a person is to publish the notice referred to in subsection 119(2).
The changes won’t be made secret, however:
Publication of proposed regulations
(2) Subject to subsection (3), a copy of each regulation that the Commission proposes to make under subsection (1) must be published in the Canada Gazette and operators and other interested persons must be given a reasonable opportunity to make representations to the Commission with respect to the proposed regulation.
It’s still strange that there can be unilateral changes to the law like this.
Some further adjustments can be made by the Governor in Council:
Governor in Council
141 (1) The Governor in Council may make regulations(a) respecting the meaning of the expression “significant psychological or physical harm” for the purposes of paragraphs 62(1)(h) and (j);
(b) providing for a period for the purposes of subsection 67(2); and
(c) providing for a period for the purposes of subsection 68(5).
Another interesting bit is the fact that this legislation is reviewed every 5 years:
Review and report
142 No later than the fifth anniversary of the day on which this section comes into force, and every five years after that, the Minister must cause a review of this Act and its operation to be undertaken. The Minister must cause a report on the review to be laid before each House of Parliament within one year after the review is completed.
Part 10
Coming into Force
The coming into force provision… doesn’t really say much:
Order in council
143 The provisions of this Act come into force on a day or days to be fixed by order of the Governor in Council.
So, there doesn’t appear to be any real time limit on that one. It’s whatever the government ultimately decides. Still, on the day the government decides, then the bill gets reviewed every 5 years starting on that first day.
Additional Acts and Bills Affected
There’s a bunch of Acts and bills this is changing, but most of it is just making references to the three offices being created under this bill.
Part 2
(There’s no Part 1 heading in case you are wondering).
There’s a bunch of changes to the criminal code, but as far as I can tell, a lot of the offences I see also have the defence of how simply offending someone isn’t grounds for a criminal offence.
The Rest of the Bill
The rest of the bill talks about several amendments to multiple laws. After going through the rest of it, nothing really stands out to me as far as I can tell.
Conclusions
I’ll go through my general notes in more detail, but the quick conclusions is that this is a substantial improvement over the 2021 version. A huge chunk of my fears are no longer in play. It’s not to say that this legislation is problem free, but I can definitely say it is not a massive ominous threat any more. I can definitely say I am hugely relieved that I am not going to see a massive threat to my operation, nor will I see a huge and imminent threat to personal rights and freedoms. For once, it really does sound like the government listened. Better late than never, I’d say.
Drew Wilson on Twitter: @icecube85 and Facebook.