Authors are clearly responsible for the statements they post online, but website administrators may also be held liable for defamatory statements posted by others.

Defamation
The tort of defamation seeks to protect the reputations of individuals and businesses against unfounded and unjustified attacks. To establish the tort, a plaintiff (person suing) must prove that the defendant (person sued) published a defamatory statement about the plaintiff. A defamatory statement is one that is likely to lower the plaintiff’s reputation in the eyes of a reasonable person.
Authors’ potential liability for online defamation
When a person writes and posts a defamatory statement online, they can be held liable (legally responsible) for the tort of defamation. A court may order them to pay the victim damages (financial compensation).
For example, in Premier Finance Limited v Ginther (Ginther), the defendant, a disgruntled customer, posted negative reviews about the plaintiffs, a business and its owners, on Google and Yelp. The defendant’s reviews said the plaintiffs were “fraudulent” and “deceitful”. The defendant claimed the plaintiffs had scammed him by charging him for a product he did not order and creating “fake invoices” to cover their lies.
One possible defence to defamation is justification or truth. If the defendant shows that their defamatory statements were substantially true, they will not be held liable for defamation. The defendant in Ginther raised this defence but the British Columbia judge who heard the case rejected it. She said the defendant failed to prove that the defamatory statements he made about the plaintiffs were substantially true. She found that the defendant’s online reviews contained several untrue statements.
The judge also found that the defendant acted with malice when he posted the online reviews. She noted that, in publishing the defamatory statements, the defendant was trying to discourage others from doing business with the plaintiffs.
The judge found the defendant liable for the tort of defamation and ordered him to pay the plaintiffs $80,000 in general damages and $10,000 in aggravated damages.
The Ginther case is just one example of an author of online comments being held liable for the tort of defamation and ordered to pay damages. There are several other recent Canadian examples, including: Carnegie v Descalchuk; D’Alessio v. Chowdhury; Pacific Granite Manufacturing Ltd v Lee; and Moen v Mackay.
Suggestions for people who post comments online
To manage their risk of liability for the tort of defamation, people who post comments online should:
- Be honest and truthful in what they say.
- Limit what they say to what they can prove (back up with evidence).
- Keep records that support what they say.
- Avoid using inflated or exaggerated language.
- Have someone proofread their comment before posting it.
Website administrators’ potential liability for online defamation
A less well-established issue is the potential liability for defamation website administrators face for the comments other people (third parties) post on their websites.
An Ontario judge recently held a Facebook group administrator liable for defamatory comments posted by other members of the group: Belliveau v Quinlan (Belliveau).
The Facebook group was called “Are We Dating the Same Guy in London Ontario”. It provided a space for its members to communicate about men they were dating, so they could identify potential safety risks and men dating more than one of them.
In April 2023, members of the Facebook group began posting comments about the plaintiff. Some of the comments were defamatory. For example, one comment called the plaintiff an awful human being, another described him as sleazy, a third said he was a liar, and a fourth accused him of stealing money from a single mother. The plaintiff sued the defendant, a co-founder and administrator of the Facebook group. He alleged the defendant was legally responsible for the defamatory comments posted by other members of the group. The judge who heard the case agreed.
The judge concluded the plaintiff had met the test for establishing liability on the part of a website administrator for defamatory comments others (third parties) had posted on the website. The plaintiff proved that:
- the defendant was aware of the defamatory comments members of the Facebook group had posted,
- she had the authority (power) to remove them, and
- she chose not to remove them.
In the result, the judge found the defendant personally liable for the defamatory comments posted by other members of her Facebook group and ordered her to pay the plaintiff $7,500 in general damages.
The judge in Belliveau based his decision on a British Columbia case decided nearly a decade before: Pritchard v Van Nes (Pritchard). The judge in Pritchard recognized that liability for third-party defamatory comments on an internet-based platform was “an emerging legal issue in Canadian law”. After reviewing several case authorities, he suggested the test for establishing such liability should have three elements:
- actual knowledge of the defamatory material posted by the third party,
- a deliberate act that could include not acting in the face of actual knowledge, and
- power and control over the defamatory content.
He then held that, when these three elements are met, “it may be said that a defendant has adopted the third party defamatory material as their own”. Based on this test, the judge found the defendant in Pritchard liable for defamatory comments others posted on her Facebook page in response to her own defamatory posts.
Like Belliveau, other recent Canadian cases have employed the test set out in Pritchard to hold website administrators liable for defamatory comments posted by their website’s users (third parties). For example, I Buy Beauty LLC v Dong dealt with defamatory comments posted by viewers of the defendant’s YouTube channels.
Suggestions for website administrators
To manage their risk of liability for online defamation, an administrator of a website that allows others to post comments should:
- Establish rules or guidelines for their website that prohibit users from posting potentially defamatory comments.
- Actively monitor their website for potentially defamatory comments.
- Offer a process (like a “Report” button or link) to allow users to flag potentially defamatory comments and/or request that such comments be removed.
- Quickly remove potentially defamatory comments that are identified/reported.
Looking for more information?
Looking for articles like this one to be delivered right to your inbox? SUBSCRIBE NOW!
DISCLAIMER The information in this article was correct at time of publishing. The law may have changed since then. The views expressed in this article are those of the author and do not necessarily reflect the views of LawNow or the Centre for Public Legal Education Alberta.