TikTok is a video-hosting service that allows users to create and share content. A popular subject matter for these videos are ‘challenges’ – effectively an entreaty to users to undertake some action and film themselves while doing so. The challenges range from humorous and harmless to seriously dangerous activities.
One such challenge was undertaken by a young girl, and tragically resulted in her death. A recent ruling in the US means TikTok may be held liable for her passing.
The deceased’s mother sued TikTok and ByteDance, the corporation which owns TikTok, alleging responsibility for the death. The substance of her complaint was based on the way TikTok works.
In addition to allowing users to search video content, TikTok’s algorithms recommend videos, based on several factors including the user’s age and other demographics, as well as other on-line interactions and metadata. The challenge video in this case was recommended by the algorithms, not sought out by the user.
The mother claimed for, among other things, negligence and products liability. She alleged that TikTok:
- Was aware of the challenge;
- Allowed users to post themselves participating in it; and
- Recommended it to other users, including minors.
She was unsuccessful in the District Court, which dismissed the claim pursuant to s230 of the Communications Decency Act of 1996 (CDA). The CDA began life as an attempt to regulate pornographic material on the internet, and s230 effectively provides immunity for platforms that allow the posting of third party content, largely on the basis that they are not considered publishers of that content.
The mother appealed that decision, on the basis that the process TikTok’s algorithms undertake – amalgamating third party content and producing a curated stream of videos that is offered to users as being interesting to those users – is different from merely hosting videos. This, she alleged, constituted a product that was TikTok’s own content, not shared third party content.
Ms Anderson was successful in having the lawsuit reinstated, a persuasive factor being that the challenge video was suggested, rather than being the result of a search for the video’s subject matter by the user.
Tellingly, Judge Matey (in a partial concurrence and partial dissent) noted (footnotes omitted):
TikTok reads § 230 of the Communications Decency Act, 47 U.S.C. § 230, to permit casual indifference to the death of a ten-year-old girl. It is a position that has become popular among a host of purveyors of pornography, self-mutilation, and exploitation, one that smuggles constitutional conceptions of a “free trade in ideas” into a digital “cauldron of illicit loves” that leap and boil with no oversight, no accountability, no remedy.
The Judge went on to find that this interpretation was not within the words of the section, and that the lawsuit could continue. Although this is far from the final decision, it is not the first time that the US courts have acted to limit the operations of s230 of the CDA, as Proctor has previously reported.
It may be an indication that the US courts are moving towards a position more in-line with that in Australia, where the High Court has held on-line platforms responsible for third party comments which they encourage or facilitate (see Fairfax Media Publications Pty Ltd v Voller [2021] HCA 25 (8 September 2021) .
What is clear, however, is that courts are beginning to hold companies responsible for the activities of their algorithms. As the software in use progresses beyond merely searching the existing internet, and into curating – and in the case of some AI products, creating – content in response to customer requests, likes and reams of metadata, the perils increase.
Practitioners considering the use of AI-based tools to engage with clients and increase customer reach should be cautious; the sins of the software could well become theirs.
Share this article