HOW THE RULING WAS DECIDED
Within the Los Angeles case, Kaley’s legal professionals argued that Meta and Google deliberately focused children by way of platform design, moderately than content material, and made choices that prioritised revenue over security.
The legal professionals’ technique made it more durable for corporations to cover behind authorized provisions corresponding to Part 230, which typically shields platforms from legal responsibility over user-generated content material.
Jurors have been proven inside paperwork revealing how Meta and Google sought to draw youthful customers, and heard testimony from executives, together with Meta CEO Mark Zuckerberg.
One juror, who recognized herself solely as Victoria, mentioned the panel targeted closely on what protections the platforms had in place to protect Kaley from hurt, in addition to on the long-term penalties for future younger customers.
“We regarded on the historical past of every little thing that Kaley went by way of, and what was the method that these platforms had in place that was going to probably stop any hurt,” she mentioned.
Collin Walke, accomplice and head of cybersecurity and information privateness follow at legislation agency Corridor Estill, mentioned the case’s concentrate on platform design moderately than content material mattered within the eventual ruling.
The content material placed on social media isn’t the duty of the businesses, Walke defined.
“However what’s their duty is the style and methodology by which they design their algorithms with a purpose to present you that content material,” he mentioned.
“And that may be a unilateral alternative that they make within the design of their merchandise – and that’s the reason they have been discovered liable right here.”
