Record-setting FTC settlements with Fortnite owner Epic Games are the latest “Battle Royale” against violations of kids’ privacy and use of digital dark patterns

2 years ago 40

Record-setting FTC settlements with Fortnite owner Epic Games are the latest “Battle Royale” against violations of kids’ privacy and use of digital dark patterns lfair December 18, 2022 | 9:03PM Record-setting FTC settlements with Fortnite owner Epic Games are the latest “Battle Royale” against violations of kids’ privacy and use of digital dark patterns By Lesley Fair Two separate settlements with Epic Games, owner of the massively popular online game Fortnite, send the unmistakable message to business that the FTC means business when it comes to enforcing online protections for kids and fighting back against dark patterns designed to rack up charges without consumers’ express consent. If that isn’t enough to make companies take notice, perhaps these numbers will. Epic will pay a record-shattering $275 million civil penalty for alleged violations of the Children’s Online Privacy Protection Act. The company will turn over an additional $245 million for allegedly using dark patterns to dupe millions of Fortnite players into making unintentional purchases, the largest FTC administrative settlement ever. This post will focus on the FTC’s allegation of COPPA violations and on Epic’s choice of default settings, which allowed strangers to communicate with children and teens under 18. Subscribers to the Business Blog can expect a second post shortly that will take a deep dive into how the FTC says Epic used design tricks to zap Fortnite players with unauthorized charges. You definitely don’t want to miss Part 2. First, a refresher about what the COPPA Rule requires. Section 312.3 makes it clear that the Rule covers operators of child-directed sites and online services – a determination made by evaluating the subject matter, visual content, use of animated characters or child-oriented activities and incentives, and other factors – and operators of sites and online services who have actual knowledge they’re collecting or maintaining personal information from a child under 13. If a company is covered by COPPA, it must (among other things) get verifiable parental consent before collecting, using, or disclosing personal information from children under 13. According to the FTC, a substantial number of the 400 million people who play Fortnite are kids under 13, and through its registration process, Epic collected kids’ personal information – including their full names, email addresses, and usernames – without getting their parents’ consent. The complaint cites a number of factors to establish that Fortnite is a “child-directed” service. First, there’s a 2019 survey reporting that 53% of U.S. children aged 10-12 played Fortnite weekly, compared to 33% of teens between 13 and 17, and 19% of those between 18 and 24. The style of game play is relevant, too, including Fortnite’s cartoon-like graphics and colorful animation. Indeed, according to the complaint, Fortnite has proven so popular with children that Epic Games has approved licensing deals – and pocketed millions of dollars – for Fortnite-branded merchandise aimed at kids, including children’s clothing, Halloween costumes, school supplies, and toys. Other persuasive evidence came from Epic’s own employees. The complaint quotes statements like “We want to be living room safe, but barely. We don’t want your mom to love the game – just accept it compared to alternatives,” “Agree with the idea that, generally, all theming should be relevant to a 8-14 y.o., as a litmus test,” and “We are NOT adult: experience must allow for parental comfort for ages 10+.” The FTC alleges that Epic launched Fortnite with no parental controls and minimal privacy settings. Instead, the company included one paragraph toward the end of a lengthy Privacy Policy, disavowing that it was kid-directed. Thus, for two years, Epic allegedly took no steps to seek parental consent before collecting children’s personal information. Furthermore, as the complaint alleges, “Even when Epic obtained actual knowledge that particular Fortnite players were under 13, Epic took no steps to comply with COPPA” during this timeframe. The company finally instituted an age gate in 2019, but Epic didn’t apply it to most of the hundreds of millions of Fortnite players who already had accounts. The complaint alleges that Epic violated COPPA by failing to honor the requirements designed to ensure that parents – not companies – are in control of kids’ information online. But according to the FTC, Epic’s law violations didn’t end there. In designing Fortnite to match users to play the game together, Epic set it up by default that players could engage in direct, real-time voice chat with other players. Given the number of Fortnite players who were young kids or teenagers, the inevitable result was that children and teens were often matched with strangers. Epic’s then-Director of User Experience spotted the problem early on. Noting that “surely a lot of kids” are playing Fortnite, the Director of User Experience urged Epic leadership to institute “basic toxicity prevention” mechanisms to “avoid voice chat or have it opt-in at the very least.” An Epic employee raised a similar concern after a high-profile gamer verbally harassed a young player while publicly streaming to an audience of thousands. As the employee acknowledged: “. . . we honestly should have seen this coming or [at least] expected this with an on-by-default voice chat system. Situations like this are bound to happen . . .” Another employee summed up the problem this way:  I think you both know this, but our voice and chat controls are total crap as far as kids and parents go. It’s not a good thing. It was on my list a year ago, but never bubbled to the surface. This is one of those things that the company generally has weak will to pursue, but really impacts our overall system and perception. I’ve made a COPPA compliant game and we are far from it, but we don’t need to be that far . . .  How did Epic respond to its own employees’ concerns? According to the FTC, with lip service – followed by crickets. Despite entreaties from its staff, Epic chose to maintain its on-by-default in-game communications that allowed personal interactions between kids and strangers. When the company introduced a toggle switch allowing Fortnite players to turn voice chat off, the FTC says the control was buried on a hard-to-find settings page. Furthermore, even after Epic ultimately implemented an age gate, the FTC says the company continued to enable direct communication by default for all players, including those who identified themselves as under 13 or teens. The complaint outlines disturbing allegations of how Epic’s choice of default settings resulted in harm to kids and teens, including threats, bullying, and sexual harassment. Numerous news stories reported that predators had coerced youngsters they met through Fortnite into sharing explicit images or meeting offline for sexual activity. In addition, some kids and teens were exposed to traumatizing encounters involving self-harm, suicide, and suggestions by others that a player “kill themselves.” As one parent reported to Epic, “This morning, while on Fortnite, my 9 year old son had a ‘friend’ (someone he doesn’t know in real life, but has been playing with for months) tell him that he was going to kill himself tonight. It shook him to the core.” In addition to the $275 million civil penalty, which by law goes to the U.S. Treasury, the proposed court order prohibits Epic from enabling voice and text communications unless parents of users under 13 or teenage users (or their parents) give their affirmative consent through a privacy setting. Epic also must delete personal information previously collected from Fortnite users in violation of COPPA’s parental notice and consent requirements unless the company obtains parental consent to retain that data or the user identifies as 13 or older through a neutral age gate. To protect kids and other users in the future, Epic must establish a comprehensive privacy program that addresses the issues challenged in the FTC complaint. What can other companies take from the record-setting settlement? Companies can’t disclaim their way out of COPPA coverage. Simply saying your business isn’t covered by COPPA doesn’t absolve you of your legal obligations. The COPPA Rule includes detailed definitions of the sites and online services subject to the law’s protective provisions. If there is any doubt in your mind about whether COPPA applies to your business, now is the time to clear up that ambiguity. Listen to what your employees are telling you. When a knowledgeable staffer says, “Houston, we have a problem,” take their concerns seriously. One of a company’s best tools for reducing the risk of legal quicksand is a staff that feels empowered to call management’s attention to potential difficulties.  Default settings that harm consumers can be unfair under the FTC Act. As the complaint alleges, Epic’s choice to configure its system for on-by-default voice and text chat injured both kids in the under-13 COPPA age group, as well as teens. Think through the potential for harm your default settings could have for users of all age groups.  Looking for COPPA compliance resources? Visit the FTC’s Children’s Privacy page. And be sure to read the follow-up Business Blog post about the FTC’s challenge to Epic’s use of digital dark patterns.  


View Entire Post

Read Entire Article