Family businesses should prioritize protecting consumers’ privacy and personal information—particularly children’s personal information—more than ever. Two bipartisan bills approved by the U.S. Senate on July 27, 2022, and headed for a full vote in the Senate would expand online privacy protections for children and minors and impose on certain businesses greater responsibilities for protecting children’s privacy.
Data protection is challenging for many businesses, including family businesses, because the United States does not currently have a national privacy law that establishes a uniform set of rights and obligations for the collection, processing, or disclosure of personal information. The U.S Senate Committee on Commerce, Science, and Transportation on July 27 approved two bipartisan bills to expand federal protections for children and teenagers online, however. Specifically, the Senate Commerce Committee passed the Kids Online Safety Act (KOSA) and an update to the Children’s Online Privacy Protection Act (COPPA). The two bills are now headed for a full vote in the Senate.
COPPA, which was enacted and supplemented through Federal Trade Commission (FTC) rules in 1998 and revised by the FTC in 2013 through rulemaking, aims to protect children’s privacy by mandating informed parental consent prior to the collection and processing of any personal information to be collected from children. It imposes requirements on operators of websites or online services (operators) directed to children who collect, use, or disclose personal information from children under 13 or operators with actual knowledge that they collect, use, or disclose personal information from children under 13. COPPA also applies to website operators and online service providers who knowingly collect personal information from users of other websites or online services directed to children. To learn more about COPPA, read the DWT article titled "Children’s Online Privacy Protection Act: What to Know and How to Comply."
The digital atmosphere has changed since Congress enacted COPPA. Today, companies collect, use, and disclose a wide range of digital information in ways that were not imaginable when COPPA was first enacted. The proposed expansion of COPPA aims to enhance protection for children under 13 and extends privacy protections to minors (ages 13–15) that were previously applicable only to children 12 and younger. It also establishes greater online privacy protections for children and minors.
The Children and Teens’ Online Privacy Protection Act (CTOPPA), which would extend the COPPA Rule to protect 13-to-15-year-old users, would prohibit an operator of a website, online service, online application, or mobile application directed toward a child or minor from collecting the child’s personal information without providing sufficient notice and obtaining consent. Sufficient notice must be provided to the parent of the child and—in the case of a minor, to the minor themselves . Likewise, consent must be obtained from the parent of the child and from the minor respectively. CTOPPA would also prohibit targeted marketing directed to a child in all instances and to a minor without the minor’s consent. The bill would prohibit the collection of more information than is reasonably required to participate in the game or use the website, service, or application.
The bill would further require operators to implement procedures to inform children of deletion mechanisms, delete personal information upon request (except for information provided by a third party ), and not cease service if the service could be provided without such information. Further, companies would have to establish and implement reasonable security safeguards to protect the personal information collected from children and minors. And they would have to establish and implement cybersecurity safeguards to protect information collected through internet-connected products targeted toward children and minors.
There is some opposition to amending COPPA. However, many senators from both parties shared their support of the bill’s purpose and said they could vote for it when it comes up for a full vote.
Unlike CTOPPA, KOSA was approved by the Senate Commerce Committee unanimously, 28-0. KOSA would require certain technology companies to prevent harm to minors while mandating more transparency in their algorithms. A "covered platform" under KOSA would be a "commercial software application or electronic service that connects to the internet and that is used, or is reasonably likely to be used, by a minor." A minor is any individual who is age 16 or younger.
KOSA would impose obligations on covered platforms to protect children online. For instance, it would require covered platforms to implement the strongest default safeguard setting for minors, and obligate the platforms to establish a method for minors and parents to submit reports of harm to minors, and for the covered platform to receive reports and respond . KOSA would also create a duty of care for covered platforms to act in the best interests of the minors using their products or services.
Moreover, before registration, use, or purchase, a covered platform would have to:
(a) provide minors and parents with notice of the policies regarding the use of personal data and placement of appropriate safeguards that would control their experience and personal data on the covered platform in a manner that is age appropriate and does not encourage minors to weaken or turn off safeguards, and;
(b) provide minors and parents notice about any systems that pose heightened risks of harm to minors by materials on, or engagement with, the platform.
The heightened risks of harm to minors by content on the platform are enumerated in the bill and would include: the promotion of self-harm, suicide, and eating disorders; patterns of use that show or encourage addiction-like behavior; sexual exploitation; physical harm, online bullying, and harassment of a minor; promotion and advertisement of products or services that are unlawful for minors; and predatory marketing practices.
Covered platforms that use algorithmic recommendation systems would have to clearly state how they use personal data of minors. The platforms would have to provide options for minors or their parents to modify the results of the algorithmic recommendation system, including opting out or down-ranking the recommendations.
To further aid transparency, KOSA would require platforms to conduct an annual independent audit assessing risks to minors. The audit would have to evaluate the platform for foreseeable harms to children and explain the prevention and mitigation measures intended to be taken in response to the known and emerging risks identified in the audit.
Children and teens continue to increase their use of technology and social media. Companies must take proactive steps to protect the personal information of these vulnerable users. All family businesses – particularly those that direct their products or services to children or teens online – must ensure compliance with ever-changing state and federal privacy laws to create a safe online environment for children and teens.
See more »
DISCLAIMER: Because of the generality of this update, the information provided herein may not be applicable in all situations and should not be acted upon without specific legal advice based on particular situations.
© Davis Wright Tremaine LLP | Attorney Advertising
Refine your interests »
Back to Top
Explore 2022 Readers’ Choice Awards
Copyright © JD Supra, LLC