Startup News Digest 05/05/23

The Big Story: “Children’s safety” push for more data collection, content scanning 

Lawmakers reintroduced and advanced legislation that would push Internet companies of all types and sizes to collect more data from their users and scan public and private user content in the name of childrens’ safety. The bills—including the Kids Online Safety Act, the Children and Teens’ Online Privacy Protection Act, which were reintroduced this week, and the Eliminating Abusive and Rampant Neglect of Interactive Technologies (EARN IT) Act, which advanced out of the Senate Judiciary Committee this week—would remake the regulatory framework that enables startups to grow and succeed. 

The EARN IT Act amends Section 230 and would open up Internet companies to lawsuits under state laws—which have varying and lower bars for when a company can be held liable—if the companies host child sexual abuse material (CSAM), even unknowingly. That liability will make companies that host user content less likely to host content, more likely to proactively scan content like images and private messages, and less likely to offer privacy-protecting tools like encryption that would prevent that scanning. Ahead of the markup, technology industry stakeholders and over 100 civil society organizations wrote the Committee to voice these privacy and security concerns, which were echoed by some lawmakers on Thursday. The committee delayed votes on two other bills—the STOP CSAM Act and the Cooper Davis Act—which would also push companies to proactively scan user content and report illegal content (CSAM and controlled substances, respectively) to authorities. 

The Children and Teens’ Online Privacy Protection Act would update existing rules by adjusting the legal standard that determines how companies are responsible for knowing the age of their users, which would require companies to collect additional data about their users. Right now, companies have obligations around data use that kick in when they have “actual knowledge” that a user is under the age of 13. The proposed change would alter that standard to one based on one where companies have “constructive knowledge” of a user’s age, meaning companies should infer a user’s age based on their actions. Another bill, the Kids Online Safety Act would give parents control over how their children under the age of 17 use Internet platforms and open up companies to liability if they let young users see harmful content. These proposals would have companies collect and analyze additional information about all users to figure out which users are not adults. In addition to implications for user privacy and expression, most startups don’t have the capacity to collect, analyze, and secure additional information beyond what they actually need to deliver their services.

As lawmakers introduced and considered these proposals this week, much of the conversation was focused on large tech companies, the majority of which already have programs in place to report illegal content to authorities, have the ability to make inferences based on additional data collection, and are generally more equipped to navigate compliance burdens and litigation costs. As policymakers consider changes to intermediary liability and privacy frameworks, they should take into account the limited resources of the startups who currently operate under those frameworks.

Policy Roundup: 

Importance of startups on display during National Small Business Week. This week marked National Small Business Week, with policymakers, advocates, and startups themselves voicing the need for a policy environment that enables entrepreneurs to succeed. For example, bipartisan groups of lawmakers introduced pieces of legislation to support educational opportunities and capital access for entrepreneurs.

State censorship bills threaten Internet infrastructure. In a new op-ed this week, Chamber of Progress founder Adam Kovacevich explained how bills restricting information about reproductive health will undermine the open Internet. Earlier this year, Republican lawmakers introduced a state law in Texas that, if enacted, would force websites to remove content about reproductive health care and would force Internet service providers to block websites that feature such information. Subverting open Internet principles and attaching liability for user content dissuades innovators from creating spaces for people to talk about issues important to them, including their health.  

Impending H-1B rule highlights need for low cap fix.  The U.S. Citizenship and Immigration Service is working on a rule to modernize the H-1B visa process to address, among other things, potential abuse by beneficiaries submitting multiple entries to increase their chances. The latest program cycle saw record submissions, which the agency will likely attempt to fix by limiting the lottery to “unique beneficiaries” based on identification like passport numbers. The extreme lengths that some companies and individuals are willing to undertake to hire talented workers underscores the need for additional fixes to the immigration system—including by increasing the cap for H-1B visas—to allow additional pathways for talent from around the world to contribute to the U.S. economy. 
EU AI Act moving forward, vote expected next week. Members of the European Parliament reached a deal ahead of key committee votes next week to solidify their position on the bloc’s AI Act. The proposed legislation will regulate artificial intelligence systems, assigning obligations according to risk, with higher risk categories subject to more stringent requirements. Startups on both sides of the Atlantic have been worried that the law could subject innocuous use cases of AI—including general purpose AI—to additional obligations and therefore make developing products in those areas uncompetitive for startups.