What’s in the AI executive order, and what does it mean for Startups?

(this post was updated on Nov. 6 to include additional information related to government use of AI)

On Monday, President Biden signed a sweeping executive order addressing artificial intelligence, touching on many issues that will impact the startup and AI ecosystem. The order includes several provisions, including, e.g., those designed to regulate the most powerful “frontier” models, address government use of AI, direct the creation of new oversight mechanisms, instruct agencies to monitor potential biases, authorize an AI resource pilot, and ease some pathways for immigrants with AI skills. Taken together, the order presents a mixed bag for startups, and with much left to agencies—the details of implementation will prove important.

The Good

Creating AI resources

The order directs the creation of a pilot program for the National AI Research Resource (NAIRR) that is consistent with the NAIRR Task Force’s recommendations. The order also directs the Small Business Administration to assess, revise, and create new and existing small business grant, investment, and support programs to ensure they’re relevant and accessible to AI innovators. 

Why it matters for startups: 

Innovating in AI is capital-intensive, and the resources advanced by the order can help level the playing field for startups. Ensuring the existing 7(a) loan, Small Business Investment Company and other capital access programs are fit for the purpose of supporting AI firms will help startups access capital and help their existing resources go farther. Meanwhile, the NAIRR—which startups helped to shape—stands to benefit startups twofold: both by directly providing compute, datasets, and other resources, and by bolstering the U.S. AI talent pool. Of course, the president cannot unilaterally appropriate the requisite funds, making it necessary for Congress to ensure these resources are adequately funded and successful.

Bolstering AI talent

The order includes a number of measures aimed at bolstering U.S. AI talent, including prompting changes to U.S. visa and immigration programs, supporting AI training programs, and fostering an AI-ready workforce. It directs the extension of domestic renewal to J and F visa types (researchers and students) streamlining the process, reducing costs, and mitigating interruptions to e.g., AI-related research. It will establish a program to identify and attract top talent in AI to the U.S., and it will precipitate changes to streamline pathways for O-1A, EB-1, EB-2 visas and startup founders under the International Entrepreneur Rule.

Why it matters for startups: 

The U.S. faces a shortage of skilled talent, especially in AI, and these provisions should help to close that gap, making hiring advanced AI talent within reach for more startups. What’s more, the changes should enable more founders to launch or relocate and scale their startup from the U.S. Immigration challenges have created headwinds for the startup ecosystem and immigrant founders for several years, especially in the face of an intransigent Congress, so these changes mark a promising step in the right direction. Meanwhile, the workforce training and retraining programs contemplated should help to additionally bolster AI talent while ensuring that retraining is forward looking to the jobs of the future

Guidance around AI & existing law

The order seeks to confront a primary concern of many policymakers, civil society organizations, and others in addressing issues of potential bias in AI, both through several steps related to government use of AI and by directing agencies to evaluate and issue guidance on how existing law they are responsible for—including around healthcare, education, finance, housing and employment—interacts with AI.

Why it matters for startups:

Engine has long highlighted that startups and others have been innovating with AI for years—including in key sectors like education, health, lending, and employment. We have underscored that many issues policymakers are worried about—especially around discrimination, for example—are often already addressed or illegal. As a result, it makes good sense that regulation of AI begin with existing law, and we have previously suggested that agencies of jurisdiction should issue guidance making clear how they view the interaction of the laws they are charged with enforcing and the use of AI technologies. Clear guidance can help startups understand the law without leaving them to parse it alone, and (though the type and extent of guidance agencies are directed to issue varies) this seems to be a step in the right direction.

The Not-so-Good

Limiting open source and opening the door to regulatory capture

The most publicized part of the order is concerned with regulating powerful multi-purpose AI models and the compute resources to build them. It subjects computing clusters and those building or intending to build such models—above certain (as now, very high) thresholds—to several reporting requirements, disclosures, and testing and security measures. The order primarily does this by (somewhat controversially) leveraging the Defense Production Act—a Korean War-era law usually reserved for use in national emergencies. And it additionally prompts the National Telecommunications and Information Administration to study open source AI models. 

Why it matters for startups: 

The compute-based thresholds in the executive order don’t come anywhere near directly impacting startups. In fact, they may be set above current frontier models or only apply to a few two companies. However, they raise a few distinct concerns. One is around regulatory capture, as the reporting requirements will help to cement the largest firms and slow others trying to catch up, perhaps leaving startups and others to innovate on top of those firms’ models. The other is that computational power and availability will continue to improve, meaning that what is far out of reach today will be in scope tomorrow (consider Cold War-era export controls on CPUs or the tax code). The order gives authority to revise thresholds, but government moves slow and usually expands—rather than narrows—applicability. 

Perhaps most alarming for innovation is the likely impact on open source models—usually meaning publicly-available pre-trained models that can be fine tuned for other purposes or innovated upon. While the outcome of the NTIA study technically belongs in the ‘TBD’ column, the order presently discourages firms that were considering open sourcing their models from doing so through the security and reporting requirements. Open source plays a critical role in innovation and the startup ecosystem in particular. Startups are able to iterate with open source, which has helped to drive innovation and lower barriers to technology entrepreneurship, meaning limiting it in AI will be bad for startups.  

The TBD

Intellectual property guidance

The order directs the U.S. Patent and Trademark Office (USPTO) and the U.S. Copyright Office to issue guidance related to AI and IP. The USPTO will need to address thorny questions around inventorship and patent eligibility, while the Copyright Office will issue guidance related to the copyrightability of works produced using AI and the treatment of copyrighted works in AI training data—issues addressed in the Office’s ongoing study on Copyright and AI.

Why it matters for startups: 

Depending on its contents, the guidance issued by the respective IP agencies could significantly negatively impact the startup ecosystem, or it could create clarity for innovators and mitigate abuse and the threat of potentially ruinous litigation. As we have previously told the agencies, current law around patent eligibility is critical to ensure only truly novel inventions are patentable, and to avoid bad faith litigation that arises from low-quality patents. Likewise, startups need to be able to ingest lots of data, and the current copyright framework allows startups on bootstrap budgets to build and scale AI-powered tools without having to find and negotiate a license every time a piece of copyrighted material is included in vast datasets. If that were to change—which is a real possibility, given the Copyright Office’s ongoing study exploring requiring such licenses—it would chill innovation and harm startup competitiveness, in addition to being an impractical misapplication of existing copyright law. 

Government use of AI

Perhaps the most direct implications of the order will be for how various U.S. government agencies use, procure, and evaluate AI to advance their missions. Quickly following the order, the Office of Management and Budget—the agency responsible for overseeing the federal agencies—released proposed guidance regarding government use of AI. Together, the guidance will lay out how agencies should approach governance, manage risks, and leverage AI to improve public services.

Why it matters for startups

Many startups—including those innovating in AI—currently have contracts with government agencies, plan to sell to the government in the future, or create a product or service that would improve the provision of public services. Engine has repeatedly highlighted ways technology and the ingenuity of startups could be leveraged to improve government. Encouragingly, the order refocuses agencies on doing just that, but at the same time, additional steps to serving agencies’ needs may elevate already-high barriers to government contracts for startups.

* * *

This week’s executive order is just the latest development as policymakers around the world move to regulate AI, parts of which lawmakers in Congress are already looking to pass into law. Startups are innovating in AI across all sectors and in many ways—with their own unique AI products and building off of others’ general purpose models. Startups are key stakeholders in AI and their perspectives should be included in policymaking to arrive at AI policy that promotes innovation and competition.

* * *

Engine is a non-profit technology policy, research, and advocacy organization that bridges the gap between policymakers and startups. Engine works with government and a community of thousands of high-technology, growth-oriented startups across the nation to support the development of technology entrepreneurship through economic research, policy analysis, and advocacy on local and national issues.