What Lawyers and AI Companies Need to Know About Washington's HB 1170

Governor Ferguson signed HB 1170 into law, imposing new disclosure and technical requirements on generative AI systems. At a high level, this Act requires covered providers to enable users to determine whether content was generated or materially altered by the covered provider’s generative AI systems.

This blog breaks down the Act into seven key questions that matter most for legal and AI product teams.

1. How do you know if you are a "covered provider"? 

A covered provider is an entity that creates generative AI systems with over 1 million monthly users, which is publicly accessible to consumers in the state of Washington for personal use. However, state, local, and tribal governments are expressly excluded from this definition.

The threshold is relatively clear, but generative AI providers should closely monitor user counts. Once a system reaches a high-use phase, the threshold may be crossed quickly. It also remains unclear how users are counted across multi-product ecosystems, raising the possibility that counts may be aggregated.

2. What qualifies as "AI-generated or materially altered" content? 

Content is AI-generated if the video, image, or audio is created by the covered provider’s generative AI system. Content is “materially altered” if the covered provider’s generative AI system made significant changes that substantially altered the content’s underlying data.

The Act provides examples of minor modifications, including adjustments to brightness, contrast, or color, as well as the application of filters, resizing, format conversions, resampling, and denoising.

The Act does not provide examples of material alterations. Covered providers should work with engineers and legal counsel to determine what qualifies, while closely monitoring how regulators are likely to interpret the standard.

3. What does "provenance data" actually require covered providers to build?

Covered providers must include provenance data in content generated or materially altered by its generative AI systems, so that users can determine the content's origin. The provenance data can take the form of watermarking or metadata, but it must be difficult to remove or tamper with. 

Although framed as a disclosure obligation, this requirement calls for deliberate technical integration. Covered providers will need to design and implement durable technical solutions. They should work with their engineers to assess feasibility and determine the appropriate approach based on the content type. 

4. What are the enforcement and litigation risks? 

This Act will take effect on February 1, 2027, and will be enforced by the Washington Attorney General under the authority of the Consumer Protection Act. The Act does not provide a private right of action.

Importantly, the Act presumes violations as unfair or deceptive acts in trade or commerce that affect public interest. This gives the Attorney General a clear path to bring CPA claims.

The risk of enforcement depends on the Attorney General’s priorities. If AI enforcement is prioritized, enforcement could escalate quickly.

5. How does this law relate to other AI laws?

This Act adds to the growing patchwork of AI transparency and disclosure laws recently enacted and broadly aligns with California’s AI Transparency Act (SB 942).

As the patchwork continues to grow, covered providers will need to determine how to operationalize, and whether to standardize compliance. Covered providers should consider whether to follow the strictest standard or maintain a fragmented compliance approach.

6. Are there exceptions? 

Yes! There are a few notable exceptions. This Act does not apply to:

  • Business-to-business use, sale, licensing, or distribution of generative AI systems
  • Products, services, websites, or applications that exclusively provide video games or interactive experiences
  • Systems used solely for upscaling, noise reduction, or compression

Because these exceptions are narrow, covered providers should avoid over-relying on them without a clear factual basis.

7. What should covered providers do now? 

Whether clearly a covered provider or not, creators of generative AI systems should start preparing now by taking the following steps:

  • Audit generative AI system user counts, including user counts in multi-product ecosystems, to determine whether systems are approaching the user threshold.
  • Map where AI-generated or altered content is produced across generative AI systems.
  • Audit what kind of AI-generated or altered content is produced by generative AI systems.
  • Evaluate commercially feasible, technical options for embedding provenance data into AI-generated or altered content.
  • Align with legal counsel and engineering on what constitutes “material alterations.”
  • Monitor regulatory developments, advisory opinions, and enforcement signals from the Attorney General.

Proactive steps taken now can significantly reduce future compliance burdens and litigation risks. If you have questions about how this Act may affect your business, or if you’d like help assessing your AI systems, our Technology Transactions team is here to assist.

Related Services: Tech Transactions
Explore more on
Thought Leadership
  • Valerie  Shmigol
    Attorney

    Valerie is a trusted legal advisor specializing in commercial and technology transactions, with a distinct focus on data privacy issues. Her practice is built on a deep passion for the intersection of privacy, law, and emerging ...

About this Blog

Stay current on legal news and issues, and learn more about Summit Law Group's practice groups.

Topics

Archives

Authors

Recent Posts

Jump to Page

Necessary Cookies

Necessary cookies enable core functionality such as security, network management, and accessibility. You may disable these by changing your browser settings, but this may affect how the website functions.

Analytical Cookies

Analytical cookies help us improve our website by collecting and reporting information on its usage. We access and process information from these cookies at an aggregate level.