Close Menu
clearpathinsight.org
  • AI Studies
  • AI in Biz
  • AI in Tech
  • AI in Health
  • Supply AI
    • Smart Chain
    • Track AI
    • Chain Risk
  • More
    • AI Logistics
    • AI Updates
    • AI Startups

Deloitte finds companies adopting AI without increasing revenue • The Register

January 24, 2026

Enhancing luxury travel experiences through technology

January 24, 2026

Predictions 2026: Evolving data centers for an AI-driven future – IT News Africa

January 24, 2026
Facebook X (Twitter) Instagram
Facebook X (Twitter) Instagram
clearpathinsight.org
Subscribe
  • AI Studies
  • AI in Biz
  • AI in Tech
  • AI in Health
  • Supply AI
    • Smart Chain
    • Track AI
    • Chain Risk
  • More
    • AI Logistics
    • AI Updates
    • AI Startups
clearpathinsight.org
Home»AI in Business»AI regulation is coming: what should companies do now?
AI in Business

AI regulation is coming: what should companies do now?

December 15, 2025004 Mins Read
Share Facebook Twitter Pinterest Copy Link LinkedIn Tumblr Email Telegram WhatsApp
Follow Us
Google News Flipboard
Shilling cameron e1741840482937.jpg
Share
Facebook Twitter LinkedIn Pinterest Email Copy Link
For businesses in the United States, AI governance will soon become a compliance imperative, not just a best practice.

When the European Union adopted the European Artificial Intelligence Act in 2024, it set a global benchmark for risk-based AI regulation. The gradual implementation of this law, starting with the banning of high-risk systems and moving to comprehensive governance by 2026, has influenced legislative agendas around the world. For U.S. companies, the law is a sign that AI governance will soon become a compliance imperative, not just a best practice.

Landmark Colorado law will have ripple effects

In May 2024, Colorado became the first state to enact a large-scale AI law, the Colorado Artificial Intelligence Act. Modeled in part on the EU framework, Colorado imposes obligations on developers and deployers of high-risk AI tools, such as those used to make employment, housing, health care and lending decisions. These obligations include impact assessments, risk management programs, transparency and human oversight.

Originally scheduled to take effect in February 2026, enforcement of Colorado’s law is delayed until June due to industry pressure and legislative changes. Nonetheless, Colorado’s landmark law has inspired similar laws in other states, such as California and Illinois, and could still seep into New Hampshire’s current legislative session.

Momentum builds in state legislatures

While Colorado leads with a comprehensive AI law, other state legislatures are proposing both broad and targeted laws. California has passed several such laws, including the Transparency in Frontier Artificial Intelligence Act, which mandates disclosures and security protocols for developers of advanced AI models. California also has laws addressing chatbot security and consumer protection, which will begin enforcement in 2026.

Illinois and New York City have focused on the use of job-related AI. These laws require notification or consent from applicants before using AI tools in the hiring process, and prohibit or require auditing of automated employment decisions. General privacy laws, including New Hampshire’s, also impose restrictions on automated decision-making, which extend to employment decisions as well as other contexts.

New Hampshire has yet to pass a general AI law, opting instead for narrower measures addressing specific risks. For example, current New Hampshire law prohibits state agencies from using AI for real-time biometric surveillance and discriminatory profiling without a warrant, as well as for specific uses of generative AI, such as for deepfakes and communications with minors.

Federal legislation and decrees

At the federal level, comprehensive AI legislation remains elusive. Rather, the political landscape is shaped by the actions of the executive. In early 2025, President Trump signed Executive Order 14179, titled Removing Barriers to American Leadership in Artificial Intelligence, which repealed previous security-focused mandates and prioritized innovation. Most recently, a draft executive order leaked in November 2025 signaled an intent to preempt state AI laws, citing concerns about a “patchwork” of regulations that could stifle competitiveness. The plan proposed creating a federal AI task force and conditioning federal funding on states’ compliance with national rules. Although the order has not been finalized or released, it highlights the tension between federal uniformity and states’ rights – a debate that will shape AI governance in 2026 and beyond.

Businesses should start preparing for compliance now

Whether state or federal regulations emerge during this legislative session or in the near future, businesses should start preparing to comply now. Here are three main steps to achieve this.

Conduct an evaluation of the use of AI. Inventory all AI tools the company is already using and identify AI technologies that will benefit the organization.
Establish an AI governance framework. Create a cross-functional AI governance team that includes leaders from across the company, as well as technology and legal advisors with AI expertise. Develop written policies that align with existing regulations and emerging standards, such as the EU AI law and the AI ​​risk management framework promulgated by the National Institute of Standards and Technology.
Integrate AI into operations. Operationalizing the use of AI through testing, prototyping, and end-use of AI in production environments. Ensure that appropriate due diligence is carried out and contracts are signed with suppliers regarding their use of AI.
AI is not a distant concept. This is a current business reality. With hundreds of AI-related bills introduced in the United States and global frameworks such as the EU AI Act and the NIST AI Risk Management Framework setting the bar for compliance, businesses must act now to maintain their competitive advantage. Don’t wait for a law to force you to comply. Lead the way.

Cam Shilling founded and chairs McLane Middleton’s Cybersecurity and Privacy Group. The group of six lawyers and a paralegal helps businesses and private clients improve their AI security, privacy and compliance, and address any incidents or breaches that occur.

Follow on Google News Follow on Flipboard
Share. Facebook Twitter Pinterest LinkedIn Tumblr Email Copy Link

Related Posts

Deloitte finds companies adopting AI without increasing revenue • The Register

January 24, 2026

Small Business Update: What SMBs Need to Know About the Economy, Taxes and AI in 2026 | CO

January 23, 2026

AI for Business: Practical Tools for Small Businesses

January 23, 2026
Add A Comment
Leave A Reply Cancel Reply

Categories
  • AI Applications & Case Studies (55)
  • AI in Business (280)
  • AI in Healthcare (251)
  • AI in Technology (266)
  • AI Logistics (47)
  • AI Research Updates (105)
  • AI Startups & Investments (226)
  • Chain Risk (70)
  • Smart Chain (91)
  • Supply AI (74)
  • Track AI (57)

Deloitte finds companies adopting AI without increasing revenue • The Register

January 24, 2026

Enhancing luxury travel experiences through technology

January 24, 2026

Predictions 2026: Evolving data centers for an AI-driven future – IT News Africa

January 24, 2026

Maryland Graduate AI Tool Teaches Case Study Answers

January 24, 2026

Subscribe to Updates

Get the latest news from clearpathinsight.

Topics
  • AI Applications & Case Studies (55)
  • AI in Business (280)
  • AI in Healthcare (251)
  • AI in Technology (266)
  • AI Logistics (47)
  • AI Research Updates (105)
  • AI Startups & Investments (226)
  • Chain Risk (70)
  • Smart Chain (91)
  • Supply AI (74)
  • Track AI (57)
Join us

Subscribe to Updates

Get the latest news from clearpathinsight.

We are social
  • Facebook
  • Twitter
  • Pinterest
  • Instagram
  • YouTube
  • Reddit
  • Telegram
  • WhatsApp
Facebook X (Twitter) Instagram Pinterest
© 2026 Designed by clearpathinsight

Type above and press Enter to search. Press Esc to cancel.