EU AI Act Countdown: Essential Compliance Requirements Starting August 2025

AI Governance July 25, 2025 • 8 min read

EU AI Act Countdown: Essential Compliance Requirements Starting August 2025

With just days remaining until August 2, 2025, startups using or developing AI systems must prepare for the EU AI Act's first major compliance wave. Here's everything you need to know about the upcoming requirements and how to ensure your startup is ready.
Critical Deadline Alert

August 2, 2025 is a firm compliance deadline with no delays or postponements confirmed by the European Commission. General-Purpose AI model requirements take effect immediately.

What Changes on August 2, 2025?

The EU AI Act introduces its first wave of binding obligations, primarily targeting General-Purpose AI (GPAI) models and establishing the foundation for broader AI governance across Europe.

General-Purpose AI Models: The First Priority

A GPAI model is defined as an AI system that displays significant generality and can competently perform a wide range of distinct tasks across various applications. If your model can generate text and/or images using training compute greater than 10^22 floating point operations (FLOP), it's presumed to be a GPAI model under the Act.

Core GPAI Compliance Requirements:

  • Technical Documentation: Comprehensive "model cards" detailing development, training, and evaluation processes
  • Training Data Summaries: Public documentation of data sources, types, and preprocessing methods
  • Copyright Compliance: Evidence of legal permissions for all copyrighted training content
  • Code of Practice Adherence: Following the GPAI Code of Practice published in July 2025

The Code of Practice: Your Compliance Roadmap

Released on July 10, 2025, the Code of Practice provides a voluntary but highly recommended framework for GPAI compliance. It's structured around three critical areas:

Transparency

Clear documentation of model capabilities, limitations, and intended use cases.

Copyright

Comprehensive tracking and legal compliance for all training data sources.

Safety & Security

Risk assessment, mitigation measures, and incident reporting protocols.

Code of Practice Benefits

Companies signing the Code receive a one-year grace period until August 2026 for full implementation, reduced administrative burden, and greater legal certainty. Non-signatories face heightened regulatory scrutiny.

Systemic Risk Models: Higher Stakes

GPAI models with systemic risk face additional requirements including:

  • Advanced Model Evaluation: State-of-the-art testing using standardized protocols
  • Adversarial Testing: Documented testing to identify and mitigate systemic risks
  • Incident Reporting: Real-time reporting of serious incidents to the AI Office
  • Cybersecurity Protection: Adequate security measures for high-risk models

What About Startups?

The EU AI Act includes several provisions specifically designed to support startups and SMEs:

Regulatory Sandboxes

Startups receive priority access to regulatory sandboxes, allowing real-world testing under flexible conditions with no administrative fees. Sandbox participation can serve as evidence of AI Act compliance.

Financial Relief Measures

  • Reduced Fees: Conformity assessment fees adjusted based on company size
  • Proportionate Penalties: Fines scaled to SME size (though maximum penalties remain at €35M or 7% of global turnover)
  • SME Exemptions: Lighter requirements for companies with fewer than 50 employees

Dedicated Support Resources

  • Simplified documentation forms for SMEs
  • Specialized guidance channels for startup queries
  • Tailored AI literacy and compliance training programs
  • Dedicated AI service desks in member states

Immediate Action Plan for Startups

Pre-August 2025 Checklist:

Assessment Phase
  • Complete AI system inventory
  • Classify risk levels
  • Determine your role (provider/deployer)
  • Identify GPAI models
Documentation Phase
  • Prepare technical documentation
  • Conduct copyright audit
  • Create training data summaries
  • Implement risk management procedures

1. Conduct a Comprehensive AI Audit

Start with a complete inventory of all AI systems your startup uses or develops. For each system:

  • Classify the risk level (minimal, limited, high-risk, unacceptable risk)
  • Determine if it qualifies as a GPAI model
  • Identify your role: provider, deployer, modifier, or distributor
  • Document intended use cases and potential risks

2. Prepare Technical Documentation

Create comprehensive "model cards" that include:

  • Model architecture and design decisions
  • Training methodologies and datasets used
  • Performance metrics and evaluation results
  • Known limitations and potential biases
  • Intended use cases and deployment guidance

3. Address Copyright Compliance

For any GPAI models, ensure you have:

  • Complete documentation of all training data sources
  • Legal permissions for copyrighted content
  • Filtering systems to avoid unauthorized content
  • Attribution systems where required

4. Establish Governance Infrastructure

Implement internal processes including:

  • AI literacy training for relevant staff
  • Appointed responsible persons for AI compliance
  • Incident monitoring and reporting systems
  • Regular risk assessment procedures

Looking Ahead: Future Compliance Waves

While August 2025 focuses on GPAI models, additional requirements are coming:

February 2026: High-risk AI system requirements take effect
August 2026: Full enforcement powers activated, Code of Practice becomes binding for signatories
August 2027: All AI systems placed on market before August 2025 must achieve full compliance

Penalties and Enforcement

The EU AI Act carries significant penalties for non-compliance:

Maximum Fines

€35M or 7%
of global annual turnover

Data Governance

€20M or 4%
for data violations

GPAI Models

€15M or 3%
for GPAI violations

However, enforcement powers don't fully activate until August 2026, providing a crucial preparation window for startups.

Practical Next Steps

Ready to Get Started?

Take our AI Governance Assessment to evaluate your current compliance status and receive a detailed report with personalized recommendations.

  1. Assess Your Current State: Use compliance checkers and professional assessments to understand your obligations
  2. Prioritize GPAI Models: Focus immediate attention on any general-purpose AI systems
  3. Consider Code of Practice: Evaluate signing the voluntary code for the grace period benefits
  4. Seek Professional Guidance: Consult AI Act specialists for complex classifications and compliance strategies
  5. Monitor National Implementation: Stay updated on member state variations and enforcement approaches

The Bottom Line

The EU AI Act represents a fundamental shift in how AI systems must be developed, deployed, and maintained. While the August 2025 deadline focuses primarily on GPAI models, it marks the beginning of the world's most comprehensive AI regulatory framework.

Startups that proactively embrace compliance will not only avoid penalties but position themselves as trusted, responsible AI developers in an increasingly regulated market. The support measures and grace periods available show that regulators understand the unique challenges facing startups – but these benefits require active engagement with the compliance process.

The countdown has begun. The time for preparation is now.

Need Help with AI Act Compliance?

Privacy Pad specializes in helping startups navigate complex regulatory requirements. Our AI governance experts can assess your current compliance status and create a tailored roadmap for EU AI Act compliance.

Get Expert Guidance