EU AI Act 2026 — What UK Small Businesses Actually Need to Do
Most UK small business owners have never heard of the EU AI Act. Fair enough. You have invoices to chase, customers to deal with and about 40 more urgent things on the list.
But if you sell into the EU, market to EU customers, or use AI tools whose output is used in the EU, this law may affect you. And if it does, the right response is not a meltdown. It is a simple compliance tidy-up.
Key takeaway
- The EU AI Act is not just for big tech firms. It can apply to UK businesses if their AI systems or AI outputs are used in the EU.
- For most small firms, the urgent jobs are basic ones. Work out where you use AI, what personal data goes into it, whether EU customers are involved, and whether staff understand the risks.
- UK GDPR still matters just as much. The ICO is already making clear that AI and even more autonomous “agentic” AI must comply with existing UK GDPR rules on lawfulness, accuracy, transparency and automated decisions.
What the EU AI Act actually is
In plain English, the EU AI Act is the EU’s main law for regulating AI. It uses a risk-based approach. Some AI uses are banned outright. Some are allowed but tightly regulated. Others are lower risk and face lighter rules. The Act entered into force on 1 August 2024. Some rules already apply, including the ban on prohibited practices and AI literacy requirements from 2 February 2025, and obligations for general-purpose AI models from 2 August 2025. The broader framework is being phased in, with major obligations applying from 2 August 2026.
That is the first useful thing to know. This is not a future problem. Parts of it are already live. More of it bites in 2026.
The second useful thing is that this is not only about firms physically based in the EU. The official scope says the Act applies to providers placing AI systems or general-purpose AI models on the EU market, deployers located in the EU, and also providers and deployers outside the EU where the output produced by the AI system is used in the Union. That is the bit many UK firms will miss.
So if you are a UK business using AI to support an EU-facing service, or supplying AI-driven outputs to EU customers, you should stop assuming Brexit magically removed the issue.
Does this apply to your business?
For many UK small businesses, the honest answer is: maybe, but not always.
If you are just using AI internally to tidy meeting notes, draft blog posts for a UK-only audience, or rewrite emails for UK clients, the EU AI Act may not be the main issue. Your bigger concern is usually UK GDPR, supplier terms, and common sense.
But the Act becomes more relevant if any of these apply:
- you sell goods or services to EU customers
- your website, ads or onboarding are aimed at people in the EU
- you use AI outputs as part of an EU-facing service
- you embed AI into a product or service used in the EU
- you build or heavily customise an AI-driven feature and put it on the EU market
Those are not edge cases. Plenty of UK freelancers, agencies, software firms, ecommerce brands and consultants do at least one of them.
A simple rule helps here:
If you just use AI tools inside the business
Your biggest legal risk is usually UK GDPR AI compliance 2026, not the full EU AI Act. You still need to know what personal data goes in, whether the outputs are reliable enough to use, and whether you are making any significant decisions on a purely automated basis.
If you sell AI-enabled services into the EU
Now the EU AI Act becomes much more relevant. At that point, you need to know whether you are simply a user of someone else’s tool, or whether you are acting more like a provider, deployer or supplier under the Act.
That distinction matters, because the obligations are not the same.
What UK small businesses should do first
This is where most articles go foggy. So let’s make it simple.
If you are a UK small business and you think the EU AI Act might touch you, do these jobs first.
1. Make a basic list of every AI tool you use
Not next month. This week.
Write down:
- the tool name
- what you use it for
- whether staff use it or customers see it
- whether personal data goes into it
- whether any EU customer or EU lead is involved
- whether the output affects a real decision
This sounds dull. It is. It is also the fastest way to stop guessing.
If your team already uses Notion (affiliate link) for internal docs, keep the register there. If you use HubSpot (affiliate link) or GetResponse (affiliate link) for sales and email workflows, include any AI features switched on inside those platforms too. The point is not to buy a “compliance tool”. It is to know what is already happening in your business.
2. Sort your use into “internal help” and “customer-facing”
This is the next big filter.
Internal help includes things like:
- meeting summaries
- draft emails
- blog outlines
- internal research notes
- rewriting rough copy
Customer-facing or decision-facing use includes things like:
- AI chat on your website
- AI-written responses sent without review
- lead scoring or profiling
- screening candidates
- pricing or credit decisions
- fraud flags
- anything that could materially affect a person
The more your AI use affects real people, especially automatically, the more careful you need to be under both the EU AI Act and UK GDPR.
3. Check whether personal data goes into the tool
The ICO’s point is very plain: if you are processing personal data with AI, you need a lawful basis and you need to understand your role. Are you the controller, joint controller or processor? That applies even when you are using somebody else’s model or product.
This matters because many firms casually paste customer emails, call notes, CVs or support messages into AI tools without stopping to ask:
- should this data be in there at all?
- what is our lawful basis?
- what does the supplier do with it?
- is there a retention setting?
- is the data used for training?
That is not theory. That is the compliance work.
4. Decide whether you need a DPIA
The ICO says you must carry out a DPIA before deploying an AI system where the processing is likely to result in a high risk to people. If you cannot sufficiently mitigate that risk, you may need to consult the ICO before processing starts.
In plain English, you should be thinking seriously about a DPIA if your AI use involves:
- sensitive or large volumes of personal data
- profiling people
- automated decisions with legal or similarly significant effects
- vulnerable people
- monitoring behaviour
- a new or opaque use of personal data
For a lot of basic writing or design use, a full DPIA may not be necessary. But if you are using AI in recruitment, finance, risk scoring, support triage, health-related services, education or anything similar, do not skip this step.
The bit most UK firms will get wrong: “we’re just using a third-party tool”
This is the favourite excuse.
It is also not enough.
Using a well-known third-party platform does not remove your responsibilities. If your business is using Canva, Grammarly, Writesonic, Frase.io, Surfer SEO, Semrush, Notion, HubSpot or GetResponse, you still need to know what goes in, what comes out, and whether personal data is involved.
For many marketing uses, the legal risk is manageable. If you are using Canva to knock together a social graphic or Grammarly to tidy a blog draft, that is a very different risk profile from using AI to score leads, reject applicants or handle sensitive customer data automatically.
The practical step is this: review the supplier’s data processing terms, retention settings, training policy, security documentation and admin controls. Then set some house rules for staff. No customer data into public tools without approval. No AI-generated customer response sent blind. No fully automated decision that could seriously affect somebody without proper checks.
That will do more for compliance than reading 40 pages of legal commentary.
UK-specific context: what UK businesses actually need to do
This is the bit that matters most.
There are 5.7 million private sector businesses in the UK, and most are small. About 75% do not employ anyone other than the owner. That means most firms do not have a compliance team, an AI ethics board, or a spare day to decode regulation.
So what do you actually need to do?
UK GDPR still sits underneath everything
The ICO’s AI guidance says organisations must still meet the normal data protection standards around lawfulness, fairness and transparency when using AI. The ICO also stresses that its guidance applies to generative AI, and that organisations using generative AI that processes personal data need to ask basic questions around lawful basis and roles.
That means you should:
- identify your lawful basis for any personal data used in AI
- update privacy information where needed
- keep records of what tools are used and why
- limit what data staff can paste into tools
- make sure outputs are reviewed if accuracy matters
- have a process for dealing with rights requests
Be careful with automated decisions
The ICO is clear that people have rights under UK GDPR where they are subject to solely automated decisions, including profiling, that produce legal or similarly significant effects.
So if your AI setup is doing more than drafting copy — for example, filtering applicants, approving customers, prioritising support, setting prices, or flagging risk in a way that materially affects someone — you need to take that seriously.
Do not ignore accuracy and transparency
Skadden’s summary of the ICO’s latest thinking on agentic AI is useful because it cuts to the practical problems. The UK GDPR requires data to be accurate, yet agentic AI can hallucinate. It also requires transparency, yet more autonomous systems can make it harder to explain what data was used and how.
That is not just a problem for big firms building agents. It matters for any small business tempted to let AI handle customer-facing work without decent oversight.
Make AI literacy someone’s job
The EU AI Act’s AI literacy requirement is already in play. The Commission’s guidance says providers and deployers should ensure a sufficient level of AI literacy among staff and others dealing with AI systems on their behalf. It does not demand a formal exam. But it does expect organisations to make sure people understand what AI is being used, what the risks are, and how it should be used in context.
For a small UK business, this can be very simple:
- a one-page AI policy
- a short training session
- a list of approved tools
- a rule on personal data
- a rule on human review
- a rule on checking outputs before use
That is enough to put you miles ahead of the average firm.
What you do not need to do
You do not need to hire a big consultancy because somebody on LinkedIn used the words “governance framework”.
You do not need to stop using AI altogether.
You do not need to assume the EU AI Act applies in full just because you occasionally get an enquiry from France.
And you do not need to turn this into a six-month project.
Most small firms need a tighter grip on data, decisions and staff behaviour. That is the job.
If you want a broader starting point on useful tools before you get lost in compliance detail, read our guide to the best AI tools for small business. And if you want more plain-English updates like this, keep an eye on our newsletter.
Plain English verdict — what should you actually do right now?
Here is the short version.
If you are a UK-only business using AI internally
Your priority is UK GDPR AI compliance 2026. Make a tool list. Stop staff dumping personal data into random tools. Check your suppliers. Put a simple AI policy in place. Review anything customer-facing before it goes out.
If you sell into the EU or your AI outputs are used in the EU
Assume the EU AI Act may be relevant. Map your tools and use cases. Work out whether you are just using third-party software or providing an AI-enabled service into the EU. Train staff. Review contracts and supplier terms. And get proper legal advice if you are in a higher-risk area.
If you do one thing this week
Create a one-page AI register and a one-page AI policy.
That will give you more practical control than 90% of small firms currently have.
Want this kind of insight every week? Get the free AIBrief Weekly at aibrief.uk/newsletter/