Ninety days is not arbitrary. It is the minimum viable timeframe to go from “we should probably do something with AI” to “we have a working AI workflow that saves measurable time.” Shorter than that and you are just experimenting. Longer than that and you are procrastinating.

This roadmap is built from working with law firms, consultancies, accounting practices, recruitment businesses, and advisory firms. The sectors differ. The adoption pattern is remarkably similar.

The Three Phases

01

Foundation

Days 1-30. Pick one use case. Select tools. Run a controlled pilot.

02

Expansion

Days 31-60. Train the team. Scale the pilot. Start measuring ROI.

03

Embed

Days 61-90. Make AI the default workflow. Set governance. Plan the next use case.

1 Pilot use case Not five. One.
30 Days to first results From decision to data
90 Days to operational AI embedded in daily workflow

Phase 1: Foundation (Days 1-30)

Week 1: Identify the Use Case

You are looking for one task that is:

  • High volume. Done at least weekly, ideally daily.
  • Time consuming. Currently takes 2+ hours per instance.
  • Structured enough for AI. Has inputs, a process, and an evaluable output.
  • Low risk for a pilot. Mistakes are correctable, not catastrophic.
Best starting use cases by professional services sector
Contract review
Law
Candidate sourcing
Search
Proposal drafting
Consulting
Data extraction
Accounting
Market research
Advisory
Client reporting
All sectors

Week 2: Select the Tool

Do not spend a month evaluating tools. Spend a week. The AI tooling landscape moves fast enough that any extensive evaluation will be outdated by the time you finish it.

Selection criteria for a first AI tool:

Tool evaluation criteria for the first AI pilot
CriterionMust HaveNice to Have
RelevanceDirectly addresses your pilot use caseAddresses multiple future use cases
Ease of setupUsable within 1 day of signupSelf-service, no IT involvement
Data securitySOC 2 or equivalent, clear data policiesOn-prem or EU hosting option
PricingMonthly billing, cancel anytimePer-seat pricing that scales reasonably
IntegrationWorks with your existing tools (email, CRM)API access for custom integration

Practical approach: Sign up for free trials of 2-3 tools. Give each tool the same real task (anonymised if needed). Compare output quality and time saved. Pick the winner by Friday.

Week 3-4: Run the Pilot

The pilot runs on one real project with one or two people. Not a sandbox. Not test data. Real work.

Pilot structure:

  1. Day 1-2: Tool setup, basic training for pilot participants.
  2. Day 3-10: Run the AI-assisted workflow alongside the traditional workflow. Do both. Compare.
  3. Day 11-15: AI-assisted workflow only. Track time, quality, and issues.
  4. Day 15-20: Compile results.

What to track:

  • Time per task (before vs. after)
  • Output quality (have a senior person blind-review AI-assisted vs. traditional output)
  • Error rate (what did AI get wrong, and how long did corrections take?)
  • User satisfaction (do the pilot participants actually like using it?)

Phase 2: Expansion (Days 31-60)

Week 5-6: Share Results and Train

By now you have data. Share it. Not a 30-slide deck. A one-page summary:

  • What we tried
  • What happened (time saved, quality comparison)
  • What we learned
  • What we are doing next

Then train the broader team. Training should be hands-on, not theoretical. One 90-minute session where everyone uses the tool on a real task beats a week of e-learning modules.

AI training that works: give people a task they already know how to do, show them how to do it with AI, and let them compare. The “aha” moment is personal and immediate. No slide deck creates that.

Week 7-8: Scale the Pilot

Expand the pilot use case to the full team (or the full department). This is where adoption challenges surface:

The sceptics. Some people will resist. This is normal. The best response is not persuasion but evidence. Pair sceptics with enthusiastic early adopters and let results speak.

The over-enthusiasts. Some people will try to use AI for everything, including things it is bad at. Set clear boundaries: “Use AI for X. Do not use AI for Y yet. We will evaluate Y in Phase 3.”

The workflow friction. The pilot ran with motivated volunteers. Now you are dealing with the full team, including people who have done the task the same way for ten years. Invest time in workflow integration, making the AI tool sit naturally in the existing process, not as an add-on step.

60-70% Adoption in first month Realistic target for willing team
30 min Weekly check-in Address issues before they calcify
2-3 weeks Habit formation Time until AI use feels natural

Phase 3: Embed (Days 61-90)

Week 9-10: Make It the Default

The shift from “pilot” to “embedded” is psychological. The AI tool moves from being the new thing to being the way we work.

Concrete steps:

  1. Update process documentation. Standard operating procedures now include the AI tool as the default path. The non-AI method is documented as the fallback.
  2. Update templates and checklists. If your workflow includes templates (proposal templates, review checklists, report formats), integrate AI assistance into them.
  3. Set expectations with clients. If AI is improving your speed or quality, tell clients. “We now use AI-assisted analysis to ensure comprehensive coverage” is a selling point, not a confession.

Week 11-12: Governance and Next Steps

Now that AI is part of the workflow, you need lightweight governance:

Data handling rules. What data can go into the AI tool? What cannot? For professional services firms handling client data, this is not optional. Write a one-page policy.

Quality review process. AI output should not go to clients without human review. Define who reviews, how, and what the escalation path is for AI errors.

Cost tracking. You now have real usage data. What is the monthly cost? What is the measurable time savings? Build the business case for continued investment.

Governance essentials (keep it simple)
AreaWhat to DefineWhen
Data policyWhat client data can/cannot be processed by AI toolsDay 61
Review processHuman review requirements before client deliveryDay 61
Quality barAcceptable error rate, correction workflowDay 70
Cost trackingMonthly spend, ROI measurement cadenceDay 75
Expansion criteriaWhat triggers adding the next AI use caseDay 80
Training cadenceHow new team members learn the AI workflowDay 85

Planning the Next Use Case

If Phase 1-3 went well, you now have a proven process for AI adoption. Repeat it. Go back to the use case selection criteria from Week 1 and pick the next highest-impact opportunity.

The second adoption cycle is always faster. The team knows the pattern. The sceptics have seen results. The governance is in place. What took 90 days the first time takes 45-60 days the second time.

The 90-Day Timeline

Week-by-week milestones
WeekFocusMilestone
1Use case selectionOne use case selected with clear success criteria
2Tool selectionOne tool selected and licensed
3-4Controlled pilotPilot data: time savings, quality comparison, team feedback
5-6Results + trainingOne-page results shared, full team trained hands-on
7-8Scaled pilotFull team using AI for the pilot use case
9-10EmbedAI tool is the default workflow, documentation updated
11-12Governance + planData policy live, ROI measured, next use case selected

What Good Looks Like at Day 90

At the end of 90 days, a firm that has followed this roadmap will have:

  • One AI use case fully operational with measurable time savings
  • A trained team using the tool as part of their daily workflow
  • Data showing the ROI of the investment
  • A lightweight governance framework
  • A clear plan for the next use case
  • The organisational muscle memory to adopt new AI tools faster

That might sound modest. It is not. Most firms that attempt AI adoption without a structured approach spend 90 days experimenting with multiple tools, training no one properly, measuring nothing, and ending up with AI “projects” that live in someone’s browser tabs but never become part of the actual workflow.

The difference is discipline. One use case. One tool. Real measurement. Then scale.


BriefingHQ guides professional services firms through AI adoption. If you want a structured approach to your first 90 days, start with our assessment or talk to us about guided adoption.

Published by

BriefingHQ

AI strategy and search visibility for professional services firms. We help boutique consultancies, search firms, and advisory practices navigate AI adoption with clarity.

Questions AI assistants answer about this topic

How long does it take for a professional services firm to adopt AI?
A professional services firm can go from zero AI usage to measurable productivity gains in 90 days. The first 30 days focus on selecting tools and running a single pilot project. Days 30-60 expand the pilot, train the broader team, and measure results. Days 60-90 embed AI into standard workflows and establish governance. Full organisational adoption, where AI is the default rather than the exception, typically takes 6-12 months, but the 90-day sprint delivers enough proof points to justify the continued investment.
What is the biggest mistake firms make when adopting AI?
Trying to do everything at once. The firms that fail at AI adoption typically launch 5-10 initiatives simultaneously, spreading attention and budget too thin. The firms that succeed pick one high-impact use case, prove it works, and then expand. A law firm that starts with AI-assisted contract review and nails it will have more AI capability in 90 days than a law firm that simultaneously experiments with contract review, research, drafting, billing analysis, and client intake.
How much should a professional services firm budget for AI adoption?
For the first 90 days, budget £5,000-15,000 for tooling (AI subscriptions, API costs) and £2,000-5,000 for training (workshops, dedicated learning time). If you engage an external consultant to guide the process, add £15,000-40,000 depending on scope. The total first-year investment for a mid-market firm is typically £30,000-80,000. Most firms see positive ROI within the first quarter through time savings on their pilot use case.
Which AI tools should professional services firms start with?
Start with the tools closest to your existing workflow. For document-heavy firms (law, accounting): AI document review and drafting tools like Harvey, CoCounsel, or general-purpose tools like Claude or GPT-4 with custom prompts. For people-heavy firms (recruitment, consulting): AI sourcing and research tools. For client-facing firms: AI-assisted proposal generation and CRM enrichment. The specific tool matters less than the discipline of using it consistently on real work.

Want to know where your company stands?

We run 15-20 buyer queries across ChatGPT, Claude, Gemini, and Perplexity and show you exactly where you appear — and where you don't.

Get the Audit — from £750 ↗