AI
BLOG
AI

AI Strategy Workshop: How I Help Companies Find Their First $100K in AI Savings

Most AI strategies fail because they start with technology, not problems. Here is the exact workshop framework I use to find $100K+ in AI savings for every client.

S
Sebastian
March 23, 2026
16 min read
Scroll

A 40-person SaaS company asked me to run a 2-day AI workshop last quarter. The CEO told me they wanted to "become an AI-first company." I told him to forget about that goal entirely.

By the end of day two, we had identified 7 workflows that were costing them $340K per year in manual labor. Three of those workflows were automated within two weeks using off-the-shelf AI tools. No custom models. No data science team. No six-month roadmap. Just targeted automation where it actually mattered.

That engagement is not unusual. I have run 20+ AI strategy workshops for companies ranging from seed-stage startups to 500-person organizations. The pattern is always the same: companies walk in thinking they need to "adopt AI." They walk out with 3-5 specific automations that will save them $100K or more in the first year.

Here is the exact framework I use.

Why Most AI Strategies Fail

Let me be blunt: you do not need an AI strategy. You need an automation strategy that happens to use AI.

The difference matters. When companies start with "we need to use AI," they end up with solutions looking for problems. They hire a data science team, spend six months building an internal ML platform, and then realize nobody in the organization has a clear use case for it.

I have seen this play out at least a dozen times. The three most common failure modes:

1. Technology-first thinking. "GPT-4 is amazing, let's find a way to use it." This is backwards. Start with the pain, not the tool. 2. Boiling the ocean. "We need an enterprise AI platform that handles everything." No, you need to automate the one workflow that is eating 20 hours of someone's week. 3. Ignoring data quality. Every AI implementation depends on data. If your data is messy, inconsistent, or siloed, no amount of AI sophistication will save you. Data quality remains the number one roadblock in AI implementation, and it needs to be addressed before anything else.

The companies that succeed with AI start small. They pick structured, high-value tasks. They prove ROI in weeks, not quarters. Then they scale.

That is exactly what my workshop framework is designed to do.

The AI Strategy Workshop Framework: 4 Phases

I have refined this framework over 20+ engagements. It compresses what most consulting firms stretch into 3-month discovery phases into an intensive 2-day workshop followed by a focused sprint.

Here is the overview:

PhaseDurationOutput
Phase 1: Workflow MappingHalf dayComplete map of high-cost manual workflows
Phase 2: AI Opportunity ScoringHalf dayPrioritized list of AI automation candidates
Phase 3: Quick Win SprintHalf dayOne working automation prototype
Phase 4: 30-60-90 Day RoadmapHalf dayActionable implementation plan with owners

Before the workshop even starts, I run a 4-hour AI opportunity audit with the leadership team. This is where I learn the business, understand the org structure, and identify which teams to pull into the workshop. It is the boring pre-work that makes everything else possible.

Let me walk you through each phase.

Phase 1: Workflow Mapping (The Boring Part That Matters Most)

This is the phase nobody wants to do and the phase that delivers 80% of the value.

I bring together 8-12 people from across the organization — operations, customer support, finance, engineering, sales. Not just leadership. The people who actually do the work every day. Then I ask one question:

"Walk me through the most tedious part of your week."

Not "where could AI help?" — that question leads to fantasy. I want to hear about the real, grinding, repetitive work that makes people dread Monday mornings.

Here is what the mapping process looks like:

  1. Each participant writes down their top 5 most time-consuming repetitive tasks on sticky notes (or a Miro board if remote). One task per note.
  2. We group them into workflow categories: data entry, reporting, communication, review/approval, scheduling, research.
  3. For each workflow, we document:
- Who does it and how often - How many hours per week it consumes - What the input and output look like - What tools are currently used - What the error rate is

By the end of Phase 1, we typically have 15-25 documented workflows with rough time estimates. I then calculate the annual cost of each workflow using a simple formula:

text
Annual Cost = Hours/Week × 52 × Fully Loaded Hourly Rate

For example, if a customer support manager spends 8 hours per week writing ticket summaries and follow-up emails, and their fully loaded rate is $55/hour:

text
8 hours × 52 weeks × $55/hour = $22,880/year

Multiply that across a team of 4 support agents doing similar work, and you are looking at $91,520 per year on a single workflow.

This is always the moment in the workshop where the room goes quiet. Nobody has ever added up these numbers before.

Phase 2: AI Opportunity Scoring (The 2x2 Matrix)

Now we have a list of expensive workflows. The question becomes: which ones should we automate first?

Not every workflow is a good AI candidate. Some are too unstructured. Some depend on tribal knowledge that is hard to encode. Some touch sensitive data that requires careful governance.

I use a 2x2 scoring matrix that evaluates each workflow on two dimensions:

Business Impact (Y-axis, scored 1-10):
  • Annual cost savings potential
  • Error reduction impact
  • Employee satisfaction improvement
  • Customer experience improvement
Implementation Feasibility (X-axis, scored 1-10):
  • Data availability and quality
  • Technical complexity
  • Integration requirements
  • Regulatory and compliance risk
  • Change management difficulty
Every workflow gets plotted on the matrix:
text
High Impact │ SCHEDULE THESE    │ DO THESE FIRST
            │ (High impact,     │ (High impact,
            │  hard to build)   │  easy to build)
            │                   │
────────────┼───────────────────┼──────────────────
            │                   │
Low Impact  │ IGNORE THESE      │ QUICK WINS
            │ (Low impact,      │ (Low impact,
            │  hard to build)   │  easy to build)
            │                   │
            └───────────────────┴──────────────────
            Low Feasibility      High Feasibility

The top-right quadrant is where the gold is: high business impact, high feasibility. These are your first automations.

In my experience, every company ends up with 3-5 workflows in that top-right quadrant. These become the core of the AI implementation roadmap.

I also pay special attention to the bottom-right quadrant — quick wins. These are low-impact individually but easy to implement. Stacking 3-4 quick wins creates visible momentum and builds organizational buy-in for the bigger projects.

Phase 3: Quick Win Sprint (Automate One Thing in the Room)

This is the phase that separates my workshop from a consulting deck.

We pick the single highest-scoring workflow from Phase 2 and we build a working prototype before lunch. Not a mockup. Not a slide. A working automation.

For most workflows, this means connecting existing tools with AI in the middle. Common patterns:

  • Email triage and drafting: Connect Gmail/Outlook to an LLM via Zapier or Make. Incoming emails get classified, prioritized, and draft responses are generated.
  • Report generation: Pull data from a spreadsheet or database, feed it to an LLM with a structured prompt, output a formatted report.
  • Meeting summarization: Connect a transcription tool (Otter, Fireflies) to an LLM that extracts action items, decisions, and follow-ups.
  • Document review: Upload contracts or proposals to an LLM with a checklist prompt that flags missing sections, inconsistencies, or compliance issues.
The prototype does not need to be production-ready. It needs to be real enough that the room can see what the future looks like. When a CFO watches a 45-minute monthly reporting process happen in 90 seconds, the conversation shifts from "should we do AI?" to "what else can we automate?"

Here is a real example from a recent workshop. A fintech company had a compliance team spending 12 hours per week manually reviewing customer onboarding documents for completeness. We built a prototype using a structured prompt chain:

text
Step 1: Extract document fields (name, address, ID number, etc.)
Step 2: Cross-reference against required field checklist
Step 3: Flag missing or inconsistent fields
Step 4: Generate a review summary with confidence scores

Total build time: 3 hours. Estimated time savings: 10 hours per week. That is $41,600 per year from one workflow, prototyped in a single morning.

Phase 4: The 30-60-90 Day Roadmap

The final phase turns insights into action. I have learned the hard way that without a concrete plan with named owners and deadlines, workshop insights die within a week.

Here is the template I use:

Days 1-30: Foundation

  • [ ] Deploy the quick win prototype to production
  • [ ] Assign an AI champion (one person accountable for the program)
  • [ ] Conduct a data quality audit for the top 3 target workflows
  • [ ] Set up measurement baselines (time spent, error rates, cost)
  • [ ] Define governance guardrails: what AI can and cannot decide autonomously
Owner: AI Champion + Department leads of affected teams

Days 31-60: Expansion

  • [ ] Implement automation #2 and #3 from the prioritized list
  • [ ] Build internal documentation and runbooks for each automation
  • [ ] Train affected team members on new workflows
  • [ ] Collect quantitative results from the quick win deployment
  • [ ] Redesign affected roles — be explicit about how responsibilities shift
Owner: AI Champion + HR/People team for role redesign

Days 61-90: Scale

  • [ ] Implement automations #4 and #5
  • [ ] Establish a recurring AI opportunity review (monthly or quarterly)
  • [ ] Build an internal case study from quick win results
  • [ ] Present ROI report to leadership with recommendations for next quarter
  • [ ] Evaluate build-vs-buy decisions for more complex automations
Owner: AI Champion + CTO/VP Engineering

The most critical element here is change management. I always build role redesign, communication plans, and performance agreement updates into the roadmap. AI automation changes how people work. If you do not manage that transition explicitly, you will face resistance that kills adoption regardless of how good the technology is.

The 5 Workflows I Find in Every Company

After 20+ workshops, certain patterns repeat. These five workflows exist in nearly every company I work with, and all of them are strong AI automation candidates:

1. Status Reporting and Updates

Before: Team leads spend 2-4 hours per week compiling status updates from Jira, Slack, email, and docs into a weekly report for leadership. After: An AI agent pulls data from project management tools, summarizes progress, flags blockers, and generates a formatted report. Human reviews and sends. Typical savings: 3 hours/week per team lead. For a company with 5 teams: $42,900/year.

2. Customer Support Ticket Classification and First Response

Before: Support agents read each ticket, categorize it, assign priority, and write an initial response. Average handling time: 8 minutes per ticket. After: AI classifies incoming tickets, assigns priority, routes to the right team, and drafts a first response. Agent reviews and sends. Average handling time: 2 minutes per ticket. Typical savings: 75% reduction in first-response time. For a team handling 200 tickets/day: $156,000/year.

3. Invoice Processing and Expense Review

Before: Finance team manually reviews invoices, matches them to POs, flags discrepancies, and enters data into the accounting system. 15-20 minutes per invoice. After: AI extracts invoice data, matches to POs automatically, flags only the exceptions for human review. 2-3 minutes per invoice for exception handling only. Typical savings: 80% reduction in processing time. For a company processing 500 invoices/month: $62,400/year.

4. Meeting Notes and Action Item Tracking

Before: Someone takes notes during meetings, then spends 20-30 minutes after each meeting cleaning them up and distributing action items. Often, action items are lost or forgotten. After: Meetings are transcribed automatically. AI extracts decisions, action items with owners, and follow-up dates. Summary is distributed immediately after the meeting ends. Typical savings: 30 minutes per meeting. For a company running 40 meetings/week: $52,000/year.

5. Candidate Screening and Interview Prep

Before: Recruiters spend 15-20 minutes per resume reviewing qualifications, cross-referencing with job requirements, and preparing interview questions. After: AI screens resumes against job requirements, generates a fit score, and prepares tailored interview questions based on the candidate's background. Recruiter reviews the AI output and makes the call. Typical savings: 70% reduction in screening time. For a company reviewing 100 candidates/month: $36,400/year.

Add those up across a mid-sized company, and you are looking at $350,000+ per year in efficiency gains. Even capturing half of that puts you well past the $100K mark.

How to Calculate AI ROI Before You Build Anything

I never let a client invest in an AI automation without a clear ROI projection. Here is the framework I use:

Cost of the Current Workflow (Annual)

text
A = Hours spent per week on the workflow
B = Number of people doing the workflow
C = Fully loaded hourly rate (salary + benefits + overhead, usually 1.3-1.5x base salary)

Current Annual Cost = A × B × 52 × C

Cost of the AI-Automated Workflow (Annual)

text
D = Hours spent per week AFTER automation (review, exceptions, oversight)
E = AI tooling cost (monthly subscription × 12, or build cost amortized over 3 years)
F = One-time implementation cost (amortized over first year)

Automated Annual Cost = (D × B × 52 × C) + E + F

Net Savings and ROI

text
Annual Savings = Current Annual Cost - Automated Annual Cost
ROI = (Annual Savings - F) / F × 100

Payback Period = F / (Annual Savings / 12) months

Let me run a real example. That compliance document review workflow from earlier:

text
Current state:
- 12 hours/week × 1 person × 52 weeks × $50/hour = $31,200/year

Automated state:
- 2 hours/week (exception review) × 1 person × 52 weeks × $50/hour = $5,200/year
- AI tooling: $200/month × 12 = $2,400/year
- Implementation: $8,000 one-time

Automated Annual Cost = $5,200 + $2,400 + $8,000 = $15,600 (year 1)
Annual Savings = $31,200 - $15,600 = $15,600 (year 1)
Annual Savings = $31,200 - $7,600 = $23,600 (year 2+)

ROI (year 1) = ($15,600 - $8,000) / $8,000 × 100 = 95%
Payback Period = $8,000 / ($15,600 / 12) = 6.2 months

A 95% first-year ROI with a 6-month payback period. That is the kind of number that gets budget approved immediately.

Common Mistakes: What Kills AI Initiatives

I have seen enough failed AI projects to spot the patterns. Here are the three mistakes that kill more initiatives than any technical challenge:

1. Vibe Coding the Solution

This is the trap that technical founders fall into most often. They see a workflow that could be automated, fire up an IDE, and start building a custom solution with the latest AI APIs. Two weeks later, they have a fragile prototype that handles 60% of edge cases and requires constant babysitting.

The fix: always evaluate existing tools first. Zapier, Make, n8n, and similar platforms can handle 70% of AI automation use cases without writing a single line of custom code. Only build custom when the off-the-shelf tools genuinely cannot do what you need.

2. Ignoring Data Quality

I cannot stress this enough. If the data feeding your AI automation is inconsistent, incomplete, or spread across 5 different systems with no single source of truth, the automation will produce garbage output. It will not matter how sophisticated your prompt engineering is.

Before any implementation, I run a data readiness checklist:

  • [ ] Is the input data structured and consistent?
  • [ ] Is there a single source of truth for each data type?
  • [ ] Are there documented data quality issues?
  • [ ] Can we access the data programmatically (API, database, export)?
  • [ ] Are there privacy or compliance constraints on data access?
If more than two of these boxes are unchecked, we fix the data problem before touching AI.

3. Boiling the Ocean

"Let's build a company-wide AI platform" is the most expensive sentence in enterprise software. I have watched companies spend $500K on AI infrastructure that nobody uses because it was designed for hypothetical future use cases instead of present-day pain.

Start with one workflow. Prove it works. Measure the ROI. Then expand. The companies that report 40% faster release cycles and 35% reductions in planning overhead did not get there by launching an AI moonshot. They got there by systematically automating one process at a time.

What Happens After the Workshop

The workshop is not the end. It is the starting gun.

In the weeks following every workshop, I stay engaged as an advisor. The AI champion we identified in Phase 4 becomes my primary point of contact. We have a weekly 30-minute check-in for the first 90 days to:

  • Remove blockers on implementation
  • Adjust priorities as we learn what works
  • Capture ROI data as automations go live
  • Identify new automation opportunities as the team's AI literacy grows
The best outcomes I have seen come from companies that treat AI automation as an ongoing practice, not a one-time project. They build the muscle of identifying repetitive work, scoring it, and automating it. After 6 months, they do not need me anymore. They have internalized the framework.

And that is the real goal. Not to make companies dependent on consultants, but to give them a repeatable system for finding and capturing efficiency gains with AI.

If you are a CTO, founder, or ops leader sitting on a pile of manual workflows and wondering where to start with AI — start here. Map the work. Score the opportunities. Build one prototype. Plan the rollout. The $100K in savings is almost certainly there. You just need a structured way to find it.

References

Sources

Further Reading


~Seb 👊

Share this article