A couple months back, I'm sitting here thinking, "Damn, we're not adopting AI company-wide like I expected.”
And according to a McKinsey study, C-suite leaders estimate that only 4% of employees use generative AI for at least 30% of their daily work. But when employees were asked directly, that number was actually three times higher.
If execs are missing that much of what’s actually happening on their own teams… maybe I was one of those execs too?
I saw this post to our Slack and was like…damn. We’re doing this much work in AI?
How did I not know, then I looked at the names. This many *different* people are contributing? We had execs on the board. Associates. New hires. Veterans.
I was impressed (here is just one example) and to be honest, surprised. How did this happen? what did we do?
I believe in interviewing SMEs as the way to make content that stands out, so I sat down with our President Crystal, our VP of People Ops Emily, and our VP of SEO & AI Alisa to unpack it. Here is what I learned.
We Didn't Have an AI Adoption Problem. We Had A Clarity Problem.
"Work on stuff" is not a strategy. You need structure: enablement, incentives, communication, and accountability, otherwise even the BEST ideas aren’t going to happen.
My style, right or wrong, is VERY, learn it yourself if you want greatness, surround you with other people who want greatness and it'll all work out. Not in a company of our size, that just doesn't work.
I want Seer top build a company that has a prototype employee, back when combining SEO & PPC data was all my rage (still is), I wanted to build teammates that when hired elsewhere people knew they were getting someone steeped in Power Bi, Tableau, Cross Divisional Data, etc. When it comes to AI I want the same, I want working at Seer to mean something instantly on your resume, and to do that it means we gotta re-invent, democratize and try to have one of the leading voices, once again, we've learned a lot of lessons from our previous failed attempts which is making this one go a BUNCH better... here is our blueprint.
Here’s the blueprint to making AI everyone’s job:
- Tie Real Money to Real Behavior
You get what you pay for. If you want AI adoption, pay for it.
Emily put it best: “Tracking time? That’s table stakes—like powering your laptop. No one hires you for that. We swapped out time tracking compliance for AI contributions.”
Boom. Real dollars tied to real work. Not lip service. - Build a System People Can Actually Use
Good ideas don’t die from lack of talent. They die from lack of structure.
That’s why we built Grab & Go: a frictionless way for anyone to spot a problem and build an AI based solution.
“Top-down mandates don’t work here. I wanted visibility, transparency, and peer-to-peer competition. Most of all, I wanted a scoreboard.” – Alisa - Define the Job, Not the Buzzword
“Experiment with AI” means nothing.
Grab & Go gives people clear prompts, real business problems, and structured success metrics. - Make Innovation Visible
Those 5, 6 and 7 figure checks get written for AI investments, but leadership like me rarely sees where the work’s actually happening. This elevates the work being actually done. A live view of who’s building, what’s shipping, and where the impact is.
You Don’t Need Another AI Pilot. You Need a System.
Let’s be real:
You don’t have an AI adoption problem. You’ve got a follow-through problem.
Seer, like so many other companies, has smart people with good ideas. But where do those ideas go? They sit in Slack threads. They vanish in one-off emails. They float through meetings and disappear.
We had all this internal innovation potential, but we weren't capturing it efficiently. We needed a clear, transparent place to collect, organize, prioritize, and implement the good ideas.
8 Steps to Build the System Yourself
Because good ideas die when you don’t give them a place to live.
Step 1. AI Idea Submission
The submission criteria is simple: Have an idea? Share it in the chat, and trust that the AI Council will find it. Zero barrier-to-entry.
Throwing a budget at a tool and hoping the team runs with it … doesn’t work. You need people with time dedicated to spotting ideas, supporting experimentation, and seeing builds through.
At Seer, we have two public Slack channels focused on AI. One, a more general channel for sharing ideas, reads, and AI news. Then a more tactical channel where the team shares their wins, their struggles, and their questions.
Step 2. AI Council Review
The council’s job is to evaluate the idea from a technical perspective, asking questions like:
- Is this technically feasible with our current AI capabilities?
- Does it leverage or advance our internal expertise?
- Are there known solutions or past attempts that could inform or improve this idea?
- Could this solution inadvertently duplicate something already in progress or recently implemented elsewhere?
If it passes this initial test, the AI Council creates a task to represent the idea on the Grab N Go board. And, importantly, folks can expect an initial review within 24-48 hours.
Now, the idea is up for grabs for anyone in the company to try to build. This also creates a lower barrier-to-entry if the individual who has the idea doesn’t have the bandwidth to execute it.
Step 3. Project Brief Creation
Before you start building, map out the problem you’re trying to solve. With the help of our Project Brief GPT, any individual who has assigned themselves to the task answers a series of questions about the concept:
- What is the goal of the task and how do we measure success?
- What is your hypothesis? If we invest time on solving this problem, how do our clients benefit?
- Describe the user story for how the team would engage with the workflow.
- What are the inputs and outputs?
You define the problem, not the solution. Rob Corso on the Creative team understood the friction in his work, translated it into a buildable idea, and laid it out clearly with a “UX Audit Advisor”, directly mapping to future AI agent workflows:
“We talk a lot about AI agents, but Rob, without explicit training or instruction, immediately started thinking in terms of agentic workflows. Seeing that happen organically was exciting because it showed real alignment and understanding at a deep level.”
Step 4: Technical Review
A dedicated member of the AI Council reviews the project brief to ensure the initiative is viable and achievable for the team. If a concept has legs but requires a more advanced builder, the council can route the next steps accordingly. Especially when someone is new, we don’t want them to bite off a project that sounded deceptively simple.
If a concept is strong and a good opportunity for the team to exercise their AI training, it’s approved and sent for Division Leadership review.
Not every innovation needs to be flashy. Just ask Heather from Paid Media, who automated repetitive slide deck creation with AI:
Heather used AI to create a Google Apps Script that automatically generates repetitive slide decks for client reporting calls:
“Stuff like automated slide decks isn't flashy on sales calls, but unless you're the one stuck doing these decks every week, you don't realize how impactful it is. It reduces cognitive load significantly—freeing our team from mundane, repetitive tasks to focus on high-value work.”
This is exactly why you make it easy for everyone to participate. Real innovation often looks like someone making their Tuesday suck less.
Step 5: Leader Alignment Approval
After the technical review, the relevant leader reviews the idea and make sure it creates value as they define it. Division or team leaders provide critical context like:
- Does this idea align with our team’s immediate goals and priorities?
- Would this genuinely benefit our customers, clients, or internal workflows?
- Is this a priority worth dedicating resources to right now, given other initiatives in flight?
- Could this solution unlock broader strategic opportunities we hadn't considered?
Having a team that is charged up and excited to get on the AI train, you owe it to them to make sure they don’t waste effort solving the wrong problems or duplicating existing initiatives. One example, we’ve seen someone propose automating a deliverable only to have the div leader say we’re deprecating that, they'll flag it immediately, redirecting energy to something with greater strategic value.
Once it passes the technical and strategic review, the task is open for the taking and can be grabbed by anyone. That said, the countdown begins once the task is grabbed…
Step 6: Build it in 2 weeks
“We didn't want people to grab an idea and then sit on it for months and months. We wanted there to be a sense of urgency because we need the team to feel momentum moving toward AI solutions. We can't have the mentality of, 'Oh, I've got all these good ideas, but Q2 is busy, so I'll grab this now and dig in months later.' That's exactly what we want to avoid.” Alisa said, making it clear that urgency (not just interest) drives our AI momentum.
Once an idea is grabbed, the team starts building. They use their training and tools like ChatGPT to design the ideal workflow. Most of what we’re building are custom GPTs because we see that as the foundation for future agentic workflows. But it’s not just GPTs. Sometimes it’s a complex Google Apps Script. Other times it’s a well-structured NotebookLM project or a smart prompt system. The tool doesn’t matter. Solving the problem does.
In 2024 We invested $30,000 in AI certifications from The AI marketing institute for the whole team. We don't want those newly acquired skills to start atrophying. The two-week sprint points people clearly in the direction of applying those skills immediately, when a team member finishes their AI certification, they are encouraged to grab a task and go build it.
Step 7: Build, Test, and Refine
Once the AI build is ready, our AI expert, Jordan, reviews and tests it rigorously.
Jordan took the system he uses to improve GPTs and turned that into its own GPT. It grades your instructions, points out exactly what needs improvement, and even offers to fix certain parts automatically. It’s extremely meta, but very effective.
Approved solutions then move to final documentation and integration within existing workflows.
Step 8. Change Management & Communication
When a solution is finalized, it’s passed on clearly documented to leaders for integration into standard operating procedures.
The individual responsible for the build packages initial documentation and submits to our change management process, at which point the responsibility changes hands to the team representative responsible for change management. We hold a biweekly release, through asynchronous notes or synchronous meetings, depending on complexity of the changes
The Real Question: Did it Work?
Right. So how do you know something like this worked? We track and measure effectiveness in a few very clear, very transparent ways. It should never be a guessing game, especially to those doing the work.
-
- Activity Levels: How many ideas get submitted, approved, and executed monthly.
- Speed & Agility: How quickly do ideas go from submission to execution and adoption? Our 2-week sprint approach helps ensure fast outcomes.
- Cross-Divisional Engagement: We measure how widely the system is used across teams, especially important because innovation often happens when ideas cross traditional team boundaries.
- Impact & ROI: Ultimately, we measure effectiveness by real-world impact are these ideas reducing cognitive load, removing friction, and helping teams deliver better client work faster?
- Team Feedback & Adoption: We're regularly checking in with the team about how easy it is to participate, how meaningful their contributions feel, and what friction remains.
And like we said above, we can see EVERYTHING. I know exactly where my team stands on AI innovation:
Since launching our AI Slack channel in May 2023, we’ve had 9,924 messages posted. That’s not hype. That’s people in the weeds, sharing, asking questions, and helping each other.
Our Grab & Go system currently tracks 147 active tasks.
- 81 still up for grabs
- 52 in progress
- 28 completed
The beauty of Grab and Go? We’re not guessing.
I now have dashboards showing who’s building, where things stall, and what’s actually shipping. Real accountability. Real momentum.
We had a suggestion box before, threads, emails, docs. It’s a system now, attached to real data that impacts people's real pay. It kills the chaos that happens when ideas get lost, duplicated, or ignored. Which as the head of innovation always frustrated me.
Grab and Go gives literally everyone a shot to drive real change. Visibility. Voice. Impact.
You don’t need another AI idea.
You need a system to make those ideas real.