For executives whose board has started asking about AI governance
Technology implementations aren't one-time decisions. They require constant gardening.
All technology governance frameworks, policies, and compliance checklists all require the same thing to work: an organization with named owners, functioning decision-making bodies, and a formal structure for what happens when technology initiatives, including AI experiments, cross a line. The DGS System builds that infrastructure — in 12 weeks.
This is the system that drove coordination inside a 12,000 person federal agency and across dozens of entities in the Executive Branch. This system was so effective that DOGE shut it down—and then had to reboot it. Now, we've streamlined it to serve both private and public sector entities. The DGS System constitutes an Executive Board and Digital Council inside your organization and gets them running sustainably. You can think of this as Agile self-governance—but for full organizations. This structure keeps your strategic vision intact while your teams move fast, make decisions, and do their best work.
Book a free call to learn more No cost or commitment.Decrease in digital footprint, General Services Administration
Scales of implementation: TTS, GSA, Federal Web Council
Agencies using this structure
Every organization deploying AI has a governance problem. Most of them don't realize it's a structural one.
The conversation happening in boardrooms right now is real. AI deployments are proliferating across every business line, under every kind of budget label, by teams moving fast and trying things. That's not the problem. The problem is that most organizations have no formal structure for what happens when an AI initiative crosses a line — between teams, between budgets, between the work a tool was approved for and the work it's actually doing.
AI governance frameworks promise to solve this. They produce policies, principles, and compliance checklists. And then those documents sit on a shared drive while the deployments continue accumulating, the costs continue growing, and the CFO continues finding out about new AI initiatives at the budget review.
A governance framework is not a governance structure. The framework tells you what should happen. The structure is what makes it happen — named owners for every deployment, a Digital Council that handles operational AI questions, and an Executive Board with the authority to greenlight, redirect, or stop an AI initiative before it becomes a compliance problem or a budget surprise.
If your organization already has operational digital governance through the EDX System, the DGS System adds the Executive Board layer that handles the decisions too consequential for the Digital Council to resolve alone. If your organization is starting from scratch, the DGS System installs both bodies. Either way, the result is the same: a governance structure that actually governs.
AI deployments aren't a one-time decision. They're a continuous management problem — and most organizations have no structure to manage them.
Every AI tool your organization deploys needs someone responsible for monitoring its outputs, maintaining its compliance as laws and policies evolve, and deciding when it needs to be updated, restricted, or retired. That's not an IT question. It's a governance question. And it requires the same infrastructure as every other digital asset — named owners, defined processes, and a body with authority to make decisions when something changes.
Right now, AI tools are being evaluated, piloted, and quietly adopted by teams across your organization. Some of those tools are overlapping with what another team is already running. Some are accumulating compute and storage costs under budget lines that were approved for something else. Some are doing work that, on reflection, someone in your organization would want to review before it continues.
That sentence, said about any digital initiative, is a governance problem. Said about an AI deployment, it's a material risk. The gap isn't technical and it isn't intentional. Nobody is hiding anything. There's simply no structure for when to escalate, who to escalate to, or how to get a fast, binding answer when an AI initiative crosses a line. So it doesn't get escalated. And the CFO finds out when the invoice arrives.
The DGS System closes that gap. It installs the structure that makes "AI governance" an operational reality instead of a policy document.
Your teams should be experimenting with AI. You need to know when an experiment becomes a commitment.
This is not a problem with your people. Teams should move fast. They should pilot tools, test models, and try things without needing executive sign-off on every decision. Slowing that down costs you more than it saves.
But some AI decisions aren't single-team decisions. A new model deployment that touches customer data crosses a compliance line. A tool that two teams are both evaluating independently is a procurement inefficiency waiting to happen. An AI initiative that succeeds will eventually require resources, contracts, and organizational commitments that one team can't authorize alone.
Right now, when a team hits that threshold, the default response is: "I think I should check with someone on that." Then that check goes looking for a place to land — and finds none. So either the initiative stalls, or it moves forward without the review ever happening. Both outcomes are expensive.
Operational AI decisions within a single team's scope
Tool evaluations, pilot programs, use policy questions, day-to-day compliance monitoring, and maintenance of deployed AI systems. The Digital Council handles these. They don't need to come up to the board.
AI decisions that cross teams, cross budgets, or carry material risk
Cross-functional AI deployments. Disputes between teams evaluating the same tool. Proposals to build or retire a shared AI system. Initiatives that require compliance review at the organizational level. A new model deployment touching customer or regulated data. These come to the Executive Board.
AI governance requires the same infrastructure as every other kind of governance. This is that infrastructure.
The DGS System installs three things that make AI governance operational rather than theoretical: named owners for every digital and AI asset, a Digital Council that handles ongoing governance questions at the operational level, and an Executive Board with the authority to make binding decisions when something crosses a line.
An AI policy without this infrastructure is a document. It describes what should happen. It doesn't determine what does happen when a team deploys a model that touches regulated data, when two business lines are both running pilots of competing tools, or when an AI initiative that started as an experiment has grown into a budget line nobody approved.
The governance structure the DGS System installs is where those situations get resolved — consistently, on a defined timeline, by the right people.
The person who helped build federal AI governance from scratch — as a government employee, accountable for making it work.
Most AI governance consultants arrived at the problem recently. Ana Monroe was a founding member of GSA's AI Safety Team— the body responsible for creating the governing structure for AI deployments across a U.S. federal agency operating under some of the most demanding legal and ethical requirements in the country.
Her team built the first platform-agnostic AI use training in the federal government. Not a vendor-specific certification. Not a policy summary. A training program designed to work across every AI platform an agency might deploy — because she understood from the beginning that AI governance has to govern the behavior and the structure, not just the tool.
Her work in GSA also focused on injecting ethical frameworks into agency-deployed AI models — ensuring that the models themselves were operating in compliance with law, ethics, and mission, not just that a policy document existed saying they should. That's the full stack: governance architecture and model behavior, built together, by the same person.
The DGS System brings that experience to your organization. The structure transfers because the problem does — and because the team behind it has been working on this problem before “AI governance” was a social media buzzword.
Six sprints. Each one builds what the next one requires.
Map what exists
Qualitative, quantitative, and desk research produces a verified map of every digital asset your organization owns — websites, platforms, applications, and tools across every business line. Most organizations find 15–30% more than they expected.
Attach people to that map
Working with HR, we identify the employee responsible for each digital asset and produce the position language to make that responsibility official. Ownership becomes documented. Accountability becomes durable.
Surface what IT didn't have on the list
In partnership with IT and business lines, we find the technologies that didn't appear in Sprint 1 — shadow tools, legacy platforms, informal systems, and AI tools being evaluated or quietly used under discretionary budgets. We add them to the map, assign ownership, and flag anything with HR or compliance implications.
Where the AI cost surprises liveEstablish the Executive Board and Digital Council
We appoint the members of both governing bodies and produce founding charters for each — including sitting members, meeting structure, scope of deliberations, and the escalation pathway between them. We produce the first version. You edit it to reflect your leadership, your culture, and your priorities.
The architecture that changes how decisions get madeRun the first sessions for both bodies
We facilitate the soft launch of the Executive Board and Digital Council — working through the charters together, stress-testing what's realistic, and establishing the working norms that will govern how decisions actually get made. What doesn't fit gets revised before the bodies take on real decisions.
Govern actual problems from your organization
Both bodies take on live issues — using the pitch, debate, and resolution structures the charter defines. By the end of this sprint, your governance architecture has a track record. The Executive Board has reviewed something real. The Digital Council has escalated something real. The pathway works.
First follow-on Digital Council meeting — facilitated by Ishmael Interactive
Within 30 days of the engagement closing, we return to observe and facilitate the Digital Council's first independent meeting. We identify what's working, surface anything that needs adjustment, and help the Council refine its operating rhythm before it's fully on its own.
Included · Must be scheduled within 30 days of engagement closeFirst follow-on Executive Board meeting — facilitated by Ishmael Interactive
Within 30 days of the engagement closing, we return for the Executive Board's first independent meeting. We observe the decision-making process in practice, note where the charter language needs refinement, and give the board direct feedback on how the session went. At the end of this meeting, the system is running without us.
Included · Must be scheduled within 30 days of engagement closeTwelve weeks from now, your organization governs AI deployments through a structure — not through whoever escalates loudest or fastest.
Here is what is in place at the end of the engagement.
A functioning Executive Board Senior leadership with a defined mandate, a regular cadence, and binding authority over cross-functional digital decisions.
A functioning Digital Council The operational governing body that handles day-to-day decisions and knows exactly when and how to escalate.
A formal escalation pathway between them When something crosses a line, there's a structure for where it goes — and a timeline for getting an answer.
Founding charters — edited to fit your organization Not templates. Working documents built around your structure, your leadership, and your specific decision boundaries.
A verified digital asset inventory Every property, every platform, every tool — mapped, attributed, and visible to both the Council and the Board.
Shadow technology found and governed The platforms and tools that were running outside the map — now on it, now owned, now visible to your cost review.
Both bodies have already made real decisions The Executive Board reviewed something live before the engagement ended. The pathway has been tested.
Facilitated follow-on sessions after the engagement closes Ishmael Interactive returns for the first independent Digital Council and Executive Board meetings — to observe, refine, and confirm the system is running the way it should.
Build the governance infrastructure your AI strategy requires.
The discovery call is 30 minutes. We'll look at your current digital ecosystem together and tell you honestly whether the DGS System is the right fit — and what the engagement would look like for your organization.
See if we're the right fitFree 30-minute call. No cost or commitment.
What each sprint includes and what it produces.
The DGS System is a 12-week engagement. Here is what each sprint delivers and why it matters.
Digital Asset Inventory
Sprints 1–3A verified map of every digital property your organization owns or operates — built from primary research, not from whatever list already exists. Most organizations discover assets in this process they didn't know were still running, still licensed, and still generating costs. The inventory is the foundation every subsequent sprint builds on.
HR Accountability Structure and Position Language
Sprint 2Named owners for every digital asset, with position description language your HR team can adopt. This sprint makes accountability official. It also surfaces the employees who have been doing governance work without it appearing anywhere in their role — and gives you a way to recognize and formalize that work.
Founding Charter: Digital Council
Sprint 4A working charter for your Digital Council — including sitting members, meeting structure, scope of deliberations, and the conditions under which issues escalate to the Executive Board. If your Digital Council was established through an EDX System engagement, we build from that foundation. If it's new, we establish it here.
Founding Charter: Executive Board
Sprint 4A working charter for the Executive Board — including appointed members, the types of decisions that belong to the board, how proposals are structured, how disputes are resolved, and what a binding decision looks like. We produce the first version. You edit it to reflect your organization's leadership culture and specific boundaries.
Facilitated Governance Sessions — Soft Launch
Sprint 5Facilitated sessions for both the Digital Council and Executive Board. We work through each charter in session — stress-testing what's realistic, identifying where the language needs revision, and establishing working norms before either body takes on live decisions. What doesn't work gets fixed before the real work begins.
Live Governance Sessions on Real Organizational Problems
Sprint 6Both bodies take on actual issues from your organization — using the pitch, debate, and resolution structures established in their charters. By the end of Sprint 6, the Executive Board has reviewed something real. The Digital Council has escalated something real. The pathway between them has been tested under working conditions. That's what makes the system a system.
Post-Engagement Facilitation: First Independent Digital Council Meeting
Included · Within 30 DaysWithin 30 days of the engagement closing, we return to facilitate the Digital Council's first meeting without us in a formal role. We observe how the council operates independently, surface anything that needs to be adjusted in its charter or cadence, and give direct feedback before the council is fully on its own. This is where small issues get caught early — before they become structural problems.
Post-Engagement Facilitation: First Independent Executive Board Meeting
Included · Within 30 DaysWithin 30 days of the engagement closing, we return for the Executive Board's first independent meeting. We observe the decision-making process in practice, note where the charter language needs refinement, and give the board direct feedback on how the session went and what to adjust. At the end of this session, the DGS System is running without external support. That's the intended outcome — and this meeting is how we confirm it's actually happening.
We already have an AI policy. We have acceptable use guidelines, a responsible AI framework, and a compliance team reviewing deployments.
Those are the right things to have. They're also not sufficient on their own, for a specific reason: a policy defines what should happen. A governance structure is what makes it happen — consistently, at scale, when the team making a deployment decision doesn't have time to read the policy document and isn't sure which part applies to their situation.
Right now, when a team encounters an AI decision that feels like it might cross a line, they have two realistic options: proceed and hope, or stall and send an email to someone who may or may not respond. Neither of those is compliance. Neither is governance. They're the absence of a structure for making the decision well.
The DGS System doesn't replace your AI policy. It installs the structure that enforces it — a Digital Council that handles ongoing compliance questions at the operational level, and an Executive Board that resolves the ones too consequential for any single team to decide alone. Your policy tells people what to do. The DGS System gives them somewhere to go when they're not sure.
Policies have a shelf life. Deployed AI systems have a compliance requirement that compounds as the models update, the regulations evolve, and the use cases expand. The governance structure the DGS System installs is designed to manage that complexity over time — not just at the moment of deployment.
What executives ask before starting.
Our organization procures external engagements through a formal vendor or RFP process. Can the DGS System be purchased that way?
Yes. We have procurement specialists on our team who have navigated formal vendor processes, government contracting structures, and institutional procurement channels before. If your organization requires a specific procurement pathway, let us know on the discovery call and we'll work with your process requirements.
We're deploying AI tools across multiple business lines right now. Is this the right moment to install this, or should we wait until the deployments stabilize?
Now is the right moment, specifically because the deployments are active. The DGS System is most valuable when there are live decisions being made — because the governance bodies have real work to do from the moment they're operational. Waiting for deployments to stabilize means waiting for the governance gap to widen. The cost of installing the structure while things are moving is lower than the cost of untangling decisions made without it.
Does the DGS System address AI compliance specifically, or just governance structure generally?
Both. The Digital Lifecycle Program that governs how assets move through your organization applies to AI deployments the same way it applies to every other digital asset — including the ongoing monitoring and compliance requirements that don't end at deployment. The governing bodies the DGS System installs are also the bodies that enforce your AI policy in practice, not just on paper. If you have specific regulatory requirements around AI, the discovery call is where we talk through how the structure addresses them.
Do we need the EDX System before we can do the DGS System?
No. The DGS System installs both the Digital Council and the Executive Board — so if your organization doesn't have the operational layer yet, we build it as part of this engagement. If you already have a Digital Council from an EDX System engagement or otherwise, we build from that foundation instead of duplicating it.
What stops the Executive Board from becoming another committee that doesn't actually govern anything?
Two things. First, the charter defines what belongs to the board — so there's no ambiguity about when something should come up and who has authority to resolve it. Second, Sprint 6 requires both bodies to work through real problems before the engagement closes. The Executive Board leaves the engagement with a decision on record, not just a mandate. That track record is what makes it credible to the teams who will escalate to it.
AI deployments aren't slowing down. The governance gap compounds the longer it's open.
Every quarter without a governance structure is another quarter of AI tools adopted informally, compliance decisions made by whoever happened to be in the room, and initiatives that crossed an organizational line without anyone with the authority to greenlight or stop them.
Your teams are doing the right thing. They're experimenting, moving fast, and trying to use every available tool to do better work. What they need is a structure that tells them when something is theirs to decide and when it needs to go to a body with broader authority. What you need is that body to exist — and to have already made decisions before you need it for something consequential.
The DGS System builds that structure. In 12 weeks. Built by the person who invented federal AI governance from the inside.
See if we're the right fit