Executive Summary
We’re introducing a significant new feature to Tako, our AI agent for Okta, with the addition of secure real-time API query capabilities. This upgrade allows Tako to connect directly with Okta’s APIs, intelligently generating and executing secure code based on natural language questions. Beyond enhancing Tako’s current functionality, these capabilities establish the foundation for future autonomous operations. We’ve implemented a measured approach—starting with read-only operations while developing the architecture for supervised write operations in future releases. For IAM teams managing growing complexity, this represents a practical evolution in identity management: simplifying operations while strengthening security posture.
Introduction
In our previous article, we introduced Tako, our AI agent that simplified access to Okta data through plain-English queries. No more complex scripts or API calls – just ask a question and get an answer. IAM engineers, managers, and auditors quickly embraced this approach for extracting insights from their Okta environments.
Today, I’m thrilled to share Tako’s next big leap forward – real-time capabilities that lay the groundwork for autonomous identity management. While Tako’s database mode is still your go-to for fast, comprehensive data analysis (and we’re not changing that!), this new dimension lets Tako talk directly to Okta’s APIs and make smart decisions about which tools to use when tackling complex questions.
What Are Tako’s Real-time Capabilities?
When we say “real-time capabilities,” what does that actually mean for your day-to-day work with Tako? When you type in a question now, Tako:
- Analyzes your question to figure out what you’re really asking for
- Maps out a plan by picking the right tools for the job
- Writes secure Python code on the spot to handle your specific request
- Runs this code in a locked-down sandbox environment
- Cleans up and presents the results in a way that actually makes sense
The magic happens behind the scenes – Tako decides which API endpoints to call, how to handle the data it gets back, and how to pull information from multiple sources to answer even your trickiest questions. And it all happens in seconds.
The Evolution of Tako: Adding Real-time Intelligence to Power Autonomy
We didn’t build Tako’s new real-time capabilities to replace its current database-based engine – quite the opposite. We’ve created something more powerful by combining the best of both worlds.
Think of it this way: Tako’s database mode is like having instant access to a comprehensive report that’s updated regularly. It’s fast, thorough, and great for most questions. The new real-time capabilities are like having an expert assistant who can go check on specific things right now and combine information in clever ways. Together, they’re helping Tako grow from a smart query tool into an intelligent assistant that doesn’t just analyze data but makes decisions about how to get you exactly what you need – and eventually, with your approval, will make changes directly in your Okta environment.
The Security-First Foundation
I know what you’re thinking – “generating and executing code based on natural language sounds risky!” That’s why we’ve built Tako’s real-time capabilities on a rock-solid security foundation. Our secure sandbox environment for AI-directed operations includes:
- Method and Module Whitelisting: We explicitly list which SDK methods and Python modules Tako can use – anything else is off-limits
- AST-Based Code Analysis: Every bit of generated code gets scanned at the abstract syntax tree level to catch potentially harmful operations
- URL Validation: All API calls are checked to ensure they only target your authorized Okta domain
- Operation Restrictions: We’ve built a permission model that limits which operations can be performed on specific entities
This comprehensive approach means Tako can generate and run code based on your questions while staying within strict security boundaries – something we believe is a genuine breakthrough for AI-powered identity management.
Tako’s Real-time Architecture: The Multi-Agent Orchestration System

Tako isn’t just a single AI agent – it’s actually an orchestrator managing a specialized team of AI agents, each with its own role in processing your questions. This multi-agent approach allows Tako to handle complex queries that would overwhelm traditional single-agent systems:
The Orchestration Process
-
Tako Orchestrator – The central controller that manages the entire workflow and communication between specialized agents
-
Plan Generation Agent – This specialized AI analyzes your natural language query and determines which tools are needed and in what order. It breaks down complex questions into logical execution steps, essentially creating a blueprint for solving your query.
-
Code Generation Agent – Once a plan is established, this agent translates it into executable Python code. It works within strict security parameters defined in Tako’s security configuration, ensuring all generated code follows best practices and security protocols.
-
Execution Manager – A specialized utility that runs the generated code within a highly secure sandbox environment, enforcing all security rules and managing API interactions.
-
Result Processor Agent – This final agent receives output from the execution process (either complete results or carefully sampled data for large datasets) and transforms it into clear, insightful answers. It can generate summaries, create relationship tables, or highlight key findings based on what would be most helpful.
This multi-agent orchestration allows Tako to maintain context across complex multi-step operations, process larger datasets than would fit in a single AI model’s context window, and provide more sophisticated analysis than any single agent could deliver.
Intelligent Real-Time Processing and Context Awareness
What makes Tako’s approach special is how it breaks down complex questions, chooses the right tools, and runs everything within a secure framework. This multi-phase execution lets Tako:
- Handle datasets far larger than what would fit in typical AI context windows
- Keep track of complex multi-step operations without getting confused
- Apply sophisticated reasoning to spot patterns across your identity environment
- Generate targeted code that addresses specific parts of your question
In plain English? Tako can now answer questions that would completely overwhelm traditional AI assistants, giving you deeper insights into complex identity problems.
Tako Realtime vs. MCP Server Approaches: Breaking Through Context Limitations
If you’ve been following our journey (or experimenting with AI agents yourself), you might be wondering how Tako’s approach differs from traditional Model Context Protocol (MCP) servers we discussed in our previous posts. The difference is significant – and it’s all about overcoming two critical limitations that have plagued MCP implementations:
Limitation #1: Data Volume Constraints
Traditional MCP servers struggle when API endpoints return large datasets. Why? Because they have to push all that raw data into the AI model’s context window, quickly exhausting token limits. This forces developers to implement artificial caps on results or resort to complex pagination schemes that confuse the AI.
Limitation #2: Tool Confusion
The more tools and functions you expose to an MCP-based system, the more detailed documentation you need to include in every prompt. This creates a vicious cycle: more capabilities require more documentation, which consumes more context space, which limits how much data can be processed.
Limitation #3: Cost and Privacy Concerns
With traditional MCP approaches, the more data you process, the more tokens you consume – driving up costs substantially as your organization scales. Worse yet, passing complete datasets to AI models raises significant privacy concerns, especially with sensitive identity data.
Tako’s Breakthrough Solution
We’ve engineered a fundamentally better approach to overcome these limitations. Tako’s architecture processes data outside the AI’s context window, intelligently selects only relevant tools for each query, and breaks complex operations into manageable steps. This breakthrough lets Tako handle enterprise-scale identity datasets without the “sorry, too many results” messages that plague traditional implementations.
The Path to Autonomy
Right now, Tako’s real-time capabilities are limited to read-only operations – a boundary we’ve deliberately set while the architecture matures. But we’ve laid the groundwork for supervised autonomous operations in the future.
The components we’ve built for real-time operations include:
- Reasoning Agent: Analyzes what you’re asking for and creates structured plans
- Execution Manager: Handles dependencies between steps and manages errors
- Tool Registry: Provides a collection of functions Tako can use
- Human Confirmation Workflow: Ensures you verify plans before any changes are made
These are the building blocks for a future where Tako can not only answer your questions but also help fix issues or implement changes – all while keeping you in control with a carefully designed approval process.
Business Impact Today
We’ve been working with early adopters of Tako’s real-time capabilities, and they’re already seeing concrete benefits:
- Less Custom Scripting: Identity teams are saving 5-10 hours weekly that they used to spend writing scripts for complex queries (Based on feedback from a dozen enterprise customers)
- Faster Security Response: Security incidents involving identity issues are being investigated 60% faster using natural language queries (We measured this across more than 30 security incident simulations)
- Better Decision Making: Questions that once took days of analysis are now answered in minutes (The average time reduction is a whopping 94%)
- Broader Access to Insights: Non-technical team members report 78% more confidence in getting identity information through natural language
User Community and Feedback
I want to take a moment to personally thank everyone who’s reached out with questions, suggestions, and feature requests since we first launched Tako. Your feedback hasn’t just been helpful – it’s been absolutely essential in shaping this update and guiding our roadmap. The enthusiasm we’ve seen from the identity community has validated our approach and inspired many of the improvements in this release. We’re committed to keeping this collaborative spirit alive as Tako continues to evolve. Your real-world use cases and challenges drive our development priorities more than anything else.
Looking Ahead: Tako’s Future
As we continue developing Tako, we’re working toward creating a true AI co-pilot for identity management – one that can help with increasingly complex tasks while keeping security at the forefront. The foundation we’ve built for real-time capabilities gives us a clear path toward supervised write operations that could automate routine identity management tasks, from access reviews to provisioning optimizations.
Tako’s evolution isn’t just a technical achievement – it represents a fundamental shift in how we approach identity management. Rather than replacing human expertise, Tako augments it, combining the efficiency of AI with human judgment and oversight. Our goal is to make identity management more accessible, efficient, and secure for teams of all sizes.
Get Started with Tako
Ready to see Tako in action? Visit our GitHub repository at https://github.com/fctr-id/okta-ai-agent or drop us a line at support@fctr.io with any questions.
Need help troubleshooting? You can:
- Open an issue on GitHub
- Email our team at support@fctr.io
- Contact Dan directly:
- Email: dan@fctr.io
- Slack: dan@fctr.io
We’re committed to ensuring your success with Tako and welcome your feedback!
BONUS:
Tako can also has access to SPECIAL TOOLS (ex: can user access app) as shown in the image below which answers the question which Okta admins get the most!

