- 2 days ago
- 4 min read
Updated: 18 hours ago
The conversation around generative engine optimization has largely focused on getting your brand mentioned and cited in AI-generated responses.
That’s already outdated.
"The future isn't about how your static content is found on the web,” says Chelsea Carter, product manager at Microsoft working on NLWeb. “It's about how users can interact with your content, and how they discover that ability to interact.”
That's the vision behind NLWeb, Microsoft's infrastructure that gives websites conversational search capabilities and makes them discoverable by AI agents.
Here, as part of our Human in the Loop interview series, Carter shares how marketers can prepare for an ecosystem where agents, not pages, are the primary interface between brands and consumers.
How long have you been working with AI?
I've been working with AI since mid-2022, initially focused on identifying interesting use cases and connecting Microsoft teams doing innovative model training.
Early in 2025, I transitioned to working on NLWeb, which has been an incredible opportunity to apply AI at a practical level across products.
My role as a technical program/product manager allows me to bridge the gap between AI capabilities and real-world applications that solve customer problems.

How has your relationship with AI changed over the years (or months)?
My perspective has evolved significantly. I've moved from being focused on the models themselves to concentrating on how we use AI to solve specific customer problems and identify product changes that truly move the needle. The reality is that the models are already quite good. What's lacking are products that take full advantage of them. This shift has made me much more pragmatic and outcome-focused in my approach to AI.
What's your AI-search philosophy right now?
Users are increasingly trained to want conversational interfaces, and search across the web will need to adapt accordingly. We fundamentally believe this should be an open ecosystem where publishers maintain control over how they monetize their content.
It's critical that there isn't “one agent to rule them all.” Instead, there should be many sources of intelligence that can all communicate with each other. This is the vision for NLWeb. This philosophy has crystallized as I've worked more directly with publishers and seen the real stakes involved in this transformation.
Can you share a recent success in improving visibility in LLMs?
I think we need to reframe this question entirely. The future isn't about how your static content is found on the web. It's about how users can interact with your content, and how they discover that ability to interact.
This is where agent discovery becomes essential, and it's a core part of the ecosystem that Microsoft is building with NLWeb and what we casually call “Agent Finder.”
Effectively, we have a lot of agents being built, but there’s not a great way to find them without a lot of friction to find and install them, and that’s what we’re aiming to solve. The success metric isn't traditional visibility anymore; it's about being discoverable and accessible as an interactive agent that can respond to natural language queries in real-time.
The key mindset shift is moving from content optimization to interaction design. Rather than thinking about keywords and page rankings, we're focused on creating endpoints that can understand and respond to complex natural language queries.
The tactic is building infrastructure that enables publishers to participate in the agentic web without requiring them to become AI experts—that's what NLWeb provides.
Which AI tools do you use the most?
GitHub Copilot: This is my go-to for coding assistance and technical implementation. It dramatically speeds up development work and helps me write better code.
M365 Copilot: The collaboration features are game-changing. Meeting transcripts and summarization save me hours every week and ensure I never miss important details from conversations.
Claude: I rely heavily on Claude for completing technical tasks, particularly when I need detailed explanations or help working through complex product challenges. Its ability to understand context and provide thoughtful, nuanced responses makes it invaluable for my work.
What excites you the most about AI-SEO/GEO right now?
I love the idea of having an incredibly intelligent landscape of agents to interact with. This approach removes the pressure—and frankly, the inherent bias—of having a central agent that must be super smart and likely costs multiples of what smaller, more targeted agents with specific corpus of data would cost. The distributed intelligence model is not only more efficient but also more democratic and aligned with how the web should evolve.
What's the biggest challenge for marketers diving into AI-SEO today?
For marketers and publishers, the biggest challenge is figuring out what the new advertising and monetization models will be. We're in the early phases, and these models have yet to fully emerge, which also means there's tremendous opportunity for those willing to experiment and innovate. The old playbook doesn't apply anymore, and the new one is still being written.
What advice do you have for those looking to improve performance in AI search?
Build with an open ecosystem in mind. NLWeb was designed with this very idea at its core—to bring the agentic landscape together in a way that benefits everyone: users, publishers, and AI systems. Don't wait for the future to be defined for you. Start experimenting now with conversational interfaces, structured data, and agent-based interactions. The publishers who will thrive are those who embrace this shift early and help shape what comes next.




