SEO in 2025: Optimizing for Humans and AI Agents

Humans scan in 3 seconds. AI agents parse in milliseconds. What I learned revamping Neoflo's website- and why llms.txt might be more important than your sitemap.

One website interpreted by humans and AI agents, illustrating modern SEO in 2025.
One website interpreted by humans and AI agents, illustrating modern SEO in 2025.

Your Website Now Has Two Brains Reading It

I was supposed to do a quick website revamp for Neoflo. Update copy. Fix meta tags. Classic SEO stuff.

Three rabbit holes later, I realized something that changed how I think about websites entirely.

You're no longer optimizing for one audience. You're optimizing for two - and they read completely differently.

The Problem Nobody's Talking About

Here's what's actually happening when someone looks for information about your company:

Scenario 1: They Google you. Visit your site. Scan for 3 seconds. Decide if you're worth their time.

Scenario 2: They ask ChatGPT. Or Perplexity. Or Claude. The AI reads your entire site, extracts facts, and generates an answer. No clicking required.

Same content. Two completely different brains consuming it.

And most websites are optimized for neither.

I learned this the hard way while revamping Neoflo's site. Let me show you what actually matters.

How Google Actually Works (Not What You Think)

Most people think Google "matches keywords." It doesn't. Not anymore.

Here's what happens when Google visits your site:

Discovery → Crawling → Rendering → Indexing → Ranking

The part that surprised me? Google actually runs your JavaScript. It sees your page almost exactly like a human would in a browser.

Then it tries to understand meaning - not just words.

That's a massive shift from the old "keyword stuffing" days. Which brings me to...

The Graveyard of Black Hat SEO

Quick detour into what doesn't work anymore.

The classic tricks that got crushed:

Hidden text. White text on white background. Stuff it with keywords. Users can't see it, but Google can.

Cloaking. Show Google a keyword-stuffed page. Show humans something completely different.

Keyword stuffing. "Best AI engineer. Top AI engineer. #1 AI engineer. AI engineer AI engineer."

These worked. Until they didn't.

Google's major updates killed them:

  • 2011: Panda (killed content farms)

  • 2012: Penguin (killed fake backlinks)

  • 2019: BERT (true language understanding)

Sites lost 90%+ of traffic overnight. Some businesses shut down entirely.

Here's what I learned going down the "Adversarial Information Retrieval" rabbit hole:

Google uses 200+ ranking factors. You might game one or two. But gaming all 200 consistently? Mathematically nearly impossible.

They cross-verify everything. If you claim to be "Head of Product at Neoflo" on your site, Google checks if Neoflo's website, LinkedIn, and news articles agree. Inconsistency = red flag.

The lesson: Manipulation has a short shelf life. Being genuinely useful is the only strategy that survives.

How Humans Actually Read Your Website

Forget bots for a second. Humans are a different animal.

Humans don't read. They scan.

You have 3 seconds to hook someone before they bounce. Eyes move in an F-pattern - across the top, down the left side. Headlines get read. Body text? Maybe.

What humans want:

  • "What is this?" (answer in 3 seconds)

  • "Why should I care?" (what's in it for me)

  • "Can I trust you?" (credentials, proof, design)

  • "What next?" (clear CTA)

What humans ignore:

  • Walls of text

  • Corporate jargon ("synergies," "leverage," "solutions")

  • Generic stock photos

  • Anything requiring effort

Humans are selfish readers. They don't care about you. They care about their problem.

Your job? Convince them - fast - that you can help.

The New Player: AI Agents

Here's where it gets interesting.

People don't just Google anymore. They ask AI.

"Who's a good AI product manager in India?" "What's an example of agentic AI in healthcare?" "What does Neoflo do?"

If an AI can't find accurate information about you, you don't exist in that conversation.

How AI agents are different:

Google crawls, indexes, ranks. Gives you a list of links.

AI agents search, extract, synthesize. They read your page, pull out facts, generate an answer. No clicking required.

What AI agents want:

  • Clear, factual statements ("Shubham is Head of Product at Neoflo.ai")

  • Structured information with headings

  • Consistency across your web presence

  • Something called llms.txt (more on this in a second)

What AI agents struggle with:

  • Vague marketing speak ("transforming the future of innovation")

  • Content buried in images or complex JavaScript

  • Contradictory information across pages

Same website. Two completely different parsing strategies.

llms.txt - The New Robots.txt

This blew my mind.

You know robots.txt - tells Google what it can crawl?

There's a new convention: llms.txt. Same idea, but for AI agents.

Put it at yoursite.com/llms.txt - a plain-text summary of who you are and what you do. AI agents can read it directly without parsing your entire site.

Here's what I put in mine:

# Shubham Shrivastava
> Head of Product at Neoflo.ai. Building AI that automates the CFO tech stack.

Simple. Plain text. Designed for machines to understand instantly.

Google doesn't use this yet. But ChatGPT, Perplexity, and Claude can.

As AI search grows, this becomes your SEO edge.

The Two-Audience Problem

So now you're writing for two very different readers:

Humans: Scan, skim, bounce in 3 seconds. Want to know "why should I care?" Trust signals from design and social proof.

AI Agents: Parse, extract, synthesize in milliseconds. Want to know "what are the facts?" Trust signals from consistency and verification.

The good news: There's overlap. Clear writing, good structure, and factual statements work for both.

The bad news: Clever-but-vague marketing copy fails for both.

"We're revolutionizing the paradigm of enterprise synergies" helps no one.

What I Actually Did for Neoflo

Here's my checklist after all this research:

For Google:

  • Page titles under 60 characters

  • Meta descriptions under 155 characters

  • Proper heading structure (H1 → H2 → H3)

  • Alt text on all images

  • Fast load times, mobile-friendly

For Humans:

  • Clear headline in 3 seconds

  • Scannable sections

  • Zero jargon

  • Obvious call to action

For AI Agents:

  • Added JSON-LD Schema (structured data markup)

  • Created llms.txt file

  • Made sure facts are stated clearly, not buried in fluff

  • Consistent information across all pages

For Trust:

  • Same story on website, LinkedIn, and company pages

  • Real credentials that can be verified

  • Links to actual work

Do This Next

If you're revamping your website or optimizing for visibility, here's your action plan:

  • [ ] Create your llms.txt file. Put it at yoursite.com/llms.txt. Include who you are, what you do, and how to contact you. Plain text. Factual.

  • [ ] Audit your homepage for clarity. Open it on your phone. Can someone understand what you do in 3 seconds? If not, rewrite the headline.

  • [ ] Cross-verify your facts. Google yourself. Check LinkedIn. Check company pages. Make sure the story is consistent everywhere. AI agents check this.

  • [ ] Test for AI readability. Ask ChatGPT "What does [your company] do?" If it gives a vague or wrong answer, your site isn't clear enough.

  • [ ] Cut the jargon. Find every instance of "leverage," "synergy," "solutions," "paradigm." Delete or replace with plain language.

Start with llms.txt. It takes 10 minutes and immediately makes you more discoverable to AI agents.

Key Takeaways

  • You're optimizing for two audiences now. Humans who scan in 3 seconds. AI agents that parse in milliseconds. Both reward clarity.

  • Black hat SEO is dead. Google uses 200+ ranking factors and cross-verifies everything. The only strategy that survives is being genuinely useful.

  • Humans are selfish readers. They don't care about you. They care about their problem. Answer "why should I care?" in 3 seconds or lose them.

  • AI agents are the new gatekeepers. If ChatGPT can't find accurate info about you, you don't exist in that conversation. llms.txt fixes this.

  • Consistency compounds trust. Same story across your website, LinkedIn, company pages, and llms.txt. Inconsistency triggers red flags for both Google and AI agents.

Keep Reading

If this clicked, these connect directly:

More AI Product Insights

I write about what I'm building and learning at heyshubh.com.

Connect with me on LinkedIn - always up for talking about AI product challenges.

Next week: Claude's new Chrome integration - is it actually better than Comet and Dia, or just more hype? I've been testing all three. The results aren't what I expected.