I’ve been seeing a lot of folks vibe coding websites for small businesses and startups lately, so I finally decided to jump in myself.
First observation: most AI builders default to SPAs (single-page apps). Great for modern UX, terrible by default for SEO. You don’t really get clean, crawlable, search-engine-friendly pages unless you’re intentional about it (AKA burning through credits as I'll discuss later).
That said, the stuff people are shipping with vibe coding is genuinely impressive. After leaving my last company post-acquisition, I wanted to experiment and build something useful.
One of my first projects was a directory—but I wanted it to actually show up in search.
Enter: sitemaps.
Exit: my sanity.
The project
The first place I hit this wall was Allergy Safe Menus, a directory of restaurants in North America with food allergy menus.
I have both friends and family members with severe food allergies, and every trip or night out turns into a scavenger hunt through PDFs, outdated menus, and awkward conversations with staff. I realized there wasn’t a single, up-to-date place to find allergy menus.
So I:
- Used Grok to help collect menus
- Generated summaries + allergen tags
- Stored everything in a Google Sheet
- Shipped an MVP once I had a couple dozen restaurants
Cool. Pages existed. Content existed. Time to get indexed.
“No problem, just add a sitemap”
I’ve handled SEO for thousands of website clients. This is muscle memory.
“Create a sitemap.xml with all content pages, submit to Google Search Console, done.”
It was not done.
Issue 1: Works locally, breaks in prod
Classic. XML route worked in testing, failed after deploy. A few prompts, some Chrome console debugging, eventually fixed at the expense of 4-5 credits.
Issue 2: Wrong domain
The sitemap technically loaded… just not on allergysafemenus.com/sitemap.xml
It was living on some randomly generated lovable.dev URL, which Google rejected since it wasn't on the main domain.
More credits burned, but eventually got the sitemap route on the correct domain.
Issue 3: Dynamic pages not updating the sitemap
New restaurants were being added. Pages were rendering. Sitemap wasn’t updating.
Back to prompting. Eventually fixed dynamic generation.
But Google was still rejected it. 😡
Issue 4: React routing + headers from hell
This was the real fight.
React routing is built for HTML, not XML. No matter what I did, the sitemap response kept getting HTML headers wrapped around it. The URL worked. The content looked fine in-browser. Google still said nope.
At this point I’d burned ~25–30% of my Lovable credits on what should be a 10-minute task.
The workaround that finally worked
This is where old-school web dev instincts kicked in.
Instead of continuing to fight headers and routing:
1) Switched from XML to TXT
Google allows TXT sitemaps. TXT doesn’t care about XML parsing or headers. I just had a hunch it would be easier and Google Search Console would just read it as text instead of checking the formatting and headers.
2) I made it a static file.
No dynamic routing. No React interception. Just a plain file Google can fetch.
Google accepted it instantly. 🎉
Keeping it up to date
Of course, static files introduce a new problem: freshness.
So I added a hook that updates the sitemap file whenever a new page is created. Static delivery, dynamic updates.
Problem solved. Take that AI!
Update
I’ve since built multiple projects on Lovable, Bolt, Rork, and Base44 and hit this exact issue again.
On my latest project, an AI Wordpress theme generator & website builder, I skipped the pain by immediately applying the same solution so that free Wordpress themes in our Wordpress theme directory would be indexed. The coding tools still tripped but a who lot less at only 6 or 7 credits to get it right.
If you’re vibe coding on Lovable (or other) and wondering why SEO feels cursed—this is probably why. Take my advice and do the above.
AI builders CAN ship amazing apps as an MVP.
SEO, though? Still needs a human in the loop.

Top comments (0)