FoxChat for docs sites

FoxChat for docs sites — turn your documentation into a conversation

A vertical-specific look at what Foxy does on developer documentation sites: where it beats broken keyword search, where it hands off to humans, and how to set it up on Mintlify, Docusaurus, GitBook, ReadMe, or a custom docs stack.

01The docs-site visitor problem

Docs site search is broken in a specific way. Every docs platform ships a search box, every search box runs on keyword indexing, and keyword indexing fails the moment a visitor uses different words than the docs author used. A developer searches for "rate limit reset" and gets nothing because the page calls it "request budget refresh interval". A reader searches "how to bulk update" and finds nothing because every example in the docs is titled "batch operations". The content is there. The visitor cannot get to it.

The deeper failure is that docs visitors are not searching, they are asking. They want to ask "how do I retry a failed webhook delivery?" and get a working answer with the right code snippet and the next caveat to watch out for. Forcing them through a search box that needs keyword matching turns every question into a guessing game about the docs writer's vocabulary, and most visitors give up after two failed searches. Some go to Stack Overflow. Most close the tab and assume your product cannot do what they want.

The compounding problem is that docs are paged and chunked. A real answer often spans two or three pages — setup on one page, configuration on another, troubleshooting on a third. Keyword search returns one page at a time and leaves the visitor to assemble the answer themselves. That assembly cost is invisible to docs maintainers but corrosive to the developer-experience metric every API team cares about.

02What Foxy does on a docs site

Foxy reads your entire docs tree — API references, guides, tutorials, changelogs, FAQs — and answers visitor questions in plain language with the source still attached. The retrieval is hybrid: a semantic search ranks passages by meaning so synonyms work, and a full-text search catches literal API method names, error codes, and parameter strings. Both signals are blended, which is why Foxy keeps answering correctly when a visitor types "rate limit reset" against docs that say "request budget refresh interval".

Foxy assembles answers across pages. When a visitor asks "how do I retry a failed webhook", Foxy pulls the webhook overview from the conceptual guide, the retry policy from the configuration reference, the example payload from the troubleshooting page, and the error-code list from the API reference. The answer reads as one coherent response with citations to each source page. The visitor never has to assemble it themselves.

Where retrieval confidence is too low to answer well — an obscure edge case that is not documented, a question about a deprecated endpoint, a feature request — Foxy says so and offers a clean handoff. Most docs teams route those to a developer-relations inbox so an engineer can answer in the same thread, and the answer they write becomes a new docs entry on the way out.

03Where Foxy helps most

Three docs surfaces account for most of the lift. API references are the highest-value one. API ref pages are dense, structured, and almost always require cross-reference to be useful. A developer reading the create-user endpoint also needs to know the auth header format, the rate-limit rules, and the related update-user endpoint to actually build against the API. Foxy assembles that cross-reference automatically and offers a working code example on the language the visitor's browser implies.

How-to guides are the second. Guides usually cover a happy path and skip the variations a real implementer hits. Foxy can take a guide question like "how do I do this but with SSO?" and stitch the SSO authentication guide into the implementation guide so the visitor sees the combined steps in one answer. Same for "how do I do this against a sandbox environment", "how do I do this from a server-side language", and any other axis-of-variation that gets glossed in the canonical guide.

Troubleshooting is the third. Troubleshooting pages are read by visitors with an error in front of them. They paste the error message into search and want the cause, the fix, and the prevention. Foxy handles error messages well because semantic retrieval still works on terse error strings, and Foxy can pull in the relevant API ref page so the visitor sees what the offending field expected. That triage is the single tightest feedback loop a docs site can offer.

A fourth lift that compounds over time is onboarding and quickstart coverage. New users opening the docs for the first time tend to ask the same set of questions in the same order: where do I get my API key, what is the minimum viable request, how do I authenticate, what is the cheapest plan that includes feature X. Foxy answers all of those from your quickstart and your pricing-detail pages, and it does so without forcing the new visitor to read four pages of conceptual material before they hit the first code sample. The onboarding deflection effect is often invisible until you look at the data and notice that new-visitor support tickets dropped by half in the first quarter after deploy. The lift is real, it is just quiet, and that is the right shape for a developer experience improvement.

04Where Foxy does not help

Custom code debugging needs a human. When a visitor pastes 200 lines of their own code and asks "why is this not working", Foxy is not the answer. The reasons are real: Foxy does not run code, does not see the visitor's environment, does not know which versions of which dependencies are pinned, and does not have access to the actual response payload the failing request produced. Any answer Foxy gives in that thread would be a confident guess, and confident guesses on debugging questions burn trust faster than no answer at all.

The right pattern is a clean handoff. Foxy can confirm the relevant docs sections, narrow the failure to a likely cause, and then offer to escalate the thread to a developer-relations operator with the visitor's code snippet, the error they hit, and the documentation they have already read attached. The operator picks up a triaged thread instead of a cold ticket. That is significantly faster for both sides than dumping the visitor into a Discord or a forum and hoping someone in a different timezone answers.

05Examples of docs-site conversations Foxy handles

Developer on an API reference page
"What's the difference between the v1 and v2 create-user endpoint?"
Foxy pulls both endpoints from the reference, surfaces the parameter differences, links to the migration guide, and notes that v1 is deprecated and scheduled for removal next quarter. It offers a side-by-side example payload so the developer can see the diff without flipping tabs. No human in the loop.
Reader searching the guides section
"How do I retry a failed webhook delivery? I'm getting 503s intermittently."
Foxy pulls the retry policy from the webhook configuration page, the recommended backoff strategy from the reliability guide, and the related dead-letter queue documentation, and assembles a single answer that says: enable automatic retries, set exponential backoff with jitter, configure a dead-letter URL for permanent failures. It cites all three pages. The reader has the working answer.
Visitor pasting an error message
"I'm getting an invalid-audience error on token exchange, what does this mean?"
Foxy retrieves the matching error entry from the troubleshooting page, explains the cause (audience claim does not match the configured client), and walks the visitor through verifying the audience parameter in their token-request payload against the dashboard setting. It includes a sample request. The visitor fixes the integration without opening a support ticket.
Developer asking about a deprecated feature
"How do I use the legacy XML export endpoint?"
Foxy confirms the endpoint exists but is deprecated, points to the migration guide for the JSON export that replaced it, and notes the deprecation timeline. If the visitor insists they need XML specifically because of a downstream legacy system, Foxy offers to escalate to a developer-relations operator who can suggest a workaround or confirm whether an extension is planned.

06Setup specifics for docs sites

Mintlify, Docusaurus, GitBook, ReadMe. All four hosted-docs platforms allow custom HTML in the layout or a script-tag injection in the site configuration. The FoxChat widget is one script tag, so it slots into whichever escape hatch your platform exposes. The crawler respects your sitemap and the standard docs metadata, so the knowledge base is built from your published content without manual page-by-page upload. For docs platforms that namespace content under versions, FoxChat indexes the version you mark as the canonical default, so visitors do not get answers from a year-old version of the API.

Custom docs stacks. A static-site-generated docs build — Hugo, Eleventy, Astro, MkDocs, anything else — gets the same single script tag in the base template. If you publish OpenAPI specs alongside your docs, FoxChat can index those directly so the API reference is searchable by parameter name, method, and example payload, not just by the rendered HTML. The reindexer runs on a cadence you set in the dashboard so new pages and updated endpoints are searchable within hours of publish. See the install guides for a custom stack.

07Try Foxy on your docs site

The 14-day trial gives you a fully indexed docs knowledge base from your live site within minutes of signup. The widget runs alongside your existing search, so you can compare the two surfaces head to head before deciding to remove the search box or keep both. Most docs teams keep both for the first month and watch which one their visitors choose. Start your trial, see the full feature list, or check the pricing page first.

Start a 14-day FoxChat trial on your docs site

No credit card. One script tag. Foxy is answering API and how-to questions from your existing docs within an hour.

Start your free trial See the demo