Your contentstays yours.Always.
Five principles. Zero compromises.
Read-Only Access
enhancely reads your rendered HTML output — the same way Google's crawler does. It never writes to your CMS, modifies your files, or accesses your backend. No database connections. No write permissions. No admin access.
No Content Storage
Your page content is processed in real-time and not persisted. Only the generated JSON-LD schema is stored — the structured data output, not your source content. When we re-crawl, we read from your live site, not from stored copies.
Factuality Guarantee
The factuality layer cross-references every piece of generated markup against your actual page content. No hallucinated opening hours. No phantom prices. No outdated addresses. Only verifiable facts from your live pages end up in the schema.
Full Compliance
Every schema output is validated against Google, Bing, and AI search engine guidelines before deployment. Non-compliant markup is blocked — it never reaches your site. This protects you from potential penalties for misleading structured data.
Full Control
The enhancely dashboard shows every schema generated for every page. Delete anything, block any page, trigger re-generation at any time. You retain full control over what structured data represents your content.
Every schema passes three gates.
Validity
Factuality
Compliance
Read-only. Always.
Numbers don't lie.
Built on research. Not promises.
The University of Nantes published peer-reviewed research on LLM-generated Schema.org markup. Finding: 40–50% of markup generated by AI without a curation pipeline is invalid, non-factual, or non-compliant.
After applying a three-layer curation pipeline — the same architecture enhancely uses — GPT-4 outperforms human annotators. enhancely is built on exactly this architecture. That's not a claim. It's published science.