- The Datable
- Posts
- AI Uses the Same Data Quality Signals as Search Engines
AI Uses the Same Data Quality Signals as Search Engines
Companies need structured data strategies before AI search dominance becomes permanent

AI adoption moves fast. Faster than companies realize.
While executives debate whether AI resembles "that internet thing," the technology already changes how consumers find information.
This isn't gradual change. It's sudden displacement. AI systems don't wait for businesses to catch up.
We aren’t adopting AI. AI is adopting us.
The discussions at the SIINDA conference (Search and Information Industry Association) in Cologne revealed some great insights. Following my presentation, I must have had a dozen dialogues on “okay, but how?!”
These new AI systems operate on frameworks we've understood for decades.
They just apply them differently.
The Foundation We Already Know
Current AI systems build upon information quality theory established by Wang and Strong in 1996 (Wang & Strong, 1996).
Their research identified multiple factors that determine how both humans and systems assess information quality. Search and AI systems require methods to evaluate the accuracy of information about brands or businesses, focusing on key areas such as accuracy, frequency, recency, consistency across platforms, and depth of data.
These principles haven't changed. Search engines have always relied on these same quality signals. Authority refers to the expertise or recognized official status of a source, while frequency and consistency help determine reliability across different instances. What's different now is how AI agents weigh and process these signals.

Again, don’t worry about the Bars - this is just illustrating the point.
AI systems use the same data quality framework, but with different model constructs and citation weightings. Each system interprets authority differently. Google's algorithm values specific signals. Claude prioritizes others. ChatGPT weighs consistency against recency using its own calculations.
This fragmentation creates new complexity. Where traditional search offered relatively consistent ranking factors, AI agents operate like separate search engines with unique preferences.
Knowledge Graphs Change Everything
Knowledge graphs represent the most efficient way to feed information to AI systems. They solve two critical problems simultaneously.

"Knowledge graphs are infinitely better than vector search for building the memory of AI agents. With five lines of code, you can build a knowledge graph with your data. When you see the results, you'll never go back to vector-mediocrity-land."
First, they reduce computational costs. Structured data enables large language models to process information more efficiently because it is already organized.
The AI doesn't need to parse and interpret unstructured content during every interaction.
Second, they address the mismatch between human-designed websites and machine consumption. Our current web infrastructure optimizes for human browsers, not AI agents.
Consider the size difference, as I covered a few weeks ago…
Traditional websites force AI systems to download massive files, render complex layouts, and extract meaning from content designed for human eyes. This approach wastes computational resources and slows response times.

You can try this yourself - ask Claude to extract Structured Data from Your Page.
Knowledge graphs flip this equation. They present information in machine-readable formats that AI can consume immediately. No rendering required. No visual parsing needed.
Start Building Structured Data Now
AI adoption happens to us, not because of us.
Every day, companies delay building structured data systems, and they fall further behind in AI search rankings.
"One way to think about AI Agent opportunities is to figure out which software categories are customers not fully taking advantage of because they don't have the resources to leverage the tools. The vast majority of software in the world is underutilized relative to what it can do, solely because there aren't people on the other end to use the service productively."
Companies should immediately begin converting their key content into structured formats. This includes product information, company details, expertise demonstrations, and customer case studies. The goal is to create machine-readable versions of everything currently locked in human-optimized web pages.
Focus on these action steps:
Audit existing content for structured data opportunities
Implement JSON-LD markup for key business information
Create knowledge graph representations of products and services
Create Knowledge Graphs for absolutely any data set that would help a consumer or business understand what you have to offer
Establish consistent data formats across all customer touchpoints
Monitor how different AI systems interpret and cite your information
The Infrastructure Challenge
Building knowledge graphs doesn't have to be a significant technical effort. Many companies like Yext (my employer) provide services by making it easier to incorporate data into this type of structure. You can also think of a graph as a series of connected tables.
The point is that you're not feeding AI paragraph after paragraph of fluffy writing and marketing copy about your products, services, and locations. Instead, you're feeding it highly structured data, allowing it to process that much faster.
I know it's uncomfortable, but the reality is that we are headed to a place where AI agents are the only people visiting our website, and it will be a lot better for everyone if you can feed them more efficiently.
AI companies won't rebuild infrastructure to accommodate poorly structured business data. They expect information in formats their systems can consume efficiently.
Asking AI companies to render complex websites for data extraction won't happen.
This creates a divide between early adopters and laggards. Companies that invest in structured data now will dominate AI search results. Those who wait will find themselves invisible to AI agents.
Memory Changes User Behavior
Memory features add context to user interactions, creating persistent relationships between consumers and AI systems. This contextual memory changes how people interact with brands through AI.
AI doesn't adopt gradually. It adopts us suddenly. This represents a significant change from previous technology adoption patterns. We've never had technology that understood us better than we know ourselves.
This changes search dynamics completely. AI agents will make recommendations based on data patterns humans can't process. They'll identify connections we miss and suggest solutions we wouldn't consider.
Brands that establish trust signals now through consistent, high-quality structured data will benefit from AI recommendation engines. Those that don't will lose visibility as AI agents struggle to interpret their information.
Companies that value consistency, frequency, updates, real-time information, and data depth need to start now.
Adding data into AI systems later remains possible, but early signals carry more weight.
The Acceleration Timeline
AI agent adoption accelerates daily. The more structured data companies provide to AI agents today, the better positioned they'll be tomorrow. Financial services and healthcare companies have enormous opportunities here, but also a considerable distance to cover.
Traditional search took years to establish ranking factors. AI systems establish preferences in months or weeks. The acceleration creates both opportunity and risk.
Bottom Line?
AI systems mirror traditional search frameworks but operate at unprecedented speed with unique weights.
Companies must build structured data systems now or risk invisibility in AI search.
Reply