Publishing in the AI era

How AI changed the way readers resolve questions on recipe pages

Everyone in publishing is talking about traffic loss from AI search. This article is about something else: what happens when the reader is already on the page, and the question goes somewhere else.

Nikhil Mundhra and Ashwin Chepuru
Nikhil Mundhra and Ashwin Chepuru

by Nikhil Mundhra and Ashwin Chepuru

Apr 17, 2026

7 min read
Editorial illustration of a recipe page with a subtle AI-generated question hovering above it, accent orange glow

What this article is about

Everyone in publishing is talking about traffic loss from AI search. This article is about something else. When a reader is already on a recipe page and a question forms in their mind, how they resolve that question is quietly shifting. The reader stayed. The question just went somewhere else.

The questions a recipe sparks

A recipe page generates a cascade of questions in the reader's mind. Not quick factual lookups, those were always Google's job. The questions that matter here are specific to this recipe, where the answer requires context, testing, or experience.

My dough is sticky after 4 cups of flour. Is this normal for this recipe or should I add more?

Does the overnight marinade actually matter, or can I get away with 2 hours?

I'm at 7,000 feet. How does this affect the rise time for this bread?

Someone used bread flour instead of all-purpose in this pizza dough and got triple the volume. Is that reliable?

I don't have Shaoxing wine. Will dry sherry change the flavor of this dish, or is it close enough?

These are questions where the right answer matters more than a fast one. They need someone who has actually made this recipe, tested the variation, or understands why the publisher insists on a specific technique.

Where these questions used to go

The primary destination was the comment section. Not just for the publisher's reply. Recipe comment sections are active communities. Readers post results of substitutions, warn about what didn't work, encourage each other, and build a collective knowledge layer around the recipe. Someone tried bread flour and got triple the volume. Someone skipped the overnight marinade and burned the sugar. Someone who lives at altitude figured out the adjustment and wrote it down.

This community-generated knowledge is one of the most valuable assets on a recipe site. Future readers browse it, search it, and absorb it. It gives them the confidence to start.

What's changing and what isn't

Two side-by-side flows: in the past, reader questions stayed on the recipe page and flowed back to the publisher; now, questions leave for AI tools and no signal returns.

What's changing is the retrieval. When a reader wants to find an answer that someone already wrote, in the comments, in the headnotes, in the publisher's FAQ, that retrieval behavior is shifting to AI. Instead of scrolling 140 pages of comments to find someone with the same question, the reader asks AI.

How this happens varies. They might ask the AI sidebar built into their browser while still on the page. They might copy the recipe URL and paste it into ChatGPT, Claude, or Perplexity. They might be browsing in an AI-native browser like Perplexity's Comet or OpenAI's Atlas, where the AI reads the page alongside them and answers contextually. They might ask their phone's assistant while their hands are in the dough. Or they might type the question into Google, where an AI Overview answers it using content extracted from the publisher's page, and they never scroll down to the actual results.

What is not changing is the posting. A reader who tries a substitution and wants to share the result will still post in the comments. A reader who had a great experience will still leave a review. Contribution is not driven by retrieval, it is driven by the desire to share. AI does not replace the impulse to tell other people "I tried this and it worked."

What is diminishing is the volume signal. When fewer readers leave questions in the comments because they are getting answers from AI, the publisher loses visibility into what readers struggle with. If 50 people asked whether bread flour could replace all-purpose in the comments over three years, the publisher would know that substitution is a common pain point and might add a section about it. If those 50 readers ask AI instead, the publisher never sees the pattern. The signal that would have informed future recipe development disappears.

The reader did not leave. The question did.

The broader shift is well documented

Searches where the user never clicks through to a result

Zero-click searches now account for nearly 7 in 10 queries

May 2024
56%
May 2025
69%
Absolute change in 12 months
+13pp
13 points in a single year. This trend is not slowing.

Source: Similarweb, May 2024 to May 2025. Share of Google searches that end without a click to any external result. The rise coincides with the rollout of AI Overviews and expanded on-SERP answers.

The behavior around AI search is well measured now. Recipe blogs were among the first categories affected by AI search summaries (AdExchanger, January 2026). Zero-click searches on Google rose from 56% to 69% between May 2024 and May 2025 (Similarweb). Pew Research tracked 68,000 searches and found a 46% click-through reduction on queries that surfaced AI summaries.

Those numbers describe what happens before the reader arrives at a recipe. What happens after the reader arrives, on the page itself, has been largely invisible.

What we measured

At Elephany we have been measuring the on-page shift. The widget we run on recipe sites fires an analytics event when a reader engages with an on-page question, and we can see how long the reader was on the page when that happened, what category the question fell into, and whether the reader stayed for the answer. Across millions of question impressions, three patterns stood out.

Readers are deep into the recipe when the question forms.

Share of reader sessions that engaged with an on-page question

When the question surfaces, readers are already deep into the page

Median time on page before first engagement
68s
Sessions over 1 minute before engaging
52%
Sessions over 2 minutes
32%
These are the deepest readers. The question found them mid-recipe.

Source: Elephany widget analytics on recipe sites. A session counts here when a reader loaded the page and engaged with a question. Percentages sum to 100.

The median reader who engages with an on-page question has been on the page for about a minute already. Over half spend more than a minute before engaging. About a third spend more than two minutes. Only a small minority engage immediately. This is not bounce behavior. These are readers who have read the headnotes, scrolled to the ingredients, maybe scanned the method, and then hit a point where they need an answer that isn't on the page in front of them.

Substitution is the single largest question category.

Share of reader questions by category

The questions that actually get asked on a recipe page

One in five engaged questions is a substitution query.
See real questions from each categoryPick a category

Substitution

19.5% of reader questions
  • I don't have ricotta. What's a good substitute for this pizza?
  • I only have salted butter, is that okay to use in this recipe?
  • I don't have any sour cream, what can I use instead?
  • What can I use instead of golden syrup in this flapjack recipe?
  • No buttermilk in the fridge. Any alternatives for this spelt pizza dough?

Source: Elephany widget analytics on recipe sites. Over 11,600 reader questions categorized by keyword pattern. An additional 33% of questions fell into an "Other" bucket containing additions and customization questions, confidence checks, sensory expectations, and technique variations that did not match keyword patterns above. All categories are on-page practical questions about a specific recipe.

Roughly one in five engaged questions is a substitution query: "I don't have X, what can I use instead?" Clarification questions about technique, make-ahead and storage, equipment swaps, scaling, troubleshooting, and dietary adaptations together account for most of the rest. These are not questions Google answers well in a search box. They are questions that, a few years ago, would have gone into the comment section.

When the answer is good, readers stay for it.

Of the readers who engage with a question, nearly 9 in 10 stay long enough to read the full answer. The question has demand. The answer has consumption. The behavior we measured on the page is a compressed version of what used to happen in the comments: reader has a question, reader gets an answer, reader goes back to cooking.

The publisher's content is the source. The engagement isn't.

The AI's answer is powered by the publisher's years of work. Their tested recipes, their headnotes, their comment replies, their community's collective wisdom. But the question-and-answer exchange happens on someone else's surface. The publisher's content did its job. The publisher just wasn't there when it happened.

The on-page engagement migration is no longer invisible.

It is measurable, and it is happening now.
Elephany closing image

Sources

  • AdExchanger, January 2026. Recipe blogs among first categories affected by AI search summaries.
  • Similarweb, May 2024 to May 2025. Zero-click searches rose from 56% to 69%.
  • Pew Research Center. 68,000 tracked searches, 46% click-through reduction with AI summaries.
  • Elephany widget analytics on recipe sites.