
Is your content hiding in plain sight?
Yay! It’s another article about AI! This one isn’t going to be long, it isn’t going to be speculative and it won’t be foretelling the impending doom of the human race. It is going to warn of the probable impact on the visibility of your content to potential customers looking for answers from their favourite research tools.
An ever-expanding toolbox
We say ‘research tools’, as we are definitely talking about a range of distinct product categories now: Search Engines (Google, Bing), voice search (Alexa, Siri), pure AI applications (ChatGPT, Perplexity) and AI tools and snippets shown with (or as extra options to) traditional search results (Google AIO, Copilot Chat, GPT Deep Search). All these services gather data in different ways and show different results. People choose to use them for various reasons: familiarity, default browser setting, the only option available on a work computer, curiosity or trust. It is essential to publish your content in such a way that it is readily available to be read, interpreted and listed/re-published by as many of these services as possible.
Semantic markup, meta data and javascript
At AlphaQuad, we have always paid attention to the way content is marked up in HTML, and how pages are described with meta data and represented in other ways for consumption by bots and spiders. Our best practice evolved with the available technologies and we always kept on top of whether the use of certain presentation techniques (tabbers, sliders, accodions etc) would prevent key content from being read and listed by the services that web users rely on to find what they need.
These discussions are happening again as the LLMs crawl sites to learn about the world so they can answer the increasing number of questions being asked of them by a growing number of users who are coming to rely on them every day. It turns out that many of them don’t use JavaScript well…or at all.
It looks like it is time to review content presentation to make sure key messages are readily readable in context without JavaScript.
As a bit of a sidebar about JavaScript usage by the general browsing public, in mid-January 2025, Google made their search results inaccessible to visitors (bot or human) without JavaScript enabled. They reckon (and they of all people should know) that less than .1% of human visitors have JavaScript disabled – their ‘error’ page gives links to show people how to turn it on if that is what they want to do.
This move by Google could be to facilitate the inclusion of their AI Overviews or to prevent competing AI LLMs from scraping their listings. It also meant that a number of SEO tools that provide ranking data stopped returning information around 15th Jan. Fortunately AlphaQuad’s go-to tool wasn’t one that was affected. It did cause a bit of a stir at the time as many analysts thought another major unannounced algorithm change was underway…
Feeding the bots
The pages we build already have features that are designed to be read by machines. Sitemaps, meta data, schema, and RSS are all there to help summarise, contextualise and label content at various levels so machines can compare it to other similar-looking content and make sure it gets listed in the right place to match the search intent of people typing questions and phrases into their favourite search tool. All this is – and will continue to be – relevant for AI.
Our preference for publishing well-written, clear, long-form content also gives us an early advantage – sites that have prioritised style over content to make an impact will have much less for the LLMs to get their teeth into and learn from – a picture is probably not worth a thousand words to an AI bot (for now at least).
Redirection
Another thing to take into account with the AI bots is that they will generally not crawl sites as often as the familiar search engines. This would be very resource intensive and therefore expensive for them – you will have seen in the news just how much processing power it takes to run these services (unless you are DeepSeek apparently). This becomes especially relevant when making structural changes, renaming pages or re-launching a site – redirects need to be done and need to be kept in place for longer than has been seen to be necessary. It may take these bots longer to catch up with changes which may result in pages linked from AI results showing ‘404 page not found’.
What now?
It is our usual practice to keep content under review to make sure it performs as well as it can against the competition. Having to keep an eye on the requirements of AI is another part of these ongoing optimisation activities. If you are concerned about a particular aspect of your site (whether you are currently an AlphaQuad customer or not) get in touch with us and we’ll take an expert look as part of your retainer or a standalone audit.
Photo by Markus Winkler on Unsplash