1. The Part Everyone Sees

The room photography. The booking widget. The tagline. The hero image of the pool at dusk, or the breakfast spread, or the view from the suite. This is the hotel website as most people picture it — as most clients commission it, as most agencies deliver it.

It is a legitimate and important brief. A hotel website that misrepresents the property, or that presents a four-star experience with two-star design, fails at the first test. The visible layer matters. But it is approximately ten percent of the actual work that determines whether the site does its job.

The other ninety percent is invisible to the guest. It is also, in most cases, invisible to the hotelier. And its condition — whether it was built carefully or carelessly, whether it has been maintained or left to drift — determines, more than any design choice, whether the site actually performs.

2. What Sits Below the Surface

The invisible layer is not mysterious. It is simply not visible.

Every hotel website needs a working canonical URL structure, so search engines know which version of a page to index. It needs hreflang tags, so a French-speaking guest arriving from a Google search lands on the French version of the site rather than the English one — or, worse, a mixed-language hybrid that signals confusion to both user and algorithm. It needs a robots.txt file that has been deliberately configured, not left at its default state, which in many cases means accidentally restricting access to crawlers that would otherwise bring traffic.

It needs structured data — schema markup — that tells search engines not just that this is a hotel, but which hotel, where, with how many room types, at what price range, with which amenities, receiving what average rating from which platform. It needs page speed that has been measured and optimised, because a hotel homepage that loads in four seconds on mobile loses a measurable share of potential bookings before the design is ever seen.

It needs clean redirects, not redirect chains that add 300 milliseconds to every page load. Unique, correctly-lengthed meta descriptions, because they appear in Google snippets and affect click-through rate from search results. A sitemap submitted to Search Console, updated when content changes. Heading structure that is logical, not decorative. Image alt text that describes the content, not just the filename.

None of this is visible to the guest. All of it affects whether the guest finds the site, trusts it, and books.

3. "Not Just a Pretty Website"

We have used that phrase for as long as we have been building hotel websites. It is not a tagline chosen for positioning purposes. It describes a problem we encountered early and chose to address directly.

Hotels come to web agencies — as they should — primarily with a brief about how the site should look. The photography, the colour palette, the tone of voice, the booking flow. These are real concerns and they deserve real answers. But we have seen, and continue to see, a category of hotel website that looks good and performs badly.

Strong photography, modern layout, clear call to action — and underneath, a technical structure that is quietly undermining everything visible. Duplicate title tags across every room-type page. Schema markup that was copied from a template and never corrected, with wrong property values or missing required fields. Hreflang errors that cause search engines to treat the French and English versions of the site as competing against each other for the same queries. A robots.txt file that has not been reviewed since the site launched, now silently blocking crawlers the hotel wants to be found by.

These websites look professional. They do not work professionally. The visible work was done; the invisible work was not. And because the invisible work is invisible, the problem is rarely noticed until something measurable deteriorates — a drop in organic traffic, a fall in direct bookings, an unexplained decline in Google ranking.

We build the whole iceberg. We always have. That choice was not a strategy; it was a refusal to deliver something we knew was incomplete.

4. GEO Pushed the Waterline Deeper

About eighteen months ago, a new layer appeared at the bottom of the iceberg. Generative Engine Optimisation — GEO — is the practice of making a website legible and recommendable not just to search engine crawlers, but to the AI systems that generate direct answers to user queries. ChatGPT, Perplexity, Gemini, and the growing range of AI-powered search interfaces do not simply index pages and rank them. They read, extract, synthesise, and generate text. The hotel that appears in those answers is not necessarily the hotel that ranks well on Google. It is the hotel whose website communicates clearly, accurately, and completely in machine-readable terms.

What does that require? Can AI crawlers access the site at all? That is a robots.txt question — the same robots.txt file we have been configuring for years. Is the Hotel schema complete: does it include the address, the star rating, the amenities, the check-in time, the languages spoken at reception? That is a structured data question — the same structured data layer we have always insisted on building. Is there a clear, factual, well-structured description of the property that an AI system can extract and reproduce without introducing errors? That is a content structure question — headings, logical information hierarchy, unambiguous prose.

There is also llms.txt — a new file, modelled on robots.txt, that hints to AI systems how to work with a site's content. It is not yet a standard, and not having one is not penalising today. But it takes an hour to write, costs nothing to add, and positions a site correctly for a convention that may solidify quickly. It is the kind of thing a technically-attentive team adds as a matter of course, without waiting to be asked.

GEO did not introduce a new discipline. It extended an existing one. The iceberg became deeper. The habits required — careful bot access management, complete structured data, clean content hierarchy — are the same habits that distinguish good hotel web infrastructure from the alternative. They just now have a new test to pass.

5. Why the History Matters

An agency that has spent years building the invisible layer of hotel websites is not starting from scratch when a client asks about GEO. The practices are already in place: reviewing robots.txt as part of every launch, implementing complete schema markup rather than minimal schema, checking hreflang correctness, monitoring Core Web Vitals, treating the technical substrate of the site as seriously as the visual surface.

An agency that treats web development as a design problem — for whom the work is complete when the site looks right — is starting from scratch. Not because GEO is technically demanding, but because the orientation required to do it consistently, as a matter of course rather than as an afterthought, develops slowly. It is a set of instincts, not a checklist. It shows in what gets built without being asked: the correctly-nested heading structure, the schema that covers edge cases, the multilingual redirect logic that handles every variant of the URL.

What we are describing is specific to hospitality. The schema types that matter are different for a hotel than for an e-commerce site. The multilingual requirements are different. The bot access considerations, the local search signals, the OTA relationship that affects how AI systems weight first-party information — these are not generic web problems. They are hotel web problems, and they require hotel web expertise.

6. The Full Iceberg in 2026

At the surface: design. Photography. Brand identity. The user experience a guest can see and evaluate. This is what the brief usually specifies, and it should be excellent.

Just below the waterline: performance. Page speed. Core Web Vitals. Mobile rendering. Uptime. Security certificates. These affect user experience directly and are frequently neglected after launch.

In the middle depths: SEO infrastructure. Canonical structure. Hreflang correctness. Meta descriptions. Title tags. Heading hierarchy. Sitemap hygiene. Internal linking. These determine whether search engines find, understand, and rank the site correctly.

Deeper: structured data. Schema markup — Hotel, LodgingBusiness, Room, Offer, FAQPage — correctly implemented, completely populated, kept current. This is the layer that allows search engines, and now AI systems, to understand the hotel as a data object rather than simply as a web page.

Deepest, and newest: AI visibility. Bot access configured for the AI crawlers specifically. Content structured to be machine-extractable. Information complete enough that an AI system can describe the property accurately, without fabrication or omission. And optionally, a llms.txt file — not yet a norm, but easy to add and worth having before it becomes one.

The guest sees none of this. The guest knows only whether the site feels right, loads quickly, and gives them reason to book. What makes that possible — or prevents it — is almost entirely below the surface.

We think about what is below the surface. We always have. GEO just confirmed why that mattered.

Want to know how deep your iceberg goes? AIscore scans your hotel website across 91 signals — structured data, bot access, content structure, metadata — and shows you exactly what sits below your waterline. Free, no sign-up required.

See what's below your waterline

AIscore analyses your site in 30 seconds and shows you exactly where your AI visibility foundations stand.

Scan your hotel now →
← Back to blog   |   ← Why We Built AIscore   |   Lire en français →