Skip to content
← All work
Brand Experience · Origin Studios

Building Our Own GEO-Optimised Web Presence from Scratch

How we applied our own methodology to build originstudios.dev — a Next.js site with Three.js 3D, GSAP motion design, full JSON-LD schema, and AI search visibility scoring 74+ on GEO readiness.

95+

Lighthouse Score

74/100

GEO Readiness

5 types

Schema Entities

8 bots allowed

AI Crawler Access

8 weeks

Build Time

0

Templates Used

Next.js 16Three.jsGSAPTailwind CSSVercel EdgeJSON-LD

The Challenge

We needed a website that practised what we preach. Every agency claims to build great websites — but how many agencies have sites that AI search engines actually cite? We set out to build originstudios.dev as a proof of concept for our entire methodology: strategy-first design, motion and 3D as standard, and full Generative Engine Optimization from the ground up.

The Approach

We followed our own five-sprint, eight-week process:

Sprint 1 — Understand (Weeks 1-2): We audited 15+ competitor studios, mapped the GEO keyword landscape, and identified that "Generative Engine Optimization" and "AI-optimised website development" were high-intent queries with almost no structured data competition. Most studio sites had zero JSON-LD beyond basic Organization schema.

Sprint 2 — Prototype (Weeks 3-4): High-fidelity Figma prototypes with real copy. We designed for two audiences simultaneously: human visitors who respond to motion and visual impact, and AI crawlers who need semantic structure and machine-readable data. Every section was planned with both a visual hook and an extractable content unit.

Sprint 3 — Engineer (Weeks 5-6): Built in Next.js 16 with App Router. The Three.js galaxy scene loads dynamically with SSR disabled for the WebGL component only — all text content is server-rendered and visible to crawlers. We implemented five JSON-LD schema types (ProfessionalService, WebSite, WebPage, FAQPage, and a Service ItemList) with full @id cross-referencing.

Sprint 4 — Optimise (Week 7): Performance tuning hit a 95+ Lighthouse score. We created llms.txt for AI crawler discoverability, configured robots.txt to explicitly allow 8 AI crawlers, and built a dynamic sitemap that auto-updates on each deployment. The GEO audit scored the site at 74/100 — strong for a brand-new domain with no backlinks.

Sprint 5 — Ship (Week 8): DNS cutover to Vercel Edge, monitoring setup, and launch. The site went live with every marketing integration pre-wired: GA4, GTM, and structured data that AI engines could parse from day one.

Technical Highlights

SSR + CSR hybrid: The main page uses CSS visibility control instead of React conditional rendering. Content is always in the DOM for crawlers, but hidden until the loader animation completes. Three.js and WebGL components use dynamic(() => import(...), { ssr: false }) to avoid server-side rendering of GPU-dependent code.

Five-layer JSON-LD graph: ProfessionalService (with ImageObject logo, INR pricing, founder entity, sameAs), WebSite, WebPage (with datePublished/dateModified), FAQPage (4 Q&A pairs optimised for AI citation), and an ItemList of 5 Service entities. All cross-referenced via @id anchors.

Motion design: GSAP ScrollTrigger animations, character-level text reveals, horizontal scroll sections, velocity marquees, and a cursor reveal interaction — all built with progressive enhancement so the site works fully without JavaScript animations.

Results

The site launched with a 95+ Lighthouse performance score, 74/100 GEO readiness score, and full AI crawler accessibility. Five JSON-LD schema types provide machine-readable entity data. The llms.txt file gives AI crawlers a structured summary without requiring HTML parsing. Eight AI crawlers are explicitly allowed in robots.txt.

This project is our living proof of concept — and the foundation we build on for every client engagement.