Today, Jamie will go into one of the most valuable topics in technical SEO – rendering and JavaScript. 95% of sites use JS— so many that Google has had to reconsider how they crawl and index JavaScript generated. Let’s look at the new rules for a dynamically generated web.
11. 1. Googlebot renders JS as it crawls.
2. Indexing is based on DOM content.
3. These actions occur simultaneously in a
single sequence.
#engagePDX | @jammer_volts
This model assumes:
24. If a Googlebot can’t see pricing or product
availability, it can’t understand our PDPs purpose.
Product detail pages are designed
to meet a transactional intent.
#engagePDX | @jammer_volts
32. Know the difference between your HTML and Rendered HTML
#engagePDX | @jammer_volts
33. Want to be sure it’s seen? Think like a bot
Test like Googlebot
● URL Inspector (1 at a time)
● Configure your own web
crawler!
● Audit with Lighthouse
● Advocate for user-centric
KPIs
Study Googlebot
● V8, Googlebot’s JavaScript
and WebAssembly engine
● Chrome 41, Googlebot’s Web
Rendering Service
● Learn to spot the difference
between lightdom and
shadowdom
45. Everyone has to wait while JS does it’s thing.
#engagePDX | @jammer_volts
46. How to cut the time and cost of JavaScript
● Make a developer friend! ● Load only what you need for the
requested page (Code splitting)
● Make resources as small as
possible (Minification,
Compression)
● Prioritize loading resources
● Use Service Work for browser-
native JS functionality
Credit:gizmo-art
#engagePDX | @jammer_volts
52. ● Build your own Dynamic Rendering using Rendertron
● Get hands on with code in Google’s Codelab
● Learn to speak dev (start with the developer’s guide to search)
● Join the Webmaster Forum
● Watch Google’s new JavaScript SEO web series
● Dig through Definitive SEO JavaScript Resource List, Barry
Adams, State of Digital
#engagePDX | @jammer_volts
Get your hands dirty.