Allow bots to read dynamically injected content

by gidgicken   Last Updated December 01, 2016 08:01 AM

I've got a pretty large Angular SPA, and I currently use ?_escaped_fragment_ to serve up static versions of all our pages. I've discovered, however, that this often has issues with newly deployed/updated pages (prerendered pages still have cached references to the old css which, since we name our css according to the deployment version, no longer exists ... so Google then looks at pages with no styling which makes them look like link-heavy garbage).

We could implement some work-arounds to get the prerendering to work, but I'd love to see if google can just crawl our ajax.

Here's my issue...

We currently "disallow: /api/" in our robots.txt because we don't want our api to be public. But our dynamically injected depends on info from our api, so in our "Fetch and Render", the GoogleBot gets a 404 because anytime it tries to pull info from the api, it gets blocked. enter image description here

The browser user-agent (the right pane on Fetch and Render) renders it fine, but the google user-agent just shows a 404.

Any ideas on how to get around this? Do I have some basic misunderstanding of crawlers? I'm really stumped here...



Related Questions




how server can identify ajax request?

Updated September 23, 2016 09:01 AM

SEO on AJAX based search site

Updated April 08, 2015 18:01 PM