I followed the google guideline Making AJAX applications crawlable to make my AngularJS Application crawlable for SEO purposes. So I am using #! (hashbang) in my routes config:

    $locationProvider.hashPrefix('!');

So my URLs look like this:

http://www.example.com/#!/page1.html

http://www.example.com/#!/page2.html

...

As google replaces the hashbangs (#!) with ?_escaped_fragment_= I redirect the google bots via my .htaccess file to a snapshot of the page:

DirectoryIndex index.html

RewriteEngine On

RewriteCond %{QUERY_STRING} ^_escaped_fragment_=/?(.*)$

RewriteRule ^(.*)$ /snapshot/%1? [NC,L]

So far everything works like a charm. When a bot requests following URL http://www.example.com/#!/page1.html it will replace the hashbang and actually requests http://www.example.com/?_escaped_fragment_=/page1.html which I redirect to the static/prerendered version of the requested page.

So I submitted my sitemap.xml via the Search Console from Google Webmaster Tools. All URLs in my sitemap are indexed correctly by google but not the domain itself. So it means that a page like:

http://www.example.com/#!/page1.html 

is indexed correctly and by googling specific content of any of my subpages google finds the correct page. The problem is the start/homepage itself which "naturally" has no hashbang:

http://www.example.com/

The hashbang here is appended (via javascript in my router configuration) when a user visits the site. But it looks that this is not the case for the google bot.

So the crawlers does not "see" the hashbang and hence does not use the static version here which is a big issue because especially here I provide the most important content. I already tried to rewrite and redirect / via .htaccess to /#!/but this ends up in to many redirects and crashes everything. I also tried to use

<meta name="fragment" content="!">

in the header of the index.html. But this did not help at all. Does anybody else faced that problem before?

0

There are 0 answers