So here's the story: We recently made a huge update to our site to an angularJS platform, we opted not to use prerender since we heard that it should be too much of an issue with Google. When we were testing the indexing using fetch and render it looked like everything was fine, but we would occasionally get an issue that certain code was Temporarily unreachable.
Fast forward to now and while it looks like google is crawling our site, they haven't indexed anything except for the homepage in over 3 weeks. We set up redirects from all the old pages (they were .html) to the new ones and nothing.
I've tried all the typical indexing solutions, but there seems to be something else going on. While fetch and render seems to work fine, occasionally it still gets the same Temporarily unreachable as it did before. Not only that but when I do a mobile test, google says that Googlebot smartphone can't index the site because of my robots.txt file
This is my robots.txt file:
User-agent: * Disallow: /logs/ Disallow: /temptables/ Disallow: /procedures/ Disallow: /tpweb/ Disallow: /includes/ Disallow: /infopages/ Disallow: /write-review Disallow: /cart Disallow: /reservation Disallow: /bodystyle Disallow: /tires/search Disallow: /wheels/search Allow: /styles/ Allow: /scripts/ Allow: /vendor/