Don't combine robots.txt disallow with noindex tags. Use noindex when you want a page crawled but not in search results. Use robots.txt disallow for pages that should never be crawled. Google ...
Google has updated its JavaScript SEO basics documentation to clarify how Google’s crawler handles noindex tags in pages that use JavaScript. In short, if “you do want the page indexed, don’t use a ...