jeudi 16 juillet 2020

Googlebot blocked by robots.txt in laravel

I embedded a structured data to my job posting project. And during testing, I got some page loading issues which are as follows:

enter image description here

I edited the public/robots.txt file and added the googlebot settings to let google crawl to the web page.

User-agent: Googlebot
Allow: /

User-agent: *
Allow: /

And when I clicked the robots.txt file, it shows me this:

User-Agent: *
Allow: /ads/preferences/
Allow: /gpt/
Allow: /pagead/show_ads.js
Allow: /pagead/js/adsbygoogle.js
Allow: /pagead/js/*/show_ads_impl.js
Allow: /static/glade.js
Allow: /static/glade/
Disallow: /
Noindex: /

Am I missing some configuration in my robots.txt file? I'm new to this part of my developer's life. And what does other error means in the script and styelsheet type error means? I hope someone can help me figure this out.



via Chebli Mohamed

Aucun commentaire:

Enregistrer un commentaire