How To Fix Blocked By Robots.txt Error In Blogger| Google Search Console Robots.txt Error Solution - Baji88

How To Fix Blocked By Robots.txt Error In Blogger| Google Search Console Robots.txt Error Solution


How To Fix Blocked By Robots.txt Error In Blogger| Google Search Console Robots.txt Error Solution


How To Fix Blocked By Robots.txt Error In Blogger| Google Search Console Robots.txt Error Solution - Baji88




A "Blocked by robots.txt" error in Blogger can be a significant hurdle, especially if you're trying to improve your site's visibility on search engines like Google. This error occurs when Google's bots are prevented from crawling your site due to restrictions in your robots.txt file. Here’s a step-by-step guide to fix this issue:

Step 1: Access Your Blogger Dashboard

  1. Log in to your Blogger account.
  2. Go to your blog's dashboard.

Step 2: Navigate to the Settings

  1. From the dashboard, click on "Settings" in the left-hand menu.
  2. Scroll down to find the "Crawlers and Indexing" section.

Step 3: Enable Custom robots.txt

  1. Look for the "Enable custom robots.txt" option.
  2. Toggle the switch to enable it. This will allow you to customize the file.

Step 4: Edit Your robots.txt File

  1. Click on "Custom robots.txt."
  2. You will see a text box where you can enter your custom robots.txt directives.

Step 5: Update the robots.txt File

  1. Clear any restrictive directives that might block search engines from crawling your content. A common setup might look like this:
  2. User-agent: * Disallow: /search Allow: / Sitemap: http://yourblog.blogspot.com/sitemap.xml
  3. Make sure to replace "yourblog" with your actual blog name.

Step 6: Save the Changes

  1. After updating the robots.txt file, click "Save" to apply the changes.

Step 7: Verify in Google Search Console

  1. Go to Google Search Console and log in to your account.
  2. Select your property (your blog).
  3. Navigate to the "Coverage" report to check if the "Blocked by robots.txt" error has been resolved.

Step 8: Submit a Request for Re-crawling

  1. In Google Search Console, go to the URL Inspection tool.
  2. Enter the URL of your blog and click "Request Indexing" to prompt Google to crawl your site again.

By following these steps, you should be able to resolve the "Blocked by robots.txt" error and improve your site's search engine visibility. If the problem persists, consider consulting Google's documentation or seeking professional assistance.


User-agent: Mediapartners-Google Disallow: User-agent: * Disallow: /search?q= Disallow: /tag/ Allow: / Sitemap: Site Url link


Sitemap:


/https://www.baji88.topsitemap.xml


https://www.baji88.top/atom.xml?redirect=false&start-index=1&max-results=500


https://www.baji88.top/atom.xml?redirect=false&start-index=501&max-results=500


https://www.baji88.top/atom.xml?redirect=false&start-index=1001&max-results=500

Post a Comment

0 Comments