How to Set up Robots.txt and a Custom 404 Page

It is important to have robot.txt in your domain as well as the subdomain. It helps the search engines to determine which pages aren't to be crawled. Even if you don't have any such pages which do not need to be crawled, then you can easily have an empty robot.txt file. This indicates the search engines that your website is fully accessible to them.

Now, How to Setup a robot.txt file on your website?

Step 1: Open any word file - notepad or Microsoft word file and save it as robots (all lowercase) and choose the file extension as .txt. (Make sure it is plain .txt)

Step 2: Add the following two lines:

# Group 1

User-agent: Googlebot

Disallow: /Googlebot/

# Group 2

User-agent: *

Allow: /

Sitemap: http://www.example.com/sitemap.xml

Here, Group #1 indicates that Google should not crawl the folder.

For example, if I want to restrict www.example.com folder,

then: http://example com/nogooglebot/

Although, by this action, all the other agents would still be able to access the entire site.

Step 3: Once its done, save the file to the root directory of your website. If your domain is www.example.com, then you will have to save the file at www.example.com/robots.txt

Step 4: Once you are successfully done with the whole process, check the file again for errors.

 404 Page

You put a lot of effort into the website to create a flawless, efficient, and user-friendly website, but if someone mistypes the name of the page or clicks any bad link, he gets directed to a 404: page not found. For such users, you can create a much friendlier 404 page and address your visitors in a better way.

 How to set up a 404 page?

Step 1: Create your page using your favorite HTML or any webpage editor.

Step2: Copy the HTML code. Open your website in a web browser, go to the about us page. Click "view" and hit the "source." Click paste into your editor.

Step 3: Edit the page with a clear statement, advice to help them get back on the track, and an option to get in touch with the owner.

Step 4: Save the page as 404.html.

Step 5: Put the page on your website, and upload 404.html to the server.

Done!  You have created an effective 404 page.

 Verdict:

The aforementioned were the steps to set up the 404 and robots.txt. To get more insightful tips on SEO, check small business SEO tips.  Also check tips for SEO of local business.

​How to Set up Robots.txt and a Custom 404 Page in Toronto, Canada #404 #robots.txt

Click to Tweet

ACHIEVE ALL YOUR DIGITAL MARKETING GOALS WITH US

​Featured Post

Digital Internet Marketing Toronto | 10 Strategies

WHAT IS DIGITAL MARKETING?Internet usage has increased up to 5% in just the last three years, and it

More

Toronto Local SEO Guide for Small Business | 33 Actionable Tips

Why do you need Local SEO for small business?​The internet is the best place to expand for ​Local

More