Friday, 25 December 2015

How to create a robots.txt file



SEO Company Bangalore

 What robots txt file

Robots.txt is a content (not html) document you put on your site to tell look robots which page you might want them not to visit. Robots.txt is in no way, shape or form compulsory for web crawlers however for the most part web indexes obey what they are requested that not do. It is critical to illuminate that robots.txt is not a route from keeping web indexes from creeping your website (i.e. it is not a firewall, or a sort of secret key security) and the way that you put a robots.txt document is something like putting a note "Kindly, don't enter" on an opened entryway –e.g. you can't keep criminals from coming in however the great folks won't open to entryway and enter. That is the reason we say that in the event that you have truly sensitive information, it is excessively credulous, making it impossible to depend on robots.txt to shield it from being listed and showed in indexed list

How does it work tips from SEO Company Bangalore

Prior to an internet searcher creeps your website, it will take a gander at your robots.txt record as directions on where they are permitted to slither (visit) and file (save) on the web index results.

Robots.txt files are useful:

·         On the off chance that you need web indexes to overlook any copy pages on your site
·         On the off chance that you don't need web indexes to file your interior list items pages
·         On the off chance that you don't need web indexes to file certain zones of your site or an entire site
·         In the event that you don't need web search tools to file certain records on your site (pictures, PDFs, and so on.)
·         In the event that you need to tell web indexes where your sitemap is found
How to create a robots.txt file

Tips and tricks:

1. How to allow all search engine spiders to index all files
Utilize the accompanying substance for your robots.txt record on the off chance that you need to permit all internet searcher arachnids to file all documents of your Web webpage:
    User-agent: *
    Disallow:

2. How to disallow all spiders to index any file
On the off chance that you don't need web crawlers to record any document of your Web website, utilize the accompanying:
    User-agent: *
    Disallow: /

3. Where to find more complex examples.
On the off chance that you need to see more mind boggling illustrations, of robots.txt documents, see the robots.txt records of huge Web locales:
        http://www.cnn.com/robots.txt
        http://www.nytimes.com/robots.txt
        http://www.spiegel.com/robots.txt
        http://www.ebay.com/robots.txt 

Author:

Nexevo Technologies is an Award winning website development company in Bangalore. Unique website designers will make stand out from your competitor’s website. We visualize your dreams with affordable cost and excellent quality. Have you deiced to outsource your projects to other company, absolutely wee fit for your requirement?  25+ team with Joomla, Magento, Ecommerce, WordPress, Open-cart, and HTML5.

Contact Information:

Nexevo Technologies – Web Design Company Bangalore
http://www.nexevo.in
Skype:  Nexevotechnologies
Office: +91-8880102111
Call / Whatsapp: +91-9591505948


No comments:

Post a Comment

Note: only a member of this blog may post a comment.

    • Popular
    • Categories
    • Archives