Robots.txt Generator

Search Engine Optimization

Robots.txt Generator


Padrão - todos os robôs são:  
    
Atraso de rastreamento:
    
Mapa do site: (deixe em branco se você não tiver) 
     
Robôs de pesquisa: Google
  Google Image
  Google Mobile
  MSN Search
  Yahoo
  Yahoo MM
  Yahoo Blogs
  Ask/Teoma
  GigaBlast
  DMOZ Checker
  Nutch
  Alexa/Wayback
  Baidu
  Naver
  MSN PicSearch
   
Diretórios restritos: O caminho é relativo ao root e deve conter uma barra final "/"
 
 
 
 
 
 
   



Agora, crie o arquivo 'robots.txt' no seu diretório raiz. Copie o texto acima e cole no arquivo de texto.


Sobre Robots.txt Generator

Generate your Robots.txt with the Robots.txt Generator and save a lot of time!

As you know the robots.txt is the most important file on your website. It tells Google and other search engines what parts of your website are off limits, and helps you control how your content will be indexed in search results. When we created the Robots.txt Generator, we wanted to make it easier to create this important file, so now all you have to do is upload your sitemap and generate your Robots.txt with the simple click of a button! It’s the quickest way to ensure that search engines only index exactly what you want them to see!

How does it work?
The Robots.txt Generator is an easy to use tool which allows you to create (and modify) you robots file for free. Just select you directory or enter it manually into our robots file generator and click on 'create'. You can generate any type of robots file including sitemap, HTML site map, wordpress robots text or xml sitemap in seconds with our online webmaster tool ! The generated code is neatly arranged in proper order so that your robots file will look like it was created by an expert professional.

What do you need?
First you need to know that there are three types of robots you can use in order to regulate bot behaviour on your website. Those are: index, follow and noindex/nofollow directives in one simple interface (read more here). Also, make sure that you don’t mistake Robots Exclusion Protocol (REP) or Web Content Accessibility Guidelines (WCAG) checkpoints as they have nothing to do with regulating search engine crawlers. You should treat them as separate entities requiring separate treatment.

Where can I use it?
Put it on all pages, but especially on pages where you don’t want search engines to index content. Typical pages include login/logout pages, or anything that you do not want non-users to find when they Google for information about your website (e.g., contact us or privacy policy).

Why should I use it?
Have you ever tried to generate or edit your robots.txt file? It can be hard sometimes, even for SEOs! With our simple tool, you don’t have to do anything except choose which directories should not be crawled by search engines.

Robots.txt Generator - SEO Tool