Better Robots.txt creates a WordPress virtual robots.txt, helps you boost your website SEO (indexing capacities, Google ranking,etc.) and your loading performance –Compatible with Yoast SEO, Google Merchant, WooCommerce and Directory based network sites (MULTISITE) With Better Robots.txt, you can identify which search engines are allowed to crawl your website (or not), specify clear instructions aboutwhat they are allowed to do....
Better Robots.txt creates a WordPress virtual robots.txt, helps you boost your website SEO (indexing capacities, Google ranking,etc.) and your loading performance –Compatible with Yoast SEO, Google Merchant, WooCommerce and Directory based network sites (MULTISITE)
With Better Robots.txt, you can identify which search engines are allowed to crawl your website (or not), specify clear instructions aboutwhat they are allowed to do (or not) and define a crawl-delay (to protect your hosting server against aggressive scrapers). Better Robots.txt also gives you full control over your WordPress robots.txt content via the custom setting box.
Reduce your site’s ecological footprint and the greenhouse gas (CO2) production inherent to its existence on the Web.
A quick overview:
SUPPORTED IN 7 LANGUAGES
Better Robots.txt plugins are translated and available in: Chinese –汉语/漢語, English, French – Français, Russian –Руссɤɢɣ, Portuguese – Português, Spanish – Español, German – Deutsch
Did you know that…
- The robots.txt file is a simple text file placed on your web server which tells web crawlers (like Googlebot) whether they should access a file.
- The robots.txt file controls how search engine spiders see and interact with your web pages;
- This file and the bots they interact with are fundamental parts of how search engines work;
- The first thing a search engine crawler looks at when it is visiting a page is the robots.txt file;
The robots.txt is a source of SEO juice just waiting to be unlocked. Try Better Robots.txt !
About the Pro version (additional features):
1. Boost your content on search engines with your sitemap !
Make sure your pages, articles, and products, even the latest, are taken into consideration by search engines !
The Better Robots.txt plugin was made to work with the Yoast SEO plugin (probably the best SEO Plugin for WordPress websites). It will detect if you are currently using Yoast SEO and if the sitemap feature is activated. If it is, then it will add instructions automatically into the Robots.txt file asking bots/crawlers to read your sitemap and check if you have made recent changes in your website (so that search engines can crawl the new content that is available).
If you want to add your own sitemap (or if you are using another SEO plugin), then you just have to copy and paste your Sitemap URL, and Better Robots.txt will add it into your WordPress Robots.txt.
2. Protect your data and content
Block bad bots from scraping your website and commercializing your data.
The Better Robots.txt plugin helps you block most popular bad bots from crawling and scraping your data.
When it comes to things crawling your site, there are good bots and bad bots. Good bots, like Google bot, crawl your site to index it for search engines. Others crawl your site for more nefarious reasons such as stripping out your content (text, price, etc.) for republishing, downloading whole archives of your site or extracting your images. Some bots were even reported to pull down entire websites as a result of heavy use of broadband.
The Better Robots.txt plugin protects your website against spiders/scrapers identified as bad bots by Distil Networks.
3. Hide & protect your backlinks
Stop competitors from identifying your profitable backlinks.
Backlinks, also called “inbound links” or “incoming links” are created when one website links to another. The link to an external website is called a backlink. Backlinks are especially valuable for SEO because they represent a “vote of confidence” from one site to another. In essence, backlinks to your website are a signal to search engines that others vouch for your content.
If many sites link to the same webpage or website, search engines can infer that the content is worth linking to, and therefore also worth showing on a SERP. So, earning these backlinks generates a positive effect on a site’s ranking position or search visibility. In the SEM industry, it is very common for specialists to identify where these backlinks come from (competitors) in order to sort out the best of them and generate high-quality backlinks for their own customers.
Considering that the creation of very profitable backlinks for a company takes a lot of time (time + energy + budget), allowing your competitors to identify and duplicate them so easily is a pure loss of efficiency.
Better Robots.txt helps you block all SEO crawlers (aHref, Majestic, Semrush) to keep your backlinks undetectable.
4. Avoid Spam Backlinks
Bots populating your website’s comment forms telling you ‘great article,’ ‘love the info,’ ‘hope you can elaborate more on the topic soon’ or even providing personalized comments, including author name are legion. Spambots get more and more intelligent with time, and unfortunately, comment spam links can really hurt your backlink profile. Better Robots.txt helps you avoid these comments from being indexed by search engines.
5. Seo tools
While improving our plugin, we added shortcut links to 2 very important tools (if you are concerned with your ranking on search engines): Google Search Console & Bing Webmaster Tool. In case you are not already using them, you may now manage your website indexing while optimizing your robots.txt ! Direct access to a Mass ping tool was also added, allowing you to ping your links on more than 70 search engines.
We also created 4 shortcut links related to the best Online SEO Tools, directly available on Better Robots.txt SEO PRO. So that, whenever you want, you are now able to check out your site’s loading performance, analyze your SEO score, identify your current ranking on SERPs with keywords & traffic, and even scan your entire website for dead links (404, 503 errors, …), directly from the plugin.
6. Be unique
We thought that we could add a touch of originality on Better Robots.txt by adding a feature allowing you to “customize” your WordPress robots.txt with your own unique “signature.” Most major companies in the world have personalized their robots.txt by adding proverbs (https://www.yelp.com/robots.txt), slogans (https://www.youtube.com/robots.txt) or even drawings (https://store.nike.com/robots.txt – at the bottom). And why not you too? That’s why we have dedicated a specific area on the settings page where you can write or draw whatever you want (really) without affecting your robots.txt efficiency.
7. Prevent robots crawling useless WooCommerce links
We added a unique feature allowing to block specific links (“add-to-cart”, “orderby”, “fllter”, cart, account, checkout, …) from being crawled by search engines. Most of these links require a lot of CPU, memory & bandwidth usage (on hosting server) because they are not cacheable and/or create “infinite” crawling loops (while they are useless). Optimizing your WordPress robots.txt for WooCommerce when having an online store, allows to provide more processing power for pages that really matter and boost your loading performance.
8. Avoid crawler traps:
“Crawler traps” are a structural issue within a website that causes crawlers to find a virtually infinite number of irrelevant URLs. In theory, crawlers could get stuck in one part of a website and never finish crawling these irrelevant URLs. Better Robots.txt helps prevent crawler traps which hurt crawl budget and cause duplicate content.
9. Growth hacking tools
Today’s fastest growing companies like Amazon, Airbnb and Facebook have all driven breakout growth by aligning their teams around a high velocity testing/learning process. We are talking about Growth Hacking. Growth hacking is a process of rapidly experimenting with and implementing marketing and promotional strategies that are solely focused on efficient and rapid business growth. Better Robots.txt provide a list of 150+ tools available online to skyrocket your growth.
10. Robots.txt Post Meta Box for manual exclusions
This Post Meta Box allows to set “manually” if a page should be visible (or not) on search engines by injecting a dedicated “disallow” + “noindex” rule inside your WordPress robots.txt. Why is it an asset for your ranking on search engines ? Simply because some pages are not meant to be crawled / indexed. Thank you pages, landing pages, page containing exclusively forms are useful for visitors but not for crawlers, and you don’t need them to be visible on search engines. Also, some pages containing dynamic calendars (for online booking) should NEVER be accessible to crawlers beause they tend to trap them into infinite crawling loops which impacts directly your crawl budget (and your ranking).
11. Ads.txt & App-ads.txt crawlability
In order to ensure that ads.txt & app-ads.txt can be crawled by search engines, Better Robots.txt plugin makes sure they are by default allowed in Robots.txt file no matter your configuration. For your information, Authorized Digital Sellers for Web, or ads.txt, is an IAB initiative to improve transparency in programmatic advertising. You can create your own ads.txt files to identify who is authorized to sell your inventory. The files are publicly available and crawlable by exchanges, Supply-Side Platforms (SSP), and other buyers and third-party vendors. Authorized Sellers for Apps, or app-ads.txt, is an extension to the Authorized Digital Sellers standard. It expands compatibility to support ads shown in mobile apps.
More to come as always …
Be Part of the Conversation with WordPress Enthusiasts
Using Better Robots.txt - Index...? Great, join the conversation now!
Let’s talk about overall quality, ease of use, stellar support, unbeatable value, and the amazing experience Better Robots.txt - Index... brings to you.