Development :
Web Authoring :
Robots.txt 2.0

Robots.txt is a visual editor for Robot Exclusion Files and a log analyzer software. It allows a user to quickly and easily create the robots.txt files required to instruct search engine spiders, which parts of a Web site are not to be indexed and made searchable by the general Web public and then to identify spiders, which do not keep to those instructions. The program provides the user a way to log onto his FTP or local network server and then select the documents and directories which are not to be made searchable. By means of this program you will be able to visually generate industry standard robots.txt files; identify malicious and unwanted spiders and ban them from your site; direct search engine crawlers to the appropriate pages for multilingual sites; use robots.txt files for doorway page management; keep spiders out of sensitive and private areas of your Web site; upload the correctly formatted robots.txt files directly to your FTP server not switching from Robots.txt Editor; track spider visits; create spider visits reports in HTML, Microsoft Excel CSV and XML formats and more. Free program`s updates and upgrades unrestricted in time; unlimited number of Web sites to work with

This software is a shareware. You will be able to download and test Robots.txt during a certain period of time, then, if it does what you need, you will have to acquire the full version. The trial version available for download on www.softandco.com has a size of 2870 KBytes. For additional information and support request, please contact directly Robots.txt publisher.

Robots.txt 2.0 was released by NetPromoter on Tuesday 27 September 2005. Its known requirements are : Windows 9x, NT,ME,2000/XP.

Robots.txt will run on Windows 98, Windows Me, Windows 2000, Windows NT and Windows XP.

Downloads (597)