The user-agent-string.info project collects information about robots by detecting their access to the robots.txt file (http://www.example.com/robots.txt). The description of the syntax and use of robots.txt is in the document the robot exclusion standard .
The idea is simple. Before exploring a certain web site a correctly coded robot downloads the file robots.txt. If the access to the file is monitored, we have what we needed.
If you place a monitoring script on your server, you'll help with collecting the information. If you do this, don't forget to write so that you can be added to the list of cooperating websites.
What will be recorded into our DB?
- We record only the UA, the IP and time, nothing else.
Can the inaccessibility of the user-agent-string.info cause me any problems?
- No, only the information about the robot won't be recorded into our DB.