I have a dynamic robots.txt program on GitHub that lets me tell unwanted bots unambiguously not to scan my site. Its overall effect is minor as only the responsible bots pay attention.
While scanning my logs, there are a few items that required further analysis. I am keeping a record here for my own future reference, but feel free to read.
This is an http library on GitHub called Requests, allowing easier access to websites from a Python Script. These are being used as a hacking tool, looking for vulnerabilities.