Closed rb2k closed 14 years ago
Tried with the version before the last commit:
ruby-head > require "./lib/robots.rb"
=> true
ruby-head > bla = Robots.new("test")
=> #<Robots:0x00000100a60580 @user_agent="test", @parsed={}>
ruby-head > bla.allowed?("http://www.google.de/search")
=> false
That seems to be working
Found the bug and fixed it, sent pull request:
http://github.com/rb2k/robots/commits/master
closing
System: ruby-head and 1.9.1, using the newest gem version.
I came across the error over here: http://danielwebb.us/robots.txt
But still:
What is even more disturbing: ruby-head > robots.allowed?("http://www.google.de/search") => true
Did something break in the recent update? Did something break in one of the last updates?