issues
search
ekalinin
/
robots.js
Parser for robots.txt for node.js
MIT License
66
stars
21
forks
source link
issues
Newest
Newest
Most commented
Recently updated
Oldest
Least commented
Least recently updated
Fix user agent not being sent
#35
rune-cats
opened
3 years ago
0
Fix user agent header not actually being sent
#34
rune-cats
closed
3 years ago
0
Add a timeout handler
#33
andymurd
opened
3 years ago
0
Use require to get package version number
#32
allenevans
closed
5 years ago
1
Fix incorrect JSDoc comments + add linting of them
#31
voxpelli
closed
5 years ago
1
HTTP request User-Agent header styled incorrectly
#30
midnightfreddie
opened
6 years ago
0
praser.parse() throwing Cannot read property \'length\' of undefined'
#29
zackiles
closed
6 years ago
1
Provide a CLI interface to test if a specific URL is accessible using a specific user agent
#28
gajus
opened
6 years ago
0
check rules exists first
#27
skysteve
closed
7 years ago
1
Add getDisallowedPaths method
#26
skysteve
closed
7 years ago
3
Do not leak 'clone' as a global variable
#25
thomas-hilaire
closed
7 years ago
1
Guard against infinite redirect loop
#24
bsander
closed
7 years ago
4
setUrl() is blocked
#23
fprivitera
opened
8 years ago
0
Access true while robots.txt disallows all
#22
basvdijk
opened
8 years ago
3
Fix redirection when there is no location header.
#21
tfoxy
closed
9 years ago
0
Fix Rule.appliesTo to return true when the rule has end_char ($) but no wildcard (*), and path === url + end_char.
#20
valabau
closed
9 years ago
1
'Allow all' statement or blank line returns true for disallowed URLs
#19
sanderheilbron
opened
9 years ago
0
Don't return first rule match for canFetch
#18
sebastianwessel
opened
9 years ago
4
Everything returns true...
#17
morganrallen
closed
9 years ago
3
Ignoring rules without *
#16
SoAG
closed
9 years ago
5
Allow port indication in sitemap urls
#15
avovsya
closed
10 years ago
1
Allow custom HTTP options
#14
richorama
closed
10 years ago
1
https://github.com/ekalinin/robots.js/issues/12
#13
scherler
closed
10 years ago
1
init parser from robot.txt string or allow to serialize the parser
#12
scherler
closed
10 years ago
1
getCrawlDelay not public/documented?
#11
reezer
closed
10 years ago
2
Nothing to see here.
#10
heath
closed
10 years ago
0
Always consume response data in order for connection to close.
#9
bsander
closed
11 years ago
1
Handle http(s) request errors
#8
kunev
closed
10 years ago
4
Sitemap
#7
reezer
closed
11 years ago
5
Allow ports in redirect locations
#6
bingalls-compete
closed
11 years ago
0
Fixed a bug with relative redirects and added unit tests
#5
bingalls-compete
closed
11 years ago
1
Two bug fixes
#4
mlodz
closed
12 years ago
1
Update rule matching to allow wildcards
#3
mlodz
closed
12 years ago
3
Added code for dealing with globing inside allow/disalow rules
#2
XooR
closed
12 years ago
1
3xx redirects, Crawl-Delay, and response callback
#1
mlodz
closed
12 years ago
1