issues
search
samclarke
/
robots-parser
NodeJS robots.txt parser with support for wildcard (*) matching.
MIT License
148
stars
19
forks
source link
issues
Newest
Newest
Most commented
Recently updated
Oldest
Least commented
Least recently updated
Add explicit disallow feature
#36
SimonC-Audigent
opened
2 weeks ago
2
Bump braces from 3.0.2 to 3.0.3
#35
dependabot[bot]
closed
3 months ago
0
The library should validate the document before processing it
#34
sneko
opened
9 months ago
2
Bump @babel/traverse from 7.14.7 to 7.23.2
#33
dependabot[bot]
closed
9 months ago
0
Empty 'Disallow:' statement incorrectly gobbles the next statement
#32
skibum3d
closed
1 year ago
2
Fix `https:` defaulting to port `80` if no port given
#31
samclarke
closed
1 year ago
0
Library adds port when port is never defined.
#30
dskvr
closed
1 year ago
3
Bump json5 from 2.2.0 to 2.2.3
#29
dependabot[bot]
closed
1 year ago
0
Bump minimatch from 3.0.4 to 3.1.2
#28
dependabot[bot]
closed
1 year ago
0
Preferred host breaks isAllowed
#27
WillSquire
closed
2 years ago
1
Update dev dependencies
#26
samclarke
closed
2 years ago
0
Bump ansi-regex from 5.0.0 to 5.0.1
#25
dependabot[bot]
closed
2 years ago
1
Bump minimist from 1.2.5 to 1.2.6
#24
dependabot[bot]
closed
2 years ago
0
Use global URL constructor
#23
brendankenny
closed
2 years ago
2
Added typedefs
#22
danhab99
closed
3 years ago
4
Returns true for disallowed route
#21
david2am
closed
3 years ago
1
Add coveralls support
#20
samclarke
closed
3 years ago
0
Add Github actions CI
#19
samclarke
closed
3 years ago
0
Need help maintain this project?
#18
AnandChowdhary
closed
3 years ago
2
getPreferredHost() is spelled wrong in code example and is throwing an error
#17
mabry1985
closed
3 years ago
0
Add start position assertion to pattern (#15)
#16
kourylape
closed
3 years ago
1
Incorrectly rejecting deep directories with wildcards
#15
kourylape
closed
3 years ago
0
Support for ignoring protocols and ports
#14
ColinRoberts
closed
3 years ago
2
Recommend robots-agent
#13
gajus
closed
3 years ago
0
redos vulnerability
#12
ghost
closed
5 years ago
2
Support for URL object
#11
gsouf
closed
3 years ago
2
Robots-parser not working
#10
menthos984
closed
3 years ago
0
Files without User-agents can't add rules
#9
Andrewnino12
closed
6 years ago
1
Avoid using deprecated parts of the 'url' module.
#8
kdzwinel
closed
6 years ago
3
isAllowed should support multiple “User-agent” lines (case insensitive)
#7
brendonboshell
closed
7 years ago
1
Patch for colon at the end or beginning of a line
#6
ghost
closed
7 years ago
0
Patch for colon at the end or beginning of a line
#5
ghost
closed
8 years ago
0
Throws error with a robots.txt with a colon at the end or beginning of a line
#4
ghost
closed
8 years ago
0
Code coverage
#3
schornio
closed
7 years ago
1
Empty disallow statements are interpreted as disallow all
#2
guiweber
closed
9 years ago
1
Crash when passing an invalid URL
#1
guiweber
closed
6 years ago
2