issues
search
faradayio
/
scrubcsv
Remove bad records from a CSV file and normalize
56
stars
7
forks
source link
issues
Newest
Newest
Most commented
Recently updated
Oldest
Least commented
Least recently updated
Add option to interpret null byte as an empty cell
#24
tomplex
opened
2 years ago
0
Scrubcsv pre-build package
#23
calebeaires
opened
3 years ago
0
Remove unnecessary escapings
#22
ngirard
opened
3 years ago
1
Be more verbose when normalizing
#21
ngirard
opened
3 years ago
2
Make it clear(er) that Scrubcsv favors snake_case over CamelCase in header
#20
ngirard
opened
3 years ago
0
Deal with sep= metadata line
#19
ngirard
opened
3 years ago
0
README: describe normalization features
#18
ngirard
opened
3 years ago
0
scrub null byte into empty cell
#17
seamusabshere
opened
4 years ago
0
Close #13 save bad rows to a file
#16
dimus
opened
4 years ago
3
Optionally trim whitespace from cells (fixes #8)
#15
emk
closed
4 years ago
0
Add `--clean-column-names`
#14
emk
closed
4 years ago
0
an option to collect bad records to a file
#13
dimus
opened
4 years ago
1
--fixlengths option
#12
seamusabshere
opened
4 years ago
3
example of null removal
#11
seamusabshere
opened
5 years ago
0
Modernize `scrubcsv` a bit
#10
emk
closed
5 years ago
0
--drop-row-if-null=col1
#9
seamusabshere
closed
5 years ago
2
--strip-whitespace option
#8
seamusabshere
closed
4 years ago
1
Option to scrub CSV injection vectors
#7
erno
opened
5 years ago
1
Optionally replace newlines with spaces
#6
emk
closed
5 years ago
0
doesn't fix or detect linebroken headers
#5
seamusabshere
closed
5 years ago
4
Only call a row bad if it has *more* than the expected cols
#4
seamusabshere
opened
7 years ago
1
Use csv's zero-copy API
#3
Dr-Emann
closed
4 years ago
5
no binary
#2
seamusabshere
closed
5 years ago
2
GB/s vs MB/s
#1
seamusabshere
closed
7 years ago
2