Closed UlisseMini closed 4 years ago
I just noticed that nom has a "6.0.0-alpha1" on crates.io
. It may be worth updating to nom 6
at the same time. I should have some free time tonight and I can probably start on the engine
parsers and get those converted to divide and conquer this task.
Nice, I've got some time now & I'm working on converting the rest of parsers.rs
.
So, I had some trouble converting space
& non_newline_space
to not use macros, The best I could come up with was
pub fn space(input: &str) -> IResult<&str, &str> {
input.split_at_position(|c| !(" \t\r\n").find_token(c))
}
pub fn non_newline_space(input: &str) -> IResult<&str, &str> {
input.split_at_position(|c| !(" \t\r").find_token(c))
}
Much more verbose then eat_separator
, but I could not find a function that did exactly what eat_separator
did.
Also I'm not 100% positive that map((p1, p2, p3), |x1, x2, x3| { ... })
is the same as
do_parse!(
x1: p1 >>
x2: p2 >>
x3: p3 >>
({...})
Is it okay if I add some tests to parsers.rs
? I found some things that were not covered well.
Its ugly but it works, I'd say we can refactor later? If you've got some simple things I could do though I'll go ahead and apply them.
pub fn parse_fen(input: &str) -> IResult<&str, Board> {
map(
tuple((
take_while(|y| "pPnNbBrRqQkK12345678/".contains(y)),
space,
alt((tag("w"), tag("b"))),
space,
take_while(|y| "-kKqQ".contains(y)),
space,
take_while(|y| "abcdefgh12345678-".contains(y)),
space,
take_while(|y| "0123456789".contains(y)),
space,
take_while(|y| "0123456789".contains(y)),
)),
|(board, _, player, _, castle, _, ep, _, m1, _, m2)| {
(Board::from_str(&format!(
"{} {} {} {} {} {}",
board, player, castle, ep, m1, m2
))
.unwrap()) // we parsed it therefor it must be a valid fen?
// this is risky because Board has another fen parser that might disagree,
// however I'm not familiar enough with nom to know how to do this
// without massive boilerplate
},
)(input)
.map_err(|_| nom::Err::Failure(("Invalid FEN", nom::error::ErrorKind::Verify)))
}
This looks good. The parse_fen containing an unwrap scares me a bit. Perhaps we need a test for that? We should be able to do a .map_err
instead of an unwrap
, or (in general) find a better error management system.
I'll take a closer look tonight, and maybe add some more of the parsers to it.
Merging now, because I'm not sure how to add more commits to the pull request. I checked out the changes locally, and fixed the unwrap, but otherwise left it about the same.
In addition, I converted all the other parsers.
DO NOT MERGE, UNFINISHED
Working towards moving all the parsers to using functions instead of macros, ala nom 5.