Closed gfr10598 closed 2 years ago
Changes Missing Coverage | Covered Lines | Changed/Added Lines | % | ||
---|---|---|---|---|---|
parser/json.go | 96 | 113 | 84.96% | ||
<!-- | Total: | 128 | 145 | 88.28% | --> |
Totals | |
---|---|
Change from base Build 129: | 0.2% |
Covered Lines: | 665 |
Relevant Lines: | 682 |
parser/json.go, line 107 at r1 (raw file):
// ExtractHops extracts the hop IP address from nodes in a tracelb json record. func ExtractHops(data []byte) ([]string, error) {
How about something the following code that parses all IP address? I have tested it and will send you the input file and a CLI separately.
// ExtractHops extracts all IP addresses from scamper's output in JSON format.
func ExtractHops(data []byte) ([]string, error) {
hops := []string{}
// "src": "172.19.0.2",
// "dst": "172.24.129.116",
for _, s := range []string{"src", "dst"} {
hop, err := jsonparser.GetString(data, s)
rtx.Must(err, "failed to parse "+s)
addHop(&hops, hop)
}
// "nodes" -> "addr": "172.19.0.1",
nodec, err := jsonparser.GetInt(data, "nodec")
rtx.Must(err, "failed to parse nodec")
for i := 0; i < int(nodec)-1; i++ {
hop, err := jsonparser.GetString(data, "nodes", fmt.Sprintf("[%d]", i), "addr")
rtx.Must(err, fmt.Sprintf("failed to parse nodes[%d].addr", i))
addHop(&hops, hop)
}
// "nodes" -> "links" -> "addr": "100.116.79.252",
linkc, err := jsonparser.GetInt(data, "linkc")
rtx.Must(err, "failed to parse linkc")
for i := 0; i < int(linkc); i++ {
hop, err := jsonparser.GetString(data, "nodes", fmt.Sprintf("[%d]", i), "links", "[0]", "[0]", "addr")
rtx.Must(err, fmt.Sprintf("failed to parse nodes[%d].links[0][0].addr", i))
addHop(&hops, hop)
}
return hops, nil
}
Since this is a draft PR, I didn't get into all the details and would like to review again once the PRs are merged. However, I have suggested a simpler code for extracting hops (IP addresses). Please see my comment inline.
TODO: rebase after merging #96 and #97, write test/validation section
This replaces the detailed scamper parsing into structs with an in place parser that just looks for nodes and addresses.
It is faster, eliminates a lot of code, and makes the code more robust to changes in scamper output schema.
golint detects missing export comments in the error definitions. But adding the comments makes the code less readable. Discussion?
Testing:
This change is