I've been playing around with the Swift interface to sajson and been working on adopting Swift4's Decodable protocol to use sajson instead of Foundation's JSONSerialization. I came across a problem with decoding floating point numbers though. I'm not yet sure if the error is in the Swift bridge or in sajson itself. Maybe the original author of the Swift code has an idea?
See below for a test case where decoding two ValueReader.double values results in one being slightly off and the other one being completely off.
func test_floats() {
let doc = try! parse(allocationStrategy: .single, input: "{\"start\": 12.948, \"end\": 42.1234}")
doc.withRootValueReader { docValue in
guard case .object(let objectReader) = docValue else { XCTFail(); return }
XCTAssert(objectReader.count == 2)
guard let startValue = objectReader["start"], case let .double(start) = startValue else { XCTFail(); return }
guard let endValue = objectReader["end"], case let .double(end) = endValue else { XCTFail(); return }
XCTAssertEqual(start, 12.948)
// XCTAssertEqual failed: ("12.9480152587891") is not equal to ("12.948") -
XCTAssertEqual(end, 42.1234)
// XCTAssertEqual failed: ("981442340.4544") is not equal to ("42.1234") -
}
}
I've been playing around with the Swift interface to sajson and been working on adopting Swift4's
Decodable
protocol to use sajson instead of Foundation'sJSONSerialization
. I came across a problem with decoding floating point numbers though. I'm not yet sure if the error is in the Swift bridge or in sajson itself. Maybe the original author of the Swift code has an idea?See below for a test case where decoding two
ValueReader.double
values results in one being slightly off and the other one being completely off.