Open pteichman opened 7 years ago
And here's a goroutine dump while it's hanging. At least this time it was in reflect.DeepEqual, though that function itself runs fine on my two inputs. Maybe an infinite loop in the string similarity code?
SIGQUIT: quit PC=0x5f50b m=0
goroutine 0 [idle]: runtime.mach_semaphore_wait(0x7fff00000d03, 0x1, 0x55310, 0x7fff5fbff404, 0xffffffffffffffff, 0x5139e0, 0x7fff5fbff450, 0x55343, 0xffffffffffffffff, 0x513120, ...) /usr/local/go/src/runtime/sys_darwin_amd64.s:418 +0xb runtime.semasleep1(0xffffffffffffffff, 0x513120) /usr/local/go/src/runtime/os_darwin.go:435 +0x4b runtime.semasleep.func1() /usr/local/go/src/runtime/os_darwin.go:451 +0x33 runtime.systemstack(0x7fff5fbff478) /usr/local/go/src/runtime/asm_amd64.s:314 +0xab runtime.semasleep(0xffffffffffffffff, 0x0) /usr/local/go/src/runtime/os_darwin.go:452 +0x44 runtime.notesleep(0x514330) /usr/local/go/src/runtime/lock_sema.go:166 +0x9f runtime.stopm() /usr/local/go/src/runtime/proc.go:1594 +0xad runtime.findrunnable(0xc42001c000, 0x0) /usr/local/go/src/runtime/proc.go:2021 +0x228 runtime.schedule() /usr/local/go/src/runtime/proc.go:2120 +0x14c runtime.park_m(0xc420001040) /usr/local/go/src/runtime/proc.go:2183 +0x123 runtime.mcall(0x7fff5fbff670) /usr/local/go/src/runtime/asm_amd64.s:240 +0x5b
goroutine 1 [chan receive]: testing.(T).Run(0xc420092300, 0x37c0d8, 0xe, 0x3a6ba8, 0x5f215) /usr/local/go/src/testing/testing.go:647 +0x316 testing.RunTests.func1(0xc420092300) /usr/local/go/src/testing/testing.go:793 +0x6d testing.tRunner(0xc420092300, 0xc42003ce30) /usr/local/go/src/testing/testing.go:610 +0x81 testing.RunTests(0x3a6c28, 0x50b020, 0x1, 0x1, 0xf9ab) /usr/local/go/src/testing/testing.go:799 +0x2f5 testing.(M).Run(0xc42003cef8, 0x43) /usr/local/go/src/testing/testing.go:743 +0x85 main.main() github.com/yudai/gojsondiff/_test/_testmain.go:56 +0xc6
goroutine 17 [syscall, locked to thread]: runtime.goexit() /usr/local/go/src/runtime/asm_amd64.s:2086 +0x1
goroutine 5 [syscall]: os/signal.signal_recv(0x0) /usr/local/go/src/runtime/sigqueue.go:116 +0x157 os/signal.loop() /usr/local/go/src/os/signal/signal_unix.go:22 +0x22 created by os/signal.init.1 /usr/local/go/src/os/signal/signal_unix.go:28 +0x41
goroutine 6 [runnable]: reflect.DeepEqual(0x302bc0, 0xc420376a24, 0x302bc0, 0xc4200987f0, 0x38d) /usr/local/go/src/reflect/deepequal.go:176 github.com/yudai/golcs.(lcs).Table(0xc4200dc300, 0x34fee0, 0x1, 0xc4200dc300) /Users/peter/go/src/github.com/yudai/golcs/golcs.go:55 +0x1e6 github.com/yudai/golcs.(lcs).Length(0xc4200dc300, 0xc4200dc300) /Users/peter/go/src/github.com/yudai/golcs/golcs.go:67 +0x2b github.com/yudai/gojsondiff.stringSimilarity(0xc420336000, 0xe3f9, 0xc420368000, 0xa247, 0x119101) /Users/peter/go/src/github.com/yudai/gojsondiff/gojsondiff.go:394 +0x157 github.com/yudai/gojsondiff.(Modified).similarity(0xc4200168c0, 0xc4202c1170) /Users/peter/go/src/github.com/yudai/gojsondiff/deltas.go:290 +0x1b8 github.com/yudai/gojsondiff.similarityCache.Similarity(0x4f1f80, 0xc4200168c0, 0xbff0000000000000, 0xc420324a48) /Users/peter/go/src/github.com/yudai/gojsondiff/deltas.go:36 +0x50 github.com/yudai/gojsondiff.(TextDiff).Similarity(0xc420278060, 0x10)
This may be a bug / pathological input in golcs instead.
I found a pair of valid JSON inputs that cause Differ.CompareObjects to hang indefinitely.
[Edit: changing the link to a branch rather than specific commit]
I've added sanitized versions of those files and a hanging unit test over here: https://github.com/pteichman/gojsondiff/tree/infinite-hang
As an API request, would it be useful to be able to disable the substring diffing? For my use case it's enough to know that the values fail string equality checks. Or alternatively, should LCS calculation be part of the display logic rather than in the differ itself?