Open dpordomingo opened 5 years ago
Is this caused by the way we deploy the containers in engine? it looks like it's a problem in gitbase. @src-d/data-processing should we transfer this issue to the gitbase repo?
yeah, looks like it was in mysql: plan.(*loggingKeyValueIter).Next
Can you give some details (repo) how to reproduce it, or if it happens just in docker, how did you mount volumes?
Please move this issue to mysql-server.
By the log message err="<nil>"
and the place where it happens, this seems like a case of "nil is not nil" and then when err.Error()
is called, it panics.
I got the same error with plain srcd/gitbase:v0.19.0
image over a dataset of 7495 siva files, as it follows
$ docker run --rm --name gitbase -p 3306:3306 -v /repos:/opt/repos srcd/gitbase:v0.19.0
time="2019-03-28T11:49:43Z" level=debug msg="repository added" path=/opt/repos/00/002177a3ddfaba267e7a0a108839932e73855e4d.siva
time="2019-03-28T11:49:43Z" level=debug msg="repository added" path=/opt/repos/00/00269a8f9304aa4ee45794890b9cb38dbfc647be.siva
...
time="2019-03-28T11:49:43Z" level=debug msg="registered database to catalog" db=gitbase
time="2019-03-28T11:49:43Z" level=debug msg="registered all available functions in catalog"
time="2019-03-28T11:49:43Z" level=debug msg="created index storage"
time="2019-03-28T11:49:43Z" level=debug msg="registered pilosa index driver"
time="2019-03-28T11:49:43Z" level=info msg="squash tables rule is enabled"
time="2019-03-28T11:50:08Z" level=info msg="server started and listening on 0.0.0.0:3306"
...
time="2019-03-28T11:51:18Z" level=info msg="NewConnection: client 1"
time="2019-03-28T11:51:18Z" level=error msg="Error parsing auth server config: unexpected end of JSON input"
time="2019-03-28T11:51:18Z" level=info msg="audit trail" action=authentication address="172.17.0.1:55398" success=true system=audit user=root
time="2019-03-28T11:51:18Z" level=debug msg="executing query" query="select @@version_comment limit 1"
time="2019-03-28T11:51:18Z" level=info msg="audit trail" action=authorization address="172.17.0.1:55398" connection_id=1 permission=read pid=1 query="select @@version_comment limit 1" success=true system=audit user=root
time="2019-03-28T11:51:18Z" level=info msg="audit trail" action=query address="172.17.0.1:55398" connection_id=1 duration=2.198346ms pid=1 query="select @@version_comment limit 1" success=true system=audit user=root
time="2019-03-28T11:51:18Z" level=debug msg="executing query" query="CREATE INDEX repo_idx ON repositories USING pilosa (repository_id)"
time="2019-03-28T11:51:18Z" level=info msg="audit trail" action=authorization address="172.17.0.1:55398" connection_id=1 permission="read, write" pid=2 query="CREATE INDEX repo_idx ON repositories USING pilosa (repository_id)" success=true system=audit user=root
time="2019-03-28T11:51:19Z" level=info msg="starting to save the index" async=true driver=pilosa id=repo_idx
time="2019-03-28T11:51:19Z" level=info msg="audit trail" action=query address="172.17.0.1:55398" connection_id=1 duration=1.981242092s pid=2 query="CREATE INDEX repo_idx ON repositories USING pilosa (repository_id)" success=true system=audit user=root
time="2019-03-28T11:51:19Z" level=info msg="ConnectionClosed: client 1"
time="2019-03-28T11:52:26Z" level=debug msg="still creating index" driver=pilosa duration="608.387µs" id=repo_idx rows=10000
panic: runtime error: invalid memory address or nil pointer dereference
[signal SIGSEGV: segmentation violation code=0x1 addr=0x58 pc=0x9a1437]
goroutine 39 [running]:
github.com/src-d/gitbase/vendor/gopkg.in/src-d/go-mysql-server.v0/sql/plan.(*loggingKeyValueIter).Next(0xc0010698b0, 0xc0016a06c0, 0x0, 0x270e, 0xc00142a660, 0x2d, 0x30, 0x0, 0x0)
/go/src/github.com/src-d/gitbase/vendor/gopkg.in/src-d/go-mysql-server.v0/sql/plan/create_index.go:501 +0x187
github.com/src-d/gitbase/vendor/gopkg.in/src-d/go-mysql-server.v0/sql/index/pilosa.(*Driver).savePartition(0xc00101afe0, 0xc00169c000, 0x14cfc40, 0xc00101b140, 0x14d4060, 0xc0010698b0, 0xc0011ce080, 0xc000ec16c0, 0x270e, 0xc001575c08, ...)
/go/src/github.com/src-d/gitbase/vendor/gopkg.in/src-d/go-mysql-server.v0/sql/index/pilosa/driver.go:273 +0x451
github.com/src-d/gitbase/vendor/gopkg.in/src-d/go-mysql-server.v0/sql/index/pilosa.(*Driver).Save(0xc00101afe0, 0xc00169c000, 0x14e83a0, 0xc0011ce080, 0x14d40a0, 0xc001130090, 0x0, 0x0)
/go/src/github.com/src-d/gitbase/vendor/gopkg.in/src-d/go-mysql-server.v0/sql/index/pilosa/driver.go:358 +0x515
github.com/src-d/gitbase/vendor/gopkg.in/src-d/go-mysql-server.v0/sql/plan.(*CreateIndex).createIndex(0xc00022d030, 0xc0014d2eb0, 0xc0013213e0, 0x14e2780, 0xc00101afe0, 0x14e83a0, 0xc0011ce080, 0x14d3e60, 0xc00187b0e0, 0xc001746de0, ...)
/go/src/github.com/src-d/gitbase/vendor/gopkg.in/src-d/go-mysql-server.v0/sql/plan/create_index.go:207 +0x46e
github.com/src-d/gitbase/vendor/gopkg.in/src-d/go-mysql-server.v0/sql/plan.(*CreateIndex).RowIter.func1()
/go/src/github.com/src-d/gitbase/vendor/gopkg.in/src-d/go-mysql-server.v0/sql/plan/create_index.go:174 +0x92
created by github.com/src-d/gitbase/vendor/gopkg.in/src-d/go-mysql-server.v0/sql/plan.(*CreateIndex).RowIter
/go/src/github.com/src-d/gitbase/vendor/gopkg.in/src-d/go-mysql-server.v0/sql/plan/create_index.go:181 +0x866
And here is the command that causes the panic.
mysql --port=3306 --user=root --host=127.0.0.1 --execute "CREATE INDEX repo_idx ON repositories USING pilosa (repository_id)"
@dpordomingo - I need some repo, so I'll try to reproduce it locally. So far, based on 10 repos from PGA I could not reproduce it.
I ran borges to get those 7k repos... I'm not sure if you're interested in downloading them, or if I could run any command to get more logs from my laptop. Or we can even wait till there is a new release in gitbase, because I saw you did a lot of changes and maybe one of them fixed this problem.
I don't need repos (maybe just command how did you get them). Maybe it's fixed on the master, but from the log it looks that in happen in LogKV
in open tracing, so maybe you got io.EOF
and somehow lost connection with jagger caused that segfault, but this is just guessing
This is the list of repos that I retrieved with borges repos.todo.clean.log
86GB
7493
siva files8004
remotes~3h
downloadingI only fetched HEAD
, so no prs, no tags, no strange branches...
To do so, I used a slightly modified version of borges, that can be also started with docker-compose
using
srcd/gitbase:v0.19.0
and runningover a 7495 repositories dataset, I got this error in container logs:
Next time I run a sql query, container logs says: