Closed valeriocos closed 5 years ago
+1. We should prepare those asap (thinking about the CM workshop @ ICSE in Montreal at the end of the month, poke @borisbaldassari @phkrief). Imho, it would be enough to have dumps even for short periods (e.g. 1 month), created from a single execution task.
I'll prepare one for Eclipse Epsilon. @mhow2 @ambpro @md2manoppello can you pick a project and do the same?
Hi @tdegueul . I'd like to help but for the moment we're stuck with https://github.com/crossminer/scava/issues/198. It seems to be the product of GitHub temporary malfunction and RESTMule exception not being handled correctly.
Hi all, I will provide an up-to-date project data dump asap.
@tdegueul +10 yep having a dataset ready for the montreal workshop would be definitely re-assuring and safe! Thanks for asking and caring about that!
Hello, is anything that can I do in order to incorporate data about docker and puppet metrics in the dumps? @tdegueul @ambpro
@borisbaldassari : I have uploaded a dump of project docdokuplm (2018+ a few months) at https://nextcloud.ow2.org/index.php/s/BEiFSWZYBdEYcRW with all metric enabled. there is also microprofile but you might already have it.
@mhow2 thanks! I'm not sure what is implied when importing a dump.. May I not loose all the projects I had before? To be checked..
@borisbaldassari I pushed two up-to-date projects data dump on http://ci3.castalia.camp/dl/M30/
@mhow2 thanks! I'm not sure what is implied when importing a dump.. May I not loose all the projects I had before? To be checked..
We could keep the outdated data dump (but it wouldn't be adequate with the current version of the platform: api-refactoring + process analysis updates) alongside the up-to-date one. we just need to update the oss-db Dockerfile to restore the data.
Hey @ambpro thanks for the new data dump. i'm not sure I understand your point about old data..
So could you please tell us / confirm how we can get the new data dumps without losing our recent, consistent set of projects (e.g. epsilon, sirius, etc.)?
@borisbaldassari
Hey @ambpro thanks for the new data dump. i'm not sure I understand your point about old data..
Sorry to be unclair
But the data we have computed on our instance recently is consistent and modern and ok, so we want to keep it right?
In this case, we have two possibilities:
So could you please tell us / confirm how we can get the new data dumps without losing our recent, consistent set of projects (e.g. epsilon, sirius, etc.)?
docker cp <ARCHIVE_NAME> scava-deployment_oss-db_1:/
docker exec -it scava-deployment_oss-db_1 bash
/usr/bin/mongorestore --gzip --archive=<ARCHIVE_NAME>
@MarcioMateus could you please confirm my proposition?
FYI,mongorestore
will only drop the existing collection if you use the --drop
argument.
If you don't use --drop
, all documents will be inserted into the existing collection, unless a document with the same _id
already exists. Documents with the same _id
will be skipped, they are not merged. So mongorestore
will never delete or modify any of the existing data by default.
Hello! I have prepared a data dump about Docker and Puppet projects. Where should I upload it?
Hello @blueoly,
Hello! I have prepared a data dump about Docker and Puppet projects. Where should I upload it?
The current update data dumps are available on http://ci3.castalia.camp/dl/M30/. If you don't have already an account on the server, you could upload yours on a hosting service then I'll move them to the ci3 server.
Hello @ambpro !
Thank you for your help. I had a little talk with Boris and I have already found out what I should do to upload the dump. I had not found time to do it ahead of the closing of the M30 deliverables but I will do it soon.
The dumps available in the docker-compose are not up-to-date and don't reflect the current status of the platform. It could be useful to update the dumps with new fresh data not just for internal testing but also for demo/showcase purposes.