crossminer / scava-deployment

This repository contains the Docker configuration files for the deployment of Scava platform
Eclipse Public License 2.0
2 stars 9 forks source link

Update project dumps in docker-compose #49

Closed valeriocos closed 5 years ago

valeriocos commented 5 years ago

The dumps available in the docker-compose are not up-to-date and don't reflect the current status of the platform. It could be useful to update the dumps with new fresh data not just for internal testing but also for demo/showcase purposes.

tdegueul commented 5 years ago

+1. We should prepare those asap (thinking about the CM workshop @ ICSE in Montreal at the end of the month, poke @borisbaldassari @phkrief). Imho, it would be enough to have dumps even for short periods (e.g. 1 month), created from a single execution task.

I'll prepare one for Eclipse Epsilon. @mhow2 @ambpro @md2manoppello can you pick a project and do the same?

mhow2 commented 5 years ago

Hi @tdegueul . I'd like to help but for the moment we're stuck with https://github.com/crossminer/scava/issues/198. It seems to be the product of GitHub temporary malfunction and RESTMule exception not being handled correctly.

ambpro commented 5 years ago

Hi all, I will provide an up-to-date project data dump asap.

borisbaldassari commented 5 years ago

@tdegueul +10 yep having a dataset ready for the montreal workshop would be definitely re-assuring and safe! Thanks for asking and caring about that!

blueoly commented 5 years ago

Hello, is anything that can I do in order to incorporate data about docker and puppet metrics in the dumps? @tdegueul @ambpro

mhow2 commented 5 years ago

@borisbaldassari : I have uploaded a dump of project docdokuplm (2018+ a few months) at https://nextcloud.ow2.org/index.php/s/BEiFSWZYBdEYcRW with all metric enabled. there is also microprofile but you might already have it.

borisbaldassari commented 5 years ago

@mhow2 thanks! I'm not sure what is implied when importing a dump.. May I not loose all the projects I had before? To be checked..

ambpro commented 5 years ago

@borisbaldassari I pushed two up-to-date projects data dump on http://ci3.castalia.camp/dl/M30/

@mhow2 thanks! I'm not sure what is implied when importing a dump.. May I not loose all the projects I had before? To be checked..

We could keep the outdated data dump (but it wouldn't be adequate with the current version of the platform: api-refactoring + process analysis updates) alongside the up-to-date one. we just need to update the oss-db Dockerfile to restore the data.

borisbaldassari commented 5 years ago

Hey @ambpro thanks for the new data dump. i'm not sure I understand your point about old data..

  1. Obviously the old data dump is not useful anymore.
  2. But the data we have computed on our instance recently is consistent and modern and ok, so we want to keep it right?

So could you please tell us / confirm how we can get the new data dumps without losing our recent, consistent set of projects (e.g. epsilon, sirius, etc.)?

ambpro commented 5 years ago

@borisbaldassari

Hey @ambpro thanks for the new data dump. i'm not sure I understand your point about old data..

Sorry to be unclair

But the data we have computed on our instance recently is consistent and modern and ok, so we want to keep it right?

In this case, we have two possibilities:

So could you please tell us / confirm how we can get the new data dumps without losing our recent, consistent set of projects (e.g. epsilon, sirius, etc.)?

Copy the data dump archieve inside the oss-db container:

docker cp <ARCHIVE_NAME> scava-deployment_oss-db_1:/

Move into the oss-db container:

docker exec -it scava-deployment_oss-db_1 bash

Import mongodb dump:

/usr/bin/mongorestore --gzip --archive=<ARCHIVE_NAME>

@MarcioMateus could you please confirm my proposition?

ambpro commented 5 years ago

FYI,mongorestore will only drop the existing collection if you use the --drop argument.

If you don't use --drop, all documents will be inserted into the existing collection, unless a document with the same _id already exists. Documents with the same _id will be skipped, they are not merged. So mongorestore will never delete or modify any of the existing data by default.

blueoly commented 5 years ago

Hello! I have prepared a data dump about Docker and Puppet projects. Where should I upload it?

ambpro commented 5 years ago

Hello @blueoly,

Hello! I have prepared a data dump about Docker and Puppet projects. Where should I upload it?

The current update data dumps are available on http://ci3.castalia.camp/dl/M30/. If you don't have already an account on the server, you could upload yours on a hosting service then I'll move them to the ci3 server.

blueoly commented 5 years ago

Hello @ambpro !

Thank you for your help. I had a little talk with Boris and I have already found out what I should do to upload the dump. I had not found time to do it ahead of the closing of the M30 deliverables but I will do it soon.