This project is no longer maintained and has been superceded by the CollectionSpace CSV Importer.
Migrate data to CollectionSpace.
The converter tool is a Ruby on Rails application. The database backend is MongoDB (v3.2).
For deployments there is a Docker image and docs available.
To run the converter locally install Ruby and bundler then run:
bundle install
See .ruby-version
for the recommended version of Ruby.
There is a default .env
file that provides example configuration. Override it
by creating a .env.local
file with custom settings.
# DEVELOPMENT .env.local
CSPACE_CONVERTER_BASE_URI=https://core.dev.collectionspace.org/cspace-services
CSPACE_CONVERTER_DB_NAME=nightly_core
CSPACE_CONVERTER_DOMAIN=core.collectionspace.org
CSPACE_CONVERTER_MODULE=Core
CSPACE_CONVERTER_USERNAME=admin@core.collectionspace.org
CSPACE_CONVERTER_PASSWORD=Administrator
The CSPACE_CONVERTER_BASE_URI
variable must point to an available ColletionSpace
Services Layer backend.
Run Mongo using Docker:
docker run --name mongo -d -p 27017:27017 mongo:3.2
You should be able to access MongDB on http://localhost:27017
.
If you prefer to run Mongo traditionally follow the installation docs online.
You can dump and restore the database with Mongo Tools:
sudo apt-get install mongo-tools # ubuntu
mongodump --archive=data/dump/cspace_converter_development.gz
mongorestore --archive=data/dump/cspace_converter_development.gz
Robo3T is recommended for a GUI client.
Before the tool can import CSV data into CollectionSpace, it first "stages" the data from the CSV files into the MongoDB database.
Create a data directory and add the CSV files. For example:
data/core/
├── mymuseum_cataloging.csv
Note that where you save the CSV files is irrelevant. You can browse to any file on your computer using the Web UI, and provide a full path via the CLI.
In the data
directory of this repo, there are sample data files available for testing for each supported
Collectionspace profile. These files can also be used as templates for creating CSV data to import.
./reset.sh
./bin/rails s
Once started, visit http://localhost:3000 with a web browser.
To execute jobs created using the UI run this command:
./bin/delayed_job run --exit-on-complete
The general format for the command is:
./import.sh [FILE] [BATCH] [PROFILE]
FILE
: path to the import fileBATCH
: import batch label (for future reference)PROFILE
: profile key from config (config.yml
registered_profiles)For example:
./import.sh data/core/cataloging_core_excerpt.csv cataloging1 cataloging
Then to transfer:
./remote.sh transfer CollectionObject cataloging1
./remote.sh delete CollectionObject cataloging1
# provides a list of records
./bin/rake remote:client:get[collectionobjects]
# get an object record
./bin/rake remote:client:get[collectionobjects/$CSID]
# get list of concept authorities
./bin/rake remote:client:get[conceptauthorities]
# get a concept authority record
./bin/rake remote:client:get[conceptauthorities/$AUTHORITY_CSID/items/$TERM_CSID]
./bin/rails c
# See first existing DataObject
puts DataObject.first.inspect
# Get CSID from cached collectionObject
# The second parameter is the Identification number from the record.
CollectionSpaceObject.find_csid('CollectionObject', 'A 291/000004')
# Get CSID from remote collectionObject
# The string in the second line is the Identification number from the record.
# This will return nil unless there is ONE matching collectionObject found
service = Lookup.record_class('CollectionObject').service(nil)
RemoteActionService.find_item_csid(service, 'A 1/000261')
# Or, to do this with an existing CollectionSpaceObject record
obj = CollectionSpaceObject.where(identifier: '123456').first
RemoteActionService.new(obj).ping
obj.csid
./bin/rake db:nuke
Warning: this deletes all data, including failed jobs.
./bin/rake spec # requires Mongo
The project is available as open source under the terms of the MIT License.