org_data_transfer
Building an export:
turn on extract database:
assume-role -e p -- aws rds start-db-instance --db-instance-identifier mpdx-api-extract-prod
reset transfer database
start data transfer:
prod_console
Organization::Restorer.restore("05df083a-6836-4148-a15a-ae6d72d00749")
wait for sidekiq batch to finish
Dump transfer database to compressed file:
Transfer all s3 files to transfer bucket:
Organization::Restorer.new("05df083a-6836-4148-a15a-ae6d72d00749").restore_attachments(progress_bar: true)
Sync all files from transfer bucket to a local directory:
assume-role -e p -- aws s3 sync s3://mpdx-transmit-production docker_data/s3
Zip dir:
tar -cf - docker_data/s3 | xz -1 -c - -> docker_data/s3.tar.xz
turn off extract database:
assume-role -e p -- aws rds stop-db-instance --db-instance-identifier mpdx-api-extract-prod
Run MPDX locally with export
pg restore:
git checkout public-dockerfile
Start the api database server:
docker-compose up -d db
Create database:
createdb -h localhost -p 54322 -U postgres mpdx_extract
Restore data to database:
pg_restore -c -d mpdx_extract -h localhost -p 54322 -U postgres --no-privileges --no-owner ./transfer-test.dmp
build mpdx_api docker image:
bin/public_build.sh
(requires SIDEKIQ_CREDS to be set on host env)
run mpdx api:
docker-compose up
run mpdx_web:
cd mpdx_web
npm install
API_URL=http://localhost:50000/api/v2 OAUTH_URL=http://auth.lvh.me:50000/auth/user npm run start
Last updated