Fetching data¶
Fetch data from hosted environments.
- The production data is pulled from the
main
environment. - The staging data is pulled from the
develop
environment.
Prerequisites¶
The following steps must be completed before you can pull data from an evironment:
- Register for a Platform.sh account using your work email.
- On the
#ds-etna
or#ds-etna-dev
Slack channels, request for someone to add you to the Etna project. - Once you have access, generate an API token for your account (The name Local CLI will do nicely), and add it your local
.env
file asPLATFORMSH_CLI_TOKEN
.
Developers from external agencies may not be able to register for a platform.sh account. In this scenario, ask a developer from The National Archives to share their API key.
Download environment data¶
Run the following command from the console. NOTE: This will not work from within an existing shell session, so you may have to exit that first.
Staging¶
$ fab pull-staging-data
Production¶
$ fab pull-production-data
NOTE:
- Data is automatically anonymised after downloading to protect sensitive data, so user logins from production will NOT work locally.
- Also, any Django users you created locally before running the command will no longer exist.
- A superuser whose credentials are defined in the docker-compose.yml will be created.
- You can also run
python manage.py createsuperuser
from a container shell to create yourself a new one.
Download environment media¶
Run the following command from the console. NOTE: This will not work from within an existing shell session, so you may have to exit that first.
N.B. These commands must be run with sudo
so the permissions can be correctly updated once pulled. This is to fix an issue which will be resolved once we move platforms.
Staging¶
$ sudo fab pull-staging-media
Production¶
$ sudo fab pull-production-media