Branch | Status |
---|---|
master |
- Table of contents
- Description
- Install
- External Ressources Dependencies
- Docker
- Unit Testing
- Download WMS image legends
- Python Code Styling
- Varia
Next generation services https://api3.geo.admin.ch for https://map.geo.admin.ch
In mid August 2022 the project has been migrated to python3
, docker
and eu-central-1
.
The required environment variables are set in .env.default
. They can be
adapted or you can use a copy of .env.default
, e.g. .env.mine
and use that
instead.
Install the python virtual environment (still virtualenv
at this point)
make setup
Build the Pylons settings files and run the local waitress
server
summon -p ssm make serve
You may want to customize the variables. Copy the file .env.default
as .ven.mine
,
change the variables you want and use them with
summon -p ssm make ENV_FILE=.env.mine serve
๐ You need some external ressource to run the service, see External ressources dependencies
To run the service locally you need to have access to the following external ressources:
- Postgresql database
pg-geodata-replica.bgdi.ch
- S3 bucket
service-mf-chsdi3-grid-geojsons-dev-swisstopo
(NOTE: this bucket is only required by some of the Identity endpoints)
You can use the ssh port forwarding feature to have access to pg-geodata-replica.bgdi.ch
by using the jump host:
ssh ssh0a.prod.bgdi.ch -L 5432:pg-geodata-replica.bgdi.ch:5432
Then set the DBHOST
environment variable to localhost
(you can do this in your own environment file e.g. .env.mine
and run the make file as follow: summon -p ssm make ENV_FILE=.env.mine serve
)
To have access to the S3 bucket, you can either set your AWS_ACCESS_KEY_ID
and AWS_SECRET_ACCESS_KEY
environment variable.
Alternatively if you are using zsh
you can use the aws
plugin (see oh-my-zsh aws plugin) with the following command:
acp swisstopo-bgdi-dev
This command will automatically use your AWS profile swisstopo-bgdi-dev
for any AWS connection services.
make dockerbuild
summon -p ssm make dockerrun
๐ You need some external ressource to run the service, see External ressources dependencies
First log in to the AWS ECR registry with:
make dockerlogin
afterwards you can push the locally built image to ECR with:
make dockerpush
- PostgreSQL DB
pg-geodata-replica.bgdi.ch
must be reachable - Access to AWS services
- Read access to S3 bucket
service-mf-chsdi3-grid-geojsons-dev-swisstopo
(can be disable withS3_TESTS=0
)
- Read access to S3 bucket
See External Ressources Dependencies for more infos on those prerequisites.
To run the tests enter
summon -p ssm make test
Or if you use your own environment file
summon -p ssm make ENV_FILE=.env.mine test
summon -p ssm make S3_TESTS=0 test
In order to download all images of a layer in the correct format and with the correct dimensions, simply use:
make legends BODID=ch.layername WMSHOST=wms.geo.admin.ch
Alternatively, you can also download a WMS legend for a specific scale.
make legends BODID=ch.layername WMSHOST=wms.geo.admin.ch WMSSCALELEGEND=1000
You will need the optipgn
tool order to download the legends, use sudo apt install optipng
to install it.
We are currently using the FLAKES 8 convention for Python code. You can find more information about our code styling here:
You can find additional information about autopep8 here:
To check the code styling:
make lint
To autocorrect most linting mistakes
make autolint
export PATH=$(npm bin):$PATH
jsonlint-cli --pretty temp.json > chsdi/static/vectorStyles/ch.meteoschweiz.messwerte-foehn-10min.json