Skip to content

Installation

hydrafloods itself is a pure Python package, but some of its dependencies are not. Furthermore, the package relies on Google Cloud and Google Earth Engine to stage and process data relying on specific software and even more importantly account authentication. There are two ways to install for use, one via a manual installation and another through using a Docker Image.

Manual installation

The most convient way to install the package and its dependencies by creating a virtual environment using anaconda. To create a new environment and activate the environment:

conda create -n hydra -c conda-forge python=3.7 -y
conda activate hydra

Finally, we need to install the hydrafloods package and one last dependency via pip:

pip install hydrafloods

pip should handle some of the basic dependencies such as the Earth Engine Python API that we need for the majority of the functionality.

You will now also need to install the Google Cloud SDK to interface to with the Google cloud. Follow the directions provided by the website.

Once all of the source code and dependencies have been installed successfully, you will need to authenticate the cloud APIs

Using the Docker Image

A convient way to get up and started using the hydrafloods packages is via a Docker Image. Caution: This is installation method is recommended for advanced users that plan to deploy workflows on servers. The Docker Image comes with pre-installed software and dependencies so you do not have to deal with mis-matching dependencies or sometimes difficult installations.

To start you will need to have Docker installed on your system and running. You will need to pull the pre-built Docker Image for hydrafloods and start a new Container from the Image using the following command:

docker run --interactive --tty \
  --volume ~/<PROJECT-DIR>/:/mnt/ \
  --name hydrafloods_container kmarkert/hydrafloods

This command should be a one-time process to download the package and start the Container. Additionally, this command will mount a local directory (i.e. ~/<PROJECT-DIR>/) for use within the Docker Container which allows you to edit files locally and use within the container. Be sure to change <PROJECT-DIR> within the command to an exisiting local directory. Now the Docker Container is running for use!

Within the Docker Container the hydrafloods package and dependencies are pre-installed so all that is left is to authenticate the cloud APIs then we will be ready to test and start processing.

If you have exited the Docker Container and want to start it again, use the following command:

docker start -ia hydrafloods_container

This command to restart an existing Container is important especially after authenticating the cloud environment so that you do not have to go through the authentication process everytime you run the Docker container. For more information on working with Docker Images/Containers using the CLI see the Docker command line documentation.

Cloud authentication

After successful installation of the package and dependencies we will need to authenticate our local installation (or within the Docker Container) to interface with Google Cloud and Earth Engine. Running these command will prompt you through the authentication process using a web browser.

Warning: Make sure you initialize the earthengine and gcloud APIs with Google accounts that have permissions to read and write to Google Cloud Storage and Google Earth Engine assets. These do not have to be the same account (although this helps) they simply need to be authenticated to read and write Earth Engine and Cloud Storage assets.

To intialize the Google Cloud environment and authenticate using your credentials, run the following command:

gcloud init

To authenticate the Earth Engine Python API with your credentials, run the following:

earthengine authenticate

Now we are ready to test our installation!

Testing installation

🚧 Coming soon! 🚧