The Fintech Artificial Inteligence Consortium LAB is an UNSW vistual lab.
This is it's global aws infrastrucure (Datalake etc.). It must not be deployed multiple time!
The state of the terraform configuration is saved to an S3 bucket.
Please, learn how to use terraform and how AWS works before doing anything...
-
Configure your AWS credentials, refer to provider documentation, do not hard code your secret token.
-
Modify the infrastructure definition.
-
run
pipenv install -
run
makein the console. Double check the terraform plan before accepting the changes.3.1 Depending on your os, you might need to run
pipenv shell -
Check the outputs.json file to find API url and API keys.
make zip_lambdas: compress lambdas codemake deploy: deploy stackmake output: write outputs to outputs.jsonmake destroy: destroy the stack (bad idea)
Few considerations when it comes to modifying the stack.
- All lambdas code goes to
/src/code - Define global policies in the
policiesmodule - Do not add to much abstraction, keep things simple
- One module = one functionality (exemple: datalake, docker orchestration, sagemaker env etc.), do not split resources belongings to same functionality (Datalake API go with the actual Datalake definition)