Loading environment variables into Bash by AWS Secrets Manager
If you building a pipeline or some kind of automated script, it’s very common to need some variables to compile a program or access some database. Some people use “.env” files, in general, versioned by source control and can be a security issue. There is a safe way to do this, using an external safe for these variables. In this post, we will use AWS Secret Manager.
First of all, you need to insert your variables into AWS Secret Manager. This is an easy task, can be done by using the AWS Console, or even by your terminal.
Now, we will create a file “app-twitter-dev.json” with variables:
Once the file created, we can post their information into AWS Secret Manager:
aws secretsmanager create-secret --name dev/app/twitter --secret-string file://app-twitter-dev.json
There is a lot of customization in this process, for example, you can choose the encrypted key. I Recommend you read an AWS Secret Manager documentation.
Now we can receive the values from the AWS Secret Manager and apply them to the system.
aws secretsmanager get-secret-value --secret-id dev/app/twitter --query SecretString --output text > /tmp/secrets.json
There are many ways to do this, using a regular expression, awk, etc. I preferred to use an external library to transform my JSON file exported from AWS Secret Manager into a temporary KEY/VALUE file.
1) npm install --global convert-json-env
2) convert-json-env /tmp/secrets.json --prefix="export " --out=.test.env
3) eval $(cat .test.env)
4) rm -rf /tmp/secrets.json && rm -rf .test.env
This is it.
Let me explain:
(1) You installed a node.js dependence with capabilities to transform your file. This process only needs to be done one time in your system.
(2) You called the command “convert-json-env” to transform your JSON file into a KEY/VALUE file. I took the opportunity to concatenate the word “export” on each line. This will facilitate the next command. The “export” statement serves to persist a variable in the system. ex: “export name=guilherme”
(3) You used the bash command “eval” to exec some instructions described in the file.
(4) You delete the temporary files used in this operation.
(5) The command “printenv” can show you the variables that persisted in your system.
My Pipeline Step:
aws secretsmanager get-secret-value --profile mason-dev --secret-id dev/app/twitter --query SecretString --output text > /tmp/secrets.jsonconvert-json-env /tmp/secrets.json --prefix="export " --out=.test.enveval $(cat .test.env)rm -rf /tmp/secrets.json && rm -rf .test.env
And last but not least. To execute the instructions with the AWS CLI, you need certain permissions linked to your access policy. This subject is a little more technical and requires more detail. I believe it to be the subject of another post. Anyway, I will leave here the necessary policies for each operation.
For testing level only, these policies are highly unrestricted. I Recommend you talk to your Cloud Administrator to create an appropriate policy.
This is it!
You will probably find several ways on the internet to do the same thing. Many of these ways will be more elegant or practical. Leave a comment if you find something cool.