Wednesday, March 4, 2020

Upload file on Google Drive using shell script

If you want to send backup files like sql on google drive regularly using an automated script, here is the solution for you.

1. First clone following repository from github.
git clone https://github.com/labbots/google-drive-upload.git
2. Now run following commands.
cd google-drive-upload
./google-oauth2.sh
It will ask you to add Client ID and Secret Key of your google account. To generate access key and secret key of google drive Open link https://console.developers.google.com/apis/credentials
Create Credentials > Oauth Client ID > Other > Create
Now Copy client ID and secret Key and use in your script. If it gives 403 restricted access then go to OAuth consent screen and create a project to generate client ID and secret Key.

3. Now Enter Client ID and Secret Key. It will be stored in ~/googledrive.conf To copy file on google drive, run command
sudo ./upload.sh -v -r 1zfZKk37SgPFc4k_sbAVI_Xo9U427X3KU databasefile.sql.zip
1zfZKk37SgPFc4k_sbAVI_Xo9U427X3KU is your drive foldername which you can find in url and where you want to copy the file.

4. Now you get a url of devices with a code, you need to hit the url in browser and submit the code. 

Now this device has access to upload file on google drive. Addig code is one time activity, once your google account knows about the device, it will not ask to enter code again and that's why you can create shell script and add it in cron to take regular backup automatically.

Note : 
If you have set up the cron but file is not transferred on google drive, although it is transferred using same script when you run the script manually, there may be one issue, you have set up the cron for root user. Root user is unable to find .googledrive.conf in its home directory but when you run the script manually, you may use sudo but still the user who is running the script is not root, so it is able to find .googlefrive.conf in its home directory.

To fix this issue, set up the cron for the user who configured googledrive or copy the .googledrive.conf file in root's home directory. This should solve the issue.

Export DynamoDB as csv

You must have dynamodb with lots of data and you want o see this in excel file. Dynamodb is a concept without columns but still it can exported in csv. You may have number of tables in dynamodb. Here is a way to export it.
1. Clone following repo from github.
git clone https://github.com/edasque/DynamoDBtoCSV.git
2. Install node based dependencies.
npm install
3. Fill the access key, secret key and region in the file DynamoDBtoCSV/config.json
{
    "accessKeyId": "QWERTNVEXAMPLEMUJMKI",
    "secretAccessKey": "+WSXCD/SdEfExAmPlEJkjKL/KlOiK45Iu",
    "region": "us-east-1"
}
4. Now export tables using following commands.
node dynamoDBtoCSV.js -t tableOne > tableOne.csv
node dynamoDBtoCSV.js -t tableTwo > tableTwo.csv


AWS take backup of all IAM Roles and Policies


If you want to take backup of your all IAM roles and policies, here is solution for you.

1. Configure aws cli with access key and secret key.
aws configure
2. Save all roles and policies in json file
aws iam get-account-authorization-detail > output.json
There are several ways to read this json file using a script. 

Now You have complete data of your role and policies.