Sunday, June 9, 2019

Apache - Make specific GET request forbidden by matching pattern

If there are certain get requests which you want to make forbidden (403) on your server by matching pattern, you need to write certain rules in .htaccess or apache configuration file.

Here you can add following snippets in your apache configuration file and it will block all the GET requests which will match the pattern.

Suppose an http request is
http://porcupine.com/paymentcontroller.php?id=oculus&name=johnathan
You can block this request either by id or by name or by both. I am blocking by id.
<If "%{QUERY_STRING} =~ /id=oculus/">
  Require all denied
</If>

Reload apache.
Now all the requests contain text 'id=oculus' will be forbidden.

DynamoDB Backup and Restore

If you are using DynamoDB on AWS and facing problem while importing and exporting it, here is solution for you.

In this solution, you need to download a python package from github and you can easily take backup of tables on your local or any s3 bucket.

1. Clone the dynamodump script from github.
git clone https://github.com/bchew/dynamodump.git

2. cd into the directory
cd dynamodump

3. Now you can take backup of one table, multiple or whole database.
Suppose you want to take backup of one table.
python dynamodump.py -m backup -r aws-region-name -s dynamo-tablename
In this case my aws region is us-west-1 and table name is users_profile.
python dynamodump.py -m backup -r us-west-1 -s users_profile
It will take backup of table users_profile in the directory name dump inside cloned directory dynamodump.

If you want to restore this table. Either you want to restore on local or you want to restore on AWS
a) To restore on local
python dynamodump.py -m restore -r us-west-1 -s users_profile
b) To restore on AWS
To restore table on AWS, you should have .boto file in your home directory with access and secret keys.
cat ~/.boto
[Credentials]
aws_access_key_id = AKIAJSIE27KKMHXI3BJQ
aws_secret_access_key = 5bEYu26084qjSFyclM/f2pz4gviSfoOg+mFwBH39

aws_access_key_id or aws keys credentials should be configured, it is stored in ~/.aws/credentials 
cat ~/.aws/credentials
[default]
aws_access_key_id = AKIAJSIE27KKMHXI3BJQ
aws_secret_access_key = 5bEYu26084qjSFyclM/f2pz4gviSfoOg+mFwBH39

These access and secret keys should have access to import table/database into DynamoDB of your AWS account
python dynamodump.py -m restore -r us-west-1 -s *

4. Similarly you can take backup of complete database as well as you can restore it.
python dynamodump.py -m backup -r us-west-1 -s *
It will take backup of all tables in the directory name dump inside cloned directory dynamodump.

If you want to restore all tables. Either you want to restore on local or you want to restore on AWS
a) To restore on local
python dynamodump.py -m restore -r us-west-1 -s *

b) To restore on AWS
To restore table on AWS, you should have .boto file with access and secret keys or .aws directory with credentials in your home directory.
python dynamodump.py -m restore -r us-west-1 -s *

5. If you want to take backup of dynamodb in your s3 bucket, your access and secret keys should have access to upload content in s3 bucket.
python dynamodump.py -m backup -r region-name -s * -a zip -b s3_bucket_name
In this case my aws region is us-west-1 and s3 bucket name is oculus-db-backup.
python dynamodump.py -m backup -r us-west-1 -s * -a zip -b oculus-db-backup
It will copy dump.zip in your s3 bucket. dump.zip content all exported json files of your dynamodb.

Source :
https://github.com/bchew/dynamodump