Friday, December 30, 2022

php error - Request entity too large. The requested resource does not allow request data with the post requests or the amount of data provided in the request exceeds the capacity limit

Wordpress - Request entity too large. 

The requested resource does not allow request data with the post requests or the amount of data provided in the request exceeds the capacity limit

Solution : 

Add in apache2.conf

LimitRequestBody 100000000
Update in php.ini
max_input_time = 300
max_execution_time = 300
post_max_size = 512M
upload_max_filesize = 512M

Restart Apache 2

Wednesday, December 28, 2022

Scan https Web Application for tls vulnerability using openssl

SSL Labs is the effficient way to find which SSL protocols are enabled on your webserver.

Enter your website url and click on submit

https://www.ssllabs.com/ssltest/

It generated ssl scan report for your web application. You can check tls protocol enable/disable list.

If you want to scan tls protocol version on your local, Run following commands for different versions of tls.

tls 1 and tls 1.1 are vulnerable versions.

openssl s_client -connect app.localhost:443 -tls1
openssl s_client -connect app.localhost:443 -tls1_1
openssl s_client -connect app.localhost:443 -tls1_2
openssl s_client -connect app.localhost:443 -tls1_3

If tls 1 or tls 1.1 is enabled for your webserver but you are not getting results on command line, you need to make configuration changes in your openssl.cnf file.

Take backup of your /etc/ssl/openssl.cnf

sudo cp /etc/ssl/openssl.cnf /etc/ssl/openssl.cnf.bkup

Add this to the beginning of your config file /etc/ssl/openssl.cnf
openssl_conf = default_conf

And then this to the end:
[ default_conf ]
ssl_conf = ssl_sect
[ssl_sect]
system_default = system_default_sect
[system_default_sect]
MinProtocol = TLSv1.2
CipherString = DEFAULT:@SECLEVEL=1

Now  export path
export OPENSSL_CONF=/etc/ssl/openssl.cnf

Now check again

openssl s_client -connect app.localhost:443 -tls1
openssl s_client -connect app.localhost:443 -tls1_1
 

Now it should show enabled tls protocols on command line

You can disable these protocols in your webserver config.

ngcc not found, @angular/cli is already installed

If you get the error ngcc not found but you have already installed all required angular packages.

To solve the issue, run command

npm install @angular/compiler-cli --save

Set npm Registry url locally

If npm takes registry url attribute from global npmrc config file and not from the file which is in your home directory then you want to modify global config npmrc but you can not update as you do not have root privileges.

You can set local registry url for your npm commands. 

npm config set registry https://registry.npmjs.org/

Now the url which is mentioned in global npmrc file will not be effective and it downloads package from this url only.

When you close the terminal and open new terminal, you need to run this command every time. To get rid off this, you need to add this command in your ~/.bashrc file, now whenever you open terminal, this command will be executed automatically in the background.

Friday, December 23, 2022

Amazon API Generate Access Token and Refresh Token


a)  Log into Amazon Developer Console

https://developer.amazon.com/dashboard

b) Login with Amazon > Create a Security Profile

c) Generate client id and client secret.

d) Add redirect url under web settings as a whitelist redirect url.

e) Generate Authorization Code (One Time)

https://www.amazon.com/ap/oa?client_id=xxxxxxxxxxx&response_type=code&redirect_uri=http://localhost&scope=profile&state=SPECIAL

f) Generate Refresh Token (One Time)

curl --request POST --data "code=xxxxxxxxxxx&client_id=xxxxxxxxxxxx&client_secret=xxxxxxxxxxxxx&redirect_uri=http://localhost&grant_type=authorization_code" https://api.amazon.com/auth/o2/token

g) Generate Access Token from Refresh token (Always)

curl --request POST --data "client_id=xxxxxxxxxxx&client_secret=xxxxxxxxxxx&refresh_token=xxxxxxxxxxxxxxxxxxx&grant_type=refresh_token" https://api.amazon.com/auth/o2/token
 

Generate secure https certificate for localhost using openssl commands

Create another config file from openssl configuration file. 

sudo cp /usr/lib/ssl/openssl.cnf /etc/ssl/app.localhost.cnf

Now copy below code in the respective section of copied configuration file /etc/ssl/app.localhost.cnf 

[ v3_ca ]
subjectKeyIdentifier=hash
authorityKeyIdentifier=keyid:always,issuer
basicConstraints = critical, CA:TRUE, pathlen:3
keyUsage = critical, cRLSign, keyCertSign
nsCertType = sslCA, emailCA

[ v3_req ]
basicConstraints = CA:FALSE
keyUsage = nonRepudiation, digitalSignature, keyEncipherment
#extendedKeyUsage=serverAuth
subjectAltName = @alt_names

[ alt_names ]
DNS.1 = app.localhost
DNS.2 = localhost
DNS.3 = app1.localhost


Uncomment line. 

req_extensions = v3_req

Create CA Certificate : 

openssl genrsa -aes256 -out ca.key.pem 2048

chmod 400 ca.key.pem 

openssl req -new -x509 -subj "/CN=applocalhostca" -extensions v3_ca -days 3650 -key ca.key.pem -sha256 -out ca.pem -config /etc/ssl/app.localhost.cnf 

openssl x509 -in ca.pem -text -noout

Create Server certificate signed by CA : 

openssl genrsa -out app.localhost.key.pem 2048

openssl req -subj "/CN=app.localhost" -extensions v3_req -sha256 -new -key app.localhost.key.pem -out app.localhost.csr

openssl x509 -req -extensions v3_req -days 3650 -sha256 -in app.localhost.csr -CA ca.pem -CAkey ca.key.pem -CAcreateserial -out app.localhost.crt -extfile /etc/ssl/app.localhost.cnf

openssl x509 -in app.localhost.crt -text -noout

Now add ca.pem in chrome and firefox,
Chrome > privacy and security > security > manage certificates > authority
firefox > privacy and security > view certificates > authority
 

Use these crt and key in web server as ssl certificates.
Now open this virtualhost with https. Warning should not be appeared.

Tuesday, December 13, 2022

Check cpu usage of last 30 days

To get this, /usr/bin/sar should be installed. It is a default package.
cpu logs should be there in /var/log/sa folder in centos or redhat
 

If there is no /var/log/sa folder in ubuntu so the logs are stored in /var/log/sysstat. There is daily file of stored log.


usrname@hostname:~/Desktop$ ls /var/log/sysstat
sa09  sa10  sa11  sa12  sa13  sa14  sa15  sar09  sar10  sar11  sar13  sar14

You can find last 30 days logs here.
Suppose To check the logs of 15th, run command
sar -f /var/log/sysstat/sa15

                             CPU     %user     %nice   %system   %iowait    %steal     %idle
11:45:01 AM IST     all         15.59      1.82      3.71          2.11          0.00         76.77
11:55:02 AM IST     all         14.55      0.00      3.59          3.55          0.00         78.31
12:05:01 PM IST     all         14.24      0.00      3.30          0.71          0.00         81.75
12:15:01 PM IST     all         14.06      0.00      3.31          0.58          0.00         82.06
12:25:02 PM IST     all         14.02      0.00      3.39          0.88          0.00         81.71
12:35:01 PM IST     all         15.12      0.00      3.32          0.79          0.00         80.77
12:45:01 PM IST     all         13.96      0.00      3.36          1.09          0.00         81.59
12:55:02 PM IST     all         12.23      0.06      3.13          2.70          0.00         81.87
01:05:01 PM IST     all          2.12       0.00      1.00          0.56          0.00         96.32
01:15:01 PM IST     all         10.70      0.00      3.49          2.65          0.00         83.16
01:25:01 PM IST     all         13.24      0.03      3.99          2.80          0.00         79.94
01:35:01 PM IST     all         11.66      0.04      3.39          3.13          0.00         81.78

You will get the cpu usage of complete day. Above is a part of the output.

If logs are not stored, active enabling the flag in file.
sudo nano /etc/default/sysstat
ENABLED="true"

Restart service
sudo systemctl restart sysstat.service

Thursday, December 8, 2022

Configure localhost with secure https

 

Steps :

sudo apt install libnss3-tools -y 

wget https://github.com/FiloSottile/mkcert/releases/download/v1.4.3/mkcert-v1.4.3-linux-amd64 

sudo cp mkcert-v1.4.3-linux-amd64 /usr/local/bin/mkcert 

sudo chmod +x /usr/local/bin/mkcert 

mkcert -install

Now use your virtualhost name in the below command instead of app.localhost. You can generate same certificate for multiple virtualhosts. Add multiple virtualhosts space separated in the below command.

mkcert app.localhost localhost 127.0.0.1 

Use certificates in Apache ssl config, and restart Apache. Now open localhost and other virtualhosts in the browser. it should be secured.

Wednesday, November 30, 2022

Read, Send and Delete Gmails using Gmail API

 After Generating Client ID and Client Secret

1) Generate Authorization Code (One Time)

https://accounts.google.com/o/oauth2/auth?client_id=XXXXXXXX&redirect_uri=http://localhost&response_type=code&scope=https://mail.google.com/&access_type=offline

2) Generate Refresh Token (One Time)

curl --request POST --data "code=XXXXXXXX&client_id=XXXXXXXX&client_secret=XXXXXXXX&redirect_uri=http://localhost&grant_type=authorization_code" https://oauth2.googleapis.com/token

3) Generate Access token from Refresh Token (Every Time)

curl --request POST --data "client_id=XXXXXXXX&client_secret=XXXXXXXX&refresh_token=XXXXXXXX&grant_type=refresh_token" https://oauth2.googleapis.com/token

4) Read Email

List Message IDs of all emails
curl -X GET \
  'https://gmail.googleapis.com/gmail/v1/users/me/messages?q=label:inbox' \
  -H 'Authorization: Bearer XXXXXXXXXXXXXXXXXXXXXXXX'

List Content of Specific email
curl -sX GET   'https://gmail.googleapis.com/gmail/v1/users/me/messages/<msg-id>'   -H 'Authorization: Bearer XXXXXXXXXXXXXXXXXXXXXXXX' | jq -r '.payload.parts[0].body.data' | base64 --decode

5) Send Email
Mail Details in Plain Text

 
echo "From: corecodejs@gmail.com
To: corecodejs@gmail.com
Subject: Test Email Subject

This is the body of the email." > email.txt

cat email.txt

File encoded in base64

base64 email.txt | tr -d '\n' | tr '+/' '-_' > email_encoded.txt
cat email_encoded.txt

Send email
curl -X POST \
  'https://gmail.googleapis.com/gmail/v1/users/me/messages/send' \
  -H 'Authorization: Bearer XXXXXXXXXXXXXXXXXXXXXXXX' \
  -H 'Content-Type: application/json' \
  -d '{
    "raw": "'$(cat email_encoded.txt)'"
}'

6) Delete Email

curl -X DELETE \
  'https://gmail.googleapis.com/gmail/v1/users/me/messages/<msg-id>' \
  -H 'Authorization: Bearer XXXXXXXXXXXXXXXXXXXXXXXX'

Sunday, November 27, 2022

IBM / MAX Text Sentiment Analysis

Run an already trained model instance

git clone https://github.com/IBM/MAX-Text-Sentiment-Classifier.git
cd MAX-Text-Sentiment-Classifier
docker build -t max-text-sentiment-classifier .
docker run -it -p 5000:5000 max-text-sentiment-classifier
It will create a build in a docker container. We can check the application on port 5000.
http://localhost:5000

Open it by a domain / subdomain or it can be accessed by http://ip:5000
It will open a web interface. Since its model is already trained and it gives you output as positive and negative.

We need to pass the sentence and it will tell you, either the sentence is positive or negative. Once docker container is running, we can send the request using curl and collect the output in response.

curl -d "{ \"text\": [ \"The Model Asset Exchange is a crucial element of a developer's toolkit.\" ]}" -X POST "http://localhost:5000/model/predict" -H "Content-Type: application/json"

Response :

{
  "status": "ok",
  "predictions": [
    [
      {
        "positive": 0.9977352619171143,
        "negative": 0.0022646968718618155
      }
    ]
  ]
}

To Train the Model :

We are already in the folder MAX-Text-Sentiment-Classifier, cd into training folder. Create a virtual env.

cd training/
pip install -r requirements.txt
python setup_max_model_training.py max-text-classifier-training-config.yaml
Now this command will need account set up in IBM Cloud.  How much the output is accurate, it depends on the training tsv file.
https://cloud.ibm.com/login
Go On Manage > Access > API Keys > generate a key
Download the file apikey.json, copy on the server and Provide your key path.

Now run the docker instance, it will be launched based on this training.
docker build -t max-text-sentiment-classifier --build-arg use_pre_trained_model=false .
docker run -it -p 5000:5000 max-text-sentiment-classifier
Open url
http://ip:5000
Now it will give us the response with neutral output too as it has been trained based on the given tsv input.

To reset everything on cloud, delete Created service and Created storage in the cloud IBM account.
Remove all keys Entries from /etc/environment, exit or close the terminal, open and login again and run

python setup_max_model_training.py max-text-classifier-training-config.yaml
Note :

https://github.com/IBM/MAX-Text-Sentiment-Classifier


Saturday, November 19, 2022

Bash Sleep Sort

#!/bin/bash

function f() {
sleep "$1"
echo "$1"

}
while [ -n "$1" ]
do
f "$1" &
shift
done
wait

 Run the script.

./file.sh 6 1 3 7 8 4 2

Output :

It will sort the numbers and print them after 'printed number' seconds.

ex- 7 will be printed after 7 seconds.

Thursday, November 10, 2022

Logrotate is not running automatically in Redhat or CentOS

When you run logrotate command manually, it creates archived gz files successfully in the directory which you mentioned in the config file /etc/logrotate.d/custom-config but it does not run this command using daily cron automatically.

This is the most common problem of logrotate generally which you do not get in Ubuntu. Even if you want to run logrotate for your default log directory i.e. /var/log in Redhat or CentOS, you might not get this issue. You are getting this issue because you want to run logrotate automatically for your non-default directory like /opt/httpd/logs, /opt/odoo/logs or /opt/tomcat/logs etc.

Anacron runs logrotate using cron.daily. The issue is SELinux does not provide access to run logrotate using anacron for non-default log directories. It needs to mention explicitly to run logrotate for non default log directories.

There are two solutions of this issue.

Solution : 1

Do not depend on anacron. Create your own cron using root user. Log into the terminal using root user and set daily cron.

00 05 * * * /usr/sbin/logrotate -f /etc/logrotate.conf

Solution : 2

Set SELinux to run logrotate using anacron for non-default directories.

Follow this solution.

https://access.redhat.com/solutions/39006

Run commands as root user.

semanage fcontext -a -t var_log_t '/opt/httpd/logs(/.*)?'
cat /etc/selinux/targeted/contexts/files/file_contexts.local
restorecon -Frvv /opt/httpd/logs

It will solve your problem to run logrotate using cron automatically.

 

VScode cannot be loaded because running scripts is disabled on this system

Terminal is not opening in VSCode. It gives the error.

VScode cannot be loaded because running scripts is disabled on this system
Open VSCode.

Go to
File > preferences > settings > extension > edit settings.json

Add following snippet at the start below first curly brace {


"terminal.integrated.profiles.windows": {
  "PowerShell": {
    "source": "PowerShell",
    "icon": "terminal-powershell",
    "args": ["-ExecutionPolicy", "Bypass"]
  }
},
"terminal.integrated.defaultProfile.windows": "PowerShell",

Now restart VS Code, close the Terminal and open it again.

Git Bash: "Unable to get local issuer certificate"

If you are getting above error while cloning the repository, add following flag with the command and you will not get the error.

 git -c http.sslVerify=false clone https://bitbucket.org/username/reponame.git

Monday, October 10, 2022

SAP BusinessObjects Business Intelligence Platform 4.2 : Installation stuck on Waiting for the Central Management Server (CMS) to start

SAP BusinessObjects Business Intelligence Platform 4.2 : 

Installation stuck on 

Waiting for the Central Management Server (CMS) to start

 Solution :

1. Check if any other services running on default ports 6410, 6400, 8080, 8443, 6405 etc

2. If no services are running but still you get same error

Check error file in logging folder inside ${destination folder}/sap_bo/logging

If there are database related issues in the error log, check these posts.

https://linuxamination.blogspot.com/2022/10/sap-businessobjects-business.html


SAP BusinessObjects Business Intelligence Platform 4.2 : The CMS system database connection information provided does not have a valid format

 SAP BusinessObjects Business Intelligence Platform 4.2 Error :

33007 SAP BusinessObjects BI Platform CMS: The CMS system database connection information provided does not have a valid format. Please check that the CMS system database connection information is correct.

35101 The roor server reported an error initialization Failure. 

Reason : SAP BusinessObjects BI platform CMS: The CMS system database connection information  provided does not have a valid format. Please check that the CMS system database connection information is correct. Unable to parse connection string.

Solution :

Your DB user should have admin privileges. Use DBA User.

If still you get same error then 

While putting DB details, choose option 1 for Resetting database and do the fresh installation in the empty Destination folder.


SAP BusinessObjects Business Intelligence Platform 4.2 CMS_INCREMENTIDWITHROLLOVER7 Invalid Identifier Error

While Installing SAP Business Object server, if you get following error :

33007 Database access error. Reason ORA-00904: CMS_INCREMENTIDWITHROLLOVER7 Invalid Identifier Error

35101 The roor server reported an error initialization Failure

Reason: Database access error.  Reason ORA-00904: CMS_INCREMENTIDWITHROLLOVER7 Invalid Identifier Error

Solution :

Your Oracle user should have admin privileges. use DBA User

If still you get same error then 

While putting DB details, choose option 1 for Resetting database and do the fresh installation in the empty Destination folder.


Saturday, September 17, 2022

Google API Generate Access Token and Refresh Token

Steps :

a) Generate client id and secret key.

b) Add Google logged in user as a Test user in consent screen.

c) Generate Authorization Code (One Time)

https://accounts.google.com/o/oauth2/auth?client_id=xxxxxxxx&redirect_uri=http://localhost&response_type=code&scope=https://www.googleapis.com/auth/drive&access_type=offline
d) Generate Refresh Token (One Time)
curl --request POST --data "code=xxxxxxxx&client_id=xxxxxxxxxxxx&client_secret=xxxxxxxxxxxx&redirect_uri=http://localhost&grant_type=authorization_code" https://oauth2.googleapis.com/token
e) Generate Access Token from Refresh token (Always)
curl --request POST --data "client_id=xxxxxxxxxxx&client_secret=xxxxxxxxxxxxxxx&refresh_token=xxxxxxxxxxxxx&grant_type=refresh_token" https://oauth2.googleapis.com/token

Note : Refresh Tokens expire in 1 week if your app is not set as production. Change publishing status of your app from testing to production to use your refresh token always.

The Publishing Status option can be found in 'Oauth Consent Screen' which is under API & Services.


 

 


Microsoft API - Get Access Token and Refresh Token

1. Log into Azure Portal Active Directory.

2. Register your application.

3. Create secret for your application.

4. Find your Client ID and Tenant ID.

5. Generate Authorization  Code. (One Time)

https://login.microsoftonline.com/

{Tenant ID}/oauth2/v2.0/authorize?client_id={AppReg ID} &response_type=code &redirect_uri=http%3a%2f%2flocalhost%3a8080 &response_mode=query &scope={AppReg ID}%2f.default&state=12345&sso_reload=true 

6. Save redirect url. It is Authorization Response.  

7. Generate Refresh Token. (One Time)

curl -X POST -H 

"Content-Type: application/x-www-form-urlencoded" 

-d 'client_id={AppReg ID}
  &scope={AppReg ID}%2f.default openid profile offline_access
  &code={authorization code}
  &redirect_uri=http%3A%2F%2Flocalhost%3a8080
  &grant_type=authorization_code
  &client_secret={AppReg Secret}' 

'https://login.microsoftonline.com/{Tenant ID}/oauth2/v2.0/token'

8. Generate Access Token from Refresh Token (Every Time)

curl --location --request

POST 'https://login.microsoftonline.com/

{Tenant ID}/oauth2/v2.0/token' \ --header 'Content-Type: application/x-www-form-urlencoded' \ --data-urlencode 'client_id={AppReg ID}' \ --data-urlencode 'scope={scope returned in previous request}' \ --data-urlencode 'refresh_token={Refresh Token}' \ --data-urlencode 'grant_type=refresh_token' \ --data-urlencode 'client_secret={AppReg Secret}'
 




Friday, September 9, 2022

Error Telethon - TypeError: 'ChannelParticipants' object is not subscriptable

When you are trying to export members of a Telegram public group using python script and you get following error

Traceback (most recent call last):
  File "t1_exportgroup.py", line 49, in <module>
    all_participants = client.get_participants(target_group, aggressive=False)
  File "/home/user/virtualenvs/telegram/lib/python3.8/site-packages/telethon/requestiter.py", line 74, in __anext__
    if await self._load_next_chunk():
  File "/home/user/virtualenvs/telegram/lib/python3.8/site-packages/telethon/client/chats.py", line 223, in _load_next_chunk
    participants = results[i]
TypeError: 'ChannelParticipants' object is not subscriptable

Solution :

You are getting this error because group has more than 5k or 6K members, you need to provide limit in your python script.

all_participants = []
all_participants = client.get_participants(target_group, aggressive=False, limit=5000)

Now try again, you should be able to export members less than 5000.

 Watch the video to see the demo :


 

 

Monday, June 13, 2022

MongoDB Load testing Using JMeter

Database performance testing is an important aspect of JMeter. Not only it offers for RDBMS performance testing but also no sql database like MongoDB.

We have added a JMeter test plan jmx file and other dependent files like database dump in the following git repository.

https://github.com/linuxamination/jmeter-mongodb

Here is the complete tutorial of the process. Follow the steps and you will be able to perform MongoDB Load Testing on your own.

 


 


Saturday, May 14, 2022

Genymotion install apk error - An error occured while deploying the file | Failed to extract native libraries, res=-113

If you are trying to deploy an android application on Genymotion Android Emulator and you are getting following error, this solution is for you.

An error occurred while deploying the file.
This probably means that the app contains ARM native code and your Genymotion device cannot run ARM instructions. You should either build your native code to x86 or install an ARM translation tool in your device.

1. First start Genymotion virtual Android Device and connect your Genymotion Android Device to adb. Follow this link

2. Once it is connected to adb, run command

adb shell getprop | grep ro.product.cpu.abi

3.  It shows the list of architecture supported by the device.


 If your app has similar architecture, you can install the app on the device.

4. Now how can you find the architecture of the app? 

An apk is a compressed file, you can extract the content and open the lib directory. Here it lists the architecture as a folder supported by apk.

 


5. Now this apk has armeabi-v7a architecture which is not aupported by the device by default but Genymotion provides ARM translation packages as patch to install such apps on the device.

6. To install ARM Translation package, open link

https://github.com/m9rco/Genymotion_ARM_Translation/tree/master/package

and download the package according to your android version of virtual device. Now drag and drop the apk in the device. Once this ARM translation package is installed successfully, turn off the device and reboot again.

If your device is not getting started, turn off the Genymotion software completely, kill all the processes related to it and VBox and start again.

7.  Now once the device is started successfully, check the supported architecture by the device using same command.


You will see, now the device can install apps with existing architecture i.e. x86 as well as apks with armeabi and armeabi-v7a architecture too.

8. If you are still getting same error

adb: failed to install h.apk: Failure [INSTALL_FAILED_NO_MATCHING_ABIS: Failed to extract native libraries, res=-113]

Your app architecture might be arm64 / arm64-v8a

Currently Genymotion Desktop do not support applications for aarch64/arm64, even with ARM translation tools.

However, they offer Genymotion Device (PaaS) Android 8.0 (Oreo), 9.0 (Pie), 10 and 11 images which run on an ARM64 virtual machine at AWS and Oracle Cloud Infrastructure (only Android 8.0).

Sunday, May 8, 2022

icici bank app iMobile Pay UPI error - Mobile number and profile id doesn't belong to same user

If you are trying to perform UPI transactions using icici bank app iMobile Pay and it does not allow to transfer money using UPI and you get following error

 Mobile number and profile id does not belong to same user

This is the solution for you.

Solution : 1

1. Close the app. Update iMobile Pay from Google Play Store.

Open settings of your mobile and clear only cache of the app iMobile Pay.

2. Now Open the app again and visit option UPI. It should solve the issue.

You can try restarting your mobile too.

If you still get same error by touching any option of UPI like 'Send', 'Pay to Contact', 'Transaction History', 'Collect', 'Scan any QR' etc, Follow solution 2.

Solution : 2

1. Make sure your registered mobile number is active and you are able to send and receive sms.

2. Close the app iMobile Pay. Open settings and clear all data of  the app iMobile Pay App.

3. Open the app iMobile App. Re register yourself on the app by selecting sim of your registered mobile no. (if dual sim).

4. Select your account number from drop down and confirm code of alphabets written at back side of your debit card.

5. Once you are registered successfully, Visit Section BHIM UPI in your app and try to open option 'Send','Pay to Contact', 'Transaction History', 'Collect', 'Scan any QR' etc

Now you should be able to transfer money using UPI

You can try restarting your mobile too.

If you are still facing same issue, write an email to customer care of icici on customer.care@icicibank.com or call them on their customer support number.

.



Thursday, May 5, 2022

Telegram Export Group Members more than 10000

FloodWaitError - A wait of 30 seconds is required / Telegram Export Group Members more than 10000

aggressive=True attribute was used to export group members more than 10 K.

This attribute was used in the following line of code of Telegram export group members python script.

all_participants = client.get_participants(target_group, aggressive=True)
But recently telethon has started giving FloodWait error with this attribute.
If we remove it, or make it false, we can download members but less than 10k.

all_participants = client.get_participants(target_group, aggressive=False)

But Removing the attribute or making it false can export members of a group which has less than 10K members but it gives following error for the groups which have more than 10k members.

TypeError: 'ChannelParticipants' object is not subscriptable
github page of this issue suggests a solution for the error. Upgrade your Telethon package to 1.24.0 using following command.

python3 -m pip install --upgrade https://github.com/LonamiWebs/Telethon/archive/master.zip
but after upgrading telethon some users have started getting this new error.
Traceback (most recent call last):
  File "t1_exportgroup.py", line 1, in <module>
    from telethon.sync import TelegramClient
  File "/home/pavi/virtualenvs/telegram/lib/python3.8/site-packages/telethon/__init__.py", line 4, in <module>
    from ._misc import utils as _  # depends on helpers and _tl
  File "/home/pavi/virtualenvs/telegram/lib/python3.8/site-packages/telethon/_misc/utils.py", line 23, in <module>
    from . import markdown, html
  File "/home/pavi/virtualenvs/telegram/lib/python3.8/site-packages/telethon/_misc/markdown.py", line 8, in <module>
    import markdown_it
ModuleNotFoundError: No module named 'markdown_it'
A solution was also provided for the error. Install following packages.

pip install markdown-it-py~=1.1.0
pip install pyaes~=1.6.1
pip install rsa~=4.7.2
After applying this solution above error is gone but another error is here.
Traceback (most recent call last):
  File "t1_exportgroup.py", line 1, in <module>
    from telethon.sync import TelegramClient
ModuleNotFoundError: No module named 'telethon.sync'

Telethon 2 has removed sync module. If your export script is showing this error, you need to change the script according to the changes suggested in this version 2 migration guide.

Friday, April 15, 2022

Telegram - Import Members into a Channel

There are two conditions to Import / Export members of a telegram channel.

1) You should be an administrator or owner of the channel, then only you will be able to import or export members.

2) If a member has selected ‘My Contacts’ option in ‘who can add you to groups and channels’ then you should be added in his/her contact list only then you can import him/her in the channel.

To import channel members, First Generate api id and api hash in your telegram account. To do this, open my.telegram.org
Login using your registered phone number.
Now Open Link Development Tools and copy your api id and hash
We will use the api id and hash in the Python Script

 Create virtual environment with python 3 and install Telethon using pip.

virtualenv telegram -p /usr/bin/python3
source telegram/bin/activate
pip install Telethon==1.23.0
Update API ID, API hash, telegram registered phone number and channel username(it can be found on the channel description page) in the following python script import.py.

from telethon.sync import TelegramClient
from telethon.tl.functions.messages import GetDialogsRequest
from telethon.tl.types import InputPeerEmpty, InputPeerChannel, InputPeerUser
from telethon.errors.rpcerrorlist import PeerFloodError, UserPrivacyRestrictedError
from telethon.tl.functions.channels import InviteToChannelRequest
import sys
import csv
import traceback
import time

api_id = 9999999
api_hash = 'zzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzz'

client = TelegramClient('session_name', api_id, api_hash)
client.start()

channel = client.get_entity('pythonlovers07')
print(channel)

input_file = 'members.csv'
users = []
with open(input_file, encoding='UTF-8') as f:
    rows = csv.reader(f,delimiter=",",lineterminator="\n")
    next(rows, None)
    for row in rows:
        user = {}
        user['username'] = row[0]
        user['id'] = int(row[1])
        user['access_hash'] = int(row[2])
        user['name'] = row[3]
        users.append(user)
        
 
mode = int(input("Enter 1 to add by username or 2 to add by ID: "))

for user in users:
    try:
        print ("Adding {}".format(user['id']))
        if mode == 1:
            if user['username'] == "":
                continue
            user_to_add = client.get_input_entity(user['username'])
            print(user['username'])
        elif mode == 2:
            user_to_add = InputPeerUser(user['id'], user['access_hash'])
        else:
            sys.exit("Invalid Mode Selected. Please Try Again.")
        target_group = 'pythonlovers07'
        target_group_entity = InputPeerChannel(channel.id,channel.access_hash)
        print(target_group_entity)
        print(user_to_add)               
        client(InviteToChannelRequest(target_group_entity,[user_to_add]))
        print("Waiting 60 Seconds...")
        time.sleep(60)
    except PeerFloodError:
        print("Getting Flood Error from telegram. Script is stopping now. Please try again after some time.")
    except UserPrivacyRestrictedError:
        print("The user's privacy settings do not allow you to do this. Skipping.")
    except:
        traceback.print_exc()
        print("Unexpected Error")
        continue
Run the Python script in the same virtual env.

python import.py

members.csv should be in same directory where you are running the script. members.csv has all the exported channel members. 

It will list all your channels. Choose number of the channel into you want to import members. Members will be imported in the channel.

To see the steps :

 



If you want to export telegram channel members, check other post of the blog.

https://linuxamination.blogspot.com/2022/04/telegram-export-members-of-channel.html






       

Telegram - Export Members of a Channel

There are two conditions to Import / Export members of a telegram channel.

1) You should be an administrator or owner of the channel, then only you will be able to import or export members.

2) If a member has selected ‘My Contacts’ option in ‘who can add you to groups and channels’ then you should be added in his/her contact list only then you can add him/her in the channel.

To export channel members, First Generate api id and api hash in your telegram account. To do this, open my.telegram.org
Login using your registered phone number.
Now Open Link Development Tools and copy your api id and hash
We will use the api id and hash in the Python Script

 Create virtual environment with python 3 and install Telethon using pip.

virtualenv telegram -p /usr/bin/python3
source telegram/bin/activate
pip install Telethon==1.23.0

Update API ID, API hash, telegram registered phone number and channel username(it can be found on the channel description page) whose members you want to export in the following python script export.py.

from telethon import TelegramClient, sync
from telethon.tl.functions.channels import GetParticipantsRequest
from telethon.tl.functions.channels import GetFullChannelRequest
from telethon.tl.types import ChannelParticipantsSearch

api_id = 9999999
api_hash = 'zzzzzzzzzzzzzzzzzzzzzzzzzzzzzzz'
phone_number = '+919999999999'
channel_username = 'rajcomicsoldads'

client = TelegramClient(phone_number, api_id, api_hash).start()

# get all the channels that I can access
channels = {d.entity.username: d.entity
            for d in client.get_dialogs()
            if d.is_channel}

# choose the one that I want list users from
channel = channels[channel_username]
#print(client.get_participants(channel))
# get all the users and print them
print('username',",",'user id',",",'access hash',",",'name',sep='')
for u in client.get_participants(channel):
    print(u.username,",",u.id,",",u.access_hash,",",u.first_name," ",u.last_name,sep='')

Run the Python script in the same virtual env.

python export.py

It will list all your channels. Choose number whose members you want to export.

It will list all members of the channel in the csv format. Copy this output in the file members.csv 

To see the steps :


 

Telegram Import Channel Members :

 https://linuxamination.blogspot.com/2022/03/telegram-import-members-of-channel.html

Telegram - Import Members into a Public Group

 

There are two conditions to import group members of a telegram public group.

    1) It should be a Public group but you do not need to be an administrator or owner of the group, You can import / export members of any public group, just you should be a member of the group.

    2) If a member has selected ‘My Contacts’ option in ‘who can add you to groups and channels’ then you should be added in his/her contact list, only then you can import him/her into a group.

To import group members, First Generate api id and api hash in your telegram account. To do this open my.telegram.org
Login using your registered phone number.
Now Open Link Development Tools and copy your api id and hash
We will use the api id and hash in the Python Script

 Create virtual environment with python 3 and install Telethon using pip.

virtualenv telegram -p /usr/bin/python3
source telegram/bin/activate
pip install Telethon==1.23.0

Update API ID, API hash and telegram registered phone number in the following python script import.py.

from telethon.sync import TelegramClient
from telethon.tl.functions.messages import GetDialogsRequest
from telethon.tl.types import InputPeerEmpty, InputPeerChannel, InputPeerUser
from telethon.errors.rpcerrorlist import PeerFloodError, UserPrivacyRestrictedError
from telethon.tl.functions.channels import InviteToChannelRequest
import sys
import csv
import traceback
import time

api_id = 9999999
api_hash = 'zzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzz'
phone = '+919999999999'
client = TelegramClient(phone, api_id, api_hash)

client.connect()
if not client.is_user_authorized():
    client.send_code_request(phone)
    client.sign_in(phone, input('Enter the code: '))

input_file = sys.argv[1]
users = []
with open(input_file, encoding='UTF-8') as f:
    rows = csv.reader(f,delimiter=",",lineterminator="\n")
    next(rows, None)
    for row in rows:
        user = {}
        user['username'] = row[0]
        user['id'] = int(row[1])
        user['access_hash'] = int(row[2])
        user['name'] = row[3]
        users.append(user)

chats = []
last_date = None
chunk_size = 200
groups=[]

result = client(GetDialogsRequest(
             offset_date=last_date,
             offset_id=0,
             offset_peer=InputPeerEmpty(),
             limit=chunk_size,
             hash = 0
         ))
chats.extend(result.chats)

for chat in chats:
    try:
        if chat.megagroup== True:
            groups.append(chat)
    except:
        continue
print('Choose a group to add members:')
i=0
for group in groups:
    print(str(i) + '- ' + group.title)
    i+=1
g_index = input("Enter a Number: ")
target_group=groups[int(g_index)]

target_group_entity = InputPeerChannel(target_group.id,target_group.access_hash)
print('======')
mode = int(input("Enter 1 to add by username or 2 to add by ID: "))

for user in users:
    try:
        print ("Adding {}".format(user['id']))
        if mode == 1:
            if user['username'] == "":
                continue
            user_to_add = client.get_input_entity(user['username'])
            print(user['username'])
        elif mode == 2:
            user_to_add = InputPeerUser(user['id'], user['access_hash'])
            print(user_to_add)
        else:
            sys.exit("Invalid Mode Selected. Please Try Again.")
        print(target_group_entity)
        client(InviteToChannelRequest(target_group_entity,[user_to_add]))
        print("Waiting 60 Seconds...")
        time.sleep(60)
    except PeerFloodError:
        print("Getting Flood Error from telegram. Script is stopping now. Please try again after some time.")
    except UserPrivacyRestrictedError:
        print("The user's privacy settings do not allow you to do this. Skipping.")
    except:
        traceback.print_exc()
        print("Unexpected Error")
        continue
Run the Python script in the same virtual env.

python import.py members.csv

Pass members.csv as an argument. It has all the exported group members. members.csv should be in same directory where you are running the script.

It will list all your public groups. Choose number of the channel into you want to import members. Members will be imported in the group. 

To see the steps :

 



If you want to export telegram group members, check other post of the blog.

https://linuxamination.blogspot.com/2022/04/telegram-export-members-of-public-group.html 

Telegram - Export Members of a Public Group

There are two conditions to export group members of a telegram public group.

    1) It should be a Public group but you do not need to be an administrator or owner of the group, You can import / export members of any public group, just you should be a member of the group.

    2) If a member has selected ‘My Contacts’ option in ‘who can add you to groups and channels’ then you should be added in his/her contact list, only then you can import him/her into a group.

To export group members, First Generate api id and api hash in your telegram account. To do this, open my.telegram.org
Login using your registered phone number.
Now Open Link Development Tools and copy your api id and hash
We will use the api id and hash in the Python Script

 Create virtual environment with python 3 and install Telethon using pip.

virtualenv telegram -p /usr/bin/python3
source telegram/bin/activate
pip install Telethon==1.23.0

Update API ID, API hash and telegram registered phone number in the following python script export.py.

from telethon.sync import TelegramClient
from telethon.tl.functions.messages import GetDialogsRequest
from telethon.tl.types import InputPeerEmpty
import csv

api_id = 9999999
api_hash = 'zzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzz'
phone = '+919999999999'
client = TelegramClient(phone, api_id, api_hash)

client.connect()
if not client.is_user_authorized():
    client.send_code_request(phone)
    client.sign_in(phone, input('Enter the code: '))


chats = []
last_date = None
chunk_size = 200
groups=[]
 
result = client(GetDialogsRequest(
             offset_date=last_date,
             offset_id=0,
             offset_peer=InputPeerEmpty(),
             limit=chunk_size,
             hash = 0
         ))
chats.extend(result.chats)

for chat in chats:
    try:
        if chat.megagroup== True:
            groups.append(chat)
    except:
        continue

print('Choose a group to scrape members from:')
i=0
for g in groups:
    print(str(i) + '- ' + g.title)
    i+=1

g_index = input("Enter a Number: ")
target_group=groups[int(g_index)]

print('Fetching Members...')
all_participants = []
all_participants = client.get_participants(target_group)
print('Saving In file...')
with open("members.csv","w",encoding='UTF-8') as f:
    writer = csv.writer(f,delimiter=",",lineterminator="\n")
    writer.writerow(['username','user id', 'access hash','name','group', 'group id'])
    for user in all_participants:
        if user.username:
            username= user.username
        else:
            username= ""
        if user.first_name:
            first_name= user.first_name
        else:
            first_name= ""
        if user.last_name:
            last_name= user.last_name
        else:
            last_name= ""
        name= (first_name + ' ' + last_name).strip()
        writer.writerow([username,user.id,user.access_hash,name,target_group.title, target_group.id])      
print('Members scraped successfully.')

Run the Python script in the same virtual env.

python export.py

It will list all your public groups. Choose number whose members you want to export.

It will export all group members by creating members.csv in the same directory.

cat members.csv

To see the steps : 


 

Telegram Import Group Members : 

https://linuxamination.blogspot.com/2022/04/telegram-import-members-of-public-group.html

Saturday, April 2, 2022

Solve FloodWaitError in Telegram Export Group Python Script

Your export group python script was working fine but recently you tried to export members of one of your public group and it is showing following error.

telethon.errors.common.MultiError: ([None, FloodWaitError('A wait of 30 seconds is required (caused by GetParticipantsRequest)')
Solution :

You need to make one change in your export script.

Remove 

aggressive=True

from the line

all_participants = client.get_participants(target_group, aggressive=True)

Above line should be appeared like this

all_participants = client.get_participants(target_group)

and you will not get the following FloodWaitError anymore.

Traceback (most recent call last):
  File "exportgroup.py", line 49, in <module>
    all_participants = client.get_participants(target_group, aggressive=True)
  File "virtualenvs/telegram/lib/python3.8/site-packages/telethon/sync.py", line 39, in syncified
    return loop.run_until_complete(coro)
  File "/usr/lib/python3.8/asyncio/base_events.py", line 616, in run_until_complete
    return future.result()
  File "virtualenvs/telegram/lib/python3.8/site-packages/telethon/client/chats.py", line 503, in get_participants
    return await self.iter_participants(*args, **kwargs).collect()
  File "virtualenvs/telegram/lib/python3.8/site-packages/telethon/requestiter.py", line 113, in collect
    async for message in self:
  File "virtualenvs/telegram/lib/python3.8/site-packages/telethon/requestiter.py", line 74, in __anext__
    if await self._load_next_chunk():
  File "virtualenvs/telegram/lib/python3.8/site-packages/telethon/client/chats.py", line 221, in _load_next_chunk
    results = await self.client(self.requests)
  File "virtualenvs/telegram/lib/python3.8/site-packages/telethon/client/users.py", line 30, in __call__
    return await self._call(self._sender, request, ordered=ordered)
  File "virtualenvs/telegram/lib/python3.8/site-packages/telethon/client/users.py", line 75, in _call
    raise MultiError(exceptions, results, requests)
telethon.errors.common.MultiError: ([None, FloodWaitError('A wait of 30 seconds is required (caused by GetParticipantsRequest)'), FloodWaitError('A wait of 30 seconds is required (caused by GetParticipantsRequest)')

 

To see the steps :