Wednesday, February 9, 2022

Publish message to AWS SQS queue by NodeJs

 Here, we will discuss about how to publish messages to AWS SQS queue by NodeJs. To implement it, we have to follow below steps:

  1. Create an user in AWS and note down its access key Id and secret access key
  2. Create a SQS queue and note down its url
  3. Save user credential in a shared file in your local
  4. Install aws-sdk library
  5. Create a simple node express application with route for publishing message to SQS
Now, lets discuss each steps one by one.
1) Create an user in AWS and note down its access key Id and secret access key: I have discussed in detail about this steps in my previous blog under the heading "File handling in AWS S3 by NodeJs". I am giving the link below. Please go through it.


2) Create a SQS queue and note down its url: First open AWS console and go to the SQS service and click on the "Create Queue" button. Now, add the Queue name and keep other things with default value. Further click on "Create Queue" button at the bottom.





3) Save user credential in a shared file in your local: In this steps, we should save the user credential so that node js application can read it while establishing connection with AWS. We can refer below AWS documentation for this step:


5) Install aws-sdk library: Create a simple node js application and Use the command like npm install aws-sdk

6) Create a simple node express application with route for publishing message to SQS: Here, we are creating a very simple node express application along with two routes:
a) "/addMessage": This is being used to add messages in SQS queue.
b) "/getMessage": This is being used to get messages from the SQS queue.
I am providing here the code for your reference:

const express = require('express');
const dotenv = require('dotenv');
const path = require('path');
const AWS = require('aws-sdk');

AWS.config.update({region: 'us-east-1'});
const sqs = new AWS.SQS();

const app = express();
app.use(express.json());

dotenv.config({
  path: path.join(__dirname, './.env')
});

app.listen(8000, () => {
  console.log(`Listening port 8000`);
});

//Add a message to SQS
app.post('/addMessage', async(req, res) => {
  try {
    const params = {
      MessageBody: req.body.MessageBody,
      QueueUrl: process.env.SQS_URL
    };
    const result = await sqs.sendMessage(params).promise();
    res.send(result)
  } catch (error) {
    console.log(error);
    throw error;
  }  
});

//Receive a message to SQS
app.get('/getMessage', async(req, res) => {
  try {
    const params = {
      QueueUrl: process.env.SQS_URL
    };
    const result = await sqs.receiveMessage(params).promise();
    res.send(result)
  } catch (error) {
    console.log(error);
    throw error;
  }  
});





Monday, February 7, 2022

Publish and subscribe to AWS SNS topic through NodeJS

 Today, we are going to discuss how to publish a message to AWS SNS topic and how to subscribe to the SNS topic so that the subscriber gets the messages published to the SNS topic. We should follow the below steps:

  1.  Create an user and user group and note down its accessKeyId and secretAccessKey
  2. Assign this user the required permission for SNS handling
  3. Create a topic and note down its ARN
  4. Install aws-sdk libraray
  5. Save user credential in a shared file in your local
  6. Create a simple node express application and create routes for publish and subscribe
Now, lets discuss about each steps here:
1) Create an user and user group and note down its accessKeyId and secretAccessKey: I have discussed in detail about this steps in my previous blog under the heading "File handling in AWS S3 by NodeJs". I am giving the link below. Please go through it.


2) Assign this user the required permission for SNS handling: Here, we can create a custom policy for SNS handling and then assign it to the group to which the user belongs or we can assign a default SNS policy already provided by AWS. Here, I am going to assign default SNS policy.
First, go to the IAM service in AWS console and then click on your User name to whom you want to provide permission. 


Now, click on Add permissions button and then click on "Attach existing policies directly". Now, search for SNS and select "AmazonSNSFullAccess". 
Important Note: This policy selection is only for demo purpose. In real time, we should provide the least required access for any service to an user. Always avoid providing the full access.


3) Create a topic and note down its ARN: Now, go to the SNS service in AWS console and click on the Topics tab present in the left side. Further, click on the Create topic button. In the create topic window, select "Standard" radio button and add your Topic name and display name. You can keep rest of the fields as default.

Now, your topic will be created. You should note down the ARN of the topic. It is going to be used in NodeJS code.

4) Install aws-sdk libraray: Use the command like npm install aws-sdk

5) Save user credential in a shared file in your local: In this steps, we should save the user credential so that node js application can read it while establishing connection with AWS. We can refer below AWS documentation for this step:


6) Create a simple node express application and create routes for publish and subscribe: Here, we should create a node js and express application and expose two routes. 
One is used to subscribe the email id to the SNS topic represented by the ARN of the topic in the code. Once, the route is triggered, it will trigger SNS service to subscribe the given email id to the SNS topic. You will get a confirmation email on this email id from AWS. This email will have a link and you will have to click on the link to complete the subscription.
Second route is used to publish message to the SNS topic and further SNS topic will forward the message to all the subscribed emails. Here, we have to provide the Subject and Body as a part of message. 
I am giving here the code for your reference:

const express = require('express');
const dotenv = require('dotenv');
const path = require('path');
const AWS = require('aws-sdk');

AWS.config.update({region: 'us-east-1'});
const sns = new AWS.SNS();

const app = express();
app.use(express.json());

dotenv.config({
  path: path.join(__dirname, './.env')
});

app.listen(8000, () => {
  console.log(`Listening port 8000`);
});


//Get sns details
app.get('/mysns', (req, res) => res.send({"status":"Ok", sns}));

//Subscribe SNS topic
app.post('/subscribe', async(req, res) => {
  try {
    const params = {
      Protocol: 'Email',
      TopicArn: process.env.TOPIC_ARN,
      Endpoint: req.body.email
    };
    const result = await sns.subscribe(params).promise();
    res.send(result);
  } catch (error) {
    console.log(error);
    throw error;
  }  
});

//Publish to SNS topic
app.post('/publish', async(req, res) => {
  try {
    const params = {
      Subject: req.body.subject,
      TopicArn: process.env.TOPIC_ARN,
      Message: req.body.message
    };
    const result = await sns.publish(params).promise();
    res.send(result);
  } catch (error) {
    console.log(error);
    throw error;
  }  
});



 






Sunday, February 6, 2022

File handling in AWS S3 by Node Js

 Today, I am going to discuss about file handling operations like upload a file, get list of all the files, delete a file and download files from S3 bucket in AWS. To perform all these operation, we should follow the below steps:

  1. Create an user in IAM service in AWS
  2. Create a bucket in S3 service in AWS
  3. Save user credential in a shared file in your local
  4. Create a NodeJS application
  5. Install aws-sdk package
  6. Create routes for handling files
Now, lets discuss about each steps here.
1) Create an user in IAM service in AWS: We should go to the IAM service in AWS and click on "Add users button".


Add the user name and select the check box for "Access key - Programmatic access" and then click on "Next: Permissions"

Click on the "Create group" button and then add a group name and then click on Create group button.

Now, on group list page, click on your newly created group and then click on Permissions tab. Now, Add Permissions dropdown box and click on attach Policy.


Now, click on Create Policy and select service as S3. Further, select check boxes of Read, List and Write


So, up to here, your user and required group and policy is created. Now, note down the user access key id and secret access key. These two keys are going to be used in NodeJS programming.

2) Create a bucket in S3 service in AWS: Now, we should go to the S3 service of AWS and create a bucket with unique name in AWS.




3) Save user credential in a shared file in your local: In this steps, we should save the user credential so that node js application can read it while establishing connection with AWS. We can refer below AWS documentation for this step:


4) Create a NodeJS application: Here, create a very simple node js application with Express. Code will be given in the below.

5) Install aws-sdk package: In your node js application, install aws-sdk package by using command like npm install aws-sdk

6) Create routes for handling files: Below is the code for different routes like upload a file in S3, get list of all the files from the S3 bucket, delete a file, download a file.

const express = require('express');
const dotenv = require('dotenv');
const path = require('path');
const multer = require('multer');
const AWS = require('aws-sdk');

const s3 = new AWS.S3();
const app = express();
const upload = multer();
dotenv.config({
  path: path.join(__dirname, './.env')
});

app.listen(8000);

//Creating routes
//uploading file to s3
app.post('/upload', upload.single('file'), async (req, res) => {
  try {
    const result = await s3.putObject({
      Body: req.file.buffer,
      Bucket: process.env.BUCKET_NAME,
      Key: req.file.originalname
    }).promise();
    res.send(result);
  } catch (error) {
    console.log(error);
    throw error;
  }  
});

//Listing uploaded files
app.get('/fileList', async(req, res) => {
  try {
    const result = await s3.listObjectsV2({
      Bucket: process.env.BUCKET_NAME
    }).promise();
    res.send(result);
  } catch (error) {
    console.log(error);
    throw error;
  }
});

//Deleting a file
app.delete('/deleteFile/:fileName', async(req, res) => {
  try {
    const fileName = req.params.fileName;
    const result = await s3.deleteObject({
      Bucket: process.env.BUCKET_NAME,
      Key: fileName
    }).promise();
    res.send(result);
  } catch (error) {
    console.log(error);
    throw error;
  }
})

//Download a file
app.get('/downloadFile/:fileName', async(req, res) => {
  try {
    const fileName = req.params.fileName;
    const result = await s3.getObject({
      Bucket: process.env.BUCKET_NAME,
      Key: fileName
    }).promise();
    res.send(result.Body);
  } catch (error) {
    console.log(error);
    throw error;
  }
})














Tuesday, February 1, 2022

Cluster management in node js application by PM2 Lib

 In my previous blog, we discussed about how to manually manage cluster in node js application by writing the required code. In production mode, we may have to write more code to manage it.

Link: Cluster in node js application

We can avoid writing the manual code to manage the cluster in node js application. It can be done by using PM2 npm lib. Now, suppose we have below code where express js is creating a server and listening a port and we have to make it cluster supported by using PM2 lib.

const express = require('express');
const port = 8000;

const app = express();  

app.get('/', (req, res) => {
  console.log(`Incoming req accepted by Process ${process.pid}`);
  for(let i=0; i<999999999999999; i++) {

  }
  res.send('hello world');
});

app.get('/test', (req, res) => {
  console.log(`Incoming req accepted by Process ${process.pid}`);
  res.send('Quickly say Hello World');
});

app.listen(port, () => {
  console.log(`app is listening at port ${port} by Process ${process.pid}`);
});




Here, not a single line is written to manage cluster and if we run it and trigger "/" route first then it will keep running for few minutes as it is processing a large loop and in the mean time if we trigger "/test" route, it will have to wait till the completion of first route.

Now, if we enable cluster with multiple processes, then 2nd route will not have to wait for the completion of the first route. It can be done by PM2 lib as given below.

First, install PM2 lib at global level by using command like:

npm install pm2 -g

Now, if we are working in Windows environment, we can open cmd and go to the path where our app.js file is located where server creation code is written and run the command like:

pm2 start app1.js -i -1

This command will spawn number of processes = Number of CPU - 1

We can see the list of these initiated processes with below command:

We can monitor all the processes status by command like:

pm2 monit


If needed we can stop all or any one process by using command like:

pm2 stop all

pm2 stop id

Till, here, we have seen that we are able to initiate number of processes in node application by using pm2 commands where we provides the required number of processes each time we are running the command. We can create a config js file where we can add the required config related with cluster and now pm2 command we always read from this config file and accordingly it will start the processes. To create the config file, we can use below command like:

pm2 ecosystem

It will create a file named as ecosystem.config.js with some required config. We can add the required changes in this config file as per our requirement. The config file will look like as follows:

module.exports = {
  apps : [{
    script: 'app1.js',
    watch: '.',
    instances: 0,
    exec_mode: "cluster"
  }],

  deploy : {
    production : {
      user : 'SSH_USERNAME',
      host : 'SSH_HOSTMACHINE',
      ref  : 'origin/master',
      repo : 'GIT_REPOSITORY',
      path : 'DESTINATION_PATH',
      'pre-deploy-local': '',
      'post-deploy' : 'npm install && pm2 reload ecosystem.config.js --env production',
      'pre-setup': ''
    }
  }
};

Now, to start cluster by using PM2 command by utilizing the above ecosystem.config.js file, we need to run below command like:

pm2 start ecosystem.config.js

This is all about PM2 lib. We can find more details from https://www.npmjs.com/package/pm2











Cluster in node js application

 Whenever we create a simple node js and express application, it runs with main process without any cluster functionality. Generally, application works well except in some scenarios where a particular incoming request is taking more time. In such cases, the main process remains occupied with this request and in the mean time if any request to any other route comes, it will wait till the completion of previous request. It can be understood with below code.

const express = require('express');
const port = 8000;

const app = express();  

app.get('/', (req, res) => {
  console.log(`Incoming req accepted by Process ${process.pid}`);
  for(let i=0; i<9999999999999999999; i++) {

  }
  res.send('hello world');
});

app.get('/test', (req, res) => {
  console.log(`Incoming req accepted by Process ${process.pid}`);
  res.send('Quickly say Hello World');
});

app.listen(port, () => {
  console.log(`app is listening at port ${port} by Process ${process.pid}`);
});




Here, we have created a simple node express application with two GET routes:

1) /test: It is just returning a message.

2) "/": It is running a very large loop and then returning the message after few minutes

Now, if we first run the route "/" in one browser, it will start running and in the mean time if we trigger the "/test" route in another browser, it will be in waiting mode till the first one is completed. If we go through the

 console.log(`Incoming req accepted by Process ${process.pid}`);

for both the routes in our terminal, we will find that both the requests were processed by same processId and that is why second route was waiting till first was running.

The above problem can be solved by using Node Js Cluster functionality. Here, we will create as many child processes as number of Core in our Operating system are present. Each of these child processes will listen to same port. This can be seen by below code:

const http = require('http');
const cluster =  require('cluster');
const os = require('os');
const process = require('process');
const express = require('express');
const port = 8000;

const numCPUs = os.cpus().length;
console.log(`No of cpu = ${numCPUs}`);

//Creating child processes from main process based on number of cpus
if (cluster.isMaster) {
  console.log(`Primary ${process.pid} is running`);

  // Fork workers.
  for (let i = 0; i < numCPUs; i++) {
    cluster.fork();
  }

  cluster.on('exit', (worker, code, signal) => {
    console.log(`worker ${worker.process.pid} died`);
    cluster.fork();
  });
} else {
  const app = express();  

  app.get('/', (req, res) => {
    console.log(`Incoming req accepted by Process ${process.pid}`);
    for(let i=0; i<9999999999999999999; i++) {
 
    }
    res.send('hello world');
  });

  app.get('/test', (req, res) => {
    console.log(`Incoming req accepted by Process ${process.pid}`);
    res.send('Quickly say Hello World');
  });

  app.listen(port, () => {
    console.log(`app is listening at port ${port} by Process ${process.pid}`);
  });
}




Here, the code os.cpus().length defines the number of cores in our OS and accordingly we are creating the child processes by using fork() function. Here, each child process is listening the port 8000.

Now, if we first run the route "/" it will start running and before its completion if we run the second route "/test" in another browser, it will be processed by another child process and it will not wait for the completion of first route. It will be completed immediately.

If we perform load testing, we will get very good result comparing to the first one. But, one thing important to note here is that, spawning lot of child processes are not helpful because each child process requires some resource and if number of child processes increases it may inversely impact the performance. 

This is very simple way of implementation of cluster in node application but in Production environment, it may not be so simple. To automatically manage clustering in node application without manually writing this code, we can use a library named as PM2. In my next blog, I will discuss about it.