Saturday, August 13, 2022

Assume AWS role and retrieve temporary credentials to access allowed services of AWS

 Scenario

Suppose I am an user of an AWS account with account id 1111 and I want to programmatically access a S3 bucket created in another AWS account 2222. So, how can I access it programmatically through cross account. To fulfill this scenario, we have to follow the following steps:

Steps:

  1. Admin of account 2222 should create a bucket say Bucket2222.
  2. Admin of account 2222 should create an IAM Role with minimum 2 policies. One will be the Trust Policy and second with the list of accesses of the Bucket2222
  3. Admin of account 1111 should give me the IAM access of Assume Role
  4. In Node JS code, I have to first get the temporary credential of the IAM Role created in Account 2222 by using aws-sdk version 3 STSClient class
  5. If needed I can cache the credential in API side to reuse it until it is expired
  6. I will access the Bucket2222 in Account 2222 by using S3Client class of aws-sdk version 3 by passing the temporary credentials
Below is the complete code to access the temporary credential of this role and then listing the bucket items:

Package.json

{
  "name": "aws_node_poc",
  "version": "1.0.0",
  "description": "",
  "main": "index.js",
  "scripts": {
    "test": "echo \"Error: no test specified\" && exit 1"
  },
  "author": "Jitendra Kumar Singh",
  "license": "ISC",
  "dependencies": {
    "@aws-sdk/client-s3": "^3.142.0",
    "@aws-sdk/client-sts": "^3.142.0"
  }
}


Constants.js

module.exports = {
    "REGION": "My Region name",
    "ASSUME_ROLE_ARN": "arn:aws:iam::2222:role/my_assume_role_name",
    "TEST_BUCKET_NAME": "Bucket2222"
}

AssumeRoleAndAccessBucket.js


 const { S3Client, ListObjectsCommand } = require("@aws-sdk/client-s3");
 const { STSClient, AssumeRoleCommand, GetCallerIdentityCommand } = require("@aws-sdk/client-sts");
 const { REGION, ASSUME_ROLE_ARN, TEST_BUCKET_NAME } = require('../../util/constants.js');

 //Getting Role credentials
 const assumeRole = async () => {
    try {
        let rolecreds;
        /**
         * The credential can be cached on api side in reddish or memcache
         * to reuse the credential till it is expired. This credential is having Expiration
         * attribute that defines the time when it is expired
         * We can check this time to verify whether temp credential is expired
         * If not then reuse it and if expired then send a hit to AWS to get new temp credential
         */
        //rolecreds = "Get credential from API cache if cached and if not expired"

        if(!rolecreds) {
            // Create an Amazon STS service client object.
            const stsClient  = new STSClient({
                region: REGION
            });

            // Set the parameters
            const params = {
                RoleArn: ASSUME_ROLE_ARN, //ARN_OF_ROLE_TO_ASSUME
                RoleSessionName: "MyAssueRoleSessionName",
                DurationSeconds: 3600,
            };

            //Assume Role
            const data = await stsClient.send(new AssumeRoleCommand(params));
            console.log("Cred = ", data);
           
            rolecreds = {
                accessKeyId: data.Credentials.AccessKeyId,
                secretAccessKey: data.Credentials.SecretAccessKey,
                sessionToken: data.Credentials.SessionToken
            };
        }        

        /*const stsParams = { credentials: rolecreds };
        const stsClient1 = new STSClient(stsParams);
        const results = await stsClient1.send(
            new GetCallerIdentityCommand(rolecreds)
        );*/

        return rolecreds;
    } catch (error) {
        throw error;  
    }    
 }

 //Getting Object List from Bucket
 const listObjects = async(roleCred) => {
    try {
        //Creating S3 Client
        const s3Client = new S3Client({
            region: REGION,
            credentials: roleCred
        });
        //Getting object list from bucket
        const bucketParams = { Bucket: TEST_BUCKET_NAME };
        const data = await s3Client.send(new ListObjectsCommand(bucketParams));
        return data;
    } catch (error) {
        throw error;
    }
 }

 //Getting Role credential and calling function to get object list from bucket
 const listBucketObjects = async() => {
    try {
        const roleCred = await assumeRole();
        //get object list of the bucket
        const result = await listObjects(roleCred);
        return result;
    } catch (error) {
        throw error;
    }    
 }

 //Triggering function to initiate process of getting data from bucket by using Role
 listBucketObjects()
 .then(data => {
    console.log(data);
 }).catch(err => {
    console.log("Error in my code = ", err);
 })
 

This is all I am having here to deal with such scenarios. Please let me know in comment section if you facing any problem in any other scenarios, I will try to post blog related with those issues.



Sunday, July 17, 2022

NFR : None Functional Requirement in Software Engineering or Application Development : Part 4: Extensibility, Observability, Maintainability

 In this post, we are going to discuss about the remaining 3 NFRs like Extensibility, Observability, Maintainability. The link of all of my posts related with NFR are as follows:

Links:

Extensibility: Extensibility is a measure of the ability to extend an application and the level of effort required to implement the extension. It's components are like:

  1. Flexibility: It defines how flexible your application is towards accepting business growing needs and new functionality changes. We can make our application more flexible by applying OOPS concepts, Dependence Injection, Applying Microservice/ Microviews architecture etc.
  2. Configurability: It measures the extent of configuration used in your application. We should be able to add or hide the functionality from an application by configuration. There should not be any code related with any 3rd party application or vendor. It should be in config files. So that, we can replace the 3rd party application by changing in config files and very minimal change in code. This way it will require less testing and application will be ready very quickly with less effort and minimum impact.
  3. Customizability: It is very closely related with configurability. If application is highly configurable then it will be highly customizable. The vendor or client specific functionality in your application should be in plug-in-play mode.  
  4. Upgradeability: It defines how easily your application's different components can be upgraded. Generally, we should use the built-in features of the technologies in our application so that we can upgrade our technology and all the built-in features will be upgraded automatically and easily. If we are using any 3rd party vendor's feature its difficult to upgrade as the upgraded feature may not support our application's underlying technology.
  5. Integrity: The application should be well integrated with other required business system. For example, your application should be integrated with analytic tool, dataware house etc. It should be capable to sharing the correct reports in terms of csv or excel with other business system. It should be easily integrated with any new incoming business component or vertical by applying some configuration changes and minimal code changes.
Observability: Observability is all about data exposure and easy access to information required to find issues when the communications fail, internal events do not occur as expected or events occur when they shouldn’t. The observability of your application can be enhanced by proper logging and error handling at every decision point or at the point of communication between different APIs and 3rd party API. We can create some Dashboard to display different metrics of our application. For these purpose, we can use tools like Grafana, Dyanatrace etc. We should also enable dashboard to visualize the logs like Kibana. In case of AWS, we can CloudWatch, Flowlogs for these purpose.

Maintainability: It refers the ease with which we can understand, repair and improve the code of our application. The more maintainable our application is the more easily we can fix bugs, enhance our application's functionalities and fixing the security issues. While performing maintenance activities we can make 4 types of changes to our application:
  1. Corrective: It is related with fixing the bugs and issues in our application. Application code should be well segregated in different files and folders according to its purpose so that required code fixes can be applied easily at the required place.
  2. Adaptive: Our application should be developed in such a way that it can easily accept new functionalities. If needed we can easily removed the deprecated or unwanted functionalities. It should be able to accept the updated libraries, DB, operating system etc.
  3. Perfective: This functionality includes the code changes to improve the application performance.
  4. Preventive: It is associated the required changes in application for enhancing security. Upgrading security is an ongoing and never end work. So, our application should always be ready to accept such security related changes.


NFR : None Functional Requirement in Software Engineering or Application Development : Part 3: Application Security

 In my previous posts, I have discussed about NFR definition, its requirement and 3 important NFRs like Scalability, Performance and Testability. Below are the links of all of my posts related with NFR:

Links:

Today, I am going to discuss about a very important NFR that is Application Security. Application security is an ongoing and never ending work. Every day, new security threats comes in cyber space and we along with Business team will have to keep eye open on these security threats and if it is going to impact our application, we should take preventive measures as soon as possible. Generally, application security is the responsibility of not only the Development team but it depends on all the participants of the application like: 

  • Business team
  • Development team
  • End users
  • Hosting environment provider
  • Network provider
Guidelines and conventions has been provided for each participants over internet and if every participants follows performs their roles very cautiously then only we can achieve the application security goal. Normally, it has been found that most of the application security issues comes either due to security loop holes present in application development side or from the End user side. The security issues from End user side can be minimized by increasing the awareness about secure use of the application among End users. This is completely separate set of efforts that application owner's Customer facing team should put to make their customer aware about secure use of application as much as possible. Here, I am not concentrating about customer awareness(although it is backbone of any application security). Here, I will discuss about the Development teams roles regarding implementation of security features in an application.

I have developed several application in Node js and applied below security features. But, these are applicable to any public facing web application developed in any technology.

  1. CORS vulnerability: It is abbreviated as Cross Origin Resource Sharing. Here, generally, our application exposes some APIs to share some data to some particular 3rd party API. So, we should keep a check that whether our exposed APIs are communicating with valid 3rd party APIs or some malicious services. This can be achieved by defining a White List array of 3rd party APIs or domain that will be allowed to interact with our application and rest all will be denied. There are several ways of implementing it. In Node Js, we can use CORS NPM package to implement it. If we are deploying our application in cloud like AWS, these cloud providers different security features like NACL, SecurityGroup
  2. CSRF vulnerability: It is abbreviated as Cross Site Request Forgery. Here, some malicious 3rd party API tries to send a malicious request to your application in such a way that your application assumes that the request is coming from a valid authenticated user. In Node JS application, this vulnerability can be restricted by using CSURF NPM package. Using this package we create a csurf token at Node JS side in a GET method at the very first time when request comes and then this token is sent to client in response body. Now, client should get this token from response body and set it in request header every time it is sending any request to Node JS. On Node JS side, we can define a middleware that will take this token from response header and try to match it with the token generated by Node JS. If it is matching then process the request and if it is not matching then return error response.
  3. Request payload size limit: Defining request payload max size limit on API side is very important. Generally, Node JS API reads the JSON request payload in synchronous mode and if the request payload size is very large then your application might hang or will become very slow. It should be checked before processing any request on nodejs side. If using Express framework, this size can be defined as app.use(express.json({ limit: 10 })). Here request payload size limit is defined as 10 BYTES. The default request payload size limit in EXPRESS is 100KB. I have created a post on it. Its URL is: Node JS application security
  4. File upload size limit: Define max file size limit if your application supports file upload. It can be defined by using Multer package like  multer({limits: { fileSize: maxSize }}). Here, maxSize is in BYTES.
  5. Input data sanitization: Generally, the fraud like SQL injection and Javascript injections are done by input form data. This can be minimized by sanitizing the input data on API side before using it. Several NPM packages are available to sanitize input data like express-validator. In any database interaction, we should use parametrized query or any ORM library to minimize the chances of SQL injection. 
  6. Restrict XSS(Cross site scripting) attack: This is very much similar to the security issue discussed just above under the heading "Input data sanitization". Here, along with above sanitization measures we can use XSS NPM package to read data from request object.
  7. Use HTTPS instead of HTTP
  8. Authorization: We should follow the least privilege's mechanism with no previlege as Default. This can be implemented either by our own logic by creating tables where we can keep group of users with roles at group level and check the user's group and role before processing any request. Some available NPM package supporting it are @casl/ability, passport etc.
  9. Authentication: Use some well established 3rd party authentication mechanism instead of developing your own from scratch. The example of 3rd party authentication mechanism are like Social media authentication, Authentication using Azure or AWS authentication services. After successful authentication, we should create a JWT token by putting required data in token by encrypting it using Encryption algorithm provided by JWT. This token should be signed and sent to client. It should be properly validated before processing every request.
  10. If any important data is coming from UI to API side it must be encrypted. The encryption key must be there in config file in UI. Same key must be used in API side to decrypt it.
  11. Proper logging: Proper logging should be maintained for each request keeping its user, IP information, timestamp, operation success or failure information etc.
  12. OWASP top ten: The OWASP is an organization that provides the list of top 10 security vulnerabilities of application every year. The details can be found on https://owasp.org/Top10 These top 10 vulnerabilities must be taken care in our application at any cost. The top 10 security vulnerabilities for the year 2021 are as follows:
  • A01:2021-Broken Access Control: It occurs if we provide unwanted or extra privilege's to an user. It will be restricted by the ways discussed under the heading Authorization above with least privilege's.
  • A02:2021-Cryptographic Failures: It occurs due to missing required encryption or due to encryption by deprecated or old libraries. It can be restricted by the ways discussed in point (10). The only difference is that the point(10) is talking about data in transit but this is talking about data in transit as well as data in rest. Here, we should use latest cryptographic libraries and always avoid deprecated library like md5 and SHA1. 
  • A03 Injection: It can be restricted by sanitizing request data and using parametrized queries in DB interaction as discussed in point(5 and 6)
  • A04:2021 – Insecure Design: It is mainly related with Project Design. If project is not designed properly, even best coding practices can not rectify it. There are several tricks to minimize this flaws like using Limiting resource consumption by user or service, writing proper and sufficient unit test and integration test cases, using well established NPM packages. 
  • A05 Security Misconfiguration: Security is very vast and as a software engineer we have to constantly evaluate and upgrade our security feature in our application. We will have to keep verifying all the Cloud security features like SecurityGroup, NACL(Network Access Control Layer), IAM(Identity access management) policy etc. We should keep our 3rd party libs and NPM libs up to date. We should always try to move to the latest version of the programming language and DB if possible.
  • A06:2021 – Vulnerable and Outdated Components: This can be mainly rectified by using the latest version of our programming language, DB, NPM package and any 3rd party APIs.
  • A07:2021 – Identification and Authentication Failures: It mainly occurs if we are not able to keep our identity safe or using very week identity credentials like password or forget password question. To rectify it we should force user to make strong password and keep them properly encrypted in transit as well as in rest. Design your application to make it safe from Brute force attacks or automated attacks. If any crucial API is failing for 3 or more times for a particular user then block him for 30 min or slow down the processing for him. This kind of design will help in restricting automated attacks.
  • A08:2021 – Software and Data Integrity Failures: It is mainly associated with the integrity failure or vulnerabilities related with the 3rd party APIs or NPM packages or CDN we are using in our application. So, if we are using any 3rd Party API, it should be interacted with HTTPS protocol along with proper validation of authentication and authorization tokens. We should use well established NPM package. 
  • A09:2021 – Security Logging and Monitoring Failures: It is associated with missing required logging and monitoring the logs regularly. It can be prevented by using proper logging in every request and setting up some Alarm or event like CloudWatch alarm in case of AWS to get notification via email or SMS about continuous failure of a particular API. In such cases we can validate logs to identify the cause of failure and to get the clue of any security attacks.
  • A10:2021 – Server-Side Request Forgery (SSRF): It mainly occurs by redirecting user to a different URL. It can be restricted by adding code for properly validating the domain name and port of the URL to ensure whether the domain, host and port are coming in our white list URL before redirecting user to that URL.
Here, I have discussed the minimum required measures that we must take in our application development from security perspective. But, it is not a vast list of security features. There are several more tips and tricks that we can apply in our application. Application security implementation is an ongoing process and development team as well as Business team will have to keep working on it. Its not an one time activity.

In my next post, I will discuss about few other NFR topics.

Saturday, July 16, 2022

NFR : None Functional Requirement in Software Engineering or Application Development : Part 2: Scalability, Performance, Testability

 In my previous post, I had discussed about below points like:

  1. What is NFR
  2. What information should be gathered by Development team from Business team with respect to NFR
  3. Listed down few important NFRs
Link of my NFR related posts:

In the continuation of my previous post about NFR, I am going to discuss about 3 NFR like Scalability, Performance, Testability

Scalability:
Scalability is the measure of a system's ability to increase or decrease in performance and cost in response to changes in application and system processing demands. There are 3 important types scalability:
  1. Horizontal Scaling: New server is added instead of upgrading the existing server.
  2. Vertical Scaling: Existing server is upgraded based by increasing CPU, RAM, Processor
  3. Diagonal Scaling: Here, both horizontal and vertical scaling are done.
Below are few of the ways of implementing scalability in your application:
  1. Setup an alarm like CloudWatch Alarm in AWS that triggers an alarm or event when CPU utilization reaches a thresh hold value
  2. Setup an autoscaling by using cloud autoscale functionality or Kubernates
  3. Load Balancing: Define load balancer to equally distribute load on several servers
  4. Enable clustering if your application technology supports. E.g. if your application is using NodeJs for API creation, we can easily enable clustering as discussed in my below posts like:
            Cluster in Node Js application

Performance:
It represents the application's throughput time. Lower the throughput time, higher is the performance. Below are few of the ways to enhance your application's performance:
  1. Setup proper load balancer
  2. Use auto-scaling to enhance performance based on CloudWatch alarm
  3. Setup caching like Reddish or memcache
  4. Flush the cache data at certain intervals based on frequency of write operation in the application and how early to present the latest data to customers
  5. Always use Asynchronous operations where synchronous behavior is not required
  6. Follow all coding conventions and guidelines
  7. Follow all best practices of DB like SQL query optimization, using Views, Indexes, Stored Procedures and SQL functions where ever required.
  8. Use some code validation tools like Sonar or Eslint
  9. Try to use latest stable and suitable version of libs like NPM packages
Testability:
In software, testability refers to the degree that any module, requirements, subsystem or other component of your architecture can be verified as satisfactory or not. High testability means it is easy to find and isolate faults as part of your team's regular testing process. Our application must have following criteria to be more testable:
  1. Controllability: The degree to which it is possible to control the state of the component under test (CUT) as required for testing.
  2. Observability: The degree to which it is possible to observe (intermediate and final) test results.
  3. Isolateability: The degree to which the component under test (CUT) can be tested in isolation.
  4. Separation of concerns: The degree to which the component under test has a single, well defined responsibility
  5. Understandability: The degree to which the component under test is documented or self-explaining.
  6. Automatability: The degree to which it is possible to automate testing of the component under test.
  7. Heterogeneity: The degree to which the use of diverse technologies requires to use diverse test methods and tools in parallel.
In my next post, I will discuss about NFR related with Application security.

Friday, March 4, 2022

NFR : None Functional Requirement in Software Engineering or Application Development : Part 1

Here is the link of all of my posts on topic NFR:

Links:

NFR(None Functional Requirement) is abbreviated as Non Functional Requirement. In any application development or maintenance work, the NFRs are least discussed but most required activities. Generally, the NFRs of an application are not discussed by Business team. Its Developer's team's responsibility to discuss all the NFRs with business teams and try to understand and create a document of the discussion regarding NFRs implementation with Business team. There may be several discussion points regarding NFRs with Business team like:

  1. Expected minimum or maximum load on application
  2. What is the pick time and off pick time of the application usage
  3. Is the application public facing or only client's employee facing
  4. Application downtime required for build deployment or any upgradation
  5. Response throughput in terms of millisecond required for a feature
  6. Negotiation in business requirement to achieve the required throughput.
  7. Whether the application's feature is read extensive or write extensive to decide how frequently data should be flushed in caching if caching is implemented
  8. Whether customer requires latest data as soon as it is updated in system to decide whether to implement caching
  9. Cost of any 3rd party API that Business can bear.
  10. Cost of skilled resources that Business should bear till the time the application becomes stable(Should be discussed more by Management team)
  11. Tradeoff analysis and discuss with Business team about conflict between different NFRs.
There are several types of Non Functional Requirements. Few important NFRs are as follows:
  1. Scalability
  2. Performance
  3. Testability
  4. Security
  5. Extensibility
  6. Observability
  7. Maintainability
The actual list of NFRs are very vast. Here, I have included only few that are mostly required to be considered in Application development.

In my next blog, I will discuss about the above mentioned NFRs in short.

Node Express application Security: Set Request Size Limit for JSON data and file uploading

 Here, we are going to discuss about securing our Node Express application if any unwanted user tries to impact performance of our application by sending a very huge input JSON data or by uploading a very large file. 

Generally, Express allows 100KB JSON data by default. If we try to input more than 100KB JSON data, it will return 413 error code. In case of file uploading by using Multer package, there is not any defult file size limit. So, in can case of file uploading, we should apply some file size limit in Node Js code. Lets discuss about JSON input data and file uploading separately.

JSON input data: As told above, Express allows by default 100KB JSON data which is pretty good in most of the scenarios. If in some scenarios, we need to send JSON data of size more than 100KB, then we have to increase the default JSON data size limit. We can achieve it by using the below two middlewares.

app.use(express.json({limit: '10mb', extended: true}));
app.use(express.urlencoded({limit: '10mb', extended: true}));

Now, this express application will allow us to send JSON data of up to 10mb size.

File upload size limit: Generally, we use Multer package to upload incoming files. Here, we can restrict the security threat by providing the max file size limit in multer. It can be done in the following way.

const multer = require('multer');
const upload = multer({limits: { fileSize: 1024 * 1024 * 150 }}); //150Mb

Here, we are defining the maximum allowed file size limit as 150mb. This size applies for both single file upload and multiple file upload. So, if we are uploading a single file of size say 200mb or two files together with each file having size about 100mb that is 100+100=200mb then, in both the cases, the file size limit is exceeding the defined file size that is 150mb. So, in these cases, it will give error with 500 status code with error message as "File too large".

I have created below two routes for handling single file upload and multiple file upload.

app.post('/singleFileUpload', upload.single("file"), (req, res) => {
  const jsonData = req.body;
  res.status(200).json({
    "status": "success"
  });
});

app.post('/multipleFileUpload', upload.array("file"), (req, res) => {
  const jsonData = req.body;
  res.status(200).json({
    "status": "success"
  });
});

Here, I am giving my code that will help in reproducing the outcome of above discussion.

package.json

{
  "name": "node_test",
  "version": "1.0.0",
  "description": "",
  "main": "index.js",
  "scripts": {
    "test": "echo \"Error: no test specified\" && exit 1",
    "start": "nodemon ./app"
  },
  "author": "",
  "license": "ISC",
  "dependencies": {
    "express": "^4.17.3",
    "multer": "^1.4.4",
    "nodemon": "^2.0.15"
  }
}

app.js

const express = require('express');
const multer = require('multer');
const app = express();
const upload = multer({limits: { fileSize: 1024 * 1024 * 150 }}); //50Mb

//middleware
app.use(express.json({limit: '10mb', extended: true}));
app.use(express.urlencoded({limit: '10mb', extended: true}));

const port = 3000;

app.get('/', (req, res) => {
  res.status(200).json({
    "status": "success"
  });
});

app.post('/bigJson', (req, res) => {
  const jsonData = req.body;
  res.status(200).json({
    "status": "success",
    "jsonData": jsonData
  });
});

app.post('/singleFileUpload', upload.single("file"), (req, res) => {
  const jsonData = req.body;
  res.status(200).json({
    "status": "success"
  });
});

app.post('/multipleFileUpload', upload.array("file"), (req, res) => {
  const jsonData = req.body;
  res.status(200).json({
    "status": "success"
  });
});

app.listen(port, () => {
  console.log("Server has started");
});




Wednesday, February 9, 2022

Publish message to AWS SQS queue by NodeJs

 Here, we will discuss about how to publish messages to AWS SQS queue by NodeJs. To implement it, we have to follow below steps:

  1. Create an user in AWS and note down its access key Id and secret access key
  2. Create a SQS queue and note down its url
  3. Save user credential in a shared file in your local
  4. Install aws-sdk library
  5. Create a simple node express application with route for publishing message to SQS
Now, lets discuss each steps one by one.
1) Create an user in AWS and note down its access key Id and secret access key: I have discussed in detail about this steps in my previous blog under the heading "File handling in AWS S3 by NodeJs". I am giving the link below. Please go through it.


2) Create a SQS queue and note down its url: First open AWS console and go to the SQS service and click on the "Create Queue" button. Now, add the Queue name and keep other things with default value. Further click on "Create Queue" button at the bottom.





3) Save user credential in a shared file in your local: In this steps, we should save the user credential so that node js application can read it while establishing connection with AWS. We can refer below AWS documentation for this step:


5) Install aws-sdk library: Create a simple node js application and Use the command like npm install aws-sdk

6) Create a simple node express application with route for publishing message to SQS: Here, we are creating a very simple node express application along with two routes:
a) "/addMessage": This is being used to add messages in SQS queue.
b) "/getMessage": This is being used to get messages from the SQS queue.
I am providing here the code for your reference:

const express = require('express');
const dotenv = require('dotenv');
const path = require('path');
const AWS = require('aws-sdk');

AWS.config.update({region: 'us-east-1'});
const sqs = new AWS.SQS();

const app = express();
app.use(express.json());

dotenv.config({
  path: path.join(__dirname, './.env')
});

app.listen(8000, () => {
  console.log(`Listening port 8000`);
});

//Add a message to SQS
app.post('/addMessage', async(req, res) => {
  try {
    const params = {
      MessageBody: req.body.MessageBody,
      QueueUrl: process.env.SQS_URL
    };
    const result = await sqs.sendMessage(params).promise();
    res.send(result)
  } catch (error) {
    console.log(error);
    throw error;
  }  
});

//Receive a message to SQS
app.get('/getMessage', async(req, res) => {
  try {
    const params = {
      QueueUrl: process.env.SQS_URL
    };
    const result = await sqs.receiveMessage(params).promise();
    res.send(result)
  } catch (error) {
    console.log(error);
    throw error;
  }  
});





Monday, February 7, 2022

Publish and subscribe to AWS SNS topic through NodeJS

 Today, we are going to discuss how to publish a message to AWS SNS topic and how to subscribe to the SNS topic so that the subscriber gets the messages published to the SNS topic. We should follow the below steps:

  1.  Create an user and user group and note down its accessKeyId and secretAccessKey
  2. Assign this user the required permission for SNS handling
  3. Create a topic and note down its ARN
  4. Install aws-sdk libraray
  5. Save user credential in a shared file in your local
  6. Create a simple node express application and create routes for publish and subscribe
Now, lets discuss about each steps here:
1) Create an user and user group and note down its accessKeyId and secretAccessKey: I have discussed in detail about this steps in my previous blog under the heading "File handling in AWS S3 by NodeJs". I am giving the link below. Please go through it.


2) Assign this user the required permission for SNS handling: Here, we can create a custom policy for SNS handling and then assign it to the group to which the user belongs or we can assign a default SNS policy already provided by AWS. Here, I am going to assign default SNS policy.
First, go to the IAM service in AWS console and then click on your User name to whom you want to provide permission. 


Now, click on Add permissions button and then click on "Attach existing policies directly". Now, search for SNS and select "AmazonSNSFullAccess". 
Important Note: This policy selection is only for demo purpose. In real time, we should provide the least required access for any service to an user. Always avoid providing the full access.


3) Create a topic and note down its ARN: Now, go to the SNS service in AWS console and click on the Topics tab present in the left side. Further, click on the Create topic button. In the create topic window, select "Standard" radio button and add your Topic name and display name. You can keep rest of the fields as default.

Now, your topic will be created. You should note down the ARN of the topic. It is going to be used in NodeJS code.

4) Install aws-sdk libraray: Use the command like npm install aws-sdk

5) Save user credential in a shared file in your local: In this steps, we should save the user credential so that node js application can read it while establishing connection with AWS. We can refer below AWS documentation for this step:


6) Create a simple node express application and create routes for publish and subscribe: Here, we should create a node js and express application and expose two routes. 
One is used to subscribe the email id to the SNS topic represented by the ARN of the topic in the code. Once, the route is triggered, it will trigger SNS service to subscribe the given email id to the SNS topic. You will get a confirmation email on this email id from AWS. This email will have a link and you will have to click on the link to complete the subscription.
Second route is used to publish message to the SNS topic and further SNS topic will forward the message to all the subscribed emails. Here, we have to provide the Subject and Body as a part of message. 
I am giving here the code for your reference:

const express = require('express');
const dotenv = require('dotenv');
const path = require('path');
const AWS = require('aws-sdk');

AWS.config.update({region: 'us-east-1'});
const sns = new AWS.SNS();

const app = express();
app.use(express.json());

dotenv.config({
  path: path.join(__dirname, './.env')
});

app.listen(8000, () => {
  console.log(`Listening port 8000`);
});


//Get sns details
app.get('/mysns', (req, res) => res.send({"status":"Ok", sns}));

//Subscribe SNS topic
app.post('/subscribe', async(req, res) => {
  try {
    const params = {
      Protocol: 'Email',
      TopicArn: process.env.TOPIC_ARN,
      Endpoint: req.body.email
    };
    const result = await sns.subscribe(params).promise();
    res.send(result);
  } catch (error) {
    console.log(error);
    throw error;
  }  
});

//Publish to SNS topic
app.post('/publish', async(req, res) => {
  try {
    const params = {
      Subject: req.body.subject,
      TopicArn: process.env.TOPIC_ARN,
      Message: req.body.message
    };
    const result = await sns.publish(params).promise();
    res.send(result);
  } catch (error) {
    console.log(error);
    throw error;
  }  
});



 






Sunday, February 6, 2022

File handling in AWS S3 by Node Js

 Today, I am going to discuss about file handling operations like upload a file, get list of all the files, delete a file and download files from S3 bucket in AWS. To perform all these operation, we should follow the below steps:

  1. Create an user in IAM service in AWS
  2. Create a bucket in S3 service in AWS
  3. Save user credential in a shared file in your local
  4. Create a NodeJS application
  5. Install aws-sdk package
  6. Create routes for handling files
Now, lets discuss about each steps here.
1) Create an user in IAM service in AWS: We should go to the IAM service in AWS and click on "Add users button".


Add the user name and select the check box for "Access key - Programmatic access" and then click on "Next: Permissions"

Click on the "Create group" button and then add a group name and then click on Create group button.

Now, on group list page, click on your newly created group and then click on Permissions tab. Now, Add Permissions dropdown box and click on attach Policy.


Now, click on Create Policy and select service as S3. Further, select check boxes of Read, List and Write


So, up to here, your user and required group and policy is created. Now, note down the user access key id and secret access key. These two keys are going to be used in NodeJS programming.

2) Create a bucket in S3 service in AWS: Now, we should go to the S3 service of AWS and create a bucket with unique name in AWS.




3) Save user credential in a shared file in your local: In this steps, we should save the user credential so that node js application can read it while establishing connection with AWS. We can refer below AWS documentation for this step:


4) Create a NodeJS application: Here, create a very simple node js application with Express. Code will be given in the below.

5) Install aws-sdk package: In your node js application, install aws-sdk package by using command like npm install aws-sdk

6) Create routes for handling files: Below is the code for different routes like upload a file in S3, get list of all the files from the S3 bucket, delete a file, download a file.

const express = require('express');
const dotenv = require('dotenv');
const path = require('path');
const multer = require('multer');
const AWS = require('aws-sdk');

const s3 = new AWS.S3();
const app = express();
const upload = multer();
dotenv.config({
  path: path.join(__dirname, './.env')
});

app.listen(8000);

//Creating routes
//uploading file to s3
app.post('/upload', upload.single('file'), async (req, res) => {
  try {
    const result = await s3.putObject({
      Body: req.file.buffer,
      Bucket: process.env.BUCKET_NAME,
      Key: req.file.originalname
    }).promise();
    res.send(result);
  } catch (error) {
    console.log(error);
    throw error;
  }  
});

//Listing uploaded files
app.get('/fileList', async(req, res) => {
  try {
    const result = await s3.listObjectsV2({
      Bucket: process.env.BUCKET_NAME
    }).promise();
    res.send(result);
  } catch (error) {
    console.log(error);
    throw error;
  }
});

//Deleting a file
app.delete('/deleteFile/:fileName', async(req, res) => {
  try {
    const fileName = req.params.fileName;
    const result = await s3.deleteObject({
      Bucket: process.env.BUCKET_NAME,
      Key: fileName
    }).promise();
    res.send(result);
  } catch (error) {
    console.log(error);
    throw error;
  }
})

//Download a file
app.get('/downloadFile/:fileName', async(req, res) => {
  try {
    const fileName = req.params.fileName;
    const result = await s3.getObject({
      Bucket: process.env.BUCKET_NAME,
      Key: fileName
    }).promise();
    res.send(result.Body);
  } catch (error) {
    console.log(error);
    throw error;
  }
})