Showing posts with label DynamoDB. Show all posts
Showing posts with label DynamoDB. Show all posts

Monday, May 8, 2017

Serverless User Activation Mechanism with Amazon Lambda, Amazon Step Functions and DynamoDB Triggers



In this series of posts, I am writing about various AWS services. In a previous post, I have shown how to use Amazon Simple Queue Service and Amazon Simple Email Service to send an activation mail when a user is registered.
The application I have developed for that post used SQS to send a message to the activation queue and a queue listener is processed the messages for sending activation mails to users by using SES.
AWS Lambda is very popular nowadays. Using the serverless architecture, we can focus only on business needs and the rest is handled by AWS. But when we start to use a few Lambda functions together, it starts to get harder and harder to manage the functions and understand the data flow between them.
AWS Step Functions is very useful in simplifying distributed Lambda executions. While seeing steps visually when designing the flow provides easy understanding, having state and ability to retry is very valuble for distributed execution coordination.
In this post, I will use DynamoDB triggers instead of SQS for sending an activation mail. When a User item is inserted to DynamoDB User table, table's trigger executes a Lambda function. Lambda function then generates a new Step Function execution for generating mail and sending it. The process is shown in the picture below.


Steps

1. Prepare IAM Roles

2. Implement LF_CardStore_GetUserActivationStatus Lambda Function

3. Implement LF_CardStore_SendUserActivationMail Lambda Function

4. Create the SF_CardStoreSendUserActivationMail Step Function.

5. Create LF_CardStore_UserTableTriggerToSendActivationMail Function for User Table 
Trigger

6. Configure DynamoDB User Table Trigger

7. Change the Application


As a starting point, I will use the code I have developed for my previous post. The code can be found here.
Application is developed in Java using Spring Boot. For this post, I will use Node.js to implement Lambda functions easily.

Let's start.


1. Prepare IAM Roles

For our Lambda Functions to call DynamoDB and Simple Email Service (SES), we create a new IAM Role with AmazonDynamoDBFullAccess, AmazonSESFullAccess named CardStoreLambdaRole.
For step functions, we create a role named CardStoreStepFunctionsRole. For more information on creating IAM roles for step functions, see here.

2. Implement LF_CardStore_GetUserActivationStatus Lambda Function

The code for LF_CardStore_GetUserActivationStatus is below. DynamoDB client is created in initialization part and user is queried by username from User table in handler. If user is found, activationStatus field is returned, otherwise an empty string is returned. In a production usage, we should handle error conditions, for more information see here.

var aws = require('aws-sdk');

var docClient = new aws.DynamoDB.DocumentClient();

var table = "User";

exports.handler = (event, context, callback) => {
    var username = event.username;
    var activationStatus = "";
    
    if (!username || username === "") {
        callback(null, activationStatus);
        return;
    }
    
    var params = {
        TableName: table,
        Key: {
            "username": username
        }
    };
    
    docClient.get(params, function(err, data) {
        if (err) {
            console.error("Unable to read item. Error JSON:", JSON.stringify(err, null, 2));
            callback(null, activationStatus);
        } else {
            console.log("GetItem succeeded:", JSON.stringify(data, null, 2));
   
            var user = data.Item;
            activationStatus = user == null ? "" : user.activationStatus;
            
            callback(null, activationStatus);
        }
    });
};


After putting this Node.js code in zip file, we can create the Lambda Function with the command below. Replace AWS_ACCOUNT_ID with your AWS Id.
aws lambda create-function \
--function-name LF_CardStore_GetUserActivationStatus \
--description "Returns activation status of user" \
--runtime nodejs6.10 \
--handler LF_CardStore_GetUserActivationStatus.handler \
--zip-file fileb://LF_CardStore_GetUserActivationStatus.zip \
--role arn:aws:iam::AWS_ACCOUNT_ID:role/CardStoreLambdaRole

Once created, we can test the Lambda function like below. In this sample, activation status of user20 was DONE.

aws lambda invoke \
--function-name LF_CardStore_GetUserActivationStatus \
--payload '{"username":"user20"}' \
out.txt

cat out.txt
"DONE"


3. Implement LF_CardStore_SendUserActivationMail Lambda Function

The code for LF_CardStore_SendUserActivationMail is below. In initialization part, we init SES and DynamoDB clients. SES service is not available in every region, so use a region close to you and configure it like below.
Handler first tries to find the user. If user found, it checks the activationStatus of the user and sends the mail if not already sent. After sending the mail, activationStatus of user is marked as MAIL_SENT.
Please note that the function uses activationUrlBase field of User table to generate final activation url. This is required for Lambda function to work when Java application is run in both development environment and on AWS. The Java program generates the base url according to the environment and saves to the User table.
Also From field of the mails is specified with FROM_ADDRESS environment variable.

var aws = require('aws-sdk');

var ses = new aws.SES({
    region: 'eu-west-1' 
});

var docClient = new aws.DynamoDB.DocumentClient();

var table = "User";
var fromAddress = process.env.FROM_ADDRESS;

function markActivationStatus(username, activationStatus, callback) {
    var params = {
        TableName: table,
        Key:{
            "username": username
        },
        UpdateExpression: "set activationStatus = :status",
        ExpressionAttributeValues:{
            ":status": activationStatus
        },
        ReturnValues:"UPDATED_NEW"
    };
    
    console.log("Updating the item...");
    docClient.update(params, function(err, data) {
        if (err) {
            console.error("Unable to update item. Error JSON:", JSON.stringify(err, null, 2));
            callback(null, {"result": "Can't mark activationStatus: " + JSON.stringify(err, null, 2)})
        } else {
            console.log("UpdateItem succeeded:", JSON.stringify(data, null, 2));
            callback(null, {"result": "OK"})
        }
    });
}

function sendEmail(user, activationUrlBase, callback) {
    
    var activationUrl = activationUrlBase + "?username=" + user.username + "&token=" + user.activationToken;
    
    var to = user.email;
    var subject = "Activate your Digital Card Store account";
    var mailBody = '<html><body><br/>Dear ' + user.name + '<br/><a href="' + activationUrl
    + '">Please click to activate your user account ' + user.username + "</a><br/>"
    + "</body></html>";

    var eParams = {
        Destination: {
            ToAddresses: [to]
        },
        Message: {
            Body: {
                Html: {
                    Data: mailBody
                }
            },
            Subject: {
                Data: subject
            }
        },
        Source: fromAddress
    };

    console.log('>>> SENDING EMAIL');
    var email = ses.sendEmail(eParams, function(err, data){
        if (err) {
            console.log(err);
            callback(null, {"result": "Can't send email:" + err});
        }
        else {
            console.log(">>> EMAIL SENT");
            markActivationStatus(user.username, "MAIL_SENT", callback);
        }
    });
}

function findUser(username, callback) {
    var params = {
        TableName: table,
        Key:{
            "username": username
        }
    };
    
    docClient.get(params, function(err, data) {
        if (err) {
            console.error("Unable to read item. Error JSON:", JSON.stringify(err, null, 2));
            callback(null, {"result": "Unable to read item. Error JSON:" + JSON.stringify(err, null, 2)});
        } else {
            console.log("GetItem succeeded:", JSON.stringify(data, null, 2));
            
            var user = data.Item;
            var activationStatus = user == null ? "" : user.activationStatus;
            var activationUrlBase = user == null ? "" : user.activationUrlBase;
            
            if (activationStatus === "NONE")
                sendEmail(data.Item, activationUrlBase, callback);
            else
                callback(null, {"result": "Activation status of user " + username + " is not appropriate. It is " + activationStatus});
        }
    });
}

exports.handler = (event, context, callback) => {
    console.log("Incoming: ", event);

    var username = event.username;

    findUser(username, callback);
};
After putting this Node.js code in zip file, we can create the Lambda Function with the command below. Replace AWS_ACCOUNT_ID with your AWS Id and specify mail sender address with FROM_ADDRESS environment variable.
aws lambda create-function \
--role arn:aws:iam::520334389080:role/CardStoreLambdaRole \
--function-name LF_CardStore_SendUserActivationMail \
--description "Send activation mail to the user" \
--runtime nodejs6.10 \
--handler LF_CardStore_SendUserActivationMail.handler \
--zip-file fileb://LF_CardStore_SendUserActivationMail.zip \
--role arn:aws:iam::AWS_ACCOUNT_ID:role/CardStoreLambdaRole \
--environment Variables={FROM_ADDRESS=sender@app.com} \

Once created, we can test the Lambda function like below. In this sample, because the activation status of user20 was DONE, mail is not sent.
aws lambda invoke \
--function-name LF_CardStore_SendUserActivationMail \
--payload '{"username":"user22"}' \
out.txt

cat out.txt
{"result":"Activation status of user user22 is not appropriate. It is DONE"}


When we test with another user, mail is sent.

aws lambda invoke \
--function-name LF_CardStore_SendUserActivationMail \
--payload '{"username":"user23"}' \
out.txt

cat out.txt
{"result":"OK"}

We can see the Lambda functions in AWS Console like the picture below.




4. Create the SF_CardStoreSendUserActivationMail Step Function.

Now, we are ready to create the step function that use the Lambda functions created. The Amazon JSON language for defining the step function is below. Replace AWS_ACCOUNT_ID with your AWS Id.

{
  "Comment": "Step function to send user activation mail", 
  "StartAt": "GetActivationStatus",
  "States": {
    "GetActivationStatus": {
      "Type": "Task",
      "Resource": "arn:aws:lambda:eu-central-1:AWS_ACCOUNT_ID:function:LF_CardStore_GetUserActivationStatus",
      "Next": "CheckActivationStatus",
      "ResultPath": "$.activationStatus"
    },
    "CheckActivationStatus": {
      "Type" : "Choice",
      "Choices": [
        {
          "Variable": "$.activationStatus",
          "StringEquals": "NONE",
          "Next": "SendActivationMail"
        }
      ],
      "Default": "MailSent"
    },
    "SendActivationMail": {
      "Type" : "Task",
      "Resource": "arn:aws:lambda:eu-central-1:AWS_ACCOUNT_ID:function:LF_CardStore_SendUserActivationMail",
      "Next": "MailSent"
    },
    "MailSent": {
      "Type": "Pass",
      "End": true
    }
  }
}


After saving this code to SF_CardStore_SendUserActivationMail.js file, we can create the step function with the command below. Replace AWS_ACCOUNT_ID with your AWS Id.

aws stepfunctions create-state-machine \
--name SF_CardStore_SendUserActivationMail \
--role-arn arn:aws:iam::AWS_ACCOUNT_ID:role/ CardStoreStepFunctionsRole \
--definition "$(cat SF_CardStore_SendUserActivationMail.js)"

We can see the step function in the console like below.



Steps for the step function can be seen like the picture below.



The flow is explained below.

1. Step function starts at GetActivationStatus task. This task is implemented by calling LF_CardStore_GetUserActivationStatus Lambda function. username parameter is expected to be exist in the parameter given when starting the step function execution. Input parameter is passed to the function as is. Result of the function is put to the step machine data with the name activationStatus by
"ResultPath": "$.activationStatus"
expression. The result will be checked in next step by using this field name. For more information on input and output management for steps, see here.
The next step is specified with "Next": "CheckActivationStatus" expression.
2. Next step is CheckActivationStatus. The type of this step is Choice. In this step result of the LF_CardStore_GetUserActivationStatus Lambda function is checked. If the activation status of the user is "NONE", SendActivationMail step is executed. Otherwise,  MailSent step is executed, which means the end of the step function and mail is not sent.
3. SendActivationMail step is executed conditionally. This step calls LF_CardStore_SendUserActivationMail. Input parameter username is passed as is. The next step is MailSent, which means the end of the step function.
4. MailSent step is marked as the end of the step function.

We can start a new execution with the command below. Replace AWS_ACCOUNT_ID with your AWS Id.
aws stepfunctions start-execution \
--state-machine-arn arn:aws:states:eu-central-1:AWS_ACCOUNT_ID:stateMachine:SF_CardStore_SendUserActivationMail \
--input '{"username":"user22"}'


We  can see the flow of the execution by clicking the execution in the console. We can see the executed steps with the green color. In this case, SendActivationMail step is not executed and mail is not sent.

We try another user and this time mail is sent by executing SendActivationMail step like below.



Now our step function is ready for sending activation mails.


5. Create LF_CardStore_UserTableTriggerToSendActivationMail Function for User Table Trigger

The code for the trigger is below. When a record is inserted, deleted or updated our handler will be called. Handler may be called for more than one record, so we loop over event.Records. event.Records[i].dynamodb.NewImage will contain the record inserted. For more information, see here.
We want to send activation mails only when a new record is inserted. Thus we check whether the value of event.Records[i].eventName is "INSERT".
After user name is extracted, we start a new execution by using AWS.StepFunctions API. Step function ARN is retrieved from environment variable. We pass the username of the record as username parameter to the step function.

const AWS = require('aws-sdk');

var stepFunctionArn = process.env.STEP_FUNCTION_ARN;

function startSendUserActivationMailStepFunctionExecution(username, callback) {
  console.log("Starting SendUserActivationMail StepFunction Execution for user " + username);
  
  const stepfunctions = new AWS.StepFunctions();
  const params = {
    stateMachineArn: stepFunctionArn,
    input: JSON.stringify({ "username": username })
  };

  // start a state machine
  stepfunctions.startExecution(params, (err, data) => {
    if (err) {
      callback(err, null);
      return;
    }
    
    console.log(data);

    callback(null, 'Started SendUserActivationMail StepFunction Execution for user ' + username);
  });
}


exports.handler = (event, context, callback) => {
    console.log('Received event:', JSON.stringify(event, null, 2));
    event.Records.forEach((record) => {
        console.log(record.eventName);
        console.log('DynamoDB Record: %j', record.dynamodb);

        if (record.eventName === "INSERT") {
            var username = record.dynamodb.NewImage.username;

            if (username)
                startSendUserActivationMailStepFunctionExecution(username.S, callback);
        }
    });
};



After saving the code to file and zip the file, we can create the Lambda function with the command below. Replace AWS_ACCOUNT_ID with your AWS Id.
$ aws lambda create-function \
--function-name LF_CardStore_UserTableTriggerToSendActivationMail  \
--description "User Table trigger for starting SF_CardStore_SendUserActivationMail execution" \
--runtime nodejs6.10 \
--handler LF_CardStore_UserTableTriggerToSendActivationMail.handler \
--zip-file fileb://LF_CardStore_UserTableTriggerToSendActivationMail.zip \
--role arn:aws:iam::AWS_ACCOUNT_ID:role/CardStoreLambdaRole \
--environment Variables={STEP_FUNCTION_ARN=arn:aws:states:eu-central-1:AWS_ACCOUNT_ID:stateMachine:SF_CardStore_SendUserActivationMail}

Now we are ready to attach this trigger to the User table.


6. Configure DynamoDB User Table Trigger

To attach a trigger to User table, first we should enable DynamoDB streams on the table. This way when the table is modified by any insert, update or delete action, a new record is added to the stream. We can attach a Lambda trigger to this stream. For more information, see here.
Go to DynamoDB console and select User table. Click Manage Stream and select New image for View type and click Enable as shown in the picture below.



Next,  in the Triggers tab, click Create trigger button. Select Existing Lambda Function from menu. Select LF_CardStore_UserTableTriggerToSendActivationMail from Function combo, enter 1 as Batch size and check Enable trigger checkbox, then click Create to create trigger.



Now, User table trigger is configured.



7. Change the Application

For the whole mechanism to work, we should change our Java application to add activationBaseUrl field to User table record when creating a user.
Also we remove the SQS and SES dependencies from the application because activation mail will be sent by the Lambda function.
Final code can be found at my GitHub repository.
After running the application, the activation mail will be sent by the step function triggered by DynamoDB when a new user is registered.

Summary

In this post, I have shown how to create a serverless user activation mechanism by using AWS services. The mechanism is triggered by DynamoDB trigger and is implemented using step functions. Lambda functions is used for getting activation status and sending mail. The code can be found in my GitHub repository.
I will continue to use various AWS services and blogging about them.

Thursday, April 27, 2017

Uploading Images to Amazon S3 Directly from the Browser Using S3 Direct Uploads



In this series of posts, I am writing about various AWS services. In my previous posts, I have written about AWS EC2, Elastic Load Balancing, Auto Scaling, DynamoDB, Amazon Simple Queue Service and Amazon Simple Email Service.

In my last post, I have added an user activation functionality to my digital card store application. The application is used for managing digital cards. So far I have added, user registration, user session management, selling and buying cards and user activation functionality. The user can add a new card by specifying a name only.

In this post, I will add an upload function that allows user to attach an image to a digital card. I will use Amazon S3 to store uploaded image files.

Upload Functionality

When we think about uploading a file, the first option that comes to mind is to upload the file to an EC2 instance from the browser and then send the file to Amazon S3 from the EC2 instance.
While this method accomplish the image upload requirement, there is a better method. In 2012, Amazon announced CORS support for Amazon S3, which allows any web application to upload files to S3 directly. This allows quick and efficient uploads and eliminates proxying the upload requests.

The picture below shows the upload process.



To use direct uploads to S3, we should follow the steps below.

1. Enable CORS support for the bucket.
2. Configure access permissions.
3. Develop the signing part in server
4. Prepare the web front end.

In this post, I will start with the code from my last post. The code can be found here. In the post that I have written about DynamoDB, I have generated the Card entity class with imageURL field. In this post, I will use this field to hold the URL of the uploaded image file. There will be no change in entity class and CardController class. After the image uploaded to the S3, its url will be passed as imageURL to add card request. The card will be persisted with this url to DynamoDB and it will be used as card image url to show the card image in the card listing table. The final code for this post can be found here.

Let's start.

1. Enable CORS support for the bucket

To be able to use direct uploads from any web application, the target S3 bucket should be configured to allow requests from a different domain. For more information, see S3 Cors documentation.

To enable CORS support using Amazon Console, use the steps below. To use AWS CLI, see here.

  • ·         Login to Amazon Console and select S3
  • ·         Select your bucket and click Properties
  • ·         Click Permissions and then click Edit CORS Configuration
  • ·         Paste the below configuration and click Save and then click Close.


<?xml version="1.0" encoding="UTF-8"?>
<CORSConfiguration xmlns="http://s3.amazonaws.com/doc/2006-03-01/">
       <CORSRule>
             <AllowedOrigin>*</AllowedOrigin>
             <AllowedMethod>GET</AllowedMethod>
             <AllowedMethod>POST</AllowedMethod>
             <AllowedMethod>PUT</AllowedMethod>
             <AllowedHeader>*</AllowedHeader>
       </CORSRule>
</CORSConfiguration>

Please note that I allowed any origin to post to the bucket for easy development. To provide robust securiy in production use, please restrict the domains accordingly.

2. Configure access permissions

To allow uploads to the S3 bucket, the bucket should be writable. There are a few options for this. The first option is to make the bucket public. But if you make the bucket public, everybody can write to your bucket and you take the risk of uncontrolled uploads. The second option is to make the bucket writable by a specific IAM user and sign the upload requests with this users credentials. This option is more secure compared to first option. In this post, I will use the second option. Please consider security options before using the upload function in production.

While using signed requests, S3 expects your upload requests to include an upload policy and a signature. The signature is prepared using your IAM credentials. You can use any IAM credentials, but to provide more strict security, please use a dedicated IAM user for this purpose that have write access only to the target S3 bucket. This way, you can be sure that in case of a credential disclosure, it affects only a specific S3 bucket, not any other AWS resources. For more strict security, you can use Temporary Security Credentials.

To configure access permissions using Amazon Console, use the steps below. To use AWS CLI, see here.

  • ·         Login to Amazon Console and select S3
  • ·         Select your bucket and click Properties
  • ·         Click Permissions and then click Edit bucket policy.
  • ·         Paste the below policy and click Save


{
       "Version": "2012-10-17",
       "Statement": [
             {
                    "Effect": "Allow",
                    "Principal": {
                           "AWS": "arn:aws:iam::XXXXX:user/mys3user"
                    },
                    "Action": "s3:PutObject",
                    "Resource": "arn:aws:s3:::mys3bucket/*"
             },
             {
                    "Effect": "Allow",
                    "Principal": {
                           "AWS": "arn:aws:iam::XXXXX:user/mys3user"
                    },
                    "Action": "s3:PutObjectAcl",
                    "Resource": "arn:aws:s3:::mys3bucket/*"
             }
       ]
}

This policy allows our IAM user to upload the file and set its access level to make it readable to everyone (public-read). We make the uploaded files publicly readable so the browser can show the card images directly from S3 when listing cards. If you don't want to make the images public, you can generate temporary signed image urls to show card images in the browser. You can find more information here.


3. Develop the signing part in server

First we create Spring controller to generate signature. The class use bucket name, bucket region, AWS credential for signing the bucket uploads and the secret of the AWS credential as variables. After generating the signature, the controller returns the signed upload data that will be used in upload request.

@RestController
public class CardUploadController {

       @Value("${user.card.upload.s3.bucket.name}")
       String s3BucketName;

       @Value("${user.card.upload.s3.bucket.region}")
       String s3BucketRegion;

       @Value("${user.card.upload.s3.bucket.awsId}")
       String s3BucketAwsId;

       @Value("${user.card.upload.s3.bucket.awsSecret}")
       String s3BucketAwsSecret;

       @RequestMapping(value = "/presign", method = RequestMethod.POST)
       @ResponseBody
       public PreSignedS3UploadData presignS3Upload(@RequestParam("contentType") String contentType, @RequestParam("fileName") String fileName, HttpSession session) {
             PreSignedS3UploadData res;
             try {
                    String extension = fileName.lastIndexOf('.') == -1 ? "" : fileName.substring(fileName.lastIndexOf('.'));
                    String s3FileName = "upload_" + (int)(100000 * Math.random()) + extension;
                   
                    res = S3SignUtil.generatePreSignedUploadData(s3BucketName, s3BucketRegion, s3BucketAwsId, s3BucketAwsSecret, contentType, s3FileName);
             }
             catch (Exception e) {
                    res = new PreSignedS3UploadData("Can't generate signature for upload: " + e.toString());
             }
             return res;
       }
 }

The signature generation algoritm first generates a security policy, then generates a signing key with AWS credentials and then signs the policy with the signing key. The security policy specifies the expiration date and time, ACL for the file being uploaded and some other options. This application generates a policy with 3 minute expiration time, public-read ACL to allow public access and 1MB max upload size. A sample policy looks like below.

{
       "expiration": "2017-04-26T23:09:59.638Z",
       "conditions": [
             { "acl": "public-read" },
             { "bucket": "XXXXX" },
             { "key": "upload_49921.jpg" },
             { "Content-Type": "image/jpeg" },
             ["content-length-range", 0, 1048576],
             { "x-amz-credential": "XXXXXXXX/20170426/eu-central-1/s3/aws4_request" },
             { "x-amz-algorithm": "AWS4-HMAC-SHA256" },
             { "x-amz-date": "20170426T000000Z" }
       ]
}

Policy and signature generation code is in S3SignUtil class.

For more information on generating the policy and signature, see here.


4. Prepare the web front end.

After we complete the code that generates the signed upload data, we can prepare the web front end. We will change the dashboard.jsp file and add a file upload input to the Add Card form. When the selection change in the file upload input, we generate a signature using the controller we created in the 3rd step. Then we generate a dynamic form to post the file with the signature to the S3 bucket url.

The script is below.

function cardImageFileUpdated(){ 
       var file = document.getElementById('cardImageInput').files[0];
      
       if (file != null)
             startCardImageFileUpload(file);
}

function startCardImageFileUpload(file) {
       $.ajax({
         type: "POST",
         url: "presign",
         data: 'contentType=' + encodeURIComponent(file.type) + '&fileName=' + encodeURIComponent(file.name),
         success: function(data){
                if (data.errorMessage)
                    alert(data.errorMessage);
               else
                    doCardImageFileUpload(file, data);
               },
       });
}


function doCardImageFileUpload(file, data){

       var formData = new FormData();
      
       formData.append('key', data.fileName);
       formData.append('acl', 'public-read');
       formData.append('Content-Type', data.contentType);
       formData.append('X-Amz-Credential', data.credential);
       formData.append('X-Amz-Algorithm', "AWS4-HMAC-SHA256");
       formData.append('X-Amz-Date', data.date);
       formData.append('Policy', data.policy);
       formData.append('X-Amz-Signature', data.signature);
       formData.append('file', $('input[type=file]')[0].files[0]);
      
       $.ajax({
           url: data.bucketUrl,
           data: formData,
           type: 'POST',
           contentType: false,
           processData: false,
           success: function () {
              var imageUrl = data.bucketUrl + "/" + data.fileName;
          
              document.getElementById('cardImagePreview').src = imageUrl;
              document.getElementById('cardImageUrl').value = imageUrl;
           },
           error: function () {
              alert("Upload error.");
           }
       });
}
  
And we change the Add Card form from

<form id="add-card-form" onsubmit="return false;">
       <input type="text" name="name" placeholder="name" />
       <button onclick="addCard()">Add</button>
</form>

to

<form id="add-card-form" onsubmit="return false;">
       <span>Card Name</span>
       <input type="text" name="name" placeholder="name" /><br/>
      
       <span>Card Image File</span>
       <input type="file" id="cardImageInput" accept="image/*" onchange="cardImageFileUpdated()"/>
       <img style="border:1px solid gray;height:160px;width:120px;" id="cardImagePreview" src="/images/default-card.png"/>
       <input type="hidden" id="cardImageUrl" name="imageUrl" value="/images/default-card.png"/> <br/>
                          
       <button onclick="addCard()">Add</button>
</form>

First we add an input with file type to select the image file. And we use an image tag to preview the image after the upload complete. And then we add a hidden imageURL field to Add Card form as I talked in the beginning of this post.

At this point, we finished the uploading the card image to S3 and saving the card with the url of the file uploaded to S3. Next we will show the image of the cards in card listing tables. We change the buildHtmlTable JavaScript function in dashboard.jsp from

if (cellValue == null) cellValue = "";
row$.append($('<td/>').html(cellValue));

to

if (cellValue == null) cellValue = "";
if (columnList[colIndex] == 'imageUrl')
 cellValue = '<img style="border:1px solid gray;height:160px;width:120px;" src="' + cellValue + '"/>';
row$.append($('<td/>').html(cellValue));

to use the imageURL field as card image.

After showing the card images in card listings, we have completed the changes. If you run the application with this command,

$ mvn spring-boot:run -Drun.jvmArguments="-Duser.activation.queue.name=XXX -Dmail.from.address=XXX -Duser.card.upload.s3.bucket.name=XXX -Duser.card.upload.s3.bucket.region=XXX -Duser.card.upload.s3.bucket.awsId=XXX -Duser.card.upload.s3.bucket.awsSecret=XXX"

you can use the application like the screenshots below.






Summary

In this post, I have shown adding a file upload functionality to add card images. I have used direct S3 uploads from the browser without uploading the file to a EC2 instance first. The code can be found at my GitHub repository.

In my next posts, I will continue to use various AWS services to add functionality to my digital card store application.