Accessing SQL Server in AWS Lambda – Part 3

This is Part 3 on a series detailing everything you need to get going on your serverless architecture whilst using your existing SQL Server database. View the source on GitHub or checkout Parts 1 & 2 if you haven’t already.

Step 6: Handlers
We haven’t done anything related to AWS Lambda so no wonder you are still confused. So now we have mastered connecting to your SQL Server (even on RDS) in Part 2, we now need to be able to handle Lambda events.
To do this, add a new block of code above dependencies to your build.gradle, defining provided as a type of dependency and then add the following to your dependency list: provided ‘com.amazonaws:aws-lambda-java-core:1.0.0’. It should look something like this now:

1
2
3
4
5
6
7
8
9
10
11
12
configurations {
    provided
    compile.extendsFrom provided
}
 
// In this section you declare the dependencies for your production and test code
dependencies {
    compile fileTree(dir: 'libs', include: ['*.jar'])
    provided 'com.amazonaws:aws-lambda-java-core:1.0.0'
    testCompile 'junit:junit:4.12'
    testCompile 'com.fasterxml.jackson.core:jackson-databind:2.8.3'
}

Again refresh your idea module with ‘gradlew cIM iM’ (last time I promise).

We then need to be able to receive JSON data posted the Lambda event, this is done using AWS serialisation, so all we need to do is create a plain old Java object that matches the fields in the expected payload. So let’s create another Java class in your existing domain package, RequestClass.java:

1
2
3
4
5
6
7
8
9
package com.mannanlive.domain;
 
public class RequestClass {
    private Integer value;
    public RequestClass() { }
    public RequestClass(Integer value) { this.value = value; }
    public Integer getValue() { return value; }
    public void setValue(Integer value) { this.value = value; }
}

Now we need to be able to actually receive the Lambda event, to do this create a handlers package and a TimeHandler.java class:

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
package com.mannanlive.handler;
 
import com.amazonaws.services.lambda.runtime.Context;
import com.amazonaws.services.lambda.runtime.RequestHandler;
import com.mannanlive.domain.RequestClass;
import com.mannanlive.domain.TimeResponseClass;
import com.mannanlive.query.TimeQuery;
import com.mannanlive.repository.QueryDao;
 
public class TimeHandler implements RequestHandler {
    public TimeResponseClass handleRequest(RequestClass request, Context context) {
        TimeQuery query = new TimeQuery(request.getValue());
        return new QueryDao().executeSingleton(query);
    }
}

Now, when we send {“value”:”1″} to our Lambda function, we will expect it to query our SQL Server database for the current date plus one day, then return this to the caller as {“currentTime”:”2016-10-09″}. Not that impressive for all that work, but you could expand this to do any data access queries and scale as large as your SQL Server can handle.

Step 7: Deploying to Lambda
In your build.gradle, at the very top, add the following

1
2
3
4
5
6
7
8
9
10
11
12
import jp.classmethod.aws.gradle.lambda.AWSLambdaMigrateFunctionTask
 
// grab the AWS plugin
buildscript {
    repositories {
        maven { url 'https://plugins.gradle.org/m2/' }
    }
    dependencies {
        classpath 'jp.classmethod.aws:gradle-aws-plugin:0.21.1'
    }
}
apply plugin: 'jp.classmethod.aws.lambda'

What this is doing is adding a new plugin to the project that will allow us to easily deploy our code to AWS. Now we need to define a task to do this, so at the bottom of the build.gradle add:

26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
jar {
    dependsOn configurations.runtime
    from {
        (configurations.runtime - configurations.provided).collect {
            it.isDirectory() ? it : zipTree(it)
        }
    }
}
 
aws {
    profileName = 'default'
    region = 'ap-southeast-2'
}
 
// migrate or create the function to AWS
task migrateTimeQuery(type: AWSLambdaMigrateFunctionTask, dependsOn: jar) {
    role = "arn:aws:iam::${aws.accountId}:role/lambda"
    zipFile = jar.archivePath
    runtime = 'java8'
    timeout = 10
    functionName = 'query-date'
    handler = "com.mannanlive.handler.TimeHandler::handleRequest"
}

This is doing three things. Firstly, when gradle produces a .jar file with your code, it will suck up everything into a “fat jar” including any libraries you use. This is needed because we want to include the Microsoft SQL Server driver jar. Seconly it is telling the AWS plugin what credentials to use. Now this is where you have to remember the steps you did in Part 1, if you changed the credentials file to something other than default, you need to change it here. The final part is defining a new taks of type Lambda migration. You can change the name and timeout here and likewise if you changed the role name in IAM to something other than “lambda” then you have to change it here also. Once this is correct, you can run ‘gradlew migrateTimeQuery’ and it will push your code to AWS. Hopefully it looked a little like this:

gradlew migrateTimeQuery
:compileJava
:processResources UP-TO-DATE
:classes
:jar
:migrateTimeQuery
Function not found: arn:aws:lambda:ap-southeast-2:012345678912:function:query-date
Creating function... query-date

BUILD SUCCESSFUL

Total time: 11.39 secs

Step 8: Testing your Lambda
Now you can go into the AWS Console and view your Lambda Functions. Click on query-date and then Test. You can try with any JSON and it should come back with the current date. You can also pass in {“value”:”-1″} and it should returns yesterdays. Success!
If you get a timeout issue, like I did, I had to go to Configuration tab, Advanced Settings, and selected a VPC, Subnets and Security Group that I had that allowed access to the RDS database.

Let me know if you found this tutorial useful, or you have any issues in the comments below.

About the Author

Mannan

Mannan is a software engineering enthusiast and has been madly coding since 2002. When he isn't coding he loves to travel, so you will find both of these topics on this blog.

Leave a Reply

Your email address will not be published. Required fields are marked *