aws$lambda$functions$ - cs.wmich.edu · pdf...
TRANSCRIPT
AWS Lambda Functions 9/22/15 & 9/24/15
CS 6030 Tyler Bayne
Installing Java
1. http://www.oracle.com/technetwork/java/javase/downloads/jdk8-‐downloads-‐2133151.html and install the latest JDK8.
Installing & Configuring Eclipse
2. Go to https://eclipse.org/downloads/ 3. Download and install Eclipse IDE for Java EE Developers 4. Follow http://docs.aws.amazon.com/AWSToolkitEclipse/latest/GettingStartedGuide/tke_setup_install.html to
install the AWS Toolkit for Eclipse 5. Go to http://docs.aws.amazon.com/AWSToolkitEclipse/latest/GettingStartedGuide/tke_setup_creds.html to set
up the Amazon Web Services Access Credentials
Creating a Lambda Function project
6. Go to File -‐> New _-‐> Other 7. Expand AWS 8. Select AWS Lambda Java Project 9. Project Name: LambdaExample 10. Package Name: example
11. Keep the rest of the settings the same so that it looks like this:
12. Click Finish 13. Close the Welcome menu so that you can see your new project
14. Open up LambdaFunctionHandler.java inside the LambdaExample/src/example folders.
15. As the name implies, this is the handler for your lambda function. We’ll modify this handler to
perform various actions when an S3 Event is fired. Testing the Lambda Function 16. There are 3 ways to test this function: Locally through Eclipse, on AWS through Eclipse, or on
AWS through a web browser. 17. To test locally through eclipse, navigate to the tst/example folder and you’ll see a
LambdaFunctionHandlerTest.java file and a s3-‐event.put.json file.
18. These files are used for writing unit tests of the handler function. The
LambdaFunctionHandlerTest.java parses the example S3 event in the json file and passes it to the main handler. We can edit the json locally to have it test different scenarios.
19. Open up the LambdaFunctionHandlerTest.java and click the green arrow at the top. Your handler will run locally and it will output in the Console box.
20. Step number 2 is to run it through AWS in the browser. Right click on the project (LambdaExample in the top level) and choose Amazon Web Services -‐> Upload function to AWS Lambda
21. Select your region and choose a name for your new function. 22. Click Next and write a brief description and then select the IAM Role and S3 bucket that
it will use. You will have to create these if they are not already created. 23. Click Finish and the lambda function will upload to AWS. 24. Go to the AWS Console in your web browser and choose Lambda from the list of
services.
25. You’ll see your new function in the function list.
26. Click on it to go to the settings. 27. Click the Actions dropdown menu and then choose Configure sample event 28. Choose S3 Put from the dropdown menu. It probably has Hello World in it already. 29. Click Submit and you’ll see your Lambda function will start running. The output is shown at the
bottom in two places. The first place underneath Execution result shows what your function
returned. We have “null” here because our function is currently returning null.
30. The other spot is the Log output which shows the logging statements from the function.
31. This same output is recorded using the CloudWatch service. Go back to the main AWS
Console page and select CloudWatch.
32. Click Logs on the left 33. Choose your lambda function from the list 34. You’ll now see a list of Log Streams with timestamps. A new one is generated everytime
the lambda function is run through AWS. Click on the latest one. 35. The Log Output (Context logger logs) from before are shown here. 36. The last way to run the function is back in Eclipse. Right click on the project name and
choose Amazon Web Services -‐> Run fuction on AWS Lambda. 37. Click Invoke to just use the default json as your input. You’ll notice that it’s the same S3
Put that we had just used in our web browser.
38. The function will upload to AWS, run, and then print out the execution results in the console.
Modifying the Lambda Function
39. Let’s first have our function actually return a value. Replace return null with return “Hello World!” at the end of the main LambdaFunctionHandler.java
40. Test the function locally in Eclipse to verify that it works correctly. You should see this:
41. Next, let’s start reading in properties from our S3 Event. Open up the s3-‐event.put.json and
look at the various properties. Go back to LambdaFunctionHandler.java. To read the first record, we can call input.getRecords().get(0) and then we can get the different properties by appending them to that like input.getRecords().get(0).getEventName()
42. Wrap that in the logger command to get context.getLogger().log("Event Name: " + input.getRecords().get(0).getEventName()+ "\n");
43. Run the function in Eclipse again and you’ll see the following output:
44. The last property we want to get is the key. This is the name of the file that is
referenced in the S3 Event. Wrap the following code in another logger command: input.getRecords().get(0).getS3().getObject().getKey()
45. Test it in Eclipse and you should see something like:
46. Let’s run this lambda function in a real-‐world scenario now. First, upload it to AWS as you did before. Make sure you select your existing function in the upload menus. This won’t show up if you’re selecting a different region.
47. Go to the Lambda page in the AWS Console again. 48. Test your function with the S3 Put event just to make sure it works before our real test. 49. You should something similar to this Log output:
50. Click on Event sources on the lambda test page and select Add event source 51. Choose S3 from the dropdown and the Event Type of Object Created -‐> Put 52. Make sure Enable Now is checked and click Submit. You’ll see your new event source in the list:
53. Go back to the main AWS Console and choose S3 54. Click on the bucket you created during your first AWS test. You’ll now be at a page where you
can upload files and create new folders. 55. Go ahead and upload a test file. You can use any file or you can use this one as an example:
http://wmich.edu/profiles/wmu/themes/wmu_andalusian/images/w.svg 56. After it’s uploaded, go to the CloudWatch page and look at the latest log.
57. You’ll see that our function automatically fired when we uploaded a file to the S3 Bucket service
and then it logged it here. Improve this function On your own, try to edit the function so that it outputs other properties like the eventName, bucket name, key (file name), eventTime, and the principalId of the bucket owner. Test it by uploading the
function to AWS Lambda, uploading files to the S3 Bucket, and then checking the CloudWatch logs to see the properties you printed out. More Improvements Make a function that can read any uploads to a specific S3 Bucket and moves them to a different S3 Bucket. An example of this could be that you have an S3 Bucket called Dropbox and any files in there get processed and moved to the Archived S3 Bucket after the files are processed. A previous version of these instructions said “folder” instead of S3 Bucket. Because of that, you can use separate folders inside the same S3 bucket or use separate S3 buckets for this improvement. For example, scenario 1 has a single S3 bucket with two folders inside of it: Dropbox & Archived. Any files that get uploaded to the Dropbox folder get sent to the Archived folder. Scenario 2 has two S3 buckets. One is named Dropbox and the other is named Archived. Any files that get uploaded to the Dropbox S3 bucket get sent to the Archived S3 bucket. Choose ONE scenario. Amazon has a walkthrough that does something similar here: https://docs.aws.amazon.com/lambda/latest/dg/java-‐wt-‐s3-‐log-‐event-‐data.html References There are references throughout this guide but other references that weren’t mentioned were: http://docs.aws.amazon.com/lambda/latest/dg/java-‐gs.html http://docs.aws.amazon.com/lambda/latest/dg/intro-‐invocation-‐modes.html http://docs.aws.amazon.com/lambda/latest/dg/API_Reference.html