AWS Made Easy
Close this search box.

Tip #17: Serverless computing: AWS Lambda vs AWS Step Functions

How to use AWS Lambda and AWS Step Functions to run your serverless computing workloads

If you are looking into serverless computing on AWS, there is no way around AWS Lambda. Introduced in 2014, AWS Lambda is a highly scalable serverless computing service that allows you to run code for virtually any application or service without provisioning or managing servers. You can trigger Lambda functions directly from the web, and from over 200 AWS services. AWS Lambda natively supports Node JS, Python, Java, C#, Go, PowerShell, and Ruby.

AWS Lambda vs AWS Step Functions

What happens if you hit the 15 minute timeout? What can you do when your AWS Lambda function starts to grow in size and complexity? Then you should consider using Step Functions. It will transform your single Lambda function into a distributed microservices application.

Focus on higher-value business logic while AWS Step Functions provides:

  • Error handling, retry, and rollback capabilities
  • Native integration with other AWS services
  • Parallelization
  • High observability

With its event-driven architecture, you can even build workflows including human intervention, e.g. when your application requires manual approvals.

How to optimize for performance and costs

When it comes to serverless computing, optimizing your Lambda functions can make a huge difference. Faster response times make customers happy, and less compute time also means lower costs. There are several areas where you can optimize your code, such as:

  • Cold starts: if your function hasn’t been invoked for some time, the first execution will be slower, since the code must be fetched from disk. You should use tree-shaking and other techniques to minimize the code size. You can avoid cold starts altogether with provisioned concurrency.
  • Initialize outside of the function scope: you can initialize reusable objects and components outside the function code. This will reduce execution time, since the initialization only happens during a cold start, and not each time the function is invoked.
  • Caching: whenever feasible, avoid invoking your database or external APIs repeatedly, and store the data in a local cache. You can simply create a cache instance outside the function scope, and use it while your function is “hot”, or also use the ephemeral storage (/tmp) which recently has been increased to 10GB maximum size.
  • Memory allocation: AWS charges for execution time and allocated memory. Increasing memory size costs more but can reduce execution time. Use the AWS Compute Optimizer to find the sweet spot where you have optimal performance and the lower costs.
  • Parallel asynchronous execution: whenever feasible, use parallel execution to reduce the overall response time.
  • External APIs: If you invoke external APIs with long response times, consider using AWS Step Functions to avoid paying for compute time when waiting for the response.
  • ARM vs. x86: if your code can run on ARM, switch to Graviton to benefit from its 34% better price performance.
  • Check out provisioned concurrency and auto-scaling and reserved concurrency options to avoid throttling under workload peaks.


AWS Made Easy


Leave a Reply

Your email address will not be published. Required fields are marked *

Related Tips & Tricks