Serverless

Serverless computing model is the fully managed cloud service where the cloud provider manages the underlying infrastructure and automatically allocates and de-allocates resources to execute and scale for the application's immediate demand. This means that developers don't need to worry about how the underlying compute infrastructure is provisioned. Instead, they can focus on implementing business features and functionality in the application. In short, developers need to focus on writing code for their application logic, which will be deployed onto functions and executed on events triggered by the application or the user. Serverless computing model is billed as per usage and billed accordingly, making it a more cost-effective choice when unexpected changes in application load occur often.

Serverless computing is a recent development in cloud computing, which has gained significant popularity in recent years. It's popular for the underlying concept of executing code without managing or provisioning servers. Oracle Cloud Infrastructure (OCI) functions for implementation of serverless platforms is based on the open source Fn Project, which is open source, container native, and serverless, with the ability to run in any cloud and on-premises.

Benefits

Serverless provides multiple benefits over other services available in the cloud and has an edge becasue of:

  • Reduced operational overhead: Serverless computing removed the overhead of managing infrastructure, servers, and networking, and keeps development teams focused on writing code and developing new business functionalities.
  • Faster time-to-market: Serverless computing allows faster time to market as applications can be deployed faster than other compute services offered in the cloud.
  • Increased scalability: Serverless computing scales automatically based on workload demands for both scale up and scale down. Any spike in load takes care of itself.
  • Cost effective: Serverless computing is cost effective because you only need to pay for the time the application code is running, resulting in cost savings over traditional compute models.
  • Improved reliability: Serverless computing has built-in redundancy and failover mechanisms, which improves reliability of applications.
  • Higher flexibility: Serverless computing supports various popular programming languages to provide flexibility to developers. It also supports deployment models such as containers and writing code deployed onto serverless computing.

How It Works

Serverless computing, often referred as function as a service, allows developers to focus on writing code and deploying code without worrying about the underlying infrastructure required to run application code. Use the following steps to run code on serverless implementations such as OCI functions.

  1. Write your code in the preferred programming language and package it into a container image from the function.
  2. Provide the function definition in func.yaml, including maximum execution time and memory consumption.
  3. Push the container image to the container registry to pull from before execution.
  4. Upload function metadata (memory and time restrictions) to Fn Server.
  5. Add the function to the list of functions in the Console.
  6. Trigger from the CLI command or external trigger like http request, schedule and Event service
  7. On the trigger function, identify the container image and pull it from the container registry.
  8. Execute the logic defined in the container image on an instance in a subnet associated with the application where the function belongs.
  9. After execution or idle period, the container image is removed and resources are released.

Platforms

OCI supports serverless as function as a service, known as OCI Functions and OCI Container Instances.

  • Functions: Functions is a fully managed, multi-tenant, highly scalable, on-demand, function as a service platform built on enterprise grade infrastructure and powered by the Fn Project open source engine. With functions you can write code in Java, Python, Node, Go, Ruby, and C#.
  • Container Instances: Container Instances is a serverless compute service that lets you quickly and easily run containers without managing any servers. It provides the same isolation level as virtual machines.

Best Practices

Serverless is designed to accommodate unpredictable spikes in workload resources demand, but it must be designed and implemented with recommended best practices to take advantage of efficient utilization.

  • Fast and optimized: It should be implemented to execute fast, utilize resources efficiently, and avoid unnecessary dependencies.
  • Event driven: Serverless is best suited for event driven architecture to initiate processing from the trigger of the event to reduce cost and improve scalability.
  • Implement with managed services: Integrating serverless with managed services reduces complexity and scalability needs. Managed services such as database, storage, and messaging are commonly interacting services with serverless.
  • Monitoring: Monitor performance and cost for serverless to optimize for slow and expensive functionalities. Leverage the OCI recommended monitoring implementation.
  • Security: Data at Rest and Transit must be encrypted with access to control for minimizing security risk and attacks.

Use Cases

Serverless has several use cases for individual business and functional needs. Common use cases are:

  • Batch jobs: Process triggered on predefined schedule and frequency.
  • Event driven application: Process initiated from external event such as message or http request.
  • Microservices: Thin independent service implementation.
  • Machine learning model: Deploy models on cost effective and scalable API endpoints.

Moving Your Workload to Serverless

Moving your existing workload to serverless can help you reduce cost, increase resiliency, and improve scalability. The following information describes steps for moving your workload from on-premises to serverless.

  1. Identify the workload you want move onto serverless and identify size, complexity, and dependencies.
  2. Evaluate the serverless platform if a function is most suitable or a container instance is more appropriate.
  3. Determine external service dependencies such as database, storage, and messaging for the workload.
  4. Refactor the application code to optimize for performance and containerized, if needed.
  5. Test the application for expected functionality in the serverless environment. Use the recommended test environment or similar.
  6. Deploy the application to the serverless platform for execution.
  7. Monitor application performance metrics and resource utilization.
  8. Optimize the application, if needed, to improve performance going forward based on the metrics captured from monitoring.
  9. Automate everything possible using the DevOps pipeline for build and deployment.

Challenges and Limitations

Serverless has advantages, but consider the following items for workload deployment.

  • Cold start: Serverless are designed to trigger based on the event, so they're not readily available to start processing. Even though they require significantly negligible amounts of time to start, consider ultra low latency demands.
  • Execution timeout: Serverless is built for accomplishing quick functionality and release of resources. As a result, there is an associated timeout for serverless to handle unforeseen scenarios for dead lock or infinite loops and terminations. These scenarios can lead to data loss and unstable state scenarios.
  • Limited Infrastructure Access: Serverless computing has minimum access to the infrastructure on which it's running, so it has to rely on external state management or storage services for information to persist.
  • Complexity: Traditional architecture with complex work flows and dependencies may increase complexity handling with serverless for debugging and troubleshooting issues.

Future of Serverless

The future of serverless involves eliminating the need to manage complex infrastructures and manually scale challenges. It also involves handling unexpected surges in application load while paying only for utilization. There are several frameworks and solutions being developed and evolving on serverless to minimize developer effort to handle enterprise scenarios out of the box. Review the CNCF landscape for more tools, frameworks, and installable platforms on serverless.