Openshift Serverless: Building Event-Driven Applications

Author

Reads 804

Computer server in data center room
Credit: pexels.com, Computer server in data center room

Building event-driven applications with OpenShift Serverless is a game-changer for developers. It allows you to build scalable and highly available applications without worrying about the underlying infrastructure.

OpenShift Serverless uses a function-as-a-service (FaaS) model, which means you can write code without provisioning servers or worrying about scaling. This approach enables you to focus on writing code and delivering value to your customers.

With OpenShift Serverless, you can build applications that respond to events in real-time, making it ideal for use cases like IoT, real-time analytics, and more. This is made possible by the event-driven architecture that OpenShift Serverless provides.

By using OpenShift Serverless, you can take advantage of a pay-per-use pricing model, which means you only pay for the resources you use. This can lead to significant cost savings and a more efficient use of resources.

Knative

Knative is a serverless application layer on top of OpenShift/Kubernetes, providing a best-of-both-worlds experience for developers and sysadmins. It abstracts away infrastructure, allowing developers to focus on code and sysadmins to manage infrastructure.

Credit: youtube.com, [Quick demo] OpenShift Serverless / Knative Serving

Knative consists of three building blocks: Build, Eventing, and Serving. These components enable serverless workloads to run on OpenShift.

The Build component uses developer tools and templates built into OpenShift, making it easy to build pods for serverless applications. Eventing ties into pipelines to use events to trigger actions, and can consume and react to events from sources like Kafka or Jenkins CI/CD builds. Serving enables OpenShift serverless to scale from 0 to 'N' and back again according to demand, using Istio for intelligent traffic routing and management.

Here are the three building blocks of Knative:

  • Build
  • Eventing
  • Serving

Knative also supports various event sources, including Kafka, which is used in the example to exchange messages between apps.

Event-Driven Architecture

Event-Driven Architecture is a design pattern that allows applications to respond to specific events, rather than following a predetermined sequence of actions. This approach enables greater flexibility and scalability.

In an Event-Driven Architecture, events are produced by various sources, such as user interactions or system changes, and are then processed by event handlers. Event handlers can be triggered by multiple events, making the system more robust and fault-tolerant.

Credit: youtube.com, OpenShift Serverless Eventing: A 5 minute introduction and demonstration

Event-Driven Architecture is particularly useful in serverless environments, such as OpenShift Serverless, where the focus is on event-driven computing. This approach aligns perfectly with the serverless model, where resources are dynamically allocated and deallocated based on demand.

OpenShift Serverless provides a managed platform for building and deploying event-driven applications, complete with event-driven autoscaling and event-driven logging. This allows developers to focus on writing event handlers, rather than worrying about infrastructure and scalability.

Event-Driven Architecture also enables greater decoupling between different components of the system, making it easier to modify or replace individual components without affecting the entire system. This is particularly important in complex systems, where changes can have far-reaching consequences.

Deploying Functions

Deploying functions on OpenShift Serverless is a straightforward process. You can create a serverless function with a single kn command, choosing from multiple runtimes and templates.

To deploy a Quarkus-based function, you'll need to edit the generated pom.xml file, replacing the Java version with 11 and the Quarkus version with the latest. You'll also need to replace System.out with a Logger implementation to print logs.

Once you've made these changes, you can deploy your function using the kn func command, which will start the local function on Docker and then deploy it to the OpenShift cluster.

Prerequisites

Credit: youtube.com, Lessons Learnt Developing and Deploying a Cloud Native Network Function - Paul Graham

To deploy functions, you need to install and configure two operators on OpenShift: AMQ Streams (Kafka Strimzi) and OpenShift Serverless (Knative).

You'll also need to create four components: Kafka, KnativeServing, KnativeEventing, and KnativeKafka, using OpenShift Console. Leave the default settings for all components.

Create the Kafka instance in the kafka namespace, and use the YAML manifest to create a 3-node Kafka cluster if you need to.

Create the KnativeServing component in the knative-serving namespace, and the KnativeEventing component in the knative-eventing namespace.

To enable the sink and source for KnativeKafka, you'll need to install KafkaSink and KafkaSource CRDs and controllers.

You can run Docker or Podman on your local machine for running build with CNCF Buildpacks, but I found that Docker works better.

Alternatively, use Code Ready Containers (crc) to run OpenShift Serverless locally, or try the developer sandbox available online.

Finally, install the oc client and kn CLI locally, and make sure to install the Knative CLI version provided by RedHat, following the detailed installation instruction available.

Other Deployment Models

Credit: youtube.com, On-premise deployment model explained

Deploying functions can be done in various ways, but it's essential to understand the differences between these models to make an informed decision.

Running an application on a physical server requires continuous resource consumption, which can be a significant drawback.

Virtual Servers (VMs) offer better resource utilization, but they still require sizing to fit the workload requirements, which can be time-consuming.

Containers allow for more efficient resource utilization than VMs, thanks to their smaller resource footprint and the ease of scaling up or down with container orchestration.

Here's a comparison of the different deployment models:

Serverless deployment offers all the benefits of containers, but with the added advantage of releasing resources once the application's function has completed, allowing for quick scaling up and down as needed.

Deploying Functions

You can deploy serverless functions on OpenShift using the kn func plugin. This plugin allows you to work directly with the source code and uses Cloud Native Buildpacks API to create container images.

Credit: youtube.com, Building & Deploying Appwrite Functions - Part 2

The kn func plugin supports several runtimes, including Node.js, Python, and Golang. However, for this example, we'll be using Java runtimes based on the Quarkus or Spring Boot frameworks.

To deploy a Quarkus function, you'll need to create a new project using the kn command. This will generate a sample application source code with a pom.xml file. You can then edit the pom.xml file to replace the version of Java and Quarkus with the latest.

The kn func plugin will generate a simple function that takes CloudEvent as an input and sends the same event as an output. You can modify this code to print logs using a Logger implementation.

Once you've modified the code, you can proceed to the deployment phase by running the kn func command in the application root directory. This will start the local Docker environment, and if it finishes successfully, you can proceed to the deployment phase.

To deploy a Spring Boot function, you'll need to create a new project using the kn command with the springboot runtime. You'll then need to edit the Maven pom.xml file to set the latest version of Spring Boot.

Credit: youtube.com, Cloud Functions in a minute

The Spring Boot function code uses Spring functional programming style, where a Function bean represents HTTP POST endpoint with input and output response. You'll need to add a property to the application.properties file to configure the function.

To deploy both Quarkus and Spring Boot functions on OpenShift, you can use the kn func plugin with the quarkus and springboot runtimes. You'll need to create a custom Maven profile openshift to include the quarkus-openshift dependency and enable deployment.

Once you've included the quarkus-openshift module, you can use Quarkus configuration properties to customize the deployment process. You can override default autoscaling settings with the quarkus.knative.revision-auto-scaling.* properties.

To deploy the apps, you can run the following command in the repository root directory:

```bash

mvn clean package -Popenshift

```

This will deploy all the apps to the target OpenShift cluster. You can verify if all the triggers have been configured properly and take a look at the “Topology” view on OpenShift to illustrate your serverless architecture.

Here are the steps to deploy a Quarkus function on OpenShift:

1. Create a new project using the kn command

2. Edit the pom.xml file to replace the version of Java and Quarkus with the latest

Credit: youtube.com, Timer Trigger Azure Functions with .NET 9 | Step-by-Step Guide for Beginners

3. Modify the code to print logs using a Logger implementation

4. Run the kn func command in the application root directory

5. Proceed to the deployment phase

Here are the steps to deploy a Spring Boot function on OpenShift:

1. Create a new project using the kn command with the springboot runtime

2. Edit the Maven pom.xml file to set the latest version of Spring Boot

3. Modify the code to configure the function

4. Add a property to the application.properties file

5. Run the kn func command in the application root directory

6. Proceed to the deployment phase

Integration with AWS

You can integrate AWS Lambda into an OpenShift cluster for a hybrid cloud setup. This allows you to use AWS Lambda to handle certain tasks while still leveraging the benefits of OpenShift.

AWS Lambda and OpenShift Serverless can be used together to create a powerful hybrid cloud setup. You can host your OpenShift cluster on AWS and use AWS Lambda to handle tasks such as scaling your cluster.

Credit: youtube.com, Deploying applications developed on OpenShift on AWS to Edge with Ansible Automation Platform.

To integrate AWS Lambda into an OpenShift cluster, you'll need to create an AWS Lambda function, an OpenShift Route, and then trigger the Lambda function by sending a request to the Route's URL. This can be done manually or automated using OpenShift's built-in cron jobs or event triggers.

Here's a step-by-step guide to integrating AWS Lambda into an OpenShift cluster:

  1. Create an AWS Lambda function containing the code you want to run in response to certain events.
  2. Create an OpenShift Route that points to your Lambda function, providing a URL that can be used to trigger your Lambda function.
  3. Trigger your Lambda function by sending a request to the Route's URL, which can be done manually or automated using OpenShift's built-in features.

You can also use AWS Lambda to help manage an OpenShift cluster hosted on AWS. This can be done by creating Lambda functions to automate routine tasks such as scaling your cluster, backing up data, or monitoring performance.

To scale your OpenShift cluster using AWS Lambda, you'll need to create a Lambda function containing the code to scale your cluster, create a CloudWatch alarm that triggers your Lambda function when certain conditions are met, and then scale your cluster when the alarm is triggered.

Benefits and Conclusion

OpenShift Serverless offers several benefits, including cost savings and more efficient utilization of resources. With OpenShift Serverless, you only pay for execution time, which means you're not charged for idle time.

Credit: youtube.com, OpenShift 4 Serverless

Here are some key differences between serverless and traditional server or VM-based deployments:

By choosing OpenShift Serverless, you can take advantage of these benefits and focus on writing code, not managing infrastructure.

Benefits of

Serverless computing offers numerous benefits, making it an attractive option for developers and businesses alike.

One of the biggest advantages of serverless computing is cost savings. With serverless, you only pay for the execution time of your code, which means you're not paying for idle time. This can lead to significant cost reductions, especially for applications with variable traffic.

Serverless computing also eliminates the need for infrastructure management, which can be a significant burden. You don't need to worry about patching or security updates, or monitoring your servers. This frees up time and resources for more important tasks.

Serverless computing is also highly scalable, making it easier to handle sudden spikes in traffic. The underlying platforms have generally plentiful resources, so scaling is easier and less of a concern. This means you can focus on building and deploying your application, rather than worrying about its performance.

Computer server in data center room
Credit: pexels.com, Computer server in data center room

The responsibility for high availability is also on the platform, so you don't need to design your application with high availability in mind. This can be a significant advantage, as it allows you to focus on building a great application, rather than worrying about its reliability.

Here's a comparison of serverless computing to traditional servers and VMs:

Conclusion

Both AWS Lambda and OpenShift Serverless are powerful platforms for deploying and managing serverless applications.

OpenShift Serverless provides more flexibility and a seamless experience for OpenShift users, making it a great choice for those already invested in the platform.

With OpenShift Serverless, you can easily build event-driven architecture for simple HTTP-based apps, which is completely transparent for the app.

OpenShift Serverless brings several features to simplify development, including the ability to leverage Quarkus Kubernetes Extension to easily build and deploy apps on OpenShift as Knative services.

While AWS Lambda offers tight integration with other AWS services, OpenShift Serverless is the better choice for those who want a more flexible and user-friendly experience.

Glen Hackett

Writer

Glen Hackett is a skilled writer with a passion for crafting informative and engaging content. With a keen eye for detail and a knack for breaking down complex topics, Glen has established himself as a trusted voice in the tech industry. His writing expertise spans a range of subjects, including Azure Certifications, where he has developed a comprehensive understanding of the platform and its various applications.

Love What You Read? Stay Updated!

Join our community for insights, tips, and more.