Serverless technology is changing the whole workflow and the way we look at development. There are several reasons for this.

Serverless shifts the payment towards a pay-as-you-go approach: you pay as much as the CPU time consumed (plus or minus 100ms). You do not wait for the server to start, do not distribute the load, and do not bother with maintenance. The task is written – the task is completed. On the other hand, there are cold start problems, and many projects are not suitable for the lack of clear control of the container. In this article I will tell you exactly in which cases Serverless can come in handy and when you need to look at it.

From microservices to serverless

Serverless technologies operate on the principles of microservices. The first thing that catches the eye is the similarities – event-based management. Both architectures use stand-alone modules that run to execute specific tasks in logical containers and are hidden from the eyes of the end user. 

The difference is that microservices operate on a request-response system. In Serverless, the functions are unidirectional (request only or response only) and are queued. Instead of a single proxy function, Serverless uses a set of unidirectional elements. If there is a bug, the application will not crash, and only one function will not be executed instead of the entire set. The error in this case is easier to find and correct.

Microservices also have to scale manually when there is a spike in load. Even if there is an autoscaling setting, you have to work with each service separately, setting the necessary parameters. There is no such problem with Serverless – the headache goes to the provider.

A set of microservices is not yet serverless. Serverless has its advantages, and working with the logic of microservices, it is very easy to miss them. Many fell for this bait.

Lego for adults: how to build any project with Serverless

For simple bots or microservices, it is not necessary to inject all the planned functionality into the application at once. You can start and add as you expand. Scaling is automatic and almost unlimited.

Serverless technologies allow you to quickly integrate into an existing application: for this case, almost all platforms provide APIs (via REST or message queues) with which you can develop add-ons independently of the application core.

Serverless, microservices, clouds: what a crow and a desk have in common

Serverless technologies inherit the qualities of both cloud services and microservices in general.

In odes to the clouds, it always sounds like this: you do not need to buy equipment, select a suitable room and hire a system administrator – and it is good, if only one. And the payment goes as you use it. 

Microservices are loved for simplified internal code for simple functions (FaaS is our everything), fast turnaround due to the ability to change and add code in parts, without worrying about whether the project will fall as a whole, and limitless horizontal growth.

These are the basics. What are the benefits of Serverless in general?

Learning the basics of using serverless technologies is easier and faster than learning full DevOps development. The lower the entry threshold, the easier it is to find a suitable specialist and the sooner you can deal with the project itself.

You don’t have to calculate the bandwidth requirement yourself: Serverless solutions automatically scale with incoming traffic.

There is no need to configure and maintain Kubernetes or monitor the state of containers. True, you should be more careful with the configuration, otherwise there is a risk of being on the verge of bankruptcy, as happened with the developers of the Announce service .

The principles of how serverless technologies interact with containers are similar to docker – both are great for working with microservices. Serverless technologies save time and nerves for those who do not want to worry about architecture, but docker provides independence from the service provider and absolute control of the project at any stage. What to choose? Depends on priorities – the ShoutOUT platform, for example, has moved from docker to serverless technologies to reduce costs and address scaling issues and more.

Where Serverless Doesn’t Fit

Cold start is most often criticized. To save resources, the provider disables functions that have not been called for some time, so when the function is started again, the provider has to re-enable it – and this causes a delay of several milliseconds or even seconds. 

The problem can be solved in several ways – for example, you can run functions with some frequency, start the service more often, or keep some of the containers running all the time, which will make the possible delay almost imperceptible … Or maintain the minimum packet size for deployment – the smaller they are, the faster loading … Some providers offer hybrid infrastructure, others use a monitoring system to help optimize the cold start process of functions. 

Distributed tracing is sometimes used as a solution method – for example, AWS X-Ray. AWS X-Ray helps you analyze applications during development and deployment, making it easier to determine the level of performance and find bugs that prevent it from improving.

When working with Serverless, it is important to immediately decide on the programming language, because the set of supported languages ​​is different for all providers. For example, AWS supports Java, Go, PowerShell, Node.js, C #, Python, and Ruby, while Yandex.Cloud uses Python, PHP, Go, Node.js, and Bash.

However, AWS Lambda and Azure Functions provide the ability to run in unsupported languages. Paradox! Serverless is promoted as an architecture made up of modules that might not run very often, and they don’t write them very often in mainstream programming languages. 

The problem of limitations

Tied to the cloud provider. Changing it with an already written application is difficult and time consuming. At the operational level, providers are not similar, there is little to no standardization, but in fact everyone is targeting large providers – especially those that offer interoperability. For example, Yandex.Cloud works well with AWS and can use Amazon S3, SQS, and DynamoDB HTTP APIs among the integration methods. In addition to code, applications are associated with storage, specific queuing, and so on. In many cases, more than just functions can be moved. Although sometimes it’s easier to write a new product. 

The lack of application integrity due to fragmentation into autonomous microservices is also mentioned as features of non-server platforms. On the one hand, Serverless takes advantage of microservices – the structure allows you to quickly release updates and additions, and an error in one of its fragments will not lead to the fact that the entire project will stop working. On the other hand, this means that porting a ready-made application monolith to Serverless is simply too expensive and costly. Serverless structures are now often served as an addition to a monolith rather than a single, off-the-shelf product.

In addition, as already mentioned, there are special features associated with limited access. Serverless gives you less control over compute resources: you cannot SSH into compute instances and manually configure settings.

Difference between providers

Since Serverless appeared on the market relatively recently and is highly dependent on the provider, each service has its own characteristics, which have already been described on Habré.

Amazon Web Services

Amazon Web Services is considered the largest and most thoughtful service, and Amazon Free Tier allows you to start completely free of charge. 

Microsoft Azure

Microsoft cloud services include over two hundred different solutions. It can be used in any unclear situations: for example, to integrate a Slack application as a Serverless backend.

Google cloud

The Google Cloud infrastructure that the platform offers to customers is similarly used for Google’s own products such as Google Search and YouTube. This inspires confidence, but the platform has abandoned backward compatibility – and this leads to some curious consequences. You can also read about parsing the computational stack of the Google Cloud Platform. 

Firebase

In 2018, Firebase services were considered almost the standard in the mobile app development industry. Now they are remembered … because they are part of the Google Cloud Platform support package. Interesting links:

Firebase is the subject of research again 

Habr Introduction to Firebase: writing a simple social app in Swift

Yandex Cloud Functions

The classic representative of serverless technologies. It easily integrates with other services in Yandex.Cloud, so if you want to create a skill for Alice, this is your choice. 

Where is Serverless going

Modern development requires the rapid introduction of new features, while Serverless allows you to quickly create and immediately test new features. This makes it possible to quickly add new functions to existing monoliths. 

Serverless technologies are good at peak loads due to automatic scaling and moderation of external interactions – if necessary, they can be simply cut off from the main service. So the application for testing on traffic rules has successfully withstood the participation of more than one hundred thousand people.

Not only the influx of users creates a high load, but also the connection with a large number of devices – and the developer from Sydney used serverless solutions to implement an application that just needs integration with several third-party vendors, as well as the ability to connect to IoT devices for processing data in the field conditions.

The IoT does not support monolithic constructs in the best way, so Serverless is a great choice in this case. In the future, serverless technologies will be increasingly used when working with the Internet of Things, artificial intelligence, mobile applications – wherever monolithic structures are more likely to be a minus.

Personally, I don’t expect the process to be quick. Massive abandonments of the old architecture are also not expected in the near future. To work with Serverless, you need not just learn a couple of new things, but change your mindset for a new type of development.