Skip to main content

20 posts tagged with "database"

View All Tags

· 2 min read
Enes Akar

Upstash supports the REST API in addition to the native Redis API. REST API helps developers to access their Redis without connection issues from serverless and edge functions. But if you execute multiple Redis commands in the same function then this means you will make a call to the database multiple times. One of our community members (@MasterGates) came with a great suggestion in our Discord channel. Pipeline API:

pipeline api

· 4 min read
Enes Akar

In this article, we will build a Serverless Next.js based TODO application. We will try our best to make it minimalist. It will not have any database connection. It will not have any extra dependency other than Next.js. It will not have any buttons. Besides, minimalism is cool and clean, I love it because I am a lazy developer :)

Why do we avoid database connections?

Next.js is a modern framework which enables the front-end developers to develop full stack applications. Serverless functions have an important role in simplifying backend development for Next.js developers. As you probably know, serverless functions do not like database connections due to their stateless nature. See here and here as examples of problems of database connections inside serverless functions.

· 4 min read
Lisa Natsumi

It is a common need to restrict access to your website to some specific IPs. In this post, I will show how to implement an IP Allow/Deny list using Edge computing. Let me first introduce Cloudflare Workers.

Cloudflare Workers

Cloudflare workers are quite popular technology in recent years. It became publicly available in 2017 and Cloudflare KV storage became publicly available in 2019.

· 7 min read
Enes Akar

Computing at the Edge is one of the most exciting capabilities in recent years. CDN allows you to keep your files closer to your users. Edge computing allows you to run your applications closer to your users. This helps developers to build globally distributed, performant applications.

Cloudflare Workers is the leading product in this space right now. It gives you a serverless processing environment without cold starts. You leverage Cloudflare's global network to minimize latency of your applications. You can write your functions in Javascript, Rust, C and C++.

Similar to Serverless functions (AWS Lambda etc.), Cloudflare Workers are stateless. As you can see in Cloudflare’s survey, developers are asking ways to connect their databases from Edge functions. Unfortunately, most databases are not designed for serverless environments, they require persistent connections. We developed the REST API over Redis to enable serverless edge functions to access Upstash in the simplest and fastest way possible.

· 4 min read
Noah Fischer

Next.js is a very successful web framework which brings together server side rendering and static site generation. SSG speeds up your web site thanks to CDN caching meanwhile SSR helps you with SEO and dynamic data.

Server side rendering is a great feature which helps you to write full stack applications. But if you are not careful, the performance of your Next.js website can be affected easily. In this blog post, I will explain how to leverage Redis to speed up your Next.js API calls. Before that I will briefly mention a simpler way to improve your performance.

· 4 min read
Noah Fischer

One of the best things about the serverless is its ability to scale even in case of huge traffic spikes. But unfortunately, scaling is not free both financially and technically. That’s why developers need to control their applications’ scalability. Here the main reasons you will need a rate limiting mechanism in your serverless application:

1- Protect your resources: If you’re providing a public API, traffic spikes can degrade the quality of the service, and may lead to a service outage for all your users. You need to protect your system against such cascading failures as well as self-inflicted Ddos incidents. A bug in your application can trigger such problems in your system. An internal process which retries an endpoint indefinitely in case of a failure can easily exhaust your resources.

2- Manage user quotas: You may want to define quotas for your users for fair use of your services. Also you may need quotas if you provide your services in different pricing tiers.

3- Control the cost: There are many real life examples how an uncontrolled system can cause large bills. This is quite a risk for serverless applications thanks its highly scalable nature. Rate limiting will help you control these costs.

· 2 min read
Noah Fischer

We have been developing example applications to showcase how easy and practical to develop serverless applications with Redis. So far, the most popular of those examples is the Roadmap Voting Application. As we started to use it in real life, there were two main problems:

  • We started to see spam entries. The application does not have an admin dashboard, so one had to connect to Redis to delete an entry.
  • We released some features in the list but there was no way to flag them as released and remove from voting list.

· 10 min read
Noah Fischer

In this article, I will compare the latencies of three serverless databases DynamoDB, FaunaDB, Upstash (Redis) for a common web use case.

I created a sample news website and I am recording database related latency with each request to the website. Check the website and the source code.

I have inserted 7001 NY Times articles into each database. The articles are collected from New York Times Archive API(all articles of January 2021). I randomly scored each article. At each page request, I query top 10 articles under the World section from each database.

· 8 min read
Noah Fischer

Designing a database for serverless, the biggest challenge in our mind was to build an infrastructure which supports per request pricing in a profitable way. We believe Upstash has achieved this. After we launched the product, we saw that there was another major challenge: Database connections!

As you know, Serverless Functions scale from 0 to infinity. This means when your functions get a lot of traffic, the cloud provider creates new containers (lambda functions) in parallel and scales out your backend. If you create a new database connection within the function then you can rapidly reach the connection limit of your database.

If you try to cache the connection outside the lambda functions then another problem occurs. When AWS freezes your Lambda function, it does not close the connection. So you may end up with many idle/zombie connections which can still threaten.