All hale the Cache

All hale the Cache

Imagine working in a big file room, where millions of records are there.

And one day your boss comes and asks for X data for each record in those files, sadly will need to go through all of them and then report it back.

Well lets say it took you Y minutes to get the needed information plus all the hard work you had to give while collecting and processing it.

Now, lets say that not only my boss is now asking for the same information but the hole department!.

At this point I am doing the same over and over!

After a couple of days I get an idea to get a copy of they data needed after collection and just leave it aside, now once anyone comes and asks for it I only need to give them what is inside the copy (No need to search again!).

But after a while I noticed that my copy is not sync with the current data.

Big problem, the copy is not right anymore!

wait I can just watch when some operation happens on the data and then….. create a new copy to spread back.

WOW! now I only do search and collection when any operation takes action on the files and just hand over the copy each time someone asks for it.

This is a very common issue in the software as your database grows it will slow down the time to retrieving and processing data in your code.

This will cause a more time per request which will lead slower response time on which it will lead to unhappy users.

So by monitoring our data and marking the ones that take too long to get plus or not changing frequently will be moved to cache for faster response.

Asp.Net Core made it is really easy to get started as IMemoryCache is backed in the injection pipeline with some really easy and cool functions, let dig in!!

I have created a simple Asp.net Core (3.1) WebAPI project, my values controller is as the following.

[ApiController]
[Route("[controller]")]
public class ValuesController : ControllerBase
{

    public IMemoryCache  MemoryCache{ get; set; }
    
    public ValuesController(IMemoryCache memoryCache)
    {
        MemoryCache = memoryCache;
    }

    [HttpGet]
    public async Task<IActionResult> Get()
    {
        await Task.Delay(1000);
        return Ok("Some important data");
    }
    
}

Simple enough to show us the point!.

As you see I have set IMemoryCache as a parameter in my constructor which will end up being injected via backed in DI in asp.net core.

But do keep in mind you need to add it startup.cs

public void ConfigureServices(IServiceCollection services)
{
    services.AddMemoryCache();
    services.AddControllers();
}

Now that we are setup if we do a call to this API it will end up waiting for 1 second after which it will return the value.

Lets assume this delay is the time to process and get the data.

For one user we maybe fine with that but once our application scales up this is going to be a big issue!!!.

As we see I have hard coded some value to return so we know that our data will not change frequently and as explained this a perfect case to use caching!.

Changing the code to support our need will end up looking as such

[HttpGet]
public async Task<IActionResult> Get()
{
    var result = MemoryCache.Get<string>("ImportantData");

    if (result == null)
    {
        await Task.Delay(1000);
        result = "Some important data";

        MemoryCache.Set("ImportantData", result);
    }

    return Ok(result);
}

So we first try to get our data from cache which at first time will be a miss so we check if we didn’t get any data then we start processing it and save it back in cache using .Set($ket,$value) method.

After that if we recall the endpoint again it will be much faster because now it is getting the value from cache right away!.

This will reduce load on our servers and much faster response time!

There are plenty of other methods and extensions that would help us such as GetOrCreate as one of my favorite extensions.

[HttpGet]
public async Task<IActionResult> Get()
{
    var result = await MemoryCache.GetOrCreateAsync("ImportantData", async entry =>
    {
        //Many more options to set
        entry.AbsoluteExpiration = DateTimeOffset.Now.AddMinutes(10);
        entry.SlidingExpiration = TimeSpan.FromMinutes(1);
        entry.Priority = CacheItemPriority.Normal;
    
        await Task.Delay(1000);
        //Will save the returned data in in cache under the provided key
        return "Some important data";
    });

    return Ok(result);
}

As we can see we were able to reduce the amount of code and we can setup some cache options as expiration time (sliding or absolute), attaching a callback once evicted and many more!.

Tips when working with cache…

1- Do not over cache things!! with out a proper manage!.

Trying to cache a lot of things with out good manage of it will eventually be your server last wish.

We need to consider the amount of resources that we are allocating for each key there as we are sacrificing space for speed here.

2- Chain your miss: At one point there will be a cache miss with so many request going hitting it that will lead to making the server think that it needs to get the data multiple times which then makes more processing time and ends up in duplicate writing!.

3- Choose the right cache implementation for your application, in here we went through the backed in cache service at it is sufficient for many small to medium projects.

But there are many more such as Redis (Which will talk about next blog!)

4- Back to our first point understand your data good so that will help you in choosing the right ones to cache.

Happy caching for you all 🙂

0
SignalR without IIS. (JSON) parsing and helpful tips..

No Comments

No comments yet

Leave a Reply

Your email address will not be published.