Marmicode
Blog Post
Younes Jaaidi

Spice Up your Caching with Convoyr

by Younes Jaaidi • 
Nov 16, 2020 • 5 minutes
Spice Up your Caching with Convoyr

Where it All Started

Most web apps get their value from interacting with HTTP APIs. This is generally done using HTTP clients like the native fetch function, or .

Once you set up an HTTP client on a web app then sooner or later, you will need to extend its capabilities in order to handle different topics like , , , and . Luckily, most HTTP clients can be easily extended using interceptors so you won't have to wrap them or implement your own client.

Even though implementing an interceptor can sound quick and easy, handling , and can get expensive. Wouldn't it be better if someone else could handle these issues for us?


That's when my friend and I noticed the following facts:

The next thing I remember is that we decided to react by initiating an open-source library called .

💡 The Idea Behind Convoyr

While Convoyr is currently focused on extending Angular's HttpClient it has been designed as a modular and framework agnostic set of plugins.

We like to think of Convoyr as an infrastructure agnostic Service Mesh for web apps and JavaScript... even if we are not there yet.

🐢 The Network Latency Performance Problem

Today, in this blog post, we will focus on the performance topic and how to fix network latency issues using Convoyr.

In most cases, when a user navigates from a route to another on the same web app, the main thing preventing us from displaying the result instantly is the network latency related to fetching data from some remote service.

This can be problematic, especially when it comes to re-fetching some data that we have just fetched a few minutes ago and that didn't change since. We end up making the user wait to finally display the same result as he received before. Imagine a list of products where the user clicks on a specific product to see its details before clicking the "back to the list" button. The latency due to re-fetching the products can cause friction.

@Component({
  template: `{{ weather | json }}`
})
export class WeatherComponent {
  weather: Weather;

  ...() {
    this.http.get<Weather>('/weather/lyon')
      .subscribe(weather => this.weather = weather);
  }

}

@Component({
  template: `{{ weather$ | async | json }}`
})
export class WeatherComponent {
  weather$ = this.http.get<Weather>('/weather/lyon');
}

npm install @convoyr/core @convoyr/angular @convoyr/plugin-cache
import { ConvoyrModule } from '@convoyr/angular';
import { createCachePlugin } from '@convoyr/plugin-cache';

@NgModule({
  imports: [
    ...
    HttpClientModule,
    ConvoyrModule.forRoot({
      plugins: [createCachePlugin()],
    }),
  ],
  ...
})
export class AppModule {}

createCachePlugin({
  addCacheMetadata: true
})

http.get('/weather/lyon')
  .subscribe(data => console.log(data));

{
  data: {
    temperature: ...,
    ...
  },
  cacheMetadata: {
    createdAt: '2020-01-01T00:00:00.000Z',
    isFromCache: true
  }
}

import { and, matchOrigin, matchPath } from '@convoyr/core';

createCachePlugin({
  shouldHandleRequest: and(
    matchOrigin('marmicode.io'),
    matchPath('/weather')
  )
})

createCachePlugin({
  storage: new MemoryStorage({ maxSize: 2000 }), // 2000 requests
})

createCachePlugin({
  storage: new MemoryStorage({ maxSize: '2 mb' }), // 2MB
})