The Best New Way To Cache API Responses with Angular and RxJs

Let's learn the best new way to implement time based caching for the API responses (or any other) RxJs streams in our Angular applications!

emoji_objects emoji_objects emoji_objects
Tomas Trajan

Tomas Trajan

@tomastrajan

Jul 7, 2022

9 min read

The Best New Way To Cache API Responses with Angular and RxJs
share

The time has come, clean RxJs time-based caching is now possible! 📸 by Elena Koycheva & 🎨 by Tomas Trajan

Hey folks! Welcome 👋

This article is a rather special one!

Over the years, I’ve repeatedly got caught up in trying to implement this use case in a clean way. Often involving many colleagues, spending a bit too much time, but always without proper success.

Even though we could always come up with a working solution , the solution we got felt down right dirty…

Until now…

Today we’re going change that and learn about the best new way to implement time based caching for the API responses (or any other) RxJs streams in our Angular applications!

☕ This article is pretty focused on a single topic, so you should be able to get through it in one go, still TLDR; can’t hurt nobody 😉

TLDR

  • Original example use case of retrieving and caching the apiKey
  • Previous approaches how to solve it (before RxJs 7.1) and their flaws
  • New better solution with the help from improved share operator made available in RxJs 7.1+
  • Refactoring our original implementation
  • Caveats, gotchas and comparing possible solutions and their trade-offs
  • Working solution (StackBlitz) & Cheat Sheet

The Original Use Case

Let’s imagine an Angular application where in order to retrieve some data from a API endpoint (aka backend) we have to provide two HTTP headers:

  1. Standard access-token, eg JWT token which we retrieve when we sign in
  2. Custom api-key which we have to retrieve from its dedicated endpoint (and we can because we already have the access-token which is sufficient…)

Besides that, the api-key has time restricted validity which is not really predictable, please don’t ask me why… Let’s just say it will always be valid for at least 10 seconds after it was retrieved, potentially all the way up to one month 😅😅

Let’s explore what are our options to solve this use case…

👎Retrieve key for every request we make

Our Angular interceptor which sets HTTP headers could retrieve fresh API key for every request we make

  • ✅ We can be sure that the none of our requests will fail because we’re using outdated API key…
  • ❌ Getting a fresh API key for every request will delay that request significantly because it we will effectively have to always make two requests instead of just one
// api key service
@Injectable({ providedIn: 'root' })
export class ApiKeyService {
  // cold stream definition
  apiKey$ = this.httpClient.get<string>(API_KEY_ENDPOINT);

  constructor(private httpClient: HttpCLient) {}
}

// auth interceptor
export class AuthInterceptor implements HttpInterceptor {
  constructor(
    private accessTokenService: AccessTokenService,
    private apiKeyService: ApiKeyService,
  ) {}

  intercept(
    request: HttpRequest<any>,
    next: HttpHandLer,
  ): Observable<HttpEvent> {
    const accessToken = this.accessTokenService.getAccessToken();
    return apiKeyService.apiKey$.pipe(
      // will trigger extra backend request
      concatMap((apiKey) => {
        request = request.clone({
          setHeaders: {
            Authorization: `Bearer ${accessToken}`,
            'x-api-key': apiKey,
          },
        });
        return next.handle(request);
      }),
    );
  }
}

Example of a solution which will trigger additional request to get fresh API key for every real request

In the example above, the ApiKeyService exposes the apiKey$ stream definition (cold stream) which will be then executed for every real request in the intercept() method of our AuthInterceptor.

👎Retrieve and cache API key forever. After that just close your eyes and hope for the best 🤞and retry requests that failed because the API key was invalid 😅

  • ✅ Good for performance, the API key is retrieved initially and then cached, every subsequent real request will reuse the cached API key, but…
  • ❌ The request might fail because the API key became invalid, in that case we have to implement extra logic which will re-fetch API key and re-try the failed request which is a lot of additional code to implement and maintain…

👎 Refresh key periodically

  • ✅ We can be sure that the none of our requests will fail because we’re using outdated key…
  • ❌ Getting API key periodically can be bad for backend performance and even costly in case of a cloud based solution. Imagine we have thousands of concurrent users who leave the tab with application open even though they are not really interacting with the application. They don’t do any real request but we will still re-fetch new API key every minute or so…
@Injectable({ providedln: 'root' })
export class ApiKeyService {
  apiKey: string; // will be accessed sync with apiKeyService.apiKey
  constructor(private httpCtient: HttpCtient) {
    this.timer(0, 60_000)
      .pipe(switchMap(() => this.httpClient.get<string>(API_KEY_ENDPOINT)))
      .subscribe((apiKey) => {
        this.apiKey = apiKey;
      }); // no need to unsubscribe, global singleton
    // should run for the whole app life time...
  }
}

Example of solution where we refresh API key on interval (eg one minute)

Now is the time to explore the first real solution for our API key caching use case!

😑Retrieve key in a lazy way (only when there are requests) and cache it only for a short period of time

  • ✅ No unnecessary requests when user doesn't use the application (eg leaves the tab open)
  • ✅ No performance penalty, requests which happen in quick succession will reuse the same API key (we don't retrieve fresh API key for every request)
  • ✅ No need to implement special retry logic because the key is guaranteed to be fresh enough to rule out that scenario
  • ❌ Problematic implementation, manually managed local state, not a self contained RxJs stream based solution, the stream has to be accessed with a factory instead of just public property access…
@Injectable({ providedln: 'root' })
export class ApiKeyService {
  apiKey$: Observable<string>;

  constructor(private httpCtient: HttpCtient) {}

  getApiKey() {
    if (this.apiKey$) {
      // if key exists
      return this.apiKey$; // return cachec API key
    } else {
      this.apiKey$ = this.httpClient
        .get<string>(API_KEY_ENDPOINT)
        .pipe(shareReplay(1)); // retrieve new API key
      setTimeout(() => {
        // setup cache invalidation
        this.apiKey$ = undefined; // unset stream
      }, CACHE_TIMEOUT); // cache invalidation timeout
    }
  }
}

Example of a solution which implements time-based stream caching. The solution works but it requires manual local state management in contrast with a desired self contained RxJs stream…

Problems with our custom RxJs caching solution

The solution above works just fine but it’s not really nice or clean

We’re manually managing local state with imperative logic around our RxJs stream which just doesn’t feel right…

It is always* possible to define RxJs stream with all the sources of change from start without the need to re-create streams during runtime

*the statement above holds true, in last 6 years, this was the only use case I was not able to solve without re-creating stream during runtime 😔

But don’t worry, everything changes with the advent of RxJs 7.1 and its new improved share operator which will allow us to do it right! 💪

Follow me on Twitter because you will get notified about new Angular blog posts and cool frontend stuff!😉

The new improved share operator

RxJs 7.1 brought us new improved share operator and especially more powerful way to configure it!

Let’s refactor our last local state based caching solution…

const CACHE_TIMEOUT = 10 * 1000; // 10 seconds

@Injectable({ providedln: 'root' })
export class ApiKeyService {
  apiKey$ = this.httpClient.get<string>(API_KEY_ENDPOINT).pipe(
    tap(() => console.log('[DEBUG] request happened')),
    share({
      // HttpClient.get is a completing stream
      // eg '---a|' (marble diagram)
      resetOnComplete: () => timer(CACHE_TIMEOUT),
      // as it completes, we start a timer which will reset the stream
      // when finished, this means that the last API key value will be
      // shared with all subscribers until the timer is triggered which
      // is the desired time-based caching behavior
    }),
  );

  constructor(private httpCtient: HttpCtient) {}
}

Example of a new improved pure RxJs based solution to time-based caching for our API key use case

  • ✅ In this solution we do NOT need any kind of local state or a stream factory method, everything is nicely self-contained within the RxJs stream
  • ✅ It is enough to just store a stream definition in a public property of the service which then can be accessed with simple apiKeyService.apiKey$ property access
  • ✅ The stream definition is cold (lazy) so it wont trigger any request until the first consumer subscribes to it! In our case it will wait until the application makes a first “real” request to some other endpoint

Let’s see how our solution behaves during the runtime!

@Injectable()
class AuthInterceptor {
  constructor(private apiKeyService: ApiKeyService) {}

  intercept() {
    // unrealistic, for demonstration purposes only

    this.apiKeyService.apiKey$.subscribe(console.log); // logs: [DEBUG] request
    // logs: apiKey1
    this.apiKeyService.apiKey$.subscribe(console.log); // logs: apiKey1
    this.apiKeyService.apiKey$.subscribe(console.log); // logs: apiKey1
    this.apiKeyService.apiKey$.subscribe(console.log); // logs: apiKey1

    setTimeout(() => {
      this.apiKeyService.apiKey$.subscribe(console.log); // ⚠️ doesn't log anything !?
      this.apiKeyService.apiKey$.subscribe(console.log);
    }, 1000); // less

    setTimeout(() => {
      this.apiKeyService.apiKey$.subscribe(console.log); // logs: [DEBUG] request
      // logs: apiKey2
      this.apiKeyService.apiKey$.subscribe(console.log); // logs: apiKey2
    }, 11_000); // more than caching timeout
  }
}

Example of a timeline of what could happen with our new pure RxJs time-base stream caching solution

As we can see, the first subscription triggers the request to retrieve the API key and the subsequent subscriptions receive the cached API key without triggering of another request, great!

Once enough time has passed (more than the cache timeout) next subscription will trigger another request, the new value will be cached again…

The whole process will keep repeating itself as long there are more subscriptions (in our case requests handled by the interceptor) so only as long as user interacts with our app which is as lazy as it gets 👍

⚠️ Still, the solution doesn’t work perfectly yet…

The subscriptions which happened with delay which was still within the caching window but after the initial subscription didn’t trigger new request ✅ but they also didn’t receive any API key ❌ … (look for the ⚠️ icon in the code example above)

Let’s fix this behavior with our last change for today

The share operator uses a RxJs subject behind the scenes as the Subject is the RxJs way to implement multicasting of an Observable.

This behavior of the share operator can be adjusted by overriding its connector option.

By default, the connector will use basic RxJs Subject which emits events in mode which is best described as “fire and forget”. This is the reason why subscribers who subscribed to our stream of API keys after some delay did not receive any API key, the event happened in the past and the basic Subject does not have any mechanism to remember the last emitted value!

Luckily, this can be quickly fixed by overriding connector and using another RxJs subject called ReplaySubject.

The desired behavior then can be achieved with connector: () => new ReplaySubject(1) as we’re only interested in the latest API key!

Let’s see it in action 😉

const CACHE_TIMEOUT = 10 * 1000; // 10 seconds

@Injectable({ providedln: 'root' })
export class ApiKeyService {
  apiKey$ = this.httpClient.get<string>(API_KEY_ENDPOINT).pipe(
    tap(() => console.log('[DEBUG] request happened')),
    share({
      // fix the problem where later subscribers
      // did not receive cached API key
      connector: () => new ReplaySubject(1), // override default "new Subject()"
      resetOnComplete: () => timer(CACHE_TIMEOUT),
    }),
  );

  constructor(private httpCtient: HttpCtient) {}
}

Finished example with properly working solution which behaves exactly as we want!

Let’s see how our fixed solution behaves during the runtime…

@Injectable()
class AuthInterceptor {
  constructor(private apiKeyService: ApiKeyService) {}

  intercept() {
    // unrealistic, for demonstration purposes only

    this.apiKeyService.apiKey$.subscribe(console.log); // logs: [DEBUG] request
    // logs: apiKey1
    this.apiKeyService.apiKey$.subscribe(console.log); // logs: apiKey1
    this.apiKeyService.apiKey$.subscribe(console.log); // logs: apiKey1
    this.apiKeyService.apiKey$.subscribe(console.log); // logs: apiKey1

    setTimeout(() => {
      this.apiKeyService.apiKey$.subscribe(console.log); // ✅ logs: apiKey1
      this.apiKeyService.apiKey$.subscribe(console.log); // ✅ logs: apiKey1
    }, 1000); // less

    setTimeout(() => {
      this.apiKeyService.apiKey$.subscribe(console.log); // logs: [DEBUG] request
      // logs: apiKey2
      this.apiKeyService.apiKey$.subscribe(console.log); // logs: apiKey2
    }, 11_000); // more than caching timeout
  }
}

The subscriptions which happen with delay which is still within the caching window but after the initial subscription didn’t trigger new request ✅ and correctly receives the latest API key ✅ hooray! 🎉

Check out the working solution in StackBlitz!

Quick cheat sheet to share with your colleagues😉

BONUS: Why did we use ReplaySubject instead of the more common BehaviorSubject?

The last step of our implementation was to override the connector option which is using new Subject() by default with the connector: () => new ReplaySubject(1).

That approach fixed our issue where later subscribers did not receive last stored API key because default Subject handles events in a way which is best described with “fire and forget”.

Some of you might be wondering why did we choose new ReplaySubject(1) instead of new BehaviorSubject('') which is a very interesting question which will bring us further insights!

  • ✅ Both subjects store last stream value and make it available for the subscribers who subscribe AFTER the value was originally emitted, that way we do not lose the value as was the case with plain Subject
  • ✅ The ReplaySubject(1) will store last emitted value and make it available for the future subscribes, but it does **NOT **need an initial value
  • ⚠️ The BehaviorSubject('some initial value') mandates an initial value which it will provide for the new subscriber immediately. In our case it would either lead to and error as the 'some initial value' is not a valid API key, or we would need to provide additional filter(apiKey => apiKey !== 'some initial value') which would prevent it…

We made it! 💪

I hope You enjoyed learning about the best way to cache backend API responses in your Angular applications with the help of new better share operator available from RxJs 7.1.

Now go and start caching your streams where it makes sense to deliver faster apps and safe network bandwidth for your users!

Also, don’t hesitate to ping me if you have any questions using the article responses or Twitter DMs @tomastrajan

And never forget, future is bright

Obviously the bright  Future! Obviously the bright future! (📷 by [Kamil Kalbarczyk](https://unsplash.com/@kamilkalb))

Do you enjoy the theme of the code preview? Explore our brand new theme plugin

Skol - the ultimate IDE theme

Skol - the ultimate IDE theme

Northern lights feeling straight to your IDE. A simple but powerful dark theme that looks great and relaxes your eyes.

Do you enjoy the content and would like to learn more about how to ensure long term maintainability of your Angular application?

Angular Enterprise Architecture eBook

Angular Enterprise Architecture eBook

Learn how to architect a new or existing enterprise grade Angular application with a bulletproof tooling based automated architecture validation.

This will ensure that Your project stays maintainable, extendable and therefore with high delivery velocity over the whole project lifetime!

Get notified
about new blog posts

Sign up for Angular Experts Content Updates & News and you'll get notified whenever we release a new blog posts about Angular, Ngrx, RxJs or other interesting Frontend topics!

We will never share your email with anyone else and you can unsubscribe at any time!

Emails may include additional promotional content, for more details see our Privacy policy.

Responses & comments

Do not hesitate to ask questions and share your own experience and perspective with the topic

Tomas Trajan - GDE for Angular & Web Technologies

Tomas Trajan

Google Developer Expert (GDE)
for Angular & Web Technologies

I help developer teams deliver successful Angular applications through training and consulting with focus on Architecture and State managements with NgRx!

A Google Developer Expert for Angular & Web Technologies working as a consultant and Angular trainer. Currently empowering teams in enterprise organizations worldwide by implementing core functionality and architecture, introducing best practices, sharing know-how and optimizing workflows.

Tomas strives continually to provide maximum value for customers and the wider developer community alike. His work is underlined by a vast track record of publishing popular industry articles, leading talks at international conferences and meetups, and contributing to open-source projects.

52

Blog posts

4.7M

Blog views

3.5K

Github stars

612

Trained developers

39

Given talks

8

Capacity to eat another cake

You might also like

Check out following blog posts from Angular Experts to learn even more about related topics like RxJs or Angular !

Empower your team with our extensive experience

Angular Experts have spent many years consulting with enterprises and startups alike, leading workshops and tutorials, and maintaining rich open source resources. We take great pride in our experience in modern front-end and would be thrilled to help your business boom

or