Navigate back to the homepage

Handling API request race conditions in React

Sébastien Lorber
August 30th, 2019 · 6 min read

Many blog articles talk about loading api/async data in React apps, with componentDidMount, useEffect, Redux, Apollo…

Yet, all those articles are generally optimistic, and never mention something important to consider: race conditions could happen, and your UI may end up in an inconsistent state.

An image is worth a thousand words:

trump macron

You search for Macron, then change your mind and search for Trump, and you end up with a mismatch between what you want (Trump) and what you get (Macron).

If there is a non-null probability that your UI could end up in such a state, your app is subject to race conditions.

Why this happens?

Sometimes, multiple requests are fired in parallel (competing to render the same view), and we just assume the last request will resolve last. Actually, the last request may resolve first, or just fail, leading to the first request resolving last.

It happens more often than you think. For some apps, it can lead to very serious problems, like a user buying the wrong product, or a doctor prescribing the wrong drug to a patient.

A non-exhaustive list of reasons:

  • The network is slow, bad, unpredictable, with variable request latencies…
  • The backend is under heavy load, throttling some requests, under a Denial-of-Service attack…
  • The user is clicking fast, commuting, travelling, on the country side…
  • You are just unlucky

Developers don’t see them in development, where the network conditions are generally good, sometimes running the backend API on your own computer, with close to 0ms latency.

In this post, I’ll show you what those issues do, using realistic network simulations and runnable demos. I’ll also explain how you can fix those issues, depending on the libraries you already use.

Disclaimer: to keep the focus on race conditions, the following code samples will not prevent the React warning if you setState after unmounting.

The incriminated code:

You probably already read tutorials with the following code:

1const StarwarsHero = ({ id }) => {
2 const [data, setData] = useState(null);
3
4 useEffect(() => {
5 setData(null);
6
7 fetchStarwarsHeroData(id).then(
8 result => setData(result),
9 e => console.warn('fetch failure', e),
10 );
11 }, [id]);
12
13 return <div>{data ? data.name : <Spinner />}</div>;
14};

Or with the class API:

1class StarwarsHero extends React.Component {
2 state = { data: null };
3
4 fetchData = id => {
5 fetchStarwarsHeroData(id).then(
6 result => setState({ data: result }),
7 e => console.warn('fetch failure', e),
8 );
9 };
10
11 componentDidMount() {
12 this.fetchData(this.props.id);
13 }
14
15 componentDidUpdate(prevProps) {
16 if (prevProps.id !== this.props.id) {
17 this.fetchData(this.props.id);
18 }
19 }
20
21 render() {
22 const { data } = this.state;
23 return <div>{data ? data.name : <Spinner />}</div>;
24 }
25}

Here is a Starwars Hero slider. All 2 versions above lead to this same result.

Id=1
Fetched:
Id: ...
Name: ...
Height: ...
Mass: ...

When clicking very fast on the arrows, even with your own good home network and very fast API, you can already see something is wrong. Please don’t think debouncing protects you: it just reduces the chances of being unlucky.

Now let’s see what happens when you are on a train with a few tunnels.

Simulating bad network conditions

Let’s build some utils to simulate bad network conditions:

1import { sample } from 'lodash';
2
3// Will return a promise delayed by a random amount, picked in the delay array
4const delayRandomly = () => {
5 const timeout = sample([0, 200, 500, 700, 1000, 3000]);
6 return new Promise(resolve =>
7 setTimeout(resolve, timeout),
8 );
9};
10
11// Will throw randomly with a 1/4 chance ratio
12const throwRandomly = () => {
13 const shouldThrow = sample([true, false, false, false]);
14 if (shouldThrow) {
15 throw new Error('simulated async failure');
16 }
17};

Adding network delays

You might be on a slow network, or the backend may take time to answer.

1useEffect(() => {
2 setData(null);
3
4 fetchStarwarsHeroData(id)
5 .then(async data => {
6 await delayRandomly();
7 return data;
8 })
9 .then(
10 result => setData(result),
11 e => console.warn('fetch failure', e),
12 );
13}, [id]);
Id=1
Fetched:
Id: ...
Name: ...
Height: ...
Mass: ...

Adding network delays + failures

You are on a train in the countryside, and there are a few tunnels: requests are delayed randomly and some of them might fail.

1useEffect(() => {
2 setData(null);
3
4 fetchStarwarsHeroData(id)
5 .then(async data => {
6 await delayRandomly();
7 throwRandomly();
8 return data;
9 })
10 .then(
11 result => setData(result),
12 e => console.warn('fetch failure', e),
13 );
14}, [id]);

As you can see, this code easily leads to weird, inconsistent UI states.

Id=1
Fetched:
Id: ...
Name: ...
Height: ...
Mass: ...

How to avoid this problem

Let’s suppose 3 requests R1, R2 and R3 gets fired in this order, and are still pending. The solution is to only handle the response from R3, the last issued request.

There are a few ways to do so:

  • Ignoring responses from former api calls
  • Cancelling former api calls
  • Cancelling and ignoring

Ignoring responses from former api calls

Here is one possible implementation.

1// A ref to store the last issued pending request
2const lastPromise = useRef();
3
4useEffect(() => {
5 setData(null);
6
7 // fire the api request
8 const currentPromise = fetchStarwarsHeroData(id).then(
9 async data => {
10 await delayRandomly();
11 throwRandomly();
12 return data;
13 },
14 );
15
16 // store the promise to the ref
17 lastPromise.current = currentPromise;
18
19 // handle the result with filtering
20 currentPromise.then(
21 result => {
22 if (currentPromise === lastPromise.current) {
23 setData(result);
24 }
25 },
26 e => {
27 if (currentPromise === lastPromise.current) {
28 console.warn('fetch failure', e);
29 }
30 },
31 );
32}, [id]);
Id=1
Fetched:
Id: ...
Name: ...
Height: ...
Mass: ...

Some might be tempted to use the id to do this filtering, but it’s not a good idea: if the user clicks next and then previous, we might end up with 2 distinct requests for the same hero. Generally this is not a problem (as the 2 requests will often return the exact same data), but using promise identity is a more generic and portable solution.

Cancelling former api calls

It is better to cancel former api requests in-flight: the browser can avoid parsing the response and prevent some useless CPU/Network usage. fetch support cancellation thanks to AbortSignal:

1const abortController = new AbortController();
2
3// fire the request, with an abort signal,
4// which will permit premature abortion
5fetch(`https://swapi.co/api/people/${id}/`, {
6 signal: abortController.signal,
7});
8
9// abort the request in-flight
10// the request will be marked as "cancelled" in devtools
11abortController.abort();

An abort signal is like a little event emitter, you can trigger it (through the AbortController), and every request started with this signal will be notified and cancelled.

Let’s see how to use this feature to solve race conditions:

1// Store abort controller which will permit to abort
2// the last issued request
3const lastAbortController = useRef();
4
5useEffect(() => {
6 setData(null);
7
8 // When a new request is going to be issued,
9 // the first thing to do is cancel the previous request
10 if (lastAbortController.current) {
11 lastAbortController.current.abort();
12 }
13
14 // Create new AbortController for the new request and store it in the ref
15 const currentAbortController = new AbortController();
16 lastAbortController.current = currentAbortController;
17
18 // Issue the new request, that may eventually be aborted
19 // by a subsequent request
20 const currentPromise = fetchStarwarsHeroData(id, {
21 signal: currentAbortController.signal,
22 }).then(async data => {
23 await delayRandomly();
24 throwRandomly();
25 return data;
26 });
27
28 currentPromise.then(
29 result => setData(result),
30 e => console.warn('fetch failure', e),
31 );
32}, [id]);
Id=1
Fetched:
Id: ...
Name: ...
Height: ...
Mass: ...

This code looks good at first, but actually we are still not safe.

Let’s consider the following code:

1const abortController = new AbortController();
2
3fetch('/', { signal: abortController.signal }).then(
4 async response => {
5 await delayRandomly();
6 throwRandomly();
7 return response.json();
8 },
9);

If we abort the request during the fetch, the browser will be notified and do something about it. But if the abortion happens while the browser is running the then() callback, it has no way to handle the abortion of this part of the code, and you have to write this logic on your own. If the abortion happens during the fake delay we added, it won’t cancel that delay and stop the flow.

1fetch('/', { signal: abortController.signal }).then(
2 async response => {
3 await delayRandomly();
4 throwRandomly();
5 const data = await response.json();
6
7 // Here you can decide to handle the abortion the way you want.
8 // Throwing or never resolving are valid options
9 if (abortController.signal.aborted) {
10 return new Promise(() => {});
11 }
12
13 return data;
14 },
15);

Let’s get back to our problem. Here’s the final, safe version, aborting the request in-flight, but also using the abortion to eventually filter the results. Also let’s use the hooks cleanup function, as I was suggested on Twitter, which makes the code a bit simpler.

1useEffect(() => {
2 setData(null);
3
4 // Create the current request's abort controller
5 const abortController = new AbortController();
6
7 // Issue the request
8 fetchStarwarsHeroData(id, {
9 signal: abortController.signal,
10 })
11 // Simulate some delay/errors
12 .then(async data => {
13 await delayRandomly();
14 throwRandomly();
15 return data;
16 })
17 // Set the result, if not aborted
18 .then(
19 result => {
20 // IMPORTANT: we still need to filter the results here,
21 // in case abortion happens during the delay.
22 // In real apps, abortion could happen when you are parsing the json,
23 // with code like "fetch().then(res => res.json())"
24 // but also any other async then() you execute after the fetch
25 if (abortController.signal.aborted) {
26 return;
27 }
28 setData(result);
29 },
30 e => console.warn('fetch failure', e),
31 );
32
33 // Trigger the abortion in useEffect's cleanup function
34 return () => {
35 abortController.abort();
36 };
37}, [id]);
Id=1
Fetched:
Id: ...
Name: ...
Height: ...
Mass: ...

And now only we are safe.

Using libraries

Doing all this manually is complex and error prone. Hopefully, some libraries solve this problem for you. Let’s explore a non-exhaustive list of libraries generally used for loading data into React.

Redux

There are multiple ways to load data into a Redux store. Generally, if you are using Redux-saga or Redux-observable, you are fine. For Redux-thunk, Redux-promise and other middlewares, you might check the “vanilla React/Promise” solutions in next sections.

Redux-saga

You might notice there are multiple take methods on the Redux-saga API, but generally you’ll find many examples using takeLatest. This is because takeLatest will protect you against those race conditions.

1Forks a saga on each action dispatched to the Store
2that matches pattern. And automatically cancels any previous saga
3task started previously if it's still running.
1function* loadStarwarsHeroSaga() {
2 yield* takeLatest(
3 'LOAD_STARWARS_HERO',
4 function* loadStarwarsHero({ payload }) {
5 try {
6 const hero = yield call(fetchStarwarsHero, [
7 payload.id,
8 ]);
9 yield put({
10 type: 'LOAD_STARWARS_HERO_SUCCESS',
11 hero,
12 });
13 } catch (err) {
14 yield put({
15 type: 'LOAD_STARWARS_HERO_FAILURE',
16 err,
17 });
18 }
19 },
20 );
21}

The previous loadStarwarsHero generator executions will be “cancelled”. Unfortunately the underlying API request will not really be cancelled (you need an AbortSignal for that), but Redux-saga will ensure that the success/error actions will only be dispatched to Redux for the last requested Starwars hero. For in-flight request cancellation, follow this issue.

You can also opt-out from this protection and use take or takeEvery.

Redux-observable

Similarly, Redux-observable (actually RxJS) has a solution: switchMap:

1The main difference between switchMap and other flattening operators
2is the cancelling effect. On each emission the previous inner observable
3(the result of the function you supplied) is cancelled and
4the new observable is subscribed. You can remember this
5by the phrase switch to a new observable.
1const loadStarwarsHeroEpic = action$ =>
2 action$.ofType('LOAD_STARWARS_HERO').switchMap(action =>
3 Observable.ajax(`http://data.com/${action.payload.id}`)
4 .map(hero => ({
5 type: 'LOAD_STARWARS_HERO_SUCCESS',
6 hero,
7 }))
8 .catch(err =>
9 Observable.of({
10 type: 'LOAD_STARWARS_HERO_FAILURE',
11 err,
12 }),
13 ),
14 );

You can also use other RxJS operators like mergeMap if you know what you are doing, but many tutorials will use switchMap, as it’s a safer default. Like Redux-saga, it won’t cancel the underlying request in-flight, but there are solutions to add this behavior.

Apollo

Apollo lets you pass down GraphQL query variables. Whenever the Starwars hero id changes, a new request is fired to load the appropriate data. You can use the HOC, the render props or the hooks, Apollo will always guarantee that if you request id: 2, your UI will never return you the data for another Starwars hero.

1const data = useQuery(GET_STARWARS_HERO, {
2 variables: { id },
3});
4
5if (data) {
6 // This is always true, hopefully!
7 assert(data.id === id);
8}

Vanilla React

There are many libraries to load data into React components, without needing a global state management solution.

I created react-async-hook: a very simple and tiny hooks library to load async data into React components. It has very good native Typescript support, and protects you against race conditions by using the techniques discussed above.

1import { useAsync } from 'react-async-hook';
2
3const fetchStarwarsHero = async id =>
4 (await fetch(
5 `https://swapi.co/api/people/${id}/`,
6 )).json();
7
8const StarwarsHero = ({ id }) => {
9 const asyncHero = useAsync(fetchStarwarsHero, [id]);
10 return (
11 <div>
12 {asyncHero.loading && <div>Loading</div>}
13 {asyncHero.error && (
14 <div>Error: {asyncHero.error.message}</div>
15 )}
16 {asyncHero.result && (
17 <div>
18 <div>Success!</div>
19 <div>Name: {asyncHero.result.name}</div>
20 </div>
21 )}
22 </div>
23 );
24};

Other options protecting you:

There are many other library options, for which I won’t be able to tell you if they are protecting you: take a look at the implementation.

Note: it’s possible react-async-hook and react-async will merge in the next months.

Note:: it’s possible to use <StarwarsHero key={id} id={id} /> as a simple React workaround, to ensure the component remounts everytime the id changes. This will protect you (and sometime a useful feature), but gives more work to React.

Vanilla promises and Javascript

If you are dealing with vanilla promises and Javascript, here are simple tools you can use to prevent those issues.

Those tools can also be useful to handle race conditions if you are using thunks or promises with Redux.

Note: some of these tools are actually low-level implementation details of react-async-hook.

Cancellable promises

React has an old blog post isMounted() is an antipattern on which you’ll learn how to make a promise cancellable to avoid the setState after unmount warning. The promise is not really cancellable (the underlying api call won’t be cancelled), but you can choose to ignore or reject the response of a promise.

I made a library awesome-imperative-promise to make this process easier:

1import { createImperativePromise } from 'awesome-imperative-promise';
2
3const id = 1;
4
5const { promise, resolve, reject, cancel } = createImperativePromise(fetchStarwarsHero(id);
6
7// will make the returned promise resolved manually
8resolve({
9 id,
10 name: "R2D2"
11});
12
13// will make the returned promise rejected manually
14reject(new Error("can't load Starwars hero"));
15
16// will ensure the returned promise never resolves or reject
17cancel();

Note: all those methods have to be called before the underlying API request resolves or reject. If the promise is already resolved, there’s no way to “unresolve” it.

Automatically ignoring last call

awesome-only-resolves-last-promise is a library to ensure we only handle the result of the last async call:

1import { onlyResolvesLast } from 'awesome-only-resolves-last-promise';
2
3const fetchStarwarsHeroLast = onlyResolvesLast(
4 fetchStarwarsHero,
5);
6
7const promise1 = fetchStarwarsHeroLast(1);
8const promise2 = fetchStarwarsHeroLast(2);
9const promise3 = fetchStarwarsHeroLast(3);
10
11// promise1: won't resolve
12// promise2: won't resolve
13// promise3: WILL resolve

What about Suspense?

It should prevent those issues, but let’s wait for the official release :)

Conclusion

For your next React data loading usecase, I hope you will consider handling race conditions properly.

I can also recommend to hardcode some little delays to your API requests in development environment. Potential race conditions and bad loading experiences will be more easy to notice. I think it’s safer to make this delay mandatory, instead of expecting each developer to turn on the slow network option in devtools.

I hope you’ve found this post interesting and you learned something, it was my first technical blog post ever :)


If you like it, spread the word with a Retweet

Browser demos code or correct my post typos on the blog repo

For more content like this, subscribe to my mailing list and follow me on Twitter.

Thanks for my reviewers: Shawn Wang, Mateusz Burzyński, Andrei Calazans, Adrian Carolli, Clément Oriol, Thibaud Duthoit, Bernard Pratz

Join my email list

I blog about the wide React ecosystem (React, ReactNative, Expo, React-Navigation, Apollo, GraphQL, Gatsby, Typescript, Node...), and try to focus on uncovered subjects.

More articles from Sébastien Lorber

Finally, I have a blog

Or the ultimate recipe to fail at starting your blog for years.

August 20th, 2019 · 5 min read
© 2019 Sébastien Lorber
Link to $https://twitter.com/sebastienlorberLink to $https://github.com/slorberLink to $https://www.linkedin.com/in/sebastienlorber/