Changing data via a REST API is easy: just call axios.post
or axios.patch
in a React click handler. Done.
But unfortunately, production apps need more than that: e.g. you have to handle loading and error state, invalidate the cache, or refetch data. And as a result, your code can easily turn into messy spaghetti.
Luckily we have libraries like react-query to our help. These not only give us a lot of these features out of the box. They also allow us to build advanced data-driven features with a snappy user experience without much effort.
On this page, you can see
react-query
helps us to build such a snappy data-driven component.As an example, we’ll build a paginated table component that allows the user to remove single rows by clicking a button. This doesn’t sound too hard at first. But the combination of pagination and removing items quickly leads us to a handful of problems and edge cases.
As always, the devil is in the details. But the end result (using techniques like cache invalidation, request cancellation, and optimistic updates) speaks for itself. Just look how fast the app is even though each click on the button sends a request:
Note: this is the second part of a series on React and REST APIs. If you want to learn more about fetching data (instead of mutating) based on an advanced example read the first part.
Nobody wants to read through the setup of a new React app, I assume. So I prepared a realistic project that we can use as a slightly more advanced example.
It’s an error-tracking tool similar to Sentry that I created for the React Job Simulator. It includes a React / Next.js frontend and a REST API which we will connect to.
Here's what it looks like.
The application fetches a list of issues from an API and renders them as a table. At the right of each row, you can see a button that “resolves” an issue. The user can click this when they fixed a bug in their application and want to remove the corresponding issue from this list.
You can see that this list has several pages (note the “Previous” and “Next” buttons at the bottom). This will give us some headaches later.
In this example, we use two REST endpoints.
prolog-api.profy.dev/v2/issue?status=open
(click here to see the JSON response)prolog-api.profy.dev/v2/issue/{id}
You can find more details about this REST API in its Swagger API documentation.
To start on the same page here is the whole component without the data fetching logic. We have a table
that shows the issues in its tbody
element. At the bottom, you can see the pagination which isn’t relevant to this article. But if you’re interested in building a paginated table have a look at the previous part of this series.
import { useState } from "react";export function IssueList() {// state variable used for paginationconst [page, setPage] = useState(1);// this is where our REST API connections goconst issuePage = ...const onClickResolve = ...return (<Container><Table><thead><HeaderRow><HeaderCell>Issue</HeaderCell><HeaderCell>Level</HeaderCell><HeaderCell>Events</HeaderCell><HeaderCell>Users</HeaderCell></HeaderRow></thead><tbody>{(issuePage.items || []).map((issue) => (<IssueRowkey={issue.id}issue={issue}onClickResolve={() => onClickResolve(issue.id))/>))}</tbody></Table><Pagination page={page} setPage={setPage} /></Container>);}
Sending requests in a React app isn’t difficult. In our case, we can use a useEffect
to fetch data as soon as the component renders. And to send a request when the user clicks on the “Resolve” button we can use a simple click handler.
Easy peasy.
import { useEffect, useState } from "react";const requestOptions = { headers: { Authorization: "tutorial-access-token" } }export function IssueList() {const [page, setPage] = useState(1);// fetch the data and store it in a state variableconst [issuePage, setIssuePage] = useState({ items: [], meta: undefined });useEffect(() => {axios.get("https://prolog-api.profy.dev/v2/issue?status=open", requestOptions).then(({ data }) => setIssuePage(data));}, []);// update the issue status to resolved when clicking the buttonconst onClickResolve = () => {axios.patch(`https://prolog-api.profy.dev/v2/issue/${issue.id}`,{ status: "resolved" },requestOptions,);};return (<Container><Table><head>...</thead><tbody>{(issuePage.items || []).map((issue) => (<IssueRowkey={issue.id}issue={issue}onClickResolve={() => onClickResolve(issue.id))/>))}</tbody></Table><Pagination page={page} setPage={setPage} /></Container>);}
And unsurprisingly, this works. We see the data in the table. And when we click the button we can see in the dev tools’ network tab that a PATCH request has been sent.
The problem is that the UX isn’t that great. The row the user clicked should disappear from the table. And it does… once we reload the page.
We should be able to improve this behavior. Let’s try to refetch the data after the patch request.
export function IssueList() {// fetch the data and store it in a state variableconst [issuePage, setIssuePage] = useState({ items: [], meta: undefined });const [invalidated, setInvalidated] = useState(0);useEffect(() => {axios.get("https://prolog-api.profy.dev/v2/issue?status=open", requestOptions).then(({ data }) => setIssuePage(data));}, [invalidated]);// update the issue status to resolved when clicking the buttonconst onClickResolve = () => {axios.patch(`https://prolog-api.profy.dev/v2/issue/${issueId}`,{ status: "resolved" },requestOptions,).then(() => {setInvalidated((count) => count + 1);});};...
Cringe… that looks a bit hacky. But it works.
As soon as we click the “Resolve” button, the PATCH request is sent. And once that request resolves the table data is refetched. As a result, the row disappears from the table.
The UX is better but as you can see it’s not great yet. There’s a significant delay between clicking the button and the row being removed. Either we need to show a loading indicator (boring!) or make this snappier.
A common technique to achieve a snappier UX is called “optimistic updates”: the app pretends the request was successful right away.
In our case, that means the issue is removed from the table as soon as we click the “Resolve” button. That’s not hard to do. We simply remove the issue to be resolved from the state before we send the request.
export function IssueList() {const [issuePage, setIssuePage] = useState({ items: [], meta: undefined });...const onClickResolve = () => {// optimistic update: remove issue from the listsetIssuePage((data) => ({...data,items: data.items.filter((issue) => issue.id !== issued)});axios.patch(`https://prolog-api.profy.dev/v2/issue/${issueId}`,{ status: "resolved" },requestOptions,).then(() => {setInvalidated((count) => count + 1);});};...
Again, this works. But the code gets messier. And in fact buggy as well.
The initial implementation with react-query isn’t much shorter than our “simple” approach above. But as shown in the previous article we get a lot of things for free (like loading and error states, caching, and so on). For simplicity, we won’t go deeper into these topics here though.
import axios from "axios";import { useMutation, useQuery } from "@tanstack/react-query";const requestOptions = { headers: { Authorization: "tutorial-access-token" } }export function IssueList() {const issuePage = useQuery(["issues"], async () => {const { data } = await axios.get("https://prolog-api.profy.dev/v2/issue?status=open");return data;});const resolveIssueMutation = useMutation((issueId) =>axios.patch(`https://prolog-api.profy.dev/v2/issue/${issueId}`,{ status: "resolved" },requestOptions,));const { items, meta } = issuePage.data || {};...}
To fetch the issues from our GET endpoint we can use the useQuery
hook. The first parameter ["issues"]
is the identifier for this query in the cache. The second parameter is the function responsible for fetching the data.
To send the PATCH request that updates an issue we can use the useMutation
hook. In our case, we can simply pass the function responsible for sending the PATCH request.
Sending the PATCH request is now easy. We simply call the mutate
function that is returned by the useMutation
hook.
<IssueRowkey={issue.id}issue={issue}resolveIssue={() => resolveIssueMutation.mutate(issue.id)}/>
As in the previous example, this sends the request but doesn’t update the table data until we refresh the page.
To refetch the table data we luckily don’t need any hacky workarounds as before. We can use one of the callbacks that react-query offers in the mutation options.
The first one that we’ll use is onSettled
. This callback is fired once a mutation is finished no matter if it was a success or an error. Kind of like the ”finally” method of a promise.
To refetch the table data after the patch request we flag it as invalidated.
export function IssueList() {const issuePage = useQuery(["issues"], async () => ...);const queryClient = useQueryClient();const resolveIssueMutation = useMutation((issueId) =>axios.patch(`https://prolog-api.profy.dev/v2/issue/${issueId}`,{ status: "resolved" },requestOptions,){onSettled: () => {// flag the query with key ["issues"] as invalidated// this causes a refetch of the issues dataqueryClient.invalidateQueries(["issues"]);},});...}
This invalidates all queries containing the issues
key (even if additional keys are set). We can see now that the data is refetched automatically after we click the button.
As in the previous “simple” approach, we see a delay between the button click and the row being removed from the table. Let’s deal with that in a bit.
First, there’s another issue that we can fix easily: When a user quickly clicks to resolve multiple issues we can see concurrent GET requests being sent to the REST API.
In this video, we first see the two PATCH requests. These are followed by two GET requests. Depending on the timing of the button clicks we can end up with different scenarios:
To get around this problem, we can cancel any GET request that’s still pending when a new mutation is triggered.
First, we need to set up our GET request to support cancellation. This is typically done by passing the AbortSignal from an AbortController to axios (or fetch). And this again means some additional code.
react-query makes it easier: It already provides an abort signal in the first parameter of the query function.
export function IssueList() {// use the AbortSignal that useQuery providesconst issuePage = useQuery(["issues"], async ({ signal }) => {const { data } = await axios.get("https://prolog-api.profy.dev/v2/issue?status=open",// pass the abort signla to axios{ ...requestOptions, signal });return data;});const resolveIssueMutation = useMutation(...);...}
Now that the query is set up for cancellation we can simply call queryClient.cancelQueries(…) at the right time and we’re done.
The right time to cancel pending GET requests is whenever a new mutation is triggered. Again react-query has our backs: we can use the onMutate
callback (a sibling of onSettled
):
export function IssueList() {const issuePage = useQuery(...);// get the query clientconst queryClient = useQueryClient();const resolveIssueMutation = useMutation((issueId) =>axios.patch(`https://prolog-api.profy.dev/v2/issue/${issueId}`,{ status: "resolved" },requestOptions,),{onMutate: async (issueId) => {// cancel all queries that contain the key "issues"await queryClient.cancelQueries(["issues"]);},onSettled: () => {queryClient.invalidateQueries(["issues"]);},});...}
Let’s try that out.
When we quickly click on two of the issues in our table we can again see two PATCH requests followed by two GET requests. But this time, the first GET request is canceled.
Cool, that was easy to achieve. Didn’t even take a lot of code.
But as mentioned, we still see a delay between clicking the “Resolve” button and the corresponding issue being removed from the table.
As mentioned before, to update the table immediately after the user clicks the “Resolve” button we can “optimistically update” the data on our frontend. This gives the user the illusion that the action they triggered (resolving the issue) happens instantaneously.
The plan is simple: As soon as the mutation starts we remove the issue from the data. When we control the data ourselves that's easy. But how does it work with react-query?
export function IssueList() {const issuePage = useQuery(["issues"], async () => ...);const queryClient = useQueryClient();const resolveIssueMutation = useMutation((issueId) => axios.patch(...),{// optimistically remove the to-be-resolved issue from the listonMutate: async (issueId) => {await queryClient.cancelQueries(["issues"]);// get the current issues from the cacheconst currentPage = queryClient.getQueryData(["issues"]);if (!currentPage) {return;}// remove resolved issue from the cache so it immediately// disappears from the UIqueryClient.setQueryData(["issues"], {...currentPage,items: currentPage.items.filter(({ id }) => id !== issueId),});// save the current data in the mutation context to be able to// restore the previous state in case of an errorreturn { currentPage };},onSettled: () => {queryClient.invalidateQueries(["issues"]);},});...}
OK, that’s a bit more code than we had in the “simple” approach at the beginning of this page. Still, not very complicated though.
But what if the request fails? With the optimistic update we created the illusion that everything went fine. But we shouldn’t keep the user in the dark if we get an error. We have to restore the previous state when the request fails.
That’s easy with the onError
callback. Note that the return value of onMutate
is passed to onError
as context
parameter. How handy is that?
export function IssueList() {const issuePage = useQuery(["issues"], async () => ...);const queryClient = useQueryClient();const resolveIssueMutation = useMutation((issueId) => axios.patch(...),{onMutate: async (issueId) => {// optimistically remove the to-be-resolved issue from the list...// save the current data in the mutation context to be able to// restore the previous state in case of an errorreturn { currentPage };},// restore the previous data in case the request failedonError: (err, issueId, context) => {if (context?.currentPage) {queryClient.setQueryData(["issues"], context.currentPage);}},onSettled: () => {queryClient.invalidateQueries(["issues"]);},});...}
Now the resolved issue is removed immediately from the table and the data is updated in the background. You have to trust me with the error handling though.
This is all nice, but still not great. For example, we see that the table has one less row while its data is being refetched. So the height of the table changes and the pagination at the bottom jumps around.
On top of the changing height also lets the scroll bar disappear. That creates a wiggly user experience as the table width changes.
Can we make this experience nicer and maybe even snappier?
This is the point where the pagination starts to become a headache.
The API endpoint for our GET requests is paginated and only returns 10 issues at a time. So when we click the “Resolve” button to remove an issue from the table there are only 9 issues left in the UI.
But in fact, the backend has more data for us. So once we refetch the issues we again see 10 rows in the table. And that creates the wiggly UX as discussed above.
Now, what if the frontend already had the data for the second page of issues? We could fill the missing row at the bottom with the first issue of the next page. The number of rows in the table would stay constant and we’d have a much cleaner UX.
In the previous article, we already implemented the prefetching logic to create a snappy experience while navigating through the table pages. We extracted the code related to the GET request in a custom hook that looks like this (sorry, I’m just gonna throw this at you without much explanation here):
async function getIssues(page, options) {const { data } = await axios.get("https://prolog-api.profy.dev/v2/issue", {params: { page, status: "open" },signal: options?.signal,...requestOptions,});return data;}export function useIssues(page) {const query = useQuery(// note that we added the "page" parameter to the query key["issues", page],({ signal }) => getIssues(page, { signal }),);// Prefetch the next page!const queryClient = useQueryClient();useEffect(() => {if (query.data?.meta.hasNextPage) {queryClient.prefetchQuery(["issues", page + 1],async ({ signal }) => getIssues(page + 1, { signal }),);}}, [query.data, page, queryClient]);return query;}
We can now use the hook and connect it to a page
state.
export function IssueList() {// state variable used for paginationconst [page, setPage] = useState(1);const issuePage = useIssues(page);const queryClient = useQueryClient();const resolveIssueMutation = useMutation(...);...}
Note: The page
state variable and its setter are also connected to the pagination component which is not shown here (if you’re curious you can see it in the very first code snippet at the top of this page).
Now we can add the first issue from the next page to the current page during the optimistic update.
export function IssueList() {const [page, setPage] = useState(1);const issuePage = useIssues(page);const queryClient = useQueryClient();const resolveIssueMutation = useMutation((issueId) =>axios.patch(...),{onMutate: async (issueId) => {await queryClient.cancelQueries(["issues"]);// note that we have to add the page to the query key nowconst currentPage = queryClient.getQueryData(["issues",page,]);// get the prefetched data for the next pageconst nextPage = queryClient.getQueryData(["issues",page + 1,]);if (!currentPage) {return;}const newItems = currentPage.items.filter(({ id }) => id !== issueId);// add the first issue from the next page to the current pageif (nextPage?.items.length) {const lastIssueOnPage =currentPage.items[currentPage.items.length - 1];// get the first issue on the next page that isn't yet added to the// current page (in case a user clicks on multiple issues quickly)const indexOnNextPage = nextPage.items.findIndex((issue) => issue.id === lastIssueOnPage.id);const nextIssue = nextPage.items[indexOnNextPage + 1];// there might not be any issues left to add if a user clicks fast// and/or the internet connection is slowif (nextIssue) {newItems.push(nextIssue);}}queryClient.setQueryData(["issues", page], {...currentPage,items: newItems,});return { currentPage };},onError: (err, issueId, context) => {if (context?.currentPage) {queryClient.setQueryData(["issues", page], context.currentPage);}},onSettled: () => {// we don't have to add the page to the query key here// this invalidates all queries containing the key "issues"queryClient.invalidateQueries(["issues"]);},});...}
Yes, the code is getting more complex. But the user experience is worth it.
Look at this: when a user clicks the “Resolve” button the row is not only removed but a new row is appended at the bottom to fill the otherwise empty spot. The table layout is stable and we have a super snappy experience.
Looks so simple but took some effort to build. Unfortunately, there’s still one problem left.
When a user wants to resolve multiple issues very quickly after one another we can run into a tricky situation. In the video below the user clicks twice removing two rows from the table.
It looks like there’s some sort of race condition. Both rows disappear as expected. But then we can see one of the removed rows reappear shortly before it disappears again.
What happened?
So it seems that concurrent GET requests cause this problem. Even though pending requests should be canceled. According to my tests, this happens quite frequently and becomes really annoying and confusing.
So the goal is to prevent parallel GET requests as much as possible.
One way to achieve this is to invalidate the “issues” query only when there’s no pending mutation (aka PATCH request). That again means we need to keep track of the number of pending mutations.
This might sound like another state variable at first. But we don’t want to trigger a re-render of the component when each mutation starts. So instead we can better use a ref.
export function IssueList() {...// keep track of the number of pending mutationsconst pendingMutationCount = useRef(0);const resolveIssueMutation = useMutation((issueId) =>axios.patch(...),{onMutate: async (issueId) => {// increment number of pending mutationspendingMutationCount.current += 1;...return { currentPage };},onError: (err, issueId, context) => { ... },onSettled: () => {// only invalidate queries if there's no pending mutation// this makes it unlikely that a previous request updates// the cache with outdated datapendingMutationCount.current -= 1;if (pendingMutationCount.current === 0) {queryClient.invalidateQueries(["issues"]);}},});...}
Not sure if that’s hacky or not but it does the job. Look how snappy this table has become even when a user goes into “click rage”.