Can I Use Reactive Variables Instead of fetchMore() and refetchQueries()?

For a more detailed explanation you can view my question on Stack Overflow

I’m using Apollo with Next.js to connect to a headless CMS (Craft CMS). I need to fetch a list of entries from the cms, paginate those entries using offset pagination, and also filter the entries using a search query and/or categories. The offset, search, and category filtering are all done by the CMS, I just need to send updated query variables via the Apollo query whenever a filtering event occurs. An example query:

query GetRecipeEntries(
        $section: [String]
        $limit: Int 
        $offset: Int 
        $search: String
        $relatedTo: [String]
    ) {
        recipeList: entries(
            section: $section
            limit: $limit
            offset: $offset
            search: $search
            relatedTo: $relatedTo
        ) {
            id
            title
            uri
        }
    }

For pagination I’ve been using fetchMore() along with offsetLimitPagination(), but as my app gets more complex I need some way to manage offset and other filtering variables from one single location. So I figured Reactive Variables would be the solution. I store offset as a reactive variable, pass it to my Apollo useQuery, then when I update offset somewhere else Apollo will detect the change and re-fetch the query without me having to explicitly call fetchMore():

    const queryLimit = makeVar(12);
    const pagerOffset = makeVar(0);

    const { data } = useQuery(GET_RECIPE_ENTRIES, {
        variables: {
            section: ["recipes"],
            limit: queryLimit(),
            offset: pagerOffset()
        }
    });

    const handlePagerClick = () => {
        pagerOffset(pagerOffset() + queryLimit());
    }

Trouble is it doesn’t seem to be working that way. The reactive variable updates but it doesn’t trigger a refetch by Apollo.

Is this the correct way to use reactive variables? Should I be using a different approach altogether?

Hey @kmgdev :waving_hand:

The issue with reactive variables is you have no way to tell React to rerender when the values to your reactive variables change. Using a reactive variable by calling its function like you are is just returning a primitive value, so useQuery has no idea the value has changed until your component rerenders again.

Instead I’d recommend using the useReactiveVar hook which will read the value from your reactive variable and rerender your component when the value changes. Use the value returned from that hook and your useQuery hook should work as expected:

const queryLimitValue = useReactiveVar(queryLimit);
const pagerOffsetValue = useReactiveVar(pagerOffset);

const { data } = useQuery(GET_RECIPE_ENTRIES, {
  variables: {
    section: ['recipes'],
    limit: queryLimitValue,
    offset: pagerOffsetValue,
  }
});

const handlePagerClick = () => {
  pagerOffset(pagerOffset() + queryLimit());
}

See if that works for you!

1 Like

Thanks for the response, @jerelmiller !

I updated my test application with useReactiveVar() like you suggested but I’m still getting no result on the pager click event. Just to confirm this is what the final code should look like, right?

    const queryLimit = makeVar(12);
    const pagerOffset = makeVar(0);

    const queryLimitValue = useReactiveVar(queryLimit);
    const pagerOffsetValue = useReactiveVar(pagerOffset);

    const { data } = useQuery(GET_RECIPE_ENTRIES, {
        variables: {
            section: ["recipes"],
            limit: queryLimitValue,
            offset: pagerOffsetValue
        }
    });

Where are you initializing those reactive vars? Are those initialized outside your component? If not, move them outside the component since they need to persist across renders.

const queryLimit = makeVar(12);
const pagerOffset = makeVar(0);

function MyComponent() {
  const queryLimitValue = useReactiveVar(queryLimit);
  const pagerOffsetValue = useReactiveVar(pagerOffset);

  const { data } = useQuery(GET_RECIPE_ENTRIES, {
    variables: {
      section: ["recipes"],
      limit: queryLimitValue,
      offset: pagerOffsetValue
    }
  });
  
  // ...
}
1 Like

Ah ha! Yes that was it! I was thinking of reactive variables like React states which are typically declared inside the component. The Apollo docs really should do a better job of explaining that.

Thank you so much for the help!

@jerelmiller Follow-up: do you know why offsetLimitPagination() stops working when I switch to this method vs fetchMore()? If I don’t set any cache policies at all I see 2 sets of cache data when I click the “Load More” button, which is expected. But when I try to set a merge function or keyArgs then no new data is loaded at all.

@kmgdev glad to hear that helped!

Would you mind going into detail on what you mean by “not working”? Does it not refresh? Are the values cache incorrectly?

By chance do you have the Apollo Client Devtools installed (Chrome extension, Firefox add-on)? I’d recommend installing that extension if don’t have it yet as it will let you inspect the cache and see live updates. That should give you a hint about how the data is written to the cache after executing those queries and you might be able to spot something from there. If not, a runnable reproduction or more code samples of what your queries, type policies and cache data looks like would be helpful. Thanks!

If I have an empty cache policy like this:

const queryCache = new InMemoryCache();

And I click the “load more” pager button, then a new set of recipes is loaded into the cache as a separate block (this is expected behavior):

But if I try to do a custom cache policy, say using the offsetLimitPagination() function, like this:

const queryCache = new InMemoryCache({
    typePolicies: {
        Query: {
            fields: {
                entries: offsetLimitPagination(["section"])
            }
        }
    }
});

And click the “load more” pager button, no updated query is executed, no data is fetched, nothing new is added to the cache beyond the initial set of data from the component mount:

But in Apollo Dev Tools the “offset” variable is shown as updated:

Hmmmm. Here is the implementation of offsetLimitPagination. Can you copy this into your type policy?

const queryCache = new InMemoryCache({
  typePolicies: {
    Query: {
      fields: {
        entries: {
          keyArgs: ["section"],
          merge(existing, incoming, { args }) {
            const merged = existing ? existing.slice(0) : [];
      
            if (incoming) {
              if (args) {
                // Assume an offset of 0 if args.offset omitted.
                const { offset = 0 } = args;
                for (let i = 0; i < incoming.length; ++i) {
                  merged[offset + i] = incoming[i];
                }
              } else {
                // It's unusual (probably a mistake) for a paginated field not
                // to receive any arguments, so you might prefer to throw an
                // exception here, instead of recovering by appending incoming
                // onto the existing array.
                merged.push(...incoming);
              }
            }
      
            return merged;
          },
        }
      }
    }
  }
});

It might be helpful to log args, existing, incoming and merged here to see what might be happening.

That said, did you say that using the exact same cache code, but fetchMore instead of new vars on useQuery do work? These two go through slightly different code paths (fetchMore sets a no-cache fetch policy temporarily, then does a cache write after the request is finished), but they are still fairly similar. I’m very curious!

I threw some files up on Github so you can see the two approaches side by side. The fetchMore() approach works great, new entries are loaded and merged into the existing cache.

The cache policy you provided along with reactive vars had the same result as my other attempts: no new query request. I couldn’t even get any console.log data for the merged info because it looks like the function didn’t even run (which I guess it wouldn’t if the query didn’t execute).

Oh gosh, I see the issue now. You said something that made the issue obvious and I should have noticed this earlier. So sorry long day!

When using fetchMore, the fetchPolicy is temporarily set to no-cache to guarantee that a network request is made, regardless of the fetch policy set by the hook. If you’re just using variables on the hook, then the hook will abide by the fetchPolicy when new variables are passed in. In your case you’re using the default fetchPolicy which is cache-first. This means that changing variables will try to read from the cache, and if it can fulfill the data, it will return data from the cache. Since you’re setting keyArgs on that field to just section, offset and limit aren’t a part of the cache key, so the cache thinks it can fulfill the data required for that query which means you won’t see a network request made. That is why you’re only getting those first 3 results, then changing variables on the hook does nothing (it is essentially just rereading that data from the cache for the same section, so it pulls those 3 records from the cache).

What I’d recommend is also adding a read function that returns undefined when the array doesn’t contain enough data to fulfill the limit/offset. As you can see, the offsetLimitPagination helper doesn’t include a read function, just a merge function, reason being that we don’t know whether you want to do page-based pagination or an infinite scroll type UX where you return all records fetched so far.

Try adding a read function along these lines and this should help ensure the query is fetched properly just by passing new variables to the hook without the need for fetchMore:

fields: {
  entries: {
    // use the `keyArgs` and `merge` function returned from the helper
    ...offsetLimitPagination(["section"]),
    read(entries, { args }) {
      // Best to add defaults for args in case args aren't provided in your query.
      // Choose ones that make sense in your app. These are examples.
      const { offset = 0, limit = 10 } = args;
      // Handle when `entries` is `undefined` in case this field hasn't been 
      // fetched yet
      const records = entries?.slice(offset, offset + limit) ?? [];

      if (records.length < limit) {
        // Force a fetch with an early return if there aren't enough 
        // records in the array to fulfill the data required by `args`.
        return;
      }

      return records;
    }
  }
}

Obviously tweak this logic to suit your needs, but this should demonstrate the sort of thing you’ll want to do. Again, the key is returning undefined to force a fetch from the server to ensure that network request is actually made.

The only tricky thing here is trying to determine when you’ve reached the end of the list (assuming you want to stop the query from refetching the last page over and over, if thats a concern of yours). Since you’re just returning a simple array and no page information, you’ll probably want to store some additional meta information that detects when you’ve reached the end of the list. This will be useful in the read function to ensure you don’t return undefined when you’ve reached the end of the list.

I’d recommend using the storage util passed to both the merge and read functions. This is a plain object that allows you to store your own meta information and is scoped only to this field. It would look something like this:

read(entries, { storage, args }) {
  // ...
  
  // default `hasMore` to true in case we haven't fetched anything yet
  // (i.e. the `merge` function hasn't been run yet)
  const { hasMore = true } = storage

  // Make `hasMore` part of the condition that causes a fetch.
  // `merge` will set this to `false` once it detects the end of the list
  if (records.length < limit && hasMore) {
    return
  }

  return records;
},
merge(existing, incoming, { args, storage }) {
  // ...

  // Replace with your own logic, but detecting if we fetched less than
  // `limit` might be enough to detect the end of the list.
  storage.hasMore = incoming.length === args.limit

  // ...
}

Hopefully this gives you an idea of how to use storage. Apologies that storage utility isn’t documented very well, but it can definitely be super useful here!

Even if you don’t need storage to keep track of the end of the list, definitely try at the very least implementing a read function and that should do the trick. Let me know how it goes!

I’m not able to figure out how the initial read function works. Let’s assume we’ve run the query once with limit: 10 and offset: 0, now we’re ready to load the next page by updating offset: 10. See comments:

fields: {
  entries: {
    ...offsetLimitPagination(["section"]),
    read(entries, { args }) {
      // "entries" is the list of records currently in the cache,
      // which is currently at a length of 10

      // "args" is the arguments when the cache was updated,
      // in this case limit: 10, offset: 0

      const { offset = 0, limit = 10 } = args;
      
      // we slice a chunk off of the entries list from 0 to 10,
      // for a total length of 10
      // this will always have a length of 10 because offset always increments
      // in chunks of 10?
      const records = entries?.slice(offset, offset + limit) ?? [];

      // if the chunk we just sliced is less than 10, force a refetch
      // but this will never happen because the sliced chunk will always be 10?
      if (records.length < limit) {
        return;
      }

      return records;
    }
  }
}

Just to confirm, are the args values correct when you run the 2nd query?

slice should return a copy of the array starting at offset, so if offset is 10 and you’ve only written 10 items to the cache, it should slice(10, 20) (assuming limit is 10) which give you back an empty array since you wouldn’t have any records in entries at index 10 or greater.

Can you log some of the values here to see if you’re getting what you expect? The code looks correct to me at a glance, but I’m not sure what values you’re getting for each of those variables.

Sorry for taking so long to get back to you on this! Your code works perfectly, I was just trying to understand how it works since I’ll probably need to tweak it for my more advanced setup. Logging the args and output from read() and merge() was helpful.

I originally thought the arguments property of the read() function showed the arguments when the cache was last merged rather than the current arguments updated by the pager click event. I see now that the first read() uses the updated offset value, which results in an empty sliced array, which triggers a fetch. Then a merge(), then another read() after that with the same arguments, this time returning an array of 10, therefore no additional fetch.

As to determining the ultimate number of pages, Craft CMS has a way to return the total results in a separate field called entryCount:

query GetRecipeEntries(
        $section: [String]
        $limit: Int 
        $offset: Int 
    ) {
        recipeList: entries(
            section: $section
            limit: $limit
            offset: $offset
        ) {
            id
            title
            uri
            ... on recipes_recipes_Entry {
                recipeImg {
                    url
                }
            }
        }
        entryCount(
            section: $section
            limit: $limit
            offset: $offset
        )
    }

Is it possible to access the value of this other field within the read() function of the entries field?

Good to hear!

I believe you’d be able to access that other value with the readField helper passed to your read function. Can you try this?

read(entries, { readField }) {
  const entryCount = readField('entryCount');
}

If this doesn’t work, I’ll need to dig in a bit deeper to how readField is implemented as I’m a bit rusty on its implementation, but hope that one works for you!

I’m getting undefined when attempting to use readField:

read({readField}){
    const totalEntries = readField('entryCount');
    console.log(totalEntries); // returns undefined before and after merge
}

merge({readField}){
   const totalEntries = readField('entryCount');
   console.log(totalEntries); // returns undefined with an error
}

When using it on merge I get an Apollo error in the console:

Undefined ‘from’ passed to readField with arguments [“entryCount”]

File: @apollo/client/cache/inmemory/policies.js

I’ve confirmed entryCount is returning a value in the useQuery data variable.