We have strapi-middleware-cache configured and enabled in middleware config. The load order looks as follows
load: {
before: [
"errorHandling",
"responseTime",
"logger",
"cors",
"responses",
"gzip",
],
order: [
"errorHandling",
"responseTime",
"logger",
"cors",
"responses",
"gzip",
],
after: ["parser", "router", "audit","<my-middleware>"],
}
Just to bring it to your notice, cache middleware is not specified in load order. So it must be running at some default order. My middleware is written like
async (ctx, next) => {
await next();
// my code here
}
My intention is to execute this middleware as the last part of execution chain. Where i will be modifying the response body a bit and i dont want the cache to be updated with this modification.
As per my recent interaction on slack with @DMehaffy, the above solutions should yeild the intended behaviour.
But its not happening so. Will appreciate your help. Thank you!
Ah now that I see you are using the strapi-middleware-cache, yes in that case the middleware itself responds and stops the request chain.
For reference the cache middleware is located here: GitHub - patrixr/strapi-middleware-cache: A cache middleware for https://strapi.io
Poking @alexandrebodin to see what his thoughts are on alternatives, my only thought is having something to bypass the cache (as the entire purpose of the cache is to skip doing any DB queries and return a fast response.)
Interestingly i dont want to bypass the cache, the idea is to get response (either from cache or controller) do some update on response and return it.
Can you provide some more context as to the use-case?
sure, We have some custom endpoints where we have some properties on the response object which contain information about authorisation required to access the object. example
[
{
...
propety1:"<value for this object>",
property2:"<value for this object>"
},
{
...
propety1:"<value for this object>",
property2:"<value for this object>"
}
]
We want to validate these values against the auth info we derive from token. The ides is to filter and provide only authorised info to user. Thats why we want to keep the entire response list cached but deliver filtered response.
Thank you, I will wait and see what our backend engineers think but I am leaning more towards a custom caching solution for this particular use-case (LRU caching isn’t really designed for this purpose, and it’s more so for extremely commonly accessed information that don’t require modification)
For clarification the LRU stands for “Least recently used” meaning it’s updated very infrequently