Skip to content

Some sort of middleware support #1122

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Closed
1 task done
drwpow opened this issue May 22, 2023 · 45 comments
Closed
1 task done

Some sort of middleware support #1122

drwpow opened this issue May 22, 2023 · 45 comments
Labels
enhancement New feature or request openapi-fetch Relevant to the openapi-fetch library

Comments

@drwpow
Copy link
Contributor

drwpow commented May 22, 2023

Original message by @djMax on drwpow/openapi-fetch#41


Description

The request interception, response interception, etc is quite useful and I wouldn't think would add much to the library. Is there a way to simulate this on top of the library? (I don't see an obvious one)

Proposal

Add a single pre-flight and post-flight hook argument to the methods.

Checklist

@drwpow drwpow added enhancement New feature or request openapi-fetch Relevant to the openapi-fetch library labels May 22, 2023
@djMax
Copy link

djMax commented May 24, 2023

I started messing with this, and I now think the answer is "don't do it here." ClientOptions already supports a fetch argument, and you can basically do all you want with that, I think.

@adarshaacharya
Copy link

@djMax so did you found any solution for this using fetch argument?

@djMax
Copy link

djMax commented Jun 7, 2023

for collecting basic metrics? Yes, you just wrap the fetch function with an observer. You can use something like this:

https://www.npmjs.com/package/fetch-hooks

I've been meaning to make a more purpose built module for this particular library but haven't had time. The one thing I think should change in this lib is the direct call of response.json(). Instead there should probably be some way to set what method is called to handle the response.

@adarshaacharya
Copy link

@djMax i'm trying to implement refresh token using openapi-typescript and running out of idea for implementing that.
Used to do it using axios interceptor checking the status code, how can we implement here ?

@djMax
Copy link

djMax commented Jun 7, 2023

You would just make a fetch something like this:

async function customFetch(...args) {
  // Modify the request as necessary, e.g. with a Bearer token auth header
 const result = await fetch(...args);
 if (result.status === 403) {
   // refresh the token, call customFetch again, do something to protect against repeated calls
   return customFetch(...args);
 }
  return result;
}

@marcomuser
Copy link
Contributor

marcomuser commented Jul 20, 2023

Something simple like this should also make do I suppose for most pre-flight cases?

const customFetchWithInterceptor = async (url, options) => {
  // Get your accessToken and add your custom headers or any other modifications to the options object here, e.g.:
  const accessToken = await getAccessToken();
  const modifiedOptions = {
    ...options,
    headers: {
      ...options?.headers,
      Authorization: 'Bearer YOUR_ACCESS_TOKEN',
    },
  };

  // Make the actual fetch call with the modified options
  return fetch(url, modifiedOptions);
}

And then use this function as the custom fetch in the ClientOptions.

@WickyNilliams
Copy link
Contributor

it's slightly off topic, but this i ended up on this issue when investigating how to throw on error instead of returning a data structure like { data, error }. i am using tanstack query, so throwing on error is a must. i didn't want to manually do this every time.

i struggled to write an object that wraps each GET/POST/etc function without breaking the types. or having to restate all the generics, which is error prone.

i have landed on exposing the client like this:

const client = createApiClient<paths>({ ... })
export type Client = typeof client
export type ApiCall =  (client: Client) => Promise<FetchResponse<T>>

export function useApi() {
  return async function <T>(apiCall: ApiCall<T>) {
    const response = await apiCall(client)

    if (response.error) {
      throw new ApiError(response)
    }

    return response.data!
  }
}

// later
const api = useApi()
const data = await api(client => client.GET("/some/path"))

this allowed me to have a small wrapper (at the cost of some indirection) that leverages TS inferance with minimal type-level magic, and means it'll always throw on error.

just posting here in case it helps anyone in future! or if supporting this use case can be a consideration for some future middleware/extensibility points

@ddx32
Copy link

ddx32 commented Nov 2, 2023

@WickyNilliams thanks for the snippet! Would you be able to share how's the FetchResponse type structured? I'm trying to implement a tanstack-query wrapper, but struggling so far.

@WickyNilliams
Copy link
Contributor

@ddx32 It's exported from openapi-fetch :)

import type { FetchResponse } from "openapi-fetch"

@tonyxiao
Copy link

What if we created something like trpc links? It has a tiny internal observables implementation but provides the ultimate flexibility to end users.

@tonyxiao
Copy link

tonyxiao commented Nov 17, 2023

This is my current solution. Still want to add something that allows you to refresh tokens and such before request, but it solves the problem of no more error possibility in the response typing (you can throw your own error).

import type { FetchResponse } from 'openapi-fetch';
import createClient from 'openapi-fetch';

type MaybePromise<T> = T | Promise<T>;

type _ClientOptions = NonNullable<Parameters<typeof createClient>[0]>;
// Workaround for https://github.com/drwpow/openapi-typescript/issues/1122
interface ClientOptions extends _ClientOptions {
  preRequest?: (...args: Parameters<typeof fetch>) => MaybePromise<Parameters<typeof fetch>>;
  postRequest?: (
    res: Awaited<ReturnType<typeof fetch>>,
    requestArgs: Parameters<typeof fetch>
  ) => ReturnType<typeof fetch>;
}

// eslint-disable-next-line @typescript-eslint/ban-types
export function createOpenapiClient<Paths extends {}>({
  preRequest = (url, init) => [url, init],
  postRequest = (res) => Promise.resolve(res),
  ...clientOptions
}: ClientOptions = {}) {
  const baseFetch = clientOptions?.fetch ?? globalThis.fetch;
  const customFetch: typeof baseFetch = async (...args) => {
    const requestArgs = await preRequest(...args);
    const res = await baseFetch(...requestArgs);
    return postRequest(res, requestArgs);
  };
  const client = createClient<Paths>({ ...clientOptions, fetch: customFetch });

  return {
    GET: (...args: Parameters<typeof client.GET>) => client.GET(...args).then(throwIfNotOk),
    PUT: (...args: Parameters<typeof client.PUT>) => client.PUT(...args).then(throwIfNotOk),
    POST: (...args: Parameters<typeof client.POST>) => client.POST(...args).then(throwIfNotOk),
    DELETE: (...args: Parameters<typeof client.DELETE>) => client.DELETE(...args).then(throwIfNotOk),
    OPTIONS: (...args: Parameters<typeof client.OPTIONS>) => client.OPTIONS(...args).then(throwIfNotOk),
    HEAD: (...args: Parameters<typeof client.HEAD>) => client.HEAD(...args).then(throwIfNotOk),
    PATCH: (...args: Parameters<typeof client.PATCH>) => client.PATCH(...args).then(throwIfNotOk),
    TRACE: (...args: Parameters<typeof client.TRACE>) => client.TRACE(...args).then(throwIfNotOk),
  };
}

function throwIfNotOk<T>(res: FetchResponse<T>) {
  if (res.error) {
    // TODO: Return more detailed error here...
    res.error;
    throw new Error('HTTPError');
  }
  // You can further modify response as desired...
  return res;
}

@drwpow
Copy link
Contributor Author

drwpow commented Nov 18, 2023

@tonyxiao I like the API design of tRPC links! I think I’d probably lean in this direction too because tRPC and openapi-typescript do have a lot of similar ideas. While this library is lower-level, lighter-weight, and more flexible than tRPC and will be used in different ways, I think it only helps users to lean into the similarities to help migrating from one to the other simpler.

@drwpow
Copy link
Contributor Author

drwpow commented Nov 19, 2023

Also this is a side-comment, but things like BigInt support or Date support should be able to be supported via middleware. I think that’s a good usecase of needing to bypass the default JSON deserialization when needed (I know a lot of middleware is specifically designed for requests, but it should provide hooks for responses, too)

@tonyxiao
Copy link

Yea, so trpc links can modify both the request and the response, and even do things like batching multiple requests in one! So pretty powerful stuff. I think with some proper pipe typing we should be able to get response typing correctly including middleware too.

@madi-tracksuit
Copy link

Just wanted to add that as a heavy trpc user some kind of way to add middleware-like stuff is a must, even for the very basic use-case of throwing on rather than returning errors (so that they can be passed through to trpc's error handling, in our case)

@denisw
Copy link

denisw commented Dec 7, 2023

Middleware would be great, especially for intercepting responses (our use case is access token refresh). A custom fetch wrapper works well enough for enhancing the request, but not so well for response processing as I cannot inspect the response body without using it up and breaking openapi-fetch's response processing.

@drwpow
Copy link
Contributor Author

drwpow commented Dec 13, 2023

So I think we’re all in agreement this seems like a good thing to add, and isn’t out of scope for this package. I’d love to match tRPC’s links API & behavior in almost every way (e.g. executing in array order for request, and reverse-array order for response).

I haven’t been able to make any meaningful progress in the last couple weeks, but I’d welcome a PR that essentially implements tRPC links support & adds tests 🙂 (if someone beats me to it—I may have time near end-of-year but not before then).

Some general opinions:

  • I’m open to the actual name; perhaps links would be best to make it clear how the API works (and we can still call it “middleware” in docs so it’s searchable)
  • Client weight is a major consideration, so this shouldn’t require any dependencies
    • I believe this is a simple enough feature it can be implemented manually, however, I’m open to arguments of an external library being necessary if the weight is justified (e.g. it’s a no-dependency library, and/or is more performant than a manual implementation would be)
  • Any deviation from how tRPC links work is undesired. But if there are any, they should be pointed out & justified

@WickyNilliams
Copy link
Contributor

WickyNilliams commented Dec 13, 2023

Is there some minimum set of changes that can be made to openapi-fetch to allow the links stuff to be built on top independently?

I ask because it feels like a safer bet. I expect many people's needs fall short of needing a full links setup. It would be nice to add just enough extensibility to allow for links to be built by anyone keen enough, whilst also allowing for simpler use cases.

@drwpow
Copy link
Contributor Author

drwpow commented Dec 15, 2023

@WickyNilliams After saying I wouldn’t, I unexpectedly had some time this week and started diving into middleware. And though I initially wanted to implement it as closely as possible, after diving in I realized it’s not only overkill for openapi-fetch, it also doesn’t make sense as it’s handling streaming server responses. So I’m starting down a path of “making it as inspired as possible from tRPC links” but the end result will be much simpler, because openapi-fetch is a simple library.

Will post a link to a PR when it’s further along, and when that comes would welcome any/all critical feedback.

As a sidenote, I’m currently stuck between middleware and bodySerializer() since the two basically do the same thing. And can’t be used together since one has to take priority over the other. I’m contemplating introducing a breaking change, and removing bodySerializer() in favor of middleware since middleware can accomplish all that and then some. However, querySerializer(), only being concerned with the URL, is handy to keep, I think, and doesn’t interfere as much (just saves middleware from having to parse URL + params and build the URL, which is tedious/annoying)

@drwpow
Copy link
Contributor Author

drwpow commented Dec 19, 2023

Still a little WIP, but a proposed API is up here: https://middleware.openapi-ts.pages.dev/openapi-fetch/api#middleware. Would love feedback 🙏

Still have a few tests failing, but after those are fixed, we can do a beta release of this to get feedback before committing to the API (treat everything as changeable at this point!)

@denisw
Copy link

denisw commented Dec 19, 2023

@drwpow Thank you for working on this! 👍

Regarding the API: the if (req) { ... } else if (res) { ... } requirement for middleware functions makes me think that a piece of middleware might be better represented as an object with two separate functions, one for request and the other for response transformation. For example:

createClient({
  middleware: [
    {
      onRequest(req, options) {
        return new Request(req.url, {
          ...req,
          headers: { ...req.headers, foo: "bar" },
        });
      },
      onResponse(res, options) {
        return new Response({
          ...res,
          status: 200,
        });
      },
    },
  ],
});

This approach is a little less awkward. It also makes it more straightforward to express middleware that transforms only requests or only responses, if we allow only one of the two functions to be specified.

Alternatively, one could adopt a middleware-as-async-function approach similar to Koa:

createClient({
  middleware: [
    async function myMiddlware(req, next, options) {
      const reqWithHeader = new Request(req.url, {
        ...req,
        headers: { ...req.headers, foo: "bar" },
      });

      const res = await next(reqWithHeader);

      return new Response({
        ...res,
        status: 200,
      });
    },
  ],
});

@WickyNilliams
Copy link
Contributor

I was going to suggest something like koa also, it's a nice pattern. Very clear and makes it easy to keep some state around for use after the request

@tonyxiao
Copy link

@drwpow Actually I started working on a library (github, npm @opensdks/fetch-links) based on top of openapi-fetch and started introducing a middleware abstraction there. After some iteration what I arrived at for middleware the following interface (I used the word Link but it is identical in meaning as Middleware).

/** aka `Middleware` */
type Link = (
  req: Request,
  next: (req: Request) => Promise<Response>,
) => Promise<Response>

Link is a function that accepts two arguments, a standard web Request and a next parameter, and it must asynchronously return a standard web Response. It may do so by calling the next link in the chain or returning a response directly (in which case it is called a terminating link).

The simplest link is one that does nothing and simply passes the request along to the next one in the chain.

const noopLink: Link = (req, next) => next(req)

If you have used axios before, you may know about requestInterceptor and responseInterceptor. They can be easily expressed as links (kind of like that concept in mathematics where a more general theorem simplifies down to a more specific theorem in specific situations)

const requestInterceptorLink: Link = (req, next) => {
  // do something with req, such as setting a header
  req.headers.set('x-api-key', 'my-super-secret-key')
  return next(req)
}

const responseInterceptorLink: Link = async (req, next) => {
  const res = await next(req)
  // do something with res, such as checking for errors
  if (res.status !== 200) {
    throw new Error(`Response failed with status ${res.status}`)
  }
  return res
}

Because Links simply use the web standard Request & Response objects, the terminating fetchLink is also a trivial one liner.

const fetchLink: Link = (req) => fetch(req)

If one wants to use a custom termination mechanism (for example axios to support older platform, it can be done by using a custom axios link

export function axiosLink(): Link {
  return async (req) => {
    const res = await axios.request({
      method: req.method,
      url: req.url,
      data: await req.blob(),
      headers: Object.fromEntries(req.headers.entries()),
      responseType: 'stream',
    })
    return new Response(res.data, {
      headers: res.headers as HeadersInit,
      status: res.status,
      statusText: res.statusText,
    })
  }
}

Here's a table of links I think would be super useful, all of which can be implemented using the Link interface

Link Description Accepted options
logLink Print request / response to a configurable destination for debugging / monitoring log?: typeof console.log
retryLink Retry request that have failed due to 5xx error with expotential backup maxRetries?: number
throwLink Great for simulating error conditions and writing tests. Customize the error that gets thrown in getError or return null to let the request through to the next link in chain. `getError: (req) => Error
fetchTermLink Terminating link that uses fetch to send request to the backend server. You may provide a fetch?: typeof globalThis.fetch
axiosTermLink Terminating link that uses axios to send request to the backend server. It does not include its axios: typeof import('axios').default
oauthLink Adds Authorization: Bearer $accessToken header to the request. Optionally checks for validity of the accessToken and proactive refresh the accessToken before making the actual request. tokens: OauthTokens
refreshConfig: OauthConfig
onRefresh?: (tokens: OauthTokens) => void
runtimeValidateLink Validate the actual request & response payload using the provided OpenAPI spec at runtime. This is especialy helpful when interacting with flaky 3rd party APIs that don't provide or stick with their spec. It can also catch the inevitable bugs in your own client code because JavaScript is ultimately a dynamic language (Don't tell me you have never used an as any 😈) `oas: oas30.OpenAPIObject
rateLimitLink Most APIs have rate limit that needs to be respected. This link can either delay the request to stay below the rate limit or throw a RateLimited error so you can handle it in your client / UI. It would be even better if it could be backed by something like Redis so you don't blow past your limit in a multi-process / serverless environment. TBD

One of the nice thing about standardizing on web request & response is that links used anywhere the Fetch API exists, even without using openapi-fetch

function applyLinks(
  request: Request,
  links: Link[],
): Promise<Response> {
  const [link, ...rest] = links
  if (!link) {
    throw new Error('Terminating link missing in links chain')
  }
  return link(request, (req) => applyLinks(req, rest))
}

const myFetch: typeof globalThis.fetch = (url, init) =>
  applyLinks(new Request(url, init), [
    // add your own custom links, just don't forget to terminate
    logLink(),
    fetchLink(),
  ])

await myFetch('https://httpbin.org/get')

If you are creative, you can even find other crazy places for links. For example, you can use links inside next.js' new app directory route handlers.

// @file: app/my-endpoint/route.ts

const handler = (req) => {
  // do something
  return NextResponse.json({ some: 'data' })
}

export function GET(req: NextRequest) {
  return applyLinks(req, [logLink(), handler])
}

My hope is by making links depend on nothing except the web Fetch API we will get a whole bunch of people implementing all sorts of cool links because it's useful to them, and then it would benefit everybody who wants to use links.

Current status

I'm already using this approach in production on top of openapi-fetch by using a customFetch. it looks like this.

https://github.com/useVenice/openSDKs/blob/17bd1b1806b5b54a4a1f77c3294021038e2492c2/examples/example-openapi-fetch-links.js#L3-L18

import createClient from 'openapi-fetch'
import {applyLinks, fetchLink, logLink} from '@opensdks/fetch-links'

const customFetch = (url, init) =>
  applyLinks(new Request(url, init), [
    logLink(),
    // other links as desired...
    fetchLink(),
  ])

const client = createClient({
  fetch: customFetch,
  baseUrl: 'https://httpbin.org',
})

client.GET('/anything', {}).then((r) => console.log(r.data))

f you run this, you get

❯ node example-openapi-fetch-links.js
[#1 Request] GET https://httpbin.org/anything
[#1 Response] OK 200
{
  args: {},
  data: '',
  files: {},
  form: {},
  headers: {
    Accept: '*/*',
    'Accept-Encoding': 'br, gzip, deflate',
    'Accept-Language': '*',
    'Content-Type': 'application/json',
    Host: 'httpbin.org',
    'Sec-Fetch-Mode': 'cors',
    'User-Agent': 'node',
    'X-Amzn-Trace-Id': 'Root=1-65828476-7dc057731fca88d70412d29d'
  },
  json: null,
  method: 'GET',
  origin: '81.193.212.57',
  url: 'https://httpbin.org/anything'
}

We don't need to depend on @opensdks/fetch-links at runtime if you want to keep the lib zero dep lightweight, or even build time. as the needed code is super small that you can copy paste. applyLinks function earlier is literally the only piece of runtime code and the only type signature that is required is the Link signature. If we can adopt that same interface in openapi-fetch, I think we'll be able to get a whole lot more out of box middlewares because everyone who uses the fetch API can benefit from and therefore contribute to them.

Happy to submit a PR for this or work off your branch. Lmk.

@djMax
Copy link

djMax commented Dec 20, 2023

I'm not a big fan of the "overload" either, even though I get why it's in the minimal spirit. The link syntax seems fine and occurs in other modules like vitest. But that can be a bit confusing on first sight, so I'm also fine with onrequest/on response in an object for each middleware entry

@jmpecx
Copy link

jmpecx commented Dec 22, 2023

+1
BTW, is 0.9.0 published ?

@Noahkoole
Copy link

@drwpow Where can I download the 0.9.0 version? I wouldnt care if it is beta but I really need the middleware for my project.

@drwpow
Copy link
Contributor Author

drwpow commented Jan 27, 2024

@Noahkoole The API for the middleware is still baking; I received some fantastic feedback on the approach from @tonyxiao and others, and wanted to release a RC with it with some improvements. I had also been working on a project with Hono which has a great middleware API IMO and wanted to put some ideas from that there as well. Am planning to release a RC very soon; within the next week or so.

@WickyNilliams
Copy link
Contributor

No need to rush it out. Better to get it right than to get it quick imo. Sounds promising either way 👌

@ootkin
Copy link

ootkin commented Jan 28, 2024

I'm interested too. I Would like to help if it's possible.

@drwpow drwpow mentioned this issue Jan 28, 2024
3 tasks
@drwpow
Copy link
Contributor Author

drwpow commented Jan 28, 2024

Put up a PR that’s also an RFC for the API. Please see the attached docs for the API and intended usage (thanks @denisw for the great suggestion 🙂). And if you’re interested, please kick the tires and report any bugs 🐞

@djMax
Copy link

djMax commented Jan 29, 2024

Typo - "MergedOptiosn"

@drwpow
Copy link
Contributor Author

drwpow commented Feb 14, 2024

Kicked the tires a little more, and made some minor tweaks, and published 0.9.0-rc.0 (under the openapi-fetch@next tag). Last chance to voice any last suggestions before it’s released!

@drwpow
Copy link
Contributor Author

drwpow commented Feb 15, 2024

Just merged #1521! 🎉 Will release 0.9.0 shortly along with a better querySerializer and full support for Path serialization

@drwpow drwpow closed this as completed Feb 15, 2024
@jmpecx
Copy link

jmpecx commented Feb 19, 2024

Since onRequest and onResponse are two independent functions, they are not related, which will make it difficult to perform backtracking operations.

Consider the following common scenario. We determine that the status code returned by the server is 401, then we refresh the token, and then we resend the request. At this point, we cannot find out in onReponse which request sent. and now how can we resend it?

Therefore, I think middleware cannot replace the role of custom fetch for the time being, although it is much prettier than the ugly fetch patch.

@drwpow
Copy link
Contributor Author

drwpow commented Feb 19, 2024

@jmpecx many other middleware APIs have a similar separation. msw for instance sends a uuid to mark individual requests that you can track across event types. that would add a little weight to the library but would something like that solve your usecase?

@jmpecx
Copy link

jmpecx commented Feb 20, 2024

@drwpow I considered this scenario, but then the client would need to store all the send queues for retrieval. This will result in additional logic complexity and memory consumption.

but if I use custom fetch it can be solve easily. (the request store in invoke context we don't need to recover it)

for middleware, may be we can invoke onReponse with origin Reqeuest object? then we can resend it

@ohroy
Copy link

ohroy commented Feb 20, 2024

I make a PR for this feature.

#1556

@drwpow
Copy link
Contributor Author

drwpow commented Feb 20, 2024

@rozbo thanks for your suggestion, but I’d prefer changes (especially breaking ones) to the API be proposed (as a separate issue) before a PR is opened; that way no work is wasted. But there are some great ideas in your PR I’d like to work in if possible.

@tonyxiao
Copy link

@rozbo Looks like you have similar ideas to what I have here #1122 (comment) Perhaps we can collaborate? I think you can wrap openapi-fetch to be able to accomplish your desired function without needing any api changes.

@drwpow speaking of which would like to hear your feedback on my comments too. Not sure if you took anything from it for this PR.

@drwpow
Copy link
Contributor Author

drwpow commented Feb 23, 2024

@tonyxiao Ah, sorry, I never gave feedback on all your wonderful ideas! I’m impressed with the thoroughness of your research and implementations, and did read every word you wrote and looked at the examples. I should have taken more time to weigh options with you. My major concerns with that particular API were:

  • More code for the user: it seemed like in order to modify the outgoing request or response, it would involve a little more creative thinking to implement, assuming they wanted to do one or the other (but not both).
  • More code internally: I imagined that the internal implementation with callbacks could get tricky and have unintended bugs, whereas a simple serial promise array is simple, predictable, and tiny.
  • Required too much community support: If Links have more overhead composing than a simpler “hook”-type callback, it does make sense to create, document, and share them as a way to reuse them. But I didn’t have the time (at least in this release) to create a litany of well-documented Links that would then get issues opened on them. I love the theory; it was just a little bit too much time commitment. But also, perhaps I underestimate the people that do contribute here and would be interested in helping 🙂

Of course, any of my points may just be misunderstanding! So feel free to correct me on any of those. I also am fully aware that openapi-fetch is still 0.x (pre-stable), so if we don’t like the middleware API as-is, I am still fully leaving the door open to the Links idea as a rewrite for 1.0, if it could meet the needs of the current API and then some. This was a heavily-requested feature so I did feel like just unblocking people, then learning how people do/don’t use it, would be more valuable than spinning on the API further, again, knowing major versions are free, and we can revisit it at any time if the majority of users feel it should be a certain way.

@christoph-fricke
Copy link
Contributor

@drwpow After using using the current middleware implementation to attach an authorization header to every request, I wanted to adding a new logging middleware today. I am using openapi-fetch on the server, so the goal is to get insights into outgoing requests and their response through small log entries. Something like:

[15:05:34.488] INFO (ApiClient): GET http://localhost:3503/api/todos - 200 in 10ms

Creating this entry requires information about the request method and url, as well as response status and additional timing information.

I fail to see how this can be build with the current middleware implementation, because the onResponse handler, where the log has to be written, does not reliably has access to the request object. Even if the original request is provided to the onResponse handler, how should one go about tracking additional information (like timings) across the whole request/response cycle?

Using the widely popular (connect) middleware approach in Express, made it easy to implement logging for incoming requests. I love how @tonyxiao's middleware proposal follows the same flexible philosophy, but makes it work for fetch Request/Response objects and asynchronous operations. This middleware model would have made my logging implementation easy in openapi-fetch as well.

Sadly, my only solution has been to create a wrapper around fetch. Interestingly, with a bit of generalization and abstraction, I would easily end up with an implementation of @tonyxiao's middleware proposal. Kinda shows a flexible and minimal it is. This is my current implementation for logging (and attaching auth headers) in openapi-fetch:

import createClient from 'openapi-fetch';

const client = createClient({
  baseUrl,
  fetch: async (input, init) => {
    const req = new Request(input, init);
    req.headers.set('Authorization', `${token.tokenType} ${token.accessToken}`);

    let timing = Date.now();
    const res = await fetch(req);
    timing = Date.now() - timing;

    logger.info(`${req.method} ${decodeURIComponent(req.url)} - ${res.status} in ${timing}ms`);
    return res;
  },
});

@drwpow
Copy link
Contributor Author

drwpow commented Mar 16, 2024

@christoph-fricke That’s great feedback, thank you. Agree that @tonyxiao’s implementation is robust and elegant, and perhaps I was too quick to go another route. I’m still intentionally keeping openapi-fetch at 0.x for a while longer, and am still open of going in another direction for middleware for 1.x. The more pieces of feedback I get that point in one direction, the better 🙂

Even if the original request is provided to the onResponse handler, how should one go about tracking additional information (like timings) across the whole request/response cycle?

Short-term, one idea that’s been discussed is passing in a request UUID to the middleware to track unique requests (MSW has a similar concept). That way you could profile fetches even if multiple are happening in parallel. It’d require some state outside of the middleware, of course, but that would probably be recommended to do anyway so it doesn’t block the request itself. While the request UUID doesn’t exist now, it could be a nonbreaking change added today (and if someone is willing to open a PR for it before I can, I’d happily accept it).

@drwpow
Copy link
Contributor Author

drwpow commented Mar 16, 2024

Consider the following common scenario. We determine that the status code returned by the server is 401, then we refresh the token, and then we resend the request. At this point, we cannot find out in onReponse which request sent. and now how can we resend it?

Sorry; realized I didn’t address this specifically. That’s a great question, and one I realized I hadn’t explicated anywhere before. At least for 0.x I feel strongly openapi-fetch won’t retry requests. When you open that up, you inevitably start getting into caching (and cache invalidation), orchestration, request deduplication, and all the bloat that comes with more advanced clients. If you’re using openapi-fetch wrapped in a caching layer (e.g. React Query), it already has retry mechanisms built-in. For now, openapi-fetch is still a low-level typed fetch library that can work in any stack and any setup. In addition to time cost, adding more features limits how it can/can’t be used.

Perhaps we can revisit this in 1.x and beyond (especially if others are willing to help contribute more!), but I haven’t started thinking about that just yet. Still collecting good feedback like yours to inform that 🙂

@crutch12
Copy link
Contributor

crutch12 commented Sep 4, 2024

Hi.

We have written openapi-typescript wrapper over axios: https://github.com/web-bee-ru/openapi-axios

Hope it helps ❤️

@drwpow
Copy link
Contributor Author

drwpow commented Sep 12, 2024

@crutch12 would be more than happy to add this to the docs as a suggested fetch client. Could you open a PR?

crutch12 added a commit to crutch12/openapi-typescript that referenced this issue Sep 16, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request openapi-fetch Relevant to the openapi-fetch library
Projects
None yet
Development

No branches or pull requests