Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Question: Sync getter? #21

Open
yaacovCR opened this issue May 31, 2024 · 3 comments
Open

Question: Sync getter? #21

yaacovCR opened this issue May 31, 2024 · 3 comments
Labels
enhancement New feature or request

Comments

@yaacovCR
Copy link

What kind of improvement might you envision for a synchronous API with the in-memory option? or is that already an option? Or not worth the complexity of returning a promise or value and having to check?

@Julien-R44
Copy link
Owner

Julien-R44 commented Jun 1, 2024

Honestly yes, naively I'd say it would require quite some work and add some complexity to the codebase, so I'm not sure it's worth it.

What's your use case? Are you worried about the perf-hit caused by creating promises, or do you just need to use bentocache in a synchronous context?

@yaacovCR
Copy link
Author

The performance hit. But not per se as a user of this library, just asking as a learning opportunity and to compare approaches.

My context is that I work here and there on open source development within the graphql server reference implementation and within the code base there is a lot of effort on staying sync when we can despite the more complex return types, so I was surprised to be honest that this library didn't.

One exception of course is DataLoader, that will stay async so that one synchronous hit amidst a batch does not cause the batch to split. If one is using bentocache inside a dataloader, all will be well.

But considering the general purpose of this library, I was a bit surprised. I guess it's also about that tradeoff of performance vs usability...

Sorry for the late reply and thanks for taking a look at my question!

@yaacovCR
Copy link
Author

Let me put a practical spin on it. I want to create a graphql executor that utilizes query planning and I want to cache the plans, but I don't want every operation to be async. I see now that you have L1 drivers that are synchronous and L2 that are async, but I can't use bentocache for plan caching because even if the plan is in L1 the call to bentocache is async. I can use my own L1 cache and only call bentocache if it's empty, but then I am recreating the superb functionality of this library. :(

@Julien-R44 Julien-R44 added the enhancement New feature or request label Oct 2, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request
Projects
None yet
Development

No branches or pull requests

2 participants