Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Otel support #520

Open
oldbettie opened this issue Sep 21, 2024 · 3 comments
Open

Otel support #520

oldbettie opened this issue Sep 21, 2024 · 3 comments

Comments

@oldbettie
Copy link

This is a copy of my conversation in discord but I believe this is a big enough deal to create an issue.

Has anyone got Otel working based on this https://nextjs.org/docs/app/building-your-application/optimizing/open-telemetry

I have wasted days now, I have it working locally with SST but the moment I deploy it I get nothing. I have even done a minrepro to test it with the latest version of SST and it still does not work in any SST remote deployment. I have the min repro working with Vercel, Railway and even self hosted in AWS ecs/ecr.
I love SST but this is a deal breaker for me. I have tried both manual and auto instrumentation. I am not running a collector I am just using honeycomb traces endpoint as it doesn't require a collector.

I have load tested it in railway and it performs exactly as expected.

Is this something open-next plans to sortout or should I just switch to Railway or Vercel? How do people in production currently deal with Otel? surely this has been raised before but I have searched high and low and could not find anything. This tells me either no one deploys to prod with with SST...

@conico974
Copy link
Contributor

conico974 commented Sep 21, 2024

This tells me either no one deploys to prod with with SST...

Not everyone use Otel. Otel is considered as EXPERIMENTAL by Next themselves, that's not exactly something that i would consider mandatory to deploy in prod

Here is a link to the discord thread for context https://discord.com/channels/983865673656705025/1286983198819090445

This is the context used by vercel in their runtime https://github.com/vercel/otel/blob/main/packages/otel/src/vercel-request-context/api.ts

One other option (probably a better one now that i think about it) to implement this would be to use a custom wrapper and provide a fake vercel context yourself https://open-next.js.org/config/custom_overrides
Something like that (not tested) should do the trick, but be aware that this is a bit risky. It relies on internal stuff from vercel which might change in the future :

// customWrapper.ts
import streamingWrapper from 'open-next/wrappers/aws-lambda.js'
import {WrapperHandler} from 'open-next/types/open-next.js'
 
const handler : WrapperHandler = async (handler, converter) => {
  const defaultHandler = await streamingWrapper.wrapper(handler, converter)
  return async (event, context) => {
    const symbol = Symbol.for("@vercel/request-context");
    const promiseToAwaits : Promise<unknown>[] = []

    //@ts-ignore
    globalThis[symbol] = {
      get: () => {
        return {
          waitUntil: (promiseOrFunc: Promise<unknown> | (() => Promise<unknown>)) => {
            const promise = 'then' in promiseOrFunc ? promiseOrFunc : promiseOrFunc()
            promiseToAwaits.push(promise)
          },
          headers: event.headers,
          url: event.rawPath,
        }
      }
    }
    const response = await defaultHandler(event, context)
    await Promise.all(promiseToAwaits)
    return response
  }
}
 
export default {
  wrapper: handler,
  name: "aws-lambda",
  supportStreaming: false,
};

@khuezy
Copy link
Contributor

khuezy commented Sep 21, 2024

This tells me either no one deploys to prod with with SST...

dozens

@jonlambert
Copy link

Not everyone use Otel. Otel is considered as EXPERIMENTAL by Next themselves, that's not exactly something that i would consider mandatory to deploy in prod

Disagree here FWIW - it's no longer regarded as experimental by Next.js, and OpenTelemetry is definitely something a lot of companies will expect to be supported in a deployment.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

4 participants