Skip to content

Segmentation Fault with Bun >= 1.2.6 and Datadog Tracer #18595

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Closed
ranchodeluxe opened this issue Mar 28, 2025 · 3 comments · Fixed by #18711
Closed

Segmentation Fault with Bun >= 1.2.6 and Datadog Tracer #18595

ranchodeluxe opened this issue Mar 28, 2025 · 3 comments · Fixed by #18711
Assignees
Labels
bug Something isn't working crash An issue that could cause a crash node:http regression

Comments

@ranchodeluxe
Copy link

ranchodeluxe commented Mar 28, 2025

What version of Bun is running?

happens on 1.2.6 and 1.27 (not 1.2.5)

What platform is your computer?

MacOS

What steps can reproduce the bug?

Overview

The example below is setting up an API to just produce the Segmentation Fault. But the real issue here is dd-trace@5.35.0.

Reproduce

  1. save this script below to buggered.ts
  2. run bun run --watch buggered.ts
  3. after the API comes up hit it with curl -X POST http://localhost:3000/api/test
  4. response should be:
$ curl -X POST http://localhost:3000/api/test
curl: (52) Empty reply from server
// buggered.ts
import ddTrace from "dd-trace";
import cluster from "node:cluster";
import { availableParallelism } from "node:os";
import { createServer } from "node:http";

// Initialize tracer
ddTrace.init({
  logInjection: true,
  env: process.env.NODE_ENV || 'development'
});

(() => {
  if (cluster.isPrimary) {
    const desiredWorkers = Math.min(availableParallelism(), 1);
    console.info("Starting server API workers", { desiredWorkers });

    // API server workers
    for (let i = 0; i < desiredWorkers; i++) {
      cluster.fork({ WORKER_TYPE: "api" });
    }

    cluster.on("exit", (worker, code, signal) => {
      console.info("Worker died; starting new one", {
        deadWorkerPid: worker.process.pid,
        code,
        signal,
      });
      cluster.fork({ WORKER_TYPE: "api" });
    });
  } else {
    try {
      const server = createServer((req, res) => {
        // Set common headers
        res.setHeader("X-Worker-Info", `${process.env.WORKER_TYPE}:${process.pid}`);
        res.setHeader("Content-Type", "application/json");

        // Handle routes
        if (req.url === "/health" && req.method === "GET") {
          res.writeHead(200);
          res.end(JSON.stringify({
            status: "healthy",
            pid: process.pid,
            workerType: process.env.WORKER_TYPE
          }));
        } else if (req.url === "/api/test" && req.method === "POST") {
          res.writeHead(200);
          res.end(JSON.stringify({
            message: "Test endpoint hit successfully",
            pid: process.pid,
            workerType: process.env.WORKER_TYPE
          }));
        } else {
          // 404 handler
          res.writeHead(404);
          res.end(JSON.stringify({ error: "Not found" }));
        }
      });

      const PORT = process.env.PORT || 3000;
      server.listen(PORT, () => {
        console.info(`Worker server listening on port ${PORT}`, {
          pid: process.pid,
          workerType: process.env.WORKER_TYPE
        });
      });

      // Handle graceful shutdown
      process.on("SIGTERM", () => {
        console.info("SIGTERM received, shutting down...", {
          pid: process.pid,
          workerType: process.env.WORKER_TYPE
        });
        
        server.close(() => {
          process.exit(0);
        });
      });

    } catch (err) {
      console.error("Worker failed to start", {
        error: err instanceof Error ? err.message : String(err),
        workerType: process.env.WORKER_TYPE,
      });
      process.exit(1);
    }
  }
})(); 

What is the expected behavior?

No segfault, response happens

What do you see instead?

Segfault

Bun v1.2.7 (5c0fa6d) macOS x64
macOS v14.6.1
CPU: sse42 popcnt avx avx2
Args: "/Users/ranchodeluxe/.bun/bin/bun" "/Users/ranchodeluxe/apps/app/src/bun-bug-api.ts"
Features: Bun.stderr(2) dotenv fetch(6) http_server jsc transpiler_cache tsconfig(2) tsconfig_paths process_dlopen
Builtins: "bun:main" "node:async_hooks" "node:child_process" "node:crypto" "node:dns" "node:events" "node:fs" "node:http" "node:https" "node:module" "node:net" "node:os" "node:path" "node:perf_hooks" "node:stream" "node:url" "node:util" "node:zlib" "node:worker_threads" "node:v8" "node:diagnostics_channel" "node:dgram" "node:cluster"
Elapsed: 3014ms | User: 263ms | Sys: 223ms
RSS: 52.95MB | Peak: 52.95MB | Commit: 1.06GB | Faults: 227

panic(main thread): Segmentation fault at address 0x0
oh no: Bun has crashed. This indicates a bug in Bun, not your code.

To send a redacted crash report to Bun's team,
please file a GitHub issue using the link below:

https://bun.report/1.2.7/ma15c0fa6duIiwkU___u+pvhCmjxtnCygqklByg1ybonwkjB0426pB

Additional information

How about a haiku?

sometimes you want to
go where someone knows your name
avoid segfaults friend

@ranchodeluxe
Copy link
Author

I've upgraded to dd-trace@5.45.0 and still the same issue

@Electroid Electroid added crash An issue that could cause a crash regression and removed needs triage labels Mar 28, 2025
@pfgithub
Copy link
Contributor

Smaller reproduction:

import ddTrace from "dd-trace";
import { createServer } from "node:http";

ddTrace.init({
  logInjection: true,
  env: process.env.NODE_ENV || "development",
});

const server = createServer((req, res) => {});

const PORT = process.env.PORT || 3000;
server.listen(PORT, () => {});

await fetch("http://localhost:" + PORT + "/api/test", {
  method: "POST",
});

@Jarred-Sumner
Copy link
Collaborator

This will be fixed in Bun v1.2.9

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working crash An issue that could cause a crash node:http regression
Projects
None yet
Development

Successfully merging a pull request may close this issue.

5 participants