Skip to content

Conversation

@alii
Copy link
Member

@alii alii commented Feb 5, 2025

No description provided.

alii added 2 commits February 5, 2025 13:21
…les, make getContext and onError optional, move server options into the router to require less boilerplate, add openapi example
@vercel
Copy link

vercel bot commented Feb 5, 2025

The latest updates on your projects. Learn more about Vercel for GitHub.

Project Deployment Preview Comments Updated (UTC)
docs Error Error Nov 24, 2025 4:23am

…t to Response, rather than just instance check
* feat: add workflow for running perf test

* fix: branch

* fix: workflow

* fix: workflow + cache

* fix: server

* fix: server

* fix: server

* fix: server part 500

* test: comments

* fix: oha

* feat: add status code distribution and hopefully runner info

* feat: performance tests

* fix: pr

* fix: perms?

* fix: god help us all

* fix: always cache cargo

* bruh

* chore: disable workflow permissions

* test
@github-actions
Copy link

github-actions bot commented Feb 5, 2025

📊 Performance Test Results

Runner Specifications:

  • CPU: AMD EPYC 7763 64-Core Processor
  • CPU Cores: 4
  • Memory: 15Gi
Metric Value
Average Latency 0.003ms
P95 Latency 0.006ms
P99 Latency 0.008ms
Requests/sec 12524
Success Rate 100%

Status Code Distribution:

  • 200: 62591 requests

@alii
Copy link
Member Author

alii commented Feb 5, 2025

12k rps is awful but still useful for comparison between prs

To do a 201/202/204, developers should return `new Response(null, {status})`
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

4 participants