Why does this exist?

I've found myself checking the official OpenAI status dashboard when I'm experiencing either a slow model or model errors, but the status dashboard normally shows all green even when I'm experiencing API/Playground/ChatGPT issues. So the official status doesn't seem to capture non-catastrophic but still elevated rates of errors or slowness. This unofficial OpenAI status page fixes that.

At a Glance

Compare current model performance with previous data. Colors reflect two day percentiles: lower values and greener shades indicate better performance.

Last updated less than a minute ago
7/25/2024, 6:00:15 PM UTC
ModelTokens / Second (Perf, %ile)Difference2 Day Average3 Hour Error Rate (Status)

Tokens Per Second

The number of tokens per second the model is generating. Indicates model speed.





The response time for the OpenAI API's models call, indicating base latency independent of the model's response time, as called from US-West. Shorter durations are preferable.