←View full ChatGPT status
ChatGPT: Model Unavailable / Overloaded
Current Status: Operational
Last checked: just now
What We're Seeing Right Now
No recent issues reported. If you're experiencing problems with ChatGPT, report below to help the community.
What is this error?
When ChatGPT reports a model as unavailable or overloaded, the specific AI model you're trying to use cannot process requests right now. This is often temporary and happens during peak usage times, model updates, or capacity issues.
Error Signatures
Model not foundmodel_not_foundThe model is currently overloadedModel is at capacityThis model is not availablemodel_overloadedThe model does not existAccess denied to modelModel temporarily unavailablecapacity_exceededCommon Causes
- The model is overloaded with too many concurrent users
- Model is being updated or undergoing maintenance
- The model has been deprecated or renamed
- Your plan doesn't include access to this specific model
- Regional capacity issue — model available in some regions but not others
✓ How to Fix It
- Check if ChatGPT has announced any model changes or deprecations
- Try a different model version (e.g., gpt-4 instead of gpt-4-turbo, claude-3-haiku instead of claude-3-opus)
- Wait 5-10 minutes and retry — overload is often temporary
- Check your plan's model access — some models are limited to higher tiers
- Verify the exact model name/ID — typos or outdated names cause this error
- Check this page for community reports on model availability
Live Signals
Service Components
ChatGPT Web
OperationalRecent Incidents
No incidents in the past 30 days
Frequently Asked Questions
Why is ChatGPT model not available?
The model may be overloaded, under maintenance, deprecated, or not included in your plan. Check ChatGPT's announcements and try an alternative model.
Is ChatGPT model overloaded right now?
Check the live status and community reports on this page. A spike in reports usually indicates widespread capacity issues.
What alternative models can I use on ChatGPT?
Most providers offer multiple models. Try a smaller or newer version — these often have more available capacity.