LLM provider overloaded

Incident Report for Patlytics

Resolved

This incident has been resolved.
Posted Jul 03, 2025 - 17:13 EDT

Monitoring

The earlier performance issues caused by our LLM provider being overloaded are now recovering. Most AI-powered features should be working as expected again.

We’ll continue to monitor the situation closely and will take further action if needed.

Thanks for your patience!

— The Engineering Team
Posted Jul 03, 2025 - 16:07 EDT

Identified

The issue has been identified and a fix is being implemented.
Posted Jul 03, 2025 - 13:46 EDT

Investigating

We’re currently experiencing degraded performance for features powered by our AI assistant. This is due to our underlying LLM provider being temporarily overloaded.

Our team is actively monitoring the situation and working on mitigation steps, including load balancing and fallback strategies.

We’ll keep you updated as soon as service stabilizes. Thanks for your patience!

— The Engineering Team
Posted Jul 03, 2025 - 13:46 EDT