This month’s release is all about refining the core experience with meaningful updates that make working within the CalypsoAI Platform more efficient, reliable, and predictable. While there are no major feature drops this cycle, the improvements shipped in June help reduce friction, enhance clarity, and support more consistent performance across environments.
Inference Red-Team: More Control When You Need It
Connection Rate Limiting
Inference Red-Team users can now apply a max request rate to individual model or application connections. This is useful when targeting models that aren’t able to process attacks as quickly as CalypsoAI can send them (especially in performance-sensitive or resource-constrained environments).
Intent Category Descriptions
We added short explanatory text to the intent category chart in Inference Red-Team reports to help clarify what each section of the data represents.
Inference Defend & Inference Oberve
Performance Enhancements
We’ve changed the scans API to be more performance-based for customers using the verbose parameter in Inference Defend. In earlier releases, using this parameter at scale caused noticeable latency. This release eliminates this issue.
Dashboard Performance Tuning
Customers with heavier workloads should notice faster dashboard load times in Inference Observe.
Platform-Wide Improvements
Unified Filters for Tables
We’ve consolidated and cleaned up table filters across CalypsoAI’s Inference Platform. Instead of several top-level buttons, filters now live in a single dropdown, with a small badge indicating when one is active. This update applies to:
- The Dashboard
- Logs
- Projects
The goal here was simplicity. Not adding new filter types, but making existing ones easier to access and manage.
Additionally, we’ve made some backend changes to improve responsiveness in high-data environments and resolved bugs that smooth out user experience and eliminate edge-case errors.