Skip to content

fix: truncate excess scenarios instead of raising error in MCP generation#556

Open
saurabhbikram wants to merge 1 commit intoOpenPipe:mainfrom
nansen-ai:truncate-excess-scenarios
Open

fix: truncate excess scenarios instead of raising error in MCP generation#556
saurabhbikram wants to merge 1 commit intoOpenPipe:mainfrom
nansen-ai:truncate-excess-scenarios

Conversation

@saurabhbikram
Copy link

Summary

When using generate_scenarios(), the LLM occasionally generates more scenarios than the requested num_scenarios count. Previously this raised a ValueError, forcing a retry. This PR changes the behavior to:

  • Too few scenarios: Still raises ValueError (generation failed)
  • Too many scenarios: Truncates to the requested count and logs an info message

This makes scenario generation more robust, especially with models that don't strictly respect JSON schema maxItems constraints.

Files changed

  • src/art/mcp/generate_scenarios.py — split the != check into < (error) and > (truncate with log)

@saurabhbikram saurabhbikram marked this pull request as ready for review February 12, 2026 08:45
When the LLM generates more scenarios than requested, truncate to the
requested count instead of raising a ValueError. This makes scenario
generation more robust since LLMs occasionally overshoot the count.
Still raises an error when too few scenarios are generated.

https://claude.ai/code/session_017Y9KNNQX2RyVWnqpj3A4hh
@saurabhbikram saurabhbikram force-pushed the truncate-excess-scenarios branch from a4907ed to ef2b0fb Compare February 12, 2026 08:45
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants

Comments