Is your feature request related to a problem? Please describe.
We have 3 reasoning template in the router config, it would be great for LLM-Katan to support them in e2e test.
# Reasoning family configurations
reasoning_families:
deepseek:
type: "chat_template_kwargs"
parameter: "thinking"
qwen3:
type: "chat_template_kwargs"
parameter: "enable_thinking"
gpt-oss:
type: "reasoning_effort"
parameter: "reasoning_effort"
gpt:
type: "reasoning_effort"
parameter: "reasoning_effort"