LLM decision and verification layer
Compare responses from multiple models
Maximum prompt length: 4,000 characters • Responses stream in real-time