Inference
class, you can also generate JSON output
from the model inference. This is useful for example when you need to
process the response in a structured way or when you want to store the
elements of the response in a database.
Inference
class supports multiple inference modes, like Tools
, Json
JsonSchema
or MdJson
, which gives you flexibility to choose the best
approach for your use case.
NOTE: Some model providers allow to specify a JSON schema for model to follow viaschema
parameter ofresponse_format
. OpenAI does not support this feature in JSON mode (only in JSON Schema mode).