How to use LLM inference API
Inference
class offers access to LLM APIs and convenient methods to execute
model inference, incl. chat completions, tool calling or JSON output
generation.
LLM providers access details can be found and modified via /config/llm.php
.
Inference::text()
method.
Simplified inference API uses the default connection for convenient ad-hoc calls.
Inference
class:
create()
method.
The toText()
method returns text completion from the LLM response.
llm.php
file.
This is useful when you want to use different LLMs or API providers in your application.
Default configuration is located in /config/llm.php
in the root directory
of Instructor codebase. It contains a set of predefined connections to all LLM APIs
supported out-of-the-box by Instructor.
Config file defines connections to LLM APIs and their parameters. It also specifies
the default connection to be used when calling Instructor without specifying
the client connection.
withPreset
method with the connection preset name.
INSTRUCTOR_CONFIG_PATHS
environment variable. You can use copies of the default
configuration files as a starting point.