Cookbook
Cookbook \ Instructor \ Basics
- Basic use
- Specifying required and optional parameters via constructor
- Getters and setters
- Private vs public object field
- Basic use via mixin
- Fluent API
- Handling errors with `Maybe` helper class
- Mixed Type Property
- Modes
- Making some fields optional
- Automatic correction based on validation results
- Using attributes
- Using LLM API connection presets from config file
- Validation
- Custom validation using Symfony Validator
- Validation across multiple fields
- Validation with LLM
Cookbook \ Instructor \ Advanced
- Use custom configuration providers
- Context caching (structured output)
- Customize parameters of LLM driver
- Custom prompts
- Customize parameters via DSN
- Extracting arguments of function or method
- Logging monolog
- Logging psr
- Streaming partial updates during inference
- Providing example inputs and outputs
- Extracting scalar values
- Extracting sequences of objects
- Streaming
- Structures
Cookbook \ Instructor \ Troubleshooting
Cookbook \ Instructor \ LLM API Support
Cookbook \ Instructor \ Extras
- Extraction of complex objects
- Extraction of complex objects (Anthropic)
- Extraction of complex objects (Cohere)
- Extraction of complex objects (Gemini)
- Using structured data as an input
- Image processing - car damage detection
- Image to data (OpenAI)
- Image to data (Anthropic)
- Image to data (Gemini)
- Generating JSON Schema from PHP classes
- Generating JSON Schema from PHP classes
- Generating JSON Schema dynamically
- Create tasks from meeting transcription
- Translating UI text fields
- Web page to PHP objects
Cookbook \ Polyglot \ LLM Basics
- Working directly with LLMs
- Working directly with LLMs and JSON - JSON mode
- Working directly with LLMs and JSON - JSON Schema mode
- Working directly with LLMs and JSON - MdJSON mode
- Working directly with LLMs and JSON - Tools mode
- Generating JSON Schema from PHP classes
- Generating JSON Schema from PHP classes
Cookbook \ Polyglot \ LLM Advanced
Cookbook \ Polyglot \ LLM Troubleshooting
Cookbook \ Polyglot \ LLM API Support
Cookbook \ Polyglot \ LLM Extras
Cookbook \ Prompting \ Zero-Shot Prompting
Cookbook \ Prompting \ Few-Shot Prompting
Cookbook \ Prompting \ Thought Generation
Cookbook \ Prompting \ Miscellaneous
- Arbitrary properties
- Consistent values of arbitrary properties
- Chain of Summaries
- Chain of Thought
- Single label classification
- Multiclass classification
- Entity relationship extraction
- Handling errors
- Limiting the length of lists
- Reflection Prompting
- Restating instructions
- Ask LLM to rewrite instructions
- Expanding search queries
- Summary with Keywords
- Reusing components
- Using CoT to improve interpretation of component data
Cookbook \ Polyglot \ LLM Advanced
Using custom LLM driver
Overview
You can register and use your own LLM driver, either using a new driver name or overriding an existing driver bundled with Polyglot.
Example
Copy
<?php
require 'examples/boot.php';
use Cognesy\Config\Env;
use Cognesy\Http\Contracts\HttpResponse;
use Cognesy\Polyglot\Inference\Config\LLMConfig;
use Cognesy\Polyglot\Inference\Data\InferenceRequest;
use Cognesy\Polyglot\Inference\Drivers\OpenAI\OpenAIDriver;
use Cognesy\Polyglot\Inference\Inference;
use Cognesy\Utils\Str;
// we will use existing, bundled driver as an example, but you can provide any class that implements
// a required interface (CanHandleInference)
Inference::registerDriver(
name: 'custom-driver',
driver: fn($config, $httpClient, $events) => new class($config, $httpClient, $events) extends OpenAIDriver {
public function handle(InferenceRequest $request): HttpResponse {
// some extra functionality to demonstrate our driver is being used
echo ">>> Handling request...\n";
return parent::handle($request);
}
}
);
// Create instance of LLM client initialized with custom parameters
$config = new LLMConfig(
apiUrl: 'https://api.openai.com/v1',
apiKey: Env::get('OPENAI_API_KEY'),
endpoint: '/chat/completions',
defaultModel: 'gpt-4o-mini',
defaultMaxTokens: 128,
httpClientPreset: 'guzzle',
driver: 'custom-driver',
);
$answer = (new Inference)
->withConfig($config)
->withMessages([['role' => 'user', 'content' => 'What is the capital of France']])
->withOptions(['max_tokens' => 64])
->get();
echo "USER: What is capital of France\n";
echo "ASSISTANT: $answer\n";
assert(Str::contains($answer, 'Paris'));
?>
Assistant
Responses are generated using AI and may contain mistakes.