Claude for PHP Quick Reference
A concise, printable reference for working with Claude AI in PHP applications.
💡 Tip: Print this page (Ctrl/Cmd+P) or save as PDF for offline reference. Use browser's "Print to PDF" for best results.
📚 Need More Details?
- Full Series Index - Complete 40-chapter series
- Learning Roadmap - Structured learning paths
📋 Table of Contents
- Basic Setup
- Common API Calls
- Model Selection
- Prompt Patterns
- Tool Use
- Error Handling
- Prompt Caching
- Batch Processing
- Cost Optimization
- Vision API
- Structured Outputs
- Common Patterns
- Common Gotchas
- Troubleshooting
Basic Setup
Installation
composer require anthropic-ai/sdkInitialize Client
use Anthropic\Anthropic;
$client = Anthropic::factory()
->withApiKey(getenv('ANTHROPIC_API_KEY'))
->make();Environment Variables
ANTHROPIC_API_KEY=sk-ant-your-key-here
ANTHROPIC_MODEL=claude-sonnet-4-20250514
ANTHROPIC_MAX_TOKENS=4096Common API Calls
Basic Request
$response = $client->messages()->create([
'model' => 'claude-sonnet-4-20250514',
'max_tokens' => 1024,
'messages' => [
['role' => 'user', 'content' => 'Your prompt here']
]
]);
$text = $response->content[0]->text;With System Prompt
$response = $client->messages()->create([
'model' => 'claude-sonnet-4-20250514',
'max_tokens' => 1024,
'system' => 'You are a helpful PHP expert.',
'messages' => [
['role' => 'user', 'content' => 'Explain dependency injection']
]
]);Multi-turn Conversation
$messages = [
['role' => 'user', 'content' => 'What is Laravel?'],
['role' => 'assistant', 'content' => 'Laravel is a PHP framework...'],
['role' => 'user', 'content' => 'How do I install it?']
];
$response = $client->messages()->create([
'model' => 'claude-sonnet-4-20250514',
'max_tokens' => 1024,
'messages' => $messages
]);Streaming Response
$stream = $client->messages()->createStreamed([
'model' => 'claude-sonnet-4-20250514',
'max_tokens' => 1024,
'messages' => [
['role' => 'user', 'content' => 'Tell me a story']
]
]);
foreach ($stream as $event) {
if ($event->type === 'content_block_delta') {
echo $event->delta->text;
}
}Model Selection
| Model | Speed | Cost (Input/Output per 1M tokens) | Best For |
|---|---|---|---|
| claude-haiku-4-20250514 | Fastest | $0.25 / $1.25 | Simple tasks, high volume |
| claude-sonnet-4-20250514 | Balanced | $3.00 / $15.00 | Most use cases (recommended) |
| claude-opus-4-20250514 | Slowest | $15.00 / $75.00 | Complex reasoning, critical tasks |
When to Use Each
Haiku (Fastest, Cheapest):
- Simple data extraction
- Classification tasks
- High-volume processing
- Speed is critical
Sonnet (Balanced):
- General-purpose tasks
- Code generation
- Content creation
- Most production use cases
Opus (Most Capable):
- Complex analysis
- Critical decisions
- Legal/medical content
- Research and reasoning
Prompt Patterns
Code Generation
$prompt = "Write a PHP function that:
1. [Specific requirement]
2. [Another requirement]
3. [Edge case handling]
Requirements:
- Use PHP 8.4+ features
- Include type hints
- Add PHPDoc comments
- Handle errors gracefully";Code Review
$prompt = "Review this PHP code for:
1. Security vulnerabilities
2. Performance issues
3. Best practices violations
4. Potential bugs
```php
{$code}Provide specific, actionable feedback.";
### Data Extraction (JSON)
```php
$prompt = "Extract structured data from this text and return as JSON:
{$text}
Return JSON with these fields:
- name (string)
- email (string or null)
- phone (string or null)
- company (string or null)
Return ONLY valid JSON, no explanation.";Classification
$prompt = "Classify this text into ONE category:
Categories: [Bug, Feature Request, Question, Spam]
Text: {$text}
Return ONLY the category name, nothing else.";Content Moderation
$prompt = "Analyze this content for policy violations:
Content: {$content}
Check for:
- Toxic language
- Personal information (PII)
- Spam
- Offensive content
Return JSON:
{
\"is_safe\": boolean,
\"violations\": [array of violation types],
\"confidence\": number (0-1)
}";Tool Use
Define Tool
$tools = [[
'name' => 'get_weather',
'description' => 'Get current weather for a city',
'input_schema' => [
'type' => 'object',
'properties' => [
'city' => [
'type' => 'string',
'description' => 'City name'
]
],
'required' => ['city']
]
]];Use Tool
$response = $client->messages()->create([
'model' => 'claude-sonnet-4-20250514',
'max_tokens' => 1024,
'tools' => $tools,
'messages' => [
['role' => 'user', 'content' => 'What is the weather in Paris?']
]
]);
// Check if Claude wants to use a tool
if ($response->stopReason === 'tool_use') {
$toolUse = $response->content[1]; // content[0] is text, [1] is tool_use
if ($toolUse->name === 'get_weather') {
$city = $toolUse->input['city'];
$weather = getWeather($city); // Your function
// Send result back to Claude
$finalResponse = $client->messages()->create([
'model' => 'claude-sonnet-4-20250514',
'max_tokens' => 1024,
'tools' => $tools,
'messages' => [
['role' => 'user', 'content' => 'What is the weather in Paris?'],
['role' => 'assistant', 'content' => $response->content],
[
'role' => 'user',
'content' => [[
'type' => 'tool_result',
'tool_use_id' => $toolUse->id,
'content' => json_encode($weather)
]]
]
]
]);
}
}Error Handling
Basic Try-Catch
use Anthropic\Exceptions\ErrorException;
use Anthropic\Exceptions\RateLimitException;
try {
$response = $client->messages()->create([...]);
} catch (RateLimitException $e) {
// Rate limit hit - wait and retry
sleep(60);
$response = $client->messages()->create([...]);
} catch (ErrorException $e) {
// API error
logger()->error('Claude API error', [
'message' => $e->getMessage(),
'code' => $e->getCode()
]);
} catch (\Exception $e) {
// Unexpected error
logger()->error('Unexpected error', [
'message' => $e->getMessage()
]);
}Retry with Exponential Backoff
function callClaudeWithRetry(callable $fn, int $maxRetries = 3): mixed
{
$attempt = 0;
while ($attempt < $maxRetries) {
try {
return $fn();
} catch (RateLimitException $e) {
$attempt++;
if ($attempt >= $maxRetries) {
throw $e;
}
$delay = (2 ** $attempt) + rand(0, 1000) / 1000;
sleep($delay);
}
}
}
$response = callClaudeWithRetry(fn() =>
$client->messages()->create([...])
);Prompt Caching
Use Prompt Caching (5-minute cache)
$response = $client->messages()->create([
'model' => 'claude-sonnet-4-20250514',
'max_tokens' => 1024,
'system' => [
[
'type' => 'text',
'text' => 'You are a helpful assistant.',
'cache_control' => ['type' => 'ephemeral', 'ttl' => 300] // 5 minutes
]
],
'messages' => [
['role' => 'user', 'content' => 'Hello!']
]
]);Use Prompt Caching (1-hour cache)
'system' => [
[
'type' => 'text',
'text' => 'Long documentation that repeats...',
'cache_control' => ['type' => 'ephemeral', 'ttl' => 3600] // 1 hour
]
]Batch Processing
Create Batch Request
// Batch processing saves 50% on costs for async workloads
$batch = $client->batches()->create([
'requests' => [
['model' => 'claude-sonnet-4-20250514', 'max_tokens' => 1024, 'messages' => [...]],
['model' => 'claude-sonnet-4-20250514', 'max_tokens' => 1024, 'messages' => [...]],
// ... up to 1000 requests
]
]);
// Check status
$status = $client->batches()->retrieve($batch->id);
// Get results when complete
if ($status->status === 'completed') {
foreach ($status->results as $result) {
// Process each result
}
}Cost Optimization
Calculate Cost
function calculateCost(object $response, string $model): float
{
$pricing = [
'claude-haiku-4-20250514' => ['input' => 0.25, 'output' => 1.25],
'claude-sonnet-4-20250514' => ['input' => 3.00, 'output' => 15.00],
'claude-opus-4-20250514' => ['input' => 15.00, 'output' => 75.00],
];
$inputCost = ($response->usage->inputTokens / 1_000_000)
* $pricing[$model]['input'];
$outputCost = ($response->usage->outputTokens / 1_000_000)
* $pricing[$model]['output'];
return $inputCost + $outputCost;
}
$cost = calculateCost($response, 'claude-sonnet-4-20250514');
echo "Cost: $" . number_format($cost, 6);Cost-Saving Tips
Use Haiku for simple tasks
php// For classification, simple extraction 'model' => 'claude-haiku-4-20250514'Compress prompts
php// Instead of verbose prompts, use abbreviations $prompt = "Classify: [categories]\nText: {$text}\nReturn: category only";Cache responses
php$cacheKey = 'claude:' . md5($prompt); if ($cached = cache()->get($cacheKey)) { return $cached; } $response = $client->messages()->create([...]); cache()->put($cacheKey, $response, 3600);Limit max_tokens
php'max_tokens' => 500 // Instead of 4096 when you only need short responsesUse batch processing API
php// Use Anthropic's batch API for 50% cost savings on async workloads $batch = $client->batches()->create(['requests' => [...]]);Use prompt caching
php// Cache repeated system prompts for 5 minutes or 1 hour 'cache_control' => ['type' => 'ephemeral', 'ttl' => 300]
Vision API (Images)
Analyze Image
$response = $client->messages()->create([
'model' => 'claude-sonnet-4-20250514',
'max_tokens' => 1024,
'messages' => [
[
'role' => 'user',
'content' => [
[
'type' => 'image',
'source' => [
'type' => 'base64',
'media_type' => 'image/jpeg',
'data' => base64_encode(file_get_contents('image.jpg'))
]
],
['type' => 'text', 'text' => 'What is in this image?']
]
]
]
]);Extract Text from Image
$prompt = 'Extract all text from this image and return as structured JSON.';
// Use same image format as aboveStructured Outputs
With JSON Schema
$response = $client->messages()->create([
'model' => 'claude-sonnet-4-20250514',
'max_tokens' => 1024,
'messages' => [
['role' => 'user', 'content' => 'Extract user data from: John Doe, john@example.com']
],
'response_format' => [
'type' => 'json_schema',
'json_schema' => [
'name' => 'user_data',
'strict' => true,
'schema' => [
'type' => 'object',
'properties' => [
'name' => ['type' => 'string'],
'email' => ['type' => 'string']
],
'required' => ['name', 'email']
]
]
]
]);
$data = json_decode($response->content[0]->text, true);Common Patterns
Laravel Service
namespace App\Services;
use Anthropic\Anthropic;
class ClaudeService
{
public function __construct(
private readonly Anthropic $client
) {}
public function generate(string $prompt, array $options = []): string
{
$response = $this->client->messages()->create([
'model' => $options['model'] ?? config('claude.model'),
'max_tokens' => $options['max_tokens'] ?? 1024,
'messages' => [
['role' => 'user', 'content' => $prompt]
]
]);
return $response->content[0]->text;
}
}Conversation Manager
class ConversationManager
{
private array $messages = [];
public function addUserMessage(string $content): self
{
$this->messages[] = ['role' => 'user', 'content' => $content];
return $this;
}
public function addAssistantMessage(string $content): self
{
$this->messages[] = ['role' => 'assistant', 'content' => $content];
return $this;
}
public function getMessages(): array
{
return $this->messages;
}
public function trimToTokenLimit(int $limit = 100000): self
{
// Keep last N messages that fit within token limit
// Simplified - you'd need actual token counting
while (count($this->messages) > 10) {
array_shift($this->messages);
}
return $this;
}
}Structured Output Parser
function extractJSON(string $response): ?array
{
// Try to extract JSON from response (may be wrapped in markdown)
if (preg_match('/```json\s*(\{.*?\})\s*```/s', $response, $matches)) {
return json_decode($matches[1], true);
}
if (preg_match('/(\{.*?\})/s', $response, $matches)) {
return json_decode($matches[1], true);
}
return null;
}Environment Setup
.env Example
ANTHROPIC_API_KEY=sk-ant-api03-xxx
ANTHROPIC_MODEL=claude-sonnet-4-20250514
ANTHROPIC_MAX_TOKENS=4096
ANTHROPIC_TEMPERATURE=1.0
# Optional
CLAUDE_CACHE_TTL=3600
CLAUDE_RETRY_ATTEMPTS=3
CLAUDE_TIMEOUT=120composer.json
{
"require": {
"php": "^8.4",
"anthropic-ai/sdk": "^1.0",
"vlucas/phpdotenv": "^5.5",
"predis/predis": "^2.0"
}
}Debugging
Enable Logging
use Monolog\Logger;
use Monolog\Handler\StreamHandler;
$log = new Logger('claude');
$log->pushHandler(new StreamHandler('claude.log', Logger::DEBUG));
// Log request
$log->debug('Claude request', ['prompt' => $prompt]);
// Log response
$log->debug('Claude response', [
'text' => $response->content[0]->text,
'tokens' => $response->usage
]);Inspect Response
var_dump([
'id' => $response->id,
'model' => $response->model,
'role' => $response->role,
'content' => $response->content,
'stop_reason' => $response->stopReason,
'usage' => [
'input_tokens' => $response->usage->inputTokens,
'output_tokens' => $response->usage->outputTokens
]
]);Common Gotchas
Response Structure
// ✅ CORRECT - Access text content
$text = $response->content[0]->text;
// ❌ WRONG - content is an array, not a string
$text = $response->content; // This is an array!Tool Use Response Handling
// ✅ CORRECT - Check stop_reason first
if ($response->stopReason === 'tool_use') {
// Find tool_use block in content array
foreach ($response->content as $block) {
if ($block->type === 'tool_use') {
// Handle tool call
}
}
}Token Counting
// ✅ CORRECT - Use usage object
$inputTokens = $response->usage->inputTokens;
$outputTokens = $response->usage->outputTokens;
// ❌ WRONG - Don't estimate manually
$estimatedTokens = strlen($prompt) / 4; // Inaccurate!Error Handling
// ✅ CORRECT - Catch specific exceptions
use Anthropic\Exceptions\RateLimitException;
use Anthropic\Exceptions\ErrorException;
try {
$response = $client->messages()->create([...]);
} catch (RateLimitException $e) {
// Handle rate limit specifically
} catch (ErrorException $e) {
// Handle API errors
}Troubleshooting
Issue: "Class 'Anthropic\Anthropic' not found"
Solution: Run composer require anthropic-ai/sdk and ensure vendor/autoload.php is included.
Issue: "Invalid API key"
Solution: Verify your API key starts with sk-ant- and is set in environment variables.
Issue: "Rate limit exceeded"
Solution: Implement exponential backoff (see Error Handling section) or upgrade your API tier.
Issue: "Content is empty"
Solution: Check $response->content is an array. Access text with $response->content[0]->text.
Issue: "Tool use not working"
Solution: Ensure tools array is included in the request and stopReason is checked before accessing content.
Links
- Full Series: Claude for PHP Developers
- Learning Roadmap: Choose Your Path
- API Docs: Anthropic Documentation
- SDK: Anthropic PHP SDK
- Console: console.anthropic.com
Print this reference and keep it handy while developing! 📄
Last Updated: 2025