Introduction
Prompt Deck provides first-class, optional integration with the Laravel AI SDK. When the AI SDK is installed, you get:- Automatic prompt scaffolding — Running
make:agentautomatically creates a matching prompt directory. - The
HasPromptTemplatetrait — Providesinstructions()andpromptMessages()methods that load versioned prompts directly into your AI agents. - The
TrackPromptMiddleware— Automatically records prompt executions (tokens, latency, model, etc.) using Prompt Deck’s tracking system.
Installation
Prompt Deck does not require the AI SDK — it’s listed as asuggest dependency. Install it when you’re ready:
laravel/ai is installed, Prompt Deck’s AI SDK features activate automatically. No additional configuration is needed.
Automatic prompt scaffolding
When the Laravel AI SDK is installed, Prompt Deck automatically hooks into themake:agent command. Whenever you create a new agent:
HasPromptTemplate trait — zero extra setup required.
How it works
Prompt Deck registers a listener (AfterMakeAgent) on Laravel’s CommandFinished event. When make:agent completes successfully, the listener:
- Extracts the agent name from the command input.
- Converts it to kebab-case (
SalesCoach→sales-coach). - Strips any namespace prefix (
App\Ai\Agents\SalesCoach→sales-coach). - Checks if the prompt already exists (skips if it does).
- Runs
make:promptwith the derived name.
laravel/ai is installed. If the prompt creation fails for any reason, it does not break the make:agent workflow — the agent is still created successfully.
Example output
Disabling auto-scaffolding
To disable automatic prompt scaffolding, set the configuration option:Quick start
Use theHasPromptTemplate trait on any agent class:
HasPromptTemplate trait provides the instructions() method required by the Agent contract, loading the system prompt from your Prompt Deck files.
The HasPromptTemplate trait
TheHasPromptTemplate trait bridges Prompt Deck’s file-based templates with the Laravel AI SDK’s agent contracts.
How it maps to AI SDK contracts
| Prompt Deck | AI SDK | Description |
|---|---|---|
system.md role file | instructions() | Agent’s system prompt. |
user.md, assistant.md, etc. | messages() via promptMessages() | Conversation context. |
metadata.json | — | Prompt metadata (description, variables, etc.). |
v1/, v2/, etc. | — | Version management. |
Mapping diagram
Customising the prompt
Prompt name
By default, the prompt name is derived from the class name in kebab-case:SalesCoach→sales-coachDocumentAnalyzer→document-analyzer
promptName() to use a custom name:
Pinning a version
By default, the active version is loaded. Pin to a specific version by overridingpromptVersion():
null (the default) to always load the active version — useful for A/B testing and gradual rollouts.
Variable interpolation
Pass dynamic values into your prompt templates by overridingpromptVariables():
system.md:
instructions() or promptMessages().
Full agent example
Here’s a complete agent using all Prompt Deck features with the AI SDK:prompts/sales-coach/v1/system.md:
Conversation context
If your agent implementsConversational, you can load pre-defined conversation context from Prompt Deck role files using the promptMessages() method.
Loading all non-system roles
By default,promptMessages() returns all roles except system (which goes through instructions()):
Limiting to specific roles
Pass an array to limit which roles are included:Merging with database history
Combine template messages with conversation history from your database:Performance tracking middleware
TheTrackPromptMiddleware automatically records prompt executions via Prompt Deck’s tracking system.
Setting up the middleware
What gets tracked
The middleware automatically records the following fields to theprompt_executions table:
| Field | Source |
|---|---|
prompt_name | Agent’s promptName() method. |
prompt_version | Resolved template version number. |
input | The user’s prompt text from the AgentPrompt. |
output | The AI response text. |
tokens | Total token usage from the response. |
latency_ms | Round-trip time in milliseconds (measured via hrtime). |
model | Model used (e.g. gpt-4o, claude-3-sonnet). |
provider | Provider name (e.g. openai, anthropic). |
How it works internally
The middleware:- Records the start time before the request using
hrtime(true). - Passes the prompt to the next middleware in the pipeline.
- Uses the response’s
then()hook to record execution data after the response completes. - Calls
PromptManager::track()with the collected data.
HasPromptTemplate trait. If the agent doesn’t have a promptName() method, the tracking is silently skipped.
Accessing the template directly
You can access the fullPromptTemplate object from within your agent for advanced use cases:
promptTemplate() don’t incur additional filesystem or cache lookups.
Clearing the cached template
Clear the cached template to force a fresh load on next access:- Long-running processes (queue workers, daemons) where prompts might change between jobs.
- Tests where you switch prompt versions between assertions.
$this for fluent chaining:
Without the AI SDK
TheHasPromptTemplate trait works even without laravel/ai installed. The instructions() method simply returns a string, and promptMessages() falls back to returning raw arrays instead of AI SDK Message objects: