Skip to content

feat: add Fabric patterns support #365

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Open
wants to merge 10 commits into
base: main
Choose a base branch
from
76 changes: 76 additions & 0 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -769,6 +769,82 @@ final class MyProcessor implements OutputProcessorInterface, ChainAwareInterface
}
```

### Fabric Patterns

LLM Chain supports [Fabric](https://github.com/danielmiessler/fabric), a popular collection of system prompts from the AI community. These patterns provide pre-built, tested prompts for common tasks like summarization, analysis, and content creation.

> [!NOTE]
> Using Fabric patterns requires the `php-llm/fabric-pattern` package to be installed separately.

#### Installation

```bash
composer require php-llm/fabric-pattern
```

#### Usage with Message Factory

The simplest way to use Fabric patterns is through the `Message::fabric()` factory method:

```php
use PhpLlm\LlmChain\Platform\Message\Message;
use PhpLlm\LlmChain\Platform\Message\MessageBag;

$messages = new MessageBag(
Message::fabric('create_summary'),
Message::ofUser($articleContent)
);

$response = $chain->call($messages);
```

#### Usage with Input Processor

For more flexibility, you can use the `FabricInputProcessor` to dynamically load patterns:

```php
use PhpLlm\LlmChain\Chain\Chain;
use PhpLlm\LlmChain\Platform\Fabric\FabricInputProcessor;

// Initialize Platform and LLM

$processor = new FabricInputProcessor();
$chain = new Chain($platform, $model, [$processor]);

$messages = new MessageBag(
Message::ofUser('Analyze this article for potential security issues: ...')
);

// Use any Fabric pattern via options
$response = $chain->call($messages, ['fabric_pattern' => 'analyze_threat_report']);
```

#### Custom Pattern Locations

If you have your own collection of patterns or want to use a local copy:

```php
use PhpLlm\LlmChain\Platform\Fabric\FabricRepository;

// Use custom pattern directory
$repository = new FabricRepository('/path/to/custom/patterns');
$processor = new FabricInputProcessor($repository);

// Or with the factory method
$message = Message::fabric('my_custom_pattern', '/path/to/custom/patterns');
```

#### Available Patterns

Some popular Fabric patterns include:
- `create_summary` - Create a comprehensive summary
- `analyze_claims` - Analyze and fact-check claims
- `extract_wisdom` - Extract key insights and wisdom
- `improve_writing` - Improve writing quality and clarity
- `create_quiz` - Generate quiz questions from content

For a full list of available patterns, visit the [Fabric patterns directory](https://github.com/danielmiessler/fabric/tree/main/patterns).

## HuggingFace

LLM Chain comes out of the box with an integration for [HuggingFace](https://huggingface.co/) which is a platform for
Expand Down
2 changes: 2 additions & 0 deletions composer.json
Original file line number Diff line number Diff line change
Expand Up @@ -46,6 +46,7 @@
"mrmysql/youtube-transcript": "^v0.0.5",
"php-cs-fixer/shim": "^3.70",
"php-http/discovery": "^1.20",
"php-llm/fabric-pattern": "^0.1",
"phpstan/phpstan": "^2.0",
"phpstan/phpstan-symfony": "^2.0",
"phpstan/phpstan-webmozart-assert": "^2.0",
Expand All @@ -70,6 +71,7 @@
"doctrine/dbal": "For using MariaDB via Doctrine as retrieval vector store",
"mongodb/mongodb": "For using MongoDB Atlas as retrieval vector store.",
"mrmysql/youtube-transcript": "For using the YouTube transcription tool.",
"php-llm/fabric-pattern": "For using Fabric patterns - a collection of pre-built system prompts.",
"probots-io/pinecone-php": "For using the Pinecone as retrieval vector store.",
"symfony/dom-crawler": "For using the Crawler tool."
},
Expand Down
55 changes: 55 additions & 0 deletions examples/fabric/summarize.php
Original file line number Diff line number Diff line change
@@ -0,0 +1,55 @@
<?php

declare(strict_types=1);

use PhpLlm\LlmChain\Chain\Chain;
use PhpLlm\LlmChain\Platform\Bridge\OpenAI\GPT;
use PhpLlm\LlmChain\Platform\Bridge\OpenAI\PlatformFactory;
use PhpLlm\LlmChain\Platform\Message\Message;
use PhpLlm\LlmChain\Platform\Message\MessageBag;

require_once dirname(__DIR__).'/../vendor/autoload.php';

if (empty($_ENV['OPENAI_API_KEY'])) {
echo 'Please set the OPENAI_API_KEY environment variable.'.\PHP_EOL;
exit(1);
}

// Check if Fabric patterns package is installed
if (!is_dir(dirname(__DIR__, 2).'/vendor/php-llm/fabric-pattern')) {
echo 'Fabric patterns are not installed.'.\PHP_EOL;
echo 'Please install them with: composer require php-llm/fabric-pattern'.\PHP_EOL;
exit(1);
}

// Initialize platform and model
$platform = PlatformFactory::create($_ENV['OPENAI_API_KEY']);
$model = new GPT(GPT::GPT_4O_MINI);
$chain = new Chain($platform, $model);

// Example article to summarize
$article = <<<'ARTICLE'
The field of artificial intelligence has undergone dramatic transformations in recent years,
with large language models (LLMs) emerging as one of the most significant breakthroughs.
These models, trained on vast amounts of text data, have demonstrated remarkable capabilities
in understanding and generating human-like text. The implications for software development,
content creation, and human-computer interaction are profound.

However, with these advances come important considerations regarding ethics, bias, and the
responsible deployment of AI systems. Researchers and practitioners must work together to
ensure that these powerful tools are used in ways that benefit society while minimizing
potential harms.
ARTICLE;

// Create messages using Fabric pattern
$messages = new MessageBag(
Message::fabric('create_summary'),
Message::ofUser($article)
);

// Call the chain
$response = $chain->call($messages);

echo 'Summary using Fabric pattern "create_summary":'.\PHP_EOL;
echo '=============================================='.\PHP_EOL;
echo $response->getContent().\PHP_EOL;
56 changes: 56 additions & 0 deletions examples/fabric/with-processor.php
Original file line number Diff line number Diff line change
@@ -0,0 +1,56 @@
<?php

declare(strict_types=1);

use PhpLlm\LlmChain\Chain\Chain;
use PhpLlm\LlmChain\Platform\Bridge\OpenAI\GPT;
use PhpLlm\LlmChain\Platform\Bridge\OpenAI\PlatformFactory;
use PhpLlm\LlmChain\Platform\Fabric\FabricInputProcessor;
use PhpLlm\LlmChain\Platform\Message\Message;
use PhpLlm\LlmChain\Platform\Message\MessageBag;

require_once dirname(__DIR__).'/../vendor/autoload.php';

if (empty($_ENV['OPENAI_API_KEY'])) {
echo 'Please set the OPENAI_API_KEY environment variable.'.\PHP_EOL;
exit(1);
}

// Check if Fabric patterns package is installed
if (!is_dir(dirname(__DIR__, 2).'/vendor/php-llm/fabric-pattern')) {
echo 'Fabric patterns are not installed.'.\PHP_EOL;
echo 'Please install them with: composer require php-llm/fabric-pattern'.\PHP_EOL;
exit(1);
}

// Initialize platform and model
$platform = PlatformFactory::create($_ENV['OPENAI_API_KEY']);
$model = new GPT(GPT::GPT_4O_MINI);

// Create chain with Fabric processor
$processor = new FabricInputProcessor();
$chain = new Chain($platform, $model, [$processor]);

// Example code to analyze
$code = <<<'CODE'
function processUserData($data) {
$sql = "SELECT * FROM users WHERE id = " . $data['id'];
$result = mysql_query($sql);

while ($row = mysql_fetch_array($result)) {
echo $row['name'] . " - " . $row['email'];
}
}
CODE;

// Create messages
$messages = new MessageBag(
Message::ofUser("Analyze this PHP code for security issues:\n\n".$code)
);

// Call with Fabric pattern
$response = $chain->call($messages, ['fabric_pattern' => 'analyze_code']);

echo 'Code Analysis using Fabric pattern "analyze_code":'.\PHP_EOL;
echo '=================================================='.\PHP_EOL;
echo $response->getContent().\PHP_EOL;
12 changes: 12 additions & 0 deletions src/Platform/Fabric/Exception/PatternNotFoundException.php
Original file line number Diff line number Diff line change
@@ -0,0 +1,12 @@
<?php

declare(strict_types=1);

namespace PhpLlm\LlmChain\Platform\Fabric\Exception;

/**
* Exception thrown when a Fabric pattern is not found.
*/
final class PatternNotFoundException extends \RuntimeException
{
}
50 changes: 50 additions & 0 deletions src/Platform/Fabric/FabricInputProcessor.php
Original file line number Diff line number Diff line change
@@ -0,0 +1,50 @@
<?php

declare(strict_types=1);

namespace PhpLlm\LlmChain\Platform\Fabric;

use PhpLlm\LlmChain\Chain\Input;
use PhpLlm\LlmChain\Chain\InputProcessorInterface;
use PhpLlm\LlmChain\Platform\Message\SystemMessage;

/**
* Input processor for Fabric patterns.
*
* Requires the "php-llm/fabric-pattern" package to be installed.
*
* This processor allows adding Fabric patterns through options:
* - fabric_pattern: string - The pattern name to load
*/
final readonly class FabricInputProcessor implements InputProcessorInterface
{
public function __construct(
private FabricRepository $repository = new FabricRepository(),
) {
}

public function processInput(Input $input): void
{
$options = $input->getOptions();

if (!\array_key_exists('fabric_pattern', $options)) {
return;
}

$pattern = $options['fabric_pattern'];
if (!\is_string($pattern)) {
throw new \InvalidArgumentException('The "fabric_pattern" option must be a string');
}

// Load the pattern and prepend as system message
$fabricPrompt = $this->repository->load($pattern);
$systemMessage = new SystemMessage($fabricPrompt->getContent());

// Prepend the system message
$input->messages = $input->messages->prepend($systemMessage);
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

What is happening if the MessageBag already have a System Message? I think most models can handle having multiple system messages. I have read some models do not care about the role of a message and take everything the same. But are we clear about possible problems when adding a system message to a bag that is already containing a system message? 🤔

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Right now, same with the SystemInputProcessor, only one SystemMessage is supported by llm-chain


// Remove the fabric option from the chain options
unset($options['fabric_pattern']);
$input->setOptions($options);
}
}
40 changes: 40 additions & 0 deletions src/Platform/Fabric/FabricPrompt.php
Original file line number Diff line number Diff line change
@@ -0,0 +1,40 @@
<?php

declare(strict_types=1);

namespace PhpLlm\LlmChain\Platform\Fabric;

/**
* Represents a Fabric prompt pattern.
*/
final readonly class FabricPrompt implements FabricPromptInterface
{
/**
* @param non-empty-string $pattern
* @param array<string, mixed> $metadata
*/
public function __construct(
private string $pattern,
private string $content,
private array $metadata = [],
) {
}

/**
* @return non-empty-string
*/
public function getPattern(): string
{
return $this->pattern;
}

public function getContent(): string
{
return $this->content;
}

public function getMetadata(): array
{
return $this->metadata;
}
}
30 changes: 30 additions & 0 deletions src/Platform/Fabric/FabricPromptInterface.php
Original file line number Diff line number Diff line change
@@ -0,0 +1,30 @@
<?php

declare(strict_types=1);

namespace PhpLlm\LlmChain\Platform\Fabric;

/**
* Interface for Fabric prompt patterns.
*/
interface FabricPromptInterface
{
/**
* Get the pattern name (e.g., 'create_summary').
*
* @return non-empty-string
*/
public function getPattern(): string;

/**
* Get the system prompt content.
*/
public function getContent(): string;

/**
* Get metadata about the pattern.
*
* @return array<string, mixed>
*/
public function getMetadata(): array;
}
Loading
Loading