Skip to content

Commit e2c8ca2

Browse files
authored
Merge pull request #297 from FullStackWithLawrence/next
refactor: rename modules to reflect plugin model
2 parents 7a5a042 + 99d8149 commit e2c8ca2

File tree

7 files changed

+39
-27
lines changed

7 files changed

+39
-27
lines changed

README.md

Lines changed: 8 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -39,9 +39,15 @@ make run # run the web app locally in your dev environment
3939

4040
## Features
4141

42-
- **Prompting**: Uses [Terraform templates](./api/terraform/apigateway_endpoints.tf) to create 30 different ChatBots, each with its own customized UX and api endpoint.
42+
- **Complete OpenAI API**: Deploys a production-ready API for integrating to OpenAI's complete suite of services, including ChatGTP, DALL·E, Whisper, and TTS.
4343

44-
- **Function Calling**: Uses [Yaml templates](./api/terraform/python/openai_api/lambda_openai_function/config/) stored locally or in an AWS S3 bucket to easily configure highly customized ChatGPT prompting behavior that uses both dynamic prompting as well as [OpenAI Python Function Calling](https://platform.openai.com/docs/guides/function-calling) to integrate your own custom Python functions into chat response processing. Refer to the [Python source code](./api/terraform/python/openai_api/lambda_openai_function/) for additional documentation and examples including the fully implemented "[get_current_weather()](./api/terraform/python/openai_api/lambda_openai_function/function_weather.py)" from The official [OpenAI API documentation](https://platform.openai.com/docs/guides/function-calling/common-use-cases), and, a much more interesting example, [get_additional_info()](./api/terraform/python/openai_api/lambda_openai_function/function_refers_to.py) which implements yaml template custom configurations.
44+
- **LangChain Integration**: A simple API endpoint for building context-aware, reasoning applications with LangChain’s flexible abstractions and AI-first toolkit. Use this endpoint to develop a wide range of applications, from chatbots to question-answering systems.
45+
46+
- **Dynamic ChatGPT Prompting**: Simple [Terraform templates](./api/terraform/apigateway_endpoints.tf) to create highly presonalized ChatBots. Program and skin your own custom chat apps in minutes.
47+
48+
- **Function Calling**: OpenAI's most advanced integration feature to date. OpenAI API Function Calling is a feature that enables developers to integrate their own custom Python functions into the processing of chat responses. For example, when a chatbot powered by OpenAI's GPT-3 model is generating responses, it can call these custom Python functions to perform specific tasks or computations, and then include the results of these functions in its responses. This powerful feature can be used to create more dynamic and interactive chatbots that can perform tasks such as fetching real-time data, performing calculations, or interacting with other APIs or services. See the [Python source code](./api/terraform/python/openai_api/lambda_openai_function/) for additional documentation and examples, including, "[get_current_weather()](./api/terraform/python/openai_api/lambda_openai_function/function_weather.py)" from The official [OpenAI API documentation](https://platform.openai.com/docs/guides/function-calling/common-use-cases)
49+
50+
- **Function Calling Plugins**: We created our own yaml-based "plugin" model, [function_calling_plugin()](./api/terraform/python/openai_api/lambda_openai_function/plugin.py). See this [example plugin](./api/terraform/python/openai_api/lambda_openai_function/config/example-configuration.yaml) and this [documentation](./api/terraform/python/openai_api/lambda_openai_function/README.md) for details, or try it out on this [live site](https://openai.lawrencemcdaniel.com/). Yaml templates can be stored locally or served from a secure AWS S3 bucket. You'll find set of fun example plugins [here](./api/terraform/python/openai_api/lambda_openai_function/config/).
4551

4652
![Marv](https://cdn.lawrencemcdaniel.com/marv.gif)
4753

api/terraform/python/openai_api/lambda_openai_function/README.md

Lines changed: 3 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -12,9 +12,9 @@ The following screenshots demonstrate the two Function Calling Python functions
1212

1313
Fully implements the "[get_current_weather()](https://platform.openai.com/docs/guides/function-calling)" from The official OpenAI API documentation. OpenAI's documentation provides scaffolding for this feature, but falls short of actually providing code that retrieves location-based current weather forecasts.
1414

15-
## function_refers_to.py
15+
## plugin.py
1616

17-
This module demonstrates an alternative implementation of prompt behavior modification involving both Function Calling, plus, dynamic modifications to the system prompt. The module passes a customized configuration object to `get_additional_info()` based on a configurable set of search terms that it looks for in the user prompt. The function works with multiple customized configurations. That is, it maintains a list of custom configurations, and user prompts including search terms associated with multiple custom configurations will result in prompt configuration multiple "Function Calling" apis. The custom configurations are persisted both inside this repository in the [config](./config/) folder as well as via a remote AWS S3 bucket that Terraform creates and configures for you automatically. Custom configurations are data-driven via a standardized yaml format. Use [example-configuration.yaml](./config/example-configuration.yaml) as a template to create your own custom configurations. Storing these in the AWS S3 bucket is preferable to keeping these inside your repo.
17+
This module demonstrates an alternative implementation of prompt behavior modification involving both Function Calling, plus, dynamic modifications to the system prompt. The module passes a customized configuration object to `function_calling_plugin()` based on a configurable set of search terms that it looks for in the user prompt. The function works with multiple customized configurations. That is, it maintains a list of custom configurations, and user prompts including search terms associated with multiple custom configurations will result in prompt configuration multiple "Function Calling" apis. The custom configurations are persisted both inside this repository in the [config](./config/) folder as well as via a remote AWS S3 bucket that Terraform creates and configures for you automatically. Custom configurations are data-driven via a standardized yaml format. Use [example-configuration.yaml](./config/example-configuration.yaml) as a template to create your own custom configurations. Storing these in the AWS S3 bucket is preferable to keeping these inside your repo.
1818

1919
### Example custom configurations
2020

@@ -48,7 +48,7 @@ function_calling:
4848
function_description: an example custom configuration to integrate with OpenAI API Function Calling additional information function, in this module.
4949
additional_information:
5050
about: >
51-
This is some sample text that will be returned ChatGPT if it opts to invoke the get_additional_info() function.
51+
This is some sample text that will be returned ChatGPT if it opts to invoke the function_calling_plugin() function.
5252
In an API call, you can describe functions and have the model intelligently choose to output a JSON object containing arguments to call one or many functions. The Chat Completions API does not call the function; instead, the model generates JSON that you can use to call the function in your code.
5353
The latest models (gpt-3.5-turbo-1106 and gpt-4-1106-preview) have been trained to both detect when a function should to be called (depending on the input) and to respond with JSON that adheres to the function signature more closely than previous models. With this capability also comes potential risks. We strongly recommend building in user confirmation flows before taking actions that impact the world on behalf of users (sending an email, posting something online, making a purchase, etc).
5454
links:

api/terraform/python/openai_api/lambda_openai_function/custom_config.py

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -2,7 +2,7 @@
22
# pylint: disable=E1101
33
"""
44
This module contains the CustomConfig class, which is used to parse YAML config objects for
5-
function_refers_to.get_additional_info().
5+
plugin.function_calling_plugin().
66
"""
77
import json
88
import logging
@@ -274,7 +274,7 @@ def to_json(self) -> json:
274274

275275

276276
class CustomConfig(CustomConfigBase):
277-
"""A json object that contains the config for a function_refers_to.get_additional_info() function"""
277+
"""A json object that contains the config for a plugin.function_calling_plugin() function"""
278278

279279
index: int = Field(0, description="Index of the config object")
280280
config_json: dict = Field(..., description="Config object")

api/terraform/python/openai_api/lambda_openai_function/lambda_handler.py

Lines changed: 9 additions & 9 deletions
Original file line numberDiff line numberDiff line change
@@ -39,18 +39,18 @@
3939
validate_item,
4040
)
4141
from openai_api.lambda_openai_function.custom_config import custom_configs
42-
from openai_api.lambda_openai_function.function_refers_to import (
43-
customized_prompt,
44-
get_additional_info,
45-
info_tool_factory,
46-
search_terms_are_in_messages,
47-
)
4842

4943
# OpenAI functions
5044
from openai_api.lambda_openai_function.function_weather import (
5145
get_current_weather,
5246
weather_tool_factory,
5347
)
48+
from openai_api.lambda_openai_function.plugin import (
49+
customized_prompt,
50+
function_calling_plugin,
51+
plugin_tool_factory,
52+
search_terms_are_in_messages,
53+
)
5454

5555

5656
openai.organization = settings.openai_api_organization
@@ -85,7 +85,7 @@ def handler(event, context):
8585
):
8686
model = "gpt-3.5-turbo-1106"
8787
messages = customized_prompt(config=config, messages=messages)
88-
custom_tool = info_tool_factory(config=config)
88+
custom_tool = plugin_tool_factory(config=config)
8989
tools.append(custom_tool)
9090
print(
9191
f"Adding custom configuration: {config.name} {config.meta_data.version} created by {config.meta_data.author}"
@@ -113,7 +113,7 @@ def handler(event, context):
113113
# Note: the JSON response may not always be valid; be sure to handle errors
114114
available_functions = {
115115
"get_current_weather": get_current_weather,
116-
"get_additional_info": get_additional_info,
116+
"function_calling_plugin": function_calling_plugin,
117117
} # only one function in this example, but you can have multiple
118118
messages.append(response_message) # extend conversation with assistant's reply
119119
# Step 4: send the info for each function call and function response to the model
@@ -127,7 +127,7 @@ def handler(event, context):
127127
location=function_args.get("location"),
128128
unit=function_args.get("unit"),
129129
)
130-
elif function_name == "get_additional_info":
130+
elif function_name == "function_calling_plugin":
131131
function_response = function_to_call(inquiry_type=function_args.get("inquiry_type"))
132132
messages.append(
133133
{

api/terraform/python/openai_api/lambda_openai_function/function_refers_to.py renamed to api/terraform/python/openai_api/lambda_openai_function/plugin.py

Lines changed: 3 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -50,7 +50,7 @@ def customized_prompt(config: CustomConfig, messages: list) -> list:
5050

5151

5252
# pylint: disable=too-many-return-statements
53-
def get_additional_info(inquiry_type: str) -> str:
53+
def function_calling_plugin(inquiry_type: str) -> str:
5454
"""Return select info from custom config object"""
5555

5656
for config in custom_configs:
@@ -64,14 +64,14 @@ def get_additional_info(inquiry_type: str) -> str:
6464
raise KeyError(f"Invalid inquiry_type: {inquiry_type}")
6565

6666

67-
def info_tool_factory(config: CustomConfig):
67+
def plugin_tool_factory(config: CustomConfig):
6868
"""
6969
Return a dictionary of chat completion tools.
7070
"""
7171
tool = {
7272
"type": "function",
7373
"function": {
74-
"name": "get_additional_info",
74+
"name": "function_calling_plugin",
7575
"description": config.function_calling.function_description,
7676
"parameters": {
7777
"type": "object",

api/terraform/python/openai_api/lambda_openai_function/tests/test_custom_config.py

Lines changed: 6 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -171,12 +171,18 @@ def test_function_calling(self):
171171

172172
def test_aws_s3_bucket(self):
173173
"""Test aws_s3_bucket."""
174+
175+
# If the aws_s3_bucket_name is example.com, then we don't need to test it.
176+
if settings.aws_apigateway_root_domain == "example.com":
177+
return
178+
174179
aws_s3_bucket_name = settings.aws_s3_bucket_name
175180
s3 = settings.aws_s3_client
176181

177182
folder_name = "test_folder/"
178183
file_name = folder_name + "test_file"
179184

185+
print("Testing aws_s3_bucket_name: ", aws_s3_bucket_name)
180186
# Connect to the aws_s3_bucket_name
181187
try:
182188
s3.head_bucket(Bucket=aws_s3_bucket_name)

api/terraform/python/openai_api/lambda_openai_function/tests/test_lambda_openai_custom_config.py

Lines changed: 8 additions & 8 deletions
Original file line numberDiff line numberDiff line change
@@ -23,9 +23,9 @@
2323
from openai_api.lambda_openai_function.custom_config import CustomConfig
2424

2525
# pylint: disable=no-name-in-module
26-
from openai_api.lambda_openai_function.function_refers_to import (
27-
get_additional_info,
28-
info_tool_factory,
26+
from openai_api.lambda_openai_function.plugin import (
27+
function_calling_plugin,
28+
plugin_tool_factory,
2929
)
3030
from openai_api.lambda_openai_function.tests.test_setup import get_test_file_path
3131

@@ -42,20 +42,20 @@ def setUp(self):
4242

4343
# pylint: disable=broad-exception-caught
4444
def test_get_additional_info(self):
45-
"""Test default return value of get_additional_info()"""
45+
"""Test default return value of function_calling_plugin()"""
4646
try:
4747
# pylint: disable=no-value-for-parameter
48-
additional_information = get_additional_info(
48+
additional_information = function_calling_plugin(
4949
inquiry_type=self.config.function_calling.additional_information.keys[0]
5050
)
5151
except Exception:
52-
self.fail("get_additional_info() raised ExceptionType")
52+
self.fail("function_calling_plugin() raised ExceptionType")
5353

5454
self.assertTrue(additional_information is not None)
5555

5656
def test_info_tool_factory(self):
57-
"""Test integrity info_tool_factory()"""
58-
itf = info_tool_factory(config=self.config)
57+
"""Test integrity plugin_tool_factory()"""
58+
itf = plugin_tool_factory(config=self.config)
5959
self.assertIsInstance(itf, dict)
6060

6161
self.assertIsInstance(itf, dict)

0 commit comments

Comments
 (0)