Skip to content

Commit b5ccd0e

Browse files
committed
chore(version): bump version to 0.10.4
1 parent be3d48f commit b5ccd0e

File tree

6 files changed

+32
-33
lines changed

6 files changed

+32
-33
lines changed

README.md

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -4,7 +4,7 @@
44

55
<p align="center">
66
<a href="https://pypi.org/project/reme-ai/"><img src="https://img.shields.io/badge/python-3.12+-blue" alt="Python Version"></a>
7-
<a href="https://pypi.org/project/reme-ai/"><img src="https://img.shields.io/badge/pypi-v0.1.10.3-blue?logo=pypi" alt="PyPI Version"></a>
7+
<a href="https://pypi.org/project/reme-ai/"><img src="https://img.shields.io/badge/pypi-v0.1.10.4-blue?logo=pypi" alt="PyPI Version"></a>
88
<a href="./LICENSE"><img src="https://img.shields.io/badge/license-Apache--2.0-black" alt="License"></a>
99
<a href="https://github.com/modelscope/ReMe"><img src="https://img.shields.io/github/stars/modelscope/ReMe?style=social" alt="GitHub Stars"></a>
1010
</p>
@@ -28,7 +28,7 @@ Personal memory helps "**understand user preferences**", task memory helps agent
2828

2929
## 📰 Latest Updates
3030

31-
- **[2025-10]** 🚀 ReMe v0.1.10.3 released! Core enhancement: direct Python import support. You can now use ReMe without starting an HTTP or MCP service - simply `from reme_ai import ReMeApp` and call methods directly in your Python code.
31+
- **[2025-10]** 🚀 ReMe v0.1.10.4 released! Core enhancement: direct Python import support. You can now use ReMe without starting an HTTP or MCP service - simply `from reme_ai import ReMeApp` and call methods directly in your Python code.
3232
- **[2025-10]** 🔧 Tool Memory support is now available! Enables data-driven tool selection and parameter optimization through historical performance tracking. Check out the [Tool Memory Guide](docs/tool_memory/tool_memory.md) and [benchmark results](docs/tool_memory/tool_bench.md).
3333
- **[2025-09]** 🎉 ReMe v0.1.9 has been officially released, adding support for asynchronous operations. It has also been
3434
integrated into the memory service of agentscope-runtime.

docs/index.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -17,7 +17,7 @@ kernelspec:
1717

1818
<div class="flex justify-center space-x-3">
1919
<a href="https://pypi.org/project/reme-ai/"><img src="https://img.shields.io/badge/python-3.12+-blue" alt="Python Version"></a>
20-
<a href="https://pypi.org/project/reme-ai/"><img src="https://img.shields.io/badge/pypi-v0.1.10.3-blue?logo=pypi" alt="PyPI Version"></a>
20+
<a href="https://pypi.org/project/reme-ai/"><img src="https://img.shields.io/badge/pypi-v0.1.10.4-blue?logo=pypi" alt="PyPI Version"></a>
2121
<a href="./LICENSE"><img src="https://img.shields.io/badge/license-Apache--2.0-black" alt="License"></a>
2222
<a href="https://github.com/modelscope/ReMe"><img src="https://img.shields.io/github/stars/modelscope/ReMe?style=social" alt="GitHub Stars"></a>
2323
</div>

docs/vector_store_api_guide.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -12,7 +12,7 @@ kernelspec:
1212
name: python3
1313
---
1414

15-
# 🚀 Vector Store API Guide
15+
# Vector Store API Guide
1616

1717
This guide covers the vector store implementations available in ReMe, their APIs, and how to use them effectively.
1818

pyproject.toml

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -4,7 +4,7 @@ build-backend = "setuptools.build_meta"
44

55
[project]
66
name = "reme_ai"
7-
version = "0.1.10.3"
7+
version = "0.1.10.4"
88
description = "Remember me"
99
authors = [
1010
{ name = "jinli.yl", email = "[email protected]" },
@@ -24,7 +24,7 @@ classifiers = [
2424
keywords = ["llm", "memory", "experience", "memoryscope", "ai", "mcp", "http"]
2525

2626
dependencies = [
27-
"flowllm[reme]>=0.1.11.3",
27+
"flowllm[reme]>=0.1.11.4",
2828
]
2929

3030
[project.optional-dependencies]

reme_ai/__init__.py

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -2,7 +2,7 @@
22

33
os.environ["FLOW_APP_NAME"] = "ReMe"
44

5-
__version__ = "0.1.10.3"
5+
__version__ = "0.1.10.4"
66

77
from reme_ai.app import ReMeApp
88
from . import agent

reme_ai/app.py

Lines changed: 25 additions & 26 deletions
Original file line numberDiff line numberDiff line change
@@ -28,12 +28,12 @@ class ReMeApp(FlowLLMApp):
2828
"""
2929

3030
def __init__(self,
31+
*args,
3132
llm_api_key: str = None,
3233
llm_api_base: str = None,
3334
embedding_api_key: str = None,
3435
embedding_api_base: str = None,
3536
config_path: str = None,
36-
*args,
3737
**kwargs):
3838
"""
3939
Initialize ReMeApp with configuration for LLM, embeddings, and vector stores.
@@ -59,47 +59,46 @@ def __init__(self,
5959
Both approaches accept the same configuration parameters and produce identical results.
6060
6161
Args:
62-
llm_api_key: API key for LLM service (e.g., OpenAI, Claude).
63-
If provided, this will override the FLOW_LLM_API_KEY environment variable.
64-
Environment variable: FLOW_LLM_API_KEY
65-
llm_api_base: Base URL for LLM API. Use this for custom or self-hosted endpoints.
66-
If provided, this will override the FLOW_LLM_BASE_URL environment variable.
67-
Example: "https://api.openai.com/v1"
68-
Environment variable: FLOW_LLM_BASE_URL
69-
embedding_api_key: API key for embedding service. Can be different from llm_api_key
70-
if using separate services for embeddings.
71-
If provided, this will override the FLOW_EMBEDDING_API_KEY environment variable.
72-
Environment variable: FLOW_EMBEDDING_API_KEY
73-
embedding_api_base: Base URL for embedding API. For custom embedding endpoints.
74-
If provided, this will override the FLOW_EMBEDDING_BASE_URL environment variable.
75-
Environment variable: FLOW_EMBEDDING_BASE_URL
76-
config_path: Path to custom configuration YAML file. If provided, loads configuration from this file.
77-
Example: "path/to/my_config.yaml"
78-
This overrides the default configuration with your custom settings.
79-
*args: Additional command-line style arguments passed to parser.
62+
*args: Additional command-line style arguments passed to parser.
8063
These parameters are identical to the command-line startup parameters in README.
81-
64+
8265
Common configuration examples:
8366
For complete configuration reference, see: reme_ai/config/default.yaml
84-
67+
8568
LLM Configuration:
8669
- "llm.default.model_name=qwen3-30b-a3b-thinking-2507" - Set LLM model
8770
- "llm.default.backend=openai_compatible" - Set LLM backend type
8871
- "llm.default.params={'temperature': '0.6'}" - Set model parameters
89-
72+
9073
Embedding Configuration:
9174
- "embedding_model.default.model_name=text-embedding-v4" - Set embedding model
9275
- "embedding_model.default.backend=openai_compatible" - Set embedding backend
9376
- "embedding_model.default.params={'dimensions': 1024}" - Embedding parameters
94-
77+
9578
Vector Store Configuration:
9679
- "vector_store.default.backend=local" - Use local vector store
9780
- "vector_store.default.backend=memory" - Use memory vector store
9881
- "vector_store.default.backend=qdrant" - Use Qdrant vector store
9982
- "vector_store.default.backend=elasticsearch" - Use Elasticsearch
10083
- "vector_store.default.embedding_model=default" - Link vector store to embedding model
10184
- "vector_store.default.params={'collection_name': 'my_memories'}" - Vector store parameters
102-
85+
llm_api_key: API key for LLM service (e.g., OpenAI, Claude).
86+
If provided, this will override the FLOW_LLM_API_KEY environment variable.
87+
Environment variable: FLOW_LLM_API_KEY
88+
llm_api_base: Base URL for LLM API. Use this for custom or self-hosted endpoints.
89+
If provided, this will override the FLOW_LLM_BASE_URL environment variable.
90+
Example: "https://api.openai.com/v1"
91+
Environment variable: FLOW_LLM_BASE_URL
92+
embedding_api_key: API key for embedding service. Can be different from llm_api_key
93+
if using separate services for embeddings.
94+
If provided, this will override the FLOW_EMBEDDING_API_KEY environment variable.
95+
Environment variable: FLOW_EMBEDDING_API_KEY
96+
embedding_api_base: Base URL for embedding API. For custom embedding endpoints.
97+
If provided, this will override the FLOW_EMBEDDING_BASE_URL environment variable.
98+
Environment variable: FLOW_EMBEDDING_BASE_URL
99+
config_path: Path to custom configuration YAML file. If provided, loads configuration from this file.
100+
Example: "path/to/my_config.yaml"
101+
This overrides the default configuration with your custom settings.
103102
**kwargs: Additional keyword arguments passed to parser. Same format as args but as key-value pairs.
104103
Example: model_name="gpt-4", temperature=0.7
105104
@@ -118,15 +117,15 @@ def __init__(self,
118117
- README.md "Environment Configuration" for environment variable setup
119118
- example.env for all available environment variables
120119
"""
121-
super().__init__(llm_api_key=llm_api_key,
120+
super().__init__(*args,
121+
llm_api_key=llm_api_key,
122122
llm_api_base=llm_api_base,
123123
embedding_api_key=embedding_api_key,
124124
embedding_api_base=embedding_api_base,
125125
service_config=None,
126126
parser=ConfigParser,
127127
config_path=config_path,
128128
load_default_config=True,
129-
args=args,
130129
**kwargs)
131130

132131
async def async_execute(self, name: str, **kwargs) -> dict:

0 commit comments

Comments
 (0)