Skip to content

Commit

Permalink
set prompt
Browse files Browse the repository at this point in the history
  • Loading branch information
Endle committed Sep 7, 2024
1 parent a65e842 commit 422c38d
Show file tree
Hide file tree
Showing 2 changed files with 15 additions and 4 deletions.
2 changes: 0 additions & 2 deletions fire_seq_search_server/src/http_client/endpoints.rs
Original file line number Diff line number Diff line change
Expand Up @@ -33,8 +33,6 @@ pub async fn summarize(
pub async fn get_llm_done_list(
State(engine_arc): State<Arc<QueryEngine>>
) -> Html<String>{

info!("get list endpoint called");
let r = engine_arc.get_llm_done_list();
Html(r.await)
}
Expand Down
17 changes: 15 additions & 2 deletions fire_seq_search_server/src/local_llm/mod.rs
Original file line number Diff line number Diff line change
Expand Up @@ -10,6 +10,18 @@ use serde_derive::Deserialize;
use serde_derive::Serialize;
use serde;

// TODO Allow user to set prompt, instead of hard-coded in code
const prompt_string: &'static str = r##"
You are a seasoned summary expert, capable of condensing and summarizing given articles, papers, or posts, accurately conveying the main idea to make the content easier to understand.
You place great emphasis on user experience, never adding irrelevant content like "Summary," "The summary is as follows," "Original text," "You can check the original text if interested," or "Original link." Your summaries always convey the core information directly.
You are adept at handling various large, small, and even chaotic text content, always accurately extracting key information and summarizing the core content globally to make it easier to understand.
=== Below is the article ===
"##;

#[derive(Debug, Serialize, Deserialize)]
pub struct OpenAiData {
pub model: String,
Expand Down Expand Up @@ -165,9 +177,11 @@ impl LlmEngine {
}
}
let mut msgs = Vec::new();
let mut chat_text = String::default(); // TODO

let mut chat_text = prompt_string.to_owned();
chat_text += &full_text;
msgs.push( build_message(chat_text) );

OpenAiData {
model: "model".to_owned(),
messages: msgs,
Expand Down Expand Up @@ -247,7 +261,6 @@ impl LlmEngine{
let mut r = Vec::new();
let jcache = self.job_cache.lock().await;
for (title, _text) in &jcache.done_job {
info!("already done : {}", &title);
r.push(title.to_owned());
}
return r;
Expand Down

0 comments on commit 422c38d

Please sign in to comment.