This is a small executable written in Rust, a part of the Refact Agent project. Its main job is to live inside your IDE quietly and keep AST and VecDB indexes up to date. It is well-written: it will not break if you edit your files quickly or switch branches, it caches vectorization model responses so you don't have to wait for VecDB to complete indexing, AST supports connection graph between definitions and usages in many popular programming languages.
Yes, it looks like an LSP server to IDE, hence the name. It can also work within a python program, check out the Text UI below, you can talk about your project in the command line!
- Installation
- Things to Try
- Telemetry
- Caps File
- AST
- CLI
- Progress and Future Plans
- Archiecture
- Contributing
- Follow Us and FAQ
- License
- Integrates with the IDE you are already using, like VSCode or JetBrains
- Offers assistant functionality: code completion and chat
- Keeps track of your source files, keeps AST and vector database up to date
- Integrates browser, databases, debuggers for the model to use
- Ask it anything! It will use the tools available to make changes to your project
Installable by the end user:
-
JetBrains IDEs https://github.com/smallcloudai/refact-intellij
-
VS Classic https://github.com/smallcloudai/refact-vs-classic/
-
Sublime Text https://github.com/smallcloudai/refact-sublime/
-
Refact Self-Hosting Server https://github.com/smallcloudai/refact/
- Code completion with RAG
- Chat with tool usage
- definition() references() tools
- vecdb search() with scope
- @file @tree @web @definition @references @search mentions in chat
- locate() uses test-time compute to find good project cross-section
- Latest gpt-4o gpt-4o-mini
- Claude-3-5-sonnet
- Llama-3.1 (passthrough)
- Llama-3.2 (passthrough)
- Llama-3.2 (scratchpad)
- Bring-your-own-key
- Memory (--experimental)
- Docker integration (--experimental)
- git integration (--experimental)
- pdb python debugger integration (--experimental)
- More debuggers
- Github integration (--experimental)
- Gitlab integration
- Jira integration
It will automatically pick up OPENAI_API_KEY, or maybe you have Refact cloud key or Refact Self-Hosting Server:
cargo build
target/debug/refact-lsp --http-port 8001 --logs-stderr
target/debug/refact-lsp --address-url Refact --api-key $REFACT_API_KEY --http-port 8001 --logs-stderr
target/debug/refact-lsp --address-url http://my-refact-self-hosting/ --api-key $REFACT_API_KEY --http-port 8001 --logs-stderr
Try --help
for more options.
Code completion:
curl http://127.0.0.1:8001/v1/code-completion -k \
-H 'Content-Type: application/json' \
-d '{
"inputs": {
"sources": {"hello.py": "def hello_world():"},
"cursor": {
"file": "hello.py",
"line": 0,
"character": 18
},
"multiline": true
},
"stream": false,
"parameters": {
"temperature": 0.1,
"max_new_tokens": 20
}
}'
RAG status:
curl http://127.0.0.1:8001/v1/rag-status
Chat, the not-very-standard version, it has deterministic_messages in response for all your @-mentions. The more standard version is at /v1/chat/completions.
curl http://127.0.0.1:8001/v1/chat -k \
-H 'Content-Type: application/json' \
-d '{
"messages": [
{"role": "user", "content": "Who is Bill Clinton? What is his favorite programming language?"}
],
"stream": false,
"temperature": 0.1,
"max_tokens": 20
}'
The flag --basic-telemetry
means send counters and error messages. It is "compressed"
into .cache/refact/telemetry/compressed
folder, then from time to time it's sent and moved
to .cache/refact/telemetry/sent
folder.
To be clear: without these flags, no telemetry is sent. At no point it sends your code.
"Compressed" means similar records are joined together, increasing the counter. "Sent" means the rust binary
communicates with a HTTP endpoint specified in caps (see Caps section below) and sends .json file exactly how
you see it in .cache/refact/telemetry
. The files are human-readable.
When using Refact self-hosted server, telemetry goes to the self-hosted server, not to the cloud.
The capabilities file stores the same things as bring-your-own-key.yaml, the file describes how to access AI models.
The --address-url
parameter controls where to get this file, it defaults to ~/.config/refact/bring-your-own-key.yaml
.
If it's a URL, the executable fetches $URL/refact-caps
to know what to do. This is especially useful to connect to Refact Self-Hosting Server,
because the configuration does not need to be copy-pasted among engineers who use the server.
Supported languages:
- Java
- JavaScript
- TypeScript
- Python
- Rust
- C#
You can still use Refact for other languages, just the AST capabilities will be missing.
You can compile and use Refact Agent from command line with this repo alone, and it's a not an afterthought, it works great!
cargo build --release
cp target/release/refact-lsp python_binding_and_cmdline/refact/bin/
pip install -e python_binding_and_cmdline/
- Contributing CONTRIBUTING.md
- GitHub issues for bugs and errors
- Community forum for community support and discussions If you wish to contribute to this project, feel free to explore our current issues or open new issues related to (bugs/features) using our CONTRIBUTING.md.
- Contributing
- Refact Docs
- GitHub Issues for bugs and errors
- Community Forum for community support and discussions
- Discord for chatting with community members
- Twitter for product news and updates
Refact is free to use for individuals and small teams under the BSD-3-Clause license. If you wish to use Refact for Enterprise, please contact us.