mirror of
https://github.com/qodo-ai/pr-agent.git
synced 2025-07-13 01:00:39 +08:00
Merge remote-tracking branch 'origin/main' into tr/issue_tool
# Conflicts: # requirements.txt
This commit is contained in:
47
Usage.md
47
Usage.md
@ -149,15 +149,58 @@ TBD
|
||||
#### Changing a model
|
||||
See [here](pr_agent/algo/__init__.py) for the list of available models.
|
||||
|
||||
To use Llama2 model, for example, set:
|
||||
#### Azure
|
||||
To use Azure, set:
|
||||
```
|
||||
api_key = "" # your azure api key
|
||||
api_type = "azure"
|
||||
api_version = '2023-05-15' # Check Azure documentation for the current API version
|
||||
api_base = "" # The base URL for your Azure OpenAI resource. e.g. "https://<your resource name>.openai.azure.com"
|
||||
deployment_id = "" # The deployment name you chose when you deployed the engine
|
||||
```
|
||||
in your .secrets.toml
|
||||
|
||||
and
|
||||
```
|
||||
[config]
|
||||
model="" # the OpenAI model you've deployed on Azure (e.g. gpt-3.5-turbo)
|
||||
```
|
||||
in the configuration.toml
|
||||
|
||||
#### Huggingface
|
||||
|
||||
To use a new model with Huggingface Inference Endpoints, for example, set:
|
||||
```
|
||||
[__init__.py]
|
||||
MAX_TOKENS = {
|
||||
"model-name-on-huggingface": <max_tokens>
|
||||
}
|
||||
e.g.
|
||||
MAX_TOKENS={
|
||||
...,
|
||||
"meta-llama/Llama-2-7b-chat-hf": 4096
|
||||
}
|
||||
[config] # in configuration.toml
|
||||
model = "huggingface/meta-llama/Llama-2-7b-chat-hf"
|
||||
|
||||
[huggingface] # in .secrets.toml
|
||||
key = ... # your huggingface api key
|
||||
api_base = ... # the base url for your huggingface inference endpoint
|
||||
```
|
||||
(you can obtain a Llama2 key from [here](https://replicate.com/replicate/llama-2-70b-chat/api))
|
||||
|
||||
#### Replicate
|
||||
|
||||
To use Llama2 model with Replicate, for example, set:
|
||||
```
|
||||
[config] # in configuration.toml
|
||||
model = "replicate/llama-2-70b-chat:2c1608e18606fad2812020dc541930f2d0495ce32eee50074220b87300bc16e1"
|
||||
[replicate]
|
||||
[replicate] # in .secrets.toml
|
||||
key = ...
|
||||
```
|
||||
(you can obtain a Llama2 key from [here](https://replicate.com/replicate/llama-2-70b-chat/api))
|
||||
|
||||
|
||||
Also review the [AiHandler](pr_agent/algo/ai_handler.py) file for instruction how to set keys for other models.
|
||||
|
||||
#### Extra instructions
|
||||
|
Reference in New Issue
Block a user