mirror of
https://github.com/qodo-ai/pr-agent.git
synced 2025-07-02 11:50:37 +08:00
Merge branch 'qodo-ai:main' into main
This commit is contained in:
@ -43,36 +43,47 @@ Note that if your base branches are not protected, don't set the variables as `p
|
||||
|
||||
## Run a GitLab webhook server
|
||||
|
||||
1. From the GitLab workspace or group, create an access token with "Reporter" role ("Developer" if using Pro version of the agent) and "api" scope.
|
||||
1. In GitLab create a new user and give it "Reporter" role ("Developer" if using Pro version of the agent) for the intended group or project.
|
||||
|
||||
2. Generate a random secret for your app, and save it for later. For example, you can use:
|
||||
2. For the user from step 1. generate a `personal_access_token` with `api` access.
|
||||
|
||||
3. Generate a random secret for your app, and save it for later (`shared_secret`). For example, you can use:
|
||||
|
||||
```
|
||||
WEBHOOK_SECRET=$(python -c "import secrets; print(secrets.token_hex(10))")
|
||||
SHARED_SECRET=$(python -c "import secrets; print(secrets.token_hex(10))")
|
||||
```
|
||||
|
||||
3. Clone this repository:
|
||||
4. Clone this repository:
|
||||
|
||||
```
|
||||
git clone https://github.com/Codium-ai/pr-agent.git
|
||||
git clone https://github.com/qodo-ai/pr-agent.git
|
||||
```
|
||||
|
||||
4. Prepare variables and secrets. Skip this step if you plan on settings these as environment variables when running the agent:
|
||||
5. Prepare variables and secrets. Skip this step if you plan on setting these as environment variables when running the agent:
|
||||
1. In the configuration file/variables:
|
||||
- Set `deployment_type` to "gitlab"
|
||||
- Set `config.git_provider` to "gitlab"
|
||||
|
||||
2. In the secrets file/variables:
|
||||
- Set your AI model key in the respective section
|
||||
- In the [gitlab] section, set `personal_access_token` (with token from step 1) and `shared_secret` (with secret from step 2)
|
||||
- In the [gitlab] section, set `personal_access_token` (with token from step 2) and `shared_secret` (with secret from step 3)
|
||||
|
||||
|
||||
5. Build a Docker image for the app and optionally push it to a Docker repository. We'll use Dockerhub as an example:
|
||||
6. Build a Docker image for the app and optionally push it to a Docker repository. We'll use Dockerhub as an example:
|
||||
```
|
||||
docker build . -t gitlab_pr_agent --target gitlab_webhook -f docker/Dockerfile
|
||||
docker push codiumai/pr-agent:gitlab_webhook # Push to your Docker repository
|
||||
```
|
||||
|
||||
6. Create a webhook in GitLab. Set the URL to ```http[s]://<PR_AGENT_HOSTNAME>/webhook```, the secret token to the generated secret from step 2, and enable the triggers `push`, `comments` and `merge request events`.
|
||||
7. Set the environmental variables, the method depends on your docker runtime. Skip this step if you included your secrets/configuration directly in the Docker image.
|
||||
|
||||
```
|
||||
"CONFIG.GIT_PROVIDER": "gitlab"
|
||||
"GITLAB.PERSONAL_ACCESS_TOKEN": "<personal_access_token>"
|
||||
"GITLAB.SHARED_SECRET": "<shared_secret>"
|
||||
"GITLAB.URL": "https://gitlab.com"
|
||||
"OPENAI.KEY": "<your_openai_api_key>"
|
||||
```
|
||||
|
||||
8. Create a webhook in your GitLab project. Set the URL to ```http[s]://<PR_AGENT_HOSTNAME>/webhook```, the secret token to the generated secret from step 3, and enable the triggers `push`, `comments` and `merge request events`.
|
||||
|
||||
9. Test your installation by opening a merge request or commenting on a merge request using one of PR Agent's commands.
|
||||
|
||||
7. Test your installation by opening a merge request or commenting on a merge request using one of CodiumAI's commands.
|
||||
boxes
|
||||
|
@ -69,3 +69,28 @@ For example, in the GitHub organization `Codium-ai`:
|
||||
- The file [`https://github.com/Codium-ai/pr-agent-settings/.pr_agent.toml`](https://github.com/Codium-ai/pr-agent-settings/blob/main/.pr_agent.toml) serves as a global configuration file for all the repos in the GitHub organization `Codium-ai`.
|
||||
|
||||
- The repo [`https://github.com/Codium-ai/pr-agent`](https://github.com/Codium-ai/pr-agent/blob/main/.pr_agent.toml) inherits the global configuration file from `pr-agent-settings`.
|
||||
|
||||
### Bitbucket Organization level configuration file 💎
|
||||
`Relevant platforms: Bitbucket Cloud, Bitbucket Data Center`
|
||||
|
||||
In Bitbucket, there are two levels where you can define a global configuration file:
|
||||
|
||||
* Project-level global configuration:
|
||||
|
||||
Create a repository named `pr-agent-settings` within a specific project. The configuration file in this repository will apply to all repositories under the same project.
|
||||
|
||||
* Organization-level global configuration:
|
||||
|
||||
Create a dedicated project to hold a global configuration file that affects all repositories across all projects in your organization.
|
||||
|
||||
**Setting up organization-level global configuration:**
|
||||
|
||||
1. Create a new project with both the name and key: PR_AGENT_SETTINGS.
|
||||
2. Inside the PR_AGENT_SETTINGS project, create a repository named pr-agent-settings.
|
||||
3. In this repository, add a .pr_agent.toml configuration file—structured similarly to the global configuration file described above.
|
||||
|
||||
Repositories across your entire Bitbucket organization will inherit the configuration from this file.
|
||||
|
||||
!!! note "Note"
|
||||
If both organization-level and project-level global settings are defined, the project-level settings will take precedence over the organization-level configuration. Additionally, parameters from a repository’s local .pr_agent.toml file will always override both global settings.
|
||||
|
||||
|
@ -41,11 +41,6 @@ class LiteLLMAIHandler(BaseAiHandler):
|
||||
os.environ["AWS_ACCESS_KEY_ID"] = get_settings().aws.AWS_ACCESS_KEY_ID
|
||||
os.environ["AWS_SECRET_ACCESS_KEY"] = get_settings().aws.AWS_SECRET_ACCESS_KEY
|
||||
os.environ["AWS_REGION_NAME"] = get_settings().aws.AWS_REGION_NAME
|
||||
if get_settings().get("litellm.use_client"):
|
||||
litellm_token = get_settings().get("litellm.LITELLM_TOKEN")
|
||||
assert litellm_token, "LITELLM_TOKEN is required"
|
||||
os.environ["LITELLM_TOKEN"] = litellm_token
|
||||
litellm.use_client = True
|
||||
if get_settings().get("LITELLM.DROP_PARAMS", None):
|
||||
litellm.drop_params = get_settings().litellm.drop_params
|
||||
if get_settings().get("LITELLM.SUCCESS_CALLBACK", None):
|
||||
|
@ -1,4 +1,4 @@
|
||||
from base64 import b64decode, encode
|
||||
from base64 import b64decode, encode, b64encode
|
||||
import hashlib
|
||||
|
||||
class CliArgs:
|
||||
@ -9,7 +9,9 @@ class CliArgs:
|
||||
return True, ""
|
||||
|
||||
# decode forbidden args
|
||||
_encoded_args = 'ZW5hYmxlX2NvbW1lbnRfYXBwcm92YWw=:ZW5hYmxlX21hbnVhbF9hcHByb3ZhbA==:ZW5hYmxlX2F1dG9fYXBwcm92YWw=:YXBwcm92ZV9wcl9vbl9zZWxmX3Jldmlldw==:YmFzZV91cmw=:dXJs:YXBwX25hbWU=:c2VjcmV0X3Byb3ZpZGVy:Z2l0X3Byb3ZpZGVy:c2tpcF9rZXlz:b3BlbmFpLmtleQ==:QU5BTFlUSUNTX0ZPTERFUg==:dXJp:YXBwX2lk:d2ViaG9va19zZWNyZXQ=:YmVhcmVyX3Rva2Vu:UEVSU09OQUxfQUNDRVNTX1RPS0VO:b3ZlcnJpZGVfZGVwbG95bWVudF90eXBl:cHJpdmF0ZV9rZXk=:bG9jYWxfY2FjaGVfcGF0aA==:ZW5hYmxlX2xvY2FsX2NhY2hl:amlyYV9iYXNlX3VybA==:YXBpX2Jhc2U=:YXBpX3R5cGU=:YXBpX3ZlcnNpb24=:c2tpcF9rZXlz'
|
||||
# b64encode('word'.encode()).decode()
|
||||
_encoded_args = 'c2hhcmVkX3NlY3JldA==:dXNlcg==:c3lzdGVt:ZW5hYmxlX2NvbW1lbnRfYXBwcm92YWw=:ZW5hYmxlX21hbnVhbF9hcHByb3ZhbA==:ZW5hYmxlX2F1dG9fYXBwcm92YWw=:YXBwcm92ZV9wcl9vbl9zZWxmX3Jldmlldw==:YmFzZV91cmw=:dXJs:YXBwX25hbWU=:c2VjcmV0X3Byb3ZpZGVy:Z2l0X3Byb3ZpZGVy:c2tpcF9rZXlz:b3BlbmFpLmtleQ==:QU5BTFlUSUNTX0ZPTERFUg==:dXJp:YXBwX2lk:d2ViaG9va19zZWNyZXQ=:YmVhcmVyX3Rva2Vu:UEVSU09OQUxfQUNDRVNTX1RPS0VO:b3ZlcnJpZGVfZGVwbG95bWVudF90eXBl:cHJpdmF0ZV9rZXk=:bG9jYWxfY2FjaGVfcGF0aA==:ZW5hYmxlX2xvY2FsX2NhY2hl:amlyYV9iYXNlX3VybA==:YXBpX2Jhc2U=:YXBpX3R5cGU=:YXBpX3ZlcnNpb24=:c2tpcF9rZXlz'
|
||||
|
||||
forbidden_cli_args = []
|
||||
for e in _encoded_args.split(':'):
|
||||
forbidden_cli_args.append(b64decode(e).decode())
|
||||
|
@ -78,9 +78,6 @@ webhook_secret = ""
|
||||
app_key = ""
|
||||
base_url = ""
|
||||
|
||||
[litellm]
|
||||
LITELLM_TOKEN = "" # see https://docs.litellm.ai/docs/debugging/hosted_debugging for details and instructions on how to get a token
|
||||
|
||||
[azure_devops]
|
||||
# For Azure devops personal access token
|
||||
org = ""
|
||||
|
Reference in New Issue
Block a user