mirror of
https://github.com/qodo-ai/pr-agent.git
synced 2025-07-02 03:40:38 +08:00
Merge remote-tracking branch 'origin/main' into tr/issue_tool
# Conflicts: # pr_agent/settings/configuration.toml
This commit is contained in:
22
INSTALL.md
22
INSTALL.md
@ -15,6 +15,7 @@ There are several ways to use PR-Agent:
|
||||
- [Method 5: Run as a GitHub App](INSTALL.md#method-5-run-as-a-github-app)
|
||||
- [Method 6: Deploy as a Lambda Function](INSTALL.md#method-6---deploy-as-a-lambda-function)
|
||||
- [Method 7: AWS CodeCommit](INSTALL.md#method-7---aws-codecommit-setup)
|
||||
- [Method 8: Run a GitLab webhook server](INSTALL.md#method-8---run-a-gitlab-webhook-server)
|
||||
---
|
||||
|
||||
### Method 1: Use Docker image (no installation required)
|
||||
@ -343,6 +344,27 @@ PYTHONPATH="/PATH/TO/PROJECTS/pr-agent" python pr_agent/cli.py \
|
||||
review
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
### Method 8 - Run a GitLab webhook server
|
||||
|
||||
1. From the GitLab workspace or group, create an access token. Enable the "api" scope only.
|
||||
2. Generate a random secret for your app, and save it for later. For example, you can use:
|
||||
|
||||
```
|
||||
WEBHOOK_SECRET=$(python -c "import secrets; print(secrets.token_hex(10))")
|
||||
```
|
||||
3. Follow the instructions to build the Docker image, setup a secrets file and deploy on your own server from [Method 5](#method-5-run-as-a-github-app).
|
||||
4. In the secrets file, fill in the following:
|
||||
- Your OpenAI key.
|
||||
- In the [gitlab] section, fill in personal_access_token and shared_secret. The access token can be a personal access token, or a group or project access token.
|
||||
- Set deployment_type to 'gitlab' in [configuration.toml](./pr_agent/settings/configuration.toml)
|
||||
5. Create a webhook in GitLab. Set the URL to the URL of your app's server. Set the secret token to the generated secret from step 2.
|
||||
In the "Trigger" section, check the ‘comments’ and ‘merge request events’ boxes.
|
||||
6. Test your installation by opening a merge request or commenting or a merge request using one of CodiumAI's commands.
|
||||
|
||||
---
|
||||
|
||||
### Appendix - **Debugging LLM API Calls**
|
||||
If you're testing your codium/pr-agent server, and need to see if calls were made successfully + the exact call logs, you can use the [LiteLLM Debugger tool](https://docs.litellm.ai/docs/debugging/hosted_debugging).
|
||||
|
||||
|
30
README.md
30
README.md
@ -96,26 +96,27 @@ See the [usage guide](./Usage.md) for instructions how to run the different tool
|
||||
|
||||
## Overview
|
||||
`PR-Agent` offers extensive pull request functionalities across various git providers:
|
||||
| | | GitHub | Gitlab | Bitbucket | CodeCommit | Azure DevOps |
|
||||
|-------|---------------------------------------------|:------:|:------:|:---------:|:----------:|:----------:|
|
||||
| TOOLS | Review | :white_check_mark: | :white_check_mark: | :white_check_mark: | :white_check_mark: | :white_check_mark: |
|
||||
| | Ask | :white_check_mark: | :white_check_mark: | :white_check_mark: | :white_check_mark: | :white_check_mark:
|
||||
| | Auto-Description | :white_check_mark: | :white_check_mark: | | :white_check_mark: | :white_check_mark: |
|
||||
| | Improve Code | :white_check_mark: | :white_check_mark: | | :white_check_mark: | |
|
||||
| | ⮑ Extended | :white_check_mark: | :white_check_mark: | | :white_check_mark: | |
|
||||
| | Reflect and Review | :white_check_mark: | | | | :white_check_mark: |
|
||||
| | Update CHANGELOG.md | :white_check_mark: | | | | |
|
||||
| | | GitHub | Gitlab | Bitbucket | CodeCommit | Azure DevOps | Gerrit |
|
||||
|-------|---------------------------------------------|:------:|:------:|:---------:|:----------:|:----------:|:----------:|
|
||||
| TOOLS | Review | :white_check_mark: | :white_check_mark: | :white_check_mark: | :white_check_mark: | :white_check_mark: | :white_check_mark: |
|
||||
| | Ask | :white_check_mark: | :white_check_mark: | :white_check_mark: | :white_check_mark: | :white_check_mark: | :white_check_mark: |
|
||||
| | Auto-Description | :white_check_mark: | :white_check_mark: | | :white_check_mark: | :white_check_mark: | :white_check_mark: |
|
||||
| | Improve Code | :white_check_mark: | :white_check_mark: | | :white_check_mark: | | :white_check_mark: |
|
||||
| | ⮑ Extended | :white_check_mark: | :white_check_mark: | | :white_check_mark: | | :white_check_mark: |
|
||||
| | Reflect and Review | :white_check_mark: | | | | :white_check_mark: | :white_check_mark: |
|
||||
| | Update CHANGELOG.md | :white_check_mark: | | | | | |
|
||||
| | | | | | | |
|
||||
| USAGE | CLI | :white_check_mark: | :white_check_mark: | :white_check_mark: | :white_check_mark: | :white_check_mark: |
|
||||
| | App / webhook | :white_check_mark: | :white_check_mark: | | | |
|
||||
| | Tagging bot | :white_check_mark: | | | | |
|
||||
| | Actions | :white_check_mark: | | | | |
|
||||
| | Web server | | | | | | :white_check_mark: |
|
||||
| | | | | | | |
|
||||
| CORE | PR compression | :white_check_mark: | :white_check_mark: | :white_check_mark: | :white_check_mark: | :white_check_mark: |
|
||||
| | Repo language prioritization | :white_check_mark: | :white_check_mark: | :white_check_mark: | :white_check_mark: | :white_check_mark: |
|
||||
| | Adaptive and token-aware<br />file patch fitting | :white_check_mark: | :white_check_mark: | :white_check_mark: | :white_check_mark: | :white_check_mark: |
|
||||
| | Multiple models support | :white_check_mark: | :white_check_mark: | :white_check_mark: | :white_check_mark: | :white_check_mark: |
|
||||
| | Incremental PR Review | :white_check_mark: | | | | |
|
||||
| CORE | PR compression | :white_check_mark: | :white_check_mark: | :white_check_mark: | :white_check_mark: | :white_check_mark: | :white_check_mark: |
|
||||
| | Repo language prioritization | :white_check_mark: | :white_check_mark: | :white_check_mark: | :white_check_mark: | :white_check_mark: | :white_check_mark: |
|
||||
| | Adaptive and token-aware<br />file patch fitting | :white_check_mark: | :white_check_mark: | :white_check_mark: | :white_check_mark: | :white_check_mark: | :white_check_mark: |
|
||||
| | Multiple models support | :white_check_mark: | :white_check_mark: | :white_check_mark: | :white_check_mark: | :white_check_mark: | :white_check_mark: |
|
||||
| | Incremental PR Review | :white_check_mark: | | | | | |
|
||||
|
||||
Review the **[usage guide](./Usage.md)** section for detailed instructions how to use the different tools, select the relevant git provider (GitHub, Gitlab, Bitbucket,...), and adjust the configuration file to your needs.
|
||||
|
||||
@ -153,6 +154,7 @@ There are several ways to use PR-Agent:
|
||||
- Allowing you to automate the review process on your private or public repositories
|
||||
- [Method 6: Deploy as a Lambda Function](INSTALL.md#method-6---deploy-as-a-lambda-function)
|
||||
- [Method 7: AWS CodeCommit](INSTALL.md#method-7---aws-codecommit-setup)
|
||||
- [Method 8: Run a GitLab webhook server](INSTALL.md#method-8---run-a-gitlab-webhook-server)
|
||||
|
||||
## How it works
|
||||
|
||||
|
@ -18,6 +18,10 @@ FROM base as github_polling
|
||||
ADD pr_agent pr_agent
|
||||
CMD ["python", "pr_agent/servers/github_polling.py"]
|
||||
|
||||
FROM base as gitlab_webhook
|
||||
ADD pr_agent pr_agent
|
||||
CMD ["python", "pr_agent/servers/gitlab_webhook.py"]
|
||||
|
||||
FROM base as test
|
||||
ADD requirements-dev.txt .
|
||||
RUN pip install -r requirements-dev.txt && rm requirements-dev.txt
|
||||
|
@ -5,6 +5,8 @@ from pr_agent.git_providers.github_provider import GithubProvider
|
||||
from pr_agent.git_providers.gitlab_provider import GitLabProvider
|
||||
from pr_agent.git_providers.local_git_provider import LocalGitProvider
|
||||
from pr_agent.git_providers.azuredevops_provider import AzureDevopsProvider
|
||||
from pr_agent.git_providers.gerrit_provider import GerritProvider
|
||||
|
||||
|
||||
_GIT_PROVIDERS = {
|
||||
'github': GithubProvider,
|
||||
@ -12,7 +14,8 @@ _GIT_PROVIDERS = {
|
||||
'bitbucket': BitbucketProvider,
|
||||
'azure': AzureDevopsProvider,
|
||||
'codecommit': CodeCommitProvider,
|
||||
'local' : LocalGitProvider
|
||||
'local' : LocalGitProvider,
|
||||
'gerrit': GerritProvider,
|
||||
}
|
||||
|
||||
def get_git_provider():
|
||||
|
393
pr_agent/git_providers/gerrit_provider.py
Normal file
393
pr_agent/git_providers/gerrit_provider.py
Normal file
@ -0,0 +1,393 @@
|
||||
import json
|
||||
import logging
|
||||
import os
|
||||
import pathlib
|
||||
import shutil
|
||||
import subprocess
|
||||
import uuid
|
||||
from collections import Counter, namedtuple
|
||||
from pathlib import Path
|
||||
from tempfile import mkdtemp, NamedTemporaryFile
|
||||
|
||||
import requests
|
||||
import urllib3.util
|
||||
from git import Repo
|
||||
|
||||
from pr_agent.config_loader import get_settings
|
||||
from pr_agent.git_providers.git_provider import GitProvider, FilePatchInfo, \
|
||||
EDIT_TYPE
|
||||
from pr_agent.git_providers.local_git_provider import PullRequestMimic
|
||||
|
||||
logger = logging.getLogger(__name__)
|
||||
|
||||
|
||||
def _call(*command, **kwargs) -> (int, str, str):
|
||||
res = subprocess.run(
|
||||
command,
|
||||
stdout=subprocess.PIPE,
|
||||
stderr=subprocess.PIPE,
|
||||
check=True,
|
||||
**kwargs,
|
||||
)
|
||||
return res.stdout.decode()
|
||||
|
||||
|
||||
def clone(url, directory):
|
||||
logger.info("Cloning %s to %s", url, directory)
|
||||
stdout = _call('git', 'clone', "--depth", "1", url, directory)
|
||||
logger.info(stdout)
|
||||
|
||||
|
||||
def fetch(url, refspec, cwd):
|
||||
logger.info("Fetching %s %s", url, refspec)
|
||||
stdout = _call(
|
||||
'git', 'fetch', '--depth', '2', url, refspec,
|
||||
cwd=cwd
|
||||
)
|
||||
logger.info(stdout)
|
||||
|
||||
|
||||
def checkout(cwd):
|
||||
logger.info("Checking out")
|
||||
stdout = _call('git', 'checkout', "FETCH_HEAD", cwd=cwd)
|
||||
logger.info(stdout)
|
||||
|
||||
|
||||
def show(*args, cwd=None):
|
||||
logger.info("Show")
|
||||
return _call('git', 'show', *args, cwd=cwd)
|
||||
|
||||
|
||||
def diff(*args, cwd=None):
|
||||
logger.info("Diff")
|
||||
patch = _call('git', 'diff', *args, cwd=cwd)
|
||||
if not patch:
|
||||
logger.warning("No changes found")
|
||||
return
|
||||
return patch
|
||||
|
||||
|
||||
def reset_local_changes(cwd):
|
||||
logger.info("Reset local changes")
|
||||
_call('git', 'checkout', "--force", cwd=cwd)
|
||||
|
||||
|
||||
def add_comment(url: urllib3.util.Url, refspec, message):
|
||||
*_, patchset, changenum = refspec.rsplit("/")
|
||||
message = "'" + message.replace("'", "'\"'\"'") + "'"
|
||||
return _call(
|
||||
"ssh",
|
||||
"-p", str(url.port),
|
||||
f"{url.auth}@{url.host}",
|
||||
"gerrit", "review",
|
||||
"--message", message,
|
||||
# "--code-review", score,
|
||||
f"{patchset},{changenum}",
|
||||
)
|
||||
|
||||
|
||||
def list_comments(url: urllib3.util.Url, refspec):
|
||||
*_, patchset, _ = refspec.rsplit("/")
|
||||
stdout = _call(
|
||||
"ssh",
|
||||
"-p", str(url.port),
|
||||
f"{url.auth}@{url.host}",
|
||||
"gerrit", "query",
|
||||
"--comments",
|
||||
"--current-patch-set", patchset,
|
||||
"--format", "JSON",
|
||||
)
|
||||
change_set, *_ = stdout.splitlines()
|
||||
return json.loads(change_set)["currentPatchSet"]["comments"]
|
||||
|
||||
|
||||
def prepare_repo(url: urllib3.util.Url, project, refspec):
|
||||
repo_url = (f"{url.scheme}://{url.auth}@{url.host}:{url.port}/{project}")
|
||||
|
||||
directory = pathlib.Path(mkdtemp())
|
||||
clone(repo_url, directory),
|
||||
fetch(repo_url, refspec, cwd=directory)
|
||||
checkout(cwd=directory)
|
||||
return directory
|
||||
|
||||
|
||||
def adopt_to_gerrit_message(message):
|
||||
lines = message.splitlines()
|
||||
buf = []
|
||||
for line in lines:
|
||||
line = line.replace("*", "").replace("``", "`")
|
||||
line = line.strip()
|
||||
if line.startswith('#'):
|
||||
buf.append("\n" +
|
||||
line.replace('#', '').removesuffix(":").strip() +
|
||||
":")
|
||||
continue
|
||||
elif line.startswith('-'):
|
||||
buf.append(line.removeprefix('-').strip())
|
||||
continue
|
||||
else:
|
||||
buf.append(line)
|
||||
return "\n".join(buf).strip()
|
||||
|
||||
|
||||
def add_suggestion(src_filename, context: str, start, end: int):
|
||||
with (
|
||||
NamedTemporaryFile("w", delete=False) as tmp,
|
||||
open(src_filename, "r") as src
|
||||
):
|
||||
lines = src.readlines()
|
||||
tmp.writelines(lines[:start - 1])
|
||||
if context:
|
||||
tmp.write(context)
|
||||
tmp.writelines(lines[end:])
|
||||
|
||||
shutil.copy(tmp.name, src_filename)
|
||||
os.remove(tmp.name)
|
||||
|
||||
|
||||
def upload_patch(patch, path):
|
||||
patch_server_endpoint = get_settings().get(
|
||||
'gerrit.patch_server_endpoint')
|
||||
patch_server_token = get_settings().get(
|
||||
'gerrit.patch_server_token')
|
||||
|
||||
response = requests.post(
|
||||
patch_server_endpoint,
|
||||
json={
|
||||
"content": patch,
|
||||
"path": path,
|
||||
},
|
||||
headers={
|
||||
"Content-Type": "application/json",
|
||||
"Authorization": f"Bearer {patch_server_token}",
|
||||
}
|
||||
)
|
||||
response.raise_for_status()
|
||||
patch_server_endpoint = patch_server_endpoint.rstrip("/")
|
||||
return patch_server_endpoint + "/" + path
|
||||
|
||||
|
||||
class GerritProvider(GitProvider):
|
||||
|
||||
def __init__(self, key: str, incremental=False):
|
||||
self.project, self.refspec = key.split(':')
|
||||
assert self.project, "Project name is required"
|
||||
assert self.refspec, "Refspec is required"
|
||||
base_url = get_settings().get('gerrit.url')
|
||||
assert base_url, "Gerrit URL is required"
|
||||
user = get_settings().get('gerrit.user')
|
||||
assert user, "Gerrit user is required"
|
||||
|
||||
parsed = urllib3.util.parse_url(base_url)
|
||||
self.parsed_url = urllib3.util.parse_url(
|
||||
f"{parsed.scheme}://{user}@{parsed.host}:{parsed.port}"
|
||||
)
|
||||
|
||||
self.repo_path = prepare_repo(
|
||||
self.parsed_url, self.project, self.refspec
|
||||
)
|
||||
self.repo = Repo(self.repo_path)
|
||||
assert self.repo
|
||||
|
||||
self.pr = PullRequestMimic(self.get_pr_title(), self.get_diff_files())
|
||||
|
||||
def get_pr_title(self):
|
||||
"""
|
||||
Substitutes the branch-name as the PR-mimic title.
|
||||
"""
|
||||
return self.repo.branches[0].name
|
||||
|
||||
def get_issue_comments(self):
|
||||
comments = list_comments(self.parsed_url, self.refspec)
|
||||
Comments = namedtuple('Comments', ['reversed'])
|
||||
Comment = namedtuple('Comment', ['body'])
|
||||
return Comments([Comment(c['message']) for c in reversed(comments)])
|
||||
|
||||
def get_labels(self):
|
||||
raise NotImplementedError(
|
||||
'Getting labels is not implemented for the gerrit provider')
|
||||
|
||||
def add_eyes_reaction(self, issue_comment_id: int):
|
||||
raise NotImplementedError(
|
||||
'Adding reactions is not implemented for the gerrit provider')
|
||||
|
||||
def remove_reaction(self, issue_comment_id: int, reaction_id: int):
|
||||
raise NotImplementedError(
|
||||
'Removing reactions is not implemented for the gerrit provider')
|
||||
|
||||
def get_commit_messages(self):
|
||||
return [self.repo.head.commit.message]
|
||||
|
||||
def get_repo_settings(self):
|
||||
"""
|
||||
TODO: Implement support of .pr_agent.toml
|
||||
"""
|
||||
return ""
|
||||
|
||||
def get_diff_files(self) -> list[FilePatchInfo]:
|
||||
diffs = self.repo.head.commit.diff(
|
||||
self.repo.head.commit.parents[0], # previous commit
|
||||
create_patch=True,
|
||||
R=True
|
||||
)
|
||||
|
||||
diff_files = []
|
||||
for diff_item in diffs:
|
||||
if diff_item.a_blob is not None:
|
||||
original_file_content_str = (
|
||||
diff_item.a_blob.data_stream.read().decode('utf-8')
|
||||
)
|
||||
else:
|
||||
original_file_content_str = "" # empty file
|
||||
if diff_item.b_blob is not None:
|
||||
new_file_content_str = diff_item.b_blob.data_stream.read(). \
|
||||
decode('utf-8')
|
||||
else:
|
||||
new_file_content_str = "" # empty file
|
||||
edit_type = EDIT_TYPE.MODIFIED
|
||||
if diff_item.new_file:
|
||||
edit_type = EDIT_TYPE.ADDED
|
||||
elif diff_item.deleted_file:
|
||||
edit_type = EDIT_TYPE.DELETED
|
||||
elif diff_item.renamed_file:
|
||||
edit_type = EDIT_TYPE.RENAMED
|
||||
diff_files.append(
|
||||
FilePatchInfo(
|
||||
original_file_content_str,
|
||||
new_file_content_str,
|
||||
diff_item.diff.decode('utf-8'),
|
||||
diff_item.b_path,
|
||||
edit_type=edit_type,
|
||||
old_filename=None
|
||||
if diff_item.a_path == diff_item.b_path
|
||||
else diff_item.a_path
|
||||
)
|
||||
)
|
||||
self.diff_files = diff_files
|
||||
return diff_files
|
||||
|
||||
def get_files(self):
|
||||
diff_index = self.repo.head.commit.diff(
|
||||
self.repo.head.commit.parents[0], # previous commit
|
||||
R=True
|
||||
)
|
||||
# Get the list of changed files
|
||||
diff_files = [item.a_path for item in diff_index]
|
||||
return diff_files
|
||||
|
||||
def get_languages(self):
|
||||
"""
|
||||
Calculate percentage of languages in repository. Used for hunk
|
||||
prioritisation.
|
||||
"""
|
||||
# Get all files in repository
|
||||
filepaths = [Path(item.path) for item in
|
||||
self.repo.tree().traverse() if item.type == 'blob']
|
||||
# Identify language by file extension and count
|
||||
lang_count = Counter(
|
||||
ext.lstrip('.') for filepath in filepaths for ext in
|
||||
[filepath.suffix.lower()])
|
||||
# Convert counts to percentages
|
||||
total_files = len(filepaths)
|
||||
lang_percentage = {lang: count / total_files * 100 for lang, count
|
||||
in lang_count.items()}
|
||||
return lang_percentage
|
||||
|
||||
def get_pr_description_full(self):
|
||||
return self.repo.head.commit.message
|
||||
|
||||
def get_user_id(self):
|
||||
return self.repo.head.commit.author.email
|
||||
|
||||
def is_supported(self, capability: str) -> bool:
|
||||
if capability in [
|
||||
# 'get_issue_comments',
|
||||
'create_inline_comment',
|
||||
'publish_inline_comments',
|
||||
'get_labels'
|
||||
]:
|
||||
return False
|
||||
return True
|
||||
|
||||
def split_suggestion(self, msg) -> tuple[str, str]:
|
||||
is_code_context = False
|
||||
description = []
|
||||
context = []
|
||||
for line in msg.splitlines():
|
||||
if line.startswith('```suggestion'):
|
||||
is_code_context = True
|
||||
continue
|
||||
if line.startswith('```'):
|
||||
is_code_context = False
|
||||
continue
|
||||
if is_code_context:
|
||||
context.append(line)
|
||||
else:
|
||||
description.append(
|
||||
line.replace('*', '')
|
||||
)
|
||||
|
||||
return (
|
||||
'\n'.join(description),
|
||||
'\n'.join(context) + '\n' if context else ''
|
||||
)
|
||||
|
||||
def publish_code_suggestions(self, code_suggestions: list):
|
||||
msg = []
|
||||
for suggestion in code_suggestions:
|
||||
description, code = self.split_suggestion(suggestion['body'])
|
||||
add_suggestion(
|
||||
pathlib.Path(self.repo_path) / suggestion["relevant_file"],
|
||||
code,
|
||||
suggestion["relevant_lines_start"],
|
||||
suggestion["relevant_lines_end"],
|
||||
)
|
||||
patch = diff(cwd=self.repo_path)
|
||||
patch_id = uuid.uuid4().hex[0:4]
|
||||
path = "/".join(["codium-ai", self.refspec, patch_id])
|
||||
full_path = upload_patch(patch, path)
|
||||
reset_local_changes(self.repo_path)
|
||||
msg.append(f'* {description}\n{full_path}')
|
||||
|
||||
if msg:
|
||||
add_comment(self.parsed_url, self.refspec, "\n".join(msg))
|
||||
return True
|
||||
|
||||
def publish_comment(self, pr_comment: str, is_temporary: bool = False):
|
||||
if not is_temporary:
|
||||
msg = adopt_to_gerrit_message(pr_comment)
|
||||
add_comment(self.parsed_url, self.refspec, msg)
|
||||
|
||||
def publish_description(self, pr_title: str, pr_body: str):
|
||||
msg = adopt_to_gerrit_message(pr_body)
|
||||
add_comment(self.parsed_url, self.refspec, pr_title + '\n' + msg)
|
||||
|
||||
def publish_inline_comments(self, comments: list[dict]):
|
||||
raise NotImplementedError(
|
||||
'Publishing inline comments is not implemented for the gerrit '
|
||||
'provider')
|
||||
|
||||
def publish_inline_comment(self, body: str, relevant_file: str,
|
||||
relevant_line_in_file: str):
|
||||
raise NotImplementedError(
|
||||
'Publishing inline comments is not implemented for the gerrit '
|
||||
'provider')
|
||||
|
||||
def create_inline_comment(self, body: str, relevant_file: str,
|
||||
relevant_line_in_file: str):
|
||||
raise NotImplementedError(
|
||||
'Creating inline comments is not implemented for the gerrit '
|
||||
'provider')
|
||||
|
||||
def publish_labels(self, labels):
|
||||
# Not applicable to the local git provider,
|
||||
# but required by the interface
|
||||
pass
|
||||
|
||||
def remove_initial_comment(self):
|
||||
# remove repo, cloned in previous steps
|
||||
# shutil.rmtree(self.repo_path)
|
||||
pass
|
||||
|
||||
def get_pr_branch(self):
|
||||
return self.repo.head
|
78
pr_agent/servers/gerrit_server.py
Normal file
78
pr_agent/servers/gerrit_server.py
Normal file
@ -0,0 +1,78 @@
|
||||
import copy
|
||||
import logging
|
||||
import sys
|
||||
from enum import Enum
|
||||
from json import JSONDecodeError
|
||||
|
||||
import uvicorn
|
||||
from fastapi import APIRouter, FastAPI, HTTPException
|
||||
from pydantic import BaseModel
|
||||
from starlette.middleware import Middleware
|
||||
from starlette_context import context
|
||||
from starlette_context.middleware import RawContextMiddleware
|
||||
|
||||
from pr_agent.agent.pr_agent import PRAgent
|
||||
from pr_agent.config_loader import global_settings, get_settings
|
||||
|
||||
logging.basicConfig(stream=sys.stdout, level=logging.DEBUG)
|
||||
router = APIRouter()
|
||||
|
||||
|
||||
class Action(str, Enum):
|
||||
review = "review"
|
||||
describe = "describe"
|
||||
ask = "ask"
|
||||
improve = "improve"
|
||||
reflect = "reflect"
|
||||
answer = "answer"
|
||||
|
||||
|
||||
class Item(BaseModel):
|
||||
refspec: str
|
||||
project: str
|
||||
msg: str
|
||||
|
||||
|
||||
@router.post("/api/v1/gerrit/{action}")
|
||||
async def handle_gerrit_request(action: Action, item: Item):
|
||||
logging.debug("Received a Gerrit request")
|
||||
context["settings"] = copy.deepcopy(global_settings)
|
||||
|
||||
if action == Action.ask:
|
||||
if not item.msg:
|
||||
return HTTPException(
|
||||
status_code=400,
|
||||
detail="msg is required for ask command"
|
||||
)
|
||||
await PRAgent().handle_request(
|
||||
f"{item.project}:{item.refspec}",
|
||||
f"/{item.msg.strip()}"
|
||||
)
|
||||
|
||||
|
||||
async def get_body(request):
|
||||
try:
|
||||
body = await request.json()
|
||||
except JSONDecodeError as e:
|
||||
logging.error("Error parsing request body", e)
|
||||
return {}
|
||||
return body
|
||||
|
||||
|
||||
@router.get("/")
|
||||
async def root():
|
||||
return {"status": "ok"}
|
||||
|
||||
|
||||
def start():
|
||||
# to prevent adding help messages with the output
|
||||
get_settings().set("CONFIG.CLI_MODE", True)
|
||||
middleware = [Middleware(RawContextMiddleware)]
|
||||
app = FastAPI(middleware=middleware)
|
||||
app.include_router(router)
|
||||
|
||||
uvicorn.run(app, host="0.0.0.0", port=3000)
|
||||
|
||||
|
||||
if __name__ == '__main__':
|
||||
start()
|
@ -98,6 +98,7 @@ async def handle_request(body: Dict[str, Any], event: str):
|
||||
api_url = body["comment"]["pull_request_url"]
|
||||
else:
|
||||
return {}
|
||||
logging.info(body)
|
||||
logging.info(f"Handling comment because of event={event} and action={action}")
|
||||
comment_id = body.get("comment", {}).get("id")
|
||||
provider = get_git_provider()(pr_url=api_url)
|
||||
@ -129,6 +130,7 @@ async def handle_request(body: Dict[str, Any], event: str):
|
||||
args = split_command[1:]
|
||||
other_args = update_settings_from_args(args)
|
||||
new_command = ' '.join([command] + other_args)
|
||||
logging.info(body)
|
||||
logging.info(f"Performing command: {new_command}")
|
||||
await agent.handle_request(api_url, new_command)
|
||||
|
||||
|
@ -1,21 +1,51 @@
|
||||
import copy
|
||||
import json
|
||||
import logging
|
||||
import sys
|
||||
|
||||
import uvicorn
|
||||
from fastapi import APIRouter, FastAPI, Request, status
|
||||
from fastapi.encoders import jsonable_encoder
|
||||
from fastapi.responses import JSONResponse
|
||||
from starlette.background import BackgroundTasks
|
||||
from starlette.middleware import Middleware
|
||||
from starlette_context import context
|
||||
from starlette_context.middleware import RawContextMiddleware
|
||||
|
||||
from pr_agent.agent.pr_agent import PRAgent
|
||||
from pr_agent.config_loader import get_settings
|
||||
from pr_agent.config_loader import get_settings, global_settings
|
||||
from pr_agent.secret_providers import get_secret_provider
|
||||
|
||||
app = FastAPI()
|
||||
logging.basicConfig(stream=sys.stdout, level=logging.INFO)
|
||||
router = APIRouter()
|
||||
|
||||
secret_provider = get_secret_provider() if get_settings().get("CONFIG.SECRET_PROVIDER") else None
|
||||
|
||||
|
||||
@router.post("/webhook")
|
||||
async def gitlab_webhook(background_tasks: BackgroundTasks, request: Request):
|
||||
if request.headers.get("X-Gitlab-Token") and secret_provider:
|
||||
request_token = request.headers.get("X-Gitlab-Token")
|
||||
secret = secret_provider.get_secret(request_token)
|
||||
try:
|
||||
secret_dict = json.loads(secret)
|
||||
gitlab_token = secret_dict["gitlab_token"]
|
||||
context["settings"] = copy.deepcopy(global_settings)
|
||||
context["settings"].gitlab.personal_access_token = gitlab_token
|
||||
except Exception as e:
|
||||
logging.error(f"Failed to validate secret {request_token}: {e}")
|
||||
return JSONResponse(status_code=status.HTTP_401_UNAUTHORIZED, content=jsonable_encoder({"message": "unauthorized"}))
|
||||
elif get_settings().get("GITLAB.SHARED_SECRET"):
|
||||
secret = get_settings().get("GITLAB.SHARED_SECRET")
|
||||
if not request.headers.get("X-Gitlab-Token") == secret:
|
||||
return JSONResponse(status_code=status.HTTP_401_UNAUTHORIZED, content=jsonable_encoder({"message": "unauthorized"}))
|
||||
else:
|
||||
return JSONResponse(status_code=status.HTTP_401_UNAUTHORIZED, content=jsonable_encoder({"message": "unauthorized"}))
|
||||
gitlab_token = get_settings().get("GITLAB.PERSONAL_ACCESS_TOKEN", None)
|
||||
if not gitlab_token:
|
||||
return JSONResponse(status_code=status.HTTP_401_UNAUTHORIZED, content=jsonable_encoder({"message": "unauthorized"}))
|
||||
data = await request.json()
|
||||
logging.info(json.dumps(data))
|
||||
if data.get('object_kind') == 'merge_request' and data['object_attributes'].get('action') in ['open', 'reopen']:
|
||||
logging.info(f"A merge request has been opened: {data['object_attributes'].get('title')}")
|
||||
url = data['object_attributes'].get('url')
|
||||
@ -28,16 +58,18 @@ async def gitlab_webhook(background_tasks: BackgroundTasks, request: Request):
|
||||
background_tasks.add_task(PRAgent().handle_request, url, body)
|
||||
return JSONResponse(status_code=status.HTTP_200_OK, content=jsonable_encoder({"message": "success"}))
|
||||
|
||||
|
||||
@router.get("/")
|
||||
async def root():
|
||||
return {"status": "ok"}
|
||||
|
||||
def start():
|
||||
gitlab_url = get_settings().get("GITLAB.URL", None)
|
||||
if not gitlab_url:
|
||||
raise ValueError("GITLAB.URL is not set")
|
||||
gitlab_token = get_settings().get("GITLAB.PERSONAL_ACCESS_TOKEN", None)
|
||||
if not gitlab_token:
|
||||
raise ValueError("GITLAB.PERSONAL_ACCESS_TOKEN is not set")
|
||||
get_settings().config.git_provider = "gitlab"
|
||||
|
||||
app = FastAPI()
|
||||
middleware = [Middleware(RawContextMiddleware)]
|
||||
app = FastAPI(middleware=middleware)
|
||||
app.include_router(router)
|
||||
|
||||
uvicorn.run(app, host="0.0.0.0", port=3000)
|
||||
|
@ -86,6 +86,17 @@ polling_interval_seconds = 30
|
||||
# description_path= "path/to/description.md"
|
||||
# review_path= "path/to/review.md"
|
||||
|
||||
[gerrit]
|
||||
# endpoint to the gerrit service
|
||||
# url = "ssh://gerrit.example.com:29418"
|
||||
# user for gerrit authentication
|
||||
# user = "ai-reviewer"
|
||||
# patch server where patches will be saved
|
||||
# patch_server_endpoint = "http://127.0.0.1:5000/patch"
|
||||
# token to authenticate in the patch server
|
||||
# patch_server_token = ""
|
||||
|
||||
|
||||
[pr_similar_issue]
|
||||
skip_comments = false
|
||||
force_update_dataset = false
|
||||
|
Reference in New Issue
Block a user