Merge remote-tracking branch 'origin/main' into tr/final_fixes

This commit is contained in:
mrT23
2023-10-29 14:38:33 +02:00
16 changed files with 290 additions and 67 deletions

View File

@ -31,6 +31,8 @@ CodiumAI `PR-Agent` is an open-source tool aiming to help developers review pull
**Find Similar Issue ([`/similar_issue`](./docs/SIMILAR_ISSUE.md))**: Automatically retrieves and presents similar issues **Find Similar Issue ([`/similar_issue`](./docs/SIMILAR_ISSUE.md))**: Automatically retrieves and presents similar issues
\ \
**Add Documentation ([`/add_docs`](./docs/ADD_DOCUMENTATION.md))**: Automatically adds documentation to un-documented functions/classes in the PR. **Add Documentation ([`/add_docs`](./docs/ADD_DOCUMENTATION.md))**: Automatically adds documentation to un-documented functions/classes in the PR.
\
**Generate Custom Labels ([`/generate_labels`](./docs/GENERATE_CUSTOM_LABELS.md))**: Automatically suggests custom labels based on the PR code changes.
See the [Installation Guide](./INSTALL.md) for instructions how to install and run the tool on different platforms. See the [Installation Guide](./INSTALL.md) for instructions how to install and run the tool on different platforms.
@ -115,6 +117,7 @@ See the [Tools Guide](./docs/TOOLS_GUIDE.md) for detailed description of the dif
| | Update CHANGELOG.md | :white_check_mark: | :white_check_mark: | :white_check_mark: | :white_check_mark: | | | | | Update CHANGELOG.md | :white_check_mark: | :white_check_mark: | :white_check_mark: | :white_check_mark: | | |
| | Find similar issue | :white_check_mark: | | | | | | | | Find similar issue | :white_check_mark: | | | | | |
| | Add Documentation | :white_check_mark: | :white_check_mark: | :white_check_mark: | :white_check_mark: | | :white_check_mark: | | | Add Documentation | :white_check_mark: | :white_check_mark: | :white_check_mark: | :white_check_mark: | | :white_check_mark: |
| | Generate Labels | :white_check_mark: | :white_check_mark: | | | | |
| | | | | | | | | | | | | | | |
| USAGE | CLI | :white_check_mark: | :white_check_mark: | :white_check_mark: | :white_check_mark: | :white_check_mark: | | USAGE | CLI | :white_check_mark: | :white_check_mark: | :white_check_mark: | :white_check_mark: | :white_check_mark: |
| | App / webhook | :white_check_mark: | :white_check_mark: | | | | | | App / webhook | :white_check_mark: | :white_check_mark: | | | |

View File

@ -26,7 +26,7 @@ Under the section 'pr_description', the [configuration file](./../pr_agent/setti
- `keep_original_user_title`: if set to true, the tool will keep the original PR title, and won't change it. Default is false. - `keep_original_user_title`: if set to true, the tool will keep the original PR title, and won't change it. Default is false.
- `extra_instructions`: Optional extra instructions to the tool. For example: "focus on the changes in the file X. Ignore change in ...". - `extra_instructions`: Optional extra instructions to the tool. For example: "focus on the changes in the file X. Ignore change in ...".
- To enable `custom labels`, apply the configuration changes described [here](./GENERATE_CUSTOM_LABELS.md#configuration-changes)
### Markers template ### Markers template
markers enable to easily integrate user's content and auto-generated content, with a template-like mechanism. markers enable to easily integrate user's content and auto-generated content, with a template-like mechanism.

View File

@ -1,5 +1,5 @@
# Generate Custom Labels # Generate Custom Labels
The `generte_labels` tool scans the PR code changes, and given a list of labels and their descriptions, it automatically suggests labels that match the PR code changes. The `generate_labels` tool scans the PR code changes, and given a list of labels and their descriptions, it automatically suggests labels that match the PR code changes.
It can be invoked manually by commenting on any PR: It can be invoked manually by commenting on any PR:
``` ```
@ -10,26 +10,27 @@ For example:
If we wish to add detect changes to SQL queries in a given PR, we can add the following custom label along with its description: If we wish to add detect changes to SQL queries in a given PR, we can add the following custom label along with its description:
<kbd><img src=./../pics/custom_labels_list.png width="768"></kbd> <kbd><img src=./../pics/custom_labels_list.png width="768"></kbd>
When running the `generte_labels` tool on a PR that includes changes in SQL queries, it will automatically suggest the custom label: When running the `generate_labels` tool on a PR that includes changes in SQL queries, it will automatically suggest the custom label:
<kbd><img src=./../pics/custom_label_published.png width="768"></kbd> <kbd><img src=./../pics/custom_label_published.png width="768"></kbd>
### Configuration options ### How to enable custom labels
To enable custom labels, you need to add the following configuration to the [custom_labels file](./../pr_agent/settings/custom_labels.toml): #### CLI
To enable custom labels, you need to apply the [configuration changes](#configuration-changes) to the [custom_labels file](./../pr_agent/settings/custom_labels.toml):
#### Github Action and Gihub App
To enable custom labels, you need to apply the [configuration changes](#configuration-changes) to the `.pr_agent.toml` file in you repository.
#### Configuration changes
- Change `enable_custom_labels` to True: This will turn off the default labels and enable the custom labels provided in the custom_labels.toml file. - Change `enable_custom_labels` to True: This will turn off the default labels and enable the custom labels provided in the custom_labels.toml file.
- Add the custom labels to the custom_labels.toml file. It should be formatted as follows: - Add the custom labels. It should be formatted as follows:
``` ```
[config]
enable_custom_labels=true
[custom_labels."Custom Label Name"] [custom_labels."Custom Label Name"]
description = "Description of when AI should suggest this label" description = "Description of when AI should suggest this label"
```
- You can add modify the list to include all the custom labels you wish to use in your repository.
#### Github Action [custom_labels."Custom Label 2"]
To use the `generte_labels` tool with Github Action: description = "Description of when AI should suggest this label 2"
```
- Add the following file to your repository under `env` section in `.github/workflows/pr_agent.yml`
- Comma separated list of custom labels and their descriptions
- The number of labels and descriptions should be the same and in the same order (empty descriptions are allowed):
```
CUSTOM_LABELS: "label1, label2, ..."
CUSTOM_LABELS_DESCRIPTION: "label1 description, label2 description, ..."
```

View File

@ -25,6 +25,7 @@ Under the section 'pr_reviewer', the [configuration file](./../pr_agent/settings
- `inline_code_comments`: if set to true, the tool will publish the code suggestions as comments on the code diff. Default is false. - `inline_code_comments`: if set to true, the tool will publish the code suggestions as comments on the code diff. Default is false.
- `automatic_review`: if set to false, no automatic reviews will be done. Default is true. - `automatic_review`: if set to false, no automatic reviews will be done. Default is true.
- `extra_instructions`: Optional extra instructions to the tool. For example: "focus on the changes in the file X. Ignore change in ...". - `extra_instructions`: Optional extra instructions to the tool. For example: "focus on the changes in the file X. Ignore change in ...".
- To enable `custom labels`, apply the configuration changes described [here](./GENERATE_CUSTOM_LABELS.md#configuration-changes)
#### Incremental Mode #### Incremental Mode
For an incremental review, which only considers changes since the last PR-Agent review, this can be useful when working on the PR in an iterative manner, and you want to focus on the changes since the last review instead of reviewing the entire PR again, the following command can be used: For an incremental review, which only considers changes since the last PR-Agent review, this can be useful when working on the PR in an iterative manner, and you want to focus on the changes since the last review instead of reviewing the entire PR again, the following command can be used:
``` ```

View File

@ -6,5 +6,6 @@
- [SIMILAR_ISSUE](./SIMILAR_ISSUE.md) - [SIMILAR_ISSUE](./SIMILAR_ISSUE.md)
- [UPDATE CHANGELOG](./UPDATE_CHANGELOG.md) - [UPDATE CHANGELOG](./UPDATE_CHANGELOG.md)
- [ADD DOCUMENTATION](./ADD_DOCUMENTATION.md) - [ADD DOCUMENTATION](./ADD_DOCUMENTATION.md)
- [GENERATE CUSTOM LABELS](./GENERATE_CUSTOM_LABELS.md)
See the **[installation guide](/INSTALL.md)** for instructions on how to setup PR-Agent. See the **[installation guide](/INSTALL.md)** for instructions on how to setup PR-Agent.

View File

@ -142,10 +142,15 @@ class BitbucketProvider(GitProvider):
def remove_initial_comment(self): def remove_initial_comment(self):
try: try:
for comment in self.temp_comments: for comment in self.temp_comments:
self.pr.delete(f"comments/{comment}") self.remove_comment(comment)
except Exception as e: except Exception as e:
get_logger().exception(f"Failed to remove temp comments, error: {e}") get_logger().exception(f"Failed to remove temp comments, error: {e}")
def remove_comment(self, comment):
try:
self.pr.delete(f"comments/{comment}")
except Exception as e:
get_logger().exception(f"Failed to remove comment, error: {e}")
# funtion to create_inline_comment # funtion to create_inline_comment
def create_inline_comment(self, body: str, relevant_file: str, relevant_line_in_file: str): def create_inline_comment(self, body: str, relevant_file: str, relevant_line_in_file: str):

View File

@ -221,6 +221,9 @@ class CodeCommitProvider(GitProvider):
def remove_initial_comment(self): def remove_initial_comment(self):
return "" # not implemented yet return "" # not implemented yet
def remove_comment(self, comment):
return "" # not implemented yet
def publish_inline_comment(self, body: str, relevant_file: str, relevant_line_in_file: str): def publish_inline_comment(self, body: str, relevant_file: str, relevant_line_in_file: str):
# https://boto3.amazonaws.com/v1/documentation/api/latest/reference/services/codecommit/client/post_comment_for_compared_commit.html # https://boto3.amazonaws.com/v1/documentation/api/latest/reference/services/codecommit/client/post_comment_for_compared_commit.html
raise NotImplementedError("CodeCommit provider does not support publishing inline comments yet") raise NotImplementedError("CodeCommit provider does not support publishing inline comments yet")

View File

@ -396,5 +396,8 @@ class GerritProvider(GitProvider):
# shutil.rmtree(self.repo_path) # shutil.rmtree(self.repo_path)
pass pass
def remove_comment(self, comment):
pass
def get_pr_branch(self): def get_pr_branch(self):
return self.repo.head return self.repo.head

View File

@ -71,6 +71,10 @@ class GitProvider(ABC):
def remove_initial_comment(self): def remove_initial_comment(self):
pass pass
@abstractmethod
def remove_comment(self, comment):
pass
@abstractmethod @abstractmethod
def get_languages(self): def get_languages(self):
pass pass

View File

@ -50,7 +50,7 @@ class GithubProvider(GitProvider):
def get_incremental_commits(self): def get_incremental_commits(self):
self.commits = list(self.pr.get_commits()) self.commits = list(self.pr.get_commits())
self.get_previous_review() self.previous_review = self.get_previous_review(full=True, incremental=True)
if self.previous_review: if self.previous_review:
self.incremental.commits_range = self.get_commit_range() self.incremental.commits_range = self.get_commit_range()
# Get all files changed during the commit range # Get all files changed during the commit range
@ -63,7 +63,7 @@ class GithubProvider(GitProvider):
def get_commit_range(self): def get_commit_range(self):
last_review_time = self.previous_review.created_at last_review_time = self.previous_review.created_at
first_new_commit_index = 0 first_new_commit_index = None
for index in range(len(self.commits) - 1, -1, -1): for index in range(len(self.commits) - 1, -1, -1):
if self.commits[index].commit.author.date > last_review_time: if self.commits[index].commit.author.date > last_review_time:
self.incremental.first_new_commit_sha = self.commits[index].sha self.incremental.first_new_commit_sha = self.commits[index].sha
@ -71,15 +71,21 @@ class GithubProvider(GitProvider):
else: else:
self.incremental.last_seen_commit_sha = self.commits[index].sha self.incremental.last_seen_commit_sha = self.commits[index].sha
break break
return self.commits[first_new_commit_index:] return self.commits[first_new_commit_index:] if first_new_commit_index is not None else []
def get_previous_review(self): def get_previous_review(self, *, full: bool, incremental: bool):
self.previous_review = None if not (full or incremental):
raise ValueError("At least one of full or incremental must be True")
if not getattr(self, "comments", None):
self.comments = list(self.pr.get_issue_comments()) self.comments = list(self.pr.get_issue_comments())
prefixes = []
if full:
prefixes.append("## PR Analysis")
if incremental:
prefixes.append("## Incremental PR Review")
for index in range(len(self.comments) - 1, -1, -1): for index in range(len(self.comments) - 1, -1, -1):
if self.comments[index].body.startswith("## PR Analysis") or self.comments[index].body.startswith("## Incremental PR Review"): if any(self.comments[index].body.startswith(prefix) for prefix in prefixes):
self.previous_review = self.comments[index] return self.comments[index]
break
def get_files(self): def get_files(self):
if self.incremental.is_incremental and self.file_set: if self.incremental.is_incremental and self.file_set:
@ -218,10 +224,16 @@ class GithubProvider(GitProvider):
try: try:
for comment in getattr(self.pr, 'comments_list', []): for comment in getattr(self.pr, 'comments_list', []):
if comment.is_temporary: if comment.is_temporary:
comment.delete() self.remove_comment(comment)
except Exception as e: except Exception as e:
get_logger().exception(f"Failed to remove initial comment, error: {e}") get_logger().exception(f"Failed to remove initial comment, error: {e}")
def remove_comment(self, comment):
try:
comment.delete()
except Exception as e:
get_logger().exception(f"Failed to remove comment, error: {e}")
def get_title(self): def get_title(self):
return self.pr.title return self.pr.title

View File

@ -287,10 +287,16 @@ class GitLabProvider(GitProvider):
def remove_initial_comment(self): def remove_initial_comment(self):
try: try:
for comment in self.temp_comments: for comment in self.temp_comments:
comment.delete() self.remove_comment(comment)
except Exception as e: except Exception as e:
get_logger().exception(f"Failed to remove temp comments, error: {e}") get_logger().exception(f"Failed to remove temp comments, error: {e}")
def remove_comment(self, comment):
try:
comment.delete()
except Exception as e:
get_logger().exception(f"Failed to remove comment, error: {e}")
def get_title(self): def get_title(self):
return self.mr.title return self.mr.title

View File

@ -140,6 +140,9 @@ class LocalGitProvider(GitProvider):
def remove_initial_comment(self): def remove_initial_comment(self):
pass # Not applicable to the local git provider, but required by the interface pass # Not applicable to the local git provider, but required by the interface
def remove_comment(self, comment):
pass # Not applicable to the local git provider, but required by the interface
def get_languages(self): def get_languages(self):
""" """
Calculate percentage of languages in repository. Used for hunk prioritisation. Calculate percentage of languages in repository. Used for hunk prioritisation.

View File

@ -1,7 +1,7 @@
import copy import copy
import os import os
import time import asyncio.locks
from typing import Any, Dict from typing import Any, Dict, List, Tuple
import uvicorn import uvicorn
from fastapi import APIRouter, FastAPI, HTTPException, Request, Response from fastapi import APIRouter, FastAPI, HTTPException, Request, Response
@ -14,8 +14,9 @@ from pr_agent.algo.utils import update_settings_from_args
from pr_agent.config_loader import get_settings, global_settings from pr_agent.config_loader import get_settings, global_settings
from pr_agent.git_providers import get_git_provider from pr_agent.git_providers import get_git_provider
from pr_agent.git_providers.utils import apply_repo_settings from pr_agent.git_providers.utils import apply_repo_settings
from pr_agent.git_providers.git_provider import IncrementalPR
from pr_agent.log import LoggingFormat, get_logger, setup_logger from pr_agent.log import LoggingFormat, get_logger, setup_logger
from pr_agent.servers.utils import verify_signature from pr_agent.servers.utils import verify_signature, DefaultDictWithTimeout
setup_logger(fmt=LoggingFormat.JSON) setup_logger(fmt=LoggingFormat.JSON)
@ -47,6 +48,7 @@ async def handle_marketplace_webhooks(request: Request, response: Response):
body = await get_body(request) body = await get_body(request)
get_logger().info(f'Request body:\n{body}') get_logger().info(f'Request body:\n{body}')
async def get_body(request): async def get_body(request):
try: try:
body = await request.json() body = await request.json()
@ -61,7 +63,9 @@ async def get_body(request):
return body return body
_duplicate_requests_cache = {} _duplicate_requests_cache = DefaultDictWithTimeout(ttl=get_settings().github_app.duplicate_requests_cache_ttl)
_duplicate_push_triggers = DefaultDictWithTimeout(ttl=get_settings().github_app.push_trigger_pending_tasks_ttl)
_pending_task_duplicate_push_conditions = DefaultDictWithTimeout(asyncio.locks.Condition, ttl=get_settings().github_app.push_trigger_pending_tasks_ttl)
async def handle_request(body: Dict[str, Any], event: str): async def handle_request(body: Dict[str, Any], event: str):
@ -109,26 +113,99 @@ async def handle_request(body: Dict[str, Any], event: str):
# handle pull_request event: # handle pull_request event:
# automatically review opened/reopened/ready_for_review PRs as long as they're not in draft, # automatically review opened/reopened/ready_for_review PRs as long as they're not in draft,
# as well as direct review requests from the bot # as well as direct review requests from the bot
elif event == 'pull_request': elif event == 'pull_request' and action != 'synchronize':
pull_request = body.get("pull_request") pull_request, api_url = _check_pull_request_event(action, body, log_context, bot_user)
if not pull_request: if not (pull_request and api_url):
return {}
api_url = pull_request.get("url")
if not api_url:
return {}
log_context["api_url"] = api_url
if pull_request.get("draft", True) or pull_request.get("state") != "open" or pull_request.get("user", {}).get("login", "") == bot_user:
return {} return {}
if action in get_settings().github_app.handle_pr_actions: if action in get_settings().github_app.handle_pr_actions:
if action == "review_requested": if action == "review_requested":
if body.get("requested_reviewer", {}).get("login", "") != bot_user: if body.get("requested_reviewer", {}).get("login", "") != bot_user:
return {} return {}
if pull_request.get("created_at") == pull_request.get("updated_at"): get_logger().info(f"Performing review for {api_url=} because of {event=} and {action=}")
# avoid double reviews when opening a PR for the first time await _perform_commands(get_settings().github_app.pr_commands, agent, body, api_url, log_context)
# handle pull_request event with synchronize action - "push trigger" for new commits
elif event == 'pull_request' and action == 'synchronize' and get_settings().github_app.handle_push_trigger:
pull_request, api_url = _check_pull_request_event(action, body, log_context, bot_user)
if not (pull_request and api_url):
return {} return {}
get_logger().info(f"Performing review because of event={event} and action={action}")
# TODO: do we still want to get the list of commits to filter bot/merge commits?
before_sha = body.get("before")
after_sha = body.get("after")
merge_commit_sha = pull_request.get("merge_commit_sha")
if before_sha == after_sha:
return {}
if get_settings().github_app.push_trigger_ignore_merge_commits and after_sha == merge_commit_sha:
return {}
if get_settings().github_app.push_trigger_ignore_bot_commits and body.get("sender", {}).get("login", "") == bot_user:
return {}
# Prevent triggering multiple times for subsequent push triggers when one is enough:
# The first push will trigger the processing, and if there's a second push in the meanwhile it will wait.
# Any more events will be discarded, because they will all trigger the exact same processing on the PR.
# We let the second event wait instead of discarding it because while the first event was being processed,
# more commits may have been pushed that led to the subsequent events,
# so we keep just one waiting as a delegate to trigger the processing for the new commits when done waiting.
current_active_tasks = _duplicate_push_triggers.setdefault(api_url, 0)
max_active_tasks = 2 if get_settings().github_app.push_trigger_pending_tasks_backlog else 1
if current_active_tasks < max_active_tasks:
# first task can enter, and second tasks too if backlog is enabled
get_logger().info(
f"Continue processing push trigger for {api_url=} because there are {current_active_tasks} active tasks"
)
_duplicate_push_triggers[api_url] += 1
else:
get_logger().info(
f"Skipping push trigger for {api_url=} because another event already triggered the same processing"
)
return {}
async with _pending_task_duplicate_push_conditions[api_url]:
if current_active_tasks == 1:
# second task waits
get_logger().info(
f"Waiting to process push trigger for {api_url=} because the first task is still in progress"
)
await _pending_task_duplicate_push_conditions[api_url].wait()
get_logger().info(f"Finished waiting to process push trigger for {api_url=} - continue with flow")
try:
if get_settings().github_app.push_trigger_wait_for_initial_review and not get_git_provider()(api_url, incremental=IncrementalPR(True)).previous_review:
get_logger().info(f"Skipping incremental review because there was no initial review for {api_url=} yet")
return {}
get_logger().info(f"Performing incremental review for {api_url=} because of {event=} and {action=}")
await _perform_commands(get_settings().github_app.push_commands, agent, body, api_url, log_context)
finally:
# release the waiting task block
async with _pending_task_duplicate_push_conditions[api_url]:
_pending_task_duplicate_push_conditions[api_url].notify(1)
_duplicate_push_triggers[api_url] -= 1
get_logger().info("event or action does not require handling")
return {}
def _check_pull_request_event(action: str, body: dict, log_context: dict, bot_user: str) -> Tuple[Dict[str, Any], str]:
invalid_result = {}, ""
pull_request = body.get("pull_request")
if not pull_request:
return invalid_result
api_url = pull_request.get("url")
if not api_url:
return invalid_result
log_context["api_url"] = api_url
if pull_request.get("draft", True) or pull_request.get("state") != "open" or pull_request.get("user", {}).get("login", "") == bot_user:
return invalid_result
if action in ("review_requested", "synchronize") and pull_request.get("created_at") == pull_request.get("updated_at"):
# avoid double reviews when opening a PR for the first time
return invalid_result
return pull_request, api_url
async def _perform_commands(commands: List[str], agent: PRAgent, body: dict, api_url: str, log_context: dict):
apply_repo_settings(api_url) apply_repo_settings(api_url)
for command in get_settings().github_app.pr_commands: for command in commands:
split_command = command.split(" ") split_command = command.split(" ")
command = split_command[0] command = split_command[0]
args = split_command[1:] args = split_command[1:]
@ -139,9 +216,6 @@ async def handle_request(body: Dict[str, Any], event: str):
with get_logger().contextualize(**log_context): with get_logger().contextualize(**log_context):
await agent.handle_request(api_url, new_command) await agent.handle_request(api_url, new_command)
get_logger().info("event or action does not require handling")
return {}
def _is_duplicate_request(body: Dict[str, Any]) -> bool: def _is_duplicate_request(body: Dict[str, Any]) -> bool:
""" """
@ -150,13 +224,8 @@ def _is_duplicate_request(body: Dict[str, Any]) -> bool:
""" """
request_hash = hash(str(body)) request_hash = hash(str(body))
get_logger().info(f"request_hash: {request_hash}") get_logger().info(f"request_hash: {request_hash}")
request_time = time.monotonic() is_duplicate = _duplicate_requests_cache.get(request_hash, False)
ttl = get_settings().github_app.duplicate_requests_cache_ttl # in seconds _duplicate_requests_cache[request_hash] = True
to_delete = [key for key, key_time in _duplicate_requests_cache.items() if request_time - key_time > ttl]
for key in to_delete:
del _duplicate_requests_cache[key]
is_duplicate = request_hash in _duplicate_requests_cache
_duplicate_requests_cache[request_hash] = request_time
if is_duplicate: if is_duplicate:
get_logger().info(f"Ignoring duplicate request {request_hash}") get_logger().info(f"Ignoring duplicate request {request_hash}")
return is_duplicate return is_duplicate

View File

@ -1,5 +1,8 @@
import hashlib import hashlib
import hmac import hmac
import time
from collections import defaultdict
from typing import Callable, Any
from fastapi import HTTPException from fastapi import HTTPException
@ -25,3 +28,59 @@ def verify_signature(payload_body, secret_token, signature_header):
class RateLimitExceeded(Exception): class RateLimitExceeded(Exception):
"""Raised when the git provider API rate limit has been exceeded.""" """Raised when the git provider API rate limit has been exceeded."""
pass pass
class DefaultDictWithTimeout(defaultdict):
"""A defaultdict with a time-to-live (TTL)."""
def __init__(
self,
default_factory: Callable[[], Any] = None,
ttl: int = None,
refresh_interval: int = 60,
update_key_time_on_get: bool = True,
*args,
**kwargs,
):
"""
Args:
default_factory: The default factory to use for keys that are not in the dictionary.
ttl: The time-to-live (TTL) in seconds.
refresh_interval: How often to refresh the dict and delete items older than the TTL.
update_key_time_on_get: Whether to update the access time of a key also on get (or only when set).
"""
super().__init__(default_factory, *args, **kwargs)
self.__key_times = dict()
self.__ttl = ttl
self.__refresh_interval = refresh_interval
self.__update_key_time_on_get = update_key_time_on_get
self.__last_refresh = self.__time() - self.__refresh_interval
@staticmethod
def __time():
return time.monotonic()
def __refresh(self):
if self.__ttl is None:
return
request_time = self.__time()
if request_time - self.__last_refresh > self.__refresh_interval:
return
to_delete = [key for key, key_time in self.__key_times.items() if request_time - key_time > self.__ttl]
for key in to_delete:
del self[key]
self.__last_refresh = request_time
def __getitem__(self, __key):
if self.__update_key_time_on_get:
self.__key_times[__key] = self.__time()
self.__refresh()
return super().__getitem__(__key)
def __setitem__(self, __key, __value):
self.__key_times[__key] = self.__time()
return super().__setitem__(__key, __value)
def __delitem__(self, __key):
del self.__key_times[__key]
return super().__delitem__(__key)

View File

@ -24,6 +24,7 @@ num_code_suggestions=4
inline_code_comments = false inline_code_comments = false
ask_and_reflect=false ask_and_reflect=false
automatic_review=true automatic_review=true
remove_previous_review_comment=false
extra_instructions = "" extra_instructions = ""
[pr_description] # /describe # [pr_description] # /describe #
@ -86,6 +87,27 @@ pr_commands = [
"/describe --pr_description.add_original_user_description=true --pr_description.keep_original_user_title=true", "/describe --pr_description.add_original_user_description=true --pr_description.keep_original_user_title=true",
"/auto_review", "/auto_review",
] ]
# settings for "pull_request" event with "synchronize" action - used to detect and handle push triggers for new commits
handle_push_trigger = false
push_trigger_ignore_bot_commits = true
push_trigger_ignore_merge_commits = true
push_trigger_wait_for_initial_review = true
push_trigger_pending_tasks_backlog = true
push_trigger_pending_tasks_ttl = 300
push_commands = [
"/describe --pr_description.add_original_user_description=true --pr_description.keep_original_user_title=true",
"""/auto_review -i \
--pr_reviewer.require_focused_review=false \
--pr_reviewer.require_score_review=false \
--pr_reviewer.require_tests_review=false \
--pr_reviewer.require_security_review=false \
--pr_reviewer.require_estimate_effort_to_review=false \
--pr_reviewer.num_code_suggestions=0 \
--pr_reviewer.inline_code_comments=false \
--pr_reviewer.remove_previous_review_comment=true \
--pr_reviewer.extra_instructions='' \
"""
]
[gitlab] [gitlab]
# URL to the gitlab service # URL to the gitlab service

View File

@ -100,6 +100,9 @@ class PRReviewer:
if self.is_auto and not get_settings().pr_reviewer.automatic_review: if self.is_auto and not get_settings().pr_reviewer.automatic_review:
get_logger().info(f'Automatic review is disabled {self.pr_url}') get_logger().info(f'Automatic review is disabled {self.pr_url}')
return None return None
if self.is_auto and self.incremental.is_incremental and not self.incremental.first_new_commit_sha:
get_logger().info(f"Incremental review is enabled for {self.pr_url} but there are no new commits")
return None
get_logger().info(f'Reviewing PR: {self.pr_url} ...') get_logger().info(f'Reviewing PR: {self.pr_url} ...')
@ -113,9 +116,10 @@ class PRReviewer:
if get_settings().config.publish_output: if get_settings().config.publish_output:
get_logger().info('Pushing PR review...') get_logger().info('Pushing PR review...')
previous_review_comment = self._get_previous_review_comment()
self.git_provider.publish_comment(pr_comment) self.git_provider.publish_comment(pr_comment)
self.git_provider.remove_initial_comment() self.git_provider.remove_initial_comment()
self._remove_previous_review_comment(previous_review_comment)
if get_settings().pr_reviewer.inline_code_comments: if get_settings().pr_reviewer.inline_code_comments:
get_logger().info('Pushing inline code comments...') get_logger().info('Pushing inline code comments...')
self._publish_inline_code_comments() self._publish_inline_code_comments()
@ -231,9 +235,13 @@ class PRReviewer:
if self.incremental.is_incremental: if self.incremental.is_incremental:
last_commit_url = f"{self.git_provider.get_pr_url()}/commits/" \ last_commit_url = f"{self.git_provider.get_pr_url()}/commits/" \
f"{self.git_provider.incremental.first_new_commit_sha}" f"{self.git_provider.incremental.first_new_commit_sha}"
last_commit_msg = self.incremental.commits_range[0].commit.message if self.incremental.commits_range else ""
incremental_review_markdown_text = f"Starting from commit {last_commit_url}"
if last_commit_msg:
incremental_review_markdown_text += f" \n_({last_commit_msg.splitlines(keepends=False)[0]})_"
data = OrderedDict(data) data = OrderedDict(data)
data.update({'Incremental PR Review': { data.update({'Incremental PR Review': {
"⏮️ Review for commits since previous PR-Agent review": f"Starting from commit {last_commit_url}"}}) "⏮️ Review for commits since previous PR-Agent review": incremental_review_markdown_text}})
data.move_to_end('Incremental PR Review', last=False) data.move_to_end('Incremental PR Review', last=False)
markdown_text = convert_to_markdown(data, self.git_provider.is_supported("gfm_markdown")) markdown_text = convert_to_markdown(data, self.git_provider.is_supported("gfm_markdown"))
@ -314,3 +322,26 @@ class PRReviewer:
break break
return question_str, answer_str return question_str, answer_str
def _get_previous_review_comment(self):
"""
Get the previous review comment if it exists.
"""
try:
if get_settings().pr_reviewer.remove_previous_review_comment and hasattr(self.git_provider, "get_previous_review"):
return self.git_provider.get_previous_review(
full=not self.incremental.is_incremental,
incremental=self.incremental.is_incremental,
)
except Exception as e:
get_logger().exception(f"Failed to get previous review comment, error: {e}")
def _remove_previous_review_comment(self, comment):
"""
Remove the previous review comment if it exists.
"""
try:
if get_settings().pr_reviewer.remove_previous_review_comment and comment:
self.git_provider.remove_comment(comment)
except Exception as e:
get_logger().exception(f"Failed to remove previous review comment, error: {e}")