Merge remote-tracking branch 'origin/main' into ok/update_readme

This commit is contained in:
Ori Kotek
2023-07-19 11:14:44 +03:00
7 changed files with 58 additions and 10 deletions

View File

@ -149,16 +149,15 @@ git clone https://github.com/Codium-ai/pr-agent.git
```
5. Copy the secrets template file and fill in the following:
```
cp pr_agent/settings/.secrets_template.toml pr_agent/settings/.secrets.toml
# Edit .secrets.toml file
```
- Your OpenAI key.
- Set deployment_type to 'app'
- Copy your app's private key to the private_key field.
- Copy your app's ID to the app_id field.
- Copy your app's webhook secret to the webhook_secret field.
```
cp pr_agent/settings/.secrets_template.toml pr_agent/settings/.secrets.toml
# Edit .secrets.toml file
```
- Set deployment_type to 'app' in [configuration.toml](./pr_agent/settings/configuration.toml)
> The .secrets.toml file is not copied to the Docker image by default, and is only used for local development.
> If you want to use the .secrets.toml file in your Docker image, you can add remove it from the .dockerignore file.
@ -189,6 +188,7 @@ docker push codiumai/pr-agent:github_app # Push to your Docker repository
7. Host the app using a server, serverless function, or container environment. Alternatively, for development and
debugging, you may use tools like smee.io to forward webhooks to your local machine.
You can check [Deploy as a Lambda Function](#deploy-as-a-lambda-function)
8. Go back to your app's settings, and set the following:
@ -198,3 +198,20 @@ docker push codiumai/pr-agent:github_app # Push to your Docker repository
9. Install the app by navigating to the "Install App" tab and selecting your desired repositories.
---
#### Deploy as a Lambda Function
1. Follow steps 1-5 of [Method 5](#method-5-run-as-a-github-app).
2. Build a docker image that can be used as a lambda function
```shell
docker buildx build --platform=linux/amd64 . -t codiumai/pr-agent:serverless -f docker/Dockerfile.lambda
```
3. Push image to ECR
```shell
docker tag codiumai/pr-agent:serverless <AWS_ACCOUNT>.dkr.ecr.<AWS_REGION>.amazonaws.com/codiumai/pr-agent:serverless
docker push <AWS_ACCOUNT>.dkr.ecr.<AWS_REGION>.amazonaws.com/codiumai/pr-agent:serverless
```
4. Create a lambda function that uses the uploaded image. Set the lambda timeout to be at least 3m.
5. Configure the lambda function to have a Function URL.
6. Go back to steps 8-9 of [Method 5](#method-5-run-as-a-github-app) with the function url as your Webhook URL.
The Webhook URL would look like `https://<LAMBDA_FUNCTION_URL>/api/v1/github_webhooks`

View File

@ -132,7 +132,7 @@ Here are several ways to install and run PR-Agent:
## How it works
![PR-Agent Tools](https://www.codium.ai/wp-content/uploads/2023/07/pr-agent-schema-updated.png)
![PR-Agent Tools](https://www.codium.ai/wp-content/uploads/2023/07/codiumai-diagram-v4.jpg)
Check out the [PR Compression strategy](./PR_COMPRESSION.md) page for more details on how we convert a code diff to a manageable LLM prompt

12
docker/Dockerfile.lambda Normal file
View File

@ -0,0 +1,12 @@
FROM public.ecr.aws/lambda/python:3.10
RUN yum update -y && \
yum install -y gcc python3-devel && \
yum clean all
ADD requirements.txt .
RUN pip install -r requirements.txt && rm requirements.txt
RUN pip install mangum==16.0.0
COPY pr_agent/ ${LAMBDA_TASK_ROOT}/pr_agent/
CMD ["pr_agent.servers.serverless.serverless"]

View File

@ -17,6 +17,7 @@ def convert_to_markdown(output_data: dict) -> str:
"Focused PR": "",
"Security concerns": "🔒",
"General PR suggestions": "💡",
"Insights from user's answers": "📝",
"Code suggestions": "🤖"
}

View File

@ -10,7 +10,7 @@ from pr_agent.tools.pr_questions import PRQuestions
from pr_agent.tools.pr_reviewer import PRReviewer
def run():
def run(args=None):
parser = argparse.ArgumentParser(description='AI based pull request analyzer', usage="""\
Usage: cli.py --pr-url <URL on supported git hosting service> <command> [<args>].
For example:
@ -35,7 +35,7 @@ reflect - Ask the PR author questions about the PR.
'reflect', 'review_after_reflect'],
default='review')
parser.add_argument('rest', nargs=argparse.REMAINDER, default=[])
args = parser.parse_args()
args = parser.parse_args(args)
logging.basicConfig(level=os.environ.get("LOGLEVEL", "INFO"))
command = args.command.lower()
if command in ['ask', 'ask_question']:

View File

@ -0,0 +1,18 @@
import logging
from fastapi import FastAPI
from mangum import Mangum
from pr_agent.servers.github_app import router
logger = logging.getLogger()
logger.setLevel(logging.DEBUG)
app = FastAPI()
app.include_router(router)
handler = Mangum(app, lifespan="off")
def serverless(event, context):
return handler(event, context)

View File

@ -24,7 +24,7 @@ class PRReviewer:
self.is_answer = is_answer
if self.is_answer and not self.git_provider.is_supported("get_issue_comments"):
raise Exception(f"Answer mode is not supported for {settings.config.git_provider} for now")
answer_str = question_str = self._get_user_answers()
answer_str, question_str = self._get_user_answers()
self.ai_handler = AiHandler()
self.patches_diff = None
self.prediction = None