diff --git a/docker/Dockerfile.lambda b/docker/Dockerfile.lambda index 46e777ad..7ac6879a 100644 --- a/docker/Dockerfile.lambda +++ b/docker/Dockerfile.lambda @@ -1,4 +1,4 @@ -FROM public.ecr.aws/lambda/python:3.12 +FROM public.ecr.aws/lambda/python:3.12 AS base RUN dnf update -y && \ dnf install -y gcc python3-devel git && \ @@ -9,4 +9,10 @@ RUN pip install --no-cache-dir . && rm pyproject.toml RUN pip install --no-cache-dir mangum==0.17.0 COPY pr_agent/ ${LAMBDA_TASK_ROOT}/pr_agent/ -CMD ["pr_agent.servers.serverless.serverless"] +FROM base AS github_lambda +CMD ["pr_agent.servers.github_lambda_webhook.lambda_handler"] + +FROM base AS gitlab_lambda +CMD ["pr_agent.servers.gitlab_lambda_webhook.lambda_handler"] + +FROM github_lambda diff --git a/docs/docs/installation/github.md b/docs/docs/installation/github.md index 69b34b8a..7ca97a75 100644 --- a/docs/docs/installation/github.md +++ b/docs/docs/installation/github.md @@ -187,14 +187,15 @@ For example: `GITHUB.WEBHOOK_SECRET` --> `GITHUB__WEBHOOK_SECRET` 2. Build a docker image that can be used as a lambda function ```shell - docker buildx build --platform=linux/amd64 . -t codiumai/pr-agent:serverless -f docker/Dockerfile.lambda + # Note: --target github_lambda is optional as it's the default target + docker buildx build --platform=linux/amd64 . -t codiumai/pr-agent:github_lambda --target github_lambda -f docker/Dockerfile.lambda ``` 3. Push image to ECR ```shell - docker tag codiumai/pr-agent:serverless .dkr.ecr..amazonaws.com/codiumai/pr-agent:serverless - docker push .dkr.ecr..amazonaws.com/codiumai/pr-agent:serverless + docker tag codiumai/pr-agent:github_lambda .dkr.ecr..amazonaws.com/codiumai/pr-agent:github_lambda + docker push .dkr.ecr..amazonaws.com/codiumai/pr-agent:github_lambda ``` 4. Create a lambda function that uses the uploaded image. Set the lambda timeout to be at least 3m. diff --git a/docs/docs/installation/gitlab.md b/docs/docs/installation/gitlab.md index bbcf4027..9fa0291d 100644 --- a/docs/docs/installation/gitlab.md +++ b/docs/docs/installation/gitlab.md @@ -61,12 +61,12 @@ git clone https://github.com/qodo-ai/pr-agent.git ``` 5. Prepare variables and secrets. Skip this step if you plan on setting these as environment variables when running the agent: -1. In the configuration file/variables: - - Set `config.git_provider` to "gitlab" + 1. In the configuration file/variables: + - Set `config.git_provider` to "gitlab" -2. In the secrets file/variables: - - Set your AI model key in the respective section - - In the [gitlab] section, set `personal_access_token` (with token from step 2) and `shared_secret` (with secret from step 3) + 2. In the secrets file/variables: + - Set your AI model key in the respective section + - In the [gitlab] section, set `personal_access_token` (with token from step 2) and `shared_secret` (with secret from step 3) 6. Build a Docker image for the app and optionally push it to a Docker repository. We'll use Dockerhub as an example: @@ -88,3 +88,63 @@ OPENAI__KEY= 8. Create a webhook in your GitLab project. Set the URL to `http[s]:///webhook`, the secret token to the generated secret from step 3, and enable the triggers `push`, `comments` and `merge request events`. 9. Test your installation by opening a merge request or commenting on a merge request using one of PR Agent's commands. + +## Deploy as a Lambda Function + +Note that since AWS Lambda env vars cannot have "." in the name, you can replace each "." in an env variable with "__".
+For example: `GITLAB.PERSONAL_ACCESS_TOKEN` --> `GITLAB__PERSONAL_ACCESS_TOKEN` + +1. Follow steps 1-5 from [Run a GitLab webhook server](#run-a-gitlab-webhook-server). +2. Build a docker image that can be used as a lambda function + + ```shell + docker buildx build --platform=linux/amd64 . -t codiumai/pr-agent:gitlab_lambda --target gitlab_lambda -f docker/Dockerfile.lambda + ``` + +3. Push image to ECR + + ```shell + docker tag codiumai/pr-agent:gitlab_lambda .dkr.ecr..amazonaws.com/codiumai/pr-agent:gitlab_lambda + docker push .dkr.ecr..amazonaws.com/codiumai/pr-agent:gitlab_lambda + ``` + +4. Create a lambda function that uses the uploaded image. Set the lambda timeout to be at least 3m. +5. Configure the lambda function to have a Function URL. +6. In the environment variables of the Lambda function, specify `AZURE_DEVOPS_CACHE_DIR` to a writable location such as /tmp. (see [link](https://github.com/Codium-ai/pr-agent/pull/450#issuecomment-1840242269)) +7. Go back to steps 8-9 of [Run a GitLab webhook server](#run-a-gitlab-webhook-server) with the function url as your Webhook URL. + The Webhook URL would look like `https:///webhook` + +### Using AWS Secrets Manager + +For production Lambda deployments, use AWS Secrets Manager instead of environment variables: + +1. Create individual secrets for each GitLab webhook with this JSON format (e.g., secret name: `project-webhook-secret-001`) + +```json +{ + "gitlab_token": "glpat-xxxxxxxxxxxxxxxxxxxxxxxx", + "token_name": "project-webhook-001" +} +``` + +2. Create a main configuration secret for common settings (e.g., secret name: `pr-agent-main-config`) + +```json +{ + "openai.key": "sk-proj-xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx" +} +``` + +3. Set these environment variables in your Lambda: + +```bash +CONFIG__SECRET_PROVIDER=aws_secrets_manager +AWS_SECRETS_MANAGER__SECRET_ARN=arn:aws:secretsmanager:us-east-1:123456789012:secret:pr-agent-main-config-AbCdEf +``` + +4. In your GitLab webhook configuration, set the **Secret Token** to the **Secret name** created in step 1: + - Example: `project-webhook-secret-001` + +**Important**: When using Secrets Manager, GitLab's webhook secret must be the Secrets Manager secret name. + +5. Add IAM permission `secretsmanager:GetSecretValue` to your Lambda execution role \ No newline at end of file diff --git a/pr_agent/servers/serverless.py b/pr_agent/servers/github_lambda_webhook.py similarity index 91% rename from pr_agent/servers/serverless.py rename to pr_agent/servers/github_lambda_webhook.py index 938be31b..57b5df56 100644 --- a/pr_agent/servers/serverless.py +++ b/pr_agent/servers/github_lambda_webhook.py @@ -23,5 +23,5 @@ app.include_router(router) handler = Mangum(app, lifespan="off") -def serverless(event, context): - return handler(event, context) +def lambda_handler(event, context): + return handler(event, context) \ No newline at end of file diff --git a/pr_agent/servers/gitlab_lambda_webhook.py b/pr_agent/servers/gitlab_lambda_webhook.py new file mode 100644 index 00000000..30ca3426 --- /dev/null +++ b/pr_agent/servers/gitlab_lambda_webhook.py @@ -0,0 +1,27 @@ +from fastapi import FastAPI +from mangum import Mangum +from starlette.middleware import Middleware +from starlette_context.middleware import RawContextMiddleware + +from pr_agent.servers.gitlab_webhook import router + +try: + from pr_agent.config_loader import apply_secrets_manager_config + apply_secrets_manager_config() +except Exception as e: + try: + from pr_agent.log import get_logger + get_logger().debug(f"AWS Secrets Manager initialization failed, falling back to environment variables: {e}") + except: + # Fail completely silently if log module is not available + pass + +middleware = [Middleware(RawContextMiddleware)] +app = FastAPI(middleware=middleware) +app.include_router(router) + +handler = Mangum(app, lifespan="off") + + +def lambda_handler(event, context): + return handler(event, context) \ No newline at end of file