Skip to content
Open
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
51 changes: 51 additions & 0 deletions .github/workflows/docker-build.yml
Original file line number Diff line number Diff line change
@@ -0,0 +1,51 @@
name: Docker Build and Test

on:
push:
branches: [ main, develop ]
pull_request:
branches: [ main, develop ]

jobs:
build:
runs-on: ubuntu-latest

steps:
- name: Checkout code
uses: actions/checkout@v4

- name: Set up Docker Buildx
uses: docker/setup-buildx-action@v3

- name: Build Backend
run: |
cd Backend
docker build -t inpactai-backend:test .
- name: Build Frontend
run: |
cd Frontend
docker build -t inpactai-frontend:test .
- name: Start services
run: |
docker compose up -d
sleep 30
Comment on lines +30 to +33
Copy link

Copilot AI Dec 13, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The CI workflow will fail because it tries to start services without environment files. The docker compose up command on line 32 requires Backend/.env and Frontend/.env files, but these are not created in the workflow. Add steps to create dummy .env files from .env.example before starting services.

Copilot uses AI. Check for mistakes.
- name: Check backend health
run: |
curl -f http://localhost:8000/ || exit 1
- name: Check frontend health
run: |
curl -f http://localhost:5173/ || exit 1
Comment on lines +35 to +41
Copy link

Copilot AI Dec 13, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The health checks will fail because the backend requires valid Supabase credentials to start, which won't be available in the CI environment. Consider adding a skip-database or test mode for CI environments, or mock the Supabase connection for health checks.

Copilot uses AI. Check for mistakes.
- name: Show logs on failure
if: failure()
run: |
docker compose logs
- name: Cleanup
if: always()
run: |
docker compose down -v
21 changes: 21 additions & 0 deletions Backend/.dockerignore
Original file line number Diff line number Diff line change
@@ -0,0 +1,21 @@
__pycache__
*.pyc
*.pyo
*.pyd
.Python
*.so
.env
.venv
env/
venv/
ENV/
.git
.gitignore
.pytest_cache
.coverage
htmlcov/
dist/
build/
*.egg-info/
.DS_Store
*.log
12 changes: 12 additions & 0 deletions Backend/.env.example
Original file line number Diff line number Diff line change
@@ -0,0 +1,12 @@
user=postgres
password=your_postgres_password
host=your_postgres_host
port=5432
dbname=postgres
GROQ_API_KEY=your_groq_api_key
SUPABASE_URL=your_supabase_url
SUPABASE_KEY=your_supabase_key
GEMINI_API_KEY=your_gemini_api_key
YOUTUBE_API_KEY=your_youtube_api_key
REDIS_HOST=redis
REDIS_PORT=6379
18 changes: 18 additions & 0 deletions Backend/Dockerfile
Original file line number Diff line number Diff line change
@@ -0,0 +1,18 @@
FROM python:3.10-slim

WORKDIR /app

RUN apt-get update && apt-get install -y --no-install-recommends \
gcc \
libpq-dev \
curl \
&& rm -rf /var/lib/apt/lists/*

COPY requirements.txt .
RUN pip install --no-cache-dir -r requirements.txt

COPY . .

EXPOSE 8000

CMD ["uvicorn", "app.main:app", "--host", "0.0.0.0", "--port", "8000", "--reload"]
33 changes: 33 additions & 0 deletions Backend/Dockerfile.prod
Original file line number Diff line number Diff line change
@@ -0,0 +1,33 @@
FROM python:3.10-slim AS builder

WORKDIR /app

RUN apt-get update && apt-get install -y --no-install-recommends \
gcc \
libpq-dev \
&& rm -rf /var/lib/apt/lists/*

COPY requirements.txt .
RUN pip install --no-cache-dir --user -r requirements.txt

FROM python:3.10-slim

WORKDIR /app

RUN apt-get update && apt-get install -y --no-install-recommends \
libpq5 \
Copy link

Copilot AI Dec 13, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The production Dockerfile doesn't install curl, but the health check in docker-compose.yml requires it. This will cause the health check to fail in production deployments. Add curl to the runtime dependencies or use a different health check method.

Suggested change
libpq5 \
libpq5 \
curl \

Copilot uses AI. Check for mistakes.
&& rm -rf /var/lib/apt/lists/* \
&& groupadd -r appuser && useradd -r -g appuser appuser

COPY --from=builder /root/.local /root/.local
COPY . .

RUN chown -R appuser:appuser /app

USER appuser

ENV PATH=/root/.local/bin:$PATH
Comment on lines +22 to +29
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue | 🔴 Critical

Package installation path incompatible with non-root user.

The builder stage installs packages to /root/.local (line 22), but the runtime switches to appuser (line 27) who cannot access /root/. The PATH also references /root/.local/bin which will be inaccessible to appuser, causing the application to fail at startup.

Apply this diff to install packages to a shared location:

 FROM python:3.10-slim AS builder
 
 WORKDIR /app
 
 RUN apt-get update && apt-get install -y --no-install-recommends \
     gcc \
     libpq-dev \
     && rm -rf /var/lib/apt/lists/*
 
 COPY requirements.txt .
-RUN pip install --no-cache-dir --user -r requirements.txt
+RUN pip install --no-cache-dir --prefix=/install -r requirements.txt
 
 FROM python:3.10-slim
 
 WORKDIR /app
 
 RUN apt-get update && apt-get install -y --no-install-recommends \
     libpq5 \
     && rm -rf /var/lib/apt/lists/* \
     && groupadd -r appuser && useradd -r -g appuser appuser
 
-COPY --from=builder /root/.local /root/.local
+COPY --from=builder /install /usr/local
 COPY . .
 
 RUN chown -R appuser:appuser /app
 
 USER appuser
 
-ENV PATH=/root/.local/bin:$PATH
-
 EXPOSE 8000
 
 CMD ["uvicorn", "app.main:app", "--host", "0.0.0.0", "--port", "8000"]

Committable suggestion skipped: line range outside the PR's diff.

🤖 Prompt for AI Agents
In Backend/Dockerfile.prod around lines 22 to 29, the runtime copies packages
from /root/.local and then switches to appuser, making /root/.local/bin
inaccessible; change to a shared install location and update PATH: copy or
install build artifacts into a non-root location such as /opt/.local (e.g. COPY
--from=builder /root/.local /opt/.local or update the builder to install
directly to /opt/.local), set ENV PATH=/opt/.local/bin:$PATH, and run chown -R
appuser:appuser /opt/.local so the appuser can access the binaries before
switching USER to appuser.

Comment on lines +22 to +29
Copy link

Copilot AI Dec 13, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The PATH environment variable points to /root/.local/bin but the application runs as the 'appuser' user (non-root). This means the installed packages in /root/.local won't be accessible. The COPY command should use --chown flag and copy to a location accessible by appuser, or the PATH should be updated accordingly.

Copilot uses AI. Check for mistakes.

EXPOSE 8000

CMD ["uvicorn", "app.main:app", "--host", "0.0.0.0", "--port", "8000"]
42 changes: 40 additions & 2 deletions Backend/app/main.py
Original file line number Diff line number Diff line change
@@ -1,5 +1,6 @@
from fastapi import FastAPI
from fastapi import FastAPI, Request
from fastapi.middleware.cors import CORSMiddleware
from starlette.middleware.base import BaseHTTPMiddleware
from .db.db import engine
from .db.seed import seed_db
from .models import models, chat
Expand All @@ -9,13 +10,21 @@
from sqlalchemy.exc import SQLAlchemyError
import logging
import os
import time
from dotenv import load_dotenv
from contextlib import asynccontextmanager
from app.routes import ai

# Load environment variables
load_dotenv()

# Configure logging
logging.basicConfig(
level=logging.INFO,
format='%(asctime)s - %(name)s - %(levelname)s - %(message)s'
)
logger = logging.getLogger(__name__)


# Async function to create database tables with exception handling
async def create_tables():
Expand All @@ -38,13 +47,42 @@ async def lifespan(app: FastAPI):
print("App is shutting down...")


# Custom middleware for logging and timing
class RequestMiddleware(BaseHTTPMiddleware):
async def dispatch(self, request: Request, call_next):
start_time = time.time()

logger.info(f"Incoming: {request.method} {request.url.path}")

response = await call_next(request)

process_time = time.time() - start_time
response.headers["X-Process-Time"] = str(process_time)
response.headers["X-Content-Type-Options"] = "nosniff"
response.headers["X-Frame-Options"] = "DENY"
response.headers["X-XSS-Protection"] = "1; mode=block"

logger.info(f"Completed: {request.method} {request.url.path} - {response.status_code} ({process_time:.3f}s)")

return response
Comment on lines +50 to +67
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue | 🟠 Major

Add error handling to capture exceptions.

The middleware doesn't handle exceptions that may occur in call_next(). If an unhandled exception is raised in downstream handlers, the completion log and timing headers won't be recorded, making debugging more difficult.

Apply this diff to add error handling:

 class RequestMiddleware(BaseHTTPMiddleware):
     async def dispatch(self, request: Request, call_next):
         start_time = time.time()
         
         logger.info(f"Incoming: {request.method} {request.url.path}")
         
-        response = await call_next(request)
-        
-        process_time = time.time() - start_time
-        response.headers["X-Process-Time"] = str(process_time)
-        response.headers["X-Content-Type-Options"] = "nosniff"
-        response.headers["X-Frame-Options"] = "DENY"
-        response.headers["X-XSS-Protection"] = "1; mode=block"
-        
-        logger.info(f"Completed: {request.method} {request.url.path} - {response.status_code} ({process_time:.3f}s)")
-        
-        return response
+        try:
+            response = await call_next(request)
+            
+            process_time = time.time() - start_time
+            response.headers["X-Process-Time"] = str(process_time)
+            response.headers["X-Content-Type-Options"] = "nosniff"
+            response.headers["X-Frame-Options"] = "DENY"
+            response.headers["X-XSS-Protection"] = "1; mode=block"
+            
+            logger.info(f"Completed: {request.method} {request.url.path} - {response.status_code} ({process_time:.3f}s)")
+            
+            return response
+        except Exception as e:
+            process_time = time.time() - start_time
+            logger.error(f"Failed: {request.method} {request.url.path} - {type(e).__name__}: {e} ({process_time:.3f}s)")
+            raise

Note: The X-XSS-Protection header (line 63) is deprecated in modern browsers and no longer recommended. Consider removing it in a future update.

Committable suggestion skipped: line range outside the PR's diff.

🤖 Prompt for AI Agents
In Backend/app/main.py around lines 50 to 67, the RequestMiddleware does not
handle exceptions from call_next(), so if downstream handlers raise an error the
completion log and timing/security headers are not recorded; wrap the
call_next(request) in try/except/finally: call call_next inside try, on
exception log the full exception (with stacktrace) and create/return an HTTP 500
response (or re-raise if you prefer) while ensuring headers (X-Process-Time and
security headers) are set before returning, and use finally to compute
process_time and emit the completion log so headers and logs are always recorded
even on errors.


# Initialize FastAPI
app = FastAPI(lifespan=lifespan)

# Add custom middleware
app.add_middleware(RequestMiddleware)

# Add CORS middleware
app.add_middleware(
CORSMiddleware,
allow_origins=["http://localhost:5173"],
allow_origins=[
"http://localhost:5173",
"http://localhost:5174",
"http://localhost:5175",
"http://localhost:5176",
"http://frontend:5173",
"http://127.0.0.1:5173"
],
allow_credentials=True,
allow_methods=["*"],
allow_headers=["*"],
Expand Down
24 changes: 18 additions & 6 deletions Backend/app/routes/post.py
Original file line number Diff line number Diff line change
Expand Up @@ -18,25 +18,37 @@
import uuid
from datetime import datetime, timezone

# Load environment variables
load_dotenv()
url: str = os.getenv("SUPABASE_URL")
key: str = os.getenv("SUPABASE_KEY")
supabase: Client = create_client(url, key)

url: str = os.getenv("SUPABASE_URL", "")
key: str = os.getenv("SUPABASE_KEY", "")

if not url or not key or "your-" in url:
print("⚠️ Supabase credentials not configured. Some features will be limited.")
Copy link

Copilot AI Dec 13, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The warning message uses emoji that might not render correctly on Windows terminals. Consider using plain text markers like [WARNING] instead of the ⚠ symbol for better cross-platform compatibility.

Suggested change
print("⚠️ Supabase credentials not configured. Some features will be limited.")
print("[WARNING] Supabase credentials not configured. Some features will be limited.")

Copilot uses AI. Check for mistakes.
Copy link

Copilot AI Dec 13, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Print statement may execute during import.

Copilot uses AI. Check for mistakes.
supabase = None
else:
try:
supabase: Client = create_client(url, key)
except Exception as e:
print(f"❌ Supabase connection failed: {e}")
Copy link

Copilot AI Dec 13, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Print statement may execute during import.

Copilot uses AI. Check for mistakes.
supabase = None
Comment on lines +23 to +34
Copy link

Copilot AI Dec 13, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The supabase variable type annotation is lost when set to None. The variable is initially typed as 'Client' but then conditionally set to None, which will cause type checking issues. Consider using 'Optional[Client]' type annotation or handle this more explicitly.

Copilot uses AI. Check for mistakes.

# Define Router
router = APIRouter()

# Helper Functions
def generate_uuid():
return str(uuid.uuid4())

def current_timestamp():
return datetime.now(timezone.utc).isoformat()

# ========== USER ROUTES ==========
def check_supabase():
if not supabase:
raise HTTPException(status_code=503, detail="Database service unavailable. Please configure Supabase credentials.")

@router.post("/users/")
async def create_user(user: UserCreate):
check_supabase()
Comment on lines +45 to +51
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue | 🔴 Critical

Critical: Incomplete Supabase availability check across endpoints.

While create_user correctly calls check_supabase() at Line 51, all other endpoints (lines 69, 94, 119, 143, 167, 188, 210) directly access the supabase client without checking if it's None. This will cause AttributeError exceptions when Supabase credentials are missing or invalid.

Apply this pattern to all endpoints that use the Supabase client:

 @router.get("/users/")
 async def get_users():
+    check_supabase()
     result = supabase.table("users").select("*").execute()
     return result

 @router.post("/audience-insights/")
 async def create_audience_insights(insights: AudienceInsightsCreate):
+    check_supabase()
     insight_id = generate_uuid()
     ...

Alternatively, create a dependency that can be injected into all endpoints:

from fastapi import Depends

def get_supabase() -> Client:
    if not supabase:
        raise HTTPException(status_code=503, detail="Database service unavailable. Please configure Supabase credentials.")
    return supabase

@router.get("/users/")
async def get_users(db: Client = Depends(get_supabase)):
    result = db.table("users").select("*").execute()
    return result
🤖 Prompt for AI Agents
In Backend/app/routes/post.py around lines 45 to 51 (and for all endpoints that
use the supabase client at lines 69, 94, 119, 143, 167, 188, 210), the Supabase
client is used without verifying availability which can raise AttributeError
when credentials are missing; either call the existing check_supabase() at the
start of each endpoint that uses supabase or implement and inject a FastAPI
dependency (e.g., get_supabase using Depends) that performs the same None check
and raises HTTPException(503) if unavailable, then replace direct uses of the
global supabase in those endpoints with the validated client returned by the
dependency.

user_id = generate_uuid()
t = current_timestamp()

Expand Down
Loading