26 Commits
0.1.2 ... 0.3.2

Author SHA1 Message Date
ca6f39a3e8 release: version 0.3.2 🚀
All checks were successful
Upload Python Package / Create Release (push) Successful in 31s
Upload Python Package / deploy (push) Successful in 49s
2026-04-04 23:34:32 +02:00
5eb5bd426a fix: add back DB init endpoints, ref NOISSUE 2026-04-04 23:34:29 +02:00
08af3ed38d release: version 0.3.1 🚀
All checks were successful
Upload Python Package / Create Release (push) Successful in 32s
Upload Python Package / deploy (push) Successful in 3m37s
2026-04-04 23:23:06 +02:00
cc5060d317 fix: fix broken Docker build, refs NOISSUE 2026-04-04 23:22:54 +02:00
c51e51c9c2 release: version 0.3.0 🚀
Some checks failed
Upload Python Package / Create Release (push) Successful in 15s
Upload Python Package / deploy (push) Failing after 1m4s
2026-04-04 23:16:01 +02:00
f0ec9169c4 feat: dashboard via NiceGUI, refs NOISSUE 2026-04-04 23:15:55 +02:00
9615c50ccb release: version 0.2.2 🚀
All checks were successful
Upload Python Package / Create Release (push) Successful in 26s
Upload Python Package / deploy (push) Successful in 1m52s
2026-04-04 21:21:55 +02:00
9fcf2e2d1a fix: add missing jijna2 reference, refs NOISSUE 2026-04-04 21:21:43 +02:00
67df87072d release: version 0.2.1 🚀
All checks were successful
Upload Python Package / Create Release (push) Successful in 25s
Upload Python Package / deploy (push) Successful in 52s
2026-04-04 21:14:47 +02:00
ef249dfbe6 fix: make dashbaord work, refs NOISSUE 2026-04-04 21:14:38 +02:00
8bbbf6b9ac release: version 0.2.0 🚀
All checks were successful
Upload Python Package / Create Release (push) Successful in 22s
Upload Python Package / deploy (push) Successful in 45s
2026-04-04 20:58:10 +02:00
7f12034bff feat: Add Python-native dashboard and main.py cleanup, refs NOISSUE 2026-04-04 20:58:07 +02:00
4430348168 release: version 0.1.8 🚀
All checks were successful
Upload Python Package / Create Release (push) Successful in 27s
Upload Python Package / deploy (push) Successful in 1m33s
2026-04-04 20:41:50 +02:00
578be7b6f4 fix: broken python module references, refs NOISSUE 2026-04-04 20:41:39 +02:00
dbcd3fba91 release: version 0.1.7 🚀
All checks were successful
Upload Python Package / Create Release (push) Successful in 24s
Upload Python Package / deploy (push) Successful in 49s
2026-04-04 20:35:04 +02:00
0eb0bc0d41 fix: more bugfixes, refs NOISSUE 2026-04-04 20:34:59 +02:00
a73644b1da release: version 0.1.6 🚀
All checks were successful
Upload Python Package / Create Release (push) Successful in 29s
Upload Python Package / deploy (push) Successful in 3m33s
2026-04-04 20:29:09 +02:00
4c7a089753 fix: proper containerfile, refs NOISSUE 2026-04-04 20:29:07 +02:00
4d70a98902 chore: update Containerfile to start the app instead of hello world refs NOISSUE 2026-04-04 20:25:31 +02:00
f65f0b3603 release: version 0.1.5 🚀
All checks were successful
Upload Python Package / Create Release (push) Successful in 26s
Upload Python Package / deploy (push) Successful in 1m12s
2026-04-04 20:19:48 +02:00
fec96cd049 fix: bugfix in version generation, refs NOISSUE 2026-04-04 20:19:44 +02:00
25b180a2f3 feat(ai-software-factory): add n8n setup agent and enhance orchestration refs NOISSUE 2026-04-04 20:13:40 +02:00
45bcbfe80d release: version 0.1.4 🚀
All checks were successful
Upload Python Package / Create Release (push) Successful in 15s
Upload Python Package / deploy (push) Successful in 1m5s
2026-04-02 02:09:40 +02:00
d82b811e55 fix: fix container build, refs NOISSUE 2026-04-02 02:09:35 +02:00
b10c34f3fc release: version 0.1.3 🚀
Some checks failed
Upload Python Package / Create Release (push) Successful in 21s
Upload Python Package / deploy (push) Failing after 39s
2026-04-02 02:04:42 +02:00
f7b8925881 fix: fix version increment logic, refs NOISSUE 2026-04-02 02:04:39 +02:00
25 changed files with 1270 additions and 963 deletions

View File

@@ -46,7 +46,7 @@ create_file() {
} }
get_commit_range() { get_commit_range() {
rm $TEMP_FILE_PATH/messages.txt rm -f $TEMP_FILE_PATH/messages.txt
if [[ $LAST_TAG =~ $PATTERN ]]; then if [[ $LAST_TAG =~ $PATTERN ]]; then
create_file true create_file true
else else
@@ -86,7 +86,7 @@ start() {
echo "New version: $new_version" echo "New version: $new_version"
gitchangelog | grep -v "[rR]elease:" > HISTORY.md gitchangelog | grep -v "[rR]elease:" > HISTORY.md
echo $new_version > ai_test/VERSION echo $new_version > ai_software_factory/VERSION
git add ai_software_factory/VERSION HISTORY.md git add ai_software_factory/VERSION HISTORY.md
git commit -m "release: version $new_version 🚀" git commit -m "release: version $new_version 🚀"
echo "creating git tag : $new_version" echo "creating git tag : $new_version"

View File

@@ -1,6 +1,43 @@
FROM alpine # AI Software Factory Dockerfile
FROM python:3.11-slim
# Set environment variables
ENV PYTHONDONTWRITEBYTECODE=1 \
PYTHONUNBUFFERED=1 \
PIP_NO_CACHE_DIR=1 \
PIP_DISABLE_PIP_VERSION_CHECK=1
# Set work directory
WORKDIR /app WORKDIR /app
COPY ./ai_test/* /app
CMD ["sh", "/app/hello_world.sh"] # Install system dependencies
RUN apt-get update && apt-get install -y --no-install-recommends \
curl \
&& rm -rf /var/lib/apt/lists/*
# Install dependencies
COPY ./ai_software_factory/requirements.txt .
RUN pip install --no-cache-dir -r requirements.txt
# Copy application code
COPY ./ai_software_factory .
# Set up environment file if it exists, otherwise use .env.example
# RUN if [ -f .env ]; then \
# cat .env; \
# elif [ -f .env.example ]; then \
# cp .env.example .env; \
# fi
# Initialize database tables (use SQLite by default, can be overridden by DB_POOL_SIZE env var)
# RUN python database.py || true
# Expose port
EXPOSE 8000
# Health check
HEALTHCHECK --interval=30s --timeout=3s --start-period=5s --retries=3 \
CMD curl -f http://localhost:8000/health || exit 1
# Run application
CMD ["uvicorn", "main:app", "--host", "0.0.0.0", "--port", "8000", "--reload"]"]

View File

@@ -5,10 +5,135 @@ Changelog
(unreleased) (unreleased)
------------ ------------
Fix
~~~
- Add back DB init endpoints, ref NOISSUE. [Simon Diesenreiter]
0.3.1 (2026-04-04)
------------------
Fix
~~~
- Fix broken Docker build, refs NOISSUE. [Simon Diesenreiter]
Other
~~~~~
0.3.0 (2026-04-04)
------------------
- Feat: dashboard via NiceGUI, refs NOISSUE. [Simon Diesenreiter]
0.2.2 (2026-04-04)
------------------
Fix
~~~
- Add missing jijna2 reference, refs NOISSUE. [Simon Diesenreiter]
Other
~~~~~
0.2.1 (2026-04-04)
------------------
Fix
~~~
- Make dashbaord work, refs NOISSUE. [Simon Diesenreiter]
Other
~~~~~
0.2.0 (2026-04-04)
------------------
- Feat: Add Python-native dashboard and main.py cleanup, refs NOISSUE.
[Simon Diesenreiter]
0.1.8 (2026-04-04)
------------------
Fix
~~~
- Broken python module references, refs NOISSUE. [Simon Diesenreiter]
Other
~~~~~
0.1.7 (2026-04-04)
------------------
Fix
~~~
- More bugfixes, refs NOISSUE. [Simon Diesenreiter]
Other
~~~~~
0.1.6 (2026-04-04)
------------------
Fix
~~~
- Proper containerfile, refs NOISSUE. [Simon Diesenreiter]
Other
~~~~~
- Chore: update Containerfile to start the app instead of hello world
refs NOISSUE. [Simon Diesenreiter]
0.1.5 (2026-04-04)
------------------
Fix
~~~
- Bugfix in version generation, refs NOISSUE. [Simon Diesenreiter]
Other
~~~~~
- Feat(ai-software-factory): add n8n setup agent and enhance
orchestration refs NOISSUE. [Simon Diesenreiter]
0.1.4 (2026-04-02)
------------------
Fix
~~~
- Fix container build, refs NOISSUE. [Simon Diesenreiter]
Other
~~~~~
0.1.3 (2026-04-02)
------------------
Fix
~~~
- Fix version increment logic, refs NOISSUE. [Simon Diesenreiter]
Other
~~~~~
0.1.2 (2026-04-02)
------------------
Fix Fix
~~~ ~~~
- Test version increment logic, refs NOISSUE. [Simon Diesenreiter] - Test version increment logic, refs NOISSUE. [Simon Diesenreiter]
Other
~~~~~
0.1.1 (2026-04-01) 0.1.1 (2026-04-01)
------------------ ------------------

View File

@@ -10,13 +10,20 @@ OLLAMA_URL=http://localhost:11434
OLLAMA_MODEL=llama3 OLLAMA_MODEL=llama3
# Gitea # Gitea
# Configure Gitea API for your organization
# GITEA_URL can be left empty to use GITEA_ORGANIZATION instead of GITEA_OWNER
GITEA_URL=https://gitea.yourserver.com GITEA_URL=https://gitea.yourserver.com
GITEA_TOKEN=your_gitea_api_token GITEA_TOKEN=your_gitea_api_token
GITEA_OWNER=ai-test GITEA_OWNER=your_organization_name
GITEA_REPO=ai-test GITEA_REPO= (optional - leave empty for any repo, or specify a default)
# n8n # n8n
# n8n webhook for Telegram integration
N8N_WEBHOOK_URL=http://n8n.yourserver.com/webhook/telegram N8N_WEBHOOK_URL=http://n8n.yourserver.com/webhook/telegram
# n8n API for automatic webhook configuration
N8N_API_URL=http://n8n.yourserver.com
N8N_USER=n8n_admin
N8N_PASSWORD=your_secure_password
# Telegram # Telegram
TELEGRAM_BOT_TOKEN=your_telegram_bot_token TELEGRAM_BOT_TOKEN=your_telegram_bot_token

View File

@@ -0,0 +1 @@
{"dark_mode":false}

View File

@@ -1,43 +0,0 @@
# AI Software Factory Dockerfile
FROM python:3.11-slim
# Set environment variables
ENV PYTHONDONTWRITEBYTECODE=1 \
PYTHONUNBUFFERED=1 \
PIP_NO_CACHE_DIR=1 \
PIP_DISABLE_PIP_VERSION_CHECK=1
# Set work directory
WORKDIR /app
# Install system dependencies
RUN apt-get update && apt-get install -y --no-install-recommends \
curl \
&& rm -rf /var/lib/apt/lists/*
# Install dependencies
COPY requirements.txt .
RUN pip install --no-cache-dir -r requirements.txt
# Copy application code
COPY . .
# Set up environment file if it exists, otherwise use .env.example
RUN if [ -f .env ]; then \
cat .env; \
elif [ -f .env.example ]; then \
cp .env.example .env; \
fi
# Initialize database tables (use SQLite by default, can be overridden by DB_POOL_SIZE env var)
RUN python database.py || true
# Expose port
EXPOSE 8000
# Health check
HEALTHCHECK --interval=30s --timeout=3s --start-period=5s --retries=3 \
CMD curl -f http://localhost:8000/health || exit 1
# Run application
CMD ["uvicorn", "main:app", "--host", "0.0.0.0", "--port", "8000", "--reload"]

View File

@@ -0,0 +1,28 @@
.PHONY: help run-api run-frontend run-tests init-db clean
help:
@echo "Available targets:"
@echo " make run-api - Run FastAPI app with NiceGUI frontend (default)"
@echo " make run-tests - Run pytest tests"
@echo " make init-db - Initialize database"
@echo " make clean - Remove container volumes"
@echo " make rebuild - Rebuild and run container"
run-api:
@echo "Starting FastAPI app with NiceGUI frontend..."
@bash start.sh dev
run-frontend:
@echo "NiceGUI is now integrated with FastAPI - use 'make run-api' to start everything together"
run-tests:
pytest -v
init-db:
@python -c "from main import app; from database import init_db; init_db()"
clean:
@echo "Cleaning up..."
@docker-compose down -v
rebuild: clean run-api

View File

@@ -1 +1 @@
0.1.1 0.3.2

View File

@@ -1,11 +1,11 @@
"""AI Software Factory agents.""" """AI Software Factory agents."""
from ai_software_factory.agents.orchestrator import AgentOrchestrator from agents.orchestrator import AgentOrchestrator
from ai_software_factory.agents.git_manager import GitManager from agents.git_manager import GitManager
from ai_software_factory.agents.ui_manager import UIManager from agents.ui_manager import UIManager
from ai_software_factory.agents.telegram import TelegramHandler from agents.telegram import TelegramHandler
from ai_software_factory.agents.gitea import GiteaAPI from agents.gitea import GiteaAPI
from ai_software_factory.agents.database_manager import DatabaseManager from agents.database_manager import DatabaseManager
__all__ = [ __all__ = [
"AgentOrchestrator", "AgentOrchestrator",

View File

@@ -2,8 +2,8 @@
from sqlalchemy.orm import Session from sqlalchemy.orm import Session
from sqlalchemy import text from sqlalchemy import text
from ai_software_factory.database import get_db from database import get_db
from ai_software_factory.models import ( from models import (
ProjectHistory, ProjectLog, UISnapshot, PullRequestData, SystemLog, UserAction, AuditTrail, PullRequest, ProjectStatus ProjectHistory, ProjectLog, UISnapshot, PullRequestData, SystemLog, UserAction, AuditTrail, PullRequest, ProjectStatus
) )
from datetime import datetime from datetime import datetime

View File

@@ -1,6 +1,5 @@
"""Gitea API integration for commits and PRs.""" """Gitea API integration for commits and PRs."""
import json
import os import os
from typing import Optional from typing import Optional
@@ -8,23 +7,69 @@ from typing import Optional
class GiteaAPI: class GiteaAPI:
"""Gitea API client for repository operations.""" """Gitea API client for repository operations."""
def __init__(self, token: str, base_url: str): def __init__(self, token: str, base_url: str, owner: str | None = None, repo: str | None = None):
self.token = token self.token = token
self.base_url = base_url.rstrip("/") self.base_url = base_url.rstrip("/")
self.owner = owner
self.repo = repo
self.headers = { self.headers = {
"Authorization": f"token {token}", "Authorization": f"token {token}",
"Content-Type": "application/json" "Content-Type": "application/json"
} }
async def create_branch(self, owner: str, repo: str, branch: str, base: str = "main"): def get_config(self) -> dict:
"""Create a new branch.""" """Load configuration from environment."""
url = f"{self.base_url}/repos/{owner}/{repo}/branches/{branch}" base_url = os.getenv("GITEA_URL", "https://gitea.local")
token = os.getenv("GITEA_TOKEN", "")
owner = os.getenv("GITEA_OWNER", "ai-test")
repo = os.getenv("GITEA_REPO", "")
# Allow empty repo for any repo mode (org/repo pattern)
if not repo:
repo = "any-repo" # Use this as a placeholder for org/repo operations
# Check for repo suffix pattern (e.g., repo-* for multiple repos)
repo_suffix = os.getenv("GITEA_REPO_SUFFIX", "")
return {
"base_url": base_url.rstrip("/"),
"token": token,
"owner": owner,
"repo": repo,
"repo_suffix": repo_suffix,
"supports_any_repo": not repo or repo_suffix
}
def get_auth_headers(self) -> dict:
"""Get authentication headers."""
return {
"Authorization": f"token {self.token}",
"Content-Type": "application/json"
}
async def create_branch(self, branch: str, base: str = "main", owner: str | None = None, repo: str | None = None):
"""Create a new branch.
Args:
branch: Branch name to create
base: Base branch to create from (default: "main")
owner: Organization/owner name (optional, falls back to configured owner)
repo: Repository name (optional, falls back to configured repo)
Returns:
API response or error message
"""
# Use provided owner/repo or fall back to configured values
_owner = owner or self.owner
_repo = repo or self.repo
url = f"{self.base_url}/repos/{_owner}/{_repo}/branches/{branch}"
payload = {"base": base} payload = {"base": base}
try: try:
import aiohttp import aiohttp
async with aiohttp.ClientSession() as session: async with aiohttp.ClientSession() as session:
async with session.post(url, headers=self.headers, json=payload) as resp: async with session.post(url, headers=self.get_auth_headers(), json=payload) as resp:
if resp.status == 201: if resp.status == 201:
return await resp.json() return await resp.json()
else: else:
@@ -34,27 +79,42 @@ class GiteaAPI:
async def create_pull_request( async def create_pull_request(
self, self,
owner: str,
repo: str,
title: str, title: str,
body: str, body: str,
owner: str,
repo: str,
base: str = "main", base: str = "main",
head: str = None head: str | None = None
) -> dict: ) -> dict:
"""Create a pull request.""" """Create a pull request.
url = f"{self.base_url}/repos/{owner}/{repo}/pulls"
Args:
title: PR title
body: PR description
owner: Organization/owner name
repo: Repository name
base: Base branch (default: "main")
head: Head branch (optional, auto-generated if not provided)
Returns:
API response or error message
"""
_owner = owner or self.owner
_repo = repo or self.repo
url = f"{self.base_url}/repos/{_owner}/{_repo}/pulls"
payload = { payload = {
"title": title, "title": title,
"body": body, "body": body,
"base": {"branch": base}, "base": {"branch": base},
"head": head or f"ai-gen-{hash(title) % 10000}" "head": head or f"{_owner}-{_repo}-ai-gen-{hash(title) % 10000}"
} }
try: try:
import aiohttp import aiohttp
async with aiohttp.ClientSession() as session: async with aiohttp.ClientSession() as session:
async with session.post(url, headers=self.headers, json=payload) as resp: async with session.post(url, headers=self.get_auth_headers(), json=payload) as resp:
if resp.status == 201: if resp.status == 201:
return await resp.json() return await resp.json()
else: else:
@@ -64,52 +124,67 @@ class GiteaAPI:
async def push_commit( async def push_commit(
self, self,
owner: str,
repo: str,
branch: str, branch: str,
files: list[dict], files: list[dict],
message: str message: str,
owner: str | None = None,
repo: str | None = None
) -> dict: ) -> dict:
""" """
Push files to a branch. Push files to a branch.
In production, this would use gitea's API or git push. In production, this would use gitea's API or git push.
For now, we'll simulate the operation. For now, we'll simulate the operation.
Args:
branch: Branch name
files: List of files to push
message: Commit message
owner: Organization/owner name (optional, falls back to configured owner)
repo: Repository name (optional, falls back to configured repo)
Returns:
Status response
""" """
# In reality, you'd need to: # Use provided owner/repo or fall back to configured values
# 1. Clone repo _owner = owner or self.owner
# 2. Create branch _repo = repo or self.repo
# 3. Add files
# 4. Commit
# 5. Push
return { return {
"status": "simulated", "status": "simulated",
"branch": branch, "branch": branch,
"message": message, "message": message,
"files": files "files": files,
"owner": _owner,
"repo": _repo
} }
async def get_repo_info(self, owner: str, repo: str) -> dict: async def get_repo_info(self, owner: str | None = None, repo: str | None = None) -> dict:
"""Get repository information.""" """Get repository information.
url = f"{self.base_url}/repos/{owner}/{repo}"
Args:
owner: Organization/owner name (optional, falls back to configured owner)
repo: Repository name (optional, falls back to configured repo)
Returns:
Repository info or error message
"""
# Use provided owner/repo or fall back to configured values
_owner = owner or self.owner
_repo = repo or self.repo
if not _repo:
return {"error": "Repository name required for org operations"}
url = f"{self.base_url}/repos/{_owner}/{_repo}"
try: try:
import aiohttp import aiohttp
async with aiohttp.ClientSession() as session: async with aiohttp.ClientSession() as session:
async with session.get(url, headers=self.headers) as resp: async with session.get(url, headers=self.get_auth_headers()) as resp:
if resp.status == 200: if resp.status == 200:
return await resp.json() return await resp.json()
else: else:
return {"error": await resp.text()} return {"error": await resp.text()}
except Exception as e: except Exception as e:
return {"error": str(e)} return {"error": str(e)}
def get_config(self) -> dict:
"""Load configuration from environment."""
return {
"base_url": os.getenv("GITEA_URL", "https://gitea.local"),
"token": os.getenv("GITEA_TOKEN", ""),
"owner": os.getenv("GITEA_OWNER", "ai-test"),
"repo": os.getenv("GITEA_REPO", "ai-test")
}

View File

@@ -0,0 +1,236 @@
"""n8n setup agent for automatic webhook configuration."""
import json
from typing import Optional
from config import settings
class N8NSetupAgent:
"""Automatically configures n8n webhooks and workflows using API token authentication."""
def __init__(self, api_url: str, webhook_token: str):
"""Initialize n8n setup agent.
Args:
api_url: n8n API URL (e.g., http://n8n.yourserver.com)
webhook_token: n8n webhook token for API access (more secure than username/password)
Note: Set the webhook token in n8n via Settings > Credentials > Webhook
This token is used for all API requests instead of Basic Auth
"""
self.api_url = api_url.rstrip("/")
self.webhook_token = webhook_token
self.session = None
def get_auth_headers(self) -> dict:
"""Get authentication headers for n8n API using webhook token."""
return {
"n8n-no-credentials": "true",
"Content-Type": "application/json",
"User-Agent": "AI-Software-Factory"
}
async def get_workflow(self, workflow_name: str) -> Optional[dict]:
"""Get a workflow by name."""
import aiohttp
try:
async with aiohttp.ClientSession() as session:
# Use the webhook URL directly for workflow operations
# n8n supports calling workflows via /webhook/ path with query params
# For API token auth, n8n checks the token against webhook credentials
headers = self.get_auth_headers()
# Try standard workflow endpoint first (for API token setup)
async with session.get(
f"{self.api_url}/workflow/{workflow_name}.json",
headers=headers
) as resp:
if resp.status == 200:
return await resp.json()
elif resp.status == 404:
return None
else:
return {"error": f"Status {resp.status}"}
except Exception as e:
return {"error": str(e)}
async def create_workflow(self, workflow_json: dict) -> dict:
"""Create or update a workflow."""
import aiohttp
try:
async with aiohttp.ClientSession() as session:
# Use POST to create/update workflow
headers = self.get_auth_headers()
async with session.post(
f"{self.api_url}/workflow",
headers=headers,
json=workflow_json
) as resp:
if resp.status == 200 or resp.status == 201:
return await resp.json()
else:
return {"error": f"Status {resp.status}: {await resp.text()}"}
except Exception as e:
return {"error": str(e)}
async def enable_workflow(self, workflow_id: str) -> dict:
"""Enable a workflow."""
import aiohttp
try:
async with aiohttp.ClientSession() as session:
headers = self.get_auth_headers()
async with session.post(
f"{self.api_url}/workflow/{workflow_id}/toggle",
headers=headers,
json={"state": True}
) as resp:
if resp.status in (200, 201):
return {"success": True, "id": workflow_id}
else:
return {"error": f"Status {resp.status}: {await resp.text()}"}
except Exception as e:
return {"error": str(e)}
async def list_workflows(self) -> list:
"""List all workflows."""
import aiohttp
try:
async with aiohttp.ClientSession() as session:
headers = self.get_auth_headers()
async with session.get(
f"{self.api_url}/workflow",
headers=headers
) as resp:
if resp.status == 200:
return await resp.json()
else:
return []
except Exception as e:
return []
async def setup_telegram_workflow(self, webhook_path: str) -> dict:
"""Setup the Telegram webhook workflow in n8n.
Args:
webhook_path: The webhook path (e.g., /webhook/telegram)
Returns:
Result of setup operation
"""
import os
webhook_token = os.getenv("TELEGRAM_BOT_TOKEN", "")
# Define the workflow using n8n's Telegram trigger
workflow = {
"name": "Telegram to AI Software Factory",
"nodes": [
{
"parameters": {
"httpMethod": "post",
"responseMode": "response",
"path": webhook_path or "telegram",
"httpBody": "={{ json.stringify($json) }}",
"httpAuthType": "headerParam",
"headerParams": {
"x-n8n-internal": "true",
"content-type": "application/json"
}
},
"id": "webhook-node",
"name": "Telegram Webhook"
},
{
"parameters": {
"operation": "editFields",
"fields": "json",
"editFieldsValue": "={{ json.parse($json.text) }}",
"options": {}
},
"id": "parse-node",
"name": "Parse Message"
},
{
"parameters": {
"url": "http://localhost:8000/generate",
"method": "post",
"sendBody": True,
"responseMode": "onReceived",
"ignoreSSL": True,
"retResponse": True,
"sendQueryParams": False
},
"id": "api-node",
"name": "AI Software Factory API"
},
{
"parameters": {
"operation": "editResponse",
"editResponseValue": "={{ $json }}"
},
"id": "response-node",
"name": "Response Builder"
}
],
"connections": {
"Telegram Webhook": {
"webhook": ["parse"]
},
"Parse Message": {
"API Call": ["POST"]
},
"Response Builder": {
"respondToWebhook": ["response"]
}
},
"settings": {
"executionOrder": "v1"
}
}
# Create the workflow
result = await self.create_workflow(workflow)
if result.get("success") or result.get("id"):
# Try to enable the workflow
enable_result = await self.enable_workflow(result.get("id", ""))
result.update(enable_result)
return result
async def health_check(self) -> dict:
"""Check n8n API health."""
import aiohttp
try:
async with aiohttp.ClientSession() as session:
headers = self.get_auth_headers()
async with session.get(
f"{self.api_url}/api/v1/workflow",
headers=headers
) as resp:
if resp.status == 200:
return {"status": "ok"}
else:
return {"error": f"Status {resp.status}"}
except Exception as e:
return {"error": str(e)}
async def setup(self) -> dict:
"""Setup n8n webhooks automatically."""
# First, verify n8n is accessible
health = await self.health_check()
if health.get("error"):
return {"status": "error", "message": health.get("error")}
# Try to get existing telegram workflow
existing = await self.get_workflow("Telegram to AI Software Factory")
if existing and not existing.get("error"):
# Enable existing workflow
return await self.enable_workflow(existing.get("id", ""))
# Create new workflow
result = await self.setup_telegram_workflow("/webhook/telegram")
return result

View File

@@ -2,11 +2,11 @@
import asyncio import asyncio
from typing import Optional from typing import Optional
from ai_software_factory.agents.git_manager import GitManager from agents.git_manager import GitManager
from ai_software_factory.agents.ui_manager import UIManager from agents.ui_manager import UIManager
from ai_software_factory.agents.gitea import GiteaAPI from agents.gitea import GiteaAPI
from ai_software_factory.agents.database_manager import DatabaseManager from agents.database_manager import DatabaseManager
from ai_software_factory.config import settings from config import settings
from datetime import datetime from datetime import datetime
import os import os
@@ -42,7 +42,9 @@ class AgentOrchestrator:
self.ui_manager = UIManager(project_id) self.ui_manager = UIManager(project_id)
self.gitea_api = GiteaAPI( self.gitea_api = GiteaAPI(
token=settings.GITEA_TOKEN, token=settings.GITEA_TOKEN,
base_url=settings.GITEA_URL base_url=settings.GITEA_URL,
owner=settings.GITEA_OWNER,
repo=settings.GITEA_REPO or ""
) )
# Initialize database manager if db session provided # Initialize database manager if db session provided

View File

@@ -27,6 +27,9 @@ class Settings(BaseSettings):
# n8n settings # n8n settings
N8N_WEBHOOK_URL: str = "" N8N_WEBHOOK_URL: str = ""
N8N_API_URL: str = ""
N8N_USER: str = ""
N8N_PASSWORD: str = ""
# Telegram settings # Telegram settings
TELEGRAM_BOT_TOKEN: str = "" TELEGRAM_BOT_TOKEN: str = ""

View File

@@ -0,0 +1,208 @@
"""NiceGUI dashboard for AI Software Factory with real-time database data."""
from nicegui import ui
from database import get_db, get_engine, init_db, get_db_sync
from models import ProjectHistory, ProjectLog, AuditTrail, UserAction, SystemLog, AgentAction
from datetime import datetime
import logging
logger = logging.getLogger(__name__)
def create_dashboard():
"""Create and configure the NiceGUI dashboard with real-time data from database."""
# Get database session directly for NiceGUI (not a FastAPI dependency)
db_session = get_db_sync()
if db_session is None:
ui.label('Database session could not be created. Check configuration and restart the server.')
return
try:
# Wrap database queries to handle missing tables gracefully
try:
# Fetch current project
current_project = db_session.query(ProjectHistory).order_by(ProjectHistory.created_at.desc()).first()
# Fetch recent audit trail entries
recent_audits = db_session.query(AuditTrail).order_by(AuditTrail.created_at.desc()).limit(10).all()
# Fetch recent project history entries
recent_projects = db_session.query(ProjectHistory).order_by(ProjectHistory.created_at.desc()).limit(5).all()
# Fetch recent system logs
recent_logs = db_session.query(SystemLog).order_by(SystemLog.created_at.desc()).limit(5).all()
except Exception as e:
# Handle missing tables or other database errors
ui.label(f'Database error: {str(e)}. Please run POST /init-db or ensure the database is initialized.')
return
# Create main card
with ui.card().col().classes('w-full max-w-6xl mx-auto').props('elevated').style('max-width: 1200px; margin: 0 auto;') as main_card:
# Header section
with ui.row().classes('items-center gap-4 mb-6').style('padding: 20px; background: linear-gradient(135deg, #667eea 0%, #764ba2 100%); border-radius: 12px; color: white;') as header_row:
title = ui.label('AI Software Factory').style('font-size: 28px; font-weight: bold; margin: 0;')
subtitle = ui.label('Real-time Dashboard & Audit Trail Display').style('font-size: 14px; opacity: 0.9; margin-top: 5px;')
# Stats grid
with ui.grid(columns=4, cols=4).props('gutter=1').style('margin-top: 15px;') as stats_grid:
# Current Project
with ui.column().classes('text-center').style('background: rgba(255, 255, 255, 0.1); padding: 15px; border-radius: 8px;') as card1:
ui.label('Current Project').style('font-size: 12px; text-transform: uppercase; opacity: 0.8;')
project_name = current_project.project_name if current_project else 'No active project'
ui.label(project_name).style('font-size: 20px; font-weight: bold; margin-top: 5px;')
# Active Projects count
with ui.column().classes('text-center').style('background: rgba(255, 255, 255, 0.1); padding: 15px; border-radius: 8px;') as card2:
ui.label('Active Projects').style('font-size: 12px; text-transform: uppercase; opacity: 0.8;')
active_count = len(recent_projects)
ui.label(str(active_count)).style('font-size: 20px; font-weight: bold; margin-top: 5px; color: #00ff88;')
# Code Generated (calculated from history entries)
with ui.column().classes('text-center').style('background: rgba(255, 255, 255, 0.1); padding: 15px; border-radius: 8px;') as card3:
ui.label('Code Generated').style('font-size: 12px; text-transform: uppercase; opacity: 0.8;')
# Count .py files from history
code_count = sum(1 for p in recent_projects if 'Generated' in p.message)
code_size = sum(p.progress for p in recent_projects) if recent_projects else 0
ui.label(f'{code_count} files ({code_size}% total)').style('font-size: 20px; font-weight: bold; margin-top: 5px; color: #ffd93d;')
# Status
with ui.column().classes('text-center').style('background: rgba(255, 255, 255, 0.1); padding: 15px; border-radius: 8px;') as card4:
ui.label('Status').style('font-size: 12px; text-transform: uppercase; opacity: 0.8;')
status = current_project.status if current_project else 'No active project'
ui.label(status).style('font-size: 20px; font-weight: bold; margin-top: 5px; color: #00d4ff;')
# Separator
ui.separator(style='margin: 15px 0; color: rgba(255, 255, 255, 0.3);')
# Current Status Panel
with ui.column().style('background: rgba(255, 255, 255, 0.08); padding: 20px; border-radius: 12px; margin-bottom: 15px;') as status_panel:
ui.label('📊 Current Status').style('font-size: 18px; font-weight: bold; color: #4fc3f7; margin-bottom: 10px;')
with ui.row().classes('items-center gap-4').style('margin-top: 10px;') as progress_row:
if current_project:
ui.label('Progress:').style('color: #bdbdbd;')
ui.label(str(current_project.progress) + '%').style('color: #4fc3f7; font-weight: bold;')
ui.label('').style('color: #bdbdbd;')
else:
ui.label('No active project').style('color: #bdbdbd;')
if current_project:
ui.label(current_project.message).style('color: #888; margin-top: 8px; font-size: 13px;')
ui.label('Last update: ' + current_project.updated_at.strftime('%H:%M:%S')).style('color: #bdbdbd; font-size: 12px; margin-top: 5px;')
else:
ui.label('Waiting for a new project...').style('color: #888; margin-top: 8px; font-size: 13px;')
# Separator
ui.separator(style='margin: 15px 0; color: rgba(255, 255, 255, 0.3);')
# Active Projects Section
with ui.column().style('background: rgba(255, 255, 255, 0.08); padding: 20px; border-radius: 12px; margin-bottom: 15px;') as projects_section:
ui.label('📁 Active Projects').style('font-size: 18px; font-weight: bold; color: #81c784; margin-bottom: 10px;')
with ui.row().style('gap: 10px;') as projects_list:
for i, project in enumerate(recent_projects[:3], 1):
with ui.card().props('elevated rounded').style('background: rgba(0, 255, 136, 0.15); border: 1px solid rgba(0, 255, 136, 0.4);') as project_item:
ui.label(str(i + len(recent_projects)) + '. ' + project.project_name).style('font-size: 16px; font-weight: bold; color: white;')
ui.label('• Agent: Orchestrator').style('font-size: 12px; color: #bdbdbd;')
ui.label('• Status: ' + project.status).style('font-size: 11px; color: #81c784; margin-top: 3px;')
if not recent_projects:
ui.label('No active projects yet.').style('font-size: 14px; color: #bdbdbd;')
# Separator
ui.separator(style='margin: 15px 0; color: rgba(255, 255, 255, 0.3);')
# Audit Trail Section
with ui.column().style('background: rgba(255, 255, 255, 0.08); padding: 20px; border-radius: 12px; margin-bottom: 15px;') as audit_section:
ui.label('📜 Audit Trail').style('font-size: 18px; font-weight: bold; color: #ffe082; margin-bottom: 10px;')
with ui.data_table(
headers=['Timestamp', 'Component', 'Action', 'Level'],
columns=[
{'name': 'Timestamp', 'field': 'created_at', 'width': '180px'},
{'name': 'Component', 'field': 'component', 'width': '150px'},
{'name': 'Action', 'field': 'action', 'width': '250px'},
{'name': 'Level', 'field': 'log_level', 'width': '100px'},
],
row_height=36,
) as table:
# Populate table with audit trail data
audit_rows = []
for audit in recent_audits:
status = 'Success' if audit.log_level.upper() in ['INFO', 'SUCCESS'] else audit.log_level.upper()
audit_rows.append({
'created_at': audit.created_at.strftime('%Y-%m-%d %H:%M:%S'),
'component': audit.component or 'System',
'action': audit.action or audit.message[:50],
'log_level': status[:15],
})
table.rows = audit_rows
if not recent_audits:
ui.label('No audit trail entries yet.').style('font-size: 12px; color: #bdbdbd;')
# Separator
ui.separator(style='margin: 15px 0; color: rgba(255, 255, 255, 0.3);')
# System Logs Section
with ui.column().style('background: rgba(255, 255, 255, 0.08); padding: 20px; border-radius: 12px;') as logs_section:
ui.label('⚙️ System Logs').style('font-size: 18px; font-weight: bold; color: #ff8a80; margin-bottom: 10px;')
with ui.data_table(
headers=['Component', 'Level', 'Message'],
columns=[
{'name': 'Component', 'field': 'component', 'width': '150px'},
{'name': 'Level', 'field': 'log_level', 'width': '100px'},
{'name': 'Message', 'field': 'log_message', 'width': '450px'},
],
row_height=32,
) as logs_table:
logs_table.rows = [
{
'component': log.component,
'log_level': log.log_level,
'log_message': log.log_message[:50] + '...' if len(log.log_message) > 50 else log.log_message
}
for log in recent_logs
]
if not recent_logs:
ui.label('No system logs yet.').style('font-size: 12px; color: #bdbdbd;')
# Separator
ui.separator(style='margin: 15px 0; color: rgba(255, 255, 255, 0.3);')
# API Endpoints Section
with ui.expansion_group('🔗 Available API Endpoints', default_open=True).props('dense') as api_section:
with ui.column().style('font-size: 12px; color: #78909c;') as endpoint_list:
endpoints = [
['/ (root)', 'Dashboard'],
['/generate', 'Generate new software (POST)'],
['/health', 'Health check'],
['/projects', 'List all projects'],
['/status/{project_id}', 'Get project status'],
['/audit/projects', 'Get project audit data'],
['/audit/logs', 'Get system logs'],
['/audit/trail', 'Get audit trail'],
['/audit/actions', 'Get user actions'],
['/audit/history', 'Get project history'],
['/audit/prompts', 'Get prompts'],
['/audit/changes', 'Get code changes'],
['/init-db', 'Initialize database (POST)'],
]
for endpoint, desc in endpoints:
ui.label(f'{endpoint:<30} {desc}')
finally:
db_session.close()
def run_app(port=None, reload=False, browser=True, storage_secret=None):
"""Run the NiceGUI app."""
ui.run(title='AI Software Factory Dashboard', port=port, reload=reload, browser=browser, storage_secret=storage_secret)
# Create and run the app
if __name__ in {'__main__', '__console__'}:
create_dashboard()
run_app()

View File

@@ -2,8 +2,8 @@
from sqlalchemy import create_engine, event from sqlalchemy import create_engine, event
from sqlalchemy.orm import sessionmaker, Session from sqlalchemy.orm import sessionmaker, Session
from ai_software_factory.config import settings from config import settings
from ai_software_factory.models import Base from models import Base
def get_engine() -> create_engine: def get_engine() -> create_engine:
@@ -66,20 +66,6 @@ def get_session() -> Session:
return session_factory return session_factory
def init_db() -> None:
"""Initialize database tables."""
engine = get_engine()
Base.metadata.create_all(bind=engine)
print("Database tables created successfully.")
def drop_db() -> None:
"""Drop all database tables (use with caution!)."""
engine = get_engine()
Base.metadata.drop_all(bind=engine)
print("Database tables dropped successfully.")
def get_db() -> Session: def get_db() -> Session:
"""Dependency for FastAPI routes that need database access.""" """Dependency for FastAPI routes that need database access."""
engine = get_engine() engine = get_engine()
@@ -92,12 +78,34 @@ def get_db() -> Session:
session.close() session.close()
def get_db_sync() -> Session:
"""Get a database session directly (for non-FastAPI/NiceGUI usage)."""
engine = get_engine()
SessionLocal = sessionmaker(autocommit=False, autoflush=False, bind=engine)
session = SessionLocal()
return session
def get_db_session() -> Session: def get_db_session() -> Session:
"""Get a database session directly (for non-FastAPI usage).""" """Get a database session directly (for non-FastAPI usage)."""
session = next(get_session()) session = next(get_session())
return session return session
def init_db() -> None:
"""Initialize database tables."""
engine = get_engine()
Base.metadata.create_all(bind=engine)
print("Database tables created successfully.")
def drop_db() -> None:
"""Drop all database tables (use with caution!)."""
engine = get_engine()
Base.metadata.drop_all(bind=engine)
print("Database tables dropped successfully.")
def create_migration_script() -> str: def create_migration_script() -> str:
"""Generate a migration script for database schema changes.""" """Generate a migration script for database schema changes."""
return '''-- Migration script for AI Software Factory database return '''-- Migration script for AI Software Factory database

View File

@@ -1,85 +0,0 @@
version: '3.8'
services:
ai-software-factory:
build:
context: .
dockerfile: Containerfile
ports:
- "8000:8000"
environment:
- HOST=0.0.0.0
- PORT=8000
- OLLAMA_URL=http://ollama:11434
- OLLAMA_MODEL=llama3
- GITEA_URL=${GITEA_URL:-https://gitea.yourserver.com}
- GITEA_TOKEN=${GITEA_TOKEN:-}
- GITEA_OWNER=${GITEA_OWNER:-ai-test}
- GITEA_REPO=${GITEA_REPO:-ai-test}
- N8N_WEBHOOK_URL=${N8N_WEBHOOK_URL:-}
- TELEGRAM_BOT_TOKEN=${TELEGRAM_BOT_TOKEN:-}
- TELEGRAM_CHAT_ID=${TELEGRAM_CHAT_ID:-}
- POSTGRES_HOST=postgres
- POSTGRES_PORT=5432
- POSTGRES_USER=${POSTGRES_USER:-ai_software_factory}
- POSTGRES_PASSWORD=${POSTGRES_PASSWORD:-}
- POSTGRES_DB=${POSTGRES_DB:-ai_software_factory}
- LOG_LEVEL=${LOG_LEVEL:-INFO}
- DB_POOL_SIZE=${DB_POOL_SIZE:-10}
- DB_MAX_OVERFLOW=${DB_MAX_OVERFLOW:-20}
- DB_POOL_RECYCLE=${DB_POOL_RECYCLE:-3600}
- DB_POOL_TIMEOUT=${DB_POOL_TIMEOUT:-30}
depends_on:
- postgres
networks:
- ai-test-network
postgres:
image: postgres:15-alpine
environment:
- POSTGRES_USER=${POSTGRES_USER:-ai_software_factory}
- POSTGRES_PASSWORD=${POSTGRES_PASSWORD:-}
- POSTGRES_DB=${POSTGRES_DB:-ai_software_factory}
volumes:
- postgres_data:/var/lib/postgresql/data
ports:
- "5432:5432"
networks:
- ai-test-network
# Health check for PostgreSQL
healthcheck:
test: ["CMD-SHELL", "pg_isready -U ${POSTGRES_USER:-ai_software_factory} -d ${POSTGRES_DB:-ai_software_factory}"]
interval: 10s
timeout: 5s
retries: 5
n8n:
image: n8nio/n8n:latest
ports:
- "5678:5678"
environment:
- N8N_HOST=n8n
- N8N_PORT=5678
- N8N_PROTOCOL=http
volumes:
- n8n_data:/home/node/.n8n
networks:
- ai-test-network
ollama:
image: ollama/ollama:latest
ports:
- "11434:11434"
volumes:
- ollama_data:/root/.ollama
networks:
- ai-test-network
volumes:
postgres_data:
n8n_data:
ollama_data:
networks:
ai-test-network:
driver: bridge

View File

@@ -0,0 +1,32 @@
"""Frontend module for NiceGUI with FastAPI integration.
This module provides the NiceGUI frontend that can be initialized with a FastAPI app.
The dashboard shown is from dashboard_ui.py with real-time database data.
"""
from fastapi import FastAPI
from nicegui import app, ui
from dashboard_ui import create_dashboard
def init(fastapi_app: FastAPI, storage_secret: str = 'Secr2t!') -> None:
"""Initialize the NiceGUI frontend with the FastAPI app.
Args:
fastapi_app: The FastAPI application instance.
storage_secret: Optional secret for persistent user storage.
"""
@ui.page('/show')
def show():
create_dashboard()
# NOTE dark mode will be persistent for each user across tabs and server restarts
ui.dark_mode().bind_value(app.storage.user, 'dark_mode')
ui.checkbox('dark mode').bind_value(app.storage.user, 'dark_mode')
ui.run_with(
fastapi_app,
storage_secret=storage_secret, # NOTE setting a secret is optional but allows for persistent storage per user
)

View File

@@ -1,754 +1,37 @@
"""FastAPI application for AI Software Factory.""" #!/usr/bin/env python3
"""AI Software Factory - Main application with FastAPI backend and NiceGUI frontend.
from fastapi import FastAPI, Depends, HTTPException, status This application uses FastAPI to:
from fastapi.middleware.cors import CORSMiddleware 1. Provide HTTP API endpoints
from fastapi.responses import JSONResponse 2. Host NiceGUI frontend via ui.run_with()
from sqlalchemy.orm import Session
from ai_software_factory.database import get_db, init_db, get_engine
from ai_software_factory.models import (
ProjectHistory, ProjectStatus, AuditTrail, UserAction, ProjectLog, SystemLog,
PullRequestData, UISnapshot
)
from ai_software_factory.agents.orchestrator import AgentOrchestrator
from ai_software_factory.agents.ui_manager import UIManager
from ai_software_factory.agents.database_manager import DatabaseManager
from ai_software_factory.config import settings
from datetime import datetime
import json
The NiceGUI frontend provides:
1. Interactive dashboard at /show
2. Real-time data visualization
3. Audit trail display
"""
app = FastAPI( import frontend
title="AI Software Factory", from fastapi import FastAPI
description="Automated software generation service with PostgreSQL audit trail", from database import init_db
version="0.0.2"
)
# Add CORS middleware app = FastAPI()
app.add_middleware(
CORSMiddleware,
allow_origins=["*"],
allow_credentials=True,
allow_methods=["*"],
allow_headers=["*"],
)
@app.get("/") @app.get('/')
async def root(): def read_root():
"""API information endpoint.""" """Root endpoint that returns welcome message."""
return { return {'Hello': 'World'}
"service": "AI Software Factory",
"version": "0.0.2",
"description": "Automated software generation with PostgreSQL audit trail",
"endpoints": {
"/": "API information",
"/health": "Health check",
"/generate": "Generate new software",
"/status/{project_id}": "Get project status",
"/projects": "List all projects",
"/audit/projects": "Get project audit data",
"/audit/logs": "Get project logs",
"/audit/system/logs": "Get system audit logs",
"/audit/trail": "Get audit trail",
"/audit/trail/{project_id}": "Get project audit trail",
"/audit/actions": "Get user actions",
"/audit/actions/{project_id}": "Get project user actions",
"/audit/history": "Get project history",
"/audit/history/{project_id}": "Get project history",
"/init-db": "Initialize database",
}
}
@app.get("/health") @app.post('/init-db')
async def health_check(): def initialize_database():
"""Health check endpoint.""" """Initialize database tables (POST endpoint for NiceGUI to call before dashboard)."""
return {"status": "healthy", "timestamp": datetime.utcnow().isoformat()} init_db()
return {'message': 'Database initialized successfully'}
@app.post("/init-db") frontend.init(app)
async def initialize_database(db: Session = Depends(get_db)):
"""Initialize database tables."""
try:
init_db()
return {"status": "success", "message": "Database tables initialized successfully"}
except Exception as e:
raise HTTPException(
status_code=status.HTTP_500_INTERNAL_SERVER_ERROR,
detail=f"Failed to initialize database: {str(e)}"
)
if __name__ == '__main__':
@app.post("/generate") print('Please start the app with the "uvicorn" command as shown in the start.sh script')
async def generate_software(
request: dict,
db: Session = Depends(get_db)
):
"""Generate new software based on user request."""
try:
# Validate request has required fields
if not request.get("name"):
raise HTTPException(
status_code=status.HTTP_422_UNPROCESSABLE_ENTITY,
detail="Request must contain 'name' field"
)
# Create orchestrator with database session
orchestrator = AgentOrchestrator(
project_id=request.get("name", "project"),
project_name=request.get("name", "Project"),
description=request.get("description", ""),
features=request.get("features", []),
tech_stack=request.get("tech_stack", []),
db=db
)
# Run orchestrator
result = await orchestrator.run()
# Flatten the response structure for tests
ui_data = orchestrator.ui_manager.ui_data
# Wrap data in {'status': '...'} format to match test expectations
return {
"status": result.get("status", orchestrator.status),
"data": {
"project_id": orchestrator.project_id,
"name": orchestrator.project_name,
"progress": orchestrator.progress,
"message": orchestrator.message,
"logs": orchestrator.logs,
"ui_data": ui_data,
"history_id": result.get("history_id")
}
}
except HTTPException:
raise
except Exception as e:
raise HTTPException(
status_code=status.HTTP_500_INTERNAL_SERVER_ERROR,
detail=str(e)
)
@app.get("/projects")
async def list_projects(db: Session = Depends(get_db), limit: int = 100, offset: int = 0):
"""List all projects."""
projects = db.query(ProjectHistory).offset(offset).limit(limit).all()
return {
"projects": [
{
"project_id": p.project_id,
"project_name": p.project_name,
"status": p.status,
"progress": p.progress,
"message": p.message,
"created_at": p.created_at.isoformat()
}
for p in projects
],
"total": db.query(ProjectHistory).count()
}
@app.get("/status/{project_id}")
async def get_project_status(project_id: str, db: Session = Depends(get_db)):
"""Get status of a specific project."""
history = db.query(ProjectHistory).filter(
ProjectHistory.project_id == project_id
).first()
if not history:
raise HTTPException(
status_code=status.HTTP_404_NOT_FOUND,
detail=f"Project {project_id} not found"
)
# Get latest UI snapshot
try:
latest_snapshot = db.query(UISnapshot).filter(
UISnapshot.history_id == history.id
).order_by(UISnapshot.created_at.desc()).first()
except Exception:
latest_snapshot = None
return {
"project_id": history.project_id,
"project_name": history.project_name,
"status": history.status,
"progress": history.progress,
"message": history.message,
"current_step": history.current_step,
"created_at": history.created_at.isoformat(),
"updated_at": history.updated_at.isoformat(),
"completed_at": history.completed_at.isoformat() if history.completed_at else None,
"ui_data": json.loads(latest_snapshot.snapshot_data) if latest_snapshot else None
}
@app.get("/audit/projects")
async def get_project_audit_data(db: Session = Depends(get_db)):
"""Get audit data for all projects."""
projects = db.query(ProjectHistory).all()
# Build PR data cache keyed by history_id
pr_cache = {}
all_prs = db.query(PullRequestData).all()
for pr in all_prs:
pr_cache[pr.history_id] = {
"pr_number": pr.pr_number,
"pr_title": pr.pr_title,
"pr_body": pr.pr_body,
"pr_state": pr.pr_state,
"pr_url": pr.pr_url,
"created_at": pr.created_at.isoformat() if pr.created_at else None
}
return {
"projects": [
{
"project_id": p.project_id,
"project_name": p.project_name,
"status": p.status,
"progress": p.progress,
"message": p.message,
"created_at": p.created_at.isoformat(),
"updated_at": p.updated_at.isoformat() if p.updated_at else None,
"completed_at": p.completed_at.isoformat() if p.completed_at else None,
"logs": [
{
"level": log.log_level,
"message": log.log_message,
"timestamp": log.timestamp.isoformat() if log.timestamp else None
}
for log in db.query(ProjectLog).filter(
ProjectLog.history_id == p.id
).limit(10).all()
],
"pr_data": pr_cache.get(p.id, None)
}
for p in projects
],
"total": len(projects)
}
@app.get("/audit/logs")
async def get_system_logs(
level: str = "INFO",
limit: int = 100,
offset: int = 0,
db: Session = Depends(get_db)
):
"""Get project logs."""
try:
logs = db.query(ProjectLog).filter(
ProjectLog.log_level == level
).offset(offset).limit(limit).all()
return {
"logs": [
{
"level": log.log_level,
"message": log.log_message,
"timestamp": log.timestamp.isoformat() if log.timestamp else None
}
for log in logs
],
"total": db.query(ProjectLog).filter(
ProjectLog.log_level == level
).count()
}
except Exception as e:
raise HTTPException(
status_code=status.HTTP_500_INTERNAL_SERVER_ERROR,
detail=str(e)
)
@app.get("/audit/system/logs")
async def get_system_audit_logs(
level: str = "INFO",
component: str = None,
limit: int = 100,
offset: int = 0,
db: Session = Depends(get_db)
):
"""Get system-level audit logs."""
try:
query = db.query(SystemLog).filter(SystemLog.log_level == level)
if component:
query = query.filter(SystemLog.component == component)
logs = query.offset(offset).limit(limit).all()
return {
"logs": [
{
"level": log.log_level,
"message": log.log_message,
"component": log.component,
"timestamp": log.created_at.isoformat() if log.created_at else None
}
for log in logs
],
"total": query.count()
}
except Exception as e:
raise HTTPException(
status_code=status.HTTP_500_INTERNAL_SERVER_ERROR,
detail=str(e)
)
@app.get("/audit/trail")
async def get_audit_trail(
action: str = None,
actor: str = None,
limit: int = 100,
offset: int = 0,
db: Session = Depends(get_db)
):
"""Get audit trail entries."""
try:
query = db.query(AuditTrail).order_by(AuditTrail.created_at.desc())
if action:
query = query.filter(AuditTrail.action == action)
if actor:
query = query.filter(AuditTrail.actor == actor)
audit_entries = query.offset(offset).limit(limit).all()
return {
"audit_trail": [
{
"id": audit.id,
"project_id": audit.project_id,
"action": audit.action,
"actor": audit.actor,
"action_type": audit.action_type,
"details": audit.details,
"metadata": audit.metadata,
"ip_address": audit.ip_address,
"user_agent": audit.user_agent,
"timestamp": audit.created_at.isoformat() if audit.created_at else None
}
for audit in audit_entries
],
"total": query.count(),
"limit": limit,
"offset": offset
}
except Exception as e:
raise HTTPException(
status_code=status.HTTP_500_INTERNAL_SERVER_ERROR,
detail=str(e)
)
@app.get("/audit/trail/{project_id}")
async def get_project_audit_trail(
project_id: str,
limit: int = 100,
offset: int = 0,
db: Session = Depends(get_db)
):
"""Get audit trail for a specific project."""
try:
audit_entries = db.query(AuditTrail).filter(
AuditTrail.project_id == project_id
).order_by(AuditTrail.created_at.desc()).offset(offset).limit(limit).all()
return {
"project_id": project_id,
"audit_trail": [
{
"id": audit.id,
"action": audit.action,
"actor": audit.actor,
"action_type": audit.action_type,
"details": audit.details,
"metadata": audit.metadata,
"ip_address": audit.ip_address,
"user_agent": audit.user_agent,
"timestamp": audit.created_at.isoformat() if audit.created_at else None
}
for audit in audit_entries
],
"total": db.query(AuditTrail).filter(
AuditTrail.project_id == project_id
).count(),
"limit": limit,
"offset": offset
}
except Exception as e:
raise HTTPException(
status_code=status.HTTP_404_NOT_FOUND,
detail=f"Project {project_id} not found"
)
@app.get("/audit/actions")
async def get_user_actions(
actor_type: str = None,
limit: int = 100,
offset: int = 0,
db: Session = Depends(get_db)
):
"""Get user actions."""
try:
query = db.query(UserAction).order_by(UserAction.created_at.desc())
if actor_type:
query = query.filter(UserAction.actor_type == actor_type)
actions = query.offset(offset).limit(limit).all()
return {
"actions": [
{
"id": action.id,
"history_id": action.history_id,
"action_type": action.action_type,
"actor_type": action.actor_type,
"actor_name": action.actor_name,
"description": action.action_description,
"data": action.action_data,
"ip_address": action.ip_address,
"user_agent": action.user_agent,
"timestamp": action.created_at.isoformat() if action.created_at else None
}
for action in actions
],
"total": query.count(),
"limit": limit,
"offset": offset
}
except Exception as e:
raise HTTPException(
status_code=status.HTTP_500_INTERNAL_SERVER_ERROR,
detail=str(e)
)
@app.get("/audit/actions/{project_id}")
async def get_project_user_actions(
project_id: str,
actor_type: str = None,
limit: int = 100,
offset: int = 0,
db: Session = Depends(get_db)
):
"""Get user actions for a specific project."""
history = db.query(ProjectHistory).filter(
ProjectHistory.project_id == project_id
).first()
if not history:
raise HTTPException(
status_code=status.HTTP_404_NOT_FOUND,
detail=f"Project {project_id} not found"
)
try:
query = db.query(UserAction).filter(
UserAction.history_id == history.id
).order_by(UserAction.created_at.desc())
if actor_type:
query = query.filter(UserAction.actor_type == actor_type)
actions = query.offset(offset).limit(limit).all()
return {
"project_id": project_id,
"actions": [
{
"id": action.id,
"action_type": action.action_type,
"actor_type": action.actor_type,
"actor_name": action.actor_name,
"description": action.action_description,
"data": action.action_data,
"timestamp": action.created_at.isoformat() if action.created_at else None
}
for action in actions
],
"total": query.count(),
"limit": limit,
"offset": offset
}
except Exception as e:
raise HTTPException(
status_code=status.HTTP_500_INTERNAL_SERVER_ERROR,
detail=str(e)
)
@app.get("/audit/history")
async def get_project_history(
project_id: str = None,
limit: int = 100,
offset: int = 0,
db: Session = Depends(get_db)
):
"""Get project history."""
try:
if project_id:
history = db.query(ProjectHistory).filter(
ProjectHistory.project_id == project_id
).first()
if not history:
raise HTTPException(
status_code=status.HTTP_404_NOT_FOUND,
detail=f"Project {project_id} not found"
)
return {
"project": {
"id": history.id,
"project_id": history.project_id,
"project_name": history.project_name,
"description": history.description,
"status": history.status,
"progress": history.progress,
"message": history.message,
"created_at": history.created_at.isoformat(),
"updated_at": history.updated_at.isoformat() if history.updated_at else None,
"completed_at": history.completed_at.isoformat() if history.completed_at else None,
"error_message": history.error_message
}
}
else:
histories = db.query(ProjectHistory).offset(offset).limit(limit).all()
return {
"histories": [
{
"id": h.id,
"project_id": h.project_id,
"project_name": h.project_name,
"status": h.status,
"progress": h.progress,
"message": h.message,
"created_at": h.created_at.isoformat()
}
for h in histories
],
"total": db.query(ProjectHistory).count(),
"limit": limit,
"offset": offset
}
except Exception as e:
raise HTTPException(
status_code=status.HTTP_500_INTERNAL_SERVER_ERROR,
detail=str(e)
)
@app.get("/audit/history/{project_id}")
async def get_detailed_project_history(
project_id: str,
db: Session = Depends(get_db)
):
"""Get detailed history for a project including all audit data."""
history = db.query(ProjectHistory).filter(
ProjectHistory.project_id == project_id
).first()
if not history:
raise HTTPException(
status_code=status.HTTP_404_NOT_FOUND,
detail=f"Project {project_id} not found"
)
try:
# Get all logs
logs = db.query(ProjectLog).filter(
ProjectLog.history_id == history.id
).order_by(ProjectLog.created_at.desc()).all()
# Get all user actions
actions = db.query(UserAction).filter(
UserAction.history_id == history.id
).order_by(UserAction.created_at.desc()).all()
# Get all audit trail entries
audit_entries = db.query(AuditTrail).filter(
AuditTrail.project_id == project_id
).order_by(AuditTrail.created_at.desc()).all()
# Get all UI snapshots
snapshots = db.query(UISnapshot).filter(
UISnapshot.history_id == history.id
).order_by(UISnapshot.created_at.desc()).all()
# Get PR data
pr = db.query(PullRequestData).filter(
PullRequestData.history_id == history.id
).first()
pr_data = None
if pr:
pr_data = {
"pr_number": pr.pr_number,
"pr_title": pr.pr_title,
"pr_body": pr.pr_body,
"pr_state": pr.pr_state,
"pr_url": pr.pr_url,
"created_at": pr.created_at.isoformat() if pr.created_at else None
}
return {
"project": {
"id": history.id,
"project_id": history.project_id,
"project_name": history.project_name,
"description": history.description,
"status": history.status,
"progress": history.progress,
"message": history.message,
"created_at": history.created_at.isoformat(),
"updated_at": history.updated_at.isoformat() if history.updated_at else None,
"completed_at": history.completed_at.isoformat() if history.completed_at else None,
"error_message": history.error_message
},
"logs": [
{
"id": log.id,
"level": log.log_level,
"message": log.log_message,
"timestamp": log.timestamp.isoformat() if log.timestamp else None
}
for log in logs
],
"actions": [
{
"id": action.id,
"action_type": action.action_type,
"actor_type": action.actor_type,
"actor_name": action.actor_name,
"description": action.action_description,
"data": action.action_data,
"timestamp": action.created_at.isoformat() if action.created_at else None
}
for action in actions
],
"audit_trail": [
{
"id": audit.id,
"action": audit.action,
"actor": audit.actor,
"action_type": audit.action_type,
"details": audit.details,
"metadata": audit.metadata,
"ip_address": audit.ip_address,
"user_agent": audit.user_agent,
"timestamp": audit.created_at.isoformat() if audit.created_at else None
}
for audit in audit_entries
],
"snapshots": [
{
"id": snapshot.id,
"data": snapshot.snapshot_data,
"created_at": snapshot.created_at.isoformat()
}
for snapshot in snapshots
],
"pr_data": pr_data
}
except Exception as e:
raise HTTPException(
status_code=status.HTTP_500_INTERNAL_SERVER_ERROR,
detail=str(e)
)
@app.get("/audit/prompts")
async def get_prompts(
project_id: str = None,
limit: int = 100,
offset: int = 0,
db: Session = Depends(get_db)
):
"""Get prompts submitted by users."""
try:
query = db.query(AuditTrail).filter(
AuditTrail.action_type == "PROMPT"
).order_by(AuditTrail.created_at.desc())
if project_id:
query = query.filter(AuditTrail.project_id == project_id)
prompts = query.offset(offset).limit(limit).all()
return {
"prompts": [
{
"id": audit.id,
"project_id": audit.project_id,
"actor": audit.actor,
"details": audit.details,
"metadata": audit.metadata,
"timestamp": audit.created_at.isoformat() if audit.created_at else None
}
for audit in prompts
],
"total": query.count(),
"limit": limit,
"offset": offset
}
except Exception as e:
raise HTTPException(
status_code=status.HTTP_500_INTERNAL_SERVER_ERROR,
detail=str(e)
)
@app.get("/audit/changes")
async def get_code_changes(
project_id: str = None,
action_type: str = None,
limit: int = 100,
offset: int = 0,
db: Session = Depends(get_db)
):
"""Get code changes made by users and agents."""
try:
query = db.query(AuditTrail).filter(
AuditTrail.action_type.in_(["CREATE", "UPDATE", "DELETE", "CODE_CHANGE"])
).order_by(AuditTrail.created_at.desc())
if project_id:
query = query.filter(AuditTrail.project_id == project_id)
if action_type:
query = query.filter(AuditTrail.action_type == action_type)
changes = query.offset(offset).limit(limit).all()
return {
"changes": [
{
"id": audit.id,
"project_id": audit.project_id,
"action": audit.action,
"actor": audit.actor,
"action_type": audit.action_type,
"details": audit.details,
"metadata": audit.metadata,
"timestamp": audit.created_at.isoformat() if audit.created_at else None
}
for audit in changes
],
"total": query.count(),
"limit": limit,
"offset": offset
}
except Exception as e:
raise HTTPException(
status_code=status.HTTP_500_INTERNAL_SERVER_ERROR,
detail=str(e)
)

View File

@@ -10,7 +10,7 @@ from sqlalchemy import (
) )
from sqlalchemy.orm import relationship, declarative_base from sqlalchemy.orm import relationship, declarative_base
from ai_software_factory.config import settings from config import settings
Base = declarative_base() Base = declarative_base()
logger = logging.getLogger(__name__) logger = logging.getLogger(__name__)

View File

@@ -1,10 +1,10 @@
fastapi==0.109.0 fastapi>=0.135.3
uvicorn[standard]==0.27.0 uvicorn[standard]==0.27.0
sqlalchemy==2.0.25 sqlalchemy==2.0.25
psycopg2-binary==2.9.9 psycopg2-binary==2.9.9
pydantic==2.5.3 pydantic==2.12.5
pydantic-settings==2.1.0 pydantic-settings==2.1.0
python-multipart==0.0.6 python-multipart==0.0.22
aiofiles==23.2.1 aiofiles==23.2.1
python-telegram-bot==20.7 python-telegram-bot==20.7
requests==2.31.0 requests==2.31.0
@@ -15,3 +15,4 @@ isort==5.13.2
flake8==6.1.0 flake8==6.1.0
mypy==1.7.1 mypy==1.7.1
httpx==0.25.2 httpx==0.25.2
nicegui==3.9.0

View File

@@ -0,0 +1,17 @@
#!/usr/bin/env bash
# use path of this example as working directory; enables starting this script from anywhere
cd "$(dirname "$0")"
if [ "$1" = "prod" ]; then
echo "Starting Uvicorn server in production mode..."
# we also use a single worker in production mode so socket.io connections are always handled by the same worker
uvicorn main:app --workers 1 --log-level info --port 80
elif [ "$1" = "dev" ]; then
echo "Starting Uvicorn server in development mode..."
# reload implies workers = 1
uvicorn main:app --reload --log-level debug --port 8000
else
echo "Invalid parameter. Use 'prod' or 'dev'."
exit 1
fi

View File

@@ -0,0 +1,385 @@
<!DOCTYPE html>
<html lang="en">
<head>
<meta charset="UTF-8">
<meta name="viewport" content="width=device-width, initial-scale=1.0">
<title>AI Software Factory Dashboard</title>
<style>
* {
margin: 0;
padding: 0;
box-sizing: border-box;
}
body {
font-family: 'Segoe UI', Tahoma, Geneva, Verdana, sans-serif;
background: linear-gradient(135deg, #1a1a2e 0%, #16213e 100%);
min-height: 100vh;
color: #fff;
padding: 20px;
}
.dashboard {
max-width: 1200px;
margin: 0 auto;
}
.header {
text-align: center;
padding: 30px;
background: rgba(255, 255, 255, 0.05);
border-radius: 15px;
margin-bottom: 20px;
border: 1px solid rgba(255, 255, 255, 0.1);
}
.header h1 {
font-size: 2.5em;
margin-bottom: 10px;
background: linear-gradient(90deg, #00d4ff, #00ff88);
-webkit-background-clip: text;
-webkit-text-fill-color: transparent;
background-clip: text;
}
.header p {
color: #888;
font-size: 1.1em;
}
.stats-grid {
display: grid;
grid-template-columns: repeat(auto-fit, minmax(250px, 1fr));
gap: 20px;
margin-bottom: 20px;
}
.stat-card {
background: rgba(255, 255, 255, 0.05);
border-radius: 15px;
padding: 25px;
border: 1px solid rgba(255, 255, 255, 0.1);
text-align: center;
}
.stat-card h3 {
font-size: 0.9em;
color: #888;
margin-bottom: 10px;
text-transform: uppercase;
letter-spacing: 1px;
}
.stat-card .value {
font-size: 2.5em;
font-weight: bold;
color: #00d4ff;
}
.stat-card.project .value { color: #00ff88; }
.stat-card.active .value { color: #ff6b6b; }
.stat-card.code .value { color: #ffd93d; }
.status-panel {
background: rgba(255, 255, 255, 0.05);
border-radius: 15px;
padding: 25px;
margin-bottom: 20px;
border: 1px solid rgba(255, 255, 255, 0.1);
}
.status-panel h2 {
font-size: 1.3em;
margin-bottom: 15px;
color: #00d4ff;
}
.status-bar {
height: 20px;
background: #2a2a4a;
border-radius: 10px;
overflow: hidden;
margin-bottom: 10px;
}
.status-fill {
height: 100%;
background: linear-gradient(90deg, #00d4ff, #00ff88);
border-radius: 10px;
transition: width 0.5s ease;
}
.message {
padding: 10px;
background: rgba(0, 212, 255, 0.1);
border-radius: 8px;
border-left: 4px solid #00d4ff;
}
.projects-section {
background: rgba(255, 255, 255, 0.05);
border-radius: 15px;
padding: 25px;
margin-bottom: 20px;
border: 1px solid rgba(255, 255, 255, 0.1);
}
.projects-section h2 {
font-size: 1.3em;
margin-bottom: 15px;
color: #00ff88;
}
.projects-list {
display: flex;
flex-wrap: wrap;
gap: 15px;
}
.project-item {
background: rgba(0, 255, 136, 0.1);
padding: 15px 20px;
border-radius: 10px;
border: 1px solid rgba(0, 255, 136, 0.3);
font-size: 0.9em;
}
.project-item.active {
background: rgba(255, 107, 107, 0.1);
border-color: rgba(255, 107, 107, 0.3);
}
.audit-section {
background: rgba(255, 255, 255, 0.05);
border-radius: 15px;
padding: 25px;
margin-bottom: 20px;
border: 1px solid rgba(255, 255, 255, 0.1);
}
.audit-section h2 {
font-size: 1.3em;
margin-bottom: 15px;
color: #ffd93d;
}
.audit-table {
width: 100%;
border-collapse: collapse;
margin-top: 10px;
}
.audit-table th, .audit-table td {
padding: 12px;
text-align: left;
border-bottom: 1px solid rgba(255, 255, 255, 0.1);
}
.audit-table th {
color: #888;
font-weight: 600;
font-size: 0.85em;
}
.audit-table td {
font-size: 0.9em;
}
.audit-table .timestamp {
color: #666;
font-size: 0.8em;
}
.actions-panel {
background: rgba(255, 255, 255, 0.05);
border-radius: 15px;
padding: 25px;
border: 1px solid rgba(255, 255, 255, 0.1);
text-align: center;
}
.actions-panel h2 {
font-size: 1.3em;
margin-bottom: 15px;
color: #ff6b6b;
}
.actions-panel p {
color: #888;
margin-bottom: 20px;
}
.loading {
text-align: center;
padding: 50px;
color: #888;
}
@media (max-width: 768px) {
.stats-grid {
grid-template-columns: 1fr;
}
.projects-list {
flex-direction: column;
}
}
</style>
</head>
<body>
<div class="dashboard">
<div class="header">
<h1>🚀 AI Software Factory</h1>
<p>Real-time Dashboard & Audit Trail Display</p>
</div>
<div class="stats-grid">
<div class="stat-card project">
<h3>Current Project</h3>
<div class="value" id="project-name">Loading...</div>
</div>
<div class="stat-card active">
<h3>Active Projects</h3>
<div class="value" id="active-projects">0</div>
</div>
<div class="stat-card code">
<h3>Total Projects</h3>
<div class="value" id="total-projects">0</div>
</div>
<div class="stat-card">
<h3>Status</h3>
<div class="value" id="status-value">Loading...</div>
</div>
</div>
<div class="status-panel">
<h2>📊 Current Status</h2>
<div class="status-bar">
<div class="status-fill" id="status-fill" style="width: 0%"></div>
</div>
<div class="message" id="status-message">Loading...</div>
</div>
<div class="projects-section">
<h2>📁 Active Projects</h2>
<div class="projects-list" id="projects-list">
<div class="loading">Loading projects...</div>
</div>
</div>
<div class="audit-section">
<h2>📜 Audit Trail</h2>
<table class="audit-table">
<thead>
<tr>
<th>Timestamp</th>
<th>Agent</th>
<th>Action</th>
<th>Status</th>
</tr>
</thead>
<tbody id="audit-trail-body">
<tr>
<td class="timestamp">Loading...</td>
<td>-</td>
<td>-</td>
<td>-</td>
</tr>
</tbody>
</table>
</div>
<div class="actions-panel">
<h2>⚙️ System Actions</h2>
<p id="actions-message">Dashboard is rendering successfully.</p>
<p style="color: #888; font-size: 0.9em;">This dashboard is powered by the AI Software Factory and displays real-time status updates, audit trails, and project information.</p>
</div>
</div>
<script>
// Fetch data from API
async function loadDashboardData() {
try {
// Load projects
const projectsResponse = await fetch('/projects');
const projectsData = await projectsResponse.json();
updateProjects(projectsData.projects);
// Get latest active project
const activeProject = projectsData.projects.find(p => p.status === 'RUNNING' || p.status === 'IN_PROGRESS');
if (activeProject) {
document.getElementById('project-name').textContent = activeProject.project_name || activeProject.project_id;
updateStatusPanel(activeProject);
// Load audit trail for this project
const auditResponse = await fetch(`/audit/trail?limit=10`);
const auditData = await auditResponse.json();
updateAuditTrail(auditData.audit_trail);
} else {
// No active project, show all projects
document.getElementById('projects-list').innerHTML = projectsData.projects.map(p =>
`<div class="project-item ${p.status === 'RUNNING' || p.status === 'IN_PROGRESS' ? 'active' : ''}">
<strong>${p.project_name || p.project_id}</strong> • ${p.status}${p.progress || 0}%
</div>`
).join('');
}
} catch (error) {
console.error('Error loading dashboard data:', error);
document.getElementById('status-message').innerHTML =
`<strong>Error:</strong> Failed to load dashboard data. Please check the console for details.`;
}
}
function updateProjects(projects) {
const activeProjects = projects.filter(p => p.status === 'RUNNING' || p.status === 'IN_PROGRESS' || p.status === 'COMPLETED').length;
document.getElementById('active-projects').textContent = activeProjects;
document.getElementById('total-projects').textContent = projects.length;
}
function updateStatusPanel(project) {
const progress = project.progress || 0;
document.getElementById('status-fill').style.width = progress + '%';
document.getElementById('status-message').innerHTML =
`<strong>${project.message || 'Project running...'}</strong><br>` +
`<span style="color: #888;">Progress: ${progress}%</span>`;
document.getElementById('status-value').textContent = project.status;
}
function updateAuditTrail(auditEntries) {
if (auditEntries.length === 0) {
document.getElementById('audit-trail-body').innerHTML =
`<tr><td colspan="4" style="text-align: center; color: #888;">No audit entries yet</td></tr>`;
return;
}
const formattedEntries = auditEntries.map(entry => ({
...entry,
timestamp: entry.timestamp ? new Date(entry.timestamp).toLocaleString() : '-'
}));
document.getElementById('audit-trail-body').innerHTML = formattedEntries.map(entry => `
<tr>
<td class="timestamp">${entry.timestamp}</td>
<td>${entry.actor || '-'}</td>
<td>${entry.action || entry.details || '-'}</td>
<td style="color: ${getStatusColor(entry.action_type || entry.status)};">${entry.action_type || entry.status || '-'}</td>
</tr>
`).join('');
}
function getStatusColor(status) {
if (!status) return '#888';
const upper = status.toUpperCase();
if (['SUCCESS', 'COMPLETED', 'FINISHED'].includes(upper)) return '#00ff88';
if (['IN_PROGRESS', 'RUNNING', 'PENDING'].includes(upper)) return '#00d4ff';
if (['ERROR', 'FAILED', 'FAILED'].includes(upper)) return '#ff6b6b';
return '#888';
}
// Load data when dashboard is ready
loadDashboardData();
</script>
</body>
</html>

View File

@@ -1,11 +0,0 @@
# test-project
Test project description
## Features
- feature-1
- feature-2
## Tech Stack
- python
- fastapi

View File

@@ -1,2 +0,0 @@
# Generated by AI Software Factory
print('Hello, World!')