Skip to content
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
14 changes: 7 additions & 7 deletions .pre-commit-config.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -17,16 +17,16 @@ repos:
- id: mixed-line-ending
- repo: https://github.com/astral-sh/ruff-pre-commit
# Ruff version.
rev: v0.14.1
rev: v0.14.11
hooks:
# Run the linter.
- id: ruff
args: [ --fix ]
# Run the formatter.
- id: ruff-format
- repo: local
hooks:
- id: ty
name: ty check
entry: uv run ty check .
language: python
# - repo: local
# hooks:
# - id: ty
# name: ty check
# entry: uv run ty check .
# language: python
5 changes: 5 additions & 0 deletions CHANGELOG.md
Original file line number Diff line number Diff line change
Expand Up @@ -6,6 +6,11 @@ The format is based on [Keep a Changelog](https://keepachangelog.com/en/1.1.0/),

## [Unreleased]

### Added

- Time stamps recording the time of application creation and conclusion.
- Time stamps recording the exact time of individual voting casts.

## [0.3.0] - 2025-11-12

### Added
Expand Down
4 changes: 4 additions & 0 deletions Dockerfile
Original file line number Diff line number Diff line change
Expand Up @@ -37,6 +37,10 @@ COPY --from=builder /app/.venv /app/.venv
# Copy the application source code
COPY src/ ./src/

# Copy alembic configuration and migration scripts
COPY alembic/ ./alembic/
COPY alembic.ini .

# Add the virtual environment's executables to the PATH
ENV PATH="/app/.venv/bin:$PATH"
ENV PYTHONPATH=/app/src
Expand Down
1 change: 1 addition & 0 deletions ai_docs/masterplan.md
Original file line number Diff line number Diff line change
Expand Up @@ -39,6 +39,7 @@ This document outlines the high-level development phases for ProjectVote. Each s
### 6.1 Unified Docker-based Deployment
### 6.2 Database Setup & Management
### 6.3 Continuous Integration (GitHub Actions)
### 6.4 Database Migrations (Alembic)

## 7. Enhancements & Future Features
### 7.1 User Authentication & Authorization
Expand Down
3 changes: 3 additions & 0 deletions ai_docs/task_1_1.md
Original file line number Diff line number Diff line change
Expand Up @@ -17,18 +17,21 @@ Design and implement the initial database schema to support the core entities of
* Applicant details (e.g., first name, last name, email)
* Project details (e.g., title, description, costs, department)
* Application status (e.g., pending, approved, rejected)
* Timestamps: `created_at` (for submission time), `concluded_at` (for final decision time)
* [x] Define essential attributes for the `VoteRecord` entity, including:
* Associated application
* Voter identification (e.g., email)
* Unique voting token
* Vote decision (e.g., approve, reject)
* Vote status (e.g., pending, cast)
* Timestamp: `voted_at` (for when the vote was cast)
* [x] Establish the one-to-many relationship: one `Application` can have multiple `VoteRecord`s.

### Phase 2: ORM Model Implementation
* [x] Choose an appropriate ORM (e.g., SQLAlchemy) for Python backend.
* [x] Create Python classes (models) representing the `Application` and `VoteRecord` entities, mapping their attributes to database columns.
* [x] Define primary keys, foreign keys, data types, and constraints for all columns.
* [x] Add timestamp fields (`created_at`, `concluded_at` for `Application`, `voted_at` for `VoteRecord`) to the ORM models.
* [x] Implement the relationship between the `Application` and `VoteRecord` models within the ORM.

### Phase 3: Database Initialization
Expand Down
23 changes: 23 additions & 0 deletions ai_docs/task_2_7.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,23 @@
# Task 2.7: Display Timestamps in Application Archive

## Goal
Update the frontend application archive to display the `created_at`, `voted_at`, and `concluded_at` timestamps, providing more detailed information about the application lifecycle.

## Plan
1. Update the backend API to include the new timestamp fields in the data transfer objects (DTOs).
2. Modify the frontend archive component to display the `created_at` and `concluded_at` timestamps for each application.
3. Update the expanded view in the archive to show the `voted_at` timestamp for each individual vote.
4. Format the timestamps for user-friendly display.

## Tasks

### Phase 1: Backend API Update
* [x] Add `created_at` and `concluded_at` fields to the `ApplicationOut` Pydantic model in `src/projectvote/backend/main.py`.
* [x] Add `voted_at` field to the `VoteOut` Pydantic model in `src/projectvote/backend/main.py`.
* [x] Ensure the new fields are correctly populated and returned by the `/applications/archive` endpoint.

### Phase 2: Frontend Archive Component
* [x] Update the `Archive.tsx` component to fetch the new timestamp fields.
* [x] Display `created_at` and `concluded_at` for each application in the main archive view.
* [x] In the expanded section for each application, display the `voted_at` timestamp next to each vote.
* [x] Implement a function to format the timestamp strings into a more readable format (e.g., "Jan 7, 2026, 10:30 AM").
38 changes: 38 additions & 0 deletions ai_docs/task_6_4.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,38 @@
# Task 6.4: Database Migrations (Alembic)

## Goal
Implement a robust database migration system using Alembic to manage schema changes, ensuring smooth upgrades and downgrades for the application's database.

## Plan
1. Integrate Alembic into the project.
2. Configure Alembic to work with SQLAlchemy models.
3. Generate and apply initial migration script.
4. Define a process for creating and applying future migrations.
5. Automate migration application in the Docker container startup.

## Tasks

### Phase 1: Alembic Integration and Configuration
Refer to the latest `alembic` documentation at https://alembic.sqlalchemy.org/en/latest/index.html
* [x] Add `alembic` to `pyproject.toml` dependencies.
* [x] Initialize Alembic environment in the project root (`uv run alembic init --template async --template pyproject alembic`).
* [x] Configure `alembic.ini` to point to the correct database URL and SQLAlchemy models.
* [x] Modify `migrations/env.py` to import the base metadata from `src/projectvote/backend/models.py`.

### Phase 2: Initial Migration Creation and Application
* [x] Generate the first migration script to reflect database schema changes:
* Added columns to table ``applications``.
1. ``created_at = Column(DateTime, nullable=False, server_default=func.now())``
2. ``concluded_at = Column(DateTime, nullable=True)``
* Added columns to table ``votes``.
1. ``voted_at = Column(DateTime, nullable=True)``
* [x] Manually inspect and adjust the generated migration script to include default values for non-nullable columns being added to existing data (e.g., `created_at`).
* [x] Apply the migration to the development database (`alembic upgrade head`).

### Phase 3: Docker Integration
* [x] Update the Dockerfile or `entrypoint.sh` to automatically run `alembic upgrade head` during container startup to apply any pending migrations.
* [x] Ensure the migration step runs before the application server starts.

### Verification
* [x] After applying migrations, verify that the database schema is updated correctly.
* [x] Test the application with the migrated database to ensure data integrity and functionality.
49 changes: 49 additions & 0 deletions alembic.ini
Original file line number Diff line number Diff line change
@@ -0,0 +1,49 @@
# A generic, single database configuration.

[alembic]

# path to migration scripts
# Note: This is also defined in pyproject.toml [tool.alembic] section,
# but Alembic CLI requires it here as well.
script_location = alembic

# database URL. This is consumed by the user-maintained env.py script only.
# other means of configuring database URLs may be customized within the env.py
# file.
sqlalchemy.url = sqlite:///./data/applications.db


# Logging configuration
[loggers]
keys = root,sqlalchemy,alembic

[handlers]
keys = console

[formatters]
keys = generic

[logger_root]
level = WARNING
handlers = console
qualname =

[logger_sqlalchemy]
level = WARNING
handlers =
qualname = sqlalchemy.engine

[logger_alembic]
level = INFO
handlers =
qualname = alembic

[handler_console]
class = StreamHandler
args = (sys.stderr,)
level = NOTSET
formatter = generic

[formatter_generic]
format = %(levelname)-5.5s [%(name)s] %(message)s
datefmt = %H:%M:%S
1 change: 1 addition & 0 deletions alembic/README
Original file line number Diff line number Diff line change
@@ -0,0 +1 @@
pyproject configuration, based on the generic configuration.
78 changes: 78 additions & 0 deletions alembic/env.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,78 @@
"""Alembic migration environment configuration."""

from logging.config import fileConfig

from sqlalchemy import engine_from_config, pool

from alembic import context
from projectvote.backend.models import Base

# this is the Alembic Config object, which provides
# access to the values within the .ini file in use.
config = context.config

# Interpret the config file for Python logging.
# This line sets up loggers basically.
if config.config_file_name is not None:
fileConfig(config.config_file_name)

# add your model's MetaData object here
# for 'autogenerate' support

target_metadata = Base.metadata


# other values from the config, defined by the needs of env.py,
# can be acquired:
# my_important_option = config.get_main_option("my_important_option") # noqa: ERA001
# ... etc.


def run_migrations_offline() -> None:
"""Run migrations in 'offline' mode.

This configures the context with just a URL
and not an Engine, though an Engine is acceptable
here as well. By skipping the Engine creation
we don't even need a DBAPI to be available.

Calls to context.execute() here emit the given string to the
script output.

"""
url = config.get_main_option("sqlalchemy.url")
context.configure(
url=url,
target_metadata=target_metadata,
literal_binds=True,
dialect_opts={"paramstyle": "named"},
)

with context.begin_transaction():
context.run_migrations()


def run_migrations_online() -> None:
"""Run migrations in 'online' mode.

In this scenario we need to create an Engine
and associate a connection with the context.

"""
connectable = engine_from_config(
config.get_section(config.config_ini_section, {}),
prefix="sqlalchemy.",
poolclass=pool.NullPool,
)

with connectable.connect() as connection:
context.configure(connection=connection, target_metadata=target_metadata)

with context.begin_transaction():
context.run_migrations()


if context.is_offline_mode():
run_migrations_offline()
else:
run_migrations_online()
28 changes: 28 additions & 0 deletions alembic/script.py.mako
Original file line number Diff line number Diff line change
@@ -0,0 +1,28 @@
"""${message}

Revision ID: ${up_revision}
Revises: ${down_revision | comma,n}
Create Date: ${create_date}

"""
from typing import Sequence, Union

from alembic import op
import sqlalchemy as sa
${imports if imports else ""}

# revision identifiers, used by Alembic.
revision: str = ${repr(up_revision)}
down_revision: Union[str, Sequence[str], None] = ${repr(down_revision)}
branch_labels: Union[str, Sequence[str], None] = ${repr(branch_labels)}
depends_on: Union[str, Sequence[str], None] = ${repr(depends_on)}


def upgrade() -> None:
"""Upgrade schema."""
${upgrades if upgrades else "pass"}


def downgrade() -> None:
"""Downgrade schema."""
${downgrades if downgrades else "pass"}
47 changes: 47 additions & 0 deletions alembic/versions/f07b511c58dc_added_date_columns.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,47 @@
"""Added date columns.

Revision ID: f07b511c58dc
Revises:
Create Date: 2026-01-20 22:07:59.314701

"""

from collections.abc import Sequence

import sqlalchemy as sa

from alembic import op

# revision identifiers, used by Alembic.
revision: str = "f07b511c58dc"
down_revision: str | Sequence[str] | None = None
branch_labels: str | Sequence[str] | None = None
depends_on: str | Sequence[str] | None = None


def upgrade() -> None:
"""Upgrade schema."""
# Use batch_alter_table to work around SQLite's ALTER TABLE limitations
with op.batch_alter_table("applications", schema=None) as batch_op:
batch_op.add_column(
sa.Column(
"created_at",
sa.DateTime(),
server_default=sa.text("datetime('now', 'localtime')"),
nullable=False,
)
)
batch_op.add_column(sa.Column("concluded_at", sa.DateTime(), nullable=True))

with op.batch_alter_table("votes", schema=None) as batch_op:
batch_op.add_column(sa.Column("voted_at", sa.DateTime(), nullable=True))
# ### end Alembic commands ###


def downgrade() -> None:
"""Downgrade schema."""
# ### commands auto generated by Alembic - please adjust! ###
op.drop_column("votes", "voted_at")
op.drop_column("applications", "concluded_at")
op.drop_column("applications", "created_at")
# ### end Alembic commands ###
14 changes: 13 additions & 1 deletion entrypoint.sh
Original file line number Diff line number Diff line change
Expand Up @@ -9,7 +9,19 @@
# regardless of the host user's UID/GID.
chown -R appuser:appuser /app/data

# 2. Execute the main command (CMD) passed to the container.
# 2. Run database migrations before starting the application.
# This ensures the database schema is up-to-date with the current code.
# Migrations are run as root first (before dropping privileges) to ensure
# permissions are correct.
echo "Running database migrations..."
alembic -c alembic.ini upgrade head
if [ $? -ne 0 ]; then
echo "Failed to run database migrations" >&2
exit 1
fi
echo "Database migrations completed successfully"

# 3. Execute the main command (CMD) passed to the container.
# `gosu` is a lightweight tool for dropping privileges.
# This command runs the uvicorn server as the `appuser`.
exec gosu appuser "$@"
Loading