Skip to content
Open
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
2 changes: 2 additions & 0 deletions .gitignore
Original file line number Diff line number Diff line change
Expand Up @@ -33,3 +33,5 @@ build/

### VS Code ###
.vscode/

python-producer/data/credit_card_transactions.csv
68 changes: 68 additions & 0 deletions ACTIVITY_LOG.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,68 @@
# Activity Log: Project Architecture Restructuring

## Goal

Restructure the project to have:

1. **Python producer**: Keep as-is, reads CSV and produces to Kafka
2. **Java Spring Boot**: Revert to original Transaction format (simple senderId, recipientId, amount)
3. **Go microservice**: New service to consume from Kafka and process AnalyticsTransaction format

## Changes Made

### 1. ✅ Switched to Flow Branch and Restored Original Spring Boot State

- **Git Operation**: `git checkout flow`

- Switched to the flow branch which contains the original Spring Boot implementation
- Restored TransactionListener.java to original state (listens to simple Transaction format)
- Restored TransactionService.java to original state (processes simple Transaction format)

- **Current State**:
- Java Spring Boot now accepts simple Transaction format (senderId, recipientId, amount)
- No AnalyticsTransaction processing in Java service
- Python producer directory is available and ready to use

### 2. ✅ Python Producer Status

- **File**: `python-producer/data/kafka_prod.py`
- Already implemented and working
- Reads from CSV file: `credit_card_transactions.csv`
- Produces to Kafka topic: `transactions`
- Sends data in chunks of 100 with 10-second delays

### 3. 🔄 Next Steps: Create Go Microservice

- **File**: `go-analytics/main.go`

- Create main Go application with Kafka consumer
- Consume AnalyticsTransaction format from Kafka
- Process credit card transaction data

- **File**: `go-analytics/go.mod`

- Go module configuration
- Dependencies for Kafka, JSON handling, logging

- **File**: `go-analytics/kafka/consumer.go`

- Kafka consumer implementation
- Handle AnalyticsTransaction message deserialization

- **File**: `go-analytics/database/models.go`
- Database models for storing transaction analytics
- GORM models for credit card transaction data

## Architecture Flow

1. **Python Producer** → Reads CSV → Produces AnalyticsTransaction to Kafka
2. **Go Microservice** → Consumes from Kafka → Processes AnalyticsTransaction data
3. **Java Spring Boot** → Accepts simple Transaction format via REST API → Processes basic transactions

## Benefits

- Clear separation of concerns
- Python handles data ingestion
- Go handles analytics processing
- Java handles core transaction processing
- Microservices architecture with event-driven communication
99 changes: 97 additions & 2 deletions README.md
Original file line number Diff line number Diff line change
@@ -1,2 +1,97 @@
# Midas
Project repo for the JPMC Advanced Software Engineering Forage program
# JPMorgan Chase Midas Analytics Service

## Overview

This repository contains the codebase for the Midas Analytics Service, a project designed to demonstrate modern microservice architecture using Java Spring Boot, with planned extensions in Go, Cassandra, and Terraform. The project is structured to provide a robust backend and, in the future, a Go-based analytics microservice and infrastructure as code.

---

## Project Structure

```
jpmc-Midas/
forage-midas/
go-analytics/
main.go # (Go microservice - planned, not yet functional)
# (other Go files: e.g., consumer.go, database/, kafka/)
src/
main/
java/
com/jpmc/midascore/
# Java Spring Boot backend (implemented)
terraform/ # (Infrastructure as code - planned)
README.md
# ...other files
go.mod
```

---

## Currently Implemented

### 1. **Java Spring Boot Backend (`forage-midas/src/main/java/com/jpmc/midascore/`)**

- **Entities:** `TransactionRecord`, `UserRecord`, etc.
- **Controllers:** `BalanceController`
- **Repositories:** `TransactionRepository`, `UserRepository`
- **Services:** `TransactionService`
- **Kafka Integration:** `TransactionListener`
- **Database Integration:** `DatabaseConduit`
- **Testing:** Multiple test classes for core functionality.

> **Note:** The Go microservice and Terraform infrastructure are not yet functional. Current Go code is experimental and not production-ready.

---

## Planned Additions

### 1. **Go Analytics Microservice (Planned)**

- **Kafka Producer & Consumer:**
- Implement robust Kafka producer and consumer in Go to process and analyze messages.
- **Cassandra Integration:**
- Use a Go Cassandra driver (e.g., `gocql`) to persist and query analytics data.
- **RESTful API:**
- Expose analytics endpoints via HTTP using `gorilla/mux`.

### 2. **Infrastructure as Code (Terraform) (Planned)**

- **Automated Provisioning:**
- Use Terraform to provision Kafka, Cassandra, and networking resources.
- **Environment Management:**
- Separate configurations for development, staging, and production.

### 3. **CI/CD Integration (Planned)**

- Automated testing and deployment for both Java and Go services.
- Terraform plan/apply automation.

### 4. **Documentation and Developer Experience (Planned)**

- Comprehensive API documentation (Swagger/OpenAPI).
- Sample data and usage examples.
- Learning guides for running and extending the system.

---

## Getting Started


### Quick Start (Java Spring Boot Backend)

1. **Navigate to the project directory:**
```sh
cd forage-midas
```
2. **Build and run the Spring Boot application:**
```sh
./mvnw spring-boot:run
```
3. **Run tests:**
```sh
./mvnw test
```

> **Note:** Go microservice and Terraform scripts are not yet functional. Instructions will be updated as these components are implemented.

---
31 changes: 31 additions & 0 deletions application.yml
Original file line number Diff line number Diff line change
@@ -0,0 +1,31 @@
general:
kafka-topic: test-topic

spring:
kafka:
bootstrap:servers: localhost:9092
producer:
key-serializer: org.apache.kafka.common.serialization.StringSerializer
value-serializer: org.springframework.kafka.support.serializer.JsonSerializer
consumer:
group-id: midas-core-group
auto-offset-reset: earliest
key-deserializer: org.apache.kafka.common.serialization.StringDeserializer
value-deserializer: org.springframework.kafka.support.serializer.JsonDeserializer
properties:
spring.json.trusted.packages: com.jpmc.midascore.foundation

datasource:
url: jdbc:h2:mem:testdb
driver-class-name: org.h2.Driver
username: sa
password: password
h2:
console:
enabled: true
path: /h2-console
jpa:
database-platform: org.hibernate.dialect.H2Dialect
hibernate:
ddl-auto: update
show-sql: true
1 change: 1 addition & 0 deletions go-analytics/analytics/behvior.go
Original file line number Diff line number Diff line change
@@ -0,0 +1 @@
package analytics
1 change: 1 addition & 0 deletions go-analytics/analytics/metrics.go
Original file line number Diff line number Diff line change
@@ -0,0 +1 @@
package analytics
Loading