Skip to content

Commit cbd4a2a

Browse files
committed
Docs: Enhanced README with architecture summary and .env instructions
1 parent b456ad9 commit cbd4a2a

File tree

1 file changed

+25
-3
lines changed

1 file changed

+25
-3
lines changed

README.md

Lines changed: 25 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -1,11 +1,11 @@
11
# Competitor Analysis
22

3-
This repository contains a system for analyzing competitors using data from various sources like Google, Wikipedia, and Reddit. It dynamically generates SWOT analyses and PDF reports using advanced AI-driven techniques.
3+
This repository contains a system for analyzing competitors using data from various sources like Google, Wikipedia, LinkedIn and Reddit. It dynamically generates SWOT analyses and PDF reports using advanced AI-driven techniques.
44

55
## Features
66

77
- **Multi-Source Data Integration**:
8-
Aggregates data from Google Search, Wikipedia, and Reddit.
8+
Aggregates data from Google Search, Wikipedia, LinkedIn and Reddit.
99
- **AI-Driven SWOT Analysis**:
1010
Uses state-of-the-art LLMs (e.g., [Llama-2](https://github.com/FardinHash/competitor-analysis/tree/llama2), [Gemma 2](https://github.com/FardinHash/competitor-analysis/tree/gemma-27b)) to extract insights and generate SWOT analyses.
1111
- **Automated PDF Reporting**:
@@ -15,6 +15,17 @@ This repository contains a system for analyzing competitors using data from vari
1515

1616
---
1717

18+
## Architecture
19+
20+
The system follows a modular, multi-agent architecture:
21+
1. **Data Retrieval Agent**: Aggregates data from multiple external sources.
22+
2. **NLP Processing Agent**: Normalizes and preprocesses raw data for downstream tasks.
23+
3. **SWOT Analysis Agent**: Uses an LLM to generate Strengths, Weaknesses, Opportunities, and Threats.
24+
4. **Report Generator**: Creates a polished PDF report for the final analysis.
25+
5. **Orchestrator**: Manages the workflow between agents.
26+
27+
---
28+
1829
## Setup
1930

2031
### 1. Clone the Repository
@@ -33,6 +44,17 @@ source venv/bin/activate # On Windows: .\venv\Scripts\activate
3344
```bash
3445
pip install -r requirements.txt
3546
```
47+
48+
#### 4. Set Up Environment Variables
49+
Create a `.env` file in the root directory and populate it with the following variables:
50+
```env
51+
OPENAI_API_KEY=your-openai-api-key
52+
CRUNCHBASE_API_KEY=your-crunchbase-api-key
53+
REDDIT_CLIENT_ID=your-reddit-client-id
54+
REDDIT_CLIENT_SECRET=your-reddit-client-secret
55+
REDDIT_USER_AGENT=your-reddit-user-agent
56+
HUGGINGFACE_HUB_TOKEN=your-huggingface-hub-token
57+
```
3658
---
3759

3860
## Usage
@@ -50,7 +72,7 @@ python src/main.py
5072

5173
---
5274

53-
## Test Cases
75+
## Test Cases (Development)
5476

5577
To ensure the system works as expected, use the following example test cases:
5678

0 commit comments

Comments
 (0)