Skip to content

Conversation

DoniLite
Copy link
Member

@DoniLite DoniLite commented Jan 3, 2025

Summary by Sourcery

Update Go version, introduce Postgres as the database, implement secure algorithm storage, and add a CI/CD pipeline.

New Features:

  • Implement secure algorithm storage using Docker containers and encryption.

Tests:

  • Add tests for compression and decompression services, and RSA key generation.

Copy link

sourcery-ai bot commented Jan 3, 2025

Reviewer's Guide by Sourcery

This pull request initializes the encryption storage algorithm and sets up the necessary infrastructure for secure algorithm storage and execution. It introduces a secure encryption service that handles algorithm storage, validation, encryption, and decryption operations within isolated Docker containers. The service uses AES encryption for secure storage of algorithms and supports various programming languages through Docker images. The update also includes database migration to PostgreSQL, directory creation for key and algorithm storage, and compression/decompression utilities.

Sequence diagram for algorithm storage and execution

sequenceDiagram
    participant Client
    participant Service as SecureEncryptionService
    participant Docker
    participant Storage

    Client->>Service: Store Algorithm(files, metadata)
    Service->>Service: Validate Algorithm
    Service->>Docker: Create Container
    Service->>Docker: Copy Files
    Service->>Docker: Run Tests
    Docker-->>Service: Test Results
    Service->>Service: Encrypt Algorithm
    Service->>Storage: Store Encrypted Files
    Service-->>Client: Success Response

    Note over Client,Storage: Later: Algorithm Execution
    Client->>Service: Execute Algorithm(data)
    Service->>Storage: Load Algorithm
    Service->>Service: Decrypt Algorithm
    Service->>Docker: Run in Container
    Docker-->>Service: Results
    Service-->>Client: Processed Data
Loading

Entity relationship diagram for database schema

erDiagram
    USERS {
        bigint id PK
        timestamp created_at
        timestamp updated_at
        timestamp deleted_at
        text name
        text email
        smallint age
        timestamp birthday
        text member_number
        timestamp activated_at
    }

    PRODUCTS {
        bigint id PK
        timestamp created_at
        timestamp updated_at
        timestamp deleted_at
        text code
        bigint price
    }
Loading

Class diagram for secure encryption service

classDiagram
    class SecureEncryptionService {
        -docker *client.Client
        -config SecurityConfig
        -storageDir string
        -keyStorage *KeyStorage
        +StoreAlgorithm(ctx, userID, files, metadata)
        +ValidateAlgorithm(ctx, files, metadata)
        +Encrypt(ctx, userID, data)
        +Decrypt(ctx, userID, data)
    }

    class SecurityConfig {
        +MaxCPUs int64
        +MaxMemoryMB int64
        +MaxExecTime int
        +WorkingDir string
        +StorageKey []byte
    }

    class KeyStorage {
        -storageKey []byte
        -storageDir string
    }

    class AlgorithmMetadata {
        +Language string
        +BuildCmd string
        +RunCmd string
        +EntryPoints map[string]string
    }

    SecureEncryptionService --> SecurityConfig
    SecureEncryptionService --> KeyStorage
    SecureEncryptionService ..> AlgorithmMetadata
Loading

File-Level Changes

Change Details Files
Adds a secure encryption service implementation.
  • Implements algorithm storage, validation, encryption, and decryption operations.
  • Uses Docker containers for isolation and supports multiple programming languages.
  • Integrates AES encryption for secure storage of algorithms.
  • Introduces a key storage mechanism for managing encryption keys.
services/secureAlgo_service.go
Initializes the encryption service and sets up configurations.
  • Sets up default service options, including Docker settings, resource limits, and storage configurations.
  • Initializes the secure encryption service with specified options.
  • Configures logging and metrics collection.
services/init_service.go
Migrates the database to PostgreSQL.
  • Updates database connection to use PostgreSQL.
  • Modifies Atlas HCL configuration for PostgreSQL.
  • Updates schema loader to generate PostgreSQL schema.
  • Adds SQL migration file for PostgreSQL database setup.
db/db.go
atlas.hcl
cmd/loader/main.go
migrations/20241231112547.sql
Creates necessary directories for key and algorithm storage.
  • Adds a function to create directories for RSA keys, server, user, client, and crypto storage.
  • Calls the directory creation function during application initialization.
utils/helpers.go
cmd/main.go
Adds compression and decompression utilities.
  • Implements data compression and decompression functions using gzip.
  • Adds tests for compression and decompression functionalities.
services/compression_service.go
services/services_test.go
Updates Go version and dependencies.
  • Upgrades Go version to 1.22.7.
  • Adds new dependencies for Docker, PostgreSQL, and other utilities.
  • Updates existing dependencies to newer versions.
go.mod
go.sum
Adds RSA key generation and storage.
  • Implements RSA key generation function.
  • Adds a function to create directories for storing RSA keys.
  • Includes tests for RSA key generation.
utils/helpers.go
utils/helpers_test.go
Sets up CI/CD pipeline using GitHub Actions.
  • Defines a workflow for build, test, and deploy stages.
  • Sets up Go environment, installs dependencies, and runs tests.
  • Builds the application and uploads artifacts.
  • Includes linting and code analysis steps.
.github/workflows/go.yml
Adds a test Python script for encryption and decryption.
  • Implements encryption and decryption functions using Fernet library.
  • Adds a metadata file for the Python script.
tests/python/test.py
tests/python/metadata.json
Adds and updates various test files.
  • Adds tests for compression and decompression services.
  • Adds tests for RSA key generation utility.
  • Updates existing test files.
services/services_test.go
utils/helpers_test.go
Updates .gitignore file.
  • Adds patterns to ignore specific files and directories related to the new features and dependencies.
.gitignore
Updates migration files and Atlas configuration.
  • Updates Atlas checksum file.
  • Adds a new SQL migration file for database schema changes.
  • Modifies Atlas HCL configuration for database settings.
migrations/atlas.sum
migrations/20241231112547.sql
atlas.hcl
Adds golangci-lint configuration.
  • Configures linters for code analysis and style checking.
  • Sets up linting rules and options.
golangci.yml
Adds a service log file.
  • Creates a log file for recording service activities and events.
.encryption-service/service.log

Tips and commands

Interacting with Sourcery

  • Trigger a new review: Comment @sourcery-ai review on the pull request.
  • Continue discussions: Reply directly to Sourcery's review comments.
  • Generate a GitHub issue from a review comment: Ask Sourcery to create an
    issue from a review comment by replying to it.
  • Generate a pull request title: Write @sourcery-ai anywhere in the pull
    request title to generate a title at any time.
  • Generate a pull request summary: Write @sourcery-ai summary anywhere in
    the pull request body to generate a PR summary at any time. You can also use
    this command to specify where the summary should be inserted.

Customizing Your Experience

Access your dashboard to:

  • Enable or disable review features such as the Sourcery-generated pull request
    summary, the reviewer's guide, and others.
  • Change the review language.
  • Add, remove or edit custom review instructions.
  • Adjust other review settings.

Getting Help

@DoniLite DoniLite merged commit f13fd52 into main Jan 3, 2025
1 of 3 checks passed
Copy link

@sourcery-ai sourcery-ai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Hey @DoniLite - I've reviewed your changes and found some issues that need to be addressed.

Blocking issues:

  • Avoid hardcoding database credentials in source code (link)
  • Hardcoded database credentials in atlas.hcl. (link)
  • Potentially sensitive path in services/init_service.go. (link)

Overall Comments:

  • Database credentials should not be hardcoded in db.go. Use environment variables or a secure configuration management system instead.
  • Consider moving Docker configuration (MaxCPUs, MaxMemoryMB, etc.) to environment variables rather than hardcoding in DefaultServiceOptions()
Here's what I looked at during the review
  • 🟢 General issues: all looks good
  • 🔴 Security: 3 blocking issues, 4 other issues
  • 🟡 Testing: 2 issues found
  • 🟢 Complexity: all looks good
  • 🟢 Documentation: all looks good

Sourcery is free for open source - if you like our reviews please consider sharing them ✨
Help me be more useful! Please click 👍 or 👎 on each comment and I'll use the feedback to improve your reviews.

}
// Enregistrement des clés
privateFile, err := os.Create(user + "-private.pem")
privateFile, err := os.Create(path.Join(keyDir, user + "-private.pem"))
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

🚨 suggestion (security): Consider setting restrictive file permissions for the private key file

Use os.OpenFile with mode 0600 to ensure the private key file has appropriate permissions. The public key file should also have controlled permissions, though they can be more permissive (e.g. 0644).

Suggested implementation:

    privateFile, err := os.OpenFile(path.Join(keyDir, user + "-private.pem"), os.O_WRONLY|os.O_CREATE|os.O_TRUNC, 0600)

If there is also code creating a public key file in this same function (which I can't see in the provided snippet), it should also be modified to use os.OpenFile with 0644 permissions like this:

publicFile, err := os.OpenFile(path.Join(keyDir, user + "-public.pem"), os.O_WRONLY|os.O_CREATE|os.O_TRUNC, 0644)

}


func CreateDirectories(dirs []string) {
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

🚨 suggestion (security): Add path validation to prevent directory traversal

Consider validating that all directory paths are within an expected root directory to prevent potential directory traversal issues.

Suggested implementation:

// rootDir defines the base directory for all created directories
const rootDir = "."

// isValidPath checks if a directory path is within the root directory
func isValidPath(dir string) bool {
    // Clean the path to resolve any .. or . components
    cleanPath := path.Clean(dir)

    // Check if the path tries to escape root directory
    if path.IsAbs(cleanPath) || strings.HasPrefix(cleanPath, "..") {
        return false
    }

    return true
}

func CreateDirectories(dirs []string) error {
    defaultDirs := []string{
        "rsa",
        "rsa/server",
        "rsa/user",
        "client",
        "crypto",
    }

    if len(dirs) == 0 {
        dirs = defaultDirs
    }

    for _, dir := range dirs {
        if !isValidPath(dir) {
            return fmt.Errorf("invalid directory path: %s", dir)
        }

        fullPath := path.Join(rootDir, dir)
        if _, err := os.Stat(fullPath); os.IsNotExist(err) {
            if err := os.MkdirAll(fullPath, os.ModePerm); err != nil {
                return fmt.Errorf("failed to create directory %s: %v", fullPath, err)
            }
        }

You'll need to:

  1. Add "fmt" and "strings" to the imports if they're not already present
  2. Update any code that calls CreateDirectories to handle the returned error
  3. Update any code that depends on the old relative paths (../rsa, etc) to use the new paths relative to rootDir

}


func DecompressData(data []byte) ([]byte, error) {
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

🚨 suggestion (security): Add size limit for decompressed data to prevent memory exhaustion

Consider implementing a maximum size limit for decompressed data and using io.LimitReader to prevent potential denial of service attacks through zip bombs.

Suggested implementation:

	"bytes"
	"compress/gzip"
	"errors"
	"io"
)

// MaxDecompressedSize defines the maximum allowed size for decompressed data (100MB)
const MaxDecompressedSize = 100 * 1024 * 1024

// ErrDecompressedSizeExceeded is returned when decompressed data exceeds the size limit
var ErrDecompressedSizeExceeded = errors.New("decompressed data exceeds maximum allowed size")
    limitReader := io.LimitReader(gz, MaxDecompressedSize+1)
    decompressed, err := io.ReadAll(limitReader)
    if err != nil {
        return nil, err
    }

    if len(decompressed) > MaxDecompressedSize {
        return nil, ErrDecompressedSizeExceeded

func init() {

db, err := gorm.Open(sqlite.Open("./db/test.db"), &gorm.Config{})
dsn := "host=localhost user=doni password=DoniLite13 dbname=anexis port=5432 sslmode=disable"
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

🚨 issue (security): Avoid hardcoding database credentials in source code

Consider using environment variables or a secure configuration management system to handle sensitive credentials.

}

// Encrypt chiffre les données de manière sécurisée
func (s *SecureEncryptionService) Encrypt(ctx context.Context, userID string, data []byte) ([]byte, error) {
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

🚨 suggestion (security): Consider implementing rate limiting for encryption operations

To prevent potential abuse, consider adding rate limiting per user or IP address for encryption and decryption operations.

Suggested implementation:

// Encrypt chiffre les données de manière sécurisée
func (s *SecureEncryptionService) Encrypt(ctx context.Context, userID string, data []byte) ([]byte, error) {
    // Check rate limit for the user
    if err := s.rateLimiter.Allow(userID); err != nil {
        return nil, fmt.Errorf("rate limit exceeded for user %s: %w", userID, err)
    }

    return s.runSecureOperation(ctx, userID, "encrypt", data)
}
// RateLimiter defines the interface for rate limiting operations
type RateLimiter interface {
    Allow(key string) error
}

type SecureEncryptionService struct {
    rateLimiter RateLimiter
    // ... other fields
}

type AlgorithmMetadata struct {

To complete this implementation, you'll also need to:

  1. Create a rate limiter implementation (e.g., using golang.org/x/time/rate or a similar package)
  2. Update the SecureEncryptionService constructor to accept and initialize the rate limiter
  3. Apply the same rate limiting to the Decrypt method if it exists
  4. Consider adding rate limit configuration parameters (like requests per minute) to the service configuration

Example rate limiter implementation could use:

import "golang.org/x/time/rate"

The constructor would need to be updated like:

func NewSecureEncryptionService(config Config) *SecureEncryptionService {
    return &SecureEncryptionService{
        rateLimiter: NewRateLimiter(config.RateLimit),
        // ... other fields
    }
}

@@ -0,0 +1,50 @@
package services_test
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

issue (testing): Missing tests for SecureEncryptionService

This PR introduces the SecureEncryptionService but doesn't include any corresponding tests. It's crucial to add tests for StoreAlgorithm, ValidateAlgorithm, Encrypt, and Decrypt to ensure the service functions correctly and securely. Consider testing various scenarios, including valid and invalid algorithm inputs, different data sizes, and edge cases like empty data or invalid user IDs.

var privateKey *rsa.PrivateKey
const USER = "doni"

func TestGenerator(t *testing.T) {
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

suggestion (testing): Add edge case tests for GenerateRSAKeys

The test for GenerateRSAKeys should cover edge cases, such as invalid user inputs (e.g., empty string, special characters) and potential errors during file creation. This ensures the function is robust and handles unexpected situations gracefully.

Suggested implementation:

import (
	"crypto/rsa"
	"strings"
	"testing"

	"cloudbeast.doni/m/utils"
)

var privateKey *rsa.PrivateKey
const USER = "doni"

func TestGenerator(t *testing.T) {
	tests := []struct {
		name    string
		user    string
		wantErr bool
	}{
		{
			name:    "Valid username",
			user:    USER,
			wantErr: false,
		},
		{
			name:    "Empty username",
			user:    "",
			wantErr: true,
		},
		{
			name:    "Username with special characters",
			user:    "user@#$%^",
			wantErr: true,
		},
		{
			name:    "Very long username",
			user:    strings.Repeat("a", 256),
			wantErr: true,
		},
	}

	for _, tt := range tests {
		t.Run(tt.name, func(t *testing.T) {
			key, err := utils.GenerateRSAKeys(tt.user)
			if (err != nil) != tt.wantErr {
				t.Errorf("GenerateRSAKeys() error = %v, wantErr %v", err, tt.wantErr)
				return
			}
			if !tt.wantErr {
				if key == nil {
					t.Error("GenerateRSAKeys() returned nil key for valid input")
				}
				privateKey = key
			}
		})
	}
}

Note that these changes assume that the GenerateRSAKeys function in the utils package should validate its input and return appropriate errors for invalid cases. You may need to update the GenerateRSAKeys implementation to:

  1. Add input validation for empty usernames
  2. Add validation for usernames with special characters
  3. Add validation for maximum username length
  4. Handle file creation errors appropriately

The exact error conditions should match your application's requirements for valid usernames.

env "gorm" {
src = data.external_schema.gorm.url
dev = "sqlite://db/test.db"
dev = "postgresql://doni:DoniLite13@localhost:5432/anexis"
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

🚨 issue (security): Hardcoded database credentials in atlas.hcl.

The database credentials doni:DoniLite13 are hardcoded in atlas.hcl. This is a security risk and should be stored securely, such as in environment variables or a secrets management service.

Opts = DefaultServiceOptions()

// Personnaliser le chemin de travail
workDir := "../.encryption-service"
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

🚨 issue (security): Potentially sensitive path in services/init_service.go.

The path ../.encryption-service is hardcoded. Ensure this doesn't expose sensitive information and consider using environment variables or configuration files for flexibility.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant