Skip to content
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
The table of contents is too big for display.
Diff view
Diff view
  •  
  •  
  •  
18 changes: 12 additions & 6 deletions .3rd-party/README.md
Original file line number Diff line number Diff line change
@@ -1,10 +1,16 @@
# Third-Party Dependencies
Third-Party Dependencies
===

This folder provides listings of all 3rd-party dependencies incl. their licenses. There is a dedicated subfolder for
each release (and milestone) holding the release-specific information.
This folder contains DEPENDENCIES that has the list of all 3rd-party dependencies (including transitive) with their licenses and approval status. Each release (and milestone) holds the release-specific information.

The DEPENDENCIES file could be generated manually using [Eclipse Dash License Tool](https://github.com/eclipse/dash-licenses) maven plugin by running in root folder:
DEPENDENCIES file is generated (automatically and committed) by [../.github/workflows/reusable_workflow_license-scan.yaml](../.github/workflows/reusable_workflow_license-scan.yaml) during the release process ([../.github/workflows/release.yaml](../.github/workflows/release.yaml)) and on daily basis ([../.github/workflows/license-scan.yaml](../.github/workflows/license-scan.yaml)). It is also

DEPENDENCIES file could be generated manually using [Eclipse Dash License Tool](https://github.com/eclipse/dash-licenses) maven plugin by running:
```shell
$ cd .. && mvn license-tool:license-check -Ddash.fail=false -PcheckLicense
```

Note: Some projects (e.g. test artifacts) could be excluded with *--projects* parameter, e.g:
```shell
mvn clean install -PcheckLicense -DskipTests \
--projects '!org.eclipse.hawkbit:hawkbit-repository-test,!org.eclipse.hawkbit:hawkbit-dmf-rabbitmq-test'
$ cd .. && mvn license-tool:license-check -Ddash.fail=false -PcheckLicense \ --projects '!org.eclipse.hawkbit:hawkbit-repository-test,!org.eclipse.hawkbit:hawkbit-dmf-rabbitmq-test'
```
2 changes: 1 addition & 1 deletion .github/workflows/reusable_workflow_license-scan.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -50,7 +50,7 @@ jobs:
- name: Check dependency licenses with dash tool (and open issues to Dash IP lab, doesn't fail)
if: ${{ inputs.open_tickets }}
run: |
mvn license-tool:license-check -Ddash.fail=false -PcheckLicense -Ddash.iplab.token=${GITLAB_API_TOKEN} --projects '!org.eclipse.hawkbit:hawkbit-repository-test,!org.eclipse.hawkbit:hawkbit-dmf-rabbitmq-test'
mvn license-tool:license-check -Ddash.fail=false -PcheckLicense -Ddash.iplab.token=${GITLAB_API_TOKEN}
CHANGED_FILES_COUNT=$(git status --short | wc -l)
CHANGED_FILES_COUNT=${CHANGED_FILES_COUNT//[[:space:]]/}
echo "Number of changed files: ${CHANGED_FILES_COUNT}"
Expand Down
48 changes: 31 additions & 17 deletions docker/README.md
Original file line number Diff line number Diff line change
@@ -1,24 +1,33 @@
hawkBit Docker
===

# Setup
## Overview
This folder contains example Docker build and Docker Compose files to build and start the hawkBit as monolith or as microservices.

## A: Docker Container
Start the hawkBit Update Server as a single container (requires Docker to be installed and all dependencies to be available)
## Build
You could build the hawkbit Docker images following the [README.md](build/README.md) instructions.

## Start
You can start hawkbit as a Docker Container (only monolith) or with Docker Compose

#### A: Docker Container (only as monolith)
_Note: You need to have Docker installed on your machine._

Start the hawkBit Update Server (monolith) as a single container (with embedded H2, if you configure a different database, e.g. MySQL or PostgreSQL, you should start it separately):

```bash
$ docker run -d -p 8080:8080 hawkbit/hawkbit-update-server:latest
```

## B: Docker Compose
Start the hawkBit Update Server together with an MySQL and RabbitMQ instance as containers (Requires Docker Compose to be installed)
### B: Docker Compose
_Note: You need to have Docker Compose installed on your machine._

Start the hawkBit Update Server (monolith) together with an MySQL and RabbitMQ instance as containers (Requires Docker Compose to be installed)

```bash
$ docker compose -f mysql/docker-compose-monolith-mysql.yml up
```
You could, also start it in different flavours, with UI or in microservices mode.

Note: Whit the upper command CTRL+C shuts down all services. Add '-d' at the end to start all into detached mode:
With the upper command CTRL+C shuts down all services. Add '-d' at the end to start all into detached mode:
```bash
$ docker compose -f mysql/docker-compose-monolith-mysql.yml up -d
```
Expand All @@ -27,15 +36,20 @@ Then stop all services with:
$ docker compose -f mysql/docker-compose-monolith-mysql.yml down
```

# Access
| Service / Container | URL | Login | A | B | C |
|--------------------------|--------------------------------------------------|-------------|----------|----------|----------|
| hawkBit Update Server | [http://localhost:8080/](http://localhost:8080/) | admin:admin | ✓ | ✓ | ✓ |
| MySQL | localhost:3306/hawkbit | root | | ✓ | ✓ |
| RabbitMQ | [http://localhost:15672](http://localhost:15672) | guest:guest | | ✓ | ✓ |
You could, also start it in different flavours, with UI or in microservices mode (see Docker Compose files in [mysql](./mysql) and [postgres](./postgres) folders). For instance to start with PostgreSQL, with RabbitMQ, in microservices mode and with UI you could use:
```bash
$ docker compose -f postgres/docker-compose-micro-services-with-simple-ui-postgres.yml up
```

# Configuration
You can override application.properties by setting an environment variable SPRING_APPLICATION_JSON for hawkbit container.
### Access
| Service / Container | URL | Login | A | B |
|-----------------------|------------------------|-------------|----------|----------|
| hawkBit Update Server | [http://localhost:8080/](http://localhost:8080/) | admin:admin | ✓ | ✓ |
| MySQL | localhost:3306/hawkbit | root | | ✓ |
| RabbitMQ | [http://localhost:15672](http://localhost:15672) | guest:guest | | ✓ |

### Configuration
You can override _application.properties_ by setting an environment variable _SPRING_APPLICATION_JSON_ to the hawkbit container, e.g.:

```yaml
hawkbit:
Expand All @@ -55,4 +69,4 @@ hawkbit:
"hawkbit.security.user.hawkbit.password": "{noop}isAwesome!",
"hawkbit.security.user.hawkbit.roles": "TENANT_ADMIN"
}'
```
```
2 changes: 0 additions & 2 deletions docker/build/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -28,9 +28,7 @@ Docker image could be build, for example, with (fixed version 0.4.1 is just an e
```shell
docker build --build-arg HAWKBIT_APP=hawkbit-update-server --build-arg HAWKBIT_VERSION=0.4.1 -t hawkbit_update_server:0.4.1 . -f Dockerfile
```

or just by:

```shell
docker build --build-arg HAWKBIT_VERSION=0.4.1 -t hawkbit_update_server:0.4.1 .
```
Expand Down
5 changes: 5 additions & 0 deletions hawkbit-artifact/README.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,5 @@
hawkBit Artifact
===
The module contains internal modules for artifact storage and encryption:
* [hawkbit-artifact-api](hawkbit-artifact-api/README.md) - see for artifact API module
* [hawkbit-artifact-fs](hawkbit-artifact-fs/README.md) - see for file-system based artifact storage implementation
9 changes: 6 additions & 3 deletions hawkbit-artifact/hawkbit-artifact-api/README.md
Original file line number Diff line number Diff line change
@@ -1,3 +1,6 @@
# hawkBit Artifact API

Various internal interfaces artifact API classes.
hawkBit Artifact API
===
The module contains artifact API classes supporting following main concepts:
* Artifact Storage - represented by the [ArtifactStorage](src/main/java/org/eclipse/hawkbit/artifact/ArtifactStorage.java) interface. It serves for artifact binary store operations
* Artifact Encryption - represented by the [ArtifactEncryptionService](src/main/java/org/eclipse/hawkbit/artifact/encryption/ArtifactEncryptionService.java). It is a pluggable implementation of artifact encryption operations.
* Artifact URL handling - represented by[ArtifactUrlResolver](src/main/java/org/eclipse/hawkbit/artifact/urlresolver/ArtifactUrlResolver.java) interface. It provides resolving URLs to the artifacts. The module provides a simple property based implementation ([PropertyBasedArtifactUrlResolver](src/main/java/org/eclipse/hawkbit/artifact/urlresolver/PropertyBasedArtifactUrlResolver.java))
Original file line number Diff line number Diff line change
Expand Up @@ -7,7 +7,7 @@
*
* SPDX-License-Identifier: EPL-2.0
*/
package org.eclipse.hawkbit.repository.artifact;
package org.eclipse.hawkbit.artifact;

import java.io.BufferedOutputStream;
import java.io.File;
Expand All @@ -22,27 +22,27 @@
import java.util.HexFormat;

import lombok.extern.slf4j.Slf4j;
import org.eclipse.hawkbit.repository.artifact.exception.ArtifactStoreException;
import org.eclipse.hawkbit.repository.artifact.exception.HashNotMatchException;
import org.eclipse.hawkbit.repository.artifact.model.AbstractDbArtifact;
import org.eclipse.hawkbit.repository.artifact.model.DbArtifactHash;
import org.eclipse.hawkbit.artifact.exception.ArtifactStoreException;
import org.eclipse.hawkbit.artifact.exception.HashNotMatchException;
import org.eclipse.hawkbit.artifact.model.ArtifactHashes;
import org.eclipse.hawkbit.artifact.model.StoredArtifactInfo;
import org.springframework.util.ObjectUtils;

/**
* Abstract utility class for ArtifactRepository implementations with common functionality, e.g. computation of hashes.
*/
@Slf4j
public abstract class AbstractArtifactRepository implements ArtifactRepository {
public abstract class AbstractArtifactStorage implements ArtifactStorage {

private static final String TEMP_FILE_PREFIX = "tmp";
private static final String TEMP_FILE_SUFFIX = "artifactrepo";

// suppress warning, of not strong enough hashing algorithm, SHA-1 and MD5 is not used security related
@SuppressWarnings("squid:S2070")
@Override
public AbstractDbArtifact store(
public StoredArtifactInfo store(
final String tenant, final InputStream content, final String filename, final String contentType,
final DbArtifactHash providedHashes) {
final ArtifactHashes providedHashes) {
final MessageDigest mdSHA1;
final MessageDigest mdMD5;
final MessageDigest mdSHA256;
Expand All @@ -60,19 +60,19 @@ public AbstractDbArtifact store(

final HexFormat hexFormat = HexFormat.of().withLowerCase();

final String sha1Hash16 = hexFormat.formatHex(mdSHA1.digest());
final String md5Hash16 = hexFormat.formatHex(mdMD5.digest());
final String sha256Hash16 = hexFormat.formatHex(mdSHA256.digest());
final String sha1Hash = hexFormat.formatHex(mdSHA1.digest());
final String md5Hash = hexFormat.formatHex(mdMD5.digest());
final String sha256Hash = hexFormat.formatHex(mdSHA256.digest());

checkHashes(providedHashes, sha1Hash16, md5Hash16, sha256Hash16);
checkHashes(providedHashes, sha1Hash, md5Hash, sha256Hash);

// Check if file with same sha1 hash exists and if so return it
if (existsBySha1(tenant, sha1Hash16)) {
if (existsBySha1(tenant, sha1Hash)) {
// TODO - shall check if the file is really the same as bytes or just sha1 hash is the same
return addMissingHashes(getBySha1(tenant, sha1Hash16), sha1Hash16, md5Hash16, sha256Hash16);
return new StoredArtifactInfo(contentType, tempFile.length(), new ArtifactHashes(sha1Hash, md5Hash, sha256Hash));
}

return store(sanitizeTenant(tenant), new DbArtifactHash(sha1Hash16, md5Hash16, sha256Hash16), contentType, tempFile);
return store(sanitizeTenant(tenant), new ArtifactHashes(sha1Hash, md5Hash, sha256Hash), contentType, tempFile);
} catch (final IOException e) {
throw new ArtifactStoreException(e.getMessage(), e);
} finally {
Expand Down Expand Up @@ -104,13 +104,13 @@ protected String storeTempFile(final InputStream content) throws IOException {
return file.getPath();
}

protected abstract AbstractDbArtifact store(final String tenant, final DbArtifactHash base16Hashes,
final String contentType, final String tempFile) throws IOException;
protected abstract StoredArtifactInfo store(
final String tenant, final ArtifactHashes base16Hashes, final String contentType, final String tempFile) throws IOException;

// java:S1066 - more readable with separate "if" statements
// java:S4042 - delete reason is not needed
@SuppressWarnings({ "java:S1066", "java:S4042" })
static File createTempFile(final boolean directory) {
public static File createTempFile(final boolean directory) {
try {
final File file = (directory
? Files.createTempDirectory(TEMP_FILE_PREFIX)
Expand Down Expand Up @@ -138,22 +138,22 @@ static File createTempFile(final boolean directory) {
}
}

private static void checkHashes(final DbArtifactHash providedHashes,
final String sha1Hash16, final String md5Hash16, final String sha256Hash16) {
private static void checkHashes(
final ArtifactHashes providedHashes, final String sha1Hash16, final String md5Hash16, final String sha256Hash16) {
if (providedHashes == null) {
return;
}

if (areHashesNotMatching(providedHashes.getSha1(), sha1Hash16)) {
throw new HashNotMatchException("The given sha1 hash " + providedHashes.getSha1() +
if (areHashesNotMatching(providedHashes.sha1(), sha1Hash16)) {
throw new HashNotMatchException("The given sha1 hash " + providedHashes.sha1() +
" does not match the calculated sha1 hash " + sha1Hash16, HashNotMatchException.SHA1);
}
if (areHashesNotMatching(providedHashes.getMd5(), md5Hash16)) {
throw new HashNotMatchException("The given md5 hash " + providedHashes.getMd5() +
if (areHashesNotMatching(providedHashes.md5(), md5Hash16)) {
throw new HashNotMatchException("The given md5 hash " + providedHashes.md5() +
" does not match the calculated md5 hash " + md5Hash16, HashNotMatchException.MD5);
}
if (areHashesNotMatching(providedHashes.getSha256(), sha256Hash16)) {
throw new HashNotMatchException("The given sha256 hash " + providedHashes.getSha256() +
if (areHashesNotMatching(providedHashes.sha256(), sha256Hash16)) {
throw new HashNotMatchException("The given sha256 hash " + providedHashes.sha256() +
" does not match the calculated sha256 hash " + sha256Hash16, HashNotMatchException.SHA256);
}
}
Expand All @@ -167,16 +167,6 @@ private static DigestInputStream wrapInDigestInputStream(final InputStream input
return new DigestInputStream(new DigestInputStream(new DigestInputStream(input, mdSHA256), mdMD5), mdSHA1);
}

private AbstractDbArtifact addMissingHashes(final AbstractDbArtifact existing,
final String calculatedSha1, final String calculatedMd5, final String calculatedSha256) {
final String sha1 = checkEmpty(existing.getHashes().getSha1(), calculatedSha1);
final String md5 = checkEmpty(existing.getHashes().getMd5(), calculatedMd5);
final String sha256 = checkEmpty(existing.getHashes().getSha256(), calculatedSha256);

existing.setHashes(new DbArtifactHash(sha1, md5, sha256));
return existing;
}

private String checkEmpty(final String value, final String fallback) {
return ObjectUtils.isEmpty(value) ? fallback : value;
}
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -7,22 +7,23 @@
*
* SPDX-License-Identifier: EPL-2.0
*/
package org.eclipse.hawkbit.repository.artifact;
package org.eclipse.hawkbit.artifact;

import java.io.InputStream;

import jakarta.validation.constraints.NotEmpty;
import jakarta.validation.constraints.NotNull;

import org.eclipse.hawkbit.repository.artifact.exception.ArtifactStoreException;
import org.eclipse.hawkbit.repository.artifact.exception.HashNotMatchException;
import org.eclipse.hawkbit.repository.artifact.model.AbstractDbArtifact;
import org.eclipse.hawkbit.repository.artifact.model.DbArtifactHash;
import org.eclipse.hawkbit.artifact.exception.ArtifactBinaryNotFoundException;
import org.eclipse.hawkbit.artifact.exception.ArtifactStoreException;
import org.eclipse.hawkbit.artifact.exception.HashNotMatchException;
import org.eclipse.hawkbit.artifact.model.ArtifactHashes;
import org.eclipse.hawkbit.artifact.model.StoredArtifactInfo;

/**
* ArtifactRepository service interface.
* Artifact Store service interface.
*/
public interface ArtifactRepository {
public interface ArtifactStorage {

/**
* Stores an artifact into the repository.
Expand All @@ -37,20 +38,20 @@ public interface ArtifactRepository {
* @throws ArtifactStoreException in case storing of the artifact was not successful
* @throws HashNotMatchException in case {@code hash} is provided and not matching to the calculated hashes during storing
*/
AbstractDbArtifact store(
StoredArtifactInfo store(
@NotEmpty String tenant, @NotNull InputStream content, @NotEmpty String filename,
String contentType, DbArtifactHash hash);
String contentType, ArtifactHashes hash);

/**
* Retrieves a {@link AbstractDbArtifact} from the store by its SHA1 hash. Throws {@link org.eclipse.hawkbit.repository.artifact.exception.ArtifactBinaryNotFoundException} if not found.
* Retrieves a {@link StoredArtifactInfo} from the store by its SHA1 hash. Throws {@link ArtifactBinaryNotFoundException} if not found.
* The caller is responsible to close the InputStream.
*
* @param tenant the tenant to store the artifact
* @param sha1Hash the sha1-hash of the file to lookup.
* @return The artifact file object or {@code null} if no file exists.
* @throws UnsupportedOperationException if implementation does not support the operation
*/
AbstractDbArtifact getBySha1(@NotEmpty String tenant, @NotEmpty String sha1Hash);

InputStream getBySha1(@NotEmpty String tenant, @NotEmpty String sha1Hash);

/**
* Checks if an artifact exists for a given tenant by its sha1 hash
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -7,13 +7,13 @@
*
* SPDX-License-Identifier: EPL-2.0
*/
package org.eclipse.hawkbit.repository.artifact.encryption;
package org.eclipse.hawkbit.artifact.encryption;

import java.io.InputStream;
import java.util.Map;
import java.util.Set;

import org.eclipse.hawkbit.repository.artifact.exception.ArtifactEncryptionFailedException;
import org.eclipse.hawkbit.artifact.exception.ArtifactEncryptionFailedException;

/**
* Interface definition for artifact encryption.
Expand All @@ -23,7 +23,7 @@ public interface ArtifactEncryption {
/**
* Defines the required secret keys for particular encryption algorithm.
*
* @return list of required secret keys
* @return set of required secret keys
*/
Set<String> requiredSecretKeys();

Expand Down Expand Up @@ -61,4 +61,4 @@ public interface ArtifactEncryption {
* @return encryption overhead in byte
*/
int encryptionSizeOverhead();
}
}
Loading