Skip to content

Large storage sizes supporting #191

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Open
p-drift opened this issue Apr 22, 2025 · 1 comment
Open

Large storage sizes supporting #191

p-drift opened this issue Apr 22, 2025 · 1 comment

Comments

@p-drift
Copy link

p-drift commented Apr 22, 2025

Hello!

On one of the nodes in my PVE cluster, there is a Directory-type storage which is a mount point for an S3 bucket via MinIO. The fuse.geesefs driver doesn't support detecting the bucket size, so it reports it as 1.13 PB:

sudo pvesh get /nodes/$(hostname)/storage --noborder --human-readable 0
content   storage        type active avail           enabled shared total           used used_fraction
backup    minio_backup   dir  1      1125899906842624 1       0      1125899906842624 0    0

When calling func (*Node) Storages for this node, I get the following error:

json: cannot unmarshal number 1.12589990684262e+15 into Go struct field Storage.Total of type uint64

Is there any plan to support such large storage sizes in the future?

@luthermonson
Copy link
Owner

the fix is probably to migrate to a big.Int

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants