-
Notifications
You must be signed in to change notification settings - Fork 24
Open
Description
Describe the bug
Hi developers,
I try to use E3TB to train on my dataset. I meet the follow error
DEEPTB INFO ------------------------------------------------------------------
DEEPTB INFO Cutoff options:
DEEPTB INFO
DEEPTB INFO r_max : {'Nb': 8.0, 'O': 7.0, 'Cl': 7.0}
DEEPTB INFO er_max : None
DEEPTB INFO oer_max : None
DEEPTB INFO ------------------------------------------------------------------
DEEPTB INFO A public `info.json` file is provided, and will be used by the subfolders who do not have their own `info.json` file.
Processing dataset...
^MLoading data: 0%| | 0/1 [00:00<?, ?it/s]/miniconda3/envs/dptb/lib/python3.10/site-packages/dptb/data/AtomicData.py:963:
UserWarning: The given NumPy array is not writable, and PyTorch does not support non-writable tensors.
This means writing to this tensor will result in undefined behavior.
You may want to copy the array to protect its data or make it writable before converting it to a tensor.
This type of warning will be suppressed for the rest of this program.
(Triggered internally at ../torch/csrc/utils/tensor_numpy.cpp:199.)
cell_tensor = torch.as_tensor(temp_cell, device=out_device, dtype=out_dtype)
^MLoading data: 0%| | 0/1 [00:00<?, ?it/s]
Traceback (most recent call last):
File "/miniconda3/envs/dptb/bin/dptb", line 8, in <module>
sys.exit(main())
............................(There are still a few lines that have not been copied)................................
File "/miniconda3/envs/dptb/lib/python3.10/site-packages/dptb/utils/torch_geometric/dataset.py", line 175, in _process
self.process()
File "/miniconda3/envs/dptb/lib/python3.10/site-packages/dptb/data/dataset/_base_datasets.py", line 209, in process
data = self.get_data() ## get data returns either a list of AtomicData class or a data dict
File "/miniconda3/envs/dptb/lib/python3.10/site-packages/dptb/data/dataset/_default_dataset.py", line 384, in get_data
subdata_list = subdata.toAtomicDataList(self.transform)
File "/miniconda3/envs/dptb/lib/python3.10/site-packages/dptb/data/dataset/_default_dataset.py", line 294, in toAtomicDataList
block_to_feature(atomic_data, idp, features, overlaps)
File "/miniconda3/envs/dptb/lib/python3.10/site-packages/dptb/data/interfaces/ham_to_feature.py", line 95, in block_to_feature
onsite_out[feature_slice] = block_ij.flatten()
ValueError: could not broadcast input array from shape (4,) into shape (5,)
Expected behavior
Can you help me with this? I don't know where to start.
Looking forward to your reply.
To Reproduce
I used dftio to parse the dataset. Whether there is a problem in this step.
Environment
No response
Additional Context
No response
Metadata
Metadata
Assignees
Labels
No labels