Releases: aimat-lab/gcnn_keras
Releases · aimat-lab/gcnn_keras
kgcnn v2.2.1
- HOTFIX: Removed
tensorflow_gpu
from setup.py - Added
HDNNP4th.py
to literature. - Fixed error in
ChangeTensorType
config for model save. - Merged pull request for #103 for
kgcnn.xai
.
kgcnn 2.2.0
- Removed deprecated modules in
kgcnn.utils
. - Moved
kgcnn.utils.model
tokgcnn.model.utils
. - Fixed behaviour for
kgcnn.data.base.MemoryGraphDataset.get_train_test_indices
to return list of train-test index tuples. - Updated
kgcnn.hyper.hyper.HyperParameter
to deserialize metrics and loss for multi-output models. - Added
trajectory_name
insummary.py
andhistory.py
. - Fixed
kgcnn.layers.geom.PositionEncodingBasisLayer
- Removed deprecated
kgcnn.layers.conv.attention
andkgcnn.layers.conv.message
- Updated
setup.py
for requirements. - HOTFIX: Error in
kgcnn.scaler.mol.ExtensiveMolecuarScaler
, where scale was not properly applied. However, this was only present in development (master) version, not in release. - Added
kgcnn.layers.relational
with relation dense layer. - Added first draft of
kgcnn.literature.HDNNP2nd
. - Added
append
,update
andadd
toMemoryGraphList
. - Fixed behaviour of
GraphDict
, which now is not making a copy of arrays. - Added explainable GNN from
visual_graph_datasets
. - Ran training for
train_force.py
- Changed backend to RDkit for
QMDatasets
. - Added
kgcnn.md
andkgcnn.xai
. - Added further documentation.
kgcnn 2.1.1
- Removed
kgcnn.graph.adapter
and switched to completedkgcnn.graph.preprocessor
. The interface toMemoryGraphList
and datasets does not change. How to update:
from kgcnn.data.base import GraphDict
GraphDict().apply_preprocessor("sort_edge_indices") # Instead of GraphDict().sort_edge_indices()
GraphDict().apply_preprocessor("set_edge_weights_uniform", value=0.0) # Instead of GraphDict().set_edge_weights_uniform(value=0.0)
# Or directly using class.
from kgcnn.graph.preprocessor import SortEdgeIndices
SortEdgeIndices(in_place=True)(GraphDict())
- Add
kgcnn.literature.MEGAN
model. - Add
kgcnn.literature.MXMNet
model. - Fixed error in
ClinToxDataset
label index. - Reworked
kgcnn.graph.adj.get_angle_index
with additional function arguments. Default behaviour remains identical. For periodic system an additionalallow_reverse_edges=True
is now required. - Added input embedding for edges in
kgcnn.literature.EGNN
. Debugged model. - Reworked
kgcnn.literature.PAiNN
to simplify normalization option and add equivariant initialization method. - Refactored
kgcnn.data.qm
including all qm7-qm9 datasets. Improved documentation. If error occurs, please runQM9Dataset(reload=True)
. - Refactored
kgcnn.data.moleculenet
. Interface and behaviour does not change. - Renamed
kgcnn.mol.graph_babel
andkgcnn.mol.graph_rdkit
and move conversion intokgcnn.mol.convert
. - Added
kgcnn.data.datasets.MD17Dataset
andkgcnn.data.datasets.MD17RevisedDataset
- Refactored
kgcnn.scaler
module to follow sklearn definitions. Changed input naming and order for scaler. Add config and weights functionality. - Changed
kgcnn.training.scheduler.LinearWarmupExponentialLearningRateScheduler
to take correctly lifetime parameter. - Reworked
kgcnn.data.datasets.QM9Dataset
to offer atomization energies and uncharacterized molecules. Please run reload=True. - Improved docs for
kgcnn.training
.
kgcnn 2.1.0
- Remove reserved properties form
MemoryGraphList
, please use set/get methods. - Removed deprecated
kgcnn.selection
module. - Added history score summary in
kgcnn.training.history
. - Rework training. Having plots takes up more memory. Prefer summary table of benchmarks.
- Changed
kgcnn.data.datasets.PROTEINSDataset
to binary graph labels. - Add
kgcnn.literature.CMPNN
model. - Add
kgcnn.literature.EGNN
model. - Merge
set_attributes
intoread_in_memory
forMoleculeNetDataset
and makeset_attributes
alias ofread_in_memory
. - Fix error of node updates in
kgcnn.literature.GAT
. Rerunning training. - Fix learning rate scheduler
kgcnn.training.scheduler.LinearLearningRateScheduler
min. learning rate if trained beyond epoch argument. - Removed
kgcnn.layers.casting.ChangeIndexing
at it was not used. - Added
kgcnn.layers.casting.CastEdgeIndicesToDenseAdjacency
. - Merged
kgcnn.layers.mlp.MLP
andkgcnn.layers.mlp.GraphMLP
, but keptGraphMLP
as alias. Change in kwargs for "normalization_technique". - Moved
kgcnn.layers.conv.message
tokgcnn.layers.message
. - Refactored
kgcnn.layers.conv.attention
intokgcnn.layers.conv.gat_conv
andkgcnn.layers.conv.attentivefp_conv
. - In
MoleculeNetDataset
andQMDataset
changed the shape of 'edge_number' to be(N, )
instead of(N, 1)
. To agree with 'node_number' shape. - Removed
kgcnn.layers.conv.sparse
as it was not used and added its content tokgcnn.layers.conv.gcn_conv
andkgcnn.layers.casting
- Started with
kgcnn.graph.preprocessor
.
kgcnn 2.0.4
- Add
kgcnn.crystal
module, which is still in development. - Add
get_weights
andget_config
tokgcnn.scaler
- Add
get
andset
alias toGraphDict
andMemoryGraphList
. Which now can be used to assign and obtain graph properties. - Refactored
GraphDict
andadj
intokgcnn.graph
. - Add a
set_range_periodic
function toGraphDict
. - Add
make_crystal_model
functions to SchNet, Megnet, DimeNetPP. - Add
custom_transform
toMoleculeNetDataset
. - Removed
add_hydrogen
,make_conformer
, andoptimize_conformer
from constructor ofMolGraphInterface
. - Added
add_hs
,make_conformer
andoptimize_conformer
toMolGraphInterface
. - Add normalization option to PAiNN and add
make_crystal_model
. - Add
kgcnn.literature.CGCNN
model with docs. - Add more list-functionality to
MemoryGraphList
. - Add tutorial_model_loading_options.ipynb to notebooks showing different ways to load ragged data.
- Add tutorial_hyper_optuna.ipynb to notebooks
kgcnn 2.0.3
- fix typo to read
kgcnn.mol.encoder
- fix bug in
GraphDict.from_networkx()
for edge attributes. - Improved docs overall.
- Added ragged node/edge embedding output for TF > 2.8 via "output_to_tensor" model config.
- Added
make_function
option to training scripts. - Refactored GraphDict methods into
kgcnn.data.adapter.GraphMethodsAdapter
. - Removed
kgcnn.layers.modules.ReduceSum
as it has not been used and may be problematic. - Moved
kgcnn.utils.data
tokgcnn.data.utils
. - Refactored smile to mol generation into
kgcnn.mol.convert
and renamedkgcnn.mol.gen
tokgcnn.mol.external
- fixed bug for
GatherEmbedding
to have correct concat axis if index tensor happens to be of rank>3 but ragged_rank=1. - Refactored
kgcnn.mol
methods into modules and renamedgraphRD
andgraphBabel
. - Continued to work on
kgcnn.data.crystal.CrystalDataset
. - Added
MatBenchDataset2020
dataset tokgcnn.data.datasets
.
kgcnn 2.0.2
Changes, updates and bug fixes
- Continue with dataset serialization. Changed training hyperparameter for dataset serialization and ran training examples.
- Refactored scaler and training utilities with serialization but kept old references.
- Added first draft of HamNet.
- Removed Haste layers, since integration did not work smoothly.
- Simplified MLPBase code.
- Changed
MemoryGraphList.clean()
to return list of removed indices. - Replaced explicit keras (python) import with
ks = tf.keras
kgcnn 2.0.1
Changes, updates and bug fixes
- Fixed error for
MoleculeNetDataset
for invalid smiles which stopped at graph size in #44 - Changed name
GraphNumpyContainer
inGraphDict
but kept as alias. - Added
to_networkx
andfrom_networkx
toGraphDict
- Changed the name of functional arguments for methods in
GraphDict
, i.e. removed the "name_" prefix. This does change previous usage if no default arguments are used! - Added notebook to show usage of
GraphDict
. - Fix bug in
set_edge_indices_reverse
forGraphDict
. - Refactor internal structure of
MoleculeNetDataset
to supportcustom_callbacks
parameter for attributes. The interface did not change else. Accepted #45 - Added list to initialization of
MemoryGraphList
. - Updated "README.md".
- Started with serializaton of datasets.
kgcnn 2.0.0
Realse version 2.0
New version 2 of kgcnn including refactored layers, datasets and training scripts. Changes are not fully compatible with version <2.0.0 of kgcnn. Future updates of version 2.0 are not supposed to introduce breaking changes to the API but add additional functionality and bug fixes. More extensive documentation and tutorials are planned. Some notable changes of version 2.0:
Changes and new features
- More general training scripts for each overall dataset type, like QM, TUDataset. Changes in the hyperparameter config to incorporate future extensions of eg. cross-validation or scaling and encoding of targets with keras-like serialization.
- Renamed dataset classes to dynamically load also dataset module in training script.
- Dataset base class is now a list of individual graphs instead of a collection of lists.
- A graph is represented by a python dictionary of named arrays for indices, node attributes etc.
- Every literature model has now a MLP at the output which defines the model output activation and units.
- Added normalization layers for Graphs following keras naming convention.
- The MLP class now also has dropout and normalization options and therefore had to be split into GraphMLP and MLP, since normalization on graphs is not identical to feed forward networks. Without normalization they are interchangeable.
- Renaming of base layers since in tf.version > 2.7 it is not possible to have a module named keras.py and identical layer names. Most standard keras layers now fully support ragged tensors as for example
Dense
. We therefore renamedkeras.py
intomodules.py
and layers into e.g.LazyAdd
,LazyConcatenate
andDenseEmbedding
for using keras-corresponding layers compatible with tf.version < 2.7. MoleculeNetDataset
now uses both RDkit and OpenBabel to generate molecular structures from smiles more reliable and optionally generate conformers. Conformer generation is parallelized for multiple molecules to greatly speed up SDF file generation. Note that node/edge or atom/bond attributes still require RDkit and molecules that fail RDkit valence test can not be used if specific attributes are desired. We also started to add the option for external conformer generation programs such as balloon in place of RDkit/OpenBabel. OpenBabel is now also a necessary dependency forMoleculeNetDataset
.- Started to add periodic crystal graph plug-in for the standard GNNs in kgcnn but that is not working yet. Will be possible in future releases.
- Warning and info notifications in modules now uses
logging
package instead of standard print.
kgcnn 1.1.1
Changes, updates and bug fixes
- Refactored training hyper-parameters to be more keras matched.
- Training paramters are now loaded from python or json/yaml and are visible in training. Removed
hyper.datasets
. - Fixed multiple bugs in
kgcnn.data.moleculenet
. - Added first version of
DMPNN
to literature. - Added examples for how to use Moleculenet and training paramters in jupyter notebooks.