-
Couldn't load subscription status.
- Fork 3
Ruff format and lint #52
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Changes from 1 commit
758bc40
47c963d
9088ac8
d96a768
4004fad
595d95c
File filter
Filter by extension
Conversations
Jump to
Diff view
Diff view
There are no files selected for viewing
| Original file line number | Diff line number | Diff line change |
|---|---|---|
| @@ -1,14 +1,13 @@ | ||
| """ | ||
| Transformers (aka data mappers) are used to transform objects from one class to another | ||
| using a transformation specification. | ||
| Transformers (aka data mappers) are used to transform objects from one class to another using a transformation specification. | ||
|
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. I fixed various linter errors in this file whilst renaming the variable that codespell kept flagging as a typo. |
||
| """ | ||
|
|
||
| import logging | ||
| from abc import ABC | ||
| from copy import deepcopy | ||
| from dataclasses import dataclass, field | ||
| from pathlib import Path | ||
| from typing import Any, Dict, Optional, Union | ||
| from typing import Any, Optional, Union | ||
|
|
||
| import yaml | ||
| from curies import Converter | ||
|
|
@@ -30,7 +29,7 @@ | |
| logger = logging.getLogger(__name__) | ||
|
|
||
|
|
||
| OBJECT_TYPE = Union[Dict[str, Any], BaseModel, YAMLRoot] | ||
| OBJECT_TYPE = Union[dict[str, Any], BaseModel, YAMLRoot] | ||
| """An object can be a plain python dict, a pydantic object, or a linkml YAMLRoot""" | ||
|
|
||
|
|
||
|
|
@@ -63,7 +62,9 @@ class Transformer(ABC): | |
|
|
||
| _curie_converter: Converter = None | ||
|
|
||
| def map_object(self, obj: OBJECT_TYPE, source_type: str = None, **kwargs) -> OBJECT_TYPE: | ||
| def map_object( | ||
| self, obj: OBJECT_TYPE, source_type: Optional[str] = None, **kwargs: dict[str, Any] | ||
|
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. add typing to kwargs |
||
| ) -> OBJECT_TYPE: | ||
| """ | ||
| Transform source object into an instance of the target class. | ||
|
|
||
|
|
@@ -74,7 +75,7 @@ def map_object(self, obj: OBJECT_TYPE, source_type: str = None, **kwargs) -> OBJ | |
| raise NotImplementedError | ||
|
|
||
| def map_database( | ||
| self, source_database: Any, target_database: Optional[Any] = None, **kwargs | ||
| self, source_database: Any, target_database: Optional[Any] = None, **kwargs: dict[str, Any] | ||
|
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. add typing |
||
| ) -> OBJECT_TYPE: | ||
| """ | ||
| Transform source resource. | ||
|
|
@@ -86,25 +87,23 @@ def map_database( | |
| """ | ||
| raise NotImplementedError | ||
|
|
||
| def load_source_schema(self, path: Union[str, Path, dict]): | ||
| def load_source_schema(self, path: Union[str, Path, dict]) -> None: | ||
| """ | ||
| Sets source_schemaview from a schema path. | ||
| Set source_schemaview from a schema path. | ||
|
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. According to pydocstyle, function/method documentation should be in the imperative voice. |
||
|
|
||
| :param path: | ||
| :return: | ||
| """ | ||
| if isinstance(path, Path): | ||
| path = str(path) | ||
| self.source_schemaview = SchemaView(path) | ||
|
|
||
| def load_transformer_specification(self, path: Union[str, Path]): | ||
| def load_transformer_specification(self, path: Union[str, Path]) -> None: | ||
| """ | ||
| Sets specification from a schema path. | ||
| Set specification from a schema path. | ||
|
|
||
| :param path: | ||
| :return: | ||
| """ | ||
| # self.specification = yaml_loader.load(str(path), TransformationSpecification) | ||
| with open(path) as f: | ||
| obj = yaml.safe_load(f) | ||
| # necessary to expand first | ||
|
|
@@ -115,9 +114,9 @@ def load_transformer_specification(self, path: Union[str, Path]): | |
| obj = normalizer.normalize(obj) | ||
| self.specification = TransformationSpecification(**obj) | ||
|
|
||
| def create_transformer_specification(self, obj: Dict[str, Any]): | ||
| def create_transformer_specification(self, obj: dict[str, Any]) -> None: | ||
| """ | ||
| Creates specification from a dict. | ||
| Create specification from a dict. | ||
|
|
||
| TODO: this will no longer be necessary when pydantic supports inlined as dict | ||
|
|
||
|
|
@@ -157,22 +156,22 @@ def _get_class_derivation(self, target_class_name: str) -> ClassDerivation: | |
| ancmap = self._class_derivation_ancestors(cd) | ||
| if ancmap: | ||
| cd = deepcopy(cd) | ||
| for anc in ancmap.values(): | ||
| for k, v in anc.__dict__.items(): | ||
| for ancestor in ancmap.values(): | ||
| for k, v in ancestor.__dict__.items(): | ||
|
Comment on lines
+159
to
+160
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. codespell was convinced this should be There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. I have made the same changes myself in other linkml repos :D There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. I suppose at least it does enforce good variable naming... even if it is a bit of a nuisance! |
||
| if v is not None and v != []: | ||
| curr_v = getattr(cd, k, None) | ||
| if isinstance(curr_v, list): | ||
| curr_v.extend(v) | ||
| elif isinstance(curr_v, dict): | ||
| curr_v.update({**v, **curr_v}) | ||
| else: | ||
| if curr_v is None: | ||
| setattr(cd, k, v) | ||
| elif curr_v is None: | ||
| setattr(cd, k, v) | ||
|
Comment on lines
+167
to
+168
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. else:
if ...is equivalent to elif ... |
||
| return cd | ||
|
|
||
| def _class_derivation_ancestors(self, cd: ClassDerivation) -> Dict[str, ClassDerivation]: | ||
| def _class_derivation_ancestors(self, cd: ClassDerivation) -> dict[str, ClassDerivation]: | ||
| """ | ||
| Returns a map of all class derivations that are ancestors of the given class derivation. | ||
| Return a map of all class derivations that are ancestors of the given class derivation. | ||
|
|
||
| :param cd: | ||
| :return: | ||
| """ | ||
|
|
@@ -199,7 +198,7 @@ def _get_enum_derivation(self, target_enum_name: str) -> EnumDerivation: | |
|
|
||
| def _is_coerce_to_multivalued( | ||
| self, slot_derivation: SlotDerivation, class_derivation: ClassDerivation | ||
| ): | ||
| ) -> bool: | ||
| cast_as = slot_derivation.cast_collection_as | ||
| if cast_as and cast_as in [ | ||
| CollectionType.MultiValued, | ||
|
|
@@ -218,7 +217,7 @@ def _is_coerce_to_multivalued( | |
|
|
||
| def _is_coerce_to_singlevalued( | ||
| self, slot_derivation: SlotDerivation, class_derivation: ClassDerivation | ||
| ): | ||
| ) -> bool: | ||
| cast_as = slot_derivation.cast_collection_as | ||
| if cast_as and cast_as == CollectionType(CollectionType.SingleValued): | ||
| return True | ||
|
|
@@ -244,7 +243,7 @@ def _coerce_datatype(self, v: Any, target_range: Optional[str]) -> Any: | |
| "string": str, | ||
| "boolean": bool, | ||
| } | ||
| cls = cmap.get(target_range, None) | ||
| cls = cmap.get(target_range) | ||
|
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more.
|
||
| if not cls: | ||
| logger.warning(f"Unknown target range {target_range}") | ||
| return v | ||
|
|
||
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Below, also inject poetry dynamic versioning into pipx-installed poetry:
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Versioning isn't really needed to run the tests, is it?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
From the run logs a version 0.0.0 is installed. While it does not break tests, you will also not notice if dynamic versioning breaks or is configured wrong, if version 0.0.0 is "good enough".
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
IMO it'd be better to have an explicit test that ensures that versioning is working as expected, which will error out if the versioning plugin hasn't done its job. It can be very simple -- e.g. grepping the version info from one of the places where the plugin automatically updates it, and ensuring that it isn't
0.0.0. I don't want to have to check run logs unless I'm looking for reasons why tests failed.There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I don't think it's worth adding such a workflow or test because it's something that can easily be checked the next time that a release is made, and having a workflow that tests the functionality as part of the linkml-map test suite is essentially testing other peoples' code.