Skip to content

Release 0.17.0 #176

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
merged 22 commits into from
Jun 18, 2025
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
Show all changes
22 commits
Select commit Hold shift + click to select a range
aad9813
Merge pull request #160 from casework/release-0.16.0
kchason Jan 25, 2024
972bf74
Add UUID generator incorporating dictionary entry keys
sldouglas-nist Aug 21, 2024
bf4da9a
Merge pull request #164 from sldouglas-nist/add_dictionary_entry_inhe…
kchason Aug 21, 2024
8896c73
Bump GitHub Actions
ajnelson-nist Aug 26, 2024
528029b
Merge pull request #165 from casework/bump_github_actions
kchason Aug 26, 2024
eeb1b96
Use build module
ajnelson-nist Aug 26, 2024
917ad51
Merge pull request #166 from casework/use_build
kchason Aug 26, 2024
06f676e
Suffix upload to avoid conflict
kchason Aug 27, 2024
02c844f
Merge pull request #168 from casework/fix-ci-develop
ajnelson-nist Aug 27, 2024
a5a2428
Update pySHACL import to recognize typing
ajnelson-nist Nov 13, 2024
bdc5d31
Regenerate Make-managed files
ajnelson-nist Nov 13, 2024
0843d4d
Document incongruous file contents
ajnelson-nist Nov 13, 2024
d5c8299
Merge pull request #170 from casework/update_pyshacl_effects
kchason Nov 14, 2024
87d2d6e
Run and apply pre-commit autoupdate
ajnelson-nist Nov 14, 2024
908dd36
Run pre-commit autoupdate
ajnelson-nist Jun 4, 2025
9be3f6b
Bump latter tested Python version to 3.13
ajnelson-nist Jun 4, 2025
ad8a860
Build CASE 1.4.0 monolithic .ttl files
ajnelson-nist Jun 4, 2025
c5befed
Bump CASE pointer to 1.4.0 release; update inherited tests; remove Vo…
ajnelson-nist Jun 4, 2025
e3a7823
Regenerate Make-managed files
ajnelson-nist Jun 4, 2025
6775123
Merge pull request #174 from casework/build_case_1.4.0
kchason Jun 11, 2025
c86dd91
Merge pull request #173 from casework/bump_python
vulnmaster Jun 11, 2025
63f40b9
Bump versions
ajnelson-nist Jun 17, 2025
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
21 changes: 13 additions & 8 deletions .github/workflows/cicd.yml
Original file line number Diff line number Diff line change
Expand Up @@ -39,32 +39,37 @@ jobs:
matrix:
python-version:
- '3.9'
- '3.12'
- '3.13'

steps:
- uses: actions/checkout@v3
- uses: actions/setup-java@v3
- uses: actions/checkout@v4

- uses: actions/setup-java@v4
with:
distribution: 'temurin'
java-version: '11'

- name: Set up Python ${{ matrix.python-version }}
uses: actions/setup-python@v4
uses: actions/setup-python@v5
with:
python-version: ${{ matrix.python-version }}

- name: Pre-commit Checks
run: |
pip -q install pre-commit
pre-commit run --all-files

- name: Start from clean state
run: make clean

- name: Run tests
run: make PYTHON3=python check

# Build the binary wheel as well as the source tar
- name: Build Objects
run: |
pip install -q twine wheel
python setup.py sdist bdist_wheel
pip install -q twine build
python -m build

# Ensure the objects were packaged correctly and there wasn't an issue with
# the compilation or packaging process.
Expand All @@ -73,9 +78,9 @@ jobs:

# Upload the packages on all develop and main pipleines for test consumption
- name: Upload HTML Docs
uses: actions/upload-artifact@v3
uses: actions/upload-artifact@v4
with:
name: packages
name: packages-${{ matrix.python-version }}
path: ./dist/

# If this commit is the result of a Git tag, push the wheel and tar packages
Expand Down
6 changes: 3 additions & 3 deletions .github/workflows/prerelease.yml
Original file line number Diff line number Diff line change
Expand Up @@ -30,12 +30,12 @@ jobs:
matrix:
python-version:
- '3.9'
- '3.12'
- '3.13'

steps:
- uses: actions/checkout@v3
- uses: actions/checkout@v4
- name: Set up Python ${{ matrix.python-version }}
uses: actions/setup-python@v4
uses: actions/setup-python@v5
with:
python-version: ${{ matrix.python-version }}
- name: Review dependencies
Expand Down
6 changes: 3 additions & 3 deletions .pre-commit-config.yaml
Original file line number Diff line number Diff line change
@@ -1,14 +1,14 @@
repos:
- repo: https://github.com/psf/black
rev: 23.12.1
rev: 25.1.0
hooks:
- id: black
- repo: https://github.com/pycqa/flake8
rev: 7.0.0
rev: 7.2.0
hooks:
- id: flake8
- repo: https://github.com/pycqa/isort
rev: 5.13.2
rev: 6.0.1
hooks:
- id: isort
name: isort (python)
13 changes: 3 additions & 10 deletions case_utils/case_file/__init__.py
Original file line number Diff line number Diff line change
Expand Up @@ -18,7 +18,7 @@
This module creates a graph object that provides a basic UCO characterization of a single file. The gathered metadata is among the more "durable" file characteristics, i.e. characteristics that would remain consistent when transferring a file between locations.
"""

__version__ = "0.6.0"
__version__ = "0.7.0"

import argparse
import datetime
Expand All @@ -38,7 +38,6 @@
NS_UCO_CORE,
NS_UCO_OBSERVABLE,
NS_UCO_TYPES,
NS_UCO_VOCABULARY,
NS_XSD,
)

Expand Down Expand Up @@ -228,14 +227,9 @@ def create_file_node(

l_hash_method: rdflib.Literal
if key in ("sha3_256", "sha3_512"):
l_hash_method = rdflib.Literal(
key.replace("_", "-").upper(),
datatype=NS_UCO_VOCABULARY.HashNameVocab,
)
l_hash_method = rdflib.Literal(key.replace("_", "-").upper())
else:
l_hash_method = rdflib.Literal(
key.upper(), datatype=NS_UCO_VOCABULARY.HashNameVocab
)
l_hash_method = rdflib.Literal(key.upper())

hash_value: str = getattr(successful_hashdict, key)
l_hash_value = rdflib.Literal(hash_value.upper(), datatype=NS_XSD.hexBinary)
Expand Down Expand Up @@ -300,7 +294,6 @@ def main() -> None:
graph.namespace_manager.bind("uco-core", NS_UCO_CORE)
graph.namespace_manager.bind("uco-observable", NS_UCO_OBSERVABLE)
graph.namespace_manager.bind("uco-types", NS_UCO_TYPES)
graph.namespace_manager.bind("uco-vocabulary", NS_UCO_VOCABULARY)
graph.namespace_manager.bind("xsd", NS_XSD)

output_format = None
Expand Down
10 changes: 6 additions & 4 deletions case_utils/case_sparql_construct/__init__.py
Original file line number Diff line number Diff line change
Expand Up @@ -18,7 +18,7 @@
This script executes a SPARQL CONSTRUCT query, returning a graph of the generated triples.
"""

__version__ = "0.2.6"
__version__ = "0.2.7"

import argparse
import logging
Expand All @@ -42,9 +42,11 @@ def main() -> None:

# Configure debug logging before running parse_args, because there could be an error raised before the construction of the argument parser.
logging.basicConfig(
level=logging.DEBUG
if ("--debug" in sys.argv or "-d" in sys.argv)
else logging.INFO
level=(
logging.DEBUG
if ("--debug" in sys.argv or "-d" in sys.argv)
else logging.INFO
)
)

parser.add_argument("-d", "--debug", action="store_true")
Expand Down
10 changes: 6 additions & 4 deletions case_utils/case_sparql_select/__init__.py
Original file line number Diff line number Diff line change
Expand Up @@ -29,7 +29,7 @@
Should a more complex query be necessary, an outer, wrapping SELECT query would let this script continue to function.
"""

__version__ = "0.5.2"
__version__ = "0.5.3"

import argparse
import binascii
Expand Down Expand Up @@ -197,9 +197,11 @@ def main() -> None:

# Configure debug logging before running parse_args, because there could be an error raised before the construction of the argument parser.
logging.basicConfig(
level=logging.DEBUG
if ("--debug" in sys.argv or "-d" in sys.argv)
else logging.INFO
level=(
logging.DEBUG
if ("--debug" in sys.argv or "-d" in sys.argv)
else logging.INFO
)
)

parser.add_argument("-d", "--debug", action="store_true")
Expand Down
28 changes: 15 additions & 13 deletions case_utils/case_validate/__init__.py
Original file line number Diff line number Diff line change
Expand Up @@ -32,7 +32,7 @@
details.)
"""

__version__ = "0.5.0"
__version__ = "0.6.0"

import argparse
import logging
Expand All @@ -41,7 +41,7 @@
import warnings
from typing import Any, Dict, List, Optional, Tuple, Union

import pyshacl # type: ignore
import pyshacl
import rdflib
from rdflib import Graph

Expand Down Expand Up @@ -120,14 +120,14 @@ def validate(
)

# Validate data graph against ontology graph.
validate_result: Tuple[
bool, Union[Exception, bytes, str, rdflib.Graph], str
] = pyshacl.validate(
data_graph,
*args,
ont_graph=ontology_graph,
shacl_graph=ontology_graph,
**kwargs,
validate_result: Tuple[bool, Union[Exception, bytes, str, rdflib.Graph], str] = (
pyshacl.validate(
data_graph,
*args,
ont_graph=ontology_graph,
shacl_graph=ontology_graph,
**kwargs,
)
)

# Relieve RAM of the data graph after validation has run.
Expand Down Expand Up @@ -159,9 +159,11 @@ def main() -> None:
# could be an error raised before the construction of the argument
# parser.
logging.basicConfig(
level=logging.DEBUG
if ("--debug" in sys.argv or "-d" in sys.argv)
else logging.INFO
level=(
logging.DEBUG
if ("--debug" in sys.argv or "-d" in sys.argv)
else logging.INFO
)
)

# Add arguments specific to case_validate.
Expand Down
70 changes: 60 additions & 10 deletions case_utils/inherent_uuid.py
Original file line number Diff line number Diff line change
Expand Up @@ -57,7 +57,7 @@
>>> assert str(n_file_facet)[-36:] == str(n_file_facet_2)[-36:]
"""

__version__ = "0.1.2"
__version__ = "0.2.0"

import binascii
import re
Expand All @@ -66,16 +66,16 @@

from rdflib import Literal, Namespace, URIRef

from case_utils.namespace import NS_UCO_CORE, NS_UCO_VOCABULARY, NS_XSD
from case_utils.namespace import NS_UCO_CORE, NS_XSD

L_MD5 = Literal("MD5", datatype=NS_UCO_VOCABULARY.HashNameVocab)
L_SHA1 = Literal("SHA1", datatype=NS_UCO_VOCABULARY.HashNameVocab)
L_SHA256 = Literal("SHA256", datatype=NS_UCO_VOCABULARY.HashNameVocab)
L_SHA3_256 = Literal("SHA3-256", datatype=NS_UCO_VOCABULARY.HashNameVocab)
L_SHA3_512 = Literal("SHA3-512", datatype=NS_UCO_VOCABULARY.HashNameVocab)
L_SHA384 = Literal("SHA384", datatype=NS_UCO_VOCABULARY.HashNameVocab)
L_SHA512 = Literal("SHA512", datatype=NS_UCO_VOCABULARY.HashNameVocab)
L_SSDEEP = Literal("SSDEEP", datatype=NS_UCO_VOCABULARY.HashNameVocab)
L_MD5 = Literal("MD5")
L_SHA1 = Literal("SHA1")
L_SHA256 = Literal("SHA256")
L_SHA3_256 = Literal("SHA3-256")
L_SHA3_512 = Literal("SHA3-512")
L_SHA384 = Literal("SHA384")
L_SHA512 = Literal("SHA512")
L_SSDEEP = Literal("SSDEEP")

# Key: hashMethod literal.
# Value: Tuple.
Expand All @@ -96,6 +96,15 @@
)


def dictionary_entry_inherence_uuid(
uco_object_uuid_namespace: uuid.UUID, key_name: str, *args: Any, **kwargs: Any
) -> uuid.UUID:
"""
This function returns a UUIDv5 for dictionary entries, incorporating the key string's value.
"""
return uuid.uuid5(uco_object_uuid_namespace, key_name)


def inherence_uuid(n_thing: URIRef, *args: Any, **kwargs: Any) -> uuid.UUID:
"""
This function returns a UUIDv5 for any OWL Thing, that can be used as a UUID Namespace in further `uuid.uuidv5` calls.
Expand Down Expand Up @@ -152,6 +161,47 @@ def facet_inherence_uuid(
return uuid.uuid5(uco_object_inherence_uuid, str(n_facet_class))


def get_dictionary_entry_uriref(
n_dictionary: URIRef,
n_dictionary_entry_class: URIRef,
key_name: str,
*args: Any,
namespace: Namespace,
**kwargs: Any
) -> URIRef:
"""
:param namespace: An RDFLib Namespace object to use for prefixing the Dictionary IRI with a knowledge base prefix IRI.
:type namespace rdflib.Namespace:

:param n_dictionary_entry_class: Assumed to be a "Proper Dictionary", as defined in UCO Issue 602.

References
==========
* https://github.com/ucoProject/UCO/issues/602

Examples
========
A dictionary has to have an entry with key "foo". What is the IRI of the dictionary entry?

>>> from case_utils.namespace import NS_UCO_TYPES
>>> ns_kb = Namespace("http://example.org/kb/")
>>> n_dictionary = ns_kb["Dictionary-eb7e68d8-94db-4071-86fa-a51a33dc4a97"]
>>> n_dictionary_entry = get_dictionary_entry_uriref(n_dictionary, NS_UCO_TYPES.DictionaryEntry, "foo", namespace=ns_kb)
>>> n_dictionary_entry
rdflib.term.URIRef('http://example.org/kb/DictionaryEntry-6ce6b412-6a3a-5ebf-993a-9df2c80d2107')
"""
uco_object_uuid_namespace: uuid.UUID = inherence_uuid(n_dictionary)
dictionary_entry_uuid = dictionary_entry_inherence_uuid(
uco_object_uuid_namespace, key_name
)

dictionary_entry_class_local_name = str(n_dictionary_entry_class).rsplit("/")[-1]

return namespace[
dictionary_entry_class_local_name + "-" + str(dictionary_entry_uuid)
]


def get_facet_uriref(
n_uco_object: URIRef,
n_facet_class: URIRef,
Expand Down
Loading