Skip to content

Sacred logger #656

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Closed
Closed
Show file tree
Hide file tree
Changes from all commits
Commits
Show all changes
30 commits
Select commit Hold shift + click to select a range
5e1869d
Setup sacred_logger
expectopatronum Dec 4, 2019
688818c
Rename sacred_logger -> sacred
expectopatronum Jan 2, 2020
62845e1
Add property experiment
expectopatronum Jan 2, 2020
2d662c6
Setup MongoObserver
expectopatronum Jan 2, 2020
a4f709f
Update sacred.py
expectopatronum Jan 2, 2020
8bceb01
Update log_metrics to work with sacred
expectopatronum Jan 2, 2020
26386e7
Absolute import
expectopatronum Jan 2, 2020
e65defd
Make ImportError more informative
expectopatronum Jan 2, 2020
a63ae05
Update __init__.py
expectopatronum Jan 3, 2020
30eb667
Let user pass list of observers
expectopatronum Jan 3, 2020
bfd6a80
Fix error when accessing name/id
expectopatronum Jan 3, 2020
7bbf188
Let observers default to []
expectopatronum Jan 3, 2020
64fda62
Raise exception in log_hyperparams
expectopatronum Jan 3, 2020
d45b554
Fix access to run_id
expectopatronum Jan 3, 2020
40343a2
Replace mutable default argument for observers
expectopatronum Jan 3, 2020
fff9231
Fixes step_num->step in log_metrics
expectopatronum Jan 3, 2020
969c58c
Fixes log_scalar call in log_metrics
expectopatronum Jan 3, 2020
91e8b9a
Update sacred.py
expectopatronum Jan 3, 2020
6b8b1f3
Add documentation for SacredLogger
expectopatronum Jan 5, 2020
f6259e7
Update SacredLogger
expectopatronum Jan 5, 2020
0cb66bb
Update docs/source/conf.py
expectopatronum Jan 5, 2020
8bf8402
Remove optional functions
expectopatronum Jan 5, 2020
a20f670
Add test_sacred_logger
expectopatronum Jan 6, 2020
c743ea5
Add test_sacred_pickle
expectopatronum Jan 6, 2020
6b3702c
Cleanup sacred.py
expectopatronum Jan 6, 2020
36c72f6
Fix flake findings
expectopatronum Jan 7, 2020
e84cbd9
Fix flake8 complaints
expectopatronum Jan 15, 2020
630ddec
Fix test_sacred_logger
expectopatronum Jan 15, 2020
4974847
Remove test_sacred_pickle
expectopatronum Jan 15, 2020
9fcb238
Update test_logging.py
williamFalcon Jan 21, 2020
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
3 changes: 2 additions & 1 deletion docs/source/conf.py
Original file line number Diff line number Diff line change
Expand Up @@ -295,7 +295,8 @@ def setup(app):
MOCK_REQUIRE_PACKAGES.append(pkg.rstrip())

# TODO: better parse from package since the import name and package name may differ
MOCK_MANUAL_PACKAGES = ['torch', 'torchvision', 'sklearn', 'test_tube', 'mlflow', 'comet_ml', 'wandb', 'neptune']
MOCK_MANUAL_PACKAGES = ['torch', 'torchvision', 'sklearn', 'test_tube', 'mlflow', 'comet_ml', 'wandb', 'neptune',
'sacred']
autodoc_mock_imports = MOCK_REQUIRE_PACKAGES + MOCK_MANUAL_PACKAGES
# for mod_name in MOCK_REQUIRE_PACKAGES:
# sys.modules[mod_name] = mock.Mock()
Expand Down
6 changes: 6 additions & 0 deletions pytorch_lightning/logging/__init__.py
Original file line number Diff line number Diff line change
Expand Up @@ -114,4 +114,10 @@ def any_lightning_module_function_or_hook(...):
except ImportError:
pass

try:
from .sacred import SacredLogger
all.append("SacredLogger")
except ImportError:
pass

__all__ = all
78 changes: 78 additions & 0 deletions pytorch_lightning/logging/sacred.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,78 @@
"""
Log using `sacred <https://sacred.readthedocs.io/en/stable/index.html>'_
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

pls check the dos visage

.. code-block:: python
from pytorch_lightning.logging import SacredLogger
ex = Experiment() # initialize however you like
ex.main(your_main_fct)
ex.observers.append(
# add any observer you like
)
sacred_logger = SacredLogger(ex)
trainer = Trainer(logger=sacred_logger)
Use the logger anywhere in you LightningModule as follows:
.. code-block:: python
def train_step(...):
# example
self.logger.experiment.whatever_sacred_supports(...)
def any_lightning_module_function_or_hook(...):
self.logger.experiment.whatever_sacred_supports(...)
"""

from logging import getLogger

try:
import sacred
except ImportError:
raise ImportError('Missing sacred package. Run `pip install sacred`')

from pytorch_lightning.logging.base import LightningLoggerBase, rank_zero_only

logger = getLogger(__name__)


class SacredLogger(LightningLoggerBase):
def __init__(self, sacred_experiment):
"""Initialize a sacred logger.

:param sacred.experiment.Experiment sacred_experiment: Required. Experiment object with desired observers
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

use the updated docs formating
:param sacred_experiment (sacred.experiment.Experiment): ...

already appended.
"""
super().__init__()
self.sacred_experiment = sacred_experiment
self.experiment_name = sacred_experiment.path
self._run_id = None

@property
def experiment(self):
return self.sacred_experiment

@property
def run_id(self):
if self._run_id is not None:
return self._run_id

self._run_id = self.sacred_experiment.current_run._id
return self._run_id

@rank_zero_only
def log_hyperparams(self, params):
Copy link
Contributor

@ozen ozen Jan 24, 2020

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I just wanted to share my workflow for your consideration. I use Sacred for general configuration, then Ax for hyperparameter optimization. There is one Sacred experiment and one logger during the whole process, but trainer.fit() is called many times by Ax with different hyperparameters. Logging these in log_hyperparams somehow may be beneficial.

# probably not needed bc. it is dealt with by sacred
pass

@rank_zero_only
def log_metrics(self, metrics, step=None):
for k, v in metrics.items():
if isinstance(v, str):
logger.warning(
f"Discarding metric with string value {k}={v}"
)
continue
self.experiment.log_scalar(k, v, step)

@property
def name(self):
return self.experiment_name

@property
def version(self):
return self.run_id
42 changes: 42 additions & 0 deletions tests/test_logging.py
Original file line number Diff line number Diff line change
Expand Up @@ -386,3 +386,45 @@ def version(self):
assert logger.hparams_logged == hparams
assert logger.metrics_logged != {}
assert logger.finalized_status == "success"


def test_sacred_logger(tmpdir):
"""Verify that basic functionality of sacred logger works."""
tutils.reset_seed()

try:
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

this should not be here, the logger shall be tested

from pytorch_lightning.logging import SacredLogger
except ModuleNotFoundError:
return

try:
from sacred import Experiment
except ModuleNotFoundError:
return

hparams = tutils.get_hparams()
model = LightningTestModel(hparams)
sacred_dir = os.path.join(tmpdir, "sacredruns")

ex = Experiment()
ex_config = vars(hparams)
ex.add_config(ex_config)

@ex.main
def run_fct():
logger = SacredLogger(ex)

trainer_options = dict(
default_save_path=sacred_dir,
max_epochs=1,
train_percent_check=0.01,
logger=logger
)
trainer = Trainer(**trainer_options)
result = trainer.fit(model)
return result

result = ex.run()

print('result finished')
assert result.status == "COMPLETED", "Training failed"