Log non-global metrics#

Sometimes, it’s helpful to log non-global metrics to an experimenent. To do so, create a lazyscribe.test.Test using lazyscribe.experiment.Experiment.log_test():

from lazyscribe import Project

project = Project(fpath="project.json", mode="w")

with project.log(name="My experiment") as exp:
    with exp.log_test(name="My test", description="Demo test") as test:
        test.log_metric("metric", 0.3)
        test.log_parameter("param", "value")

The test’s parameter has been also stored here.

The lazyscribe.experiment.Experiment.log_test() context handler creates a lazyscribe.test.Test object and logs it back to the experiment when the handler exits. If you want to avoid using the context handler, instantiate your own test and append it to the lazyscribe.experiment.Experiment.tests list:

from lazyscribe import Test

with project.log(name="My experiment") as exp:
    test = Test(name="My test", description="Demo test")
    test.log_metric("metric", 0.3)
    exp.tests.append(test)

Logging artifacts to a test#

Tests support the same artifact system as experiments. Use lazyscribe.test.Test.log_artifact() to associate an artifact with a test:

from lazyscribe import Project

project = Project(fpath="project.json", mode="w")

with project.log(name="My experiment") as exp:
    with exp.log_test(name="My test") as test:
        test.log_metric("metric", 0.9)
        test.log_artifact(name="predictions", value=[0, 1, 1, 0], handler="json")

project.save()

Artifacts are not written to disk when you call lazyscribe.test.Test.log_artifact() — they are only persisted when you call lazyscribe.project.Project.save(). Test artifact files are stored inside a subdirectory of the experiment’s folder, named after the slugified test name (e.g. my-experiment-YYYYMMDDHHMMSS/my-test/).

See Save and examine experiment artifacts for the full list of built-in artifact handlers.

Loading artifacts from a test#

To load a test artifact, open the project and call lazyscribe.test.Test.load_artifact() on the test object:

from lazyscribe import Project

project = Project(fpath="project.json", mode="r")
exp = project["my-experiment"]
test = exp.tests[0]
predictions = test.load_artifact(name="predictions")

As with experiment artifacts, you can disable runtime environment validation with validate=False:

predictions = test.load_artifact(name="predictions", validate=False)