Skip to content

BUG: Running tests/tensor/test_basic.py etc. after tests/test_printing.py fails #591

@carlsmedstad

Description

@carlsmedstad

Describe the issue:

It seems the something in tests/test_printing.py pollutes/changes the global environment and makes several unrelated tests fail as a consequence, specifically tests in tests/tensor/test_basic.py, tests/tensor/test_variable.py and tests/tensor/rewriting/test_elemwise.py.

To reproduce, perform the following steps:

Create virtualenv and install dependencies:

$ python -m venv .venv
$ pip install -e .
Obtaining file:///home/carsme/repos/github.com/pymc-devs/pytensor
  Installing build dependencies ... done
  Checking if build backend supports build_editable ... done
  Getting requirements to build editable ... done
  Installing backend dependencies ... done
  Preparing editable metadata (pyproject.toml) ... done
...

Run tests/tensor/test_basic.py - it passes:

$ pytest tests/tensor/test_basic.py
========================================================== test session starts ===========================================================
platform linux -- Python 3.11.6, pytest-7.4.4, pluggy-1.3.0
benchmark: 4.0.0 (defaults: timer=time.perf_counter disable_gc=False min_rounds=5 min_time=0.000005 max_time=1.0 calibration_precision=10 warmup=False warmup_iterations=100000)
rootdir: /home/carsme/repos/github.com/pymc-devs/pytensor
plugins: httpserver-1.0.8, asyncio-0.23.3, anyio-4.2.0, mypy-0.10.3, rerunfailures-13.0, benchmark-4.0.0, json-report-1.5.0, cov-4.1.0, metadata-3.0.0, hypothesis-6.92.2, mock-3.12.0
asyncio: mode=Mode.STRICT
collected 447 items

tests/tensor/test_basic.py ....................................................................................................... [ 23%]
.................................................................................................................................. [ 52%]
...........................................................................................................s.......x.............. [ 81%]
..................................................................s.................                                               [100%]

=============================================== 444 passed, 2 skipped, 1 xfailed in 52.11s ===============================================

Run tests/tensor/test_basic.py after tests/test_printing.py - if fails:

$ pytest tests/test_printing.py tests/tensor/test_basic.py
========================================================== test session starts ===========================================================
platform linux -- Python 3.11.6, pytest-7.4.4, pluggy-1.3.0
benchmark: 4.0.0 (defaults: timer=time.perf_counter disable_gc=False min_rounds=5 min_time=0.000005 max_time=1.0 calibration_precision=10 warmup=False warmup_iterations=100000)
rootdir: /home/carsme/repos/github.com/pymc-devs/pytensor
plugins: httpserver-1.0.8, asyncio-0.23.3, anyio-4.2.0, mypy-0.10.3, rerunfailures-13.0, benchmark-4.0.0, json-report-1.5.0, cov-4.1.0, metadata-3.0.0, hypothesis-6.92.2, mock-3.12.0
asyncio: mode=Mode.STRICT
collected 460 items

tests/test_printing.py .............                                                                                               [  2%]
tests/tensor/test_basic.py F.FF.F.F..F.F..F.F.....F....F....F....F....F....F....F....F.F..F.F..F.FFFFFFFF.F.....................FF [ 25%]
...
tests/tensor/test_basic.py:4537:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
pytensor/compile/function/__init__.py:315: in function
    fn = pfunc(
pytensor/compile/function/pfunc.py:469: in pfunc
    return orig_function(
pytensor/compile/function/types.py:1750: in orig_function
    m = Maker(
pytensor/compile/function/types.py:1523: in __init__
    self.prepare_fgraph(inputs, outputs, found_updates, fgraph, mode, profile)
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

inputs = [In(<Tensor4(float64, shape=(5, 5, 5, 5))>)], outputs = [Out(ExtractDiag{offset=1, axis1=1, axis2=3, view=False}.0,False)]
additional_outputs = []
fgraph = FunctionGraph(ExtractDiag{offset=1, axis1=1, axis2=3, view=False}(<Tensor4(float64, shape=(5, 5, 5, 5))>))
mode = <pytensor.compile.mode.Mode object at 0x7fa25374d210>, profile = None

    @staticmethod
    def prepare_fgraph(
        inputs,
        outputs,
        additional_outputs,
        fgraph: FunctionGraph,
        mode: "Mode",
        profile,
    ):
        rewriter = mode.optimizer

        try:
            start_rewriter = time.perf_counter()

            rewriter_profile = None
            rewrite_time = None

            with config.change_flags(
                mode=mode,
                compute_test_value=config.compute_test_value_opt,
                traceback__limit=config.traceback__compile_limit,
            ):
                rewriter_profile = rewriter(fgraph)

                end_rewriter = time.perf_counter()
                rewrite_time = end_rewriter - start_rewriter
                _logger.debug(f"Rewriting took {rewrite_time:f} seconds")

                # Add deep copy to respect the memory interface
                insert_deepcopy(fgraph, inputs, outputs + additional_outputs)
        finally:
            # If the rewriter got interrupted
            if rewrite_time is None:
                end_rewriter = time.perf_counter()
                rewrite_time = end_rewriter - start_rewriter

            pytensor.compile.profiling.total_graph_rewrite_time += rewrite_time

            if profile:
                if rewriter_profile is None and hasattr(rewriter, "pre_profile"):
                    rewriter_profile = rewriter.pre_profile

                profile.rewriting_time += rewrite_time

                if config.profile_optimizer:
                    profile.rewriter_profile = (rewriter, rewriter_profile)
            elif config.profile_optimizer and profile is not False:
                # If False, it means the profiling for that function was
                # explicitly disabled
>               warnings.warn(
                    (
                        "config.profile_optimizer requires config.profile to "
                        " be set to True as well"
                    ),
                    stacklevel=3,
                )
E               UserWarning: config.profile_optimizer requires config.profile to  be set to True as well

pytensor/compile/function/types.py:1438: UserWarning
======================================================== short test summary info =========================================================
...
FAILED tests/tensor/test_basic.py::test_full_like[scalar-shape0] - UserWarning: config.profile_optimizer requires config.profile to  be set to True as well
FAILED tests/tensor/test_basic.py::test_full_like[vector-3] - UserWarning: config.profile_optimizer requires config.profile to  be set to True as well
FAILED tests/tensor/test_basic.py::test_full_like[matrix-shape2] - UserWarning: config.profile_optimizer requires config.profile to  be set to True as well
FAILED tests/tensor/test_basic.py::test_trace - UserWarning: config.profile_optimizer requires config.profile to  be set to True as well
FAILED tests/tensor/test_basic.py::test_vectorize_extract_diag - UserWarning: config.profile_optimizer requires config.profile to  be set to True as well
==================================== 248 failed, 209 passed, 2 skipped, 1 xfailed in 71.37s (0:01:11) ====================================

I believe it has something to do with the following lines:

$ rg --no-ignore-vcs 'config.profile' tests/
tests/test_printing.py
468:    old_profile_optimizer_config_value = pytensor.config.profile_optimizer = True
471:    pytensor.config.profile_optimizer = old_profile_optimizer_config_value

Thanks!

Reproducable code example:

Not applicable.

Error message:

No response

PyTensor version information:

Confirmed on 2.18.6 and on main.

Context for the issue:

I maintain the AUR packages for PyTensor and PyMC.

Metadata

Metadata

Assignees

Labels

bugSomething isn't workingtests

Type

No type

Projects

No projects

Milestone

No milestone

Relationships

None yet

Development

No branches or pull requests

Issue actions