Skip to content

Conversation

@vmiller987
Copy link

@vmiller987 vmiller987 commented Nov 13, 2025

Hello all!

With nnUNet, we currently receive the following warning using PyTorch 2.9:

/home/vmiller/Work/caml/.venv/lib/python3.13/site-packages/fft_conv_pytorch/fft_conv.py:139: UserWarning: Using a non-tuple sequence for multidimensional indexing is deprecated and will be changed in pytorch 2.9; use x[tuple(seq)] instead of x[seq]. In pytorch 2.9 this will be interpreted as tensor index, x[torch.tensor(seq)], which will result either in an error or a different result (Triggered internally at /pytorch/torch/csrc/autograd/python_variable_indexing.cpp:345.)
  output = output[crop_slices].contiguous()

The source of this is from fft-conv-pytorch which is a dependency of batchgeneratorsv2. Unfortunately, it does not look like fft-conv-pytorch is being maintained. This PR has existed since July: fkodom/fft-conv-pytorch#26

The offending code:
fft-conv-pytorch/fft_conv_pytorch/fft_conv.py lines 134-139`

    # Remove extra padded values
    crop_slices = [slice(None), slice(None)] + [
        slice(0, (signal_size[i] - kernel.size(i) + 1), stride_[i - 2])
        for i in range(2, signal.ndim)
    ]
    output = output[crop_slices].contiguous()

With the help of a coworker, we found an active fork with PyPi releases where someone did fix the issue:

The fix:
fft-conv-pytorch/torch_fftconv/functional.py lines 242-245

    # Remove extra padded values
    index = (slice(None),) * 2 + tuple(slice(p, o + p)
                                       for p, o in zip(padding, output_size))
    output = output[index].contiguous()

I have updated and tested the two locations where this library is used in batchgeneratorsv2 and updated the pyproject.toml to replace the out of date dependency.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant