Fail to import mne_connectivity

  • MNE version: 1.1.1 or 1.0.0
  • operating system: Linux

Hi! guys,

I installed mne_connectivity in the MNE environment (version 1.0.0) first, it showed that installation was done. But I cannot import mne_connectivity, here is the error message:

In [61]: import mne_connectivity
---------------------------------------------------------------------------
TypeError                                 Traceback (most recent call last)
Input In [61], in <cell line: 1>()
----> 1 import mne_connectivity

File ~/.conda/envs/mne/lib/python3.9/site-packages/mne_connectivity/__init__.py:11, in <module>
      3 # Authors: Adam Li <ali39@jhu.edu>
      4 #          Eric Larson <larson.eric.d@gmail.com>
      5 #          Britta Westner <britta.wstnr@gmail.com>
      6 #
      7 # License: BSD (3-clause)
      9 __version__ = '0.3'
---> 11 from .base import (Connectivity, EpochConnectivity, EpochSpectralConnectivity,
     12                    EpochSpectroTemporalConnectivity, EpochTemporalConnectivity,
     13                    SpectralConnectivity, SpectroTemporalConnectivity,
     14                    TemporalConnectivity)
     15 from .effective import phase_slope_index
     16 from .envelope import envelope_correlation, symmetric_orth

File ~/.conda/envs/mne/lib/python3.9/site-packages/mne_connectivity/base.py:4, in <module>
      1 from copy import copy, deepcopy
      3 import numpy as np
----> 4 import xarray as xr
      5 import pandas as pd
      6 from mne.utils import (_check_combine, _check_option, _validate_type,
      7                        copy_function_doc_to_method_doc, object_size,
      8                        sizeof_fmt, _check_event_id, _ensure_events,
      9                        _on_missing, warn, check_random_state)

File ~/.conda/envs/mne/lib/python3.9/site-packages/xarray/__init__.py:1, in <module>
----> 1 from . import testing, tutorial
      2 from .backends.api import (
      3     load_dataarray,
      4     load_dataset,
   (...)
      8     save_mfdataset,
      9 )
     10 from .backends.rasterio_ import open_rasterio

File ~/.conda/envs/mne/lib/python3.9/site-packages/xarray/testing.py:9, in <module>
      6 import numpy as np
      7 import pandas as pd
----> 9 from xarray.core import duck_array_ops, formatting, utils
     10 from xarray.core.dataarray import DataArray
     11 from xarray.core.dataset import Dataset

File ~/.conda/envs/mne/lib/python3.9/site-packages/xarray/core/duck_array_ops.py:26, in <module>
     23 from numpy import take, tensordot, transpose, unravel_index  # noqa
     24 from numpy import where as _where
---> 26 from . import dask_array_compat, dask_array_ops, dtypes, npcompat, nputils
     27 from .nputils import nanfirst, nanlast
     28 from .pycompat import cupy_array_type, dask_array_type, is_duck_dask_array

File ~/.conda/envs/mne/lib/python3.9/site-packages/xarray/core/npcompat.py:72, in <module>
     49     from numpy.typing._dtype_like import _DTypeLikeNested, _ShapeLike, _SupportsDType
     51     # Xarray requires a Mapping[Hashable, dtype] in many places which
     52     # conflics with numpys own DTypeLike (with dtypes for fields).
     53     # https://numpy.org/devdocs/reference/typing.html#numpy.typing.DTypeLike
     54     # This is a copy of this DTypeLike that allows only non-Mapping dtypes.
     55     DTypeLikeSave = Union[
     56         np.dtype,
     57         # default data type (float64)
     58         None,
     59         # array-scalar types and generic types
     60         Type[Any],
     61         # character codes, type strings or comma-separated fields, e.g., 'float64'
     62         str,
     63         # (flexible_dtype, itemsize)
     64         Tuple[_DTypeLikeNested, int],
     65         # (fixed_dtype, shape)
     66         Tuple[_DTypeLikeNested, _ShapeLike],
     67         # (base_dtype, new_dtype)
     68         Tuple[_DTypeLikeNested, _DTypeLikeNested],
     69         # because numpy does the same?
     70         List[Any],
     71         # anything with a dtype attribute
---> 72         _SupportsDType[np.dtype],
     73     ]
     74 except ImportError:
     75     # fall back for numpy < 1.20, ArrayLike adapted from numpy.typing._array_like
     76     from typing import Protocol

File ~/.conda/envs/mne/lib/python3.9/typing.py:277, in _tp_cache.<locals>.decorator.<locals>.inner(*args, **kwds)
    275 except TypeError:
    276     pass  # All real errors (not unhashable args) are raised below.
--> 277 return func(*args, **kwds)

File ~/.conda/envs/mne/lib/python3.9/typing.py:1004, in Generic.__class_getitem__(cls, params)
   1000         raise TypeError(
   1001             f"Parameters to {cls.__name__}[...] must all be unique")
   1002 else:
   1003     # Subscripting a regular Generic subclass.
-> 1004     _check_generic(cls, params, len(cls.__parameters__))
   1005 return _GenericAlias(cls, params)

File ~/.conda/envs/mne/lib/python3.9/site-packages/typing_extensions.py:101, in _check_generic(cls, parameters, elen)
     97 """Check correct count for parameters of a generic cls (internal helper).
     98 This gives a nice error message in case of count mismatch.
     99 """
    100 if not elen:
--> 101     raise TypeError(f"{cls} is not a generic class")
    102 if elen is _marker:
    103     if not hasattr(cls, "__parameters__") or not cls.__parameters__:

TypeError: <class 'numpy.typing._dtype_like._SupportsDType'> is not a generic class

I also update mne to 1.1.1, mne_connectivity still cannot be imported.
I don’t know if something wrong with xarray package, so I’m trying to install xarray via conda install -c conda-forge xarray dask netCDF4 bottleneck. Any suggestions what should I do to solve this problem? Thanks a lot! :sob:

looks like a package version compatibility problem. This works for me:

$ conda create -c conda-forge -n test_env dask bottleneck mne-connectivity python=3.9
$ conda activate test_env
$ python
>>> import mne_connectivity

(you don’t have to restrict python to 3.9 as I’ve done here… I just did it because I saw that version in your output)

1 Like

Thank you so much! It truly worked! :heart: :heart: :heart: :+1: :+1: :+1: