ICA, hyperbolic tangent (tanh) and component filtering

External Email - Use Caution

Dear colleagues,
I'm working on an* independent component analysis (ICA). *
I have two problems I can't solve on my own.

   1. Is it possible to implement a *custom G function within ICA*? In this
   case I would like to implement a *hyperbolic tangent (tanh)*.
   2. Is it possible to* filter specific ICA components*? I would like to
   pass some specific components (e.g. "ICA001") inside a notch filter and
   then reconstruct the signal with the filtered components.

Thanks in advance for your time.
Greetings,
Francesco Mattioli

I'm trying to work without distractions. If I do not respond immediately,
drink a cup of tea and be patient. Great ideas need deep work!
?
-------------- next part --------------
An HTML attachment was scrubbed...
URL: http://mail.nmr.mgh.harvard.edu/pipermail/mne_analysis/attachments/20200416/fa9b42cd/attachment.html

External Email - Use Caution

hi,

see fit_params in ICA object. With FastICA you can change the non-linearity
as supported by
https://scikit-learn.org/stable/modules/generated/sklearn.decomposition.FastICA.html

HTH
Alex

External Email - Use Caution

If I'm not making syntax errors it should be:

ica = ICA(n_components=64, random_state=10, method="fastica",
max_iter=1000, fit_params=dict(fun=np.tanh))
ica.fit(raw)

Unfortunately, when I apply the fit. I get these errors:

Fitting ICA to data using 64 channels (please be patient, this may take a while)
Inferring max_pca_components from picks
Selecting by number: 64 components
Traceback (most recent call last):
  File "C:\Users\franc_pyl533c\Anaconda3\envs\eeg\lib\site-packages\IPython\core\interactiveshell.py",
line 3319, in run_code
    exec(code_obj, self.user_global_ns, self.user_ns)
  File "<ipython-input-18-2eabae644822>", line 4, in <module>
    ica.fit(raw)
  File "<decorator-gen-349>", line 21, in fit
  File "C:\Users\franc_pyl533c\Anaconda3\envs\eeg\lib\site-packages\mne\preprocessing\ica.py",
line 487, in fit
    tstep, reject_by_annotation, verbose)
  File "C:\Users\franc_pyl533c\Anaconda3\envs\eeg\lib\site-packages\mne\preprocessing\ica.py",
line 553, in _fit_raw
    self._fit(data, self.max_pca_components, 'raw')
  File "C:\Users\franc_pyl533c\Anaconda3\envs\eeg\lib\site-packages\mne\preprocessing\ica.py",
line 684, in _fit
    ica.fit(data[:, sel])
  File "C:\Users\franc_pyl533c\Anaconda3\envs\eeg\lib\site-packages\sklearn\decomposition\_fastica.py",
line 576, in fit
    self._fit(X, compute_sources=False)
  File "C:\Users\franc_pyl533c\Anaconda3\envs\eeg\lib\site-packages\sklearn\decomposition\_fastica.py",
line 511, in _fit
    W, n_iter = _ica_par(X1, **kwargs)
  File "C:\Users\franc_pyl533c\Anaconda3\envs\eeg\lib\site-packages\sklearn\decomposition\_fastica.py",
line 107, in _ica_par
    gwtx, g_wtx = g(np.dot(W, X), fun_args)
ValueError: too many values to unpack (expected 2)

Do you have any suggestions?
Thanks,
Francesco Mattioli

I'm trying to work without distractions. If I do not respond immediately,
drink a cup of tea and be patient. Great ideas need deep work!

?

External Email - Use Caution

ica = ICA(n_components=20, method='fastica', fit_params=dict(fun='cube'),
random_state=0)
ica.fit(raw, picks=picks, reject=reject)

works fine. To use a callable please read the doc at:

https://scikit-learn.org/stable/modules/generated/sklearn.decomposition.FastICA.html

""""
You can also provide your own function. It should return a tuple containing
the value of the function, and of its derivative, in the point. Example:

def my_g(x):
    return x ** 3, (3 * x ** 2).mean(axis=-1)
"""

Alex
-------------- next part --------------
An HTML attachment was scrubbed...
URL: http://mail.nmr.mgh.harvard.edu/pipermail/mne_analysis/attachments/20200419/0dd665bb/attachment.html