I have a question about SourceSpaces object. More precisely : I am following this tutorial (with MNE 0.20.0, https://mne.tools/stable/auto_examples/inverse/plot_custom_inverse_solver.html), with parameters loose and depth respectively set to 0.0 and 1.0 (so I am working with fixed orientation) in the _prepare_gain function. This latter returns a new forward object which contains a SourceSpaces, named src.
One of my goals is to only deal with vertices (and associated triangles) that are involved in the gain matrix in forward[?sol?][?data?] (whose shape is 305x7498), so I supposed these ones could be easily accessed using the ?use_tris? key. Unfortunately, it seems that it is not the case because if we proceed like this, we get 8196 vertices (4098 per hemisphere) and not 7498 as expected. Could someone explain me why ?
I was told that for some good reasons, a part of these vertices can be deleted by some MNE-Python functions in order to make things working right. For example, if we consider the left hemisphere (i.e. src[0]), we can see that there are only 3732 used vertices with the ?nuse? key. I looked through all the keys of src[0] and I found that ?patch_inds? seems to answer my question (at least it has a size of 3732), provided you rename the vertices of src[0][?use_tris?] before (using, for example, numpy.searchsorted function). My approach to get associated triangles to these vertices is to check if each triangle has all its vertices in the ?patch_inds? array. If it is the case, then we keep this triangle ; if not, we delete it. This procedure gives a new triangles array, but when I try something like np.unique(new_triangles).size, it returns 3729 (expected 3732) for the left hemisphere, and 3760 (expected 3766) for the right hemisphere. Maybe I am completely wrong with all of this, but after many days of research, I still do not know how to solve my problem.
I hope I have been clear enough in my explanations above.
what it is very possible is that the omission of certain dipoles in
the src at forward
stage due to min_dist parameter > 0 is not taken into account in the
use_tris which
is untouched.
it would be easier for me to think about the issue if you share a code
snippet I can play with.
Thank you! I attached a piece a code that does (I hope!) what I have explained before, with some comments.
Thank you again,
Cl?ment
Le 9 avr. 2020 ? 09:17, Alexandre Gramfort <alexandre.gramfort at inria.fr> a ?crit :
External Email - Use Caution
hi Cl?ment,
what it is very possible is that the omission of certain dipoles in
the src at forward
stage due to min_dist parameter > 0 is not taken into account in the
use_tris which
is untouched.
it would be easier for me to think about the issue if you share a code
snippet I can play with.
Alex
External Email - Use Caution
Hello everyone,
I have a question about SourceSpaces object. More precisely : I am following this tutorial (with MNE 0.20.0, Page Redirection), with parameters loose and depth respectively set to 0.0 and 1.0 (so I am working with fixed orientation) in the _prepare_gain function. This latter returns a new forward object which contains a SourceSpaces, named src.
One of my goals is to only deal with vertices (and associated triangles) that are involved in the gain matrix in forward[?sol?][?data?] (whose shape is 305x7498), so I supposed these ones could be easily accessed using the ?use_tris? key. Unfortunately, it seems that it is not the case because if we proceed like this, we get 8196 vertices (4098 per hemisphere) and not 7498 as expected. Could someone explain me why ?
I was told that for some good reasons, a part of these vertices can be deleted by some MNE-Python functions in order to make things working right. For example, if we consider the left hemisphere (i.e. src[0]), we can see that there are only 3732 used vertices with the ?nuse? key. I looked through all the keys of src[0] and I found that ?patch_inds? seems to answer my question (at least it has a size of 3732), provided you rename the vertices of src[0][?use_tris?] before (using, for example, numpy.searchsorted function). My approach to get associated triangles to these vertices is to check if each triangle has all its vertices in the ?patch_inds? array. If it is the case, then we keep this triangle ; if not, we delete it. This procedure gives a new triangles array, but when I try something like np.unique(new_triangles).size, it returns 3729 (expected 3732) for the left hemisphere, and 3760 (expected 3766) for the right hemisphere. Maybe I am completely wrong with all of this, but after many days of research, I still do not know how to solve my problem.
I hope I have been clear enough in my explanations above.
Thanks in advance,
Cl?ment
_______________________________________________
Mne_analysis mailing list
Mne_analysis at nmr.mgh.harvard.edu Mne_analysis Info Page
_______________________________________________
Mne_analysis mailing list
Mne_analysis at nmr.mgh.harvard.edu Mne_analysis Info Page
>> I was told that for some good reasons, a part of these vertices can be
deleted by some MNE-Python functions in order to make things working right.
For example, if we consider the left hemisphere (i.e. src[0]), we can see
that there are only 3732 used vertices with the ?nuse? key. I looked
through all the keys of src[0] and I found that ?patch_inds? seems to
answer my question (at least it has a size of 3732), provided you rename
the vertices of src[0][?use_tris?] before (using, for example,
numpy.searchsorted function).
Really you should look at `src[0]['vertno']` or
`np.where(src[0]['inuse'])[0]` (should be the same), which give the
vertices from the original mesh that are actually used.
To know which triangles these points correspond to, you can go back to the
original source space (before vertices are removed from the forward) and
look at src[0]['vertno'] there. From the difference, you can figure out
which ones have been dropped (~np.in1d(orig_src[0]['vertno'],
fwd['src'][0]['vertno'])), and then include whichever of the
`orig_src[0]['use_tris']` you want, e.g., those for which all vertices are
present in the forward source space, or those for which at least one vertex
is present in the source space, depending on your needs.
Thank you for your help. It seems there are some extra vertices in the final mesh information, but I found a way to solve this in the context of my problem.
Thank you again,
Cl?ment
Le 9 avr. 2020 ? 20:20, Eric Larson <larson.eric.d at gmail.com<mailto:larson.eric.d at gmail.com>> a ?crit :
External Email - Use Caution
I was told that for some good reasons, a part of these vertices can be deleted by some MNE-Python functions in order to make things working right. For example, if we consider the left hemisphere (i.e. src[0]), we can see that there are only 3732 used vertices with the ?nuse? key. I looked through all the keys of src[0] and I found that ?patch_inds? seems to answer my question (at least it has a size of 3732), provided you rename the vertices of src[0][?use_tris?] before (using, for example, numpy.searchsorted function).
Really you should look at `src[0]['vertno']` or `np.where(src[0]['inuse'])[0]` (should be the same), which give the vertices from the original mesh that are actually used.
To know which triangles these points correspond to, you can go back to the original source space (before vertices are removed from the forward) and look at src[0]['vertno'] there. From the difference, you can figure out which ones have been dropped (~np.in1d(orig_src[0]['vertno'], fwd['src'][0]['vertno'])), and then include whichever of the `orig_src[0]['use_tris']` you want, e.g., those for which all vertices are present in the forward source space, or those for which at least one vertex is present in the source space, depending on your needs.