Merge lp:~nipy-developers/nipy/trunk-mb into lp:~nipy-developers/nipy/obsolete-do-not-use
- trunk-mb
- Merge into obsolete-do-not-use
Status: | Merged |
---|---|
Merged at revision: | not available |
Proposed branch: | lp:~nipy-developers/nipy/trunk-mb |
Merge into: | lp:~nipy-developers/nipy/obsolete-do-not-use |
Diff against target: |
2793 lines (+1385/-636) 39 files modified
doc/devel/guidelines/coverage_testing.rst (+10/-10) doc/devel/index.rst (+0/-2) doc/devel/install/ubuntu.rst (+1/-1) doc/devel/pynifti.rst (+0/-167) doc/faq/documentation_faq.rst (+2/-2) doc/users/coordinate_map.rst (+1/-1) doc/users/installation.rst (+1/-4) doc/www/conf.py (+4/-4) nipy/core/api.py (+20/-12) nipy/core/image/image.py (+1/-1) nipy/core/reference/array_coords.py (+20/-8) nipy/core/reference/tests/test_array_coords.py (+88/-0) nipy/io/files.py (+1/-1) nipy/io/imageformats/__init__.py (+9/-5) nipy/io/imageformats/analyze.py (+42/-31) nipy/io/imageformats/funcs.py (+54/-0) nipy/io/imageformats/images.py (+0/-124) nipy/io/imageformats/ioimps.py (+0/-64) nipy/io/imageformats/loadsave.py (+16/-16) nipy/io/imageformats/minc.py (+0/-16) nipy/io/imageformats/nifti1.py (+3/-9) nipy/io/imageformats/orientations.py (+254/-0) nipy/io/imageformats/spatialimages.py (+134/-29) nipy/io/imageformats/spm2analyze.py (+1/-1) nipy/io/imageformats/spm99analyze.py (+3/-3) nipy/io/imageformats/testing/__init__.py (+18/-15) nipy/io/imageformats/testing/_paramtestpy2.py (+89/-0) nipy/io/imageformats/testing/_paramtestpy3.py (+62/-0) nipy/io/imageformats/testing/lightunit.py (+55/-0) nipy/io/imageformats/testing/nosepatch.py (+53/-0) nipy/io/imageformats/tests/test_funcs.py (+32/-4) nipy/io/imageformats/tests/test_nifti1.py (+1/-1) nipy/io/imageformats/tests/test_orientations.py (+219/-0) nipy/io/nifti_ref.py (+1/-1) nipy/modalities/fmri/fmristat/tests/test_FIAC.py (+16/-19) nipy/modalities/fmri/pca.py (+58/-40) nipy/modalities/fmri/tests/test_pca.py (+55/-29) nipy/neurospin/register/benchmarks/bench_register.py (+1/-1) setup.py (+60/-15) |
To merge this branch: | bzr merge lp:~nipy-developers/nipy/trunk-mb |
Related bugs: |
Reviewer | Review Type | Date Requested | Status |
---|---|---|---|
Matthew Brett | Needs Information | ||
Review via email: mp+17287@code.launchpad.net |
Commit message
Description of the change
Matthew Brett (matthew-brett) wrote : | # |
Matthew Brett (matthew-brett) wrote : | # |
I think Gael's approved this one on the list...
Fernando Perez (fdo.perez) wrote : | # |
Very minor nits
27 +from nipy.core.
28 + drop_io_dim, append_io_dim
I'd use
from ... import (...
...)
now that this syntax is allowed.
In
129 +@parametric
130 +def test_array_
Maybe a few comments? If one of those tests ever breaks, someone who understands the code less could be in a tight spot to fix them back.
In:
345 +def load(filename, *args, **kwargs):
346 + ''' Load file given filename, guessing at file type
347
348 Parameters
349 ----------
350 - filespec : string or file-like
351 + filename : string or file-like
I actualy found 'filespec' to be clearer, because when I see 'filename' I immediately think "it's already a string", where as filespec to me conveys the idea of name-or-open-handle better.
Looks great otherwise, and these are very minor points. Thanks!
Matthew Brett (matthew-brett) wrote : | # |
Hi,
Thanks very much for the review, it's very good to have some feedback
on the code.
On Wed, Jan 13, 2010 at 4:51 PM, Fernando Perez <email address hidden> wrote:
> Very minor nits
>
> 27 +from nipy.core.
> 28 + drop_io_dim, append_io_dim
>
> I'd use
>
> from ... import (...
> ...)
>
> now that this syntax is allowed.
Yes, we should move that way now, I agree.
> 129 +@parametric
> 130 +def test_array_
>
> Maybe a few comments? If one of those tests ever breaks, someone who understands the code less could be in a tight spot to fix them back.
Yes, fair point.
> In:
>
> 345 +def load(filename, *args, **kwargs):
> 346 + ''' Load file given filename, guessing at file type
> 347
> 348 Parameters
> 349 ----------
> 350 - filespec : string or file-like
> 351 + filename : string or file-like
>
> I actualy found 'filespec' to be clearer, because when I see 'filename' I immediately think "it's already a string", where as filespec to me conveys the idea of name-or-open-handle better.
Ah - yes - actually it's much worse than that. In your Ipython
coding frenzy you may have missed that:
img = Spm99AnalyzeIma
img.to_
actually saves to 'test.img', 'test.hdr', and 'test.mat'. I think
that filespec is better too, but Gael was finding the name a little
confusing, because you usually use it just with a filename.
See you,
Matthew
Matthew Brett (matthew-brett) wrote : | # |
Hi,
>> In:
>>
>> 345 +def load(filename, *args, **kwargs):
>> 346 + ''' Load file given filename, guessing at file type
>> 347
>> 348 Parameters
>> 349 ----------
>> 350 - filespec : string or file-like
>> 351 + filename : string or file-like
>>
>> I actualy found 'filespec' to be clearer, because when I see 'filename' I immediately think "it's already a string", where as filespec to me conveys the idea of name-or-open-handle better.
>
> Ah - yes - actually it's much worse than that. In your Ipython
> coding frenzy you may have missed that:
>
> img = Spm99AnalyzeIma
> img.to_
>
> actually saves to 'test.img', 'test.hdr', and 'test.mat'. I think
> that filespec is better too, but Gael was finding the name a little
> confusing, because you usually use it just with a filename.
Thinking about this further, I am less sure that the change is bad,
that is, 'to_filename' may indeed be preferable. 'to_filename' does
in fact require a string, because for most formats, more than one file
is needed to save the image, so file object would not work for these
formats. The reason I initially chose 'to_filespec' was to try and
signal the oddness above, but, given that 'to_filespec' made you think
you could use a file object, perhaps it's best left as 'to_filename',
and we'll hope the oddness doesn't show up too much...
See you,
Matthew
- 1816. By Matthew Brett <email address hidden>
-
PCA standardize with np.std instead of error sum of squares
- 1817. By Matthew Brett <email address hidden>
-
Maybe more obvious standardize pattern
- 1818. By Matthew Brett <email address hidden>
-
Cosmetic PCA doc edits
- 1819. By Matthew Brett <email address hidden>
-
Applied upstream patch continuing to_filespec -> to_filename rename
- 1820. By Matthew Brett <email address hidden>
-
Moved from deprecated to_filespec method
- 1821. By Matthew Brett <email address hidden>
-
Removed some remaining references to neuroimaging pacage
- 1822. By Matthew Brett <email address hidden>
-
Commenting on array_coords test
- 1823. By Matthew Brett <email address hidden>
-
Removed outdated pynifti docs
- 1824. By Matthew Brett <email address hidden>
-
Removed old references to pynifti; added affine orientation calculation
Gael Varoquaux (gael-varoquaux) wrote : | # |
A few remarks:
In pca.py, you are using numpy linalg. I tend to frown on that, because my experience is that quite often numpy linalg is compiled using the included mini-blas, which is not terribly good (for instance, at neurospin, it is the case, but I've seen it elsewhere). As we have a dependency on scipy, if you use scipy.linalg, you know that you are relying on good quality linear algebra libraries.
Also, I see that you use np.std. I remember moving away from it, because it had very bad performance on big arrays:
{{{
In [6]: t = np.random.
In [7]: %timeit np.sqrt(
1 loops, best of 3: 1.19 s per loop
In [8]: %timeit np.sqrt(
1 loops, best of 3: 587 ms per loop
In [9]: %timeit t.std(axis=1)
1 loops, best of 3: 1.19 s per loop
In [10]: %timeit t.std(axis=0)
1 loops, best of 3: 2.33 s per loop
}}}
Gael Varoquaux (gael-varoquaux) wrote : | # |
I just remembered, about std, the killer was not the speed, but the memory usage:
{{{
In [1]: import numpy as np
In [2]: t = np.random.
In [3]: %timeit t.std(axis=0)
-------
MemoryError Traceback (most recent call last)
}}}
{{{
In [1]: import numpy as np
In [2]: t = np.random.
In [3]: %timeit np.sqrt(
1 loops, best of 3: 5.03 s per loop
}}}
With my data, std simply wouldn't work.
Matthew Brett (matthew-brett) wrote : | # |
HI,
Thanks - both the linalg and the std experience is useful, I'll change
the code...
Matthew
On Sun, Jan 17, 2010 at 9:31 AM, Gael Varoquaux
<email address hidden> wrote:
> I just remembered, about std, the killer was not the speed, but the memory usage:
>
> {{{
> In [1]: import numpy as np
>
> In [2]: t = np.random.
>
> In [3]: %timeit t.std(axis=0)
>
> -------
> MemoryError Traceback (most recent call last)
> }}}
>
> {{{
> In [1]: import numpy as np
>
> In [2]: t = np.random.
>
> In [3]: %timeit np.sqrt(
> 1 loops, best of 3: 5.03 s per loop
> }}}
>
> With my data, std simply wouldn't work.
> --
> https:/
> You proposed lp:~nipy-developers/nipy/trunk-mb for merging.
>
Matthew Brett (matthew-brett) wrote : | # |
Ah, except of course std(t, axis=0) is not the same as
np.sqrt(
== 0) - which might explain the speed difference.
On Sun, Jan 17, 2010 at 9:31 AM, Gael Varoquaux
<email address hidden> wrote:
> I just remembered, about std, the killer was not the speed, but the memory usage:
>
> {{{
> In [1]: import numpy as np
>
> In [2]: t = np.random.
>
> In [3]: %timeit t.std(axis=0)
>
> -------
> MemoryError Traceback (most recent call last)
> }}}
>
> {{{
> In [1]: import numpy as np
>
> In [2]: t = np.random.
>
> In [3]: %timeit np.sqrt(
> 1 loops, best of 3: 5.03 s per loop
> }}}
>
> With my data, std simply wouldn't work.
> --
> https:/
> You proposed lp:~nipy-developers/nipy/trunk-mb for merging.
>
- 1825. By Matthew Brett <email address hidden>
-
Use scipy.linalg; do use root MSE instead of standard deviation to standardize - thanks to GV
Fernando Perez (fdo.perez) wrote : | # |
Looks fine from here, just one question: if we now use 'filename' everywhere, do we still accept open file handles? If so, then we should say somewhere clearly (probably in every docstring) something like:
filename : string or open file-like object
If we accept name-or-object in a variable called 'name', this should be made clear at least in the docstrings (since we've chosen to use deliberately misleading naming convention, we should at least give users a chance of understand the trick and undo the confusion in their heads via documentation)
Matthew Brett (matthew-brett) wrote : | # |
Hi,
On Mon, Jan 18, 2010 at 1:45 AM, Fernando Perez <email address hidden> wrote:
> Looks fine from here, just one question: if we now use 'filename' everywhere, do we still accept open file handles? If so, then we should say somewhere clearly (probably in every docstring) something like:
>
>
> filename : string or open file-like object
At the moment, there's a different routine for writing to filenames
and / or file objects - called - perhaps confusingly - img.to_files
The reason there's a different routine is that several image formats
need more than one file-object to write to. For example, you have to
write SPM analyze images by writing filename.img for the binary data
and filename.hdr for the header. If you wanted to do that, passing a
single file object can't work. I could see other ways round this -
like - raising an error with a single file-handle for img.to_filename
when the format needs two files, but it would make the interface a bit
difficult to explain - at least - that was my thought...
See you,
Matthew
Fernando Perez (fdo.perez) wrote : | # |
On Sun, Jan 17, 2010 at 10:33 PM, Matthew Brett <email address hidden> wrote:
> The reason there's a different routine is that several image formats
> need more than one file-object to write to. For example, you have to
> write SPM analyze images by writing filename.img for the binary data
> and filename.hdr for the header. If you wanted to do that, passing a
> single file object can't work. I could see other ways round this -
> like - raising an error with a single file-handle for img.to_filename
> when the format needs two files, but it would make the interface a bit
> difficult to explain - at least - that was my thought...
OK, then it makes sense to allow *only* actual filenames, and never
open handles on these functions, I see.
I don't think this is worth spending much more time on, as long as
this behavior is explained in the docstrings, since there's a
perfectly valid reason for it, we're good.
Cheers,
f
- 1826. By Matthew Brett <email address hidden>
-
Applied upstream patch completing image reorientation
- 1827. By Matthew Brett <email address hidden>
-
Jonathans generalization of affine orientation
- 1828. By Matthew Brett <email address hidden>
-
Upstream rejigging of orientations
- 1829. By Matthew Brett <email address hidden>
-
Fix reordering - I was doing it the wrong way round
- 1830. By Matthew Brett <email address hidden>
-
Montages draft
- 1831. By Matthew Brett <email address hidden>
-
Rot90 for more obvious slice display
- 1832. By Matthew Brett <email address hidden>
-
Yoked viz
- 1833. By Matthew Brett <email address hidden>
-
Reconsidered viz; a bit too drafty, needs proper thought
Matthew Brett (matthew-brett) wrote : | # |
Any more reviews here? I'll merge tomorrow unless I hear otherwise...
Preview Diff
1 | === modified file 'doc/devel/guidelines/coverage_testing.rst' |
2 | --- doc/devel/guidelines/coverage_testing.rst 2009-03-09 19:13:28 +0000 |
3 | +++ doc/devel/guidelines/coverage_testing.rst 2010-01-21 22:28:12 +0000 |
4 | @@ -28,15 +28,15 @@ |
5 | |
6 | Name Stmts Exec Cover Missing |
7 | ----------------------------------------------------------------------------- |
8 | - neuroimaging 21 14 66% 70-74, 88-89 |
9 | - neuroimaging.core 4 4 100% |
10 | - neuroimaging.core.reference 8 8 100% |
11 | - neuroimaging.core.reference.array_coords 100 90 90% 133-134, 148-151, 220, 222, 235, 242 |
12 | - neuroimaging.core.reference.coordinate_map 188 187 99% 738 |
13 | - neuroimaging.core.reference.coordinate_system 61 61 100% |
14 | - neuroimaging.core.reference.slices 34 34 100% |
15 | - neuroimaging.core.transforms 0 0 100% |
16 | - neuroimaging.core.transforms.affines 14 14 100% |
17 | + nipy 21 14 66% 70-74, 88-89 |
18 | + nipy.core 4 4 100% |
19 | + nipy.core.reference 8 8 100% |
20 | + nipy.core.reference.array_coords 100 90 90% 133-134, 148-151, 220, 222, 235, 242 |
21 | + nipy.core.reference.coordinate_map 188 187 99% 738 |
22 | + nipy.core.reference.coordinate_system 61 61 100% |
23 | + nipy.core.reference.slices 34 34 100% |
24 | + nipy.core.transforms 0 0 100% |
25 | + nipy.core.transforms.affines 14 14 100% |
26 | |
27 | |
28 | The coverage report will cover any python source module imported after |
29 | @@ -48,7 +48,7 @@ |
30 | ``--cover-package``. For example, in writing tests for the |
31 | coordinate_map module:: |
32 | |
33 | - nosetests --with-coverage --cover-package=neuroimaging.core.reference.coordinate_map test_coordinate_map.py |
34 | + nosetests --with-coverage --cover-package=nipy.core.reference.coordinate_map test_coordinate_map.py |
35 | |
36 | Since that's a lot to type, I wrote a tool called ``sneeze`` to that |
37 | simplifies coverage testing with nose. |
38 | |
39 | === modified file 'doc/devel/index.rst' |
40 | --- doc/devel/index.rst 2009-02-13 02:17:31 +0000 |
41 | +++ doc/devel/index.rst 2010-01-21 22:28:12 +0000 |
42 | @@ -19,5 +19,3 @@ |
43 | software_design/index |
44 | software_design/usecases/index |
45 | tools/index |
46 | - |
47 | - pynifti |
48 | |
49 | === modified file 'doc/devel/install/ubuntu.rst' |
50 | --- doc/devel/install/ubuntu.rst 2009-08-05 02:03:11 +0000 |
51 | +++ doc/devel/install/ubuntu.rst 2010-01-21 22:28:12 +0000 |
52 | @@ -13,12 +13,12 @@ |
53 | sudo apt-get install python-numpy python-numpy-dev python-scipy |
54 | sudo apt-get install liblapack-dev |
55 | sudo apt-get install python-sympy |
56 | - sudo apt-get install python-nifti |
57 | |
58 | Options:: |
59 | |
60 | sudo apt-get install ipython |
61 | sudo apt-get install python-matplotlib |
62 | + sudo apt-get install mayavi2 |
63 | |
64 | For getting the code via version control:: |
65 | |
66 | |
67 | === removed file 'doc/devel/pynifti.rst' |
68 | --- doc/devel/pynifti.rst 2009-02-03 08:08:16 +0000 |
69 | +++ doc/devel/pynifti.rst 1970-01-01 00:00:00 +0000 |
70 | @@ -1,167 +0,0 @@ |
71 | -.. _pynifti-io: |
72 | - |
73 | -============ |
74 | - Pynifti IO |
75 | -============ |
76 | - |
77 | -We are using pynifti_ for the underlying nifti file I/O support. |
78 | -Pynifti is built upon the nifticlibs_ so this will provide us will |
79 | -*full* nifti support. We are working closely with the author of |
80 | -pynifti_, Michael Hanke, and pushing any bug fixes or feature |
81 | -improvements upstream to the git repository. |
82 | - |
83 | -Developers should read through Michael's documentation on the pynifti_ |
84 | -site for some details on the project. The source is checked out from |
85 | -the `git repository. |
86 | -<http://git.debian.org/?p=pkg-exppsy/pynifti.git>`_ |
87 | - |
88 | -Using the command:: |
89 | - |
90 | - git clone http://git.debian.org/git/pkg-exppsy/pynifti.git |
91 | - |
92 | -Git |
93 | ---- |
94 | - |
95 | -Pynifti uses git_ for it's version control system. Git is very |
96 | -different from `svn <http://subversion.tigris.org/>`_, developers |
97 | -should read some documentation on git before doing any work with the |
98 | -git repository. |
99 | - |
100 | -The git_ website has several tutorials and full documentation. A good |
101 | -starting point may be the `Git for SVN Users |
102 | -<http://git.or.cz/course/svn.html>`_ |
103 | - |
104 | -Git has a unique mechanism for storing multiple branches on your |
105 | -machine. Instead of having separate file directories, git will store |
106 | -all branches in one directory and store *branch diffs* in an internal |
107 | -database. When you switch branches (``git checkout branch-name``), |
108 | -git will apply the branch diff to the directory, updating any files to |
109 | -match the new branch. |
110 | - |
111 | -Git uses man pages for it's installed documentation. As with all man |
112 | -pages, these contain a lot of useful information, so you should know |
113 | -how to access them. All git commands can be called in two forms: |
114 | - |
115 | -1. git add <filename> |
116 | - |
117 | -2. git-add <filename> |
118 | - |
119 | -The first form is the one you will probably use most and is what is |
120 | -often shown in the documentation. The second form, however, is what |
121 | -you need to access the man page. To see the man page on how to add a |
122 | -file to a git repository:: |
123 | - |
124 | - man git-add |
125 | - |
126 | -To see a list of all git commands look at the main git man page:: |
127 | - |
128 | - man git |
129 | - |
130 | -As with Bazaar, you should identify yourself to git so the repository |
131 | -keeps track of who made your edits:: |
132 | - |
133 | - git config --global user.name "Your Name Comes Here" |
134 | - git config --global user.email you@yourdomain.example.com |
135 | - |
136 | -To list your git configuration:: |
137 | - |
138 | - cburns@pynifti 13:32:31 $ git config -l |
139 | - user.name=Christopher Burns |
140 | - user.email=cburns[at]berkeley[dot]edu |
141 | - color.diff=auto |
142 | - color.status=auto |
143 | - core.repositoryformatversion=0 |
144 | - core.filemode=true |
145 | - core.bare=false |
146 | - core.logallrefupdates=true |
147 | - remote.origin.url=ssh://git.debian.org/git/pkg-exppsy/pynifti.git |
148 | - remote.origin.fetch=+refs/heads/*:refs/remotes/origin/* |
149 | - branch.master.remote=origin |
150 | - branch.master.merge=refs/heads/master |
151 | - |
152 | -We can also see the remote origin branch from which we have cloned. |
153 | - |
154 | -Development Cycle |
155 | ------------------ |
156 | - |
157 | -There are 3 development branches that the nipy developers need to |
158 | -interact with in the git repository: |
159 | - |
160 | -* master - the main pynifti repository |
161 | - |
162 | -* cb/master - Chris' pynifti developer repository |
163 | - |
164 | -* cb/nipy - Chris' pynifti developer repository with nipy specific code |
165 | - |
166 | -A nipy development path would look like this: |
167 | - |
168 | -#. In the master branch, merge with the server to get any updates from |
169 | - other pynifti developers. |
170 | - |
171 | -#. Checkout the cb/master branch and merge from the local master branch. |
172 | - |
173 | -#. Make any code edits that should be pushed upstream in the cb/master |
174 | - branch. Michael cherry-picks changes into the master branch. |
175 | - |
176 | -#. Checkout the cb/nipy branch and merge from the cb/master. The |
177 | - cb/nipy branch is used as the source for nipy. |
178 | - |
179 | -Example: |
180 | -^^^^^^^^ |
181 | - |
182 | -I'm in my pynifti source directory:: |
183 | - |
184 | - cburns@pynifti 13:20:35 $ pwd |
185 | - /Users/cburns/src/pynifti |
186 | - |
187 | -Use the ``git branch`` command without arguments to see all of your |
188 | -local branches. Below we can see that I'm in my ``cb/master`` |
189 | -branch:: |
190 | - |
191 | - cburns@pynifti 13:20:39 $ git branch |
192 | - * cb/master |
193 | - cb/nipy |
194 | - master |
195 | - |
196 | -I want to switch to the ``master`` branch and update it with the |
197 | -server:: |
198 | - |
199 | - cburns@pynifti 13:26:08 $ git checkout master |
200 | - Switched to branch "master" |
201 | - cburns@pynifti 13:26:16 $ git branch |
202 | - cb/master |
203 | - cb/nipy |
204 | - * master |
205 | - |
206 | -Pull from the server to update our master branch:: |
207 | - |
208 | - cburns@pynifti 13:35:52 $ git pull |
209 | - Password: |
210 | - Already up-to-date. |
211 | - |
212 | -Switch into ``cb/master`` and merge with the ``master`` branch. |
213 | -Remember, these are not separate directories, git *knows* about the |
214 | -other branch by name, so we do not specify a path, we specify a branch |
215 | -name.:: |
216 | - |
217 | - cburns@pynifti 13:36:18 $ git checkout cb/master |
218 | - Switched to branch "cb/master" |
219 | - |
220 | - cburns@pynifti 13:38:38 $ git merge master |
221 | - Already up-to-date. |
222 | - |
223 | -To update the nipy source: |
224 | -^^^^^^^^^^^^^^^^^^^^^^^^^^ |
225 | - |
226 | -#. Change to the pynifti directory in the nipy developer trunk:: |
227 | - |
228 | - cd trunk-dev/neuroimaging/externals/pynifti/utils |
229 | - |
230 | -#. Run the ``update_source.py`` script to update the source. This |
231 | - assumes a directory structure the pynifti sources are in the |
232 | - directory ``$HOME/src/pynifti``. Run the script:: |
233 | - |
234 | - python update_source.py |
235 | - |
236 | - |
237 | -.. include:: ../links_names.txt |
238 | |
239 | === modified file 'doc/faq/documentation_faq.rst' |
240 | --- doc/faq/documentation_faq.rst 2009-03-19 23:17:13 +0000 |
241 | +++ doc/faq/documentation_faq.rst 2010-01-21 22:28:12 +0000 |
242 | @@ -25,8 +25,8 @@ |
243 | install. The error may look something like:: |
244 | |
245 | **writing output...** about api/generated/gen |
246 | - api/generated/neuroimaging |
247 | - api/generated/neuroimaging.algorithms.fwhm Format: "png" not |
248 | + api/generated/nipy |
249 | + api/generated/nipy.algorithms.fwhm Format: "png" not |
250 | recognized. Use one of: canon cmap cmapx cmapx_np dia dot eps fig |
251 | hpgl imap imap_np ismap mif mp pcl pic plain plain-ext ps ps2 svg |
252 | svgz tk vml vmlz vtx xdot |
253 | |
254 | === modified file 'doc/users/coordinate_map.rst' |
255 | --- doc/users/coordinate_map.rst 2009-07-03 07:17:10 +0000 |
256 | +++ doc/users/coordinate_map.rst 2010-01-21 22:28:12 +0000 |
257 | @@ -22,7 +22,7 @@ |
258 | |
259 | |
260 | For more on Coordinate Systems and thier properties |
261 | -:mod:`neuroimaging.core.reference.coordinate_system` |
262 | +:mod:`nipy.core.reference.coordinate_system` |
263 | |
264 | You can introspect a coordinate map |
265 | |
266 | |
267 | === modified file 'doc/users/installation.rst' |
268 | --- doc/users/installation.rst 2009-10-15 18:28:57 +0000 |
269 | +++ doc/users/installation.rst 2010-01-21 22:28:12 +0000 |
270 | @@ -18,16 +18,13 @@ |
271 | Must Have |
272 | ^^^^^^^^^ |
273 | |
274 | - Python_ 2.4 or later |
275 | + Python_ 2.5 or later |
276 | |
277 | NumPy_ 1.2 or later |
278 | |
279 | SciPy_ 0.7 or later |
280 | Numpy and Scipy are high-level, optimized scientific computing libraries. |
281 | |
282 | - PyNifti_ |
283 | - We are using pynifti for the underlying file IO for nifti files. |
284 | - |
285 | gcc_ |
286 | NIPY does contain a few C extensions for optimized |
287 | routines. Therefore, you must have a compiler to build from |
288 | |
289 | === modified file 'doc/www/conf.py' |
290 | --- doc/www/conf.py 2009-02-12 23:26:12 +0000 |
291 | +++ doc/www/conf.py 2010-01-21 22:28:12 +0000 |
292 | @@ -28,15 +28,15 @@ |
293 | # The objects.inv file has this info for each mapped object: |
294 | # label-name classifier path-to-html |
295 | # Examples: |
296 | -# neuroimaging.core.image.image.Image class api/generated/neuroimaging.core.image.image.html |
297 | -# neuroimaging.core.image.generators mod api/generated/neuroimaging.core.image.generators.html |
298 | +# nipy.core.image.image.Image class api/generated/nipy.core.image.image.html |
299 | +# nipy.core.image.generators mod api/generated/nipy.core.image.generators.html |
300 | # |
301 | #intersphinx_mapping = {'../html/doc/manual/html': '../build/html/objects.inv'} |
302 | # |
303 | # In reST documents, I can then link to Python objects in the API like this: |
304 | # |
305 | -#This is the image class: :class:`neuroimaging.core.image.image` |
306 | -#This is the Image.affine method: :meth:`neuroimaging.core.image.image.Image.affine` |
307 | +#This is the image class: :class:`nipy.core.image.image` |
308 | +#This is the Image.affine method: :meth:`nipy.core.image.image.Image.affine` |
309 | |
310 | # Add any paths that contain templates here, relative to this directory. |
311 | templates_path = ['_templates'] |
312 | |
313 | === modified file 'nipy/core/api.py' |
314 | --- nipy/core/api.py 2009-12-17 20:42:37 +0000 |
315 | +++ nipy/core/api.py 2010-01-21 22:28:12 +0000 |
316 | @@ -1,16 +1,24 @@ |
317 | """ |
318 | -Pseudo-package for all of the core symbols from the image object and its reference |
319 | -system. Use this module for importing core names into your namespace. For example: |
320 | - from neuorimaging.core.api import Image |
321 | +Pseudo-package for all of the core symbols from the image object and its |
322 | +reference system. Use this module for importing core names into your |
323 | +namespace. |
324 | + |
325 | +For example: |
326 | + |
327 | +>>> from nipy.core.api import Image |
328 | """ |
329 | |
330 | # Note: The order of imports is important here. |
331 | -from nipy.core.reference.coordinate_system import CoordinateSystem |
332 | -from nipy.core.reference.coordinate_map import CoordinateMap, Affine, compose |
333 | -from nipy.core.reference.array_coords import Grid, ArrayCoordMap |
334 | - |
335 | -from nipy.core.image.image import Image, merge_images, fromarray, is_image |
336 | - |
337 | -from nipy.core.image.image_list import ImageList |
338 | - |
339 | -from nipy.core.image.generators import parcels, data_generator, write_data, slice_generator, f_generator, matrix_generator |
340 | +from .reference.coordinate_system import CoordinateSystem |
341 | +from .reference.coordinate_map import (CoordinateMap, Affine, compose, |
342 | + drop_io_dim, append_io_dim) |
343 | +from .reference.array_coords import Grid, ArrayCoordMap |
344 | + |
345 | +from .image.image import Image, merge_images, fromarray, is_image |
346 | + |
347 | +from .image.image_list import ImageList |
348 | + |
349 | +from .image.generators import (parcels, data_generator, write_data, |
350 | + slice_generator, f_generator, |
351 | + matrix_generator) |
352 | + |
353 | |
354 | === modified file 'nipy/core/image/image.py' |
355 | --- nipy/core/image/image.py 2009-12-17 20:42:37 +0000 |
356 | +++ nipy/core/image/image.py 2010-01-21 22:28:12 +0000 |
357 | @@ -76,7 +76,7 @@ |
358 | # ii) the shapes are consistent |
359 | |
360 | # self._data is an array-like object. It must implement a subset of |
361 | - # array methods (Need to specify these, for now implied in pyniftio) |
362 | + # array methods (Need to specify these) |
363 | self._data = data |
364 | self._coordmap = coordmap |
365 | |
366 | |
367 | === modified file 'nipy/core/reference/array_coords.py' |
368 | --- nipy/core/reference/array_coords.py 2009-02-25 04:54:32 +0000 |
369 | +++ nipy/core/reference/array_coords.py 2010-01-21 22:28:12 +0000 |
370 | @@ -57,11 +57,9 @@ |
371 | values : array |
372 | Values of self.coordmap evaluated at np.indices(self.shape). |
373 | """ |
374 | - |
375 | indices = np.indices(self.shape).astype( |
376 | self.coordmap.input_coords.coord_dtype) |
377 | tmp_shape = indices.shape |
378 | - |
379 | # reshape indices to be a sequence of coordinates |
380 | indices.shape = (self.coordmap.ndim[0], np.product(self.shape)) |
381 | _range = self.coordmap(indices.T) |
382 | @@ -87,13 +85,27 @@ |
383 | |
384 | Parameters |
385 | ---------- |
386 | - index : ``int`` or ``slice`` |
387 | - sequence of integers or slices |
388 | - |
389 | + index : int or tuple |
390 | + int, or sequence of any combination of integers, slices. The |
391 | + sequence can also cantian one Ellipsis. |
392 | """ |
393 | - |
394 | + # index can be single int or tuple |
395 | if type(index) != type(()): |
396 | index = (index,) |
397 | + # allow slicing of form [...,1] |
398 | + if Ellipsis in index: |
399 | + if sum([i == Ellipsis for i in index]) > 1: |
400 | + raise ValueError("only one Ellipsis (...) allowed in slice") |
401 | + # convert ellipsis to series of slice(None) objects. For |
402 | + # example, if the coordmap is length 3, we convert (...,1) |
403 | + # to (slice(None), slice(None), 1) - equivalent to [:,:,1] |
404 | + ellipsis_start = list(index).index(Ellipsis) |
405 | + inds_after_ellipsis = index[(ellipsis_start+1):] |
406 | + # the ellipsis continues until any remaining slice specification |
407 | + n_ellipses = len(self.shape) - ellipsis_start - len(inds_after_ellipsis) |
408 | + index = (index[:ellipsis_start] |
409 | + + n_ellipses * (slice(None),) |
410 | + + inds_after_ellipsis) |
411 | return _slice(self.coordmap, self.shape, *index) |
412 | |
413 | @staticmethod |
414 | @@ -106,16 +118,15 @@ |
415 | slices = tuple([slice(0,s,1) for s in shape]) |
416 | return Grid(coordmap)[slices] |
417 | |
418 | + |
419 | def _slice(coordmap, shape, *slices): |
420 | """ |
421 | Slice a 'voxel' CoordinateMap's input_coords with slices. A |
422 | 'voxel' CoordinateMap is interpreted as a coordmap having a shape. |
423 | """ |
424 | - |
425 | if len(slices) < coordmap.ndim[0]: |
426 | slices = (list(slices) + |
427 | [slice(None,None,None)] * (coordmap.ndim[0] - len(slices))) |
428 | - |
429 | ranges = [np.arange(s) for s in shape] |
430 | cmaps = [] |
431 | keep_in_output = [] |
432 | @@ -177,6 +188,7 @@ |
433 | slice_cmap = Affine(A, input_coords, coordmap.input_coords) |
434 | return ArrayCoordMap(compose(coordmap, slice_cmap), tuple(newshape)) |
435 | |
436 | + |
437 | class Grid(object): |
438 | """ |
439 | Simple class to construct Affine instances with slice notation |
440 | |
441 | === added file 'nipy/core/reference/tests/test_array_coords.py' |
442 | --- nipy/core/reference/tests/test_array_coords.py 1970-01-01 00:00:00 +0000 |
443 | +++ nipy/core/reference/tests/test_array_coords.py 2010-01-21 22:28:12 +0000 |
444 | @@ -0,0 +1,88 @@ |
445 | +""" Testing array coords |
446 | + |
447 | +""" |
448 | + |
449 | +import numpy as np |
450 | + |
451 | +from nose.tools import assert_true, assert_false, \ |
452 | + assert_equal, assert_raises |
453 | + |
454 | +from numpy.testing import assert_array_equal, assert_array_almost_equal |
455 | + |
456 | +from nipy.testing import parametric |
457 | + |
458 | +from nipy.core.api import Affine |
459 | + |
460 | +import nipy.core.reference.array_coords as acs |
461 | + |
462 | + |
463 | +@parametric |
464 | +def test_array_coord_map(): |
465 | + # array coord map recreates the affine when you slice an image. In |
466 | + # general, if you take an integer slice in some dimension, the |
467 | + # corresponding column of the affine will go, leaving a row for the |
468 | + # lost dimension, with all zeros, execpt for the translation in the |
469 | + # now-removed dimension, encoding the position of that particular |
470 | + # slice |
471 | + xz = 1.1; yz = 2.3; zz = 3.5 |
472 | + xt = 10.0; yt = 11; zt = 12 |
473 | + aff = np.diag([xz, yz, zz, 1]) |
474 | + aff[:3,3] = [xt, yt, zt] |
475 | + shape = (2,3,4) |
476 | + cmap = Affine.from_params('ijk', 'xyz', aff) |
477 | + acm = acs.ArrayCoordMap(cmap, shape) |
478 | + # slice the coordinate map for the first axis |
479 | + sacm = acm[1] |
480 | + # The affine has lost the first column, but has a remaining row (the |
481 | + # first) encoding the translation to get to this slice |
482 | + yield assert_array_almost_equal(sacm.coordmap.affine, |
483 | + np.array([ |
484 | + [0, 0, xz+xt], |
485 | + [yz, 0, yt], |
486 | + [0, zz, zt], |
487 | + [0, 0, 1]])) |
488 | + sacm = acm[:,1] |
489 | + # lost second column, remaining second row with translation |
490 | + yield assert_array_almost_equal(sacm.coordmap.affine, |
491 | + np.array([ |
492 | + [xz, 0, xt], |
493 | + [0, 0, yz+yt], |
494 | + [0, zz, zt], |
495 | + [0, 0, 1]])) |
496 | + sacm = acm[:,:,2] |
497 | + # ditto third column and row |
498 | + yield assert_array_almost_equal(sacm.coordmap.affine, |
499 | + np.array([ |
500 | + [xz, 0, xt], |
501 | + [0, yz, yt], |
502 | + [0, 0, 2*zz+zt], |
503 | + [0, 0, 1]])) |
504 | + # check ellipsis slicing is the same as [:,: ... |
505 | + sacm = acm[...,2] |
506 | + yield assert_array_almost_equal(sacm.coordmap.affine, |
507 | + np.array([ |
508 | + [xz, 0, xt], |
509 | + [0, yz, yt], |
510 | + [0, 0, 2*zz+zt], |
511 | + [0, 0, 1]])) |
512 | + # that ellipsis can follow other slice types |
513 | + sacm = acm[:,...,2] |
514 | + yield assert_array_almost_equal(sacm.coordmap.affine, |
515 | + np.array([ |
516 | + [xz, 0, xt], |
517 | + [0, yz, yt], |
518 | + [0, 0, 2*zz+zt], |
519 | + [0, 0, 1]])) |
520 | + # that there can be only one ellipsis |
521 | + yield assert_raises(ValueError, acm.__getitem__, ( |
522 | + (Ellipsis,Ellipsis,2))) |
523 | + # that you can integer slice in all three dimensions, leaving only |
524 | + # the translation column |
525 | + sacm = acm[1,0,2] |
526 | + yield assert_array_almost_equal(sacm.coordmap.affine, |
527 | + np.array([ |
528 | + [xz+xt], |
529 | + [yt], |
530 | + [2*zz+zt], |
531 | + [1]])) |
532 | + |
533 | |
534 | === modified file 'nipy/io/files.py' |
535 | --- nipy/io/files.py 2009-12-17 20:42:37 +0000 |
536 | +++ nipy/io/files.py 2010-01-21 22:28:12 +0000 |
537 | @@ -207,7 +207,7 @@ |
538 | # Set zooms |
539 | hdr.set_zooms(zooms) |
540 | # save to disk |
541 | - out_img.to_filespec(filename) |
542 | + out_img.to_filename(filename) |
543 | return Fimg |
544 | |
545 | |
546 | |
547 | === modified file 'nipy/io/imageformats/__init__.py' |
548 | --- nipy/io/imageformats/__init__.py 2009-12-16 22:40:01 +0000 |
549 | +++ nipy/io/imageformats/__init__.py 2010-01-21 22:28:12 +0000 |
550 | @@ -2,20 +2,20 @@ |
551 | #ex: set sts=4 ts=4 sw=4 et: |
552 | ### ### ### ### ### ### ### ### ### ### ### ### ### ### ### ### ### ### ### ## |
553 | # |
554 | -# See COPYING file distributed along with the PyNIfTI package for the |
555 | +# See COPYING file distributed along with the NiBabel package for the |
556 | # copyright and license terms. |
557 | # |
558 | ### ### ### ### ### ### ### ### ### ### ### ### ### ### ### ### ### ### ### ## |
559 | """This module provides Python bindings to the NIfTI data format. |
560 | |
561 | -The PyNIfTI module is a Python interface to the NIfTI I/O libraries. Using |
562 | -PyNIfTI, one can easily read and write NIfTI and ANALYZE images from within |
563 | +The NiBabel module is a Python interface to the NIfTI I/O libraries. Using |
564 | +NiBabel, one can easily read and write NIfTI and ANALYZE images from within |
565 | Python. The :class:`~nifti.image.NiftiImage` class provides pythonic |
566 | access to the full header information and for a maximum of interoperability the |
567 | image data is made available via NumPy arrays. |
568 | |
569 | =============================== |
570 | - pynifti python implementation |
571 | + nibabel python implementation |
572 | =============================== |
573 | |
574 | Quickstart:: |
575 | @@ -59,4 +59,8 @@ |
576 | from nipy.io.imageformats.spm2analyze import Spm2AnalyzeHeader, Spm2AnalyzeImage |
577 | from nipy.io.imageformats.nifti1 import Nifti1Header, Nifti1Image |
578 | from nipy.io.imageformats.minc import MincHeader, MincImage |
579 | -from nipy.io.imageformats.funcs import squeeze_image, concat_images, four_to_three |
580 | +from nipy.io.imageformats.funcs import (squeeze_image, concat_images, four_to_three, |
581 | + as_closest_canonical) |
582 | +from nipy.io.imageformats.orientations import (io_orientation, orientation_affine, |
583 | + flip_axis, OrientationError, |
584 | + apply_orientation) |
585 | |
586 | === modified file 'nipy/io/imageformats/analyze.py' |
587 | --- nipy/io/imageformats/analyze.py 2009-12-12 19:52:25 +0000 |
588 | +++ nipy/io/imageformats/analyze.py 2010-01-21 22:28:12 +0000 |
589 | @@ -19,7 +19,8 @@ |
590 | therefore must know how to predict image filenames from header |
591 | filenames, whether these are different, and so on. |
592 | |
593 | -You can access and set fields of a particular header type using standard __getitem__ / __setitem__ syntax: |
594 | +You can access and set fields of a particular header type using standard |
595 | +__getitem__ / __setitem__ syntax: |
596 | |
597 | hdr['field'] = 10 |
598 | |
599 | @@ -94,13 +95,16 @@ |
600 | |
601 | ''' |
602 | |
603 | +import warnings |
604 | + |
605 | import numpy as np |
606 | |
607 | from nipy.io.imageformats.volumeutils import pretty_mapping, endian_codes, \ |
608 | native_code, swapped_code, hdr_getterfunc, \ |
609 | make_dt_codes, HeaderDataError, HeaderTypeError, allopen |
610 | |
611 | -from nipy.io.imageformats.header_ufuncs import read_data, write_data, adapt_header |
612 | +from nipy.io.imageformats.header_ufuncs import read_data, write_data, \ |
613 | + adapt_header, read_unscaled_data |
614 | |
615 | from nipy.io.imageformats import imageglobals as imageglobals |
616 | from nipy.io.imageformats.spatialimages import SpatialImage |
617 | @@ -1071,6 +1075,41 @@ |
618 | self._data = read_data(self._header, allopen(fname)) |
619 | return self._data |
620 | |
621 | + def get_unscaled_data(self): |
622 | + """ Return image data without image scaling applied |
623 | + |
624 | + Summary: please use the ``get_data`` method instead of this |
625 | + method unless you are sure what you are doing, and that you will |
626 | + only be using image formats for which this method exists and |
627 | + returns sensible results. |
628 | + |
629 | + Use this method with care; the modified Analyze-type formats |
630 | + such as SPM formats, and nifti1, specify that the image data |
631 | + array, as they are expecting to return it, is given by the raw |
632 | + data on disk, multiplied by a scalefactor and maybe with the |
633 | + addition of a constant. This method returns the data on the |
634 | + disk, without these format-specific scalings applied. Please |
635 | + use this method only if you absolutely need the unscaled data, |
636 | + and the magnitude of the data, as given by the scalefactor, is |
637 | + not relevant to your application. The Analyze-type formats have |
638 | + a single scalefactor +/- offset per image on disk. If you do not |
639 | + care about the absolute values, and will be removing the mean |
640 | + from the data, then the unscaled values will have preserved |
641 | + intensity ratios compared to the mean-centered scaled data. |
642 | + However, this is not necessarily true of other formats with more |
643 | + complicated scaling - such as MINC. |
644 | + |
645 | + Note that - unlike the scaled ``get_data`` method, we do not |
646 | + cache the array, to minimize the memory taken by the object. |
647 | + """ |
648 | + if not self._files: |
649 | + return None |
650 | + try: |
651 | + fname = self._files['image'] |
652 | + except KeyError: |
653 | + return None |
654 | + return read_unscaled_data(self._header, allopen(fname)) |
655 | + |
656 | def get_header(self): |
657 | ''' Return header |
658 | |
659 | @@ -1099,11 +1138,6 @@ |
660 | self._header.set_data_dtype(dtype) |
661 | |
662 | @classmethod |
663 | - def from_filespec(klass, filespec): |
664 | - files = klass.filespec_to_files(filespec) |
665 | - return klass.from_files(files) |
666 | - |
667 | - @classmethod |
668 | def from_files(klass, files): |
669 | fname = files['header'] |
670 | header = klass._header_maker.from_fileobj(allopen(fname)) |
671 | @@ -1112,14 +1146,6 @@ |
672 | ret._files = files |
673 | return ret |
674 | |
675 | - @classmethod |
676 | - def from_image(klass, img): |
677 | - orig_hdr = img.get_header() |
678 | - return klass(img.get_data(), |
679 | - img.get_affine(), |
680 | - img.get_header(), |
681 | - img.extra) |
682 | - |
683 | @staticmethod |
684 | def filespec_to_files(filespec): |
685 | ftups = filetuples.FileTuples( |
686 | @@ -1133,12 +1159,6 @@ |
687 | files = dict(zip(('header', 'image'), ftups.get_filenames())) |
688 | return files |
689 | |
690 | - def to_filespec(self, filespec): |
691 | - ''' Write image to files given by filespec |
692 | - ''' |
693 | - files = self.filespec_to_files(filespec) |
694 | - self.to_files(files) |
695 | - |
696 | def to_files(self, files=None): |
697 | ''' Write image to files passed, or self._files |
698 | ''' |
699 | @@ -1196,16 +1216,7 @@ |
700 | RZS = self._affine[:3,:3] |
701 | vox = np.sqrt(np.sum(RZS * RZS, axis=0)) |
702 | hdr['pixdim'][1:4] = vox |
703 | - |
704 | - @classmethod |
705 | - def load(klass, filespec): |
706 | - return klass.from_filespec(filespec) |
707 | - |
708 | - @classmethod |
709 | - def save(klass, img, filespec): |
710 | - img = klass.from_image(img) |
711 | - img.to_filespec(filespec) |
712 | |
713 | |
714 | load = AnalyzeImage.load |
715 | -save = AnalyzeImage.save |
716 | +save = AnalyzeImage.instance_to_filename |
717 | |
718 | === modified file 'nipy/io/imageformats/funcs.py' |
719 | --- nipy/io/imageformats/funcs.py 2009-12-16 22:40:01 +0000 |
720 | +++ nipy/io/imageformats/funcs.py 2010-01-21 22:28:12 +0000 |
721 | @@ -1,6 +1,10 @@ |
722 | ''' Processor functions for images ''' |
723 | import numpy as np |
724 | |
725 | +from .orientations import (io_orientation, orientation_affine, flip_axis, |
726 | + apply_orientation, OrientationError) |
727 | + |
728 | + |
729 | def squeeze_image(img): |
730 | ''' Return image, remove axes length 1 at end of image shape |
731 | |
732 | @@ -119,3 +123,53 @@ |
733 | img3d = image_maker(arr3d, affine, header) |
734 | imgs.append(img3d) |
735 | return imgs |
736 | + |
737 | + |
738 | +def as_closest_canonical(img, enforce_diag=False): |
739 | + ''' Return `img` with data reordered to be closest to canonical |
740 | + |
741 | + Canonical order is the ordering of the output axes. |
742 | + |
743 | + Parameters |
744 | + ---------- |
745 | + img : ``spatialimage`` |
746 | + enforce_diag : {False, True}, optional |
747 | + If True, before transforming image, check if the resulting image |
748 | + affine will be close to diagonal, and if not, raise an error |
749 | + |
750 | + Returns |
751 | + ------- |
752 | + canonical_img : ``spatialimage`` |
753 | + Version of `img` where the underlying array may have been |
754 | + reordered and / or flipped so that axes 0,1,2 are those axes in |
755 | + the input data that are, respectively, closest to the output axis |
756 | + orientation. We modify the affine accordingly. If `img` is |
757 | + already has the correct data ordering, we just return `img` |
758 | + unmodified. |
759 | + ''' |
760 | + aff = img.get_affine() |
761 | + ornt = io_orientation(aff) |
762 | + if np.all(ornt == [[0,1], |
763 | + [1,1], |
764 | + [2,1]]): # canonical already |
765 | + # however, the affine may not be diagonal |
766 | + if enforce_diag and not _aff_is_diag(aff): |
767 | + raise OrientationError('Transformed affine is not diagonal') |
768 | + return img |
769 | + shape = img.get_shape() |
770 | + t_aff = orientation_affine(ornt, shape) |
771 | + out_aff = np.dot(aff, t_aff) |
772 | + # check if we are going to end up with something diagonal |
773 | + if enforce_diag and not _aff_is_diag(aff): |
774 | + raise OrientationError('Transformed affine is not diagonal') |
775 | + # we need to transform the data |
776 | + arr = img.get_data() |
777 | + t_arr = apply_orientation(arr, ornt) |
778 | + return img.__class__(t_arr, out_aff, img.get_header()) |
779 | + |
780 | + |
781 | +def _aff_is_diag(aff): |
782 | + ''' Utility function returning True if affine is nearly diagonal ''' |
783 | + rzs_aff = aff[:3,:3] |
784 | + return np.allclose(rzs_aff, np.diag(np.diag(rzs_aff))) |
785 | + |
786 | |
787 | === removed file 'nipy/io/imageformats/images.py' |
788 | --- nipy/io/imageformats/images.py 2009-06-05 03:13:27 +0000 |
789 | +++ nipy/io/imageformats/images.py 1970-01-01 00:00:00 +0000 |
790 | @@ -1,124 +0,0 @@ |
791 | -#emacs: -*- mode: python-mode; py-indent-offset: 4; indent-tabs-mode: nil -*- |
792 | -#ex: set sts=4 ts=4 sw=4 et: |
793 | -''' Base class for images |
794 | - |
795 | -A draft. |
796 | -''' |
797 | - |
798 | -import nipy.io.imageformats.ioimps as ioimps |
799 | - |
800 | -class ImageError(Exception): |
801 | - pass |
802 | - |
803 | -class Image(object): |
804 | - ''' Base class for images |
805 | - |
806 | - Attributes: |
807 | - |
808 | - * data : array-like; image data |
809 | - * affine : array; mapping from image voxels to output space |
810 | - coordinates |
811 | - * output_space : string; name for affine output space |
812 | - * meta : dict-like; image metadata |
813 | - * io : image io implementation object |
814 | - |
815 | - Properties (sorry guys): |
816 | - |
817 | - * filename : string : read only, filename or None if none |
818 | - * mode : string; 'r'ead, 'w'rite, or 'rw' |
819 | - |
820 | - Methods: |
821 | - |
822 | - * save(filespec=None) |
823 | - * get_filespec() |
824 | - * set_filespec(filespec) |
825 | - * same_as_filename() |
826 | - |
827 | - Class attributes: |
828 | - |
829 | - * default_io_class |
830 | - |
831 | - ''' |
832 | - |
833 | - default_io_class = ioimps.default_io |
834 | - |
835 | - def __init__(self, data, |
836 | - affine=None, |
837 | - output_space=None, |
838 | - meta=None, |
839 | - filespec=None, |
840 | - io=None, |
841 | - mode='rw'): |
842 | - self.data = data |
843 | - self.affine = affine |
844 | - self.output_space = output_space |
845 | - if meta is None: |
846 | - meta = {} |
847 | - self.meta = meta |
848 | - if not filespec is None: |
849 | - if io is None: |
850 | - io = ioimps.guessed_imp(filespec) |
851 | - else: |
852 | - io.set_filespec(filespec) |
853 | - if io is None: |
854 | - io = self.default_io_class() |
855 | - self.io = io |
856 | - # lay down flag to show changes in IO |
857 | - self._io_hash = io.get_hash() |
858 | - self._mode = None |
859 | - self.mode = mode |
860 | - |
861 | - @property |
862 | - def filename(self): |
863 | - filespec = self.get_filespec() |
864 | - return filespec.filename |
865 | - |
866 | - def mode(): |
867 | - def fget(self): |
868 | - return self._mode |
869 | - def fset(self, mode): |
870 | - if not set('rw').issuperset(set(mode)): |
871 | - raise ImageError('Invalid mode "%s"' % mode) |
872 | - self._mode = mode |
873 | - doc = 'image read / write mode' |
874 | - return locals() |
875 | - mode = property(**mode()) |
876 | - |
877 | - def get_filespec(self): |
878 | - return self.io.get_filespec() |
879 | - |
880 | - def set_filespec(self, filespec): |
881 | - self.io.set_filespec(filespec) |
882 | - |
883 | - def save(self, filespec=None, io=None): |
884 | - io.save_image(self, filespec, io) |
885 | - |
886 | - @classmethod |
887 | - def from_io(klass, io): |
888 | - return io.as_image(klass) |
889 | - |
890 | - def same_as_filename(self, filename=None): |
891 | - ''' True if image in memory is same as image on disk ''' |
892 | - fname = self.filename |
893 | - if fname is None: |
894 | - return False |
895 | - if filename: |
896 | - if filename != fname: |
897 | - return False |
898 | - # first, check if io has changed |
899 | - if self._io_hash != self.io.get_hash(): |
900 | - return False |
901 | - # Then get io to check image against itself |
902 | - return self.io.same_as_image(self) |
903 | - |
904 | - |
905 | -def load(filespec, maker=Image, io=None): |
906 | - if io is None: |
907 | - io = ioimps.guessed_implementation(filespec) |
908 | - else: |
909 | - io.set_filespec(filespec) |
910 | - return maker.from_io(io) |
911 | - |
912 | - |
913 | -def save(img, filespec=None, io=None): |
914 | - img.save(filespec, io) |
915 | |
916 | === removed file 'nipy/io/imageformats/ioimps.py' |
917 | --- nipy/io/imageformats/ioimps.py 2009-06-05 03:13:27 +0000 |
918 | +++ nipy/io/imageformats/ioimps.py 1970-01-01 00:00:00 +0000 |
919 | @@ -1,64 +0,0 @@ |
920 | -''' IO implementatations ''' |
921 | - |
922 | -def guessed_imp(filespec): |
923 | - ''' Return implementation guessed from filespec ''' |
924 | - raise NotImplementedError |
925 | - |
926 | -class IOImplementation(object): |
927 | - def __init__(self, filespec = None): |
928 | - self._filespec = None |
929 | - self.set_filespec(filespec) |
930 | - |
931 | - def get_filespec(self): |
932 | - return self._filespec |
933 | - |
934 | - def set_filespec(self, filespec): |
935 | - self._filespec = filespec |
936 | - |
937 | - def to_filespec(self, filespec=None): |
938 | - raise NotImplementedError |
939 | - |
940 | - def copy(self): |
941 | - raise NotImplementedError |
942 | - |
943 | - def get_affine(self): |
944 | - raise NotImplementedError |
945 | - |
946 | - def set_affine(self, affine): |
947 | - raise NotImplementedError |
948 | - |
949 | - def get_output_space(self): |
950 | - raise NotImplementedError |
951 | - |
952 | - def set_output_space(self, output_space): |
953 | - raise NotImplementedError |
954 | - |
955 | - def get_data_shape(self): |
956 | - raise NotImplementedError |
957 | - |
958 | - def set_data_shape(self, shape): |
959 | - raise NotImplementedError |
960 | - |
961 | - def get_data_dtype(self): |
962 | - raise NotImplementedError |
963 | - |
964 | - def set_data_dtype(self, dtype): |
965 | - raise NotImplementedError |
966 | - |
967 | - def write_slice(data, slicedef, outfile = None): |
968 | - raise NotImplementedError |
969 | - |
970 | - def as_image(self, image_maker): |
971 | - raise NotImplementedError |
972 | - |
973 | - def save_image(self, image, filespec=None, io=None): |
974 | - raise NotImplementedError |
975 | - |
976 | - def same_as_image(self, image): |
977 | - raise NotImplementedError |
978 | - |
979 | - def get_hash(self): |
980 | - raise NotImplementedError |
981 | - |
982 | -default_io = IOImplementation |
983 | - |
984 | |
985 | === modified file 'nipy/io/imageformats/loadsave.py' |
986 | --- nipy/io/imageformats/loadsave.py 2009-06-05 03:13:27 +0000 |
987 | +++ nipy/io/imageformats/loadsave.py 2010-01-21 22:28:12 +0000 |
988 | @@ -5,12 +5,12 @@ |
989 | from nipy.io.imageformats import minc |
990 | |
991 | |
992 | -def load(filespec, *args, **kwargs): |
993 | - ''' Load file given filespec, guessing at file type |
994 | +def load(filename, *args, **kwargs): |
995 | + ''' Load file given filename, guessing at file type |
996 | |
997 | Parameters |
998 | ---------- |
999 | - filespec : string or file-like |
1000 | + filename : string or file-like |
1001 | specification of filename or file to load |
1002 | *args |
1003 | **kwargs |
1004 | @@ -23,31 +23,31 @@ |
1005 | |
1006 | ''' |
1007 | # Try and guess file type from filename |
1008 | - if isinstance(filespec, basestring): |
1009 | - fname = filespec |
1010 | + if isinstance(filename, basestring): |
1011 | + fname = filename |
1012 | for ending in ('.gz', '.bz2'): |
1013 | - if filespec.endswith(ending): |
1014 | + if filename.endswith(ending): |
1015 | fname = fname[:-len(ending)] |
1016 | break |
1017 | if fname.endswith('.nii'): |
1018 | - return nifti1.load(filespec, *args, **kwargs) |
1019 | + return nifti1.load(filename, *args, **kwargs) |
1020 | if fname.endswith('.mnc'): |
1021 | - return minc.load(filespec, *args, **kwargs) |
1022 | + return minc.load(filename, *args, **kwargs) |
1023 | # Not a string, or not recognized as nii or mnc |
1024 | try: |
1025 | - files = nifti1.Nifti1Image.filespec_to_files(filespec) |
1026 | + files = nifti1.Nifti1Image.filespec_to_files(filename) |
1027 | except ValueError: |
1028 | raise RuntimeError('Cannot work out file type of "%s"' % |
1029 | - filespec) |
1030 | + filename) |
1031 | hdr = nifti1.Nifti1Header.from_fileobj( |
1032 | vu.allopen(files['header']), |
1033 | check=False) |
1034 | magic = hdr['magic'] |
1035 | if magic in ('ni1', 'n+1'): |
1036 | - return nifti1.load(filespec, *args, **kwargs) |
1037 | - return spm2.load(filespec, *args, **kwargs) |
1038 | - |
1039 | - |
1040 | -def save(img, filespec): |
1041 | + return nifti1.load(filename, *args, **kwargs) |
1042 | + return spm2.load(filename, *args, **kwargs) |
1043 | + |
1044 | + |
1045 | +def save(img, filename): |
1046 | ''' Save an image to file without changing format''' |
1047 | - img.to_filespec(filespec) |
1048 | + img.to_filename(filename) |
1049 | |
1050 | === modified file 'nipy/io/imageformats/minc.py' |
1051 | --- nipy/io/imageformats/minc.py 2009-09-11 01:35:49 +0000 |
1052 | +++ nipy/io/imageformats/minc.py 2010-01-21 22:28:12 +0000 |
1053 | @@ -256,11 +256,6 @@ |
1054 | return self._header.get_data_dtype() |
1055 | |
1056 | @classmethod |
1057 | - def from_filespec(klass, filespec): |
1058 | - files = klass.filespec_to_files(filespec) |
1059 | - return klass.from_files(files) |
1060 | - |
1061 | - @classmethod |
1062 | def from_files(klass, files): |
1063 | fname = files['image'] |
1064 | header = klass._header_maker.from_fileobj(allopen(fname)) |
1065 | @@ -269,20 +264,9 @@ |
1066 | ret._files = files |
1067 | return ret |
1068 | |
1069 | - @classmethod |
1070 | - def from_image(klass, img): |
1071 | - return klass(img.get_data(), |
1072 | - img.get_affine(), |
1073 | - img.get_header(), |
1074 | - img.extra) |
1075 | - |
1076 | @staticmethod |
1077 | def filespec_to_files(filespec): |
1078 | return {'image':filespec} |
1079 | |
1080 | - @classmethod |
1081 | - def load(klass, filespec): |
1082 | - return klass.from_filespec(filespec) |
1083 | - |
1084 | |
1085 | load = MincImage.load |
1086 | |
1087 | === modified file 'nipy/io/imageformats/nifti1.py' |
1088 | --- nipy/io/imageformats/nifti1.py 2009-12-12 19:52:25 +0000 |
1089 | +++ nipy/io/imageformats/nifti1.py 2010-01-21 22:28:12 +0000 |
1090 | @@ -82,7 +82,7 @@ |
1091 | (512, 'uint16', np.uint16), |
1092 | (768, 'uint32', np.uint32), |
1093 | (1024,'int64', np.int64), |
1094 | - (1280, 'int64', np.uint64), |
1095 | + (1280, 'uint64', np.uint64), |
1096 | (1536, 'float128', _float128t), # Only numpy defined on 64 bit |
1097 | (1792, 'complex128', np.complex128), |
1098 | (2048, 'complex256', _complex256t), # 64 bit again |
1099 | @@ -1458,19 +1458,15 @@ |
1100 | data = self.get_data() |
1101 | # Adapt header to possible two<->one file difference |
1102 | is_pair = files['header'] != files['image'] |
1103 | - |
1104 | hdr = self.get_header().for_file_pair(is_pair) |
1105 | - |
1106 | # if any extensions, figure out necessary vox_offset for extensions to |
1107 | # fit |
1108 | if self.extra.has_key('extensions') and len(self.extra['extensions']): |
1109 | hdr['vox_offset'] = len(hdr.binaryblock) \ |
1110 | + self.extra['extensions'].get_sizeondisk() |
1111 | - |
1112 | slope, inter, mn, mx = adapt_header(hdr, data) |
1113 | hdrf = allopen(files['header'], 'wb') |
1114 | hdr.write_to(hdrf) |
1115 | - |
1116 | # write all extensions to file |
1117 | # assumes that the file ptr is right after the magic string |
1118 | if not self.extra.has_key('extensions'): |
1119 | @@ -1478,13 +1474,11 @@ |
1120 | hdrf.write(np.array((0,0,0,0), dtype=np.int8).tostring()) |
1121 | else: |
1122 | self.extra['extensions'].write_to(hdrf) |
1123 | - |
1124 | - |
1125 | if is_pair: |
1126 | imgf = allopen(files['image'], 'wb') |
1127 | else: # single file for header and image |
1128 | imgf = hdrf |
1129 | - # streams like bz2 do not allow seeks, even forward. We |
1130 | + # streams like bz2 do not allow seeks, even forward. We |
1131 | # check where to go, and write zeros up until the data part |
1132 | # of the file |
1133 | offset = hdr.get_data_offset() |
1134 | @@ -1520,4 +1514,4 @@ |
1135 | |
1136 | |
1137 | load = Nifti1Image.load |
1138 | -save = Nifti1Image.save |
1139 | +save = Nifti1Image.instance_to_filename |
1140 | |
1141 | === added file 'nipy/io/imageformats/orientations.py' |
1142 | --- nipy/io/imageformats/orientations.py 1970-01-01 00:00:00 +0000 |
1143 | +++ nipy/io/imageformats/orientations.py 2010-01-21 22:28:12 +0000 |
1144 | @@ -0,0 +1,254 @@ |
1145 | +''' Utilities for calculating and applying affine orientations ''' |
1146 | + |
1147 | +import numpy as np |
1148 | +import numpy.linalg as npl |
1149 | + |
1150 | + |
1151 | +class OrientationError(Exception): |
1152 | + pass |
1153 | + |
1154 | + |
1155 | +def io_orientation(affine, tol=None): |
1156 | + ''' Orientation of input axes in terms of output axes for `affine` |
1157 | + |
1158 | + Valid for an affine transformation from ``m`` dimensions to ``n`` |
1159 | + dimensions (``affine.shape == (n+1, m+1)``). |
1160 | + |
1161 | + The calculated orientations can be used to transform associated |
1162 | + arrays to best match the output orientations. If ``n`` > ``m``, then |
1163 | + some of the output axes should be considered dropped in this |
1164 | + orientation. |
1165 | + |
1166 | + Parameters |
1167 | + ---------- |
1168 | + affine : (n+1,m+1) ndarray-like |
1169 | + Transformation affine from ``m`` inputs to ``n`` outputs. |
1170 | + Usually this will be a shape (4,4) matrix, transforming 3 inputs |
1171 | + to 3 outputs, but the code also handles the more general case |
1172 | + tol : {None, float}, optional |
1173 | + threshold below which SVD values of the affine are considered |
1174 | + zero. If `tol` is None, and ``S`` is an array with singular |
1175 | + values for `affine`, and ``eps`` is the epsilon value for |
1176 | + datatype of ``S``, then `tol` set to ``S.max() * eps``. |
1177 | + |
1178 | + Returns |
1179 | + ------- |
1180 | + orientations : (n,2) ndarray |
1181 | + one row per input axis, where the first value in each row is the |
1182 | + closest corresponding output axis, and the second value is 1 if |
1183 | + the input axis is in the same direction as the corresponding |
1184 | + output axis, , and -1 if it is in the opposite direction, to the |
1185 | + corresponding output axis. If a row is [np.nan, np.nan], which |
1186 | + can happen when n > m, then this row should be considered |
1187 | + dropped. |
1188 | + ''' |
1189 | + affine = np.asarray(affine) |
1190 | + n, m = affine.shape[0]-1, affine.shape[1]-1 |
1191 | + # extract the underlying rotation matrix |
1192 | + RZS = affine[:n,:m] |
1193 | + zooms = np.sqrt(np.sum(RZS * RZS, axis=0)) |
1194 | + RS = RZS / zooms |
1195 | + # Transform below is polar decomposition, returning the closest |
1196 | + # shearless matrix R to RS |
1197 | + P, S, Qs = npl.svd(RS) |
1198 | + # Threshold the singular values to determine the rank. |
1199 | + if tol is None: |
1200 | + tol = S.max() * np.finfo(S.dtype).eps |
1201 | + keep = (S > tol) |
1202 | + R = np.dot(P[:,keep], Qs[keep]) |
1203 | + # the matrix R is such that np.dot(R,R.T) is projection onto the |
1204 | + # columns of P[:,keep] and np.dot(R.T,R) is projection onto the rows |
1205 | + # of Qs[keep]. R (== np.dot(R, np.eye(m))) gives rotation of the |
1206 | + # unit input vectors to output coordinates. Therefore, the row |
1207 | + # index of abs max R[:,N], is the output axis changing most as input |
1208 | + # axis N changes. In case there are ties, we choose the axes |
1209 | + # iteratively, removing used axes from consideration as we go |
1210 | + ornt = np.ones((n,2), dtype=np.int8)*np.nan |
1211 | + for in_ax in range(m): |
1212 | + col = R[:,in_ax] |
1213 | + if not np.alltrue(np.equal(col, 0)): |
1214 | + out_ax = np.argmax(np.abs(col)) |
1215 | + ornt[in_ax,0] = out_ax |
1216 | + assert col[out_ax] != 0 |
1217 | + if col[out_ax] < 0: |
1218 | + ornt[in_ax,1] = -1 |
1219 | + else: |
1220 | + ornt[in_ax,1] = 1 |
1221 | + # remove the identified axis from further consideration, by |
1222 | + # zeroing out the corresponding row in R |
1223 | + R[out_ax,:] = 0 |
1224 | + return ornt |
1225 | + |
1226 | + |
1227 | +def _ornt_to_affine(orientations): |
1228 | + ''' Create affine transformation matrix determined by orientations. |
1229 | + |
1230 | + This transformation will simply flip, transpose, and possibly drop some |
1231 | + coordinates. |
1232 | + |
1233 | + Parameters |
1234 | + ---------- |
1235 | + orientations : (n,2) ndarray |
1236 | + one row per input axis, where the first value in each row is the |
1237 | + closest corresponding output axis, and the second value is 1 if |
1238 | + the input axis is in the same direction as the corresponding |
1239 | + output axis, and -1 if it is in the opposite direction, to the |
1240 | + corresponding output axis. If a row has first entry np.nan, then |
1241 | + this axis is dropped from the output. |
1242 | + |
1243 | + Returns |
1244 | + ------- |
1245 | + affine : (m+1,n+1) ndarray |
1246 | + matrix representing flipping / dropping axes. m is equal to the |
1247 | + number of rows of orientations[:,0] that are not np.nan |
1248 | + ''' |
1249 | + ornt = np.asarray(orientations) |
1250 | + n = ornt.shape[0] |
1251 | + keep = ~np.isnan(ornt[:,1]) |
1252 | + # These are the input coordinate axes that do have a matching output |
1253 | + # column in the orientation. That is, if the 2nd row is [np.nan, |
1254 | + # np.nan] then the orientation indicates that no output axes of an |
1255 | + # affine with this orientation matches the 2nd input coordinate |
1256 | + # axis. This would happen if the 2nd row of the affine that |
1257 | + # generated ornt was [0,0,0,*] |
1258 | + axes_kept = np.arange(n)[keep] |
1259 | + m = keep.sum(0) |
1260 | + # the matrix P represents the affine transform impled by ornt. If |
1261 | + # all entries of ornt are not np.nan, then P is square otherwise it |
1262 | + # has more columns than rows indicating some coordinates were |
1263 | + # dropped |
1264 | + P = np.zeros((m+1,n+1)) |
1265 | + P[-1,-1] = 1 |
1266 | + for idx in range(m): |
1267 | + axs, flip = ornt[axes_kept[idx]] |
1268 | + P[idx, axs] = flip |
1269 | + return P |
1270 | + |
1271 | + |
1272 | +def apply_orientation(arr, ornt): |
1273 | + ''' Apply transformations implied by `ornt` to the first |
1274 | + n axes of the array `arr` |
1275 | + |
1276 | + Parameters |
1277 | + ---------- |
1278 | + arr : array-like of data with ndim >= n |
1279 | + ornt : (n,2) orientation array |
1280 | + orientation transform. ``ornt[N,1]` is flip of axis N of the |
1281 | + array implied by `shape`, where 1 means no flip and -1 means |
1282 | + flip. For example, if ``N==0`` and ``ornt[0,1] == -1``, and |
1283 | + there's an array ``arr`` of shape `shape`, the flip would |
1284 | + correspond to the effect of ``np.flipud(arr)``. ``ornt[:,0]`` is |
1285 | + the transpose that needs to be done to the implied array, as in |
1286 | + ``arr.transpose(ornt[:,0])`` |
1287 | + |
1288 | + Returns |
1289 | + ------- |
1290 | + t_arr : ndarray |
1291 | + data array `arr` transformed according to ornt |
1292 | + ''' |
1293 | + t_arr = np.asarray(arr) |
1294 | + ornt = np.asarray(ornt) |
1295 | + n = ornt.shape[0] |
1296 | + if t_arr.ndim < n: |
1297 | + raise OrientationError('Data array has fewer dimensions than ' |
1298 | + 'orientation') |
1299 | + # no coordinates can be dropped for applying the orientations |
1300 | + if np.any(np.isnan(ornt[:,0])): |
1301 | + raise OrientationError('Cannot drop coordinates when ' |
1302 | + 'applying orientation to data') |
1303 | + shape = t_arr.shape |
1304 | + # apply ornt transformations |
1305 | + for ax, flip in enumerate(ornt[:,1]): |
1306 | + if flip == -1: |
1307 | + t_arr = flip_axis(t_arr, axis=ax) |
1308 | + full_transpose = np.arange(t_arr.ndim) |
1309 | + # ornt indicates the transpose that has occurred - we reverse it |
1310 | + full_transpose[:n] = np.argsort(ornt[:,0]) |
1311 | + t_arr = t_arr.transpose(full_transpose) |
1312 | + return t_arr |
1313 | + |
1314 | + |
1315 | +def orientation_affine(ornt, shape): |
1316 | + ''' Affine transform resulting from transforms implied in `ornt` |
1317 | + |
1318 | + Imagine you have an array ``arr`` of shape `shape`, and you apply the |
1319 | + transforms implied by `ornt` (more below), to get ``tarr``. |
1320 | + ``tarr`` may have a different shape ``shape_prime``. This routine |
1321 | + returns the affine that will take a array coordinate for ``tarr`` |
1322 | + and give you the corresponding array coordinate in ``arr``. |
1323 | + |
1324 | + Parameters |
1325 | + ---------- |
1326 | + ornt : (n,2) ndarray |
1327 | + orientation transform. ``ornt[N,1]` is flip of axis N of the |
1328 | + array implied by `shape`, where 1 means no flip and -1 means |
1329 | + flip. For example, if ``N==0`` and ``ornt[0,1] == -1``, and |
1330 | + there's an array ``arr`` of shape `shape`, the flip would |
1331 | + correspond to the effect of ``np.flipud(arr)``. ``ornt[:,0]`` is |
1332 | + the transpose that needs to be done to the implied array, as in |
1333 | + ``arr.transpose(ornt[:,0])`` |
1334 | + |
1335 | + shape : length n sequence |
1336 | + shape of array you may transform with `ornt` |
1337 | + |
1338 | + Returns |
1339 | + ------- |
1340 | + transformed_affine : (n+1,n+1) ndarray |
1341 | + An array ``arr`` (shape `shape`) might be transformed according |
1342 | + to `ornt`, resulting in a transformed array ``tarr``. |
1343 | + `transformed_affine` is the transform that takes you from array |
1344 | + coordinates in ``tarr`` to array coordinates in ``arr``. |
1345 | + ''' |
1346 | + ornt = np.asarray(ornt) |
1347 | + n = ornt.shape[0] |
1348 | + shape = np.array(shape)[:n] |
1349 | + # ornt implies a flip, followed by a transpose. We need the affine |
1350 | + # that inverts these. Thus we need the affine that first undoes the |
1351 | + # effect of the transpose, then undoes the effects of the flip. |
1352 | + # ornt indicates the transpose that has occurred to get the current |
1353 | + # ordering, relative to canonical, so we just use that |
1354 | + undo_reorder = np.eye(n+1)[list(ornt[:,0]) + [n],:] |
1355 | + undo_flip = np.diag(list(ornt[:,1]) + [1.0]) |
1356 | + center_trans = -(shape-1) / 2.0 |
1357 | + undo_flip[:n,n] = (ornt[:,1] * center_trans) - center_trans |
1358 | + return np.dot(undo_flip, undo_reorder) |
1359 | + |
1360 | + |
1361 | +def flip_axis(arr, axis=0): |
1362 | + ''' Flip contents of `axis` in array `arr` |
1363 | + |
1364 | + ``flip_axis`` is the same transform as ``np.flipud``, but for any |
1365 | + axis. For example ``flip_axis(arr, axis=0)`` is the same transform |
1366 | + as ``np.flipud(arr)``, and ``flip_axis(arr, axis=1)`` is the same |
1367 | + transform as ``np.fliplr(arr)`` |
1368 | + |
1369 | + Parameters |
1370 | + ---------- |
1371 | + arr : array-like |
1372 | + axis : int, optional |
1373 | + axis to flip. Default `axis` == 0 |
1374 | + |
1375 | + Returns |
1376 | + ------- |
1377 | + farr : array |
1378 | + Array with axis `axis` flipped |
1379 | + |
1380 | + Examples |
1381 | + -------- |
1382 | + >>> a = np.arange(6).reshape((2,3)) |
1383 | + >>> a |
1384 | + array([[0, 1, 2], |
1385 | + [3, 4, 5]]) |
1386 | + >>> flip_axis(a, axis=0) |
1387 | + array([[3, 4, 5], |
1388 | + [0, 1, 2]]) |
1389 | + >>> flip_axis(a, axis=1) |
1390 | + array([[2, 1, 0], |
1391 | + [5, 4, 3]]) |
1392 | + ''' |
1393 | + arr = np.asanyarray(arr) |
1394 | + arr = arr.swapaxes(0, axis) |
1395 | + arr = np.flipud(arr) |
1396 | + return arr.swapaxes(axis, 0) |
1397 | + |
1398 | + |
1399 | |
1400 | === modified file 'nipy/io/imageformats/spatialimages.py' |
1401 | --- nipy/io/imageformats/spatialimages.py 2009-12-12 19:52:25 +0000 |
1402 | +++ nipy/io/imageformats/spatialimages.py 2010-01-21 22:28:12 +0000 |
1403 | @@ -6,57 +6,79 @@ |
1404 | that is specific to the image format - and ``extra`` - a dictionary |
1405 | container for any other metadata. |
1406 | |
1407 | -It has attributes:: |
1408 | +It has attributes: |
1409 | |
1410 | - extra |
1411 | - filename (read only) |
1412 | + * extra |
1413 | |
1414 | -and methods:: |
1415 | - |
1416 | - .get_data() |
1417 | - .get_affine() |
1418 | - .get_header() |
1419 | - .to_files() # writes image out to passed or |
1420 | - .get_raw_data() |
1421 | - .write_data(fileobj) |
1422 | - .write_raw_data(fileobj) |
1423 | +methods: |
1424 | + |
1425 | + * .get_data() |
1426 | + * .get_affine() |
1427 | + * .get_header() |
1428 | + * .get_shape() |
1429 | + * .set_shape(shape) |
1430 | + * .to_filename(fname) - writes data to filename(s) derived from |
1431 | + ``fname``, where the derivation may differ between formats. |
1432 | + * to_files() - save image to files with which the image is already |
1433 | + associated. Or ``img.to_files(files)`` saves to the files passed. |
1434 | + |
1435 | +classmethods: |
1436 | + |
1437 | + * from_filename(fname) - make instance by loading from filename |
1438 | + * instance_to_filename(img, fname) - save ``img`` instance to |
1439 | + filename ``fname``. |
1440 | |
1441 | There are several ways of writing data. |
1442 | ======================================= |
1443 | |
1444 | There is the usual way, which is the default:: |
1445 | |
1446 | - img.write_data(data, fileobj) |
1447 | + img.to_filename(fname) |
1448 | |
1449 | -and that is, to take the data array, ``data``, and cast it to the |
1450 | -datatype the header expects, setting any available header scaling |
1451 | +and that is, to take the data encapsulated by the image and cast it to |
1452 | +the datatype the header expects, setting any available header scaling |
1453 | into the header to help the data match. |
1454 | |
1455 | +You can load the data into an image from file with:: |
1456 | + |
1457 | + img.from_filename(fname) |
1458 | + |
1459 | +The image stores its associated files in a rather secretive way. In |
1460 | +order to just save an image, for which you know there is an associated |
1461 | +filename, or other storage, you can do:: |
1462 | + |
1463 | + img.to_files() |
1464 | + |
1465 | +alternatively, you can pass in the needed files yourself, into this |
1466 | +method, as an argument. |
1467 | + |
1468 | You can get the data out again with of:: |
1469 | |
1470 | img.get_data(fileobj) |
1471 | |
1472 | -Less commonly, you might want to fetch out the unscaled array via |
1473 | -the header:: |
1474 | - |
1475 | - unscaled_data = img.get_raw_data(fileobj) |
1476 | - |
1477 | -then do something with it. Then put it back again:: |
1478 | - |
1479 | - img.write_raw_data(modifed_unscaled_data, fileobj) |
1480 | +Less commonly, for some image types that support it, you might want to |
1481 | +fetch out the unscaled array via the header:: |
1482 | + |
1483 | + unscaled_data = img.get_unscaled_data() |
1484 | + |
1485 | +Analyze-type images (including nifti) support this, but others may not |
1486 | +(MINC, for example). |
1487 | |
1488 | Sometimes you might to avoid any loss of precision by making the |
1489 | data type the same as the input:: |
1490 | |
1491 | hdr = img.get_header() |
1492 | hdr.set_data_dtype(data.dtype) |
1493 | - img.write_data(data, fileobj) |
1494 | + img.to_filename(fname) |
1495 | |
1496 | ''' |
1497 | |
1498 | +import warnings |
1499 | + |
1500 | + |
1501 | class SpatialImage(object): |
1502 | _header_maker = dict |
1503 | - ''' Template class for lightweight image ''' |
1504 | + ''' Template class for images ''' |
1505 | def __init__(self, data, affine, header=None, extra=None): |
1506 | if extra is None: |
1507 | extra = {} |
1508 | @@ -112,8 +134,17 @@ |
1509 | self.extra[key] = value |
1510 | |
1511 | @classmethod |
1512 | - def from_filespec(klass, filespec): |
1513 | - raise NotImplementedError |
1514 | + def from_filename(klass, filename): |
1515 | + files = klass.filespec_to_files(filename) |
1516 | + return klass.from_files(files) |
1517 | + |
1518 | + @classmethod |
1519 | + def from_filespec(klass, img, filespec): |
1520 | + warnings.warn('``from_filespec`` class method is deprecated\n' |
1521 | + 'Please use the ``from_filename`` class method ' |
1522 | + 'instead', |
1523 | + DeprecationWarning) |
1524 | + klass.from_filespec(filespec) |
1525 | |
1526 | def from_files(klass, files): |
1527 | raise NotImplementedError |
1528 | @@ -125,8 +156,82 @@ |
1529 | def filespec_to_files(filespec): |
1530 | raise NotImplementedError |
1531 | |
1532 | - def to_filespec(self, filespec): |
1533 | - raise NotImplementedError |
1534 | + def to_filename(self, filename): |
1535 | + ''' Write image to files implied by filename string |
1536 | + |
1537 | + Paraameters |
1538 | + ----------- |
1539 | + filename : str |
1540 | + filename to which to save image. We will parse `filename` |
1541 | + with ``filespec_to_files`` to work out names for image, |
1542 | + header etc. |
1543 | + |
1544 | + Returns |
1545 | + ------- |
1546 | + None |
1547 | + ''' |
1548 | + files = self.filespec_to_files(filename) |
1549 | + self.to_files(files) |
1550 | + |
1551 | + def to_filespec(self, filename): |
1552 | + warnings.warn('``to_filespec`` is deprecated, please ' |
1553 | + 'use ``to_filename`` instead', |
1554 | + DeprecationWarning) |
1555 | + self.to_filename(filename) |
1556 | |
1557 | def to_files(self, files=None): |
1558 | raise NotImplementedError |
1559 | + |
1560 | + @classmethod |
1561 | + def load(klass, filename): |
1562 | + return klass.from_filename(filename) |
1563 | + |
1564 | + @classmethod |
1565 | + def save(klass, img, filename): |
1566 | + warnings.warn('``save`` class method is deprecated\n' |
1567 | + 'You probably want the ``to_filename`` instance ' |
1568 | + 'method, or the module-level ``save`` function', |
1569 | + DeprecationWarning) |
1570 | + klass.instance_to_filename(img, filename) |
1571 | + |
1572 | + @classmethod |
1573 | + def instance_to_filename(klass, img, filename): |
1574 | + ''' Save `img` in our own format, to name implied by `filename` |
1575 | + |
1576 | + This is a class method |
1577 | + |
1578 | + Parameters |
1579 | + ---------- |
1580 | + img : ``spatialimage`` instance |
1581 | + In fact, an object with the API of ``spatialimage`` - |
1582 | + specifically ``get_data``, ``get_affine``, ``get_header`` and |
1583 | + ``extra``. |
1584 | + filename : str |
1585 | + Filename, implying name to which to save image. |
1586 | + ''' |
1587 | + img = klass.from_image(img) |
1588 | + img.to_filename(filename) |
1589 | + |
1590 | + @classmethod |
1591 | + def from_image(klass, img): |
1592 | + ''' Create new instance of own class from `img` |
1593 | + |
1594 | + This is a class method |
1595 | + |
1596 | + Parameters |
1597 | + ---------- |
1598 | + img : ``spatialimage`` instance |
1599 | + In fact, an object with the API of ``spatialimage`` - |
1600 | + specifically ``get_data``, ``get_affine``, ``get_header`` and |
1601 | + ``extra``. |
1602 | + |
1603 | + Returns |
1604 | + ------- |
1605 | + cimg : ``spatialimage`` instance |
1606 | + Image, of our own class |
1607 | + ''' |
1608 | + return klass(img.get_data(), |
1609 | + img.get_affine(), |
1610 | + img.get_header(), |
1611 | + img.extra) |
1612 | + |
1613 | |
1614 | === modified file 'nipy/io/imageformats/spm2analyze.py' |
1615 | --- nipy/io/imageformats/spm2analyze.py 2009-12-12 19:52:25 +0000 |
1616 | +++ nipy/io/imageformats/spm2analyze.py 2010-01-21 22:28:13 +0000 |
1617 | @@ -125,4 +125,4 @@ |
1618 | |
1619 | |
1620 | load = Spm2AnalyzeImage.load |
1621 | -save = Spm2AnalyzeImage.save |
1622 | +save = Spm2AnalyzeImage.instance_to_filename |
1623 | |
1624 | === modified file 'nipy/io/imageformats/spm99analyze.py' |
1625 | --- nipy/io/imageformats/spm99analyze.py 2009-12-12 19:52:25 +0000 |
1626 | +++ nipy/io/imageformats/spm99analyze.py 2010-01-21 22:28:13 +0000 |
1627 | @@ -214,8 +214,8 @@ |
1628 | class Spm99AnalyzeImage(analyze.AnalyzeImage): |
1629 | _header_maker = Spm99AnalyzeHeader |
1630 | @classmethod |
1631 | - def from_filespec(klass, filespec): |
1632 | - ret = super(Spm99AnalyzeImage, klass).from_filespec(filespec) |
1633 | + def from_filename(klass, filename): |
1634 | + ret = super(Spm99AnalyzeImage, klass).from_filename(filename) |
1635 | import scipy.io as sio |
1636 | matf = ret._files['mat'] |
1637 | try: |
1638 | @@ -274,4 +274,4 @@ |
1639 | |
1640 | |
1641 | load = Spm99AnalyzeImage.load |
1642 | -save = Spm99AnalyzeImage.save |
1643 | +save = Spm99AnalyzeImage.instance_to_filename |
1644 | |
1645 | === modified file 'nipy/io/imageformats/testing/__init__.py' |
1646 | --- nipy/io/imageformats/testing/__init__.py 2009-06-05 03:13:27 +0000 |
1647 | +++ nipy/io/imageformats/testing/__init__.py 2010-01-21 22:28:13 +0000 |
1648 | @@ -1,15 +1,18 @@ |
1649 | -''' Utilities to change context for nose.tools tests ''' |
1650 | - |
1651 | -from os.path import join as pjoin, split as psplit, abspath |
1652 | - |
1653 | -import nose.tools as nt |
1654 | - |
1655 | -example_data_path = abspath(pjoin(psplit(__file__)[0], '..', 'tests', 'data')) |
1656 | - |
1657 | -assert_equal = lambda x, y: nt.assert_equal(x, y) |
1658 | -assert_not_equal = lambda x, y: nt.assert_not_equal(x, y) |
1659 | -assert_true = lambda x: nt.assert_true(x) |
1660 | -assert_false = lambda x: nt.assert_false(x) |
1661 | - |
1662 | -def assert_raises(error, func, *args, **kwargs): |
1663 | - return nt.assert_raises(error, func, *args, **kwargs) |
1664 | +''' Utilities for testing ''' |
1665 | + |
1666 | +# Allow failed import of nose if not now running tests |
1667 | +try: |
1668 | + import nose.tools as nt |
1669 | +except ImportError: |
1670 | + pass |
1671 | +else: |
1672 | + from lightunit import ParametricTestCase, parametric |
1673 | + # wrappers to change context for nose.tools tests ''' |
1674 | + assert_equal = lambda x, y: nt.assert_equal(x, y) |
1675 | + assert_not_equal = lambda x, y: nt.assert_not_equal(x, y) |
1676 | + assert_true = lambda x: nt.assert_true(x) |
1677 | + assert_false = lambda x: nt.assert_false(x) |
1678 | + |
1679 | + def assert_raises(error, func, *args, **kwargs): |
1680 | + return nt.assert_raises(error, func, *args, **kwargs) |
1681 | + |
1682 | |
1683 | === added file 'nipy/io/imageformats/testing/_paramtestpy2.py' |
1684 | --- nipy/io/imageformats/testing/_paramtestpy2.py 1970-01-01 00:00:00 +0000 |
1685 | +++ nipy/io/imageformats/testing/_paramtestpy2.py 2010-01-21 22:28:13 +0000 |
1686 | @@ -0,0 +1,89 @@ |
1687 | +"""Implementation of the parametric test support for Python 2.x |
1688 | +""" |
1689 | +#----------------------------------------------------------------------------- |
1690 | +# Imports |
1691 | +#----------------------------------------------------------------------------- |
1692 | + |
1693 | +# Stdlib |
1694 | +import unittest |
1695 | +from compiler.consts import CO_GENERATOR |
1696 | + |
1697 | +#----------------------------------------------------------------------------- |
1698 | +# Classes and functions |
1699 | +#----------------------------------------------------------------------------- |
1700 | + |
1701 | +def isgenerator(func): |
1702 | + try: |
1703 | + return func.func_code.co_flags & CO_GENERATOR != 0 |
1704 | + except AttributeError: |
1705 | + return False |
1706 | + |
1707 | +class ParametricTestCase(unittest.TestCase): |
1708 | + """Write parametric tests in normal unittest testcase form. |
1709 | + |
1710 | + Limitations: the last iteration misses printing out a newline when running |
1711 | + in verbose mode. |
1712 | + """ |
1713 | + def run_parametric(self, result, testMethod): |
1714 | + # But if we have a test generator, we iterate it ourselves |
1715 | + testgen = testMethod() |
1716 | + while True: |
1717 | + try: |
1718 | + # Initialize test |
1719 | + result.startTest(self) |
1720 | + |
1721 | + # SetUp |
1722 | + try: |
1723 | + self.setUp() |
1724 | + except KeyboardInterrupt: |
1725 | + raise |
1726 | + except: |
1727 | + result.addError(self, self._exc_info()) |
1728 | + return |
1729 | + # Test execution |
1730 | + ok = False |
1731 | + try: |
1732 | + testgen.next() |
1733 | + ok = True |
1734 | + except StopIteration: |
1735 | + # We stop the loop |
1736 | + break |
1737 | + except self.failureException: |
1738 | + result.addFailure(self, self._exc_info()) |
1739 | + except KeyboardInterrupt: |
1740 | + raise |
1741 | + except: |
1742 | + result.addError(self, self._exc_info()) |
1743 | + # TearDown |
1744 | + try: |
1745 | + self.tearDown() |
1746 | + except KeyboardInterrupt: |
1747 | + raise |
1748 | + except: |
1749 | + result.addError(self, self._exc_info()) |
1750 | + ok = False |
1751 | + if ok: result.addSuccess(self) |
1752 | + |
1753 | + finally: |
1754 | + result.stopTest(self) |
1755 | + |
1756 | + def run(self, result=None): |
1757 | + if result is None: |
1758 | + result = self.defaultTestResult() |
1759 | + testMethod = getattr(self, self._testMethodName) |
1760 | + # For normal tests, we just call the base class and return that |
1761 | + if isgenerator(testMethod): |
1762 | + return self.run_parametric(result, testMethod) |
1763 | + else: |
1764 | + return super(ParametricTestCase, self).run(result) |
1765 | + |
1766 | + |
1767 | +def parametric(func): |
1768 | + """Decorator to make a simple function into a normal test via unittest.""" |
1769 | + |
1770 | + class Tester(ParametricTestCase): |
1771 | + test = staticmethod(func) |
1772 | + |
1773 | + Tester.__name__ = func.__name__ |
1774 | + |
1775 | + return Tester |
1776 | |
1777 | === added file 'nipy/io/imageformats/testing/_paramtestpy3.py' |
1778 | --- nipy/io/imageformats/testing/_paramtestpy3.py 1970-01-01 00:00:00 +0000 |
1779 | +++ nipy/io/imageformats/testing/_paramtestpy3.py 2010-01-21 22:28:13 +0000 |
1780 | @@ -0,0 +1,62 @@ |
1781 | +"""Implementation of the parametric test support for Python 3.x. |
1782 | + |
1783 | +Thanks for the py3 version to Robert Collins, from the Testing in Python |
1784 | +mailing list. |
1785 | +""" |
1786 | +#----------------------------------------------------------------------------- |
1787 | +# Imports |
1788 | +#----------------------------------------------------------------------------- |
1789 | + |
1790 | +# Stdlib |
1791 | +import unittest |
1792 | +from unittest import TestSuite |
1793 | + |
1794 | +#----------------------------------------------------------------------------- |
1795 | +# Classes and functions |
1796 | +#----------------------------------------------------------------------------- |
1797 | + |
1798 | + |
1799 | +def isgenerator(func): |
1800 | + return hasattr(func,'_generator') |
1801 | + |
1802 | + |
1803 | +class IterCallableSuite(TestSuite): |
1804 | + def __init__(self, iterator, adapter): |
1805 | + self._iter = iterator |
1806 | + self._adapter = adapter |
1807 | + def __iter__(self): |
1808 | + yield self._adapter(self._iter.__next__) |
1809 | + |
1810 | +class ParametricTestCase(unittest.TestCase): |
1811 | + """Write parametric tests in normal unittest testcase form. |
1812 | + |
1813 | + Limitations: the last iteration misses printing out a newline when |
1814 | + running in verbose mode. |
1815 | + """ |
1816 | + |
1817 | + def run(self, result=None): |
1818 | + testMethod = getattr(self, self._testMethodName) |
1819 | + # For normal tests, we just call the base class and return that |
1820 | + if isgenerator(testMethod): |
1821 | + def adapter(next_test): |
1822 | + return unittest.FunctionTestCase(next_test, |
1823 | + self.setUp, |
1824 | + self.tearDown) |
1825 | + |
1826 | + return IterCallableSuite(testMethod(),adapter).run(result) |
1827 | + else: |
1828 | + return super(ParametricTestCase, self).run(result) |
1829 | + |
1830 | + |
1831 | +def parametric(func): |
1832 | + """Decorator to make a simple function into a normal test via |
1833 | +unittest.""" |
1834 | + # Hack, until I figure out how to write isgenerator() for python3!! |
1835 | + func._generator = True |
1836 | + |
1837 | + class Tester(ParametricTestCase): |
1838 | + test = staticmethod(func) |
1839 | + |
1840 | + Tester.__name__ = func.__name__ |
1841 | + |
1842 | + return Tester |
1843 | |
1844 | === added file 'nipy/io/imageformats/testing/lightunit.py' |
1845 | --- nipy/io/imageformats/testing/lightunit.py 1970-01-01 00:00:00 +0000 |
1846 | +++ nipy/io/imageformats/testing/lightunit.py 2010-01-21 22:28:13 +0000 |
1847 | @@ -0,0 +1,55 @@ |
1848 | +"""Lightweight testing that remains unittest-compatible. |
1849 | + |
1850 | +This module exposes decorators and classes to make lightweight testing possible |
1851 | +in a manner similar to what nose allows, where standalone functions can be |
1852 | +tests. It also provides parametric test support that is vastly easier to use |
1853 | +than nose's for debugging, because if a test fails, the stack under inspection |
1854 | +is that of the test and not that of the test framework. |
1855 | + |
1856 | +- An @as_unittest decorator can be used to tag any normal parameter-less |
1857 | + function as a unittest TestCase. Then, both nose and normal unittest will |
1858 | + recognize it as such. |
1859 | + |
1860 | +Authors |
1861 | +------- |
1862 | + |
1863 | +- Fernando Perez <Fernando.Perez@berkeley.edu> |
1864 | +""" |
1865 | + |
1866 | +#----------------------------------------------------------------------------- |
1867 | +# Copyright (C) 2009 The IPython Development Team |
1868 | +# |
1869 | +# Distributed under the terms of the BSD License. The full license is in |
1870 | +# the file COPYING, distributed as part of this software. |
1871 | +#----------------------------------------------------------------------------- |
1872 | + |
1873 | +#----------------------------------------------------------------------------- |
1874 | +# Imports |
1875 | +#----------------------------------------------------------------------------- |
1876 | + |
1877 | +# Stdlib |
1878 | +import sys |
1879 | +import unittest |
1880 | + |
1881 | +# Our own |
1882 | +import nosepatch |
1883 | + |
1884 | +if sys.version[0]=='2': |
1885 | + from _paramtestpy2 import ParametricTestCase, parametric |
1886 | +else: |
1887 | + from _paramtestpy3 import ParametricTestCase, parametric |
1888 | + |
1889 | +#----------------------------------------------------------------------------- |
1890 | +# Classes and functions |
1891 | +#----------------------------------------------------------------------------- |
1892 | + |
1893 | +# Simple example of the basic idea |
1894 | +def as_unittest(func): |
1895 | + """Decorator to make a simple function into a normal test via unittest.""" |
1896 | + class Tester(unittest.TestCase): |
1897 | + def test(self): |
1898 | + func() |
1899 | + |
1900 | + Tester.__name__ = func.__name__ |
1901 | + |
1902 | + return Tester |
1903 | |
1904 | === added file 'nipy/io/imageformats/testing/nosepatch.py' |
1905 | --- nipy/io/imageformats/testing/nosepatch.py 1970-01-01 00:00:00 +0000 |
1906 | +++ nipy/io/imageformats/testing/nosepatch.py 2010-01-21 22:28:13 +0000 |
1907 | @@ -0,0 +1,53 @@ |
1908 | +"""Monkeypatch nose to accept any callable as a method. |
1909 | + |
1910 | +By default, nose's ismethod() fails for static methods. |
1911 | +Once this is fixed in upstream nose we can disable it. |
1912 | + |
1913 | +Note: merely importing this module causes the monkeypatch to be applied.""" |
1914 | + |
1915 | +import unittest |
1916 | +import nose.loader |
1917 | +from inspect import ismethod, isfunction |
1918 | + |
1919 | +def getTestCaseNames(self, testCaseClass): |
1920 | + """Override to select with selector, unless |
1921 | + config.getTestCaseNamesCompat is True |
1922 | + """ |
1923 | + if self.config.getTestCaseNamesCompat: |
1924 | + return unittest.TestLoader.getTestCaseNames(self, testCaseClass) |
1925 | + |
1926 | + def wanted(attr, cls=testCaseClass, sel=self.selector): |
1927 | + item = getattr(cls, attr, None) |
1928 | + # MONKEYPATCH: replace this: |
1929 | + #if not ismethod(item): |
1930 | + # return False |
1931 | + # return sel.wantMethod(item) |
1932 | + # With: |
1933 | + if ismethod(item): |
1934 | + return sel.wantMethod(item) |
1935 | + # static method or something. If this is a static method, we |
1936 | + # can't get the class information, and we have to treat it |
1937 | + # as a function. Thus, we will miss things like class |
1938 | + # attributes for test selection |
1939 | + if isfunction(item): |
1940 | + return sel.wantFunction(item) |
1941 | + return False |
1942 | + # END MONKEYPATCH |
1943 | + |
1944 | + cases = filter(wanted, dir(testCaseClass)) |
1945 | + for base in testCaseClass.__bases__: |
1946 | + for case in self.getTestCaseNames(base): |
1947 | + if case not in cases: |
1948 | + cases.append(case) |
1949 | + # add runTest if nothing else picked |
1950 | + if not cases and hasattr(testCaseClass, 'runTest'): |
1951 | + cases = ['runTest'] |
1952 | + if self.sortTestMethodsUsing: |
1953 | + cases.sort(self.sortTestMethodsUsing) |
1954 | + return cases |
1955 | + |
1956 | + |
1957 | +########################################################################## |
1958 | +# Apply monkeypatch here |
1959 | +nose.loader.TestLoader.getTestCaseNames = getTestCaseNames |
1960 | +########################################################################## |
1961 | |
1962 | === modified file 'nipy/io/imageformats/tests/test_funcs.py' |
1963 | --- nipy/io/imageformats/tests/test_funcs.py 2009-12-12 19:52:25 +0000 |
1964 | +++ nipy/io/imageformats/tests/test_funcs.py 2010-01-21 22:28:13 +0000 |
1965 | @@ -6,10 +6,15 @@ |
1966 | |
1967 | import nipy.io.imageformats as nf |
1968 | |
1969 | -from nipy.io.imageformats.funcs import concat_images |
1970 | +from nipy.io.imageformats.funcs import concat_images, as_closest_canonical, OrientationError |
1971 | +from nipy.io.imageformats.nifti1 import Nifti1Image |
1972 | + |
1973 | +from nipy.testing import parametric |
1974 | |
1975 | from numpy.testing import assert_array_equal |
1976 | -from nose.tools import assert_true, assert_equal, assert_raises |
1977 | +from nose.tools import (assert_true, assert_false, |
1978 | + assert_equal, assert_raises) |
1979 | + |
1980 | |
1981 | def test_concat(): |
1982 | shape = (1,2,5) |
1983 | @@ -28,5 +33,28 @@ |
1984 | img2 = nf.Nifti1Image(data1.T, affine) |
1985 | yield assert_raises, ValueError, concat_images, [img0, img2] |
1986 | |
1987 | - |
1988 | - |
1989 | + |
1990 | +@parametric |
1991 | +def test_closest_canonical(): |
1992 | + arr = np.arange(24).reshape((2,3,4,1)) |
1993 | + # no funky stuff, returns same thing |
1994 | + img = Nifti1Image(arr, np.eye(4)) |
1995 | + xyz_img = as_closest_canonical(img) |
1996 | + yield assert_true(img is xyz_img) |
1997 | + # a axis flip |
1998 | + img = Nifti1Image(arr, np.diag([-1,1,1,1])) |
1999 | + xyz_img = as_closest_canonical(img) |
2000 | + yield assert_false(img is xyz_img) |
2001 | + out_arr = xyz_img.get_data() |
2002 | + yield assert_array_equal(out_arr, np.flipud(arr)) |
2003 | + # no error for enforce_diag in this case |
2004 | + xyz_img = as_closest_canonical(img, True) |
2005 | + # but there is if the affine is not diagonal |
2006 | + aff = np.eye(4) |
2007 | + aff[0,1] = 0.1 |
2008 | + # although it's more or less canonical already |
2009 | + img = Nifti1Image(arr, aff) |
2010 | + xyz_img = as_closest_canonical(img) |
2011 | + yield assert_true(img is xyz_img) |
2012 | + # it's still not diagnonal |
2013 | + yield assert_raises(OrientationError, as_closest_canonical, img, True) |
2014 | |
2015 | === modified file 'nipy/io/imageformats/tests/test_nifti1.py' |
2016 | --- nipy/io/imageformats/tests/test_nifti1.py 2009-09-11 01:35:49 +0000 |
2017 | +++ nipy/io/imageformats/tests/test_nifti1.py 2010-01-21 22:28:13 +0000 |
2018 | @@ -270,7 +270,7 @@ |
2019 | for ext in ('.gz', '.bz2'): |
2020 | try: |
2021 | _, fname = tempfile.mkstemp('.nii' + ext) |
2022 | - img.to_filespec(fname) |
2023 | + img.to_filename(fname) |
2024 | img3 = Nifti1Image.load(fname) |
2025 | yield assert_true, isinstance(img3, img.__class__) |
2026 | yield assert_array_equal, img3.get_data(), data |
2027 | |
2028 | === added file 'nipy/io/imageformats/tests/test_orientations.py' |
2029 | --- nipy/io/imageformats/tests/test_orientations.py 1970-01-01 00:00:00 +0000 |
2030 | +++ nipy/io/imageformats/tests/test_orientations.py 2010-01-21 22:28:13 +0000 |
2031 | @@ -0,0 +1,219 @@ |
2032 | +''' Testing for orientations module ''' |
2033 | + |
2034 | +import numpy as np |
2035 | + |
2036 | +from nose.tools import assert_true, assert_equal, assert_raises |
2037 | + |
2038 | +from numpy.testing import assert_array_equal, assert_array_almost_equal |
2039 | + |
2040 | +from nipy.io.imageformats.orientations import (io_orientation, orientation_affine, |
2041 | + flip_axis, _ornt_to_affine, |
2042 | + apply_orientation, OrientationError) |
2043 | + |
2044 | +from nipy.io.imageformats.testing import parametric |
2045 | + |
2046 | + |
2047 | +IN_ARRS = [np.eye(4), |
2048 | + [[0,0,1,0], |
2049 | + [0,1,0,0], |
2050 | + [1,0,0,0], |
2051 | + [0,0,0,1]], |
2052 | + [[0,1,0,0], |
2053 | + [0,0,1,0], |
2054 | + [1,0,0,0], |
2055 | + [0,0,0,1]], |
2056 | + [[3,1,0,0], |
2057 | + [1,3,0,0], |
2058 | + [0,0,1,0], |
2059 | + [0,0,0,1]], |
2060 | + [[1,3,0,0], |
2061 | + [3,1,0,0], |
2062 | + [0,0,1,0], |
2063 | + [0,0,0,1]], |
2064 | + ] |
2065 | + |
2066 | +OUT_ORNTS = [[[0,1], |
2067 | + [1,1], |
2068 | + [2,1]], |
2069 | + [[2,1], |
2070 | + [1,1], |
2071 | + [0,1]], |
2072 | + [[2,1], |
2073 | + [0,1], |
2074 | + [1,1]], |
2075 | + [[0,1], |
2076 | + [1,1], |
2077 | + [2,1]], |
2078 | + [[1,1], |
2079 | + [0,1], |
2080 | + [2,1]], |
2081 | + ] |
2082 | + |
2083 | +IN_ARRS = IN_ARRS + [[[np.cos(np.pi/6+i*np.pi/2),np.sin(np.pi/6+i*np.pi/2),0,0], |
2084 | + [-np.sin(np.pi/6+i*np.pi/2),np.cos(np.pi/6+i*np.pi/2),0,0], |
2085 | + [0,0,1,0], |
2086 | + [0,0,0,1]] for i in range(4)] |
2087 | + |
2088 | +OUT_ORNTS = OUT_ORNTS + [[[0,1], |
2089 | + [1,1], |
2090 | + [2,1]], |
2091 | + [[1,-1], |
2092 | + [0,1], |
2093 | + [2,1]], |
2094 | + [[0,-1], |
2095 | + [1,-1], |
2096 | + [2,1]], |
2097 | + [[1,1], |
2098 | + [0,-1], |
2099 | + [2,1]] |
2100 | + ] |
2101 | + |
2102 | + |
2103 | +IN_ARRS = [np.array(arr) for arr in IN_ARRS] |
2104 | +OUT_ORNTS = [np.array(ornt) for ornt in OUT_ORNTS] |
2105 | + |
2106 | + |
2107 | +def same_transform(taff, ornt, shape): |
2108 | + # Applying transformations implied by `ornt` to a made-up array |
2109 | + # ``arr`` of shape `shape`, results in ``t_arr``. When the point |
2110 | + # indices from ``arr`` are transformed by (the inverse of) `taff`, |
2111 | + # and we index into ``t_arr`` with these transformed points, then we |
2112 | + # should get the same values as we would from indexing into arr with |
2113 | + # the untransformed points. |
2114 | + shape = np.array(shape) |
2115 | + size = np.prod(shape) |
2116 | + arr = np.arange(size).reshape(shape) |
2117 | + # apply ornt transformations |
2118 | + t_arr = apply_orientation(arr, ornt) |
2119 | + # get all point indices in arr |
2120 | + i,j,k = shape |
2121 | + arr_pts = np.mgrid[:i,:j,:k].reshape((3,-1)) |
2122 | + # inverse of taff takes us from point index in arr to point index in |
2123 | + # t_arr |
2124 | + itaff = np.linalg.inv(taff) |
2125 | + # apply itaff so that points indexed in t_arr should correspond |
2126 | + o2t_pts = np.dot(itaff[:3,:3], arr_pts) + itaff[:3,3][:,None] |
2127 | + assert np.allclose(np.round(o2t_pts), o2t_pts) |
2128 | + # fancy index out the t_arr values |
2129 | + vals = t_arr[list(o2t_pts.astype('i'))] |
2130 | + return np.all(vals == arr.ravel()) |
2131 | + |
2132 | + |
2133 | +@parametric |
2134 | +def test_apply(): |
2135 | + # most tests are in ``same_transform`` above, via the |
2136 | + # test_io_orientations. |
2137 | + a = np.arange(24).reshape((2,3,4)) |
2138 | + # Test 4D |
2139 | + t_arr = apply_orientation(a[:,:,:,None], ornt) |
2140 | + yield assert_equal(t_arr.ndim, 4) |
2141 | + # Orientation errors |
2142 | + yield assert_raises(OrientationError, |
2143 | + apply_orientation, |
2144 | + a[:,:,1], ornt) |
2145 | + yield assert_raises(OrientationError, |
2146 | + apply_orientation, |
2147 | + a, |
2148 | + [[0,1],[np.nan,np.nan],[2,1]]) |
2149 | + |
2150 | + |
2151 | +@parametric |
2152 | +def test_flip_axis(): |
2153 | + a = np.arange(24).reshape((2,3,4)) |
2154 | + yield assert_array_equal( |
2155 | + flip_axis(a), |
2156 | + np.flipud(a)) |
2157 | + yield assert_array_equal( |
2158 | + flip_axis(a, axis=0), |
2159 | + np.flipud(a)) |
2160 | + yield assert_array_equal( |
2161 | + flip_axis(a, axis=1), |
2162 | + np.fliplr(a)) |
2163 | + # check accepts array-like |
2164 | + yield assert_array_equal( |
2165 | + flip_axis(a.tolist(), axis=0), |
2166 | + np.flipud(a)) |
2167 | + # third dimension |
2168 | + b = a.transpose() |
2169 | + b = np.flipud(b) |
2170 | + b = b.transpose() |
2171 | + yield assert_array_equal(flip_axis(a, axis=2), b) |
2172 | + |
2173 | + |
2174 | +@parametric |
2175 | +def test_io_orientation(): |
2176 | + shape = (2,3,4) |
2177 | + for in_arr, out_ornt in zip(IN_ARRS, OUT_ORNTS): |
2178 | + ornt = io_orientation(in_arr) |
2179 | + yield assert_array_equal(ornt, out_ornt) |
2180 | + taff = orientation_affine(ornt, shape) |
2181 | + yield assert_true(same_transform(taff, ornt, shape)) |
2182 | + for axno in range(3): |
2183 | + arr = in_arr.copy() |
2184 | + ex_ornt = out_ornt.copy() |
2185 | + # flip the input axis in affine |
2186 | + arr[:,axno] *= -1 |
2187 | + # check that result shows flip |
2188 | + ex_ornt[axno, 1] *= -1 |
2189 | + ornt = io_orientation(arr) |
2190 | + yield assert_array_equal(ornt, ex_ornt) |
2191 | + taff = orientation_affine(ornt, shape) |
2192 | + yield assert_true(same_transform(taff, ornt, shape)) |
2193 | + |
2194 | + |
2195 | +def test_drop_coord(): |
2196 | + # given a 5x4 affine from slicing an fmri, |
2197 | + # the orientations code should easily reorder and drop the t |
2198 | + # axis |
2199 | + |
2200 | + # this affine has output coordinates '-y','z','x' and is at t=16 |
2201 | + sliced_fmri_affine = np.array([[0,-1,0,3], |
2202 | + [0,0,2,5], |
2203 | + [3,0,0,4], |
2204 | + [0,0,0,16], |
2205 | + [0,0,0,1]]) |
2206 | + ornt = io_orientation(sliced_fmri_affine) |
2207 | + affine_that_drops_t_reorders_and_flips = _ornt_to_affine(ornt) |
2208 | + final_affine = np.dot(affine_that_drops_t_reorders_and_flips, |
2209 | + sliced_fmri_affine) |
2210 | + # the output will be diagonal |
2211 | + # with the 'y' row having been flipped and the 't' row dropped |
2212 | + assert_array_equal(final_affine, |
2213 | + np.array([[3,0,0,4], |
2214 | + [0,1,0,-3], |
2215 | + [0,0,2,5], |
2216 | + [0,0,0,1]])) |
2217 | + |
2218 | + |
2219 | +@parametric |
2220 | +def test_ornt_to_affine(): |
2221 | + # this orientation indicates that the first output |
2222 | + # axis of the affine is closest to the vector [0,0,-1], |
2223 | + # the last is closest to [1,0,0] and |
2224 | + # that the y coordinate ([0,1,0]) is dropped |
2225 | + ornt = [[2,-1], |
2226 | + [np.nan,np.nan], |
2227 | + [0,1]] |
2228 | + # the reordering/flipping is represented by an affine that |
2229 | + # takes the 3rd output coordinate and maps it to the |
2230 | + # first, takes the 3rd, maps it to first and flips it |
2231 | + A = np.array([[0,0,-1,0], |
2232 | + [1,0,0,0], |
2233 | + [0,0,0,1]]) |
2234 | + yield assert_array_equal(A, _ornt_to_affine(ornt)) |
2235 | + # a more complicated example. only the 1st, 3rd and 6th |
2236 | + # coordinates appear in the output |
2237 | + ornt = [[3,-1], |
2238 | + [np.nan,np.nan], |
2239 | + [0,1], |
2240 | + [np.nan,np.nan], |
2241 | + [np.nan,np.nan], |
2242 | + [1,-1]] |
2243 | + B = np.array([[0,0,0,-1,0,0,0], |
2244 | + [1,0,0,0,0,0,0], |
2245 | + [0,-1,0,0,0,0,0], |
2246 | + [0,0,0,0,0,0,1]]) |
2247 | + yield assert_array_equal(B, _ornt_to_affine(ornt)) |
2248 | + |
2249 | + |
2250 | + |
2251 | |
2252 | === modified file 'nipy/io/nifti_ref.py' |
2253 | --- nipy/io/nifti_ref.py 2009-11-25 17:08:03 +0000 |
2254 | +++ nipy/io/nifti_ref.py 2010-01-21 22:28:12 +0000 |
2255 | @@ -49,7 +49,7 @@ |
2256 | |
2257 | FIXME: (finish this thought.... Are we going to open NIFTI files with |
2258 | NIFTI input coordinates?) For this reason, we have to consider whether |
2259 | -we should transpose the memmap from pynifti. |
2260 | +we should transpose the memmap from nifti input. |
2261 | """ |
2262 | import warnings |
2263 | |
2264 | |
2265 | === modified file 'nipy/modalities/fmri/fmristat/tests/test_FIAC.py' |
2266 | --- nipy/modalities/fmri/fmristat/tests/test_FIAC.py 2009-12-03 21:12:30 +0000 |
2267 | +++ nipy/modalities/fmri/fmristat/tests/test_FIAC.py 2010-01-21 22:28:13 +0000 |
2268 | @@ -306,10 +306,9 @@ |
2269 | i = interp1d(t, formula.vectorize(col)(t)) |
2270 | return formula.aliased_function(name, i)(formula.t) |
2271 | |
2272 | + |
2273 | def test_agreement(): |
2274 | - """ |
2275 | - The test: does Protocol manage to recreate the design of fMRIstat? |
2276 | - """ |
2277 | + # The test: does Protocol manage to recreate the design of fMRIstat? |
2278 | for design_type in ['event', 'block']: |
2279 | dd = D[design_type] |
2280 | |
2281 | @@ -319,20 +318,18 @@ |
2282 | yield niptest.assert_true, np.greater(cmax, 0.999) |
2283 | |
2284 | |
2285 | - |
2286 | def test_event_design(): |
2287 | - |
2288 | - block = csv2rec(StringIO(altdescr['block'])) |
2289 | - event = csv2rec(StringIO(altdescr['event'])) |
2290 | - t = np.arange(191)*2.5+1.25 |
2291 | - |
2292 | - bkeep = np.not_equal((np.arange(block.time.shape[0])) % 6, 0) |
2293 | - ekeep = np.greater(np.arange(event.time.shape[0]), 0) |
2294 | - |
2295 | - # Even though there is a FIAC block experiment |
2296 | - # the design is represented as an event design |
2297 | - # with the same event repeated several times in a row... |
2298 | - |
2299 | - Xblock, cblock = design.event_design(block[bkeep], t, hrfs=delay.spectral) |
2300 | - Xevent, cevent = design.event_design(event[ekeep], t, hrfs=delay.spectral) |
2301 | - |
2302 | + block = csv2rec(StringIO(altdescr['block'])) |
2303 | + event = csv2rec(StringIO(altdescr['event'])) |
2304 | + t = np.arange(191)*2.5+1.25 |
2305 | + |
2306 | + bkeep = np.not_equal((np.arange(block.time.shape[0])) % 6, 0) |
2307 | + ekeep = np.greater(np.arange(event.time.shape[0]), 0) |
2308 | + |
2309 | + # Even though there is a FIAC block experiment |
2310 | + # the design is represented as an event design |
2311 | + # with the same event repeated several times in a row... |
2312 | + |
2313 | + Xblock, cblock = design.event_design(block[bkeep], t, hrfs=delay.spectral) |
2314 | + Xevent, cevent = design.event_design(event[ekeep], t, hrfs=delay.spectral) |
2315 | + |
2316 | |
2317 | === modified file 'nipy/modalities/fmri/pca.py' |
2318 | --- nipy/modalities/fmri/pca.py 2009-12-16 18:35:13 +0000 |
2319 | +++ nipy/modalities/fmri/pca.py 2010-01-21 22:28:13 +0000 |
2320 | @@ -13,7 +13,7 @@ |
2321 | """ |
2322 | |
2323 | import numpy as np |
2324 | -import numpy.linalg as npl |
2325 | +import scipy.linalg as spl |
2326 | from nipy.fixes.scipy.stats.models.utils import pos_recipr |
2327 | |
2328 | |
2329 | @@ -25,36 +25,39 @@ |
2330 | ---------- |
2331 | data : ndarray-like (np.float) |
2332 | The array on which to perform PCA over axis `axis` (below) |
2333 | - axis : int |
2334 | + axis : int, optional |
2335 | The axis over which to perform PCA (axis identifying |
2336 | observations). Default is 0 (first) |
2337 | - mask : ndarray-like (np.bool) |
2338 | + mask : ndarray-like (np.bool), optional |
2339 | An optional mask, should have shape given by data axes, with |
2340 | - `axis` removed, i.e.: ``s = data.shape; s.pop(axis); msk_shape=s`` |
2341 | - ncomp : {None, int} |
2342 | + `axis` removed, i.e.: ``s = data.shape; s.pop(axis); |
2343 | + msk_shape=s`` |
2344 | + ncomp : {None, int}, optional |
2345 | How many component basis projections to return. If ncomp is None |
2346 | (the default) then the number of components is given by the |
2347 | calculated rank of the data, after applying `design_keep`, |
2348 | `design_resid` and `tol_ratio` below. We always return all the |
2349 | basis vectors and percent variance for each component; `ncomp` |
2350 | refers only to the number of basis_projections returned. |
2351 | - standardize : bool |
2352 | - Standardize so each time series has same error-sum-of-squares? |
2353 | - design_keep : None or ndarray |
2354 | + standardize : bool, optional |
2355 | + If True, standardize so each time series (after application of |
2356 | + `design_keep` and `design_resid`) has the same standard |
2357 | + deviation, as calculated by the ``np.std`` function. |
2358 | + design_keep : None or ndarray, optional |
2359 | Data is projected onto the column span of design_keep. |
2360 | None (default) equivalent to ``np.identity(data.shape[axis])`` |
2361 | - design_resid : str or None or ndarray |
2362 | + design_resid : str or None or ndarray, optional |
2363 | After projecting onto the column span of design_keep, data is |
2364 | projected perpendicular to the column span of this matrix. If |
2365 | None, we do no such second projection. If a string 'mean', then |
2366 | the mean of the data is removed, equivalent to passing a column |
2367 | vector matrix of 1s. |
2368 | - tol_ratio : float |
2369 | + tol_ratio : float, optional |
2370 | If ``XZ`` is the vector of singular values of the projection |
2371 | matrix from `design_keep` and `design_resid`, and S are the |
2372 | singular values of ``XZ``, then `tol_ratio` is the value used to |
2373 | - calculate the effective rank of the projection, as in ``rank = ((S |
2374 | - / S.max) > tol_ratio).sum()`` |
2375 | + calculate the effective rank of the projection of the design, as |
2376 | + in ``rank = ((S / S.max) > tol_ratio).sum()`` |
2377 | |
2378 | Returns |
2379 | ------- |
2380 | @@ -65,23 +68,24 @@ |
2381 | |
2382 | `results` has keys: |
2383 | |
2384 | - * ``basis_vectors``: time series, shape (data.shape[axis], L) |
2385 | + * ``basis_vectors``: series over `axis`, shape (data.shape[axis], L) - |
2386 | + the eigenvectors of the PCA |
2387 | * ``pcnt_var``: percent variance explained by component, shape |
2388 | (L,) |
2389 | - * ``basis_projections``: PCA components, with components varying over axis |
2390 | - `axis`; thus shape given by: ``s = list(data.shape); s[axis] = |
2391 | - ncomp`` |
2392 | + * ``basis_projections``: PCA components, with components varying |
2393 | + over axis `axis`; thus shape given by: ``s = list(data.shape); |
2394 | + s[axis] = ncomp`` |
2395 | * ``axis``: axis over which PCA has been performed. |
2396 | |
2397 | - Notes |
2398 | - ----- |
2399 | - See ``pca_image.m`` from fmristat for Keith Worsley's code on which |
2400 | - some of this is based. |
2401 | + Notes |
2402 | + ----- |
2403 | + See ``pca_image.m`` from ``fmristat`` for Keith Worsley's code on |
2404 | + which some of this is based. |
2405 | |
2406 | - See: http://en.wikipedia.org/wiki/Principal_component_analysis for |
2407 | - some inspiration for naming - particularly 'basis_vectors' and |
2408 | - 'basis_projections' |
2409 | - """ |
2410 | + See: http://en.wikipedia.org/wiki/Principal_component_analysis for |
2411 | + some inspiration for naming - particularly 'basis_vectors' and |
2412 | + 'basis_projections' |
2413 | + """ |
2414 | data = np.asarray(data) |
2415 | # We roll the PCA axis to be first, for convenience |
2416 | if axis is None: |
2417 | @@ -89,7 +93,6 @@ |
2418 | data = np.rollaxis(data, axis) |
2419 | if mask is not None: |
2420 | mask = np.asarray(mask) |
2421 | - nbasis_projections = data.shape[0] |
2422 | if design_resid == 'mean': |
2423 | # equivalent to: design_resid = np.ones((data.shape[0], 1)) |
2424 | def project_resid(Y): |
2425 | @@ -97,9 +100,20 @@ |
2426 | elif design_resid is None: |
2427 | def project_resid(Y): return Y |
2428 | else: # matrix passed, we hope |
2429 | - pinv_design_resid = npl.pinv(design_resid) |
2430 | + projector = np.dot(design_resid, spl.pinv(design_resid)) |
2431 | def project_resid(Y): |
2432 | - return Y - np.dot(np.dot(design_resid, pinv_design_resid), Y) |
2433 | + return Y - np.dot(projector, Y) |
2434 | + if standardize: |
2435 | + def standardize_from(arr, std_source): |
2436 | + # modifies array in place |
2437 | + resid = project_resid(std_source) |
2438 | + rstd = np.sqrt(np.square(resid).sum(axis=0) / resid.shape[0]) |
2439 | + # positive 1/rstd |
2440 | + rstd_half = np.where(rstd<=0, 0, 1. / rstd) |
2441 | + arr *= rstd_half |
2442 | + return arr |
2443 | + else: |
2444 | + standardize_from = None |
2445 | """ |
2446 | Perform the computations needed for the PCA. This stores the |
2447 | covariance/correlation matrix of the data in the attribute 'C'. The |
2448 | @@ -111,20 +125,20 @@ |
2449 | to column space of design_resid. |
2450 | """ |
2451 | if design_keep is None: |
2452 | - X = np.eye(nbasis_projections) |
2453 | + X = np.eye(data.shape[0]) |
2454 | else: |
2455 | - X = np.dot(design_keep, npl.pinv(design_keep)) |
2456 | + X = np.dot(design_keep, spl.pinv(design_keep)) |
2457 | XZ = project_resid(X) |
2458 | - UX, SX, VX = npl.svd(XZ, full_matrices=0) |
2459 | + UX, SX, VX = spl.svd(XZ, full_matrices=0) |
2460 | # The matrix UX has orthonormal columns and represents the |
2461 | # final "column space" that the data will be projected onto. |
2462 | rank = (SX/SX.max() > tol_ratio).sum() |
2463 | UX = UX[:,range(rank)].T |
2464 | # calculate covariance matrix |
2465 | - C = _get_covariance(data, UX, standardize, project_resid, mask) |
2466 | + C = _get_covariance(data, UX, standardize_from, mask) |
2467 | # find the eigenvalues D and eigenvectors Vs of the covariance |
2468 | # matrix |
2469 | - D, Vs = npl.eigh(C) |
2470 | + D, Vs = spl.eigh(C) |
2471 | # sort both in descending order of eigenvalues |
2472 | order = np.argsort(-D) |
2473 | D = D[order] |
2474 | @@ -136,7 +150,7 @@ |
2475 | if ncomp is None: |
2476 | ncomp = rank |
2477 | subVX = basis_vectors[:ncomp] |
2478 | - out = _get_basis_projections(data, subVX) |
2479 | + out = _get_basis_projections(data, subVX, standardize_from) |
2480 | # Roll PCA image axis back to original position in data array |
2481 | if axis < 0: |
2482 | axis += data.ndim |
2483 | @@ -147,29 +161,33 @@ |
2484 | 'axis': axis} |
2485 | |
2486 | |
2487 | -def _get_covariance(data, UX, standardize, project_resid, mask): |
2488 | +def _get_covariance(data, UX, standardize_from, mask): |
2489 | + # number of points in PCA dimension |
2490 | rank = UX.shape[0] |
2491 | + n_pts = data.shape[0] |
2492 | C = np.zeros((rank, rank)) |
2493 | + # loop over next dimension to save memory |
2494 | for i in range(data.shape[1]): |
2495 | - Y = data[:,i].reshape((data.shape[0], -1)) |
2496 | + Y = data[:,i].reshape((n_pts, -1)) |
2497 | # project data into required space |
2498 | YX = np.dot(UX, Y) |
2499 | - if standardize: |
2500 | - S2 = (project_resid(Y)**2).sum(0) |
2501 | - Smhalf = pos_recipr(np.sqrt(S2)); del(S2) |
2502 | - YX *= Smhalf |
2503 | + if standardize_from is not None: |
2504 | + YX = standardize_from(YX, Y) |
2505 | if mask is not None: |
2506 | + # weight data with mask. Usually the weights will be 0,1 |
2507 | YX = YX * np.nan_to_num(mask[i].reshape(Y.shape[1])) |
2508 | C += np.dot(YX, YX.T) |
2509 | return C |
2510 | |
2511 | |
2512 | -def _get_basis_projections(data, subVX): |
2513 | +def _get_basis_projections(data, subVX, standardize_from): |
2514 | ncomp = subVX.shape[0] |
2515 | out = np.empty((ncomp,) + data.shape[1:], np.float) |
2516 | for i in range(data.shape[1]): |
2517 | Y = data[:,i].reshape((data.shape[0], -1)) |
2518 | U = np.dot(subVX, Y) |
2519 | + if standardize_from is not None: |
2520 | + U = standardize_from(U, Y) |
2521 | U.shape = (U.shape[0],) + data.shape[2:] |
2522 | out[:,i] = U |
2523 | return out |
2524 | |
2525 | === modified file 'nipy/modalities/fmri/tests/test_pca.py' |
2526 | --- nipy/modalities/fmri/tests/test_pca.py 2009-12-14 22:01:46 +0000 |
2527 | +++ nipy/modalities/fmri/tests/test_pca.py 2010-01-21 22:28:13 +0000 |
2528 | @@ -36,13 +36,22 @@ |
2529 | return recond |
2530 | |
2531 | |
2532 | +def root_mse(arr, axis=0): |
2533 | + return np.sqrt(np.square(arr).sum(axis=axis) / arr.shape[axis]) |
2534 | + |
2535 | + |
2536 | +@parametric |
2537 | def test_input_effects(): |
2538 | ntotal = data['nimages'] - 1 |
2539 | # return full rank - mean PCA over last axis |
2540 | p = pca(data['fmridata'], -1) |
2541 | - yield assert_equal, p['basis_vectors'].shape, (data['nimages'], ntotal) |
2542 | - yield assert_equal, p['basis_projections'].shape, data['mask'].shape + (ntotal,) |
2543 | - yield assert_equal, p['pcnt_var'].shape, (ntotal,) |
2544 | + yield assert_equal( |
2545 | + p['basis_vectors'].shape, |
2546 | + (data['nimages'], ntotal)) |
2547 | + yield assert_equal( |
2548 | + p['basis_projections'].shape, |
2549 | + data['mask'].shape + (ntotal,)) |
2550 | + yield assert_equal(p['pcnt_var'].shape, (ntotal,)) |
2551 | # Reconstructed data lacks only mean |
2552 | rarr = reconstruct(p['basis_vectors'], p['basis_projections'], -1) |
2553 | rarr = rarr + data['fmridata'].mean(-1)[...,None] |
2554 | @@ -51,19 +60,19 @@ |
2555 | arr = np.rollaxis(arr, -1) |
2556 | pr = pca(arr) |
2557 | out_arr = np.rollaxis(pr['basis_projections'], 0, 4) |
2558 | - yield assert_array_equal, out_arr, p['basis_projections'] |
2559 | - yield assert_array_equal, p['basis_vectors'], pr['basis_vectors'] |
2560 | - yield assert_array_equal, p['pcnt_var'], pr['pcnt_var'] |
2561 | + yield assert_array_equal(out_arr, p['basis_projections']) |
2562 | + yield assert_array_equal(p['basis_vectors'], pr['basis_vectors']) |
2563 | + yield assert_array_equal(p['pcnt_var'], pr['pcnt_var']) |
2564 | # Check axis None raises error |
2565 | - yield assert_raises, ValueError, pca, data['fmridata'], None |
2566 | + yield assert_raises(ValueError, pca, data['fmridata'], None) |
2567 | |
2568 | |
2569 | @parametric |
2570 | def test_diagonality(): |
2571 | - # basis_projections are not diagonal, unless not standarized |
2572 | - p = pca(data['fmridata'], -1) |
2573 | - yield assert_false(diagonal_covariance(p['basis_projections'], -1)) |
2574 | - pns = pca(data['fmridata'], -1, standardize=False) |
2575 | + # basis_projections are diagonal, whether standarized or not |
2576 | + p = pca(data['fmridata'], -1) # standardized |
2577 | + yield assert_true(diagonal_covariance(p['basis_projections'], -1)) |
2578 | + pns = pca(data['fmridata'], -1, standardize=False) # not |
2579 | yield assert_true(diagonal_covariance(pns['basis_projections'], -1)) |
2580 | |
2581 | |
2582 | @@ -74,6 +83,7 @@ |
2583 | return np.allclose(aTa, np.diag(np.diag(aTa)), atol=1e-6) |
2584 | |
2585 | |
2586 | +@parametric |
2587 | def test_2D(): |
2588 | # check that a standard 2D PCA works too |
2589 | M = 100 |
2590 | @@ -83,26 +93,35 @@ |
2591 | p = pca(data) |
2592 | ts = p['basis_vectors'] |
2593 | imgs = p['basis_projections'] |
2594 | - yield assert_equal, ts.shape, (M, L) |
2595 | - yield assert_equal, imgs.shape, (L, N) |
2596 | - rimgs = reconstruct(ts, imgs) + data.mean(0)[None,...] |
2597 | - yield assert_array_almost_equal, rimgs, data |
2598 | - # if standardize is set, projections will not be covariance diagonal |
2599 | - yield assert_false, diagonal_covariance(imgs) |
2600 | - # but will if not |
2601 | + yield assert_equal(ts.shape, (M, L)) |
2602 | + yield assert_equal(imgs.shape, (L, N)) |
2603 | + rimgs = reconstruct(ts, imgs) |
2604 | + # add back the sqrt MSE, because we standardized |
2605 | + data_mean = data.mean(0)[None,...] |
2606 | + demeaned = data - data_mean |
2607 | + rmse = root_mse(demeaned, axis=0)[None,...] |
2608 | + # also add back the mean |
2609 | + yield assert_array_almost_equal((rimgs * rmse) + data_mean, data) |
2610 | + # if standardize is set, or not, covariance is diagonal |
2611 | + yield assert_true(diagonal_covariance(imgs)) |
2612 | p = pca(data, standardize=False) |
2613 | imgs = p['basis_projections'] |
2614 | - yield assert_true, diagonal_covariance(imgs) |
2615 | + yield assert_true(diagonal_covariance(imgs)) |
2616 | |
2617 | |
2618 | +@parametric |
2619 | def test_PCAMask(): |
2620 | ntotal = data['nimages'] - 1 |
2621 | ncomp = 5 |
2622 | p = pca(data['fmridata'], -1, data['mask'], ncomp=ncomp) |
2623 | - yield assert_equal, p['basis_vectors'].shape, (data['nimages'], ntotal) |
2624 | - yield assert_equal, p['basis_projections'].shape, data['mask'].shape + (ncomp,) |
2625 | - yield assert_equal, p['pcnt_var'].shape, (ntotal,) |
2626 | - yield assert_almost_equal, p['pcnt_var'].sum(), 100. |
2627 | + yield assert_equal( |
2628 | + p['basis_vectors'].shape, |
2629 | + (data['nimages'], ntotal)) |
2630 | + yield assert_equal( |
2631 | + p['basis_projections'].shape, |
2632 | + data['mask'].shape + (ncomp,)) |
2633 | + yield assert_equal(p['pcnt_var'].shape, (ntotal,)) |
2634 | + yield assert_almost_equal(p['pcnt_var'].sum(), 100.) |
2635 | |
2636 | |
2637 | def test_PCAMask_nostandardize(): |
2638 | @@ -151,6 +170,7 @@ |
2639 | yield assert_almost_equal, p['pcnt_var'].sum(), 100. |
2640 | |
2641 | |
2642 | +@parametric |
2643 | def test_resid(): |
2644 | # Data is projected onto k=10 dimensional subspace then has its mean |
2645 | # removed. Should still have rank 10. |
2646 | @@ -159,17 +179,23 @@ |
2647 | ntotal = k |
2648 | X = np.random.standard_normal((data['nimages'], k)) |
2649 | p = pca(data['fmridata'], -1, ncomp=ncomp, design_resid=X) |
2650 | - yield assert_equal, p['basis_vectors'].shape, (data['nimages'], ntotal) |
2651 | - yield assert_equal, p['basis_projections'].shape, data['mask'].shape + (ncomp,) |
2652 | - yield assert_equal, p['pcnt_var'].shape, (ntotal,) |
2653 | - yield assert_almost_equal, p['pcnt_var'].sum(), 100. |
2654 | + yield assert_equal( |
2655 | + p['basis_vectors'].shape, |
2656 | + (data['nimages'], ntotal)) |
2657 | + yield assert_equal( |
2658 | + p['basis_projections'].shape, |
2659 | + data['mask'].shape + (ncomp,)) |
2660 | + yield assert_equal(p['pcnt_var'].shape, (ntotal,)) |
2661 | + yield assert_almost_equal(p['pcnt_var'].sum(), 100.) |
2662 | # if design_resid is None, we do not remove the mean, and we get |
2663 | # full rank from our data |
2664 | p = pca(data['fmridata'], -1, design_resid=None) |
2665 | rank = p['basis_vectors'].shape[1] |
2666 | - yield assert_equal, rank, data['nimages'] |
2667 | + yield assert_equal(rank, data['nimages']) |
2668 | rarr = reconstruct(p['basis_vectors'], p['basis_projections'], -1) |
2669 | - yield assert_array_almost_equal, rarr, data['fmridata'] |
2670 | + # add back the sqrt MSE, because we standardized |
2671 | + rmse = root_mse(data['fmridata'], axis=-1)[...,None] |
2672 | + yield assert_array_almost_equal(rarr * rmse, data['fmridata']) |
2673 | |
2674 | |
2675 | def test_both(): |
2676 | |
2677 | === modified file 'nipy/neurospin/register/benchmarks/bench_register.py' |
2678 | --- nipy/neurospin/register/benchmarks/bench_register.py 2009-07-11 17:35:34 +0000 |
2679 | +++ nipy/neurospin/register/benchmarks/bench_register.py 2010-01-21 22:28:13 +0000 |
2680 | @@ -39,7 +39,7 @@ |
2681 | dt2 = time.clock()-t0 |
2682 | assert_almost_equal(I1, I2) |
2683 | print('3d array resampling') |
2684 | - print(' using neuroimaging.neurospin: %f sec' % dt1) |
2685 | + print(' using nipy.neurospin: %f sec' % dt1) |
2686 | print(' using scipy.ndimage: %f sec' % dt2) |
2687 | |
2688 | |
2689 | |
2690 | === modified file 'setup.py' |
2691 | --- setup.py 2009-12-17 23:36:20 +0000 |
2692 | +++ setup.py 2010-01-21 22:28:12 +0000 |
2693 | @@ -18,11 +18,8 @@ |
2694 | # 'nipy.core.image') |
2695 | # Robert Kern recommends setting quiet=True on the numpy list, stating |
2696 | # these messages are probably only used in debugging numpy distutils. |
2697 | - |
2698 | config.get_version('nipy/version.py') # sets config.version |
2699 | - |
2700 | config.add_subpackage('nipy', 'nipy') |
2701 | - |
2702 | return config |
2703 | |
2704 | ################################################################################ |
2705 | @@ -41,28 +38,76 @@ |
2706 | |
2707 | |
2708 | # Dependency checks |
2709 | -def package_check(pkg_name, version=None, checker=LooseVersion): |
2710 | - msg = 'Missing package: %s, you might get run-time errors' % pkg_name |
2711 | +def package_check(pkg_name, version=None, |
2712 | + optional=False, |
2713 | + checker=LooseVersion, |
2714 | + version_getter=None, |
2715 | + ): |
2716 | + ''' Check if package `pkg_name` is present, and correct version |
2717 | + |
2718 | + Parameters |
2719 | + ---------- |
2720 | + pkg_name : str |
2721 | + name of package as imported into python |
2722 | + version : {None, str}, optional |
2723 | + minimum version of the package that we require. If None, we don't |
2724 | + check the version. Default is None |
2725 | + optional : {False, True}, optional |
2726 | + If False, raise error for absent package or wrong version; |
2727 | + otherwise warn |
2728 | + checker : callable, optional |
2729 | + callable with which to return comparable thing from version |
2730 | + string. Default is ``distutils.version.LooseVersion`` |
2731 | + version_getter : {None, callable}: |
2732 | + Callable that takes `pkg_name` as argument, and returns the |
2733 | + package version string - as in:: |
2734 | + |
2735 | + ``version = version_getter(pkg_name)`` |
2736 | + |
2737 | + If None, equivalent to:: |
2738 | + |
2739 | + mod = __import__(pkg_name); version = mod.__version__`` |
2740 | + ''' |
2741 | + if version_getter is None: |
2742 | + def version_getter(pkg_name): |
2743 | + mod = __import__(pkg_name) |
2744 | + return mod.__version__ |
2745 | try: |
2746 | mod = __import__(pkg_name) |
2747 | except ImportError: |
2748 | - log.warn(msg) |
2749 | - return |
2750 | + if not optional: |
2751 | + raise RuntimeError('Cannot import package "%s" ' |
2752 | + '- is it installed?' % pkg_name) |
2753 | + log.warn('Missing optional package "%s"; ' |
2754 | + 'you may get run-time errors' % pkg_name) |
2755 | + return |
2756 | if not version: |
2757 | return |
2758 | - msg += ' >= %s' % version |
2759 | try: |
2760 | - have_version = mod.__version__ |
2761 | + have_version = version_getter(pkg_name) |
2762 | except AttributeError: |
2763 | raise RuntimeError('Cannot find version for %s' % pkg_name) |
2764 | if checker(have_version) < checker(version): |
2765 | - raise RuntimeError(msg) |
2766 | - |
2767 | - |
2768 | -# Soft dependency checking |
2769 | -package_check('sympy', '0.6.4') |
2770 | + v_msg = 'You have version %s of package "%s"' \ |
2771 | + ' but we need version >= %s' % ( |
2772 | + have_version, |
2773 | + pkg_name, |
2774 | + version, |
2775 | + ) |
2776 | + if optional: |
2777 | + log.warn(v_msg + '; you may get run-time errors') |
2778 | + else: |
2779 | + raise RuntimeError(v_msg) |
2780 | + |
2781 | + |
2782 | +# Hard and soft dependency checking |
2783 | package_check('scipy', '0.5') |
2784 | - |
2785 | +package_check('sympy', '0.6.6') |
2786 | +def _mayavi_version(pkg_name): |
2787 | + from enthought.mayavi import version |
2788 | + return version.version |
2789 | +package_check('mayavi', '3.0', optional=True, |
2790 | + version_getter=_mayavi_version) |
2791 | |
2792 | ################################################################################ |
2793 | # Import the documentation building classes. |
Some updates to io.imageformats from upstream; restored hard dependency checks; fixed missing ellipsis coordmap slicing.