Merge lp://qastaging/~nipy-developers/nipy/trunk-neurospin into lp://qastaging/~nipy-developers/nipy/obsolete-do-not-use

Proposed by Alexis Roche
Status: Merged
Merge reported by: Matthew Brett
Merged at revision: not available
Proposed branch: lp://qastaging/~nipy-developers/nipy/trunk-neurospin
Merge into: lp://qastaging/~nipy-developers/nipy/obsolete-do-not-use
Diff against target: None lines
To merge this branch: bzr merge lp://qastaging/~nipy-developers/nipy/trunk-neurospin
Reviewer Review Type Date Requested Status
Matthew Brett Abstain
Review via email: mp+8472@code.qastaging.launchpad.net
To post a comment you must log in.
Revision history for this message
Alexis Roche (alexis-roche) wrote :

trunk-neurospin is now 3-month old. The external C library has been lightened and renamed, the image registration module has been made independent from it, and many other developements have been going on.

Revision history for this message
Gael Varoquaux (gael-varoquaux) wrote :

When I try to run the tests on my computer, the process terminates
without finishing the test suite. No crash, just a termination. I suspect
there might be a 'exit' call at the C level.

Can you run the test suite on your computer?

Gaël

Revision history for this message
Bertrand Thirion (bertrand-thirion) wrote :

I'm having a look at it.
on the station, I get into trouble with funny things such as
ImportError: /usr/lib/python2.5/lib-dynload/cPickle.so: undefined
symbol: PyUnicodeUCS2_DecodeUTF8
which I fail to understand, but could be a neurospinian curiosity
(don't laugh). There must exist a fix, since I have already used it ,
but I simply don't remeber at the moment.

on the laptop I get (after 12 minutes...):

from trunk-neurospin/nipy:

Ran 4370 tests in 673.157s

FAILED (failures=2, errors=7)

Not perfect, but it runs
Bertrand

Gael Varoquaux a écrit :
> When I try to run the tests on my computer, the process terminates
> without finishing the test suite. No crash, just a termination. I suspect
> there might be a 'exit' call at the C level.
>
> Can you run the test suite on your computer?
>
> Gaël
>

Revision history for this message
Gael Varoquaux (gael-varoquaux) wrote :

On Thu, Jul 09, 2009 at 06:15:19PM -0000, Bertrand Thirion wrote:
> I'm having a look at it.
> on the station, I get into trouble with funny things such as
> ImportError: /usr/lib/python2.5/lib-dynload/cPickle.so: undefined
> symbol: PyUnicodeUCS2_DecodeUTF8
> which I fail to understand, but could be a neurospinian curiosity

Yes, that's exactly it. These are unicode errors due to the fact that we
have two different versions of Python, compiled with different unicode
setting.

> on the laptop I get (after 12 minutes...):

> from trunk-neurospin/nipy:

> Ran 4370 tests in 673.157s

> FAILED (failures=2, errors=7)

> Not perfect, but it runs

Well, 7 errors, and 2 failures out of 4370 tests is not bad!

OK, so it is something that happens only on my box. Zut! I need to track
it down :)

Gaël

Revision history for this message
Bertrand Thirion (bertrand-thirion) wrote :

NB : when I lauch the tests from trunk-neurospin/nipy/neurospin, the
outcome is:

======================================================================
ERROR: test_model_selection_mfx_spatial_rand_walk
(nipy.neurospin.group.tests.test_spatial_relaxation_onesample.test_multivariate_stat_saem)
----------------------------------------------------------------------
Traceback (most recent call last):
  File
"/data/home/thirion/mybzr/trunk-neurospin/nipy/neurospin/group/tests/test_spatial_relaxation_onesample.py",
line 111, in test_model_selection_mfx_spatial_rand_walk
    self.assertTrue(L0 > L00)
  File "/usr/lib/python2.5/unittest.py", line 309, in failUnless
    if not expr: raise self.failureException, msg
ValueError: The truth value of an array with more than one element is
ambiguous. Use a.any() or a.all()

======================================================================
ERROR: Failure: ImportError (No module named neuroimaging.neurospin.utils)
----------------------------------------------------------------------
Traceback (most recent call last):
  File
"/usr/lib/python2.5/site-packages/nose-0.10.1-py2.5.egg/nose/loader.py",
line 364, in loadTestsFromName
    addr.filename, addr.module)
  File
"/usr/lib/python2.5/site-packages/nose-0.10.1-py2.5.egg/nose/importer.py",
line 39, in importFromPath
    return self.importFromDir(dir_path, fqname)
  File
"/usr/lib/python2.5/site-packages/nose-0.10.1-py2.5.egg/nose/importer.py",
line 84, in importFromDir
    mod = load_module(part_fqname, fh, filename, desc)
  File
"/data/home/thirion/mybzr/trunk-neurospin/neuroimaging/neurospin/neuro/__init__.py",
line 1, in <module>
    """\
  File
"/data/home/thirion/mybzr/trunk-neurospin/neuroimaging/neurospin/neuro/image_classes.py",
line 10, in <module>
ImportError: No module named neuroimaging.neurospin.utils

----------------------------------------------------------------------
Ran 214 tests in 440.831s

FAILED (errors=2)

Bertrand Thirion a écrit :
> I'm having a look at it.
> on the station, I get into trouble with funny things such as
> ImportError: /usr/lib/python2.5/lib-dynload/cPickle.so: undefined
> symbol: PyUnicodeUCS2_DecodeUTF8
> which I fail to understand, but could be a neurospinian curiosity
> (don't laugh). There must exist a fix, since I have already used it ,
> but I simply don't remeber at the moment.
>
> on the laptop I get (after 12 minutes...):
>
> from trunk-neurospin/nipy:
>
> Ran 4370 tests in 673.157s
>
> FAILED (failures=2, errors=7)
>
> Not perfect, but it runs
> Bertrand
>
>
> Gael Varoquaux a écrit :
>
>> When I try to run the tests on my computer, the process terminates
>> without finishing the test suite. No crash, just a termination. I suspect
>> there might be a 'exit' call at the C level.
>>
>> Can you run the test suite on your computer?
>>
>> Gaël
>>
>>
>
>
>

Revision history for this message
Alexis Roche (alexis-roche) wrote :
Download full text (3.5 KiB)

Bertrand,

The second error suggests that you should do a fresh install: python
is trying to import a module that doesn't exist any longer
(neuroimaging.neurospin.utils).

Alexis

On Thu, Jul 9, 2009 at 8:21 PM, Bertrand
Thirion<email address hidden> wrote:
> NB : when I lauch the tests from trunk-neurospin/nipy/neurospin, the
> outcome is:
>
> ======================================================================
> ERROR: test_model_selection_mfx_spatial_rand_walk
> (nipy.neurospin.group.tests.test_spatial_relaxation_onesample.test_multivariate_stat_saem)
> ----------------------------------------------------------------------
> Traceback (most recent call last):
>  File
> "/data/home/thirion/mybzr/trunk-neurospin/nipy/neurospin/group/tests/test_spatial_relaxation_onesample.py",
> line 111, in test_model_selection_mfx_spatial_rand_walk
>    self.assertTrue(L0 > L00)
>  File "/usr/lib/python2.5/unittest.py", line 309, in failUnless
>    if not expr: raise self.failureException, msg
> ValueError: The truth value of an array with more than one element is
> ambiguous. Use a.any() or a.all()
>
>
> ======================================================================
> ERROR: Failure: ImportError (No module named neuroimaging.neurospin.utils)
> ----------------------------------------------------------------------
> Traceback (most recent call last):
>  File
> "/usr/lib/python2.5/site-packages/nose-0.10.1-py2.5.egg/nose/loader.py",
> line 364, in loadTestsFromName
>    addr.filename, addr.module)
>  File
> "/usr/lib/python2.5/site-packages/nose-0.10.1-py2.5.egg/nose/importer.py",
> line 39, in importFromPath
>    return self.importFromDir(dir_path, fqname)
>  File
> "/usr/lib/python2.5/site-packages/nose-0.10.1-py2.5.egg/nose/importer.py",
> line 84, in importFromDir
>    mod = load_module(part_fqname, fh, filename, desc)
>  File
> "/data/home/thirion/mybzr/trunk-neurospin/neuroimaging/neurospin/neuro/__init__.py",
> line 1, in <module>
>    """\
>  File
> "/data/home/thirion/mybzr/trunk-neurospin/neuroimaging/neurospin/neuro/image_classes.py",
> line 10, in <module>
> ImportError: No module named neuroimaging.neurospin.utils
>
> ----------------------------------------------------------------------
> Ran 214 tests in 440.831s
>
> FAILED (errors=2)
>
>
>
>
> Bertrand Thirion a écrit :
>> I'm having a look at it.
>> on the station, I get into trouble with funny  things such as
>> ImportError: /usr/lib/python2.5/lib-dynload/cPickle.so: undefined
>> symbol: PyUnicodeUCS2_DecodeUTF8
>> which I fail to understand, but could be  a neurospinian curiosity
>> (don't laugh). There must exist a fix, since I have already used it ,
>> but I simply don't remeber at the moment.
>>
>> on the laptop I get (after 12 minutes...):
>>
>> from trunk-neurospin/nipy:
>>
>> Ran 4370 tests in 673.157s
>>
>> FAILED (failures=2, errors=7)
>>
>> Not perfect, but it runs
>> Bertrand
>>
>>
>> Gael Varoquaux a écrit :
>>
>>> When I try to run the tests on my computer, the process terminates
>>> without finishing the test suite. No crash, just a termination. I suspect
>>> there might be a 'exit' call at the C level.
>>>
>>> Can you run the test suite on your computer?
>>>
>...

Read more...

Revision history for this message
Gael Varoquaux (gael-varoquaux) wrote :

On Thu, Jul 09, 2009 at 05:30:25PM -0000, Gael Varoquaux wrote:
> When I try to run the tests on my computer, the process terminates
> without finishing the test suite. No crash, just a termination. I suspect
> there might be a 'exit' call at the C level.

So, it was a sys.exit done in a module that was imported by nose while
looking for the tests:

http://bazaar.launchpad.net/~nipy-developers/nipy/trunk-neurospin/revision/1801

Just as a reminder for everybody: please protect all your
execution-related code from your scripts by:

if __name__ == '__main__':
    ...

All files should be importable, to be introspected. Nose is only one
example of where it is important.

Test suite is now running on my box.

Gaël

Revision history for this message
Gael Varoquaux (gael-varoquaux) wrote :

OK, remarks on the merge, as I go:

Examples
============

affine_matching example: can't run outside of neurospin, hardcoded path
and data. We will need to fix that in the long run.

Same thing for demo_blob_from_nifti.py, demo_histo_fit_nifti.py, erp.py,
onesample_group.py, script_parcel_multisubj.py, and
space_time_realign.py.

The glm.py example is dead: importing Image from nipy.neurospin fails.

Main code
==========

* nipy/neurospin/register/transform.py

    class Affine should inherit from object:

    class Affine(object):
        ...

* nipy/neurospin/spatial_models/hroi.py

    Depreciated code at the end? Could it be removed?

* nipy/neurospin/utils/mini_designmatrix.py

    Maybe the comments could be rewriten a bit. For instance translated
    from French to English?

* nipy/neurospin/utils/roi.py

    The set of tests at the end cannot be run by other people. Could we
    use artificial data instead?

* nipy/neurospin/utils/simul_2d_multisubject_fmri_dataset.py

    The 'non_random' keyword argument should be called 'seed', and should
    default to False.

Documentation
==============

Most of this code is not documented: it does not appear in the
sphinx-generated documentation. As a result, the chances that it
gets reused by people outside of neurospin is smaller: it is hard to know
what you are looking for when you are an outsider.

In the long run, it would be nice to start building a documentation.

Gaël

Revision history for this message
Bertrand Thirion (bertrand-thirion) wrote :

Thank you Gael. The fixes are not that large. I would not say that
documentation is a matter of long run.
I'm currently improving it, and I take it as a relatively short term
priority.

Bertrand

Gael Varoquaux a écrit :
> OK, remarks on the merge, as I go:
>
> Examples
> ============
>
> affine_matching example: can't run outside of neurospin, hardcoded path
> and data. We will need to fix that in the long run.
>
> Same thing for demo_blob_from_nifti.py, demo_histo_fit_nifti.py, erp.py,
> onesample_group.py, script_parcel_multisubj.py, and
> space_time_realign.py.
>
> The glm.py example is dead: importing Image from nipy.neurospin fails.
>
> Main code
> ==========
>
> * nipy/neurospin/register/transform.py
>
> class Affine should inherit from object:
>
> class Affine(object):
> ...
>
> * nipy/neurospin/spatial_models/hroi.py
>
> Depreciated code at the end? Could it be removed?
>
> * nipy/neurospin/utils/mini_designmatrix.py
>
> Maybe the comments could be rewriten a bit. For instance translated
> from French to English?
>
> * nipy/neurospin/utils/roi.py
>
> The set of tests at the end cannot be run by other people. Could we
> use artificial data instead?
>
> * nipy/neurospin/utils/simul_2d_multisubject_fmri_dataset.py
>
> The 'non_random' keyword argument should be called 'seed', and should
> default to False.
>
> Documentation
> ==============
>
> Most of this code is not documented: it does not appear in the
> sphinx-generated documentation. As a result, the chances that it
> gets reused by people outside of neurospin is smaller: it is hard to know
> what you are looking for when you are an outsider.
>
> In the long run, it would be nice to start building a documentation.
>
> Gaël
>

Revision history for this message
Alexis Roche (alexis-roche) wrote :

Hi Gaël,

Thanks a lot for your commitment.

I know about the hard-coded path issue -- at some point, I'll rewrite
my examples so that they use downloadable nipy data. Also, the glm.py
uses Edouard's datamind package, which is obviously not an official
nipy dependencie. In my mind, examples were just here to help
developers, but I agree it's better if they run.

Just a question:

> * nipy/neurospin/register/transform.py
>
>    class Affine should inherit from object:
>
>    class Affine(object):
>        ...

I do now:

class Affine():
    ...

Is that correct? Not sure I understand why any class should inherit
from something. As far as I remember, python 2.4 was not accepting
instructions like the one above.

Cheers,

Alexis

--
Alexis Roche, PhD
Researcher, CEA, Neurospin, Paris, France
Academic Guest, ETHZ, Computer Vision Lab, Zurich, Switzerland
http://alexis.roche.googlepages.com

Revision history for this message
Gael Varoquaux (gael-varoquaux) wrote :

On Fri, Jul 10, 2009 at 08:18:08AM -0000, Alexis Roche wrote:
> Hi Gaël,

> Thanks a lot for your commitment.

> I know about the hard-coded path issue -- at some point, I'll rewrite
> my examples so that they use downloadable nipy data. Also, the glm.py
> uses Edouard's datamind package, which is obviously not an official
> nipy dependencie. In my mind, examples were just here to help
> developers, but I agree it's better if they run.

> Just a question:

> > * nipy/neurospin/register/transform.py

> >    class Affine should inherit from object:

> >    class Affine(object):
> >        ...

> I do now:

> class Affine():
> ...

> Is that correct?

No, you need:

class Affine(object):

With 'object' really in there.

>Not sure I understand why any class should inherit from something.

That's historical, there use to be a 'old-style' objects, and later on,
'new-style object', that implement more clever MRO (method resolution
order) where added, and thus the base class 'object' was added.

Gaël

Revision history for this message
Gael Varoquaux (gael-varoquaux) wrote :

Also:

nipy/neurospin/statistical_mapping.py:23: undefined name 'zmap'
nipy/neurospin/statistical_mapping.py:25: undefined name 'nvoxels'

Gaël

Revision history for this message
Alexis Roche (alexis-roche) wrote :

> No, you need:
>
> class Affine(object):
>
> With 'object' really in there.

Done. I am a little confused here: is that a recommendation? Obviously
both 'class Affine:' and 'class Affine():' work.

> Also:
>
> nipy/neurospin/statistical_mapping.py:23: undefined name 'zmap'
> nipy/neurospin/statistical_mapping.py:25: undefined name 'nvoxels'

Done.

Revision history for this message
Gael Varoquaux (gael-varoquaux) wrote :

On Fri, Jul 10, 2009 at 08:45:18AM -0000, Alexis Roche wrote:
> > No, you need:

> > class Affine(object):

> > With 'object' really in there.

> Done. I am a little confused here: is that a recommendation? Obviously
> both 'class Affine:' and 'class Affine():' work.

They do work but they are not the same:

In [1]: class A:
   ...: pass
   ...:

In [2]: a = A()

In [3]: a.
a.__class__ a.__doc__ a.__module__

In [3]: class A(object):
   ...: pass
   ...:

In [4]: a = A()

In [5]: a.
a.__class__ a.__hash__ a.__repr__
a.__delattr__ a.__init__ a.__setattr__
a.__dict__ a.__module__ a.__sizeof__
a.__doc__ a.__new__ a.__str__
a.__format__ a.__reduce__ a.__subclasshook__
a.__getattribute__ a.__reduce_ex__ a.__weakref__

The later is a 'new-style class'. People nowadays expect new-style
class, so if you give them old style object, things will break (for
instance a sub-class calling the parent class __repr__). This is
why it is a recommandation always to use a new-style classes.

Gaël

Revision history for this message
Alexis Roche (alexis-roche) wrote :

Capito! Grazie.

Alexis

On Fri, Jul 10, 2009 at 4:36 PM, Gael
Varoquaux<email address hidden> wrote:
> On Fri, Jul 10, 2009 at 08:45:18AM -0000, Alexis Roche wrote:
>> > No, you need:
>
>> > class Affine(object):
>
>> > With 'object' really in there.
>
>> Done. I am a little confused here: is that a recommendation? Obviously
>> both 'class Affine:' and 'class Affine():' work.
>
> They do work but they are not the same:
>
> In [1]: class A:
>   ...:     pass
>   ...:
>
> In [2]: a = A()
>
> In [3]: a.
> a.__class__   a.__doc__     a.__module__
>
> In [3]: class A(object):
>   ...:     pass
>   ...:
>
> In [4]: a = A()
>
> In [5]: a.
> a.__class__         a.__hash__          a.__repr__
> a.__delattr__       a.__init__          a.__setattr__
> a.__dict__          a.__module__        a.__sizeof__
> a.__doc__           a.__new__           a.__str__
> a.__format__        a.__reduce__        a.__subclasshook__
> a.__getattribute__  a.__reduce_ex__     a.__weakref__
>
> The later is a 'new-style class'. People nowadays expect new-style
> class, so if you give them old style object, things will break (for
> instance a sub-class calling the parent class __repr__). This is
> why it is a recommandation always to use a new-style classes.
>
> Gaël
> --
> https://code.launchpad.net/~nipy-developers/nipy/trunk-neurospin/+merge/8472
> Your team nipy-developers is subscribed to branch lp:~nipy-developers/nipy/trunk-neurospin.
>

--
Alexis Roche, PhD
Researcher, CEA, Neurospin, Paris, France
Academic Guest, ETHZ, Computer Vision Lab, Zurich, Switzerland
http://alexis.roche.googlepages.com

Revision history for this message
Gael Varoquaux (gael-varoquaux) wrote :

More comments/questions, as I go through the code:

http://bazaar.launchpad.net/~nipy-developers/nipy/trunk-neurospin/annotate/head%3A/nipy/neurospin/utils/roi.py#L250

The two method __add__ and __multiply__ as well as the WeightedROI seem
quite wrong to me. Is there a reason for having them in, and not simply
deleting them?

Cheers,

Gaël

Revision history for this message
Gael Varoquaux (gael-varoquaux) wrote :
Download full text (5.3 KiB)

Also, after fixing a few trivial problems, I have some test failures:

resting ~/dev/nipy-neurospin/nipy/neurospin $ nosetests --with-coverage
--cover-package=nipy.neurospin
.........................................................................................F...........................................................................................F...............FF..............
======================================================================
FAIL: testvarious
(nipy.neurospin.clustering.tests.test_clustering.TestTypeProof)
----------------------------------------------------------------------
Traceback (most recent call last):
  File
"/home/varoquau/dev/nipy-neurospin/nipy/neurospin/clustering/tests/test_clustering.py",
line 190, in testvarious
    self.assert_(sys.getrefcount(X) == 2)
AssertionError

======================================================================
FAIL: test_model_selection_mfx_spatial_rand_walk
(nipy.neurospin.group.tests.test_spatial_relaxation_onesample.TestMultivariateStatSaem)
----------------------------------------------------------------------
Traceback (most recent call last):
  File
"/home/varoquau/dev/nipy-neurospin/nipy/neurospin/group/tests/test_spatial_relaxation_onesample.py",
line 131, in test_model_selection_mfx_spatial_rand_walk
    self.assertTrue(np.all(L0 > L00))
AssertionError

======================================================================
FAIL: test_bsa.test_bsa_null_dev
----------------------------------------------------------------------
Traceback (most recent call last):
  File "/home/varoquau/usr/lib/python2.6/site-packages/nose/case.py",
line 182, in runTest
    self.test(*self.arg)
  File
"/home/varoquau/dev/nipy-neurospin/nipy/neurospin/spatial_models/tests/test_bsa.py",
line 116, in test_bsa_null_dev
    assert(AF==None)
AssertionError:
-------------------- >> begin captured stdout << ---------------------
warning: no sform found for position definition, assuming it is the
identity
warning: no sform found for position definition, assuming it is the
identity
warning: no sform found for position definition, assuming it is the
identity
warning: no sform found for position definition, assuming it is the
identity
warning: no sform found for position definition, assuming it is the
identity
warning: no sform found for position definition, assuming it is the
identity
warning: no sform found for position definition, assuming it is the
identity
warning: no sform found for position definition, assuming it is the
identity
warning: no sform found for position definition, assuming it is the
identity
warning: no sform found for position definition, assuming it is the
identity
warning: no sform found for position definition, assuming it is the
identity
warning: no sform found for position definition, assuming it is the
identity
warning: no sform found for position definition, assuming it is the
identity
warning: no sform found for position definition, assuming it is the
identity
warning: no sform found for position definition, assuming it is the
identity
warning: no sform found for position definition, assuming it is the
identity
warning: no sform found for position ...

Read more...

Revision history for this message
Alexis Roche (alexis-roche) wrote :
Download full text (6.1 KiB)

All right. I forward the failure report for
test_model_selection_mfx_spatial_rand_walk to Merlin.

Alexis

On Sat, Jul 11, 2009 at 10:15 PM, Gael
Varoquaux<email address hidden> wrote:
> Also, after fixing a few trivial problems, I have some test failures:
>
> resting ~/dev/nipy-neurospin/nipy/neurospin $ nosetests --with-coverage
> --cover-package=nipy.neurospin
> .........................................................................................F...........................................................................................F...............FF..............
> ======================================================================
> FAIL: testvarious
> (nipy.neurospin.clustering.tests.test_clustering.TestTypeProof)
> ----------------------------------------------------------------------
> Traceback (most recent call last):
>  File
> "/home/varoquau/dev/nipy-neurospin/nipy/neurospin/clustering/tests/test_clustering.py",
> line 190, in testvarious
>    self.assert_(sys.getrefcount(X) == 2)
> AssertionError
>
> ======================================================================
> FAIL: test_model_selection_mfx_spatial_rand_walk
> (nipy.neurospin.group.tests.test_spatial_relaxation_onesample.TestMultivariateStatSaem)
> ----------------------------------------------------------------------
> Traceback (most recent call last):
>  File
> "/home/varoquau/dev/nipy-neurospin/nipy/neurospin/group/tests/test_spatial_relaxation_onesample.py",
> line 131, in test_model_selection_mfx_spatial_rand_walk
>    self.assertTrue(np.all(L0 > L00))
> AssertionError
>
> ======================================================================
> FAIL: test_bsa.test_bsa_null_dev
> ----------------------------------------------------------------------
> Traceback (most recent call last):
>  File "/home/varoquau/usr/lib/python2.6/site-packages/nose/case.py",
> line 182, in runTest
>    self.test(*self.arg)
>  File
> "/home/varoquau/dev/nipy-neurospin/nipy/neurospin/spatial_models/tests/test_bsa.py",
> line 116, in test_bsa_null_dev
>    assert(AF==None)
> AssertionError:
> -------------------- >> begin captured stdout << ---------------------
> warning: no sform found for position definition,  assuming it is the
> identity
> warning: no sform found for position definition,  assuming it is the
> identity
> warning: no sform found for position definition,  assuming it is the
> identity
> warning: no sform found for position definition,  assuming it is the
> identity
> warning: no sform found for position definition,  assuming it is the
> identity
> warning: no sform found for position definition,  assuming it is the
> identity
> warning: no sform found for position definition,  assuming it is the
> identity
> warning: no sform found for position definition,  assuming it is the
> identity
> warning: no sform found for position definition,  assuming it is the
> identity
> warning: no sform found for position definition,  assuming it is the
> identity
> warning: no sform found for position definition,  assuming it is the
> identity
> warning: no sform found for position definition,  assuming it is the
> identity
> warning: no sform found for position ...

Read more...

1761. By Matthew Brett

Merged Gaels slight generalization of getting test data

Revision history for this message
Merlin Keller (merlin-keller) wrote :
Download full text (6.5 KiB)

I just pushed the corrections for
test_model_selection_mfx_spatial_rand_walk.
It should work fine now.

Merlin

-------- Message d'origine--------
De: <email address hidden> de la part de Alexis Roche
Date: sam. 11/07/2009 23:09
À: KELLER Merlin INRIA
Objet : Re: [Nipy-devel] [Merge]

All right. I forward the failure report for
test_model_selection_mfx_spatial_rand_walk to Merlin.

Alexis

On Sat, Jul 11, 2009 at 10:15 PM, Gael
Varoquaux<email address hidden> wrote:
> Also, after fixing a few trivial problems, I have some test failures:
>
> resting ~/dev/nipy-neurospin/nipy/neurospin $ nosetests --with-coverage
> --cover-package=nipy.neurospin
> .........................................................................................F...........................................................................................F...............FF..............
> ======================================================================
> FAIL: testvarious
> (nipy.neurospin.clustering.tests.test_clustering.TestTypeProof)
> ----------------------------------------------------------------------
> Traceback (most recent call last):
> File
> "/home/varoquau/dev/nipy-neurospin/nipy/neurospin/clustering/tests/test_clustering.py",
> line 190, in testvarious
> self.assert_(sys.getrefcount(X) == 2)
> AssertionError
>
> ======================================================================
> FAIL: test_model_selection_mfx_spatial_rand_walk
> (nipy.neurospin.group.tests.test_spatial_relaxation_onesample.TestMultivariateStatSaem)
> ----------------------------------------------------------------------
> Traceback (most recent call last):
> File
> "/home/varoquau/dev/nipy-neurospin/nipy/neurospin/group/tests/test_spatial_relaxation_onesample.py",
> line 131, in test_model_selection_mfx_spatial_rand_walk
> self.assertTrue(np.all(L0 > L00))
> AssertionError
>
> ======================================================================
> FAIL: test_bsa.test_bsa_null_dev
> ----------------------------------------------------------------------
> Traceback (most recent call last):
> File "/home/varoquau/usr/lib/python2.6/site-packages/nose/case.py",
> line 182, in runTest
> self.test(*self.arg)
> File
> "/home/varoquau/dev/nipy-neurospin/nipy/neurospin/spatial_models/tests/test_bsa.py",
> line 116, in test_bsa_null_dev
> assert(AF==None)
> AssertionError:
> -------------------- >> begin captured stdout << ---------------------
> warning: no sform found for position definition, assuming it is the
> identity
> warning: no sform found for position definition, assuming it is the
> identity
> warning: no sform found for position definition, assuming it is the
> identity
> warning: no sform found for position definition, assuming it is the
> identity
> warning: no sform found for position definition, assuming it is the
> identity
> warning: no sform found for position definition, assuming it is the
> identity
> warning: no sform found for position definition, assuming it is the
> identity
> warning: no sform found for position definition, assuming it is the
> identity
> warning: no sform found for position definition, assuming it is the
> identity...

Read more...

Revision history for this message
Merlin Keller (merlin-keller) wrote :

I just pushed another version of test_spatial_relaxation_onesample,
which takes much less time to run.

Merlin

1762. By Matthew Brett

Merge from branch updating test data

Revision history for this message
Bertrand Thirion (bertrand-thirion) wrote :

Gael Varoquaux a écrit :
> Also, after fixing a few trivial problems, I have some test failures:
>
> resting ~/dev/nipy-neurospin/nipy/neurospin $ nosetests --with-coverage
> --cover-package=nipy.neurospin
> .........................................................................................F...........................................................................................F...............FF..............
> ======================================================================
> FAIL: testvarious
> (nipy.neurospin.clustering.tests.test_clustering.TestTypeProof)
> ----------------------------------------------------------------------
> Traceback (most recent call last):
> File
> "/home/varoquau/dev/nipy-neurospin/nipy/neurospin/clustering/tests/test_clustering.py",
> line 190, in testvarious
> self.assert_(sys.getrefcount(X) == 2)
> AssertionError
>
>
Pour celui-là, je ne suis pas sûr de comprendre: suivant les cas, on
trouve 2 ou 3 références...

Bertrand

Revision history for this message
Gael Varoquaux (gael-varoquaux) wrote :

On Wed, Jul 15, 2009 at 05:51:09PM -0000, Bertrand Thirion wrote:
> > line 190, in testvarious
> > self.assert_(sys.getrefcount(X) == 2)
> > AssertionError

> Pour celui-là, je ne suis pas sûr de comprendre: suivant les cas, on
> trouve 2 ou 3 références...

Let's ignore this failure for now. I want to focus on other things before
I leave Berkeley. We can merge with this problem, I believe.

Just a heads up on this merge: I would like to get the data-refactor
branch in first, pull from it in the neurospin branch, rework the data
access in the neurospin branch, and then merge the neurospin branch.
Hopefully this should all be done before the end of the week.

How does that sound?

Gaël

Revision history for this message
Bertrand Thirion (bertrand-thirion) wrote :

> Let's ignore this failure for now. I want to focus on other things before
> I leave Berkeley. We can merge with this problem, I believe.
>
> Just a heads up on this merge: I would like to get the data-refactor
> branch in first, pull from it in the neurospin branch, rework the data
> access in the neurospin branch, and then merge the neurospin branch.
> Hopefully this should all be done before the end of the week.
>
> How does that sound?
>
OK.

Bertrand

Revision history for this message
Matthew Brett (matthew-brett) wrote :

Hi,

> Just a heads up on this merge: I would like to get the data-refactor
> branch in first, pull from it in the neurospin branch, rework the data
> access in the neurospin branch, and then merge the neurospin branch.
> Hopefully this should all be done before the end of the week.
>
> How does that sound?

Sounds good to me too.

Oh, sorry I mean

+1

luv,

Matthew

Revision history for this message
Alexis Roche (alexis-roche) wrote :

> Oh, sorry I mean
>
> +1

This is funny.

I mean +1 too.

Alexis

Revision history for this message
Alexis Roche (alexis-roche) wrote :

The doc of trunk-neurospin now seems to build fine... even on windows XP!

I had to fix a number of issues in Bertrand's docstrings, though. The
system seems fairly strict: it does not like colons in section titles,
nor does it like dashes that do not match title lengths... Bertrand,
you should check that your docstrings are formatted the way you want
them to be. A suggested approach would be to build the doc once in a
while :-)

Cheers,

Alexis

Revision history for this message
Christopher Burns (cburns) wrote :

For editing ReST in my docstrings I enable ReST-mode in emacs. Then
for section headings (titles) you enter in the heading, and hit
Ctrl-=, (Control Key + Equal Key) and it fills in the dashes on the
line below for you. Repeatedly hit Ctrl-= and it cycles through the
different 'dashes' for different sized headings. This is handy also
for when you change the heading later, you just Ctrl-= and it resizes
the dashes. Way easier then trying to manually adjust them.

I added this in my ~/.emacs file enable rst-mode on all my text files.

;; rst-mode
(require 'rst)
(setq auto-mode-alist
      (append '(("\\.rst$" . rst-mode)
                ("\\.rest$" . rst-mode)
  ("\\.txt$" . rst-mode)) auto-mode-alist))

Then have a key binding to switch it on:

(global-set-key [(f11)] 'rst-mode)

I'm sure Vi has something similar.
Chris

On Thu, Jul 16, 2009 at 4:36 PM, Alexis Roche<email address hidden> wrote:
> nor does it like dashes that do not match title lengths...

1763. By Matthew Brett

Add Gaels patch to add inherited methods to API docs

Revision history for this message
Bertrand Thirion (bertrand-thirion) wrote :

Thanks for this.

B.

Christopher Burns a écrit :
> For editing ReST in my docstrings I enable ReST-mode in emacs. Then
> for section headings (titles) you enter in the heading, and hit
> Ctrl-=, (Control Key + Equal Key) and it fills in the dashes on the
> line below for you. Repeatedly hit Ctrl-= and it cycles through the
> different 'dashes' for different sized headings. This is handy also
> for when you change the heading later, you just Ctrl-= and it resizes
> the dashes. Way easier then trying to manually adjust them.
>
> I added this in my ~/.emacs file enable rst-mode on all my text files.
>
> ;; rst-mode
> (require 'rst)
> (setq auto-mode-alist
> (append '(("\\.rst$" . rst-mode)
> ("\\.rest$" . rst-mode)
> ("\\.txt$" . rst-mode)) auto-mode-alist))
>
> Then have a key binding to switch it on:
>
> (global-set-key [(f11)] 'rst-mode)
>
> I'm sure Vi has something similar.
> Chris
>
> On Thu, Jul 16, 2009 at 4:36 PM, Alexis Roche<email address hidden> wrote:
>
>> nor does it like dashes that do not match title lengths...
>>

Revision history for this message
Alexis Roche (alexis-roche) wrote :

Thanks Chris, it's useful.

Alexis

On Fri, Jul 17, 2009 at 4:30 AM, Christopher Burns<email address hidden> wrote:
> For editing ReST in my docstrings I enable ReST-mode in emacs.  Then
> for section headings (titles) you enter in the heading, and hit
> Ctrl-=, (Control Key + Equal Key) and it fills in the dashes on the
> line below for you.  Repeatedly hit Ctrl-= and it cycles through the
> different 'dashes' for different sized headings. This is handy also
> for when you change the heading later, you just Ctrl-= and it resizes
> the dashes.  Way easier then trying to manually adjust them.
>
> I added this in my ~/.emacs file enable rst-mode on all my text files.
>
> ;; rst-mode
> (require 'rst)
> (setq auto-mode-alist
>      (append '(("\\.rst$" . rst-mode)
>                ("\\.rest$" . rst-mode)
>                ("\\.txt$" . rst-mode)) auto-mode-alist))
>
> Then have a key binding to switch it on:
>
> (global-set-key [(f11)] 'rst-mode)
>
> I'm sure Vi has something similar.
> Chris
>
> On Thu, Jul 16, 2009 at 4:36 PM, Alexis Roche<email address hidden> wrote:
>> nor does it like dashes that do not match title lengths...
> --
> https://code.launchpad.net/~nipy-developers/nipy/trunk-neurospin/+merge/8472
> Your team nipy-developers is subscribed to branch lp:~nipy-developers/nipy/trunk-neurospin.
>

--
Alexis Roche, PhD
Researcher, CEA, Neurospin, Paris, France
Academic Guest, ETHZ, Computer Vision Lab, Zurich, Switzerland
http://alexis.roche.googlepages.com

1764. By Christopher Burns

Update copyright date in README.

1765. By Christopher Burns

Add Readme for nipy tools.

1766. By Christopher Burns

Cleanup documentation for perlpie script

1767. By Christopher Burns

Add dry-run option to perlpie

1768. By Matthew Brett

Merge of data refactor by Gael, Fernando, Matthew

1769. By Matthew Brett

Removed merge cruft

1770. By Matthew Brett

Red logging for data warnings

1771. By Matthew Brett

Update data_files doc to reflect behavior of code

1772. By Matthew Brett

Merged datarefactor branch

1773. By Matthew Brett <mb312@angela>

Updated install instructions, especially for windows

1774. By Alexis Roche

Temporary fix for bug 409269

1775. By Alexis Roche

Fix to bug fix.

Revision history for this message
Matthew Brett (matthew-brett) wrote :

This just to ask - where are we with this proposed merge? Is there any work that we can help with?

review: Abstain
Revision history for this message
Alexis Roche (alexis-roche) wrote :

I am wondering the same thing. Gaël, can you update us on the merge?

Best,

Alexis

1776. By Matthew Brett <mb312@angela>

Fix remaining data import error; example of templates data testing

Revision history for this message
Gael Varoquaux (gael-varoquaux) wrote :

On Tue, Aug 11, 2009 at 12:21:08AM -0000, Alexis Roche wrote:
> I am wondering the same thing. Gaël, can you update us on the merge?

We should update the downloading code to use the data framework that
Matthew has set up recently. I have been wanting to do this, but haven't
found time.

That's the only thing I see. Last time I looked, all the tests in our
code where passing (which is really good work, by the way).

Gaël

Revision history for this message
Alexis Roche (alexis-roche) wrote :

Hi Gaël,

> We should update the downloading code to use the data framework that
> Matthew has set up recently. I have been wanting to do this, but haven't
> found time.

Does that entail updating the nipy-data tarball to include additional
files? I am interested in adding some more data files for my example
scripts to be able to run outside neurospin.

> That's the only thing I see. Last time I looked, all the tests in our
> code where passing (which is really good work, by the way).

You mean "which is really professional work", don't you? :-)

Cheers,

Alexis

Revision history for this message
Gael Varoquaux (gael-varoquaux) wrote :

On Thu, Aug 13, 2009 at 11:48:08PM -0000, Alexis Roche wrote:
> > We should update the downloading code to use the data framework that
> > Matthew has set up recently. I have been wanting to do this, but haven't
> > found time.

> Does that entail updating the nipy-data tarball to include additional
> files? I am interested in adding some more data files for my example
> scripts to be able to run outside neurospin.

Possibly, and if not, posting these files on a download location where
everybody can access them, and using the right download mechanism to get
them. The goal is exactly that example scripts run outside of neurospin.

> > That's the only thing I see. Last time I looked, all the tests in our
> > code where passing (which is really good work, by the way).

> You mean "which is really professional work", don't you? :-)

Correct :).

Gaël

Revision history for this message
Matthew Brett (matthew-brett) wrote :

>> Does that entail updating the nipy-data tarball to include additional
>> files? I am interested in adding some more data files for my example
>> scripts to be able to run outside neurospin.
>
> Possibly, and if not, posting these files on a download location where
> everybody can access them, and using the right download mechanism to get
> them. The goal is exactly that example scripts run outside of neurospin.

You should I hope have got a Dropbox invite - if you put files up
there for now, I can put them into the package and upload it.
Alternatively, the design of the package structure was meant to allow
you to:

download the current package
unpack
add some images
bump the version number
repackage
upload

We're going to change hosting soon, so let's keep it 'lightweight' for
now (I mean I'll do it by hand, if you send me the files).

>> You mean "which is really professional work", don't you? :-)
>
> Correct :).

Ah the hum of happy enterprise... ;)

Matthew

Revision history for this message
Alexis Roche (alexis-roche) wrote :

> You should I hope have got a Dropbox invite - if you put files up
> there for now, I can put them into the package and upload it.

OK, I just added two MR-T1 image files for the affine registration
example script, and one NPZ file for the onesample fMRI group analysis
script.

I have other examples based on the FIAC dataset. My understanding is
that there is a plan to include FIAC, or at least part of it, in
nipy-data. Is that correct?

Cheers,

Alexis

Revision history for this message
Matthew Brett (matthew-brett) wrote :

On Fri, Aug 14, 2009 at 7:36 AM, Alexis Roche<email address hidden> wrote:
>> You should I hope have got a Dropbox invite - if you put files up
>> there for now, I can put them into the package and upload it.
>
> OK, I just added two MR-T1 image files for the affine registration
> example script, and one NPZ file for the onesample fMRI group analysis
> script.

Thanks - I see them.

> I have other examples based on the FIAC dataset. My understanding is
> that there is a plan to include FIAC, or at least part of it, in
> nipy-data. Is that correct?

No - not really. We were going to post FIAC separately as a huge
tarball somewhere, but with an explicit: 'please download
http://some.place.org/fiac/fiac_huge_tarball.bz2 and unpack' for any
scripts that needed it. We thought it was just too large for
distribution (Debian, Windows etc) packaging, which was the intention
of the current structure.

We wanted to keep the current packages from getting very large too -
certainly less than 500M, I hope less than 50.

No problem to include example FIAC files in the packages if they are useful...

see you,

Matthew

Revision history for this message
Alexis Roche (alexis-roche) wrote :

> No problem to include example FIAC files in the packages if they are useful...

I just updated my Dropbox so as to include all the necessary files for
my example scripts to run. Even though I tried to keep storage as low
as possible, I am adding a total of around 78MB to the example_data
package. Please let me know if you think it's too much.

Best,

Alexis

Revision history for this message
Gael Varoquaux (gael-varoquaux) wrote :

On Mon, Aug 17, 2009 at 02:36:10PM -0000, Alexis Roche wrote:
> > No problem to include example FIAC files in the packages if they are useful...

> I just updated my Dropbox so as to include all the necessary files for
> my example scripts to run. Even though I tried to keep storage as low
> as possible, I am adding a total of around 78MB to the example_data
> package. Please let me know if you think it's too much.

I haven't had time to look at that, but is their a possibility to reuse
these files in other examples? I think that's what we need to be careful
about: having a rather small set of examples that we can use in many
places.

Gaël

Revision history for this message
Alexis Roche (alexis-roche) wrote :

> I haven't had time to look at that, but is their a possibility to reuse
> these files in other examples?

Those are fairly standard types of neuroimage data, so I'm sure they
can be reused. Specifically, I included:
- two T1 images from different subjects
- two fMRI sessions of a given FIAC subject, with the corresponding
design matrices (in NPZ format)
- one second-level fMRI group dataset taken from another protocol

Those files are either in NIFTI or in NPZ format, so they can be read
without additional packages.

Cheers,

Alexis

Revision history for this message
Matthew Brett (matthew-brett) wrote :

On Mon, Aug 17, 2009 at 7:36 AM, Alexis Roche<email address hidden> wrote:
>> No problem to include example FIAC files in the packages if they are useful...
>
> I just updated my Dropbox so as to include all the necessary files for
> my example scripts to run. Even though I tried to keep storage as low
> as possible, I am adding a total of around 78MB to the example_data
> package. Please let me know if you think it's too much.

I think that's OK - let's just keep an eye on it.

I've put a new data package up here:

https://cirl.berkeley.edu/mb312/nipy-data/

You'll see a Makefile in the dropbox folder now.

I've made a package validator, now here:

It would be rather easy to set y'all up with the ability to modify the
packages directly if you like, let me know.

Otherwise, it's a trivial build step for me, happy to do it.

Revision history for this message
Matthew Brett (matthew-brett) wrote :
1777. By JB <jb@jb-laptop>

Tests were not copied because install_data was imported from distutils instead of numpy.distutils

Revision history for this message
Alexis Roche (alexis-roche) wrote :

Hi,

Where are we now regarding the merge? Gaël, do you need help on specific tasks?

Best,

Alexis

On Tue, Aug 18, 2009 at 5:18 AM, Matthew Brett<email address hidden> wrote:
>> I've made a package validator, now here:
>
> Sorry - here:
>
> http://bazaar.launchpad.net/~nipy-developers/nipy/nipy-datarefactor/annotate/head%3A/tools/validata_data_pkg.py
> --
> https://code.launchpad.net/~nipy-developers/nipy/trunk-neurospin/+merge/8472
> Your team nipy-developers is subscribed to branch lp:~nipy-developers/nipy/trunk-neurospin.
>

--
Alexis Roche, PhD
Researcher, CEA, Neurospin, Paris, France
Academic Guest, ETHZ, Computer Vision Lab, Zurich, Switzerland
http://alexis.roche.googlepages.com

Revision history for this message
Gael Varoquaux (gael-varoquaux) wrote :

On Mon, Aug 31, 2009 at 07:36:10AM -0000, Alexis Roche wrote:
> Where are we now regarding the merge? Gaël, do you need help on
> specific tasks?

I was just thinking this week end that I don't have time to polish the
things I wanted to polish. So I think that either you have time to move
the data downloading and examples to use the new API, or we should merge
anyhow and worry about this later.

Tell me what you want to do? If we decide to merge, I'll merge trunk in
trunk-neurospin, and Matthew can merge trunk-neurospin in trunk.

In any case, I'll just run the tests on 32 bits and 64 bits today to
check all is fine.

Sorry for taking that long. I am convinced that we should merge more
often, and I don't even respect my own convictions :).

Gaël

Revision history for this message
Alexis Roche (alexis-roche) wrote :

Hi Gaël,

> So I think that either you have time to move
> the data downloading and examples to use the new API, or we should merge
> anyhow and worry about this later.

I already did part of the job, but some examples require files that
Bertrand (I suppose) put together on
ftp://ftp.cea.fr/pub/dsv/madic/download/nipy/

It should be up to Bertrand to decide on the future of that small
database. The right approach would probably be to push it into
nipy-data, but I won't do it myself without his consent. Either way, I
don't see this as an urgent matter. This can be done after the merge.

> Tell me what you want to do?

I guess I support the quick merge option.

> Sorry for taking that long. I am convinced that we should merge more
> often, and I don't even respect my own convictions :).

Please do not apologize. trunk-neurospin turns out to be a fairly big
chunk and we should maybe consider splitting it into several
"lightweight" branches in order to ease merging in the future. I
believe it makes sense to have a subpackage called "neurospin", but
having a single branch for all the projects going on in neurospin
might not be ideal.

Cheers,

Alexis

Revision history for this message
Gael Varoquaux (gael-varoquaux) wrote :
Download full text (5.4 KiB)

On Mon, Aug 31, 2009 at 09:51:57AM +0200, Gael Varoquaux wrote:
> In any case, I'll just run the tests on 32 bits and 64 bits today to
> check all is fine.

OK, I have 6 test failures in the neurospin subpackage that look like
they are fairly easy to correct and non-specific to 64 bit. I believe
that you know this code well. Could you have a look please. The test log
is attached.

Thanks,

Gaël

..................................................................................................................................................................................................................EEEEEE...................................
======================================================================
ERROR: test_iconic_matcher.test_clamping_uint8
----------------------------------------------------------------------
Traceback (most recent call last):
  File "/volatile/varoquau/usr/lib/python2.5/site-packages/nose/case.py",
line 182, in runTest
    self.test(*self.arg)
  File
"/volatile/varoquau/dev/nipy-neurospin/nipy/neurospin/register/tests/test_iconic_matcher.py",
line 46, in test_clamping_uint8
    _test_clamping(I)
  File
"/volatile/varoquau/dev/nipy-neurospin/nipy/neurospin/register/tests/test_iconic_matcher.py",
line 34, in _test_clamping
    IM = IconicMatcher(I.array, I.array, I.toworld, I.toworld, thI, thI,
bins=clI)
TypeError: __init__() got an unexpected keyword argument 'bins'

======================================================================
ERROR: test_iconic_matcher.test_clamping_uint8_nonstd
----------------------------------------------------------------------
Traceback (most recent call last):
  File "/volatile/varoquau/usr/lib/python2.5/site-packages/nose/case.py",
line 182, in runTest
    self.test(*self.arg)
  File
"/volatile/varoquau/dev/nipy-neurospin/nipy/neurospin/register/tests/test_iconic_matcher.py",
line 50, in test_clamping_uint8_nonstd
    _test_clamping(I, 10, 165)
  File
"/volatile/varoquau/dev/nipy-neurospin/nipy/neurospin/register/tests/test_iconic_matcher.py",
line 34, in _test_clamping
    IM = IconicMatcher(I.array, I.array, I.toworld, I.toworld, thI, thI,
bins=clI)
TypeError: __init__() got an unexpected keyword argument 'bins'

======================================================================
ERROR: test_iconic_matcher.test_clamping_int16
----------------------------------------------------------------------
Traceback (most recent call last):
  File "/volatile/varoquau/usr/lib/python2.5/site-packages/nose/case.py",
line 182, in runTest
    self.test(*self.arg)
  File
"/volatile/varoquau/dev/nipy-neurospin/nipy/neurospin/register/tests/test_iconic_matcher.py",
line 54, in test_clamping_int16
    _test_clamping(I)
  File
"/volatile/varoquau/dev/nipy-neurospin/nipy/neurospin/register/tests/test_iconic_matcher.py",
line 34, in _test_clamping
    IM = IconicMatcher(I.array, I.array, I.toworld, I.toworld, thI, thI,
bins=clI)
TypeError: __init__() got an unexpected keyword argument 'bins'

======================================================================
ERROR: test_iconic_matcher.test_clamping_int16_nonstd
----------------------------------------------------------------------
Trace...

Read more...

Revision history for this message
Alexis Roche (alexis-roche) wrote :
Download full text (6.1 KiB)

Yes, it's trivial. I'm doing it, and testing everything. Should be
done within an hour or so.

On Mon, Aug 31, 2009 at 10:36 AM, Gael
Varoquaux<email address hidden> wrote:
> On Mon, Aug 31, 2009 at 09:51:57AM +0200, Gael Varoquaux wrote:
>> In any case, I'll just run the tests on 32 bits and 64 bits today to
>> check all is fine.
>
> OK, I have 6 test failures in the neurospin subpackage that look like
> they are fairly easy to correct and non-specific to 64 bit. I believe
> that you know this code well. Could you have a look please. The test log
> is attached.
>
> Thanks,
>
> Gaël
>
> ..................................................................................................................................................................................................................EEEEEE...................................
> ======================================================================
> ERROR: test_iconic_matcher.test_clamping_uint8
> ----------------------------------------------------------------------
> Traceback (most recent call last):
>  File "/volatile/varoquau/usr/lib/python2.5/site-packages/nose/case.py",
> line 182, in runTest
>    self.test(*self.arg)
>  File
> "/volatile/varoquau/dev/nipy-neurospin/nipy/neurospin/register/tests/test_iconic_matcher.py",
> line 46, in test_clamping_uint8
>    _test_clamping(I)
>  File
> "/volatile/varoquau/dev/nipy-neurospin/nipy/neurospin/register/tests/test_iconic_matcher.py",
> line 34, in _test_clamping
>    IM = IconicMatcher(I.array, I.array, I.toworld, I.toworld, thI, thI,
> bins=clI)
> TypeError: __init__() got an unexpected keyword argument 'bins'
>
> ======================================================================
> ERROR: test_iconic_matcher.test_clamping_uint8_nonstd
> ----------------------------------------------------------------------
> Traceback (most recent call last):
>  File "/volatile/varoquau/usr/lib/python2.5/site-packages/nose/case.py",
> line 182, in runTest
>    self.test(*self.arg)
>  File
> "/volatile/varoquau/dev/nipy-neurospin/nipy/neurospin/register/tests/test_iconic_matcher.py",
> line 50, in test_clamping_uint8_nonstd
>    _test_clamping(I, 10, 165)
>  File
> "/volatile/varoquau/dev/nipy-neurospin/nipy/neurospin/register/tests/test_iconic_matcher.py",
> line 34, in _test_clamping
>    IM = IconicMatcher(I.array, I.array, I.toworld, I.toworld, thI, thI,
> bins=clI)
> TypeError: __init__() got an unexpected keyword argument 'bins'
>
> ======================================================================
> ERROR: test_iconic_matcher.test_clamping_int16
> ----------------------------------------------------------------------
> Traceback (most recent call last):
>  File "/volatile/varoquau/usr/lib/python2.5/site-packages/nose/case.py",
> line 182, in runTest
>    self.test(*self.arg)
>  File
> "/volatile/varoquau/dev/nipy-neurospin/nipy/neurospin/register/tests/test_iconic_matcher.py",
> line 54, in test_clamping_int16
>    _test_clamping(I)
>  File
> "/volatile/varoquau/dev/nipy-neurospin/nipy/neurospin/register/tests/test_iconic_matcher.py",
> line 34, in _test_clamping
>    IM = IconicMatcher(I.array, I.array, I.toworld, I.toworld, ...

Read more...

Revision history for this message
Gael Varoquaux (gael-varoquaux) wrote :

On Mon, Aug 31, 2009 at 10:39:47AM +0200, Alexis Roche wrote:
> Yes, it's trivial. I'm doing it, and testing everything. Should be
> done within an hour or so.

OK, once this is done, we are good to merge. I tested on 64 bits, there
is no problem there.

For what its worth, we have a 54% test coverage of the neurospin
codebase, with the worst offenders in terms of coverage being:

    nipy.neurospin.utils.mask (I need to take care of that)
    nipy.neurospin.statistical_mapping
    nipy.neurospin.spatial_models.structural_bfls
    nipy.neurospin.image_registration
    nipy.neurospin.graph.BPmatch
    nipy.neurospin.eda.dimension_reduction
    nipy.neurospin.viz (I need completely rewrite this)

Gaël

Revision history for this message
Alexis Roche (alexis-roche) wrote :

I just committed a revision (1872) with fixes for various issues,
including the testing errors you spotted.

Running nipy.neurospin.test() on my minGW system yields a fantastic
zero errors, zero failures.

Ran 251 tests in 401.532s

OK

<nose.result.TextTestResult run=251 errors=0 failures=0>

Revision history for this message
Gael Varoquaux (gael-varoquaux) wrote :

On Mon, Aug 31, 2009 at 10:21:09AM -0000, Alexis Roche wrote:
> I just committed a revision (1872) with fixes for various issues,
> including the testing errors you spotted.

> Running nipy.neurospin.test() on my minGW system yields a fantastic
> zero errors, zero failures.

> Ran 251 tests in 401.532s

Same thing here.

Matthew, you can merge revision 1872 when you're ready.

Gaël

Revision history for this message
Gael Varoquaux (gael-varoquaux) wrote :

On Mon, Aug 31, 2009 at 01:11:16PM +0200, Gael Varoquaux wrote:
> On Mon, Aug 31, 2009 at 10:21:09AM -0000, Alexis Roche wrote:
> > I just committed a revision (1872) with fixes for various issues,
> > including the testing errors you spotted.

> > Running nipy.neurospin.test() on my minGW system yields a fantastic
> > zero errors, zero failures.

> > Ran 251 tests in 401.532s

> Same thing here.

> Matthew, you can merge revision 1872 when you're ready.

Actually, make that 1869[*], Bertrand has fixed a few examples, and I
just realized that I had unpushed fixes on my laptop.

We've just had history folding, someone will have to tell me how one can
avoid that with long running branches continuously synced.

Gaël

Revision history for this message
Fernando Perez (fdo.perez) wrote :

Hey,

On Tue, Sep 1, 2009 at 12:27 AM, Gael
Varoquaux<email address hidden> wrote:
>
> We've just had history folding, someone will have to tell me how one can
> avoid that with long running branches continuously synced.

As best as I've been able to learn from our similar pains early on
with ipython, the only way to avoid this problem is to use the 'two
branch dance', as described summarily in the ipython guide and in more
detail by Chris in the nipy one:

http://ipython.scipy.org/doc/stable/html/development/overview.html

http://neuroimaging.scipy.org/doc/manual/html/devel/guidelines/bzr_workflow.html

The take-home message is:

"You shall never do pull and merge in the same branch checkout, EVER"

Because of this, the safest solution is to have two branches: one
which you use to track the remote one, and where you only do
push/pull, and one where you work locally and where you do all merging
work.

I'm interested in learning if there's a better way to do this, of
course, but that's the best of my current knowledge.

Cheers,

f

Revision history for this message
Timmie (timmie) wrote :

Hello,
I am lurking here because I am just monitoring the branches. I was
pointed to it because of the stats part. I think it became a scikit now.
> I'm interested in learning if there's a better way to do this, of
> course, but that's the best of my current knowledge.
Have you tried to ask at the bzr mailing list?

People there are eager to help and may point you to some tricks...

Best,
Timmie

Revision history for this message
Fernando Perez (fdo.perez) wrote :

On Tue, Sep 1, 2009 at 12:15 PM, Tim<email address hidden> wrote:
> Have you tried to ask at the bzr mailing list?
>
> People there are eager to help and may point you to some tricks...

Not really, no. We found a pretty reasonable working solution and I
just moved on. By now I have the pattern fairly cleanly embedded in
my brain, and I use it fine, it doesn't really bother me. I'm already
overstretched with mailing list activity as it is, so for things like
this I'll just settle for the 'works for me' solution :)

Cheers,

f

Revision history for this message
Gael Varoquaux (gael-varoquaux) wrote :

On Tue, Sep 01, 2009 at 06:51:12PM -0000, Fernando Perez wrote:
> As best as I've been able to learn from our similar pains early on
> with ipython, the only way to avoid this problem is to use the 'two
> branch dance', as described summarily in the ipython guide and in more
> detail by Chris in the nipy one:

> http://ipython.scipy.org/doc/stable/html/development/overview.html

> http://neuroimaging.scipy.org/doc/manual/html/devel/guidelines/bzr_workflow.html

> The take-home message is:

> "You shall never do pull and merge in the same branch checkout, EVER"

> Because of this, the safest solution is to have two branches: one
> which you use to track the remote one, and where you only do
> push/pull, and one where you work locally and where you do all merging
> work.

I while ago I used to think this 'dance' solved the problems. When I was
doing a lot of development on IPython it worked really well for me.
However, I haven't been able to get it working with the development
pattern in nipy. Let me explain the problem.

1) I branch out trunk to trunk-gael

2) I work on trunk-gael, 'bzr ci' as I work

3) I bug fix is committed to trunk

4) I need the bug fix. This is where I break the golden rule, and merge
from trunk to trunk-gael

5) I work on trunk-gael, 'bzr ci' as I work

6) I am ready for merging in trunk, but how do I avoid history folding?

The problem for me lies in 4). And it happens very often. Now I may be
missing something, and I'll love to be enlightened.

Gaël

1778. By Matthew Brett <mb312@angela>

Fixed 0 in time dimension for example functional image; removed remnant of old data framework

Revision history for this message
Fernando Perez (fdo.perez) wrote :

Hey,

On Tue, Sep 1, 2009 at 1:06 PM, Gael
Varoquaux<email address hidden> wrote:
> 1) I branch out trunk to trunk-gael
>
> 2) I work on trunk-gael, 'bzr ci' as I work
>
> 3) I bug fix is committed to trunk
>
> 4) I need the bug fix. This is where I break the golden rule, and merge
> from trunk to trunk-gael
>
> 5) I work on trunk-gael, 'bzr ci' as I work
>
> 6) I am ready for merging in trunk, but how do I avoid history folding?

This should be OK, as long as you do:

cd trunk/
bzr merge ../trunk-gael
bzr push

because at that point, bzr will recognize that the revisions you'd
merged in (4) are already in trunk. If you look graphically at the
history it will look funny, but you shouldn't get any 'dropped
revisions' messages or any other problem.

The issue arises in the following scenario:

cd trunk # the one meant to track launchpad 1-to-1
work; commit; work
# At the same time, someone else pushes something to the same trunk
# now, you try to push:
bzr push
# boom

At this point, the two histories have diverged, and doing a forced
push or a merge is likely to cause problems.

But in your original scenario, I don't really think your #4 is a
breakage of the golden rule, as long as you don't have anyone else who
can push to trunk-gael to cause true history divergence. The problem
is a divergence between local and remote history, and I think that's
not happening in your case.

Cheers,

f

Revision history for this message
Matthew Brett (matthew-brett) wrote :

Hi,

>> Matthew, you can merge revision 1872 when you're ready.
>
> Actually, make that 1869[*], Bertrand has fixed a few examples, and I
> just realized that I had unpushed fixes on my laptop.

Guys - I'm a bit confused

Current revision of trunk-neurospin seems to be 1871

Do you really want me to merge 1869?

Matthew

Revision history for this message
Gael Varoquaux (gael-varoquaux) wrote :

On Wed, Sep 02, 2009 at 11:24:08PM -0000, Matthew Brett wrote:
> Hi,

> >> Matthew, you can merge revision 1872 when you're ready.

> > Actually, make that 1869[*], Bertrand has fixed a few examples, and I
> > just realized that I had unpushed fixes on my laptop.

> Guys - I'm a bit confused

> Current revision of trunk-neurospin seems to be 1871

I pushed a bug fix yesterday. 1871 is good.

Thanks,

Gaël

Revision history for this message
Gael Varoquaux (gael-varoquaux) wrote :

On Wed, Sep 02, 2009 at 11:18:08PM -0000, Fernando Perez wrote:
> Hey,

> On Tue, Sep 1, 2009 at 1:06 PM, Gael
> Varoquaux<email address hidden> wrote:
> > 1) I branch out trunk to trunk-gael

> > 2) I work on trunk-gael, 'bzr ci' as I work

> > 3) I bug fix is committed to trunk

> > 4) I need the bug fix. This is where I break the golden rule, and merge
> > from trunk to trunk-gael

> > 5) I work on trunk-gael, 'bzr ci' as I work

> > 6) I am ready for merging in trunk, but how do I avoid history folding?

> This should be OK, as long as you do:

> cd trunk/
> bzr merge ../trunk-gael
> bzr push

Yes, this is what I realised yesterday. 4) should be:

cd trunk/
bzr pull pl:nipy
bzr merge ../trunk-gael
bzr ci -m 'MRG: Sync trunk-gael with trunk'
cd ../trunk-gael
bzr pull ../trunk

It worked on a simple example.

The 'merge in one direction to be able to pull in the other' wasn't clear
to me. It is a directed graph, and the order in wich the edges are
created matter!

Gaël

Revision history for this message
Fernando Perez (fdo.perez) wrote :

On Wed, Sep 2, 2009 at 10:21 PM, Gael Varoquaux
<email address hidden> wrote:
> Yes, this is what I realised yesterday. 4) should be:
>
> cd trunk/
> bzr pull pl:nipy
> bzr merge ../trunk-gael
> bzr ci -m 'MRG: Sync trunk-gael with trunk'

!!!
At this point, you should immediately do a push! Since you are using
trunk as a local mirror of lp:nipy (you are 'pulling' from it, not
merging from it), you should then push up so that your local commit is
publicly available to others. Basically your local history should
match the one seen on the public site at lp:nipy.

> cd ../trunk-gael
> bzr pull ../trunk
>
> It worked on a simple example.

But it won't in others. In general, since trunk-gael is NOT a mirror
image of trunk, here you should do

cd ../trunk-gael
bzr mege ../trunk

That will always work. The pull operation is dangerous because for
bzr, push/pull mean: "make these two trees IDENTICAL, down to every
commit and revision number". That's why in this case you should use
merge and not pull, and why in the one above you should push after
your local commit, so your local history and the public one remain
identical.

Makes sense?

> The 'merge in one direction to be able to pull in the other' wasn't clear
> to me. It is a directed graph, and the order in wich the edges are
> created matter!

Yup :)

Cheers,

f

1779. By Jarrod Millman

Adding initial sphinx structure for nipy package

1780. By Jarrod Millman

Removing mistakenly committed directory:wq

1781. By Jarrod Millman

Fixing banner:wq

Revision history for this message
Gael Varoquaux (gael-varoquaux) wrote :

On Thu, Sep 03, 2009 at 07:18:09PM -0000, Fernando Perez wrote:
> On Wed, Sep 2, 2009 at 10:21 PM, Gael Varoquaux
> <email address hidden> wrote:
> > Yes, this is what I realised yesterday. 4) should be:

> > cd trunk/
> > bzr pull pl:nipy
> > bzr merge ../trunk-gael
> > bzr ci -m 'MRG: Sync trunk-gael with trunk'

> !!!
> At this point, you should immediately do a push! Since you are using
> trunk as a local mirror of lp:nipy (you are 'pulling' from it, not
> merging from it), you should then push up so that your local commit is
> publicly available to others. Basically your local history should
> match the one seen on the public site at lp:nipy.

Well, no. I am not ready to share my changes. They are buggy, and may not
be fully functionnal. I am going to break other peoples code, and get in
trouble if I do.

> Makes sense?

Nope. Your explaination does make sens, but it simply shows me that I
haven't solved my real problem. Or am I wrong?

Gaël

Revision history for this message
joep (josef-pktd) wrote :

On Thu, Sep 3, 2009 at 3:39 PM, Gael
Varoquaux<email address hidden> wrote:
> On Thu, Sep 03, 2009 at 07:18:09PM -0000, Fernando Perez wrote:
>> On Wed, Sep 2, 2009 at 10:21 PM, Gael Varoquaux
>> <email address hidden> wrote:
>> > Yes, this is what I realised yesterday. 4) should be:
>
>> > cd trunk/
>> > bzr pull pl:nipy
>> > bzr merge ../trunk-gael
>> > bzr ci -m 'MRG: Sync trunk-gael with trunk'
>
>> !!!
>> At this point, you should immediately do a push!  Since you are using
>> trunk as a local mirror of lp:nipy (you are 'pulling' from it, not
>> merging from it), you should then push up so that your local commit is
>> publicly available to others.  Basically your local history should
>> match the one seen on the public site at lp:nipy.
>
> Well, no. I am not ready to share my changes. They are buggy, and may not
> be fully functionnal. I am going to break other peoples code, and get in
> trouble if I do.
>
>> Makes sense?
>
> Nope. Your explaination does make sens, but it simply shows me that I
> haven't solved my real problem. Or am I wrong?

I don't think you are supposed to merge into your local copy of trunk if you are
not ready to push to trunk.
Keep working in trunk-gael and merging trunk into it, instead of the
other direction.

That's my interpretation, keep local copies only as mirrors of the
corresponding
remote branch, and merge from other branches and push only to the
corresponding remote
never to others.

Josef

>
> Gaël
> --
> https://code.launchpad.net/~nipy-developers/nipy/trunk-neurospin/+merge/8472
> Your team nipy-developers is subscribed to branch lp:~nipy-developers/nipy/trunk-neurospin.
>

Revision history for this message
Fernando Perez (fdo.perez) wrote :

On Thu, Sep 3, 2009 at 1:24 PM, joep <email address hidden> wrote:
> I don't think you are supposed to merge into your local copy of trunk if you are
> not ready to push to trunk.
> Keep working in trunk-gael and merging trunk into it, instead of the
> other direction.
>
> That's my interpretation, keep local copies only as mirrors of the
> corresponding
> remote branch, and merge from other branches and push only to the
> corresponding remote
> never to others.
>

That's exactly how I see it: if you aren't ready to push, do not touch
your local mirror of public trunk with anything other than

bzr pull

to sync with other's changes. You *only* merge into your local mirror
of lp:nipy when you are ready to make those changes public. Basically
think of your local copy of trunk as if it were an svn-style checkout
of lp:nipy, where *any* commit will be immediately seen by others.
For changes you aren't ready to share yet, just work in your own
branch, pulling *from* the public trunk as much as you need/want to,
and you should be OK.

Does that make sense? That's how I work and so far I've had no problems.

This discussion actually makes me think it might even be useful to
*really* do a

bzr checkout lp:nipy my-local-trunk

instead of a 'bzr branch', so that the local copy of trunk is really,
really a mirror, and I can't even commit into it without pushing.
This would be a way to safely enforce that constraint.

Mmh, I hadn't thought of this before, but it makes sense now. Am I
missing something?

Cheers,

f

Revision history for this message
joep (josef-pktd) wrote :

On Thu, Sep 3, 2009 at 8:39 PM, Fernando Perez<email address hidden> wrote:
> On Thu, Sep 3, 2009 at 1:24 PM, joep <email address hidden> wrote:
>> I don't think you are supposed to merge into your local copy of trunk if you are
>> not ready to push to trunk.
>> Keep working in trunk-gael and merging trunk into it, instead of the
>> other direction.
>>
>> That's my interpretation, keep local copies only as mirrors of the
>> corresponding
>> remote branch, and merge from other branches and push only to the
>> corresponding remote
>> never to others.
>>
>
> That's exactly how I see it: if you aren't ready to push, do not touch
> your local mirror of public trunk with anything other than
>
> bzr pull
>
> to sync with other's changes.  You *only* merge into your local mirror
> of lp:nipy when you are ready to make those changes public.  Basically
> think of your local copy of trunk as if it were an svn-style checkout
> of lp:nipy, where *any* commit will be immediately seen by others.
> For changes you aren't ready to share yet, just work in your own
> branch, pulling *from* the public trunk as much as you need/want to,
> and you should be OK.
>
> Does that make sense?  That's how I work and so far I've had no problems.
>
> This discussion actually makes me think it might even be useful to
> *really* do a
>
> bzr checkout lp:nipy my-local-trunk
>
> instead of a 'bzr branch', so that the local copy of trunk is really,
> really a mirror, and I can't even commit into it without pushing.
> This would be a way to safely enforce that constraint.
>
> Mmh, I hadn't thought of this before, but it makes sense now.  Am I
> missing something?

Very dangerous, when I got started with bzr, I managed to screw up
svn.scipy this way. I thought my local branch is isolated and I
force-pushed (not really knowing what I'm doing) and it pushed
straight through to svn.scipy (the bzr branch was a checkout of svn
scipy).
Now, I prefer to have an extra layer of protection for the remote
repository and make remote pushes a careful choice, I can always get a
new copy from the remote branch if I did something wrong with my local
trunk copy.

First rule I learned, if bzr tells me, I'm doing something stupid,
don't override it.

Josef

>
> Cheers,
>
> f
> --
> https://code.launchpad.net/~nipy-developers/nipy/trunk-neurospin/+merge/8472
> Your team nipy-developers is subscribed to branch lp:~nipy-developers/nipy/trunk-neurospin.
>

Revision history for this message
Christopher Burns (cburns) wrote :

Actually, you don't even need a local copy of the trunk. All you need
is a local developer branch that you commit to, push to your local
launchpad account and merge lp:nipy into to keep in sync with changes
in the trunk.

Matthew's diagram explains it well:

http://neuroimaging.scipy.org/site/doc/manual/html/devel/guidelines/bzr_workflow.html#overview

That is, after doing a:
bzr branch lp:nipy ~/src/trunk-dev

On Thu, Sep 3, 2009 at 5:39 PM, Fernando Perez<email address hidden> wrote:
> This discussion actually makes me think it might even be useful to
> *really* do a
>
> bzr checkout lp:nipy my-local-trunk
>
> instead of a 'bzr branch', so that the local copy of trunk is really,
> really a mirror, and I can't even commit into it without pushing.
> This would be a way to safely enforce that constraint.
>
> Mmh, I hadn't thought of this before, but it makes sense now.  Am I
> missing something?

Revision history for this message
Fernando Perez (fdo.perez) wrote :

On Thu, Sep 3, 2009 at 7:00 PM, Christopher Burns <email address hidden> wrote:
> Actually, you don't even need a local copy of the trunk.  All you need
> is a local developer branch that you commit to, push to your local
> launchpad account and merge lp:nipy into to keep in sync with changes
> in the trunk.

No, I meant for pushing back upstream, for the case of a developer who
does eventually make direct commits to the trunk. If you want to push
back up, you do need a local copy of the trunk, where you can apply
changes and then push them back, right? I think Matthew's diagram is
for the general case of developers who are not taksed with making the
actual trunk merge/push operations.

My idea of using a checkout instead of a branch does have the problem
that Josef outlined, of lacking any layer of protection between your
local mistakes and the public server. Since one of the features of a
DVCS is that you tend to be more liberal with committing than with SVN
(where you're always very much on guard that a commit has significant
public consequences), I think Josef's point is a very valid one.

Cheers,

f

Revision history for this message
joep (josef-pktd) wrote :

On Fri, Sep 4, 2009 at 1:27 AM, Fernando Perez<email address hidden> wrote:
> On Thu, Sep 3, 2009 at 7:00 PM, Christopher Burns <email address hidden> wrote:
>> Actually, you don't even need a local copy of the trunk.  All you need
>> is a local developer branch that you commit to, push to your local
>> launchpad account and merge lp:nipy into to keep in sync with changes
>> in the trunk.
>
> No, I meant for pushing back upstream, for the case of a developer who
> does eventually make direct commits to the trunk.  If you want to push
> back up, you do need a local copy of the trunk, where you can apply
> changes and then push them back, right?  I think Matthew's diagram is
> for the general case of developers who are not taksed with making the
> actual trunk merge/push operations.
>

Just to confirm this, here is the hidden part:
http://neuroimaging.scipy.org/site/doc/manual/html/devel/guidelines/bzr_administration.html#bzr-administration

I didn't see this before, even though it's linked from below the other diagram.
It would have been useful to see this when I was trying to figure out
how to handle "statsmodels trunk"

Josef

> My idea of using a checkout instead of a branch does have the problem
> that Josef outlined, of lacking any layer of protection between your
> local  mistakes and the public server.  Since one of the features of a
> DVCS is that you tend to be more liberal with committing than with SVN
> (where you're always  very much on guard that a commit has significant
> public consequences), I think Josef's point is a very valid one.
>
> Cheers,
>
> f
> --
> https://code.launchpad.net/~nipy-developers/nipy/trunk-neurospin/+merge/8472
> Your team nipy-developers is subscribed to branch lp:~nipy-developers/nipy/trunk-neurospin.
>

Revision history for this message
Fernando Perez (fdo.perez) wrote :

On Thu, Sep 3, 2009 at 10:39 PM, joep <email address hidden> wrote:
>> No, I meant for pushing back upstream, for the case of a developer who
>> does eventually make direct commits to the trunk.  If you want to push
>> back up, you do need a local copy of the trunk, where you can apply
>> changes and then push them back, right?  I think Matthew's diagram is
>> for the general case of developers who are not taksed with making the
>> actual trunk merge/push operations.
>>
>
> Just to confirm this, here is the hidden part:
> http://neuroimaging.scipy.org/site/doc/manual/html/devel/guidelines/bzr_administration.html#bzr-administration
>
> I didn't see this before, even though it's linked from below the other diagram.
> It would have been useful to see this when I was trying to figure out
> how to handle "statsmodels trunk"

Yup! I'd forgotten that diagram had also been made, that's indeed the
one that applies directly to Gael's original question. Thanks for
bringing it back up to my attention.

Cheers,

f

Revision history for this message
Gael Varoquaux (gael-varoquaux) wrote :

On Thu, Sep 03, 2009 at 08:24:09PM -0000, joep wrote:
> > Well, no. I am not ready to share my changes. They are buggy, and may not
> > be fully functionnal. I am going to break other peoples code, and get in
> > trouble if I do.

> >> Makes sense?

> > Nope. Your explaination does make sens, but it simply shows me that I
> > haven't solved my real problem. Or am I wrong?

> I don't think you are supposed to merge into your local copy of trunk
> if you are not ready to push to trunk. Keep working in trunk-gael and
> merging trunk into it, instead of the other direction.

So, sorry for being thick, but I am a bit lost with regards to my use
case:

1) I branch out trunk to trunk-gael

2) I work on trunk-gael, 'bzr ci' as I work

3) I bug fix is committed to trunk

4) I need the bug fix, in trunk-gael

5) I work on trunk-gael, 'bzr ci' as I work

6) I am ready for merging in trunk, but how do I avoid history folding?

The question is how to implement 4), without creating history folding and
without affecting trunk.

From the sentence above, I have the impression that I am supposed to do a
'cd trunk-gael ; bzr merge lp:nipy'. I believe this is what I had been
doing, and it creates history folding.

I am sorry, I am really thick.

Gaël

Revision history for this message
Fernando Perez (fdo.perez) wrote :

Hey,

On Thu, Sep 3, 2009 at 11:18 PM, Gael Varoquaux
<email address hidden> wrote:
> So, sorry for being thick, but I am a bit lost with regards to my use
> case:
>
> 1) I branch out trunk to trunk-gael
>
> 2) I work on trunk-gael, 'bzr ci' as I work
>
> 3) I bug fix is committed to trunk
>
> 4) I need the bug fix, in trunk-gael
>
> 5) I work on trunk-gael, 'bzr ci' as I work
>
> 6) I am ready for merging in trunk, but how do I avoid history folding?
>
> The question is how to implement 4), without creating history folding and
> without affecting trunk.
>
> >From the sentence above, I have the impression that I am supposed to do a
> 'cd trunk-gael ; bzr merge lp:nipy'. I believe this is what I had been
> doing, and it creates history folding.

Yes, and it does make the history a little messy with a lot of nested
merge messages, but it does not *drop revisions*. The major danger
with the other scenario (doing a local merge in a branch that's meant
to only track a public one and not committing) is that if you later
try to commit, you'll get a "histories have diverged" message from
bzr. If you then say "what the heck, do as I say, --force", you'll
get the real bad case Josef alluded to, where you seriously do mess up
the upstream repository. We did make that mistake last year early in
the transition to lp in ipython, and we learned as Josef says, "when
bzr tells you not to do something stupid, listen" :)

So I think you are only worrying about the convoluted history and
non-linear revision numbers that will result in *your* branch. That
one, there's really no fix for that I know of, but there is no loss
of information, just a nested log and multi-dot revision numbers.
For the trunk, this pattern keeps a fully clean, linearly increasing
revno, like you can see recently for ipython:

https://code.launchpad.net/~ipython-dev/ipython/trunk

Does this help clarify things?

Cheers,

f

Revision history for this message
Gael Varoquaux (gael-varoquaux) wrote :

On Fri, Sep 04, 2009 at 06:27:08AM -0000, Fernando Perez wrote:
> Does this help clarify things?

Yes, it does. So it seems that my working pattern was the right, but it
is just giving results that I am not terribly happy with.

Sorry for the long discussion.

Gaël

Revision history for this message
Fernando Perez (fdo.perez) wrote :

On Thu, Sep 3, 2009 at 11:36 PM, Gael Varoquaux
<email address hidden> wrote:
>> Does this help clarify things?
>
> Yes, it does. So it seems that my working pattern was the right, but it
> is just giving results that I am not terribly happy with.
>
> Sorry for the long discussion.

No worries, no need to apologize. It took us a while to figure this
out and I was myself very frustrated and somewhat angry last year
after a series of mis-steps of my own in ipython, so I totally
understand. And I have to admit I don't really like the shape this
leaves the history in either, but I don't know of a solution for that
part (I'd love to hear if there is one).

From my initial, but still limited, experiences with git, I get the
impression that it produces a 'flatter' history by default due to how
it handles merging. That is in addition to the fact that it has
extensive rebase/rewrite capabilities. But for bzr use, I've just
accepted the above 'messy' history in local branches as a fact of life
for now. I'll be happy to learn of better approaches if an expert
shows me one...

Best,

f

1782. By Matthew Brett <mb312@angela>

Merge from trunk-neurospin r1871

Revision history for this message
Gael Varoquaux (gael-varoquaux) wrote :

On Fri, Sep 04, 2009 at 07:55:23PM -0000, Matthew Brett wrote:
> The proposal to merge lp:~nipy-developers/nipy/trunk-neurospin into lp:nipy has been updated.

> Status: Needs review => Merged

Thank you Matthew,

Gaël

Revision history for this message
Alexis Roche (alexis-roche) wrote :

Preview Diff

[H/L] Next/Prev Comment, [J/K] Next/Prev File, [N/P] Next/Prev Hunk
1=== modified file 'examples/neurospin/affine_matching.py'
2--- examples/neurospin/affine_matching.py 2009-03-28 01:56:11 +0000
3+++ examples/neurospin/affine_matching.py 2009-07-09 10:04:36 +0000
4@@ -1,30 +1,30 @@
5 #!/usr/bin/env python
6
7-import fff2
8+"""
9+Example of running inter-subject affine matching on the sulcal2000
10+database acquired at SHFJ, Orsay, France.
11+"""
12+
13+from nipy.neurospin.image_registration import affine_register, affine_resample
14+from nipy.io.imageformats import load as load_image, save as save_image
15
16 from os.path import join
17 import sys
18 import time
19
20-
21-"""
22-Example of running affine matching on the 'sulcal2000' database
23-"""
24-
25-# Dirty hack for me to be able to access data from my XP environment
26+rootpath = '/neurospin/lnao/Panabase/roche/sulcal2000'
27+# Unimportant hack...
28 from os import name
29 if name == 'nt':
30- rootpath = 'D:\\data\\sulcal2000'
31-else:
32- rootpath = '/neurospin/lnao/Panabase/roche/sulcal2000'
33+ rootpath = 'D:\\home\\AR203069\\data\\sulcal2000'
34
35 print('Scanning data directory...')
36 source = sys.argv[1]
37 target = sys.argv[2]
38-similarity = 'correlation ratio'
39+similarity = 'cr'
40 if len(sys.argv)>3:
41 similarity = sys.argv[3]
42-interp = 'partial volume'
43+interp = 'pv'
44 if len(sys.argv)>4:
45 interp = sys.argv[4]
46 normalize = None
47@@ -34,10 +34,7 @@
48 if len(sys.argv)>6:
49 optimizer = sys.argv[6]
50
51-# Change this to use another I/O package
52-iolib = 'pynifti'
53-
54-## Info
55+# Print messages
56 print ('Source brain: %s' % source)
57 print ('Target brain: %s' % target)
58 print ('Similarity measure: %s' % similarity)
59@@ -45,20 +42,32 @@
60
61 # Get data
62 print('Fetching image data...')
63-I = fff2.neuro.image(join(rootpath,'nobias_'+source+'.nii'), iolib=iolib)
64-J = fff2.neuro.image(join(rootpath,'nobias_'+target+'.nii'), iolib=iolib)
65+I = load_image(join(rootpath,'nobias_'+source+'.nii'))
66+J = load_image(join(rootpath,'nobias_'+target+'.nii'))
67
68 # Perform affine normalization
69-T, It = fff2.neuro.affine_registration(I, J, similarity=similarity, interp=interp,
70- normalize=normalize, optimizer=optimizer, resample=True)
71-
72+print('Setting up registration...')
73+tic = time.time()
74+T = affine_register(I, J,
75+ similarity=similarity, interp=interp,
76+ normalize=normalize, optimizer=optimizer)
77+toc = time.time()
78+print(' Registration time: %f sec' % (toc-tic))
79+
80+
81+# Resample source image
82+print('Resampling source image...')
83+tic = time.time()
84+It = affine_resample(I, J, T)
85+toc = time.time()
86+print(' Resampling time: %f sec' % (toc-tic))
87
88 # Save resampled source
89 outfile = source+'_TO_'+target+'.nii'
90 print ('Saving resampled source in: %s' % outfile)
91-It.save(outfile)
92+save_image(It, outfile)
93
94 # Save transformation matrix
95 import numpy as np
96-np.save(outfile, T)
97+np.save(outfile, np.asarray(T))
98
99
100=== modified file 'examples/neurospin/cluster_stats.py'
101--- examples/neurospin/cluster_stats.py 2009-04-14 22:52:04 +0000
102+++ examples/neurospin/cluster_stats.py 2009-04-21 06:18:23 +0000
103@@ -1,19 +1,17 @@
104-import nipy.neurospin as fff2
105-
106-import nipy.neurospin.neuro
107-from nipy.neurospin.neuro.statistical_test import cluster_stats
108 import numpy as np
109
110+from nipy.neurospin import Image
111+from nipy.neurospin.statistical_mapping import cluster_stats
112+
113 dx = 5
114 dy = 5
115 dz = 4
116
117-zimg = fff2.neuro.image(3.*(np.random.rand(dx,dy,dz)-.5))
118-mask = fff2.neuro.image(np.random.randint(2, size=[dx,dy,dz]))
119-
120-null_smax = 10*np.random.rand(1000)
121-
122-
123-clusters, info = cluster_stats(zimg, mask, 0.5, null_smax=null_smax)
124+zimg = Image(3.*(np.random.rand(dx,dy,dz)-.5))
125+mask = Image(np.random.randint(2, size=[dx,dy,dz]))
126+
127+nulls = {'smax': 10*np.random.rand(1000)}
128+
129+clusters, info = cluster_stats(zimg, mask, 0.5, nulls=nulls)
130
131
132
133=== added file 'examples/neurospin/demo_bgmm.py'
134--- examples/neurospin/demo_bgmm.py 1970-01-01 00:00:00 +0000
135+++ examples/neurospin/demo_bgmm.py 2009-07-02 10:56:38 +0000
136@@ -0,0 +1,60 @@
137+"""
138+Example of a demo that fits a Bayesian GMM to a dataset
139+
140+
141+Author : Bertrand Thirion, 2008-2009
142+"""
143+
144+import numpy as np
145+import numpy.random as nr
146+import nipy.neurospin.clustering.bgmm as bgmm
147+from nipy.neurospin.clustering.gmm import plot2D
148+
149+
150+dim = 2
151+# 1. generate a 3-components mixture
152+x1 = nr.randn(100,dim)
153+x2 = 3+2*nr.randn(50,dim)
154+x3 = np.repeat(np.array([-2,2],ndmin=2),30,0)+0.5*nr.randn(30,dim)
155+x = np.concatenate((x1,x2,x3))
156+
157+#2. fit the mixture with a bunch of possible models
158+krange = range(1,10)
159+be = -np.infty
160+for k in krange:
161+ b = bgmm.VBGMM(k,dim)
162+ b.guess_priors(x)
163+ b.initialize(x)
164+ b.estimate(x)
165+ ek = b.evidence(x)
166+ if ek>be:
167+ be = ek
168+ bestb = b
169+
170+ print k,'classes, free energy:',b.evidence(x)
171+
172+# 3, plot the result
173+z = bestb.map_label(x)
174+plot2D(x,bestb,z,show=1,verbose=0)
175+
176+# the same, with the Gibbs GMM algo
177+niter = 1000
178+krange = range(2,5)
179+bbf = -np.infty
180+for k in range(1,4):
181+ b = bgmm.BGMM(k,dim)
182+ b.guess_priors(x)
183+ b.initialize(x)
184+ b.sample(x,100)
185+ w,cent,prec,pz = b.sample(x,niter=niter,mem=1)
186+ bplugin = bgmm.BGMM(k,dim,cent,prec,w)
187+ bplugin.guess_priors(x)
188+ bfk = bplugin.Bfactor(x,pz.astype(np.int),1)
189+ print k, 'classes, evidence:',bfk
190+ if bfk>bbf:
191+ bestk = k
192+ bbf = bfk
193+
194+z = bplugin.map_label(x)
195+plot2D(x,bplugin,z,show=1,verbose=0)
196+
197
198=== modified file 'examples/neurospin/demo_blob_extraction.py'
199--- examples/neurospin/demo_blob_extraction.py 2009-03-28 01:56:11 +0000
200+++ examples/neurospin/demo_blob_extraction.py 2009-06-05 13:39:32 +0000
201@@ -10,9 +10,9 @@
202 import pylab as pl
203 import matplotlib
204
205-import fff2.graph.field as ff
206-import fff2.utils.simul_2d_multisubject_fmri_dataset as simul
207-
208+import nipy.neurospin.graph.field as ff
209+import nipy.neurospin.utils.simul_2d_multisubject_fmri_dataset as simul
210+import nipy.neurospin.spatial_models.hroi as hroi
211
212 dimx=60
213 dimy=60
214@@ -36,20 +36,21 @@
215 # compute the blobs
216 th = 2.36
217 smin = 5
218-nroi = F.generate_blobs(refdim=0, th=th, smin=smin)
219-
220-# compute the average signal within each blob
221-idx = nroi.get_seed()
222-parent = nroi.get_parents()
223-label = nroi.get_label()
224-nroi.make_feature(beta, 'height', 'mean')
225-bfm = nroi.get_ROI_feature('height')
226-
227-# plot the input image
228+nroi = hroi.NROI_from_field(F,None,xyz.T,refdim=0,th=th,smin = smin)
229+
230 bmap = np.zeros(nbvox)
231-if nroi.k>0:
232- bmap[label>-1]= bfm[label[label>-1]]
233-
234+label = -np.ones(nbvox)
235+
236+if nroi!=None:
237+ # compute the average signal within each blob
238+ bfm = nroi.discrete_to_roi_features('activation')
239+
240+ # plot the input image
241+ idx = nroi.discrete_features['masked_index']
242+ for k in range(nroi.k):
243+ bmap[idx[k]] = bfm[k]
244+ label[idx[k]] = k
245+
246 label = np.reshape(label,(dimx,dimy))
247 bmap = np.reshape(bmap,(dimx,dimy))
248
249@@ -59,14 +60,14 @@
250 (aux1, 0.7, 0.7),
251 (aux2, 1.0, 1.0),
252 (1.0, 1.0, 1.0)),
253- 'green': ((0.0, 0.0, 0.7),
254- (aux1, 0.7, 0.0),
255- (aux2, 1.0, 1.0),
256- (1.0, 1.0, 1.0)),
257- 'blue': ((0.0, 0.0, 0.7),
258- (aux1, 0.7, 0.0),
259- (aux2, 0.5, 0.5),
260- (1.0, 1.0, 1.0))}
261+ 'green': ((0.0, 0.0, 0.7),
262+ (aux1, 0.7, 0.0),
263+ (aux2, 1.0, 1.0),
264+ (1.0, 1.0, 1.0)),
265+ 'blue': ((0.0, 0.0, 0.7),
266+ (aux1, 0.7, 0.0),
267+ (aux2, 0.5, 0.5),
268+ (1.0, 1.0, 1.0))}
269 my_cmap = matplotlib.colors.LinearSegmentedColormap('my_colormap',cdict,256)
270
271 pl.figure(figsize=(12, 3))
272@@ -74,7 +75,8 @@
273 pl.imshow(np.squeeze(x), interpolation='nearest', cmap=my_cmap)
274 cb = pl.colorbar()
275 for t in cb.ax.get_yticklabels():
276- t.set_fontsize(16)
277+ t.set_fontsize(16)
278+
279 pl.axis('off')
280 pl.title('Thresholded data')
281
282@@ -87,15 +89,15 @@
283 # plot the blob-averaged signal image
284 aux = 0.01#(th-bmap.min())/(bmap.max()-bmap.min())
285 cdict = {'red': ((0.0, 0.0, 0.7), (aux, 0.7, 0.7), (1.0, 1.0, 1.0)),
286- 'green': ((0.0, 0.0, 0.7), (aux, 0.7, 0.0), (1.0, 1.0, 1.0)),
287- 'blue': ((0.0, 0.0, 0.7), (aux, 0.7, 0.0), (1.0, 0.5, 1.0))}
288+ 'green': ((0.0, 0.0, 0.7), (aux, 0.7, 0.0), (1.0, 1.0, 1.0)),
289+ 'blue': ((0.0, 0.0, 0.7), (aux, 0.7, 0.0), (1.0, 0.5, 1.0))}
290 my_cmap = matplotlib.colors.LinearSegmentedColormap('my_colormap',cdict, 256)
291
292 pl.subplot(1, 3, 3)
293 pl.imshow(bmap, interpolation='nearest', cmap=my_cmap)
294 cb = pl.colorbar()
295 for t in cb.ax.get_yticklabels():
296- t.set_fontsize(16)
297+ t.set_fontsize(16)
298 pl.axis('off')
299 pl.title('Blob average')
300 pl.show()
301
302=== added file 'examples/neurospin/demo_blob_from_nifti.py'
303--- examples/neurospin/demo_blob_from_nifti.py 1970-01-01 00:00:00 +0000
304+++ examples/neurospin/demo_blob_from_nifti.py 2009-06-30 17:16:53 +0000
305@@ -0,0 +1,37 @@
306+"""
307+This scipt generates a noisy activation image image
308+and extracts the blob from it.
309+
310+Author : Bertrand Thirion, 2009
311+"""
312+#autoindent
313+
314+import numpy as np
315+import nifti
316+import nipy.neurospin.graph.field as ff
317+import nipy.neurospin.spatial_models.hroi as hroi
318+
319+inputImage = "/volatile/thirion/python/spmT_0024.img"
320+nim = nifti.NiftiImage(inputImage)
321+header = nim.header
322+data = nim.asarray().T
323+xyz = np.array(np.where(data)).T
324+F = ff.Field(xyz.shape[0])
325+F.from_3d_grid(xyz)
326+F.set_field(data[data!=0])
327+
328+#idx,height,parents,label = F.threshold_bifurcations(0,3.0)
329+
330+label = -np.ones(F.V)
331+nroi = hroi.NROI_from_field(F,header,xyz,0,3.0,smin=5)
332+if nroi!=None:
333+ idx = nroi.discrete_features['masked_index']
334+ for k in range(nroi.k):
335+ label[idx[k]] = k
336+
337+wlabel = -2*np.ones(nim.getVolumeExtent())
338+wlabel[data!=0]=label
339+wim = nifti.NiftiImage(wlabel.T,nim.header)
340+wim.description='blob image extracted from %s'%inputImage
341+wim.save("/tmp/blob.nii")
342+
343
344=== modified file 'examples/neurospin/demo_bsa.py'
345--- examples/neurospin/demo_bsa.py 2009-03-28 01:56:11 +0000
346+++ examples/neurospin/demo_bsa.py 2009-06-05 13:39:32 +0000
347@@ -9,12 +9,12 @@
348 import numpy as np
349 import scipy.stats as st
350 import matplotlib.pylab as mp
351-import fff2.graph.field as ff
352-import fff2.utils.simul_2d_multisubject_fmri_dataset as simul
353-import fff2.spatial_models.bayesian_structural_analysis as bsa
354+import nipy.neurospin.graph.field as ff
355+import nipy.neurospin.utils.simul_2d_multisubject_fmri_dataset as simul
356+import nipy.neurospin.spatial_models.bayesian_structural_analysis as bsa
357
358 def make_bsa_2d(betas, theta=3., dmax=5., ths=0, thq=0.5, smin=0,
359- nbeta=[0]):
360+ nbeta=[0],method='simple',verbose = 0):
361 """ Function for performing bayesian structural analysis on a set of images.
362 """
363 ref_dim = np.shape(betas[0])
364@@ -37,36 +37,53 @@
365 g0 = 1.0/(1.0*nbvox)
366 bdensity = 1
367
368- group_map, AF, BF, labels, likelyhood = \
369- bsa.compute_BSA_simple(Fbeta, lbeta, tal, dmax, thq,
370- smin, ths, theta, g0, bdensity)
371+ if method=='ipmi':
372+ group_map, AF, BF, likelihood = \
373+ bsa.compute_BSA_ipmi(Fbeta, lbeta, tal, dmax,xyz, None, thq,
374+ smin, ths, theta, g0, bdensity)
375+ if method=='simple':
376+ group_map, AF, BF, likelihood = \
377+ bsa.compute_BSA_simple(Fbeta, lbeta, tal, dmax,xyz,
378+ None, thq, smin, ths, theta, g0)
379+ if method=='dev':
380+ group_map, AF, BF, likelihood = \
381+ bsa.compute_BSA_dev(Fbeta, lbeta, tal, dmax,xyz, None, thq,
382+ smin, ths, theta, g0, bdensity)
383+
384+ if method not in['dev','simple','ipmi']:
385+ raise ValueError,'method is not ocrreactly defined'
386+
387+ if verbose==0:
388+ return AF,BF
389+
390+ if AF != None:
391+ lmax = AF.k+2
392+ AF.show()
393
394- labels[labels==-1] = np.size(AF)+2
395-
396 group_map.shape = ref_dim
397 mp.figure()
398- mp.imshow(group_map, interpolation='nearest', vmin=-1, vmax=labels.max())
399+ mp.imshow(group_map, interpolation='nearest', vmin=-1, vmax=lmax)
400 mp.title('Group-level label map')
401 mp.colorbar()
402-
403- likelyhood.shape = ref_dim
404+
405+ likelihood.shape = ref_dim
406 mp.figure()
407- mp.imshow(likelyhood, interpolation='nearest')
408- mp.title('Data likelyhood')
409+ mp.imshow(likelihood, interpolation='nearest')
410+ mp.title('Data likelihood')
411 mp.colorbar()
412
413- sub = np.concatenate([s*np.ones(BF[s].k) for s in range(nbsubj)])
414- qq = 0
415 mp.figure()
416 if nbsubj==10:
417 for s in range(nbsubj):
418 mp.subplot(2, 5, s+1)
419- lw = BF[s].label.astype(np.int)
420- us = labels[sub==s]
421- lw[lw>-1]= us[lw[lw>-1]]
422- lw = np.reshape(lw, ref_dim)
423- mp.imshow(lw, interpolation='nearest', vmin=-1, vmax=labels.max())
424- qq = qq + BF[s].get_k()
425+ lw = -np.ones(ref_dim)
426+ nls = BF[s].get_roi_feature('label')
427+ nls[nls==-1] = np.size(AF)+2
428+ for k in range(BF[s].k):
429+ xyzk = BF[s].discrete[k].T
430+ lw[xyzk[1],xyzk[2]] = nls[k]
431+
432+ mp.imshow(lw, interpolation='nearest', vmin=-1, vmax=lmax)
433 mp.axis('off')
434
435 mp.figure()
436@@ -100,12 +117,13 @@
437 # set various parameters
438 theta = float(st.t.isf(0.01, 100))
439 dmax = 5./1.5
440-ths = 0#nbsubj/2-1
441+ths = 1#nbsubj/2
442 thq = 0.9
443 verbose = 1
444 smin = 5
445+method = 'ipmi'
446
447 # run the algo
448-AF, BF = make_bsa_2d(betas, theta, dmax, ths, thq, smin)
449+AF, BF = make_bsa_2d(betas, theta, dmax, ths, thq, smin,method,verbose=verbose)
450 mp.show()
451
452
453=== added file 'examples/neurospin/demo_field.py'
454--- examples/neurospin/demo_field.py 1970-01-01 00:00:00 +0000
455+++ examples/neurospin/demo_field.py 2009-06-30 17:16:53 +0000
456@@ -0,0 +1,36 @@
457+import numpy as np
458+import numpy.random as nr
459+import nipy.neurospin.graph.field as ff
460+
461+
462+dx = 50
463+dy = 50
464+dz = 1
465+nbseeds=10
466+F = ff.Field(dx*dy*dz)
467+xyz = np.reshape(np.indices((dx,dy,dz)),(3,dx*dy*dz)).T.astype(np.int)
468+F.from_3d_grid(xyz,18)
469+#data = 3*nr.randn(dx*dy*dz) + np.sum((xyz-xyz.mean(0))**2,1)
470+#F.set_field(np.reshape(data,(dx*dy*dz,1)))
471+data = nr.randn(dx*dy*dz,1)
472+F.set_weights(F.get_weights()/18)
473+F.set_field(data)
474+F.diffusion(5)
475+data = F.get_field()
476+
477+seeds = np.argsort(nr.rand(F.V))[:nbseeds]
478+seeds, label, J0 = F.geodesic_kmeans(seeds)
479+wlabel, J1 = F.ward(nbseeds)
480+seeds, label, J2 = F.geodesic_kmeans(seeds,label=wlabel.copy(), eps = 1.e-7)
481+
482+print 'inertia values for the 3 algorithms: ',J0,J1,J2
483+
484+import matplotlib.pylab as mp
485+mp.figure()
486+mp.subplot(1,3,1)
487+mp.imshow(np.reshape(data,(dx,dy)),interpolation='nearest' )
488+mp.subplot(1,3,2)
489+mp.imshow(np.reshape(wlabel,(dx,dy)),interpolation='nearest' )
490+mp.subplot(1,3,3)
491+mp.imshow(np.reshape(label,(dx,dy)),interpolation='nearest' )
492+mp.show()
493
494=== added file 'examples/neurospin/demo_gmm.py'
495--- examples/neurospin/demo_gmm.py 1970-01-01 00:00:00 +0000
496+++ examples/neurospin/demo_gmm.py 2009-07-02 10:56:38 +0000
497@@ -0,0 +1,26 @@
498+"""
499+Example of a demo that fits a GMM to a dataset
500+
501+
502+Author : Bertrand Thirion, 2008-2009
503+"""
504+
505+import numpy as np
506+import numpy.random as nr
507+import nipy.neurospin.clustering.gmm as gmm
508+
509+
510+dim = 2
511+# 1. generate a 3-components mixture
512+x1 = nr.randn(100,dim)
513+x2 = 3+2*nr.randn(50,dim)
514+x3 = np.repeat(np.array([-2,2],ndmin=2),30,0)+0.5*nr.randn(30,dim)
515+x = np.concatenate((x1,x2,x3))
516+
517+# 2. fit the mixture with a bunch of possible models
518+krange = range(1,10)
519+lgmm = gmm.best_fitting_GMM(x,krange,prec_type='diag',niter=100,delta = 1.e-4,ninit=1,verbose=0)
520+
521+# 3, plot the result
522+z = lgmm.map_label(x)
523+gmm.plot2D(x,lgmm,z,show = 1,verbose=0)
524
525=== modified file 'examples/neurospin/demo_histo_fit.py'
526--- examples/neurospin/demo_histo_fit.py 2009-03-28 01:56:11 +0000
527+++ examples/neurospin/demo_histo_fit.py 2009-07-02 17:40:14 +0000
528@@ -8,15 +8,20 @@
529
530 This example is based on a (simplistic) simulated image.
531
532+Note : We do not want a 'zscore', which does mean anything
533+(except with the fdr) but probability
534+that each voxel is in the active class
535+
536+
537 """
538-# Author : Bertrand Thirion, 2008-2009
539+# Author : Bertrand Thirion, Gael Varoquaux 2008-2009
540
541 import numpy as np
542 import scipy.stats as st
543 import os.path as op
544-import fff2.spatial_models.bayesian_structural_analysis as bsa
545-import fff2.utils.simul_2d_multisubject_fmri_dataset as simul
546-from fff2.utils.zscore import zscore
547+import nipy.neurospin.utils.simul_2d_multisubject_fmri_dataset as simul
548+from nipy.neurospin.utils.zscore import zscore
549+import nipy.neurospin.utils.emp_null as en
550
551 ################################################################################
552 # simulate the data
553@@ -39,60 +44,58 @@
554
555 ################################################################################
556 # fit Beta's histogram with a Gamma-Gaussian mixture
557-gam_gaus_zscore = zscore(bsa._GGM_priors_(Beta, Beta))
558-gam_gaus_zscore = np.reshape(gam_gaus_zscore, (dimx, dimy, 3))
559+gam_gaus_pp = en.Gamma_Gaussian_fit(Beta, Beta)
560+gam_gaus_pp = np.reshape(gam_gaus_pp, (dimx, dimy, 3))
561
562 pl.figure(fig.number)
563 pl.subplot(3, 3, 4)
564-pl.imshow(gam_gaus_zscore[..., 0], cmap=pl.cm.hot)
565-pl.title('Gamme-Gaussian mixture,\n first component Z-score')
566+pl.imshow(gam_gaus_pp[..., 0], cmap=pl.cm.hot)
567+pl.title('Gamma-Gaussian mixture,\n first component posterior proba.')
568 pl.colorbar()
569 pl.subplot(3, 3, 5)
570-pl.imshow(gam_gaus_zscore[..., 1], cmap=pl.cm.hot)
571-pl.title('Gamme-Gaussian mixture,\n second component Z-score')
572+pl.imshow(gam_gaus_pp[..., 1], cmap=pl.cm.hot)
573+pl.title('Gamma-Gaussian mixture,\n second component posterior proba.')
574 pl.colorbar()
575 pl.subplot(3, 3, 6)
576-pl.imshow(gam_gaus_zscore[..., 2], cmap=pl.cm.hot)
577-pl.title('Gamme-Gaussian mixture,\n third component Z-score')
578+pl.imshow(gam_gaus_pp[..., 2], cmap=pl.cm.hot)
579+pl.title('Gamma-Gaussian mixture,\n third component posterior proba.')
580 pl.colorbar()
581
582 ################################################################################
583 # fit Beta's histogram with a mixture of Gaussians
584 alpha = 0.01
585-theta = float(st.t.isf(0.01, 100))
586-# FIXME: Ugly crasher if the second Beta is not reshaped
587-gaus_mix_zscore = zscore(bsa._GMM_priors_(Beta, Beta.reshape(-1, 1), theta,
588- alpha,
589- prior_strength=100))
590-gaus_mix_zscore = np.reshape(gaus_mix_zscore, (dimx, dimy, 3))
591+gaus_mix_pp = en.three_classes_GMM_fit(Beta, None,
592+ alpha, prior_strength=100)
593+gaus_mix_pp = np.reshape(gaus_mix_pp, (dimx, dimy, 3))
594+
595
596 pl.figure(fig.number)
597 pl.subplot(3, 3, 7)
598-pl.imshow(gaus_mix_zscore[..., 0], cmap=pl.cm.hot)
599-pl.title('Gaussian mixture,\n first component Z-score')
600+pl.imshow(gaus_mix_pp[..., 0], cmap=pl.cm.hot)
601+pl.title('Gaussian mixture,\n first component posterior proba.')
602 pl.colorbar()
603 pl.subplot(3, 3, 8)
604-pl.imshow(gaus_mix_zscore[..., 1], cmap=pl.cm.hot)
605-pl.title('Gaussian mixture,\n second component Z-score')
606+pl.imshow(gaus_mix_pp[..., 1], cmap=pl.cm.hot)
607+pl.title('Gaussian mixture,\n second component posterior proba.')
608 pl.colorbar()
609 pl.subplot(3, 3, 9)
610-pl.imshow(gaus_mix_zscore[..., 2], cmap=pl.cm.hot)
611-pl.title('Gamme-Gaussian mixture,\n third component Z-score')
612+pl.imshow(gaus_mix_pp[..., 2], cmap=pl.cm.hot)
613+pl.title('Gamma-Gaussian mixture,\n third component posterior proba.')
614 pl.colorbar()
615
616 ################################################################################
617 # Fit the null mode of Beta with an empirical normal null
618-import fff2.utils.emp_null as en
619+
620 efdr = en.ENN(Beta)
621-emp_null_zcore = zscore(efdr.fdr(Beta))
622-emp_null_zcore = emp_null_zcore.reshape((dimx, dimy))
623+emp_null_fdr = efdr.fdr(Beta)
624+emp_null_fdr = emp_null_fdr.reshape((dimx, dimy))
625
626 pl.subplot(3, 3, 3)
627-pl.imshow(emp_null_zcore, cmap=pl.cm.hot)
628+pl.imshow(1-emp_null_fdr, cmap=pl.cm.hot)
629 pl.colorbar()
630-pl.title('Empirical normal null\n Z-score')
631+pl.title('Empirical FDR\n ')
632
633-efdr.plot()
634-pl.title('Empirical normal null fit')
635+#efdr.plot()
636+#pl.title('Empirical FDR fit')
637
638 pl.show()
639
640=== modified file 'examples/neurospin/demo_histo_fit_nifti.py'
641--- examples/neurospin/demo_histo_fit_nifti.py 2009-03-28 01:56:11 +0000
642+++ examples/neurospin/demo_histo_fit_nifti.py 2009-07-02 17:40:14 +0000
643@@ -11,9 +11,10 @@
644 import numpy as np
645 import scipy.stats as st
646 import os.path as op
647-import fff2.spatial_models.bayesian_structural_analysis as bsa
648 import nifti
649
650+import nipy.neurospin.utils.emp_null as en
651+
652 nbru = range(1,13)
653
654 nbeta = [29]
655@@ -38,31 +39,25 @@
656 voxsize = nim.getVoxDims()
657
658 # Read the masks and compute the "intersection"
659-mask = np.transpose(nim.asarray())
660+mask = nim.asarray().T
661 xyz = np.array(np.where(mask))
662 nbvox = np.size(xyz,1)
663
664-# read the functional images
665-Beta = []
666+# read the functional image
667 rbeta = nifti.NiftiImage(betas[s][0])
668-beta = np.transpose(rbeta.asarray())
669+beta = rbeta.asarray().T
670 beta = beta[mask>0]
671-Beta.append(beta)
672-Beta = np.transpose(np.array(Beta))
673-
674-# fit Beta's histogram with a Gamma-Gaussian mixture
675-Bfm = np.array([2.5,3.0,3.5,4.0,4.5])
676-Bfp = bsa._GGM_priors_(np.squeeze(Beta),Bfm,verbose=2)
677-
678-# fit Beta's histogram with a mixture of Gaussians
679+
680+# fit beta's histogram with a Gamma-Gaussian mixture
681+bfm = np.array([2.5,3.0,3.5,4.0,4.5])
682+bfp = en.Gamma_Gaussian_fit(np.squeeze(beta),bfm,verbose=2)
683+
684+# fit beta's histogram with a mixture of Gaussians
685 alpha = 0.01
686 prior_strength = 100
687-Bfm = np.reshape(Bfm,(np.size(Bfm),1))
688-Bfq = bsa._GMM_priors_(np.squeeze(Beta),Bfm,theta,alpha,prior_strength,verbose=2)
689+bfq = en.three_classes_GMM_fit(beta, bfm, alpha, prior_strength,verbose=2)
690
691-# fit the null mode of Beta with the robust method
692-import fff2.utils.emp_null as en
693-efdr = en.ENN(Beta)
694+# fit the null mode of beta with the robust method
695+efdr = en.ENN(beta)
696 efdr.learn()
697-#Bfr = efdr.fdr(Bfm)
698 efdr.plot(bar=0)
699
700=== added file 'examples/neurospin/demo_hroi.py'
701--- examples/neurospin/demo_hroi.py 1970-01-01 00:00:00 +0000
702+++ examples/neurospin/demo_hroi.py 2009-04-16 20:53:11 +0000
703@@ -0,0 +1,50 @@
704+"""
705+Example of a script that crates a 'hierarchical roi' structure
706+from the blob model of an image
707+
708+Used mainly for debugging at the moment (befiore unittests are created)
709+
710+This example is based on a (simplistic) simulated image.
711+
712+"""
713+# Author : Bertrand Thirion, 2008-2009
714+
715+import numpy as np
716+import scipy.stats as st
717+import os.path as op
718+import nipy.neurospin.spatial_models.hroi as hroi
719+import nipy.neurospin.utils.simul_2d_multisubject_fmri_dataset as simul
720+import nipy.neurospin.graph.field as ff
721+
722+################################################################################
723+# simulate the data
724+dimx = 60
725+dimy = 60
726+pos = 2*np.array([[6,7],[10,10],[15,10]])
727+ampli = np.array([3,4,4])
728+
729+dataset = simul.make_surrogate_array(nbsubj=1, dimx=dimx, dimy=dimy, pos=pos, ampli=ampli, width=10.0).squeeze()
730+
731+dataset = np.reshape(dataset, (dimx, dimy,1))
732+ref_dim = (dimx,dimy,1)
733+xyz = np.array(np.where(dataset)).T
734+nbvox = np.size(xyz, 0)
735+
736+# create the field strcture that encodes image topology
737+Fbeta = ff.Field(nbvox)
738+Fbeta.from_3d_grid(xyz.astype(np.int), 18)
739+beta = np.reshape(dataset,(nbvox,1))
740+Fbeta.set_field(beta)
741+nroi = hroi.NROI_from_field(Fbeta,None,xyz,th=2.0,smin = 5)
742+if nroi != None:
743+ n1 = nroi.copy()
744+ n2 = nroi.reduce_to_leaves()
745+
746+td = n1.depth_from_leaves()
747+a = np.argmax(td)
748+lv = n1.rooted_subtree(a)
749+u = nroi.cc()
750+u = np.nonzero(u == u[0])[0]
751+err = np.sum((u-lv)**2)
752+nroi.feature_argmax('activation')
753+nroi.discrete_to_roi_features('activation')
754
755=== added file 'examples/neurospin/demo_sbf.py'
756--- examples/neurospin/demo_sbf.py 1970-01-01 00:00:00 +0000
757+++ examples/neurospin/demo_sbf.py 2009-06-05 13:39:32 +0000
758@@ -0,0 +1,99 @@
759+"""
760+This scipt generates a noisy activation image image
761+and applies the bayesian structural analysis on it
762+
763+Author : Bertrand Thirion, 2009
764+"""
765+#autoindent
766+
767+import numpy as np
768+import scipy.stats as st
769+import matplotlib.pylab as mp
770+import nipy.neurospin.graph.field as ff
771+import nipy.neurospin.utils.simul_2d_multisubject_fmri_dataset as simul
772+import nipy.neurospin.spatial_models.structural_bfls as sbf
773+
774+def make_bsa_2d(betas, theta=3., dmax=5., ths=0, pval=0.2):
775+ """ Function for performing bayesian structural analysis on a set of images.
776+ """
777+ ref_dim = np.shape(betas[0])
778+ nbsubj = betas.shape[0]
779+ xyz = np.array(np.where(betas[:1]))
780+ nbvox = np.size(xyz, 1)
781+
782+ # create the field strcture that encodes image topology
783+ Fbeta = ff.Field(nbvox)
784+ Fbeta.from_3d_grid(xyz.astype(np.int).T, 18)
785+
786+ # Get coordinates in mm
787+ xyz = np.transpose(xyz)
788+ tal = xyz.astype(np.float)
789+
790+ # get the functional information
791+ lbeta = np.array([np.ravel(betas[k]) for k in range(nbsubj)]).T
792+
793+ header = None
794+ group_map, AF, BF = sbf.Compute_Amers (Fbeta,lbeta,xyz,header, tal,dmax = dmax, thr=theta, ths = ths,pval=pval)
795+
796+ lmax = AF.k+2
797+ AF.show()
798+
799+ group_map.shape = ref_dim
800+ mp.figure()
801+ mp.imshow(group_map, interpolation='nearest', vmin=-1, vmax=lmax)
802+ mp.title('Group-level label map')
803+ mp.colorbar()
804+
805+ sub = np.concatenate([s*np.ones(BF[s].k) for s in range(nbsubj)])
806+ mp.figure()
807+ if nbsubj==10:
808+ for s in range(nbsubj):
809+ mp.subplot(2, 5, s+1)
810+ lw = -np.ones(ref_dim)
811+ nls = BF[s].get_roi_feature('label')
812+ nls[nls==-1] = np.size(AF)+2
813+ for k in range(BF[s].k):
814+ xyzk = BF[s].discrete[k].T
815+ lw[xyzk[1],xyzk[2]] = nls[k]
816+
817+ mp.imshow(lw, interpolation='nearest', vmin=-1, vmax=lmax)
818+ mp.axis('off')
819+
820+ mp.figure()
821+ if nbsubj==10:
822+ for s in range(nbsubj):
823+ mp.subplot(2,5,s+1)
824+ mp.imshow(betas[s],interpolation='nearest',vmin=betas.min(),vmax=betas.max())
825+ mp.axis('off')
826+
827+ return AF, BF
828+
829+
830+################################################################################
831+# Main script
832+################################################################################
833+
834+# generate the data
835+nbsubj=10
836+
837+dimx=60
838+dimy=60
839+pos = 2*np.array([[ 6, 7],
840+ [10, 10],
841+ [15, 10]])
842+ampli = np.array([5, 7, 6])
843+sjitter = 1.0
844+dataset = simul.make_surrogate_array(nbsubj=nbsubj, dimx=dimx, dimy=dimy,
845+ pos=pos, ampli=ampli, width=5.0)
846+betas = np.reshape(dataset, (nbsubj, dimx, dimy))
847+
848+# set various parameters
849+theta = float(st.t.isf(0.01, 100))
850+dmax = 5./1.5
851+ths = nbsubj/2-1
852+pval = 0.2
853+
854+# run the algo
855+AF, BF = make_bsa_2d(betas, theta, dmax, ths,pval)
856+mp.show()
857+
858
859=== modified file 'examples/neurospin/demo_watershed.py'
860--- examples/neurospin/demo_watershed.py 2009-03-28 01:56:11 +0000
861+++ examples/neurospin/demo_watershed.py 2009-06-05 13:39:32 +0000
862@@ -7,8 +7,8 @@
863 #autoindent
864
865 import numpy as np
866-import fff2.graph.field as ff
867-import fff2.utils.simul_2d_multisubject_fmri_dataset as simul
868+import nipy.neurospin.graph.field as ff
869+import nipy.neurospin.utils.simul_2d_multisubject_fmri_dataset as simul
870 import matplotlib
871 import matplotlib.pylab as mp
872
873
874=== modified file 'examples/neurospin/glm.py'
875--- examples/neurospin/glm.py 2009-03-28 01:56:11 +0000
876+++ examples/neurospin/glm.py 2009-04-21 06:18:23 +0000
877@@ -1,6 +1,8 @@
878-import fff2.neuro
879 import numpy as np
880
881+from nipy.neurospin import Image
882+from nipy.neurospin.statistical_mapping import LinearModel
883+
884 from datamind.core import DF
885
886 fmri_dataset_path = '/neurospin/lnao/Panabase/data_fiac/fiac_fsl/fiac0/fMRI/acquisition/fonc1/afonc1.nii.gz'
887@@ -13,11 +15,11 @@
888
889 # Get fMRI data as numpy array
890 print('loading fmri data...')
891-Y = fff2.neuro.image(fmri_dataset_path)
892+Y = Image(fmri_dataset_path)
893
894 # Get the mask
895 print('loading mask...')
896-##Mask = fff2.neuro.image(mask_image_path)
897+##Mask = Image(mask_image_path)
898 Mask = None
899
900 # GLM options
901@@ -26,7 +28,7 @@
902
903 # Fit
904 print('starting fit...')
905-glm = fff2.neuro.linear_model(Y, X, Mask, model=model)
906+glm = LinearModel(Y, X, Mask, model=model)
907
908 # Compute aribtrary contrast image
909 print('computing test contrast image...')
910
911=== removed file 'examples/neurospin/irina.py'
912--- examples/neurospin/irina.py 2009-03-28 01:56:11 +0000
913+++ examples/neurospin/irina.py 1970-01-01 00:00:00 +0000
914@@ -1,101 +0,0 @@
915-#!/usr/bin/env python
916-
917-import fff2
918-from os.path import join
919-from os import system
920-import sys
921-import time
922-import numpy as np
923-
924-"""
925-Script for image registration using fff on Irina's data.
926-"""
927-
928-subject = '/neurospin/lrmn/database_nmr/sujet11/anatomy/nobias_sujet11.ima'
929-template = '/neurospin/lnao/Panabase/vincent/dataBase_brainvisa/nmr/normalization/template/T1.ima'
930-
931-# Registration params
932-source = subject
933-target = template
934-toresample = 'source'
935-iolib = 'aims'
936-similarity = 'correlation ratio'
937-method = 'powell'
938-search = 'affine 3D'
939-
940-print ('Source brain: %s' % source)
941-print ('Target brain: %s' % target)
942-print ('I/O library: %s' % iolib)
943-print ('Similarity measure: %s' % similarity)
944-
945-
946-# Get data
947-print('Fetching image data...')
948-I = fff2.neuro.image(source, iolib=iolib)
949-J = fff2.neuro.image(target, iolib=iolib)
950-
951-## Info
952-print 'source dimensions: ', I.array.shape
953-print 'source voxel size: ', I.voxsize
954-print 'target dimensions: ', J.array.shape
955-print 'target voxel size: ', J.voxsize
956-
957-# Setup registration algorithm
958-print('Setting up registration...')
959-matcher = fff2.registration.iconic(I, J) ## I: source, J: target
960-matcher.set(subsampling=[4,4,4], similarity=similarity)
961-
962-# Register
963-print('Starting registration...')
964-tic = time.time()
965-##T, t = matcher.optimize(method=method, search='rigid 3D')
966-##T, t = matcher.optimize(method=method, search='similarity 3D', start=t)
967-t = None
968-T, t = matcher.optimize(method=method, search='affine 3D', start=t)
969-toc = time.time()
970-print(' Optimization time: %f sec' % (toc-tic))
971-
972-# Resample image
973-print('Resampling image...')
974-tic = time.time()
975-if toresample=='target':
976- It = fff2.neuro.image(I)
977- It.set_array(matcher.resample(T), toresample='target')
978-else:
979- It = fff2.neuro.image(J)
980- It.set_array(matcher.resample(T))
981-toc = time.time()
982-print(' Resampling time: %f sec' % (toc-tic))
983-
984-# Save resampled source
985-print('Saving resampled image...')
986-outfile = 'toto'
987-print ('Saving resampled source in: %s' % outfile + '.ima')
988-It.save(outfile + '.ima')
989-
990-# Convert to AIMS format and save
991-# DIRTY HACK
992-Tv = matcher.voxel_transform(T) ## canonic voxel coordinate systems
993-Dsrc_inv = np.diag(1/np.diag(matcher.source_transform))
994-Dtgt = np.diag(1/np.diag(matcher.target_transform_inv))
995-Ta = np.dot(np.dot(Dtgt, Tv), Dsrc_inv)
996-
997-f = open(outfile+'.trm', 'w')
998-f.write(Ta[0,3].__str__()+'\t'+ Ta[1,3].__str__()+'\t'+Ta[2,3].__str__()+'\n')
999-f.write(Ta[0,0].__str__()+'\t'+ Ta[0,1].__str__()+'\t'+Ta[0,2].__str__()+'\n')
1000-f.write(Ta[1,0].__str__()+'\t'+ Ta[1,1].__str__()+'\t'+Ta[1,2].__str__()+'\n')
1001-f.write(Ta[2,0].__str__()+'\t'+ Ta[2,1].__str__()+'\t'+Ta[2,2].__str__()+'\n')
1002-f.close()
1003-
1004-#
1005-cmd = 'AimsResample -m '+outfile+'.trm -i '+source+' -o '+outfile+'_aims.ima -r '+target
1006-system(cmd)
1007-
1008-
1009-# Save transfo
1010-##np.savez(outfile, Ta, T, matcher.source_transform, matcher.target_transform_inv)
1011-
1012-
1013-
1014-
1015-
1016
1017=== modified file 'examples/neurospin/neurospy/Contrast.py'
1018--- examples/neurospin/neurospy/Contrast.py 2009-03-28 01:56:11 +0000
1019+++ examples/neurospin/neurospy/Contrast.py 2009-07-07 18:03:46 +0000
1020@@ -2,12 +2,12 @@
1021 from configobj import ConfigObj
1022
1023 class Contrast(dict):
1024- def __init__(self, indict=None):
1025+ def __init__(self, indict=None,verbose=0):
1026 dict.__init__(self)
1027 if indict != None:
1028 for entry in indict.keys():
1029 if entry != "Type" and entry != "Dimension":
1030- print indict[entry]
1031+ if verbose: print indict[entry]
1032 self[entry] = array(indict[entry]).astype('f')
1033
1034 def __add__(self, con):
1035@@ -40,48 +40,53 @@
1036 return res
1037
1038 class ContrastList():
1039- def __init__ (self, misc_info_path = None, contrast_path = None, model = "default"):
1040- if misc_info_path != None:
1041- misc= ConfigObj(misc_info_path)
1042- self.dic = {}
1043- base_cond = Contrast()
1044- sessions = []
1045- for reg in misc[model].keys():
1046- if reg[:11] == "regressors_":
1047- base_cond[reg[11:]] = zeros(len(misc[model][reg]))
1048- sessions.append(reg[11:])
1049- print sessions
1050- for sess in sessions:
1051- reg = "regressors_%s" % sess
1052- for i, cond in enumerate(misc[model][reg]):
1053- if not self.dic.has_key(cond):
1054- self.dic[cond] = Contrast(base_cond)
1055- self.dic[cond][sess][i] = 1
1056- effect_cond = Contrast()
1057- ndrift = 0
1058- nderiv = 1
1059- print misc[model]["regressors_%s" % sessions[0]]
1060- for cond in misc[model]["regressors_%s" % sessions[0]]:
1061- if cond[:6] == "(drift":
1062- ndrift += 1
1063- elif cond[-6:] == "_deriv":
1064- nderiv = 2
1065- elif cond.split("_")[-1][0] == "d" and cond.split("_")[-1][1:].isdigit():
1066- nderiv += 1
1067- for sess in sessions:
1068- effect_cond[sess] = zeros(((len(base_cond[sess]) - ndrift) / nderiv, len(base_cond[sess])))
1069- for i in range(0, effect_cond[sess].shape[0]):
1070- effect_cond[sess][i,i * nderiv] = 1
1071- self.dic["effect_of_interest"] = effect_cond
1072- if contrast_path != None:
1073- con = ConfigObj(contrast_path)
1074- for c in con["contrast"]:
1075- self.dic[c] = Contrast(con[c])
1076+ def __init__ (self, misc_info_path=None, contrast_path=None, model="default", verbose=0):
1077+ if misc_info_path == None:
1078+ raise ValueError, "Need a misc_info path"
1079+ misc= ConfigObj(misc_info_path)
1080+ self.dic = {}
1081+ base_cond = Contrast()
1082+ sessions = []
1083+ for reg in misc[model].keys():
1084+ if reg[:11] == "regressors_":
1085+ base_cond[reg[11:]] = zeros(len(misc[model][reg]))
1086+ sessions.append(reg[11:])
1087+
1088+ if verbose: print sessions
1089+ for sess in sessions:
1090+ reg = "regressors_%s" % sess
1091+ for i, cond in enumerate(misc[model][reg]):
1092+ if not self.dic.has_key(cond):
1093+ self.dic[cond] = Contrast(base_cond)
1094+ self.dic[cond][sess][i] = 1
1095+
1096+ effect_cond = Contrast()
1097+ ndrift = 0
1098+ nderiv = 1
1099+ if verbose: print misc[model]["regressors_%s" % sessions[0]]
1100+ for cond in misc[model]["regressors_%s" % sessions[0]]:
1101+ if cond[:6] == "(drift":
1102+ ndrift += 1
1103+ elif cond[-6:] == "_deriv":
1104+ nderiv = 2
1105+ elif cond.split("_")[-1][0] == "d" and cond.split("_")[-1][1:].isdigit():
1106+ nderiv += 1
1107+
1108+ for sess in sessions:
1109+ effect_cond[sess] = zeros(((len(base_cond[sess])-ndrift)/nderiv,
1110+ len(base_cond[sess])))
1111+ for i in range(0, effect_cond[sess].shape[0]):
1112+ effect_cond[sess][i,i * nderiv] = 1
1113+ self.dic["effect_of_interest"] = effect_cond
1114+ if contrast_path != None:
1115+ con = ConfigObj(contrast_path)
1116+ for c in con["contrast"]:
1117+ self.dic[c] = Contrast(con[c])
1118
1119 def get_dictionnary(self):
1120 return self.dic
1121
1122- def save_dic(self, contrast_file):
1123+ def save_dic(self, contrast_file,verbose=0):
1124 contrast = ConfigObj(contrast_file)
1125 contrast["contrast"] = []
1126 for key in self.dic.keys():
1127@@ -114,6 +119,6 @@
1128 for i, row in enumerate(v):
1129 contrast[key]["%s_row%i" % (k, i)] = [int(j) for j in row]
1130 contrast[key]["Dimension"] = dim
1131- print contrast[key]
1132+ if verbose: print contrast[key]
1133 contrast["contrast"].append(key)
1134 contrast.write()
1135
1136=== removed file 'examples/neurospin/neurospy/CorticalDesignMatrix.py'
1137--- examples/neurospin/neurospy/CorticalDesignMatrix.py 2009-04-14 22:46:14 +0000
1138+++ examples/neurospin/neurospy/CorticalDesignMatrix.py 1970-01-01 00:00:00 +0000
1139@@ -1,241 +0,0 @@
1140-__doc__ = """fMRI-specific classes.
1141-
1142-WORK IN PROGRESS: please run (don't import) this file. Example of use in the end.
1143-
1144-"""
1145-
1146-from fff import glm
1147-
1148-import os, urllib, time, string
1149-import numpy as N
1150-import pylab
1151-from configobj import ConfigObj
1152-from soma import aims
1153-from neurospy.bvfunc import tio
1154-
1155-from nipy.modalities.fmri.protocol import ExperimentalFactor
1156-from nipy.modalities.fmri import protocol, hrf
1157-
1158-
1159-def _loadProtocol(x, session, names = None):
1160- """
1161- Read a paradigm file consisting of a list of pairs (occurence time, (duration), event ID)
1162- and instantiate a NiPy ExperimentalFactor object.
1163- """
1164- #paradigm = [i.split()[::-1] for i in open(x) if i != "\n"]
1165- paradigm = pylab.load(x)
1166- if paradigm[paradigm[:,-1] == session].tolist() == []:
1167- return None
1168- paradigm = paradigm[paradigm[:,-1] == session]
1169- if paradigm.shape[1] == 4:
1170- paradigm = paradigm[:,:3]
1171- else:
1172- paradigm[:,2] = 0.5
1173- paradigm[:,2] = paradigm[:,1] + paradigm[:,2]
1174-# if paradigm[-1,1] > 1000:
1175-# paradigm[:,1] /= 1000.0
1176-# paradigm[:,2] /= 1000.0
1177- if names != None:
1178- name_col = [names[int(i)] for i in paradigm[:,0]]
1179- p = protocol.ExperimentalFactor("protocol", zip(name_col, paradigm[:,1].tolist(), paradigm[:,2].tolist()),
1180- delta = False)
1181- else:
1182- p = protocol.ExperimentalFactor("protocol", paradigm[:,:3], delta = False)
1183- p.design_type = "block"
1184- return p
1185-
1186-order = 2
1187-
1188-def _drift(time):
1189- v = N.ones([order+1, time.shape[0]], dtype="f")
1190- tmax = N.abs(time.max()) * 1.0
1191- time = time * 1.0
1192- for i in range(order):
1193- v[i+1] = (time/tmax)**(i+1)
1194- return v
1195-
1196-canonical_drift = protocol.ExperimentalQuantitative('drift', _drift)
1197-
1198-HF = 128
1199-def cosine_matrix(time):
1200- M = time.max()
1201- numreg = int(N.floor(2 * float(M) / float(HF)) + 1)
1202- return N.array([N.sqrt(2.0/M) * N.cos(N.pi*(time.astype(float)/M + 0.5/len(time))*k ) for k in range(numreg)]) * 100.0
1203-
1204-cosine_drift = protocol.ExperimentalQuantitative('drift', cosine_matrix)
1205-
1206-def _fullRank(X, cmax=1e15):
1207- """ X is assumed to be a 2d numpy array. This function possibly adds a scalar matrix to X
1208- to guarantee that the condition number is smaller than a given threshold. """
1209- U, s, V = N.linalg.svd(X,0)
1210- sM = s.max()
1211- sm = s.min()
1212- c = sM/sm
1213- if c < cmax:
1214- return X, c
1215- print 'Warning: matrix is singular at working precision, regularizing...'
1216- lda = (sM-cmax*sm)/(cmax-1)
1217- s = s + lda
1218- X = N.dot(U, N.dot(N.diag(s), V))
1219- return X, cmax
1220-
1221-
1222-class DesignMatrix():
1223- def __init__(self, fmri_filename, protocol_filename, session = 0, misc_file = None):
1224- self.protocol_filename = protocol_filename
1225- self.fmri_filename = fmri_filename
1226- self.session = session
1227- self.misc_file = misc_file
1228-
1229- def load(self):
1230- """
1231- Load data from files and apply mask.
1232- """
1233-
1234- # fMRI data + binary mask
1235- if self.session.isdigit():
1236- self.session = int(self.session)
1237- else:
1238- misc = ConfigObj(self.misc_file)
1239- self.session = misc["sessions"].keys().index(self.session)
1240- self.fmri = aims.read(self.fmri_filename)
1241- self.frametimes = N.arange(self.fmri.size())
1242- self.misc = ConfigObj(self.misc_file)
1243- self.session_name = self.misc["sessions"].keys()[self.session]
1244- self.protocol = _loadProtocol(self.protocol_filename, self.session, self.misc["sessions"][self.session_name])
1245-
1246- def timing(self, tr, t0=0.0, trSlices=None, slice_idx=None):
1247- """
1248- tr : inter-scan repetition time, i.e. the time elapsed between two consecutive scans
1249-
1250-
1251- t0 : time elapsed from the paradigm time origin to the first scan acquisition (different
1252- from zero if the paradigm was not synchronized with the acquisition, or dummy scans have
1253- been removed)
1254-
1255- trSlices : inter-slice repetition time, same concept as tr for slices
1256-
1257- slice_idx : either a string or an array of integers.
1258- When input as an array of integers, slice_idx is the slice acquisition order that
1259- maps each slice number to its corresponding rank (be careful, indexes are counted from
1260- zero instead of one, as it is the standard practice in Python). By convention, slices
1261- are numbered from the bottom to the top of the head. Alternatively, keywords describing
1262- usual sequences can be used:
1263- 'ascending' : equivalent to [0,1,2,...,N-1]
1264- 'descending' : equivalent to [N-1,N-2,...,0]
1265- 'interleaved bottom-up' : equivalent to [0,N/2,1,N/2+1,2,N/2+2,...]
1266- 'interleaved top-down' : reverted interleaved bottom-up
1267-
1268- """
1269- tr = float(tr)
1270- t0 = float(t0)
1271- self.frametimes *= tr
1272- self.frametimes += t0
1273- ## TODO: account for slice timing in case data is not already corrected...
1274-
1275-
1276- def compute_design(self, hrf=hrf.canonical, drift=canonical_drift, name = ""):
1277- """
1278- Use e.g. hrf=hrf.glover_deriv to use HRF derivatives as additional regressors.
1279- self._glm is an ExperimentalFormula with terms 'drift' (ExperimentalQuantitative)
1280- and 'protocol' (ExperimentalFactor), these respective objects being accessible
1281- through the list self._glm.terms or via self._glm['drift'] and similarly
1282- for 'protocol'.
1283- """
1284- if self.protocol == None:
1285- print "The selected session does not exists"
1286- return None
1287- self._glm = self.protocol.convolve(hrf)
1288- misc = ConfigObj(self.misc_file)
1289- ## Force the design matrix to be full rank at working precision
1290- temp = self._glm(time=self.frametimes)
1291- temp = temp.transpose()
1292- self._design, self._design_cond = _fullRank(temp)
1293- drift_ind=[]
1294- proto_ind=[]
1295- proto_name=[]
1296- dproto_ind=[]
1297- dproto_name=[]
1298- for i,n in enumerate(self._glm.names()):
1299- if (n[:6] == "(drift"):
1300- drift_ind.append(i)
1301- elif (n[:19] == "(glover%(protocol=="):
1302- proto_ind.append(i)
1303- proto_name.append(n[19:-2])
1304- elif (n[:20] == "(dglover%(protocol=="):
1305- dproto_ind.append(i)
1306- dproto_name.append("%s_deriv"%n[20:-2])
1307- order1=[proto_name.index(n) for n in misc["sessions"][self.session_name]]
1308- if len(dproto_name) > 0:
1309- order2=[dproto_name.index("%s_deriv" % n) for n in misc["sessions"][self.session_name]]
1310- ind = range(len(proto_ind) + len(dproto_ind))
1311- ind[::2]=N.array(proto_ind)[order1]
1312- ind[1::2]=N.array(dproto_ind)[order2]
1313- else:
1314- ind = proto_ind
1315- new_order = N.concatenate((ind, drift_ind))
1316- self._design = self._design[:, new_order]
1317- names = self._glm.names()
1318- self.names=[]
1319- for n in misc["sessions"][self.session_name]:
1320- self.names.append(n)
1321- if len(dproto_name) > 0:
1322- self.names.append("%s_deriv" % n)
1323- for i in drift_ind:
1324- self.names.append(names[i])
1325- if drift == 0:
1326- drm = N.ones((self._design.shape[0],1))
1327- elif drift == cosine_drift:
1328- drm = cosine_matrix(self.frametimes).T
1329- elif drift == canonical_drift:
1330- drm = _drift(self.frametimes).T
1331- else:
1332- drm = drift
1333- drml = drm.shape[1]
1334- for i in range(drml):
1335- self.names.append('(drift:%i)' % i)
1336- self._design = N.column_stack((self._design, drm))
1337- #self.names = [names[i] for i in new_order]
1338- misc["regressors_%s" % name] = self.names
1339- misc["design matrix cond"] = self._design_cond
1340- misc.write()
1341- """From now on, self.protocol.convolved==True. Don't know whether another call to convolve
1342- results in a double convolution or replaces the first convolution. ???
1343- """
1344-
1345- def compute_fir_design(self, drift=canonical_drift, o=1, l=1, name=""):
1346- if self.protocol == None:
1347- print "The selected session does not exists"
1348- return None
1349- misc = ConfigObj(self.misc_file)
1350- temp = N.zeros((len(self.frametimes), (o * len(self.protocol.events))))
1351- diff = l / o
1352- self.names = []
1353- i = 0
1354- for event in misc["sessions"][self.session_name]:
1355- if self.protocol.events.has_key(event):
1356- for j in range(o):
1357- if j == 0:
1358- self.names.append("%s" % (event))
1359- else:
1360- self.names.append("%s_d%i" % (event, j))
1361- for t in self.protocol.events[event].times:
1362- base = N.argmax(self.frametimes > t)
1363- for k in range(diff):
1364- temp[base + (k + j * diff), j + i * o] = 1
1365- i += 1
1366- self._design, self._design_cond = _fullRank(temp)
1367- if drift == 0:
1368- drm = N.ones((self._design.shape[0],1))
1369- elif drift == cosine_drift:
1370- drm = cosine_matrix(self.frametimes).T
1371- elif drift == canonical_drift:
1372- drm = _drift(self.frametimes).T
1373- else:
1374- drm = drift
1375- drml = drm.shape[1]
1376- for i in range(drml):
1377- self.names.append('(drift:%i)' % i)
1378- self._design = N.column_stack((self._design, drm))
1379- misc["regressors_%s" % name] = self.names
1380- misc.write()
1381
1382=== modified file 'examples/neurospin/neurospy/DesignMatrix.py'
1383--- examples/neurospin/neurospy/DesignMatrix.py 2009-04-14 22:46:14 +0000
1384+++ examples/neurospin/neurospy/DesignMatrix.py 2009-07-07 18:03:46 +0000
1385@@ -4,8 +4,6 @@
1386
1387 """
1388
1389-#from fff2 import glm
1390-
1391 import os, urllib, time, string
1392 import numpy as np
1393 import pylab
1394@@ -14,54 +12,62 @@
1395 from nipy.modalities.fmri.protocol import ExperimentalFactor
1396 from nipy.modalities.fmri import protocol, hrf
1397
1398+order = 2
1399+HF = 128
1400+
1401+def _drift(time):
1402+ """
1403+ Create a drift matrix
1404+ """
1405+ v = np.ones([order+1, time.shape[0]], dtype="f")
1406+ tmax = np.abs(time.max()) * 1.0
1407+ time = time * 1.0
1408+ for i in range(order):
1409+ v[i+1] = (time/tmax)**(i+1)
1410+ return v
1411+
1412+canonical_drift = protocol.ExperimentalQuantitative('drift', _drift)
1413+
1414+def cosine_matrix(time):
1415+ """
1416+ create a cosine drift matrix
1417+ """
1418+ M = time.max()
1419+ numreg = int(np.floor(2 * float(M) / float(HF)) + 1)
1420+ return np.array([np.sqrt(2.0/M) * np.cos(np.pi*(time.astype(float)/M + 0.5/len(time))*k ) for k in range(numreg)]) * 100.0
1421+
1422+cosine_drift = protocol.ExperimentalQuantitative('drift', cosine_matrix)
1423
1424 def _loadProtocol(x, session, names = None):
1425 """
1426- Read a paradigm file consisting of a list of pairs (occurence time, (duration), event ID)
1427+ Read a paradigm file consisting of a list of pairs
1428+ (occurence time, (duration), event ID)
1429 and instantiate a NiPy ExperimentalFactor object.
1430+
1431+ INPUT:
1432+ x: a path to a .csv file
1433+
1434 """
1435- #paradigm = [i.split()[::-1] for i in open(x) if i != "\n"]
1436- paradigm = pylab.load(x)
1437+ paradigm = pylab.loadtxt(x)
1438 if paradigm[paradigm[:,0] == session].tolist() == []:
1439 return None
1440 paradigm = paradigm[paradigm[:,0] == session]
1441 if paradigm.shape[1] == 4:
1442- paradigm = paradigm[:,1:] ### ? !!!
1443+ paradigm = paradigm[:,1:]
1444 else:
1445 paradigm[:,0] = 0.5
1446 paradigm = paradigm[:,[1,2,0]]
1447 paradigm[:,2] = paradigm[:,1] + paradigm[:,2]
1448-# if paradigm[-1,1] > 1000:
1449-# paradigm[:,1] /= 1000.0
1450-# paradigm[:,2] /= 1000.0
1451+
1452 if names != None:
1453 name_col = [names[int(i)] for i in paradigm[:,0]]
1454- p = protocol.ExperimentalFactor("protocol", zip(name_col, paradigm[:,1].tolist(), paradigm[:,2].tolist()),
1455- delta = False)
1456+ p = protocol.ExperimentalFactor("protocol", zip(name_col, paradigm[:,1].tolist(), paradigm[:,2].tolist()),delta = False)
1457 else:
1458 p = protocol.ExperimentalFactor("protocol", paradigm[:,:3], delta = False)
1459 p.design_type = "block"
1460 return p
1461-
1462-order = 2
1463-
1464-def _drift(time):
1465- v = np.ones([order+1, time.shape[0]], dtype="f")
1466- tmax = np.abs(time.max()) * 1.0
1467- time = time * 1.0
1468- for i in range(order):
1469- v[i+1] = (time/tmax)**(i+1)
1470- return v
1471-
1472-canonical_drift = protocol.ExperimentalQuantitative('drift', _drift)
1473-
1474-HF = 128
1475-def cosine_matrix(time):
1476- M = time.max()
1477- numreg = int(np.floor(2 * float(M) / float(HF)) + 1)
1478- return np.array([np.sqrt(2.0/M) * np.cos(np.pi*(time.astype(float)/M + 0.5/len(time))*k ) for k in range(numreg)]) * 100.0
1479-
1480-cosine_drift = protocol.ExperimentalQuantitative('drift', cosine_matrix)
1481+
1482+
1483
1484 def _fullRank(X, cmax=1e15):
1485 """ X is assumed to be a 2d numpy array. This function possibly adds a scalar matrix to X
1486@@ -71,7 +77,7 @@
1487 sm = s.min()
1488 c = sM/sm
1489 if c < cmax:
1490- return X, c
1491+ return X, c
1492 print 'Warning: matrix is singular at working precision, regularizing...'
1493 lda = (sM-cmax*sm)/(cmax-1)
1494 s = s + lda
1495@@ -97,20 +103,19 @@
1496 else:
1497 misc = ConfigObj(self.misc_file)
1498 self.session = misc["sessions"].index(self.session)
1499- #self.grid = self.fmri.grid
1500+
1501 self.frametimes = np.arange(self.nbframes)
1502 self.misc = ConfigObj(self.misc_file)
1503 if not self.misc.has_key(self.model):
1504 misc[self.model] = {}
1505 misc.write()
1506- #self.session_name = self.misc["sessions"][self.session]
1507+
1508 self.protocol = _loadProtocol(self.protocol_filename, self.session, self.misc["tasks"])
1509
1510 def timing(self, tr, t0=0.0, trSlices=None, slice_idx=None):
1511 """
1512 tr : inter-scan repetition time, i.e. the time elapsed between two consecutive scans
1513
1514-
1515 t0 : time elapsed from the paradigm time origin to the first scan acquisition (different
1516 from zero if the paradigm was not synchronized with the acquisition, or dummy scans have
1517 been removed)
1518@@ -133,9 +138,7 @@
1519 t0 = float(t0)
1520 self.frametimes *= tr
1521 self.frametimes += t0
1522- ## TODO: account for slice timing in case data is not already corrected...
1523
1524-
1525 def compute_design(self, hrf=hrf.canonical, drift=canonical_drift, name = ""):
1526 """
1527 Use e.g. hrf=hrf.glover_deriv to use HRF derivatives as additional regressors.
1528
1529=== renamed file 'examples/neurospin/neurospy/ScriptBVFunc.py' => 'examples/neurospin/neurospy/GLMTools.py'
1530--- examples/neurospin/neurospy/ScriptBVFunc.py 2009-03-28 01:56:11 +0000
1531+++ examples/neurospin/neurospy/GLMTools.py 2009-07-07 18:03:46 +0000
1532@@ -1,252 +1,375 @@
1533+"""
1534+General tools to analyse fMRI datasets (FSL pre-rocessing and GLM fit)
1535+using nipy.neurospin tools
1536+
1537+Author : Lise Favre, Bertrand Thirion, 2008-2009
1538+"""
1539+
1540 from numpy import *
1541 import commands
1542 import nifti
1543-## For Smoothing
1544-import scipy.ndimage as SN
1545-## For GLM
1546+import os
1547+
1548+from configobj import ConfigObj
1549+import scipy.ndimage as sn
1550+
1551 from vba import VBA
1552-## For Contrast Computation
1553-from configobj import ConfigObj
1554 import Results
1555-## For Mask Computation
1556-from fff2.utils.mask import compute_mask_intra
1557-## For the tools
1558-import os
1559+from nipy.neurospin.utils.mask import compute_mask_intra
1560
1561-#########
1562-# Tools #
1563-#########
1564+# ----------------------------------------------
1565+# -------- Ancillary functions -----------------
1566+# ----------------------------------------------
1567
1568 def save_volume(volume, file, header, mask=None, data=None):
1569- if mask != None and data != None:
1570- if size(data.shape) == 1:
1571- volume[mask > 0] = data
1572- else:
1573- for i in range(data.shape[0]):
1574- volume[i][mask[0] > 0] = data[i]
1575- nifti.NiftiImage(volume,header).save(file)
1576+ """
1577+ niftilib-based save volume utility
1578+
1579+ fixme : very low-level and naive
1580+ """
1581+ if mask != None and data != None:
1582+ if size(data.shape) == 1:
1583+ volume[mask > 0] = data
1584+ else:
1585+ for i in range(data.shape[0]):
1586+ volume[i][mask[0] > 0] = data[i]
1587+ nifti.NiftiImage(volume,header).save(file)
1588
1589 def saveall(contrast, design, ContrastId, dim, kargs):
1590- if kargs.has_key("paths"):
1591- paths = kargs["paths"]
1592- else:
1593- print "Cannot save contrast files. Missing argument : paths"
1594- return
1595- mask = nifti.NiftiImage(design.mask_url)
1596- mask_arr = mask.asarray()
1597- header = mask.header
1598- contrasts_path = paths["Contrasts_path"]
1599- if size(mask_arr.shape) == 3:
1600- mask_arr= mask_arr.reshape(1, mask_arr.shape[0], mask_arr.shape[1], mask_arr.shape[2])
1601- shape = mask_arr.shape
1602- t = contrast.stat()
1603- z = contrast.zscore()
1604- results = "Z map"
1605- z_file = os.sep.join((contrasts_path, "%s_%s.nii"% (str(ContrastId), paths[results])))
1606- save_volume(zeros(shape), z_file, header, mask_arr, z)
1607- if contrast.type == "t":
1608- results = "Student-t tests"
1609- elif contrast.type == "F":
1610- results = "Fisher tests"
1611- t_file = os.sep.join((contrasts_path, "%s_%s.nii" % (str(ContrastId), paths[results])))
1612- save_volume(zeros(shape), t_file, header, mask_arr, t)
1613- if int(dim) != 1:
1614- shape = (int(dim) * int(dim), shape[1], shape[2], shape[3])
1615- contrast.variance = contrast.variance.reshape(int(dim) * int(dim), -1)
1616- results = "Residual variance"
1617- res_file = os.sep.join((contrasts_path, "%s_%s.nii" % (str(ContrastId), paths[results])))
1618- save_volume(zeros(shape), res_file, header, mask_arr, contrast.variance)
1619- if int(dim) != 1:
1620- shape = (int(dim), shape[1], shape[2], shape[3])
1621- results = "contrast definition"
1622- con_file = os.sep.join((contrasts_path, "%s_%s.nii" % (str(ContrastId), paths[results])))
1623- save_volume(zeros(shape), con_file, header, mask_arr, contrast.effect)
1624- if kargs.has_key("method"):
1625- method = kargs["method"]
1626- else:
1627- print "Cannot save HTML results. Missing argument : method"
1628- return
1629- if kargs.has_key("threshold"):
1630- threshold = kargs["threshold"]
1631- else:
1632- print "Cannot save HTML results. Missing argument : threshold"
1633- return
1634- if kargs.has_key("cluster"):
1635- cluster = kargs["cluster"]
1636- else:
1637- cluster = 0
1638- results = "HTML Results"
1639- html_file = os.sep.join((contrasts_path, "%s_%s.html" % (str(ContrastId), paths[results])))
1640- Results.ComputeResultsContents(z_file, design.mask_url, html_file, threshold = threshold, method = method, cluster = cluster)
1641-
1642-
1643-def ComputeMask(fmriFiles, outputFile, infT = 0.2, supT = 0.9):
1644+ """
1645+ Save all the outputs of a GLM analysis + contrast definition
1646+ fixme : restructure it
1647+ """
1648+ # preparae the paths (?)
1649+ if kargs.has_key("paths"):
1650+ paths = kargs["paths"]
1651+ else:
1652+ print "Cannot save contrast files. Missing argument : paths"
1653+ return
1654+ mask = nifti.NiftiImage(design.mask_url)
1655+ mask_arr = mask.asarray()
1656+ header = mask.header
1657+ contrasts_path = paths["Contrasts_path"]
1658+ if size(mask_arr.shape) == 3:
1659+ mask_arr= mask_arr.reshape(1, mask_arr.shape[0],
1660+ mask_arr.shape[1], mask_arr.shape[2])
1661+ shape = mask_arr.shape
1662+ t = contrast.stat()
1663+ z = contrast.zscore()
1664+
1665+ # saving the Z statsitics map
1666+ results = "Z map"
1667+ z_file = os.sep.join((contrasts_path, "%s_%s.nii"% (str(ContrastId), paths[results])))
1668+ save_volume(zeros(shape), z_file, header, mask_arr, z)
1669+
1670+ # Saving the t/F statistics map
1671+ if contrast.type == "t":
1672+ results = "Student-t tests"
1673+ elif contrast.type == "F":
1674+ results = "Fisher tests"
1675+ t_file = os.sep.join((contrasts_path, "%s_%s.nii" %
1676+ (str(ContrastId), paths[results])))
1677+ save_volume(zeros(shape), t_file, header, mask_arr, t)
1678+ if int(dim) != 1:
1679+ shape = (int(dim) * int(dim), shape[1], shape[2], shape[3])
1680+ contrast.variance = contrast.variance.reshape(int(dim) * int(dim), -1)
1681+
1682+ # saving the associated variance map
1683+ results = "Residual variance"
1684+ res_file = os.sep.join((contrasts_path, "%s_%s.nii" %
1685+ (str(ContrastId), paths[results])))
1686+ save_volume(zeros(shape), res_file, header, mask_arr, contrast.variance)
1687+ if int(dim) != 1:
1688+ shape = (int(dim), shape[1], shape[2], shape[3])
1689+
1690+ # writing the associated contrast structure
1691+ results = "contrast definition"
1692+ con_file = os.sep.join((contrasts_path, "%s_%s.nii" %
1693+ (str(ContrastId), paths[results])))
1694+ save_volume(zeros(shape), con_file, header, mask_arr, contrast.effect)
1695+
1696+ # writing the results as an html page
1697+ if kargs.has_key("method"):
1698+ method = kargs["method"]
1699+ else:
1700+ print "Cannot save HTML results. Missing argument : method"
1701+ return
1702+
1703+ if kargs.has_key("threshold"):
1704+ threshold = kargs["threshold"]
1705+ else:
1706+ print "Cannot save HTML results. Missing argument : threshold"
1707+ return
1708+
1709+ if kargs.has_key("cluster"):
1710+ cluster = kargs["cluster"]
1711+ else:
1712+ cluster = 0
1713+
1714+ results = "HTML Results"
1715+ html_file = os.sep.join((contrasts_path, "%s_%s.html" % (str(ContrastId), paths[results])))
1716+ Results.ComputeResultsContents(z_file, design.mask_url, html_file,
1717+ threshold=threshold, method=method,
1718+ cluster=cluster)
1719+
1720+
1721+def ComputeMask(fmriFiles, outputFile, infT=0.4, supT=0.9):
1722+ """
1723+ Perform the mask computation
1724+ """
1725 compute_mask_intra(fmriFiles, outputFile, False,None, infT, supT)
1726
1727-###################
1728-# Pre processings #
1729-###################
1730+
1731+# ---------------------------------------------
1732+# various FSL-based Pre processings functions -
1733+# ---------------------------------------------
1734
1735 def SliceTiming(file, tr, outputFile, interleaved = False, ascending = True):
1736- so = " "
1737- inter = " "
1738- if interleaved:
1739- inter = "--odd"
1740- if not ascending:
1741- so = "--down"
1742- print "slicetimer -i '%s' -o '%s' %s %s -r %s" % (file, outputFile, so, inter, str(tr))
1743- print commands.getoutput("slicetimer -i '%s' -o '%s' %s %s -r %s" % (file, outputFile, so, inter, str(tr)))
1744+ """
1745+ Perform slice timing using FSL
1746+ """
1747+ so = " "
1748+ inter = " "
1749+ if interleaved:
1750+ inter = "--odd"
1751+ if not ascending:
1752+ so = "--down"
1753+ print "slicetimer -i '%s' -o '%s' %s %s -r %s" % (file, outputFile, so, inter, str(tr))
1754+ print commands.getoutput("slicetimer -i '%s' -o '%s' %s %s -r %s" % (file, outputFile, so, inter, str(tr)))
1755
1756 def Realign(file, refFile, outputFile):
1757- print commands.getoutput("mcflirt -in '%s' -out '%s' -reffile '%s' -mats" % (file, outputFile, refFile))
1758+ """
1759+ Perform realignment using FSL
1760+ """
1761+ print commands.getoutput("mcflirt -in '%s' -out '%s' -reffile '%s' -mats" % (file, outputFile, refFile))
1762
1763 def NormalizeAnat(anat, templatet1, normAnat, norm_matrix, searcht1 = "NASO"):
1764- if searcht1 == "AVA":
1765- s1 = "-searchrx -0 0 -searchry -0 0 -searchrz -0 0"
1766- elif (searcht1 == "NASO"):
1767- s1 = "-searchrx -90 90 -searchry -90 90 -searchrz -90 90"
1768- elif (searcht1 == "IO"):
1769- s1 = "-searchrx -180 180 -searchry -180 180 -searchrz -180 180"
1770- print "T1 MRI on Template\n"
1771- print commands.getoutput("flirt -in '%s' -ref '%s' -omat '%s' -out '%s' -bins 1024 -cost corratio %s -dof 12" % (anat, templatet1, norm_matrix, normAnat, s1) )
1772- print "Finished"
1773+ """
1774+ Form the normalization of anatomical images using FSL
1775+ """
1776+ if searcht1 == "AVA":
1777+ s1 = "-searchrx -0 0 -searchry -0 0 -searchrz -0 0"
1778+ elif (searcht1 == "NASO"):
1779+ s1 = "-searchrx -90 90 -searchry -90 90 -searchrz -90 90"
1780+ elif (searcht1 == "IO"):
1781+ s1 = "-searchrx -180 180 -searchry -180 180 -searchrz -180 180"
1782+ print "T1 MRI on Template\n"
1783+ print commands.getoutput("flirt -in '%s' -ref '%s' -omat '%s' -out '%s' -bins 1024 -cost corratio %s -dof 12" % (anat, templatet1, norm_matrix, normAnat, s1) )
1784+ print "Finished"
1785
1786 def NormalizeFMRI(file, anat, outputFile, normAnat, norm_matrix, searchfmri = "AVA"):
1787- if searchfmri == "AVA":
1788- s2 = "-searchrx -0 0 -searchry -0 0 -searchrz -0 0"
1789- elif (searchfmri == "NASO"):
1790- s2 = "-searchrx -90 90 -searchry -90 90 -searchrz -90 90"
1791- elif (searchfmri == "IO"):
1792- s2 = "-searchrx -180 180 -searchry -180 180 -searchrz -180 180"
1793- print "fMRI on T1 MRI\n"
1794- print commands.getoutput("flirt -in '%s' -ref '%s' -omat /tmp/fmri1.mat -bins 1024 -cost corratio %s -dof 6" % (file, anat, s2))
1795- print "fMRI on Template\n"
1796- print commands.getoutput("convert_xfm -omat /tmp/fmri.mat -concat '%s' /tmp/fmri1.mat" % norm_matrix)
1797- print commands.getoutput("flirt -in '%s' -ref '%s' -out '%s' -applyxfm -init /tmp/fmri.mat -interp trilinear" % (file, normAnat, outputFile))
1798- print "Finished\n"
1799+ """
1800+ Perform the normalization of fMRI data using FSL
1801+ """
1802+ if searchfmri == "AVA":
1803+ s2 = "-searchrx -0 0 -searchry -0 0 -searchrz -0 0"
1804+ elif (searchfmri == "NASO"):
1805+ s2 = "-searchrx -90 90 -searchry -90 90 -searchrz -90 90"
1806+ elif (searchfmri == "IO"):
1807+ s2 = "-searchrx -180 180 -searchry -180 180 -searchrz -180 180"
1808+ print "fMRI on T1 MRI\n"
1809+ print commands.getoutput("flirt -in '%s' -ref '%s' -omat /tmp/fmri1.mat -bins 1024 -cost corratio %s -dof 6" % (file, anat, s2))
1810+ print "fMRI on Template\n"
1811+ print commands.getoutput("convert_xfm -omat /tmp/fmri.mat -concat '%s' /tmp/fmri1.mat" % norm_matrix)
1812+ print commands.getoutput("flirt -in '%s' -ref '%s' -out '%s' -applyxfm -init /tmp/fmri.mat -interp trilinear" % (file, normAnat, outputFile))
1813+ print "Finished\n"
1814
1815 def Smooth(file, outputFile, fwhm):
1816- # voxel_width = 3
1817- fmri = nifti.NiftiImage(file)
1818- #voxel_width = fmri.header['voxel_size'][2]
1819- voxel_width = fmri.header['pixdim'][2]
1820- sigma = fwhm/(voxel_width*2*sqrt(2*log(2)))
1821- for i in fmri.data:
1822- SN.gaussian_filter(i, sigma, order=0, output=None, mode='reflect', cval=0.0)
1823- fmri.save(outputFile)
1824-
1825-
1826-########################
1827-# First Level analysis #
1828-########################
1829-
1830-def DesignMatrix(nbFrames, paradigm, miscFile, tr, outputFile, session, hrf = "Canonical", drift = "Blank", driftMatrix = None, poly_order = 2, cos_FreqCut = 128, FIR_order = 1, FIR_length = 1, model = "default"):
1831- ## For DesignMatrix
1832- import DesignMatrix as DM
1833- from dataFrame import DF
1834-
1835- design = DM.DesignMatrix(nbFrames, paradigm, session, miscFile, model)
1836- design.load()
1837- design.timing(tr)
1838- if driftMatrix != None:
1839- drift = pylab.load(driftMatrix)
1840- elif drift == "Blank":
1841- drift = 0
1842- elif drift == "Cosine":
1843- DesignMatrix.HF = cos_FreqCut
1844- drift = DM.cosine_drift
1845- elif drift == "Polynomial":
1846- DesignMatrix.order = poly_order
1847- drift = DM.canonical_drift
1848- if hrf == "Canonical":
1849- hrf = DM.hrf.glover
1850- elif hrf == "Canonical With Derivative":
1851- hrf = DM.hrf.glover_deriv
1852- elif hrf == "FIR Model":
1853- design.compute_fir_design(drift = drift, name = session, o = FIR_order, l = FIR_length)
1854- output = DF(colnames=design.names, data=design._design)
1855- output.write(outputFile)
1856- return 0
1857- else:
1858- print "Not HRF model passed. Aborting process."
1859- return
1860- design.compute_design(hrf = hrf, drift = drift, name = session)
1861- if hasattr(design, "names"):
1862- output = DF(colnames=design.names, data=design._design)
1863- print design.names
1864- output.write(outputFile)
1865-
1866-def GLMFit(file, designMatrix, mask, outputVBA, outputCon, fit = "Kalman_AR1"):
1867- from dataFrame import DF
1868- tab = DF.read(designMatrix)
1869- if fit == "Kalman_AR1":
1870- model = "ar1"
1871- method = "kalman"
1872- elif fit == "Ordinary Least Squares":
1873- method = "ols"
1874- model="spherical"
1875- elif fit == "Kalman":
1876- method = "kalman"
1877- model = "spherical"
1878- glm = VBA(tab, mask_url=mask, create_design_mat = False, mri_names = file, model = model, method = method)
1879- glm.fit()
1880- s=dict()
1881- s["HDF5FilePath"] = outputVBA
1882- s["ConfigFilePath"] = outputCon
1883- s["DesignFilePath"] = designMatrix
1884- glm.save(s)
1885- return glm
1886-
1887-
1888-#def ComputeContrasts(contrastFile, miscFile, designs, paths, save_mode="Contrast Name"):
1889-def ComputeContrasts(contrastFile, miscFile, glms, savefunc, save_mode="Contrast Name", model = "default", **kargs):
1890- misc = ConfigObj(miscFile)
1891- if not misc.has_key(model):
1892- misc[model] = {}
1893- if not misc[model].has_key("con_dofs"):
1894- misc[model]["con_dofs"] = {}
1895- contrasts = ConfigObj(contrastFile)
1896- contrasts_names = contrasts["contrast"]
1897- designs = {}
1898- for i, contrast in enumerate(contrasts_names):
1899- contrast_type = contrasts[contrast]["Type"]
1900- contrast_dimension = contrasts[contrast]["Dimension"]
1901- final_contrast = []
1902- k = i + 1
1903- multicon = dict()
1904- if save_mode == "Contrast Name":
1905- ContrastId = contrast
1906- elif save_mode == "Contrast Number":
1907- ContrastId = "%04i" % k
1908- for key, value in contrasts[contrast].items():
1909- if key != "Type" and key != "Dimension":
1910- session = "_".join(key.split("_")[:-1])
1911- if not designs.has_key(session):
1912- print "Loading session : %s" % session
1913- designs[session] = VBA(glms[session])
1914- if contrast_type == "t" and sum([int(j) != 0 for j in value]) != 0:
1915- designs[session].contrast([int(i) for i in value])
1916- final_contrast.append(designs[session]._con)
1917+ """
1918+ fixme : this might smooth each slice indepently ?
1919+ """
1920+ # voxel_width = 3
1921+ fmri = nifti.NiftiImage(file)
1922+ #voxel_width = fmri.header['voxel_size'][2]
1923+ voxel_width = fmri.header['pixdim'][2]
1924+ sigma = fwhm/(voxel_width*2*sqrt(2*log(2)))
1925+ for i in fmri.data:
1926+ sn.gaussian_filter(i, sigma, order=0, output=None,
1927+ mode='reflect', cval=0.0)
1928+ fmri.save(outputFile)
1929+
1930+
1931+#-----------------------------------------------------
1932+#------- First Level analysis ------------------------
1933+#-----------------------------------------------------
1934+
1935+def CheckDmtxParam(DmtxParam):
1936+ """
1937+ check that Dmtx parameters are OK
1938+ """
1939+ pass
1940+
1941+def DesignMatrix(nbFrames, paradigm, miscFile, tr, outputFile,
1942+ session, DmtxParam):
1943+ """
1944+ Higher level function to define design matrices
1945+ This function simply unfolds the Dmtxparam dictionary
1946+ and calls the _DesignMatrix function
1947+
1948+ Parameters:
1949+ -----------
1950+ - nbFrames
1951+ - paradigm
1952+ - miscFile
1953+ - tr
1954+ - outputFile:
1955+ - session
1956+ - DmtxParam
1957+
1958+ """
1959+ hrfType = DmtxParam["hrfType"]
1960+ drift = DmtxParam["drift"]
1961+ poly_order = DmtxParam["poly_order"]
1962+ cos_FreqCut = DmtxParam["cos_FreqCut"]
1963+ FIR_order = DmtxParam["FIR_order"]
1964+ FIR_length = DmtxParam["FIR_length"]
1965+ driftMatrix = DmtxParam["drift_matrix"]
1966+ model = 'default' # fixme: I don't understand this
1967+ _DesignMatrix(nbFrames, paradigm, miscFile, tr, outputFile,
1968+ session, hrfType, drift, driftMatrix, poly_order,
1969+ cos_FreqCut, FIR_order, FIR_length, model)
1970+
1971+def _DesignMatrix(nbFrames, paradigm, miscFile, tr, outputFile,
1972+ session, hrf="Canonical", drift="Blank",
1973+ driftMatrix=None, poly_order=2, cos_FreqCut=128,
1974+ FIR_order=1, FIR_length=1, model="default", verbose=0):
1975+ """
1976+ Base function to define design matrices
1977+
1978+ """
1979+ ## For DesignMatrix
1980+ import DesignMatrix as dm
1981+ from dataFrame import DF
1982+
1983+ design = dm.DesignMatrix(nbFrames, paradigm, session, miscFile, model)
1984+ design.load()
1985+ design.timing(tr)
1986+ if driftMatrix != None:
1987+ drift = pylab.load(driftMatrix)
1988+ elif drift == "Blank":
1989+ drift = 0
1990+ elif drift == "Cosine":
1991+ DesignMatrix.HF = cos_FreqCut
1992+ drift = dm.cosine_drift
1993+ elif drift == "Polynomial":
1994+ DesignMatrix.order = poly_order
1995+ drift = dm.canonical_drift
1996+
1997+ if hrf == "Canonical":
1998+ hrf = dm.hrf.glover
1999+ elif hrf == "Canonical With Derivative":
2000+ hrf = dm.hrf.glover_deriv
2001+ elif hrf == "FIR Model":
2002+ design.compute_fir_design(drift = drift, name = session,
2003+ o = FIR_order, l = FIR_length)
2004+ output = DF(colnames=design.names, data=design._design)
2005+ output.write(outputFile)
2006+ return 0
2007+ else:
2008+ print "Not HRF model passed. Aborting process."
2009+ return
2010+
2011+ design.compute_design(hrf = hrf, drift = drift, name = session)
2012+ if hasattr(design, "names"):
2013+ output = DF(colnames=design.names, data=design._design)
2014+ if verbose : print design.names
2015+ output.write(outputFile)
2016+
2017+def GLMFit(file, designMatrix, mask, outputVBA, outputCon, fit="Kalman_AR1"):
2018+ """
2019+ Call the GLM Fit function with apropriate arguments
2020+
2021+ Parameters:
2022+ -----------
2023+ - file
2024+ - designmatrix
2025+ - mask
2026+ - outputVBA
2027+ - outputCon
2028+ - fit='Kalman_AR1'
2029+
2030+ Output:
2031+ -------
2032+ - glm
2033+
2034+ """
2035+ from dataFrame import DF
2036+ tab = DF.read(designMatrix)
2037+ if fit == "Kalman_AR1":
2038+ model = "ar1"
2039+ method = "kalman"
2040+ elif fit == "Ordinary Least Squares":
2041+ method = "ols"
2042+ model="spherical"
2043+ elif fit == "Kalman":
2044+ method = "kalman"
2045+ model = "spherical"
2046+
2047+ glm = VBA(tab, mask_url=mask, create_design_mat = False, mri_names = file, model = model, method = method)
2048+ glm.fit()
2049+ s=dict()
2050+ s["GlmDumpFile"] = outputVBA
2051+ s["ConfigFilePath"] = outputCon
2052+ s["DesignFilePath"] = designMatrix
2053+ glm.save(s)
2054+ return glm
2055+
2056+
2057+def ComputeContrasts(contrastFile, miscFile, glms, save_mode="Contrast Name",
2058+ model = "default", **kargs):
2059+ """
2060+ """
2061+ verbose = 0 # fixme: put ine the kwargs
2062+ misc = ConfigObj(miscFile)
2063+ if not misc.has_key(model):
2064+ misc[model] = {}
2065+
2066+ if not misc[model].has_key("con_dofs"):
2067+ misc[model]["con_dofs"] = {}
2068+
2069+ contrasts = ConfigObj(contrastFile)
2070+ contrasts_names = contrasts["contrast"]
2071+ designs = {}
2072+ for i, contrast in enumerate(contrasts_names):
2073+ contrast_type = contrasts[contrast]["Type"]
2074+ contrast_dimension = contrasts[contrast]["Dimension"]
2075+ final_contrast = []
2076+ k = i + 1
2077+ multicon = dict()
2078+ if save_mode == "Contrast Name":
2079+ ContrastId = contrast
2080+ elif save_mode == "Contrast Number":
2081+ ContrastId = "%04i" % k
2082+
2083+ for key, value in contrasts[contrast].items():
2084+ if verbose: print key,value
2085+ if key != "Type" and key != "Dimension":
2086+ session = "_".join(key.split("_")[:-1])
2087+ if not designs.has_key(session):
2088+ print "Loading session : %s" % session
2089+ designs[session] = VBA(glms[session])
2090+
2091+ if contrast_type == "t" and sum([int(j) != 0 for j in value]) != 0:
2092+ designs[session].contrast([int(i) for i in value])
2093+ final_contrast.append(designs[session]._con)
2094+
2095+ if contrast_type == "F":
2096+ if not multicon.has_key(session):
2097+ multicon[session] = array([int(i) for i in value])
2098+ else:
2099+ multicon[session] = vstack((multicon[session], [int(i) for i in value]))
2100 if contrast_type == "F":
2101- if not multicon.has_key(session):
2102- multicon[session] = array([int(i) for i in value])
2103- else:
2104- multicon[session] = vstack((multicon[session], [int(i) for i in value]))
2105- if contrast_type == "F":
2106- for key, value in multicon.items():
2107- if sum([j != 0 for j in value.reshape(-1)]) != 0:
2108- designs[key].contrast(value)
2109- final_contrast.append(designs[key]._con)
2110- design = designs[session]
2111- res_contrast = final_contrast[0]
2112- for c in final_contrast[1:]:
2113- res_contrast = res_contrast + c
2114- res_contrast.type = contrast_type
2115- savefunc(res_contrast, design, ContrastId, contrast_dimension, kargs)
2116- misc[model]["con_dofs"][contrast] = res_contrast.dof
2117- misc["Contrast Save Mode"] = save_mode
2118- misc.write()
2119+ for key, value in multicon.items():
2120+ if sum([j != 0 for j in value.reshape(-1)]) != 0:
2121+ designs[key].contrast(value)
2122+ final_contrast.append(designs[key]._con)
2123
2124-##################
2125-# Group Analysis #
2126-##################
2127+ design = designs[session]
2128+ res_contrast = final_contrast[0]
2129+ for c in final_contrast[1:]:
2130+ res_contrast = res_contrast + c
2131+ res_contrast.type = contrast_type
2132+ saveall(res_contrast, design, ContrastId, contrast_dimension, kargs)
2133+ misc[model]["con_dofs"][contrast] = res_contrast.dof
2134+ misc["Contrast Save Mode"] = save_mode
2135+ misc.write()
2136
2137=== modified file 'examples/neurospin/neurospy/Results.py'
2138--- examples/neurospin/neurospy/Results.py 2009-03-28 01:56:11 +0000
2139+++ examples/neurospin/neurospy/Results.py 2009-07-07 18:03:46 +0000
2140@@ -1,43 +1,47 @@
2141-#from soma import aims
2142-import fff2.neuro as neurospy
2143-
2144-def ComputeResultsContents(zmap_file_path, mask_file_path, output_html_path, threshold=0.001, method='fpr', cluster=0,
2145- null_zmax='bonferroni', null_smax=None, null_s=None, nmaxima=4):
2146-
2147- # Read data: z-map and mask
2148- zmap = neurospy.image(zmap_file_path)
2149- mask = neurospy.image(mask_file_path)
2150-
2151- # Compute cluster statistics
2152- clusters, info = neurospy.cluster_stats(zmap, mask, height_th=threshold, height_control=method, cluster_th=cluster,
2153- null_zmax=null_zmax, null_smax=null_smax, null_s=null_s)
2154-
2155- # Make HTML page
2156- output = open(output_html_path, mode = "w")
2157- output.write("<html><head><title> Result Sheet for %s </title></head><body><center>\n" % zmap_file_path)
2158- output.write("<h2> Results for %s</h2>\n" % zmap_file_path)
2159- output.write("<table border = 1>\n")
2160- output.write("<tr><th colspan=4> Voxel significance</th><th colspan=3> Coordinates in MNI referential</th><th>Cluster Size</th></tr>\n")
2161- output.write("<tr><th>p FWE corr<br>(Bonferroni)</th><th>p FDR corr</th><th>Z</th><th>p uncorr</th>")
2162- output.write("<th> x (mm) </th><th> y (mm) </th><th> z (mm) </th><th>(voxels)</th></tr>\n")
2163-
2164- for cluster in clusters:
2165- maxima = cluster['maxima']
2166- for j in range(min(len(maxima), nmaxima)):
2167- temp = ["%f" % cluster['fwer_pvalue'][j]]
2168- temp.append("%f" % cluster['fdr_pvalue'][j])
2169- temp.append("%f" % cluster['zscore'][j])
2170- temp.append("%f" % cluster['pvalue'][j])
2171- for it in range(3):
2172- temp.append("%f" % maxima[j][it])
2173- if j == 0: ## Main local maximum
2174- output.write('<tr><th align="center">' + '</th><th align="center">'.join(temp) + '</th></tr>\n')
2175- else: ## Secondary local maxima
2176- output.write('<tr><td align="center">' + '</td><td align="center">'.join(temp) + '</td><td></td></tr>\n')
2177-
2178- output.write("</table>\n")
2179- output.write("Number of voxels : %i<br>\n" % len(mask.array > 0))
2180- output.write("Threshold Z=%f (%s control at %f)<br>\n" % (info['threshold_z'], method, threshold))
2181- output.write("</center></body></html>\n")
2182- output.close()
2183+import nipy.neurospin.image as fff2image
2184+
2185+def ComputeResultsContents(zmap_file_path, mask_file_path, output_html_path,
2186+ threshold=0.001, method='fpr', cluster=0,
2187+ null_zmax='bonferroni', null_smax=None,
2188+ null_s=None, nmaxima=4):
2189+
2190+ """
2191+ Write the output of the GLM as ac an html page
2192+ """
2193+ # Read data: z-map and mask
2194+ zmap = fff2image.image(zmap_file_path)
2195+ mask = fff2image.image(mask_file_path)
2196+
2197+ # Compute cluster statistics
2198+ clusters, info = fff2image.cluster_stats(zmap, mask, height_th=threshold, height_control=method, cluster_th=cluster,
2199+ null_zmax=null_zmax, null_smax=null_smax, null_s=null_s)
2200+
2201+ # Make HTML page
2202+ output = open(output_html_path, mode = "w")
2203+ output.write("<html><head><title> Result Sheet for %s </title></head><body><center>\n" % zmap_file_path)
2204+ output.write("<h2> Results for %s</h2>\n" % zmap_file_path)
2205+ output.write("<table border = 1>\n")
2206+ output.write("<tr><th colspan=4> Voxel significance</th><th colspan=3> Coordinates in MNI referential</th><th>Cluster Size</th></tr>\n")
2207+ output.write("<tr><th>p FWE corr<br>(Bonferroni)</th><th>p FDR corr</th><th>Z</th><th>p uncorr</th>")
2208+ output.write("<th> x (mm) </th><th> y (mm) </th><th> z (mm) </th><th>(voxels)</th></tr>\n")
2209+
2210+ for cluster in clusters:
2211+ maxima = cluster['maxima']
2212+ for j in range(min(len(maxima), nmaxima)):
2213+ temp = ["%f" % cluster['fwer_pvalue'][j]]
2214+ temp.append("%f" % cluster['fdr_pvalue'][j])
2215+ temp.append("%f" % cluster['zscore'][j])
2216+ temp.append("%f" % cluster['pvalue'][j])
2217+ for it in range(3):
2218+ temp.append("%f" % maxima[j][it])
2219+ if j == 0: ## Main local maximum
2220+ output.write('<tr><th align="center">' + '</th><th align="center">'.join(temp) + '</th></tr>\n')
2221+ else: ## Secondary local maxima
2222+ output.write('<tr><td align="center">' + '</td><td align="center">'.join(temp) + '</td><td></td></tr>\n')
2223+
2224+ output.write("</table>\n")
2225+ output.write("Number of voxels : %i<br>\n" % len(mask.array > 0))
2226+ output.write("Threshold Z=%f (%s control at %f)<br>\n" % (info['threshold_z'], method, threshold))
2227+ output.write("</center></body></html>\n")
2228+ output.close()
2229
2230
2231=== removed file 'examples/neurospin/neurospy/configobj.py'
2232--- examples/neurospin/neurospy/configobj.py 2009-03-28 01:56:11 +0000
2233+++ examples/neurospin/neurospy/configobj.py 1970-01-01 00:00:00 +0000
2234@@ -1,2279 +0,0 @@
2235-# configobj.py
2236-# A config file reader/writer that supports nested sections in config files.
2237-# Copyright (C) 2005-2006 Michael Foord, Nicola Larosa
2238-# E-mail: fuzzyman AT voidspace DOT org DOT uk
2239-# nico AT tekNico DOT net
2240-
2241-# ConfigObj 4
2242-# http://www.voidspace.org.uk/python/configobj.html
2243-
2244-# Released subject to the BSD License
2245-# Please see http://www.voidspace.org.uk/python/license.shtml
2246-
2247-# Scripts maintained at http://www.voidspace.org.uk/python/index.shtml
2248-# For information about bugfixes, updates and support, please join the
2249-# ConfigObj mailing list:
2250-# http://lists.sourceforge.net/lists/listinfo/configobj-develop
2251-# Comments, suggestions and bug reports welcome.
2252-
2253-from __future__ import generators
2254-
2255-import sys
2256-INTP_VER = sys.version_info[:2]
2257-if INTP_VER < (2, 2):
2258- raise RuntimeError("Python v.2.2 or later needed")
2259-
2260-import os, re
2261-compiler = None
2262-try:
2263- import compiler
2264-except ImportError:
2265- # for IronPython
2266- pass
2267-from types import StringTypes
2268-from warnings import warn
2269-try:
2270- from codecs import BOM_UTF8, BOM_UTF16, BOM_UTF16_BE, BOM_UTF16_LE
2271-except ImportError:
2272- # Python 2.2 does not have these
2273- # UTF-8
2274- BOM_UTF8 = '\xef\xbb\xbf'
2275- # UTF-16, little endian
2276- BOM_UTF16_LE = '\xff\xfe'
2277- # UTF-16, big endian
2278- BOM_UTF16_BE = '\xfe\xff'
2279- if sys.byteorder == 'little':
2280- # UTF-16, native endianness
2281- BOM_UTF16 = BOM_UTF16_LE
2282- else:
2283- # UTF-16, native endianness
2284- BOM_UTF16 = BOM_UTF16_BE
2285-
2286-# A dictionary mapping BOM to
2287-# the encoding to decode with, and what to set the
2288-# encoding attribute to.
2289-BOMS = {
2290- BOM_UTF8: ('utf_8', None),
2291- BOM_UTF16_BE: ('utf16_be', 'utf_16'),
2292- BOM_UTF16_LE: ('utf16_le', 'utf_16'),
2293- BOM_UTF16: ('utf_16', 'utf_16'),
2294- }
2295-# All legal variants of the BOM codecs.
2296-# TODO: the list of aliases is not meant to be exhaustive, is there a
2297-# better way ?
2298-BOM_LIST = {
2299- 'utf_16': 'utf_16',
2300- 'u16': 'utf_16',
2301- 'utf16': 'utf_16',
2302- 'utf-16': 'utf_16',
2303- 'utf16_be': 'utf16_be',
2304- 'utf_16_be': 'utf16_be',
2305- 'utf-16be': 'utf16_be',
2306- 'utf16_le': 'utf16_le',
2307- 'utf_16_le': 'utf16_le',
2308- 'utf-16le': 'utf16_le',
2309- 'utf_8': 'utf_8',
2310- 'u8': 'utf_8',
2311- 'utf': 'utf_8',
2312- 'utf8': 'utf_8',
2313- 'utf-8': 'utf_8',
2314- }
2315-
2316-# Map of encodings to the BOM to write.
2317-BOM_SET = {
2318- 'utf_8': BOM_UTF8,
2319- 'utf_16': BOM_UTF16,
2320- 'utf16_be': BOM_UTF16_BE,
2321- 'utf16_le': BOM_UTF16_LE,
2322- None: BOM_UTF8
2323- }
2324-
2325-try:
2326- from validate import VdtMissingValue
2327-except ImportError:
2328- VdtMissingValue = None
2329-
2330-try:
2331- enumerate
2332-except NameError:
2333- def enumerate(obj):
2334- """enumerate for Python 2.2."""
2335- i = -1
2336- for item in obj:
2337- i += 1
2338- yield i, item
2339-
2340-try:
2341- True, False
2342-except NameError:
2343- True, False = 1, 0
2344-
2345-
2346-__version__ = '4.4.0'
2347-
2348-__revision__ = '$Id: configobj.py 156 2006-01-31 14:57:08Z fuzzyman $'
2349-
2350-__docformat__ = "restructuredtext en"
2351-
2352-__all__ = (
2353- '__version__',
2354- 'DEFAULT_INDENT_TYPE',
2355- 'DEFAULT_INTERPOLATION',
2356- 'ConfigObjError',
2357- 'NestingError',
2358- 'ParseError',
2359- 'DuplicateError',
2360- 'ConfigspecError',
2361- 'ConfigObj',
2362- 'SimpleVal',
2363- 'InterpolationError',
2364- 'InterpolationLoopError',
2365- 'MissingInterpolationOption',
2366- 'RepeatSectionError',
2367- 'UnreprError',
2368- 'UnknownType',
2369- '__docformat__',
2370- 'flatten_errors',
2371-)
2372-
2373-DEFAULT_INTERPOLATION = 'configparser'
2374-DEFAULT_INDENT_TYPE = ' '
2375-MAX_INTERPOL_DEPTH = 10
2376-
2377-OPTION_DEFAULTS = {
2378- 'interpolation': True,
2379- 'raise_errors': False,
2380- 'list_values': True,
2381- 'create_empty': False,
2382- 'file_error': False,
2383- 'configspec': None,
2384- 'stringify': True,
2385- # option may be set to one of ('', ' ', '\t')
2386- 'indent_type': None,
2387- 'encoding': None,
2388- 'default_encoding': None,
2389- 'unrepr': False,
2390- 'write_empty_values': False,
2391-}
2392-
2393-
2394-def getObj(s):
2395- s = "a=" + s
2396- if compiler is None:
2397- raise ImportError('compiler module not available')
2398- p = compiler.parse(s)
2399- return p.getChildren()[1].getChildren()[0].getChildren()[1]
2400-
2401-class UnknownType(Exception):
2402- pass
2403-
2404-class Builder:
2405-
2406- def build(self, o):
2407- m = getattr(self, 'build_' + o.__class__.__name__, None)
2408- if m is None:
2409- raise UnknownType(o.__class__.__name__)
2410- return m(o)
2411-
2412- def build_List(self, o):
2413- return map(self.build, o.getChildren())
2414-
2415- def build_Const(self, o):
2416- return o.value
2417-
2418- def build_Dict(self, o):
2419- d = {}
2420- i = iter(map(self.build, o.getChildren()))
2421- for el in i:
2422- d[el] = i.next()
2423- return d
2424-
2425- def build_Tuple(self, o):
2426- return tuple(self.build_List(o))
2427-
2428- def build_Name(self, o):
2429- if o.name == 'None':
2430- return None
2431- if o.name == 'True':
2432- return True
2433- if o.name == 'False':
2434- return False
2435-
2436- # An undefinted Name
2437- raise UnknownType('Undefined Name')
2438-
2439- def build_Add(self, o):
2440- real, imag = map(self.build_Const, o.getChildren())
2441- try:
2442- real = float(real)
2443- except TypeError:
2444- raise UnknownType('Add')
2445- if not isinstance(imag, complex) or imag.real != 0.0:
2446- raise UnknownType('Add')
2447- return real+imag
2448-
2449- def build_Getattr(self, o):
2450- parent = self.build(o.expr)
2451- return getattr(parent, o.attrname)
2452-
2453- def build_UnarySub(self, o):
2454- return -self.build_Const(o.getChildren()[0])
2455-
2456- def build_UnaryAdd(self, o):
2457- return self.build_Const(o.getChildren()[0])
2458-
2459-def unrepr(s):
2460- if not s:
2461- return s
2462- return Builder().build(getObj(s))
2463-
2464-def _splitlines(instring):
2465- """Split a string on lines, without losing line endings or truncating."""
2466-
2467-
2468-class ConfigObjError(SyntaxError):
2469- """
2470- This is the base class for all errors that ConfigObj raises.
2471- It is a subclass of SyntaxError.
2472- """
2473- def __init__(self, message='', line_number=None, line=''):
2474- self.line = line
2475- self.line_number = line_number
2476- self.message = message
2477- SyntaxError.__init__(self, message)
2478-
2479-class NestingError(ConfigObjError):
2480- """
2481- This error indicates a level of nesting that doesn't match.
2482- """
2483-
2484-class ParseError(ConfigObjError):
2485- """
2486- This error indicates that a line is badly written.
2487- It is neither a valid ``key = value`` line,
2488- nor a valid section marker line.
2489- """
2490-
2491-class DuplicateError(ConfigObjError):
2492- """
2493- The keyword or section specified already exists.
2494- """
2495-
2496-class ConfigspecError(ConfigObjError):
2497- """
2498- An error occured whilst parsing a configspec.
2499- """
2500-
2501-class InterpolationError(ConfigObjError):
2502- """Base class for the two interpolation errors."""
2503-
2504-class InterpolationLoopError(InterpolationError):
2505- """Maximum interpolation depth exceeded in string interpolation."""
2506-
2507- def __init__(self, option):
2508- InterpolationError.__init__(
2509- self,
2510- 'interpolation loop detected in value "%s".' % option)
2511-
2512-class RepeatSectionError(ConfigObjError):
2513- """
2514- This error indicates additional sections in a section with a
2515- ``__many__`` (repeated) section.
2516- """
2517-
2518-class MissingInterpolationOption(InterpolationError):
2519- """A value specified for interpolation was missing."""
2520-
2521- def __init__(self, option):
2522- InterpolationError.__init__(
2523- self,
2524- 'missing option "%s" in interpolation.' % option)
2525-
2526-class UnreprError(ConfigObjError):
2527- """An error parsing in unrepr mode."""
2528-
2529-
2530-class InterpolationEngine(object):
2531- """
2532- A helper class to help perform string interpolation.
2533-
2534- This class is an abstract base class; its descendants perform
2535- the actual work.
2536- """
2537-
2538- # compiled regexp to use in self.interpolate()
2539- _KEYCRE = re.compile(r"%\(([^)]*)\)s")
2540-
2541- def __init__(self, section):
2542- # the Section instance that "owns" this engine
2543- self.section = section
2544-
2545- def interpolate(self, key, value):
2546- def recursive_interpolate(key, value, section, backtrail):
2547- """The function that does the actual work.
2548-
2549- ``value``: the string we're trying to interpolate.
2550- ``section``: the section in which that string was found
2551- ``backtrail``: a dict to keep track of where we've been,
2552- to detect and prevent infinite recursion loops
2553-
2554- This is similar to a depth-first-search algorithm.
2555- """
2556- # Have we been here already?
2557- if backtrail.has_key((key, section.name)):
2558- # Yes - infinite loop detected
2559- raise InterpolationLoopError(key)
2560- # Place a marker on our backtrail so we won't come back here again
2561- backtrail[(key, section.name)] = 1
2562-
2563- # Now start the actual work
2564- match = self._KEYCRE.search(value)
2565- while match:
2566- # The actual parsing of the match is implementation-dependent,
2567- # so delegate to our helper function
2568- k, v, s = self._parse_match(match)
2569- if k is None:
2570- # That's the signal that no further interpolation is needed
2571- replacement = v
2572- else:
2573- # Further interpolation may be needed to obtain final value
2574- replacement = recursive_interpolate(k, v, s, backtrail)
2575- # Replace the matched string with its final value
2576- start, end = match.span()
2577- value = ''.join((value[:start], replacement, value[end:]))
2578- new_search_start = start + len(replacement)
2579- # Pick up the next interpolation key, if any, for next time
2580- # through the while loop
2581- match = self._KEYCRE.search(value, new_search_start)
2582-
2583- # Now safe to come back here again; remove marker from backtrail
2584- del backtrail[(key, section.name)]
2585-
2586- return value
2587-
2588- # Back in interpolate(), all we have to do is kick off the recursive
2589- # function with appropriate starting values
2590- value = recursive_interpolate(key, value, self.section, {})
2591- return value
2592-
2593- def _fetch(self, key):
2594- """Helper function to fetch values from owning section.
2595-
2596- Returns a 2-tuple: the value, and the section where it was found.
2597- """
2598- # switch off interpolation before we try and fetch anything !
2599- save_interp = self.section.main.interpolation
2600- self.section.main.interpolation = False
2601-
2602- # Start at section that "owns" this InterpolationEngine
2603- current_section = self.section
2604- while True:
2605- # try the current section first
2606- val = current_section.get(key)
2607- if val is not None:
2608- break
2609- # try "DEFAULT" next
2610- val = current_section.get('DEFAULT', {}).get(key)
2611- if val is not None:
2612- break
2613- # move up to parent and try again
2614- # top-level's parent is itself
2615- if current_section.parent is current_section:
2616- # reached top level, time to give up
2617- break
2618- current_section = current_section.parent
2619-
2620- # restore interpolation to previous value before returning
2621- self.section.main.interpolation = save_interp
2622- if val is None:
2623- raise MissingInterpolationOption(key)
2624- return val, current_section
2625-
2626- def _parse_match(self, match):
2627- """Implementation-dependent helper function.
2628-
2629- Will be passed a match object corresponding to the interpolation
2630- key we just found (e.g., "%(foo)s" or "$foo"). Should look up that
2631- key in the appropriate config file section (using the ``_fetch()``
2632- helper function) and return a 3-tuple: (key, value, section)
2633-
2634- ``key`` is the name of the key we're looking for
2635- ``value`` is the value found for that key
2636- ``section`` is a reference to the section where it was found
2637-
2638- ``key`` and ``section`` should be None if no further
2639- interpolation should be performed on the resulting value
2640- (e.g., if we interpolated "$$" and returned "$").
2641- """
2642- raise NotImplementedError
2643-
2644-
2645-class ConfigParserInterpolation(InterpolationEngine):
2646- """Behaves like ConfigParser."""
2647- _KEYCRE = re.compile(r"%\(([^)]*)\)s")
2648-
2649- def _parse_match(self, match):
2650- key = match.group(1)
2651- value, section = self._fetch(key)
2652- return key, value, section
2653-
2654-
2655-class TemplateInterpolation(InterpolationEngine):
2656- """Behaves like string.Template."""
2657- _delimiter = '$'
2658- _KEYCRE = re.compile(r"""
2659- \$(?:
2660- (?P<escaped>\$) | # Two $ signs
2661- (?P<named>[_a-z][_a-z0-9]*) | # $name format
2662- {(?P<braced>[^}]*)} # ${name} format
2663- )
2664- """, re.IGNORECASE | re.VERBOSE)
2665-
2666- def _parse_match(self, match):
2667- # Valid name (in or out of braces): fetch value from section
2668- key = match.group('named') or match.group('braced')
2669- if key is not None:
2670- value, section = self._fetch(key)
2671- return key, value, section
2672- # Escaped delimiter (e.g., $$): return single delimiter
2673- if match.group('escaped') is not None:
2674- # Return None for key and section to indicate it's time to stop
2675- return None, self._delimiter, None
2676- # Anything else: ignore completely, just return it unchanged
2677- return None, match.group(), None
2678-
2679-interpolation_engines = {
2680- 'configparser': ConfigParserInterpolation,
2681- 'template': TemplateInterpolation,
2682-}
2683-
2684-class Section(dict):
2685- """
2686- A dictionary-like object that represents a section in a config file.
2687-
2688- It does string interpolation if the 'interpolation' attribute
2689- of the 'main' object is set to True.
2690-
2691- Interpolation is tried first from this object, then from the 'DEFAULT'
2692- section of this object, next from the parent and its 'DEFAULT' section,
2693- and so on until the main object is reached.
2694-
2695- A Section will behave like an ordered dictionary - following the
2696- order of the ``scalars`` and ``sections`` attributes.
2697- You can use this to change the order of members.
2698-
2699- Iteration follows the order: scalars, then sections.
2700- """
2701-
2702- def __init__(self, parent, depth, main, indict=None, name=None):
2703- """
2704- * parent is the section above
2705- * depth is the depth level of this section
2706- * main is the main ConfigObj
2707- * indict is a dictionary to initialise the section with
2708- """
2709- if indict is None:
2710- indict = {}
2711- dict.__init__(self)
2712- # used for nesting level *and* interpolation
2713- self.parent = parent
2714- # used for the interpolation attribute
2715- self.main = main
2716- # level of nesting depth of this Section
2717- self.depth = depth
2718- # the sequence of scalar values in this Section
2719- self.scalars = []
2720- # the sequence of sections in this Section
2721- self.sections = []
2722- # purely for information
2723- self.name = name
2724- # for comments :-)
2725- self.comments = {}
2726- self.inline_comments = {}
2727- # for the configspec
2728- self.configspec = {}
2729- self._order = []
2730- self._configspec_comments = {}
2731- self._configspec_inline_comments = {}
2732- self._cs_section_comments = {}
2733- self._cs_section_inline_comments = {}
2734- # for defaults
2735- self.defaults = []
2736- #
2737- # we do this explicitly so that __setitem__ is used properly
2738- # (rather than just passing to ``dict.__init__``)
2739- for entry in indict:
2740- self[entry] = indict[entry]
2741-
2742- def _interpolate(self, key, value):
2743- try:
2744- # do we already have an interpolation engine?
2745- engine = self._interpolation_engine
2746- except AttributeError:
2747- # not yet: first time running _interpolate(), so pick the engine
2748- name = self.main.interpolation
2749- if name == True: # note that "if name:" would be incorrect here
2750- # backwards-compatibility: interpolation=True means use default
2751- name = DEFAULT_INTERPOLATION
2752- name = name.lower() # so that "Template", "template", etc. all work
2753- class_ = interpolation_engines.get(name, None)
2754- if class_ is None:
2755- # invalid value for self.main.interpolation
2756- self.main.interpolation = False
2757- return value
2758- else:
2759- # save reference to engine so we don't have to do this again
2760- engine = self._interpolation_engine = class_(self)
2761- # let the engine do the actual work
2762- return engine.interpolate(key, value)
2763-
2764- def __getitem__(self, key):
2765- """Fetch the item and do string interpolation."""
2766- val = dict.__getitem__(self, key)
2767- if self.main.interpolation and isinstance(val, StringTypes):
2768- return self._interpolate(key, val)
2769- return val
2770-
2771- def __setitem__(self, key, value, unrepr=False):
2772- """
2773- Correctly set a value.
2774-
2775- Making dictionary values Section instances.
2776- (We have to special case 'Section' instances - which are also dicts)
2777-
2778- Keys must be strings.
2779- Values need only be strings (or lists of strings) if
2780- ``main.stringify`` is set.
2781-
2782- `unrepr`` must be set when setting a value to a dictionary, without
2783- creating a new sub-section.
2784- """
2785- if not isinstance(key, StringTypes):
2786- raise ValueError, 'The key "%s" is not a string.' % key
2787- # add the comment
2788- if not self.comments.has_key(key):
2789- self.comments[key] = []
2790- self.inline_comments[key] = ''
2791- # remove the entry from defaults
2792- if key in self.defaults:
2793- self.defaults.remove(key)
2794- #
2795- if isinstance(value, Section):
2796- if not self.has_key(key):
2797- self.sections.append(key)
2798- dict.__setitem__(self, key, value)
2799- elif isinstance(value, dict) and not unrepr:
2800- # First create the new depth level,
2801- # then create the section
2802- if not self.has_key(key):
2803- self.sections.append(key)
2804- new_depth = self.depth + 1
2805- dict.__setitem__(
2806- self,
2807- key,
2808- Section(
2809- self,
2810- new_depth,
2811- self.main,
2812- indict=value,
2813- name=key))
2814- else:
2815- if not self.has_key(key):
2816- self.scalars.append(key)
2817- if not self.main.stringify:
2818- if isinstance(value, StringTypes):
2819- pass
2820- elif isinstance(value, (list, tuple)):
2821- for entry in value:
2822- if not isinstance(entry, StringTypes):
2823- raise TypeError, (
2824- 'Value is not a string "%s".' % entry)
2825- else:
2826- raise TypeError, 'Value is not a string "%s".' % value
2827- dict.__setitem__(self, key, value)
2828-
2829- def __delitem__(self, key):
2830- """Remove items from the sequence when deleting."""
2831- dict. __delitem__(self, key)
2832- if key in self.scalars:
2833- self.scalars.remove(key)
2834- else:
2835- self.sections.remove(key)
2836- del self.comments[key]
2837- del self.inline_comments[key]
2838-
2839- def get(self, key, default=None):
2840- """A version of ``get`` that doesn't bypass string interpolation."""
2841- try:
2842- return self[key]
2843- except KeyError:
2844- return default
2845-
2846- def update(self, indict):
2847- """
2848- A version of update that uses our ``__setitem__``.
2849- """
2850- for entry in indict:
2851- self[entry] = indict[entry]
2852-
2853- def pop(self, key, *args):
2854- """ """
2855- val = dict.pop(self, key, *args)
2856- if key in self.scalars:
2857- del self.comments[key]
2858- del self.inline_comments[key]
2859- self.scalars.remove(key)
2860- elif key in self.sections:
2861- del self.comments[key]
2862- del self.inline_comments[key]
2863- self.sections.remove(key)
2864- if self.main.interpolation and isinstance(val, StringTypes):
2865- return self._interpolate(key, val)
2866- return val
2867-
2868- def popitem(self):
2869- """Pops the first (key,val)"""
2870- sequence = (self.scalars + self.sections)
2871- if not sequence:
2872- raise KeyError, ": 'popitem(): dictionary is empty'"
2873- key = sequence[0]
2874- val = self[key]
2875- del self[key]
2876- return key, val
2877-
2878- def clear(self):
2879- """
2880- A version of clear that also affects scalars/sections
2881- Also clears comments and configspec.
2882-
2883- Leaves other attributes alone :
2884- depth/main/parent are not affected
2885- """
2886- dict.clear(self)
2887- self.scalars = []
2888- self.sections = []
2889- self.comments = {}
2890- self.inline_comments = {}
2891- self.configspec = {}
2892-
2893- def setdefault(self, key, default=None):
2894- """A version of setdefault that sets sequence if appropriate."""
2895- try:
2896- return self[key]
2897- except KeyError:
2898- self[key] = default
2899- return self[key]
2900-
2901- def items(self):
2902- """ """
2903- return zip((self.scalars + self.sections), self.values())
2904-
2905- def keys(self):
2906- """ """
2907- return (self.scalars + self.sections)
2908-
2909- def values(self):
2910- """ """
2911- return [self[key] for key in (self.scalars + self.sections)]
2912-
2913- def iteritems(self):
2914- """ """
2915- return iter(self.items())
2916-
2917- def iterkeys(self):
2918- """ """
2919- return iter((self.scalars + self.sections))
2920-
2921- __iter__ = iterkeys
2922-
2923- def itervalues(self):
2924- """ """
2925- return iter(self.values())
2926-
2927- def __repr__(self):
2928- return '{%s}' % ', '.join([('%s: %s' % (repr(key), repr(self[key])))
2929- for key in (self.scalars + self.sections)])
2930-
2931- __str__ = __repr__
2932-
2933- # Extra methods - not in a normal dictionary
2934-
2935- def dict(self):
2936- """
2937- Return a deepcopy of self as a dictionary.
2938-
2939- All members that are ``Section`` instances are recursively turned to
2940- ordinary dictionaries - by calling their ``dict`` method.
2941-
2942- >>> n = a.dict()
2943- >>> n == a
2944- 1
2945- >>> n is a
2946- 0
2947- """
2948- newdict = {}
2949- for entry in self:
2950- this_entry = self[entry]
2951- if isinstance(this_entry, Section):
2952- this_entry = this_entry.dict()
2953- elif isinstance(this_entry, list):
2954- # create a copy rather than a reference
2955- this_entry = list(this_entry)
2956- elif isinstance(this_entry, tuple):
2957- # create a copy rather than a reference
2958- this_entry = tuple(this_entry)
2959- newdict[entry] = this_entry
2960- return newdict
2961-
2962- def merge(self, indict):
2963- """
2964- A recursive update - useful for merging config files.
2965-
2966- >>> a = '''[section1]
2967- ... option1 = True
2968- ... [[subsection]]
2969- ... more_options = False
2970- ... # end of file'''.splitlines()
2971- >>> b = '''# File is user.ini
2972- ... [section1]
2973- ... option1 = False
2974- ... # end of file'''.splitlines()
2975- >>> c1 = ConfigObj(b)
2976- >>> c2 = ConfigObj(a)
2977- >>> c2.merge(c1)
2978- >>> c2
2979- {'section1': {'option1': 'False', 'subsection': {'more_options': 'False'}}}
2980- """
2981- for key, val in indict.items():
2982- if (key in self and isinstance(self[key], dict) and
2983- isinstance(val, dict)):
2984- self[key].merge(val)
2985- else:
2986- self[key] = val
2987-
2988- def rename(self, oldkey, newkey):
2989- """
2990- Change a keyname to another, without changing position in sequence.
2991-
2992- Implemented so that transformations can be made on keys,
2993- as well as on values. (used by encode and decode)
2994-
2995- Also renames comments.
2996- """
2997- if oldkey in self.scalars:
2998- the_list = self.scalars
2999- elif oldkey in self.sections:
3000- the_list = self.sections
3001- else:
3002- raise KeyError, 'Key "%s" not found.' % oldkey
3003- pos = the_list.index(oldkey)
3004- #
3005- val = self[oldkey]
3006- dict.__delitem__(self, oldkey)
3007- dict.__setitem__(self, newkey, val)
3008- the_list.remove(oldkey)
3009- the_list.insert(pos, newkey)
3010- comm = self.comments[oldkey]
3011- inline_comment = self.inline_comments[oldkey]
3012- del self.comments[oldkey]
3013- del self.inline_comments[oldkey]
3014- self.comments[newkey] = comm
3015- self.inline_comments[newkey] = inline_comment
3016-
3017- def walk(self, function, raise_errors=True,
3018- call_on_sections=False, **keywargs):
3019- """
3020- Walk every member and call a function on the keyword and value.
3021-
3022- Return a dictionary of the return values
3023-
3024- If the function raises an exception, raise the errror
3025- unless ``raise_errors=False``, in which case set the return value to
3026- ``False``.
3027-
3028- Any unrecognised keyword arguments you pass to walk, will be pased on
3029- to the function you pass in.
3030-
3031- Note: if ``call_on_sections`` is ``True`` then - on encountering a
3032- subsection, *first* the function is called for the *whole* subsection,
3033- and then recurses into it's members. This means your function must be
3034- able to handle strings, dictionaries and lists. This allows you
3035- to change the key of subsections as well as for ordinary members. The
3036- return value when called on the whole subsection has to be discarded.
3037-
3038- See the encode and decode methods for examples, including functions.
3039-
3040- .. caution::
3041-
3042- You can use ``walk`` to transform the names of members of a section
3043- but you mustn't add or delete members.
3044-
3045- >>> config = '''[XXXXsection]
3046- ... XXXXkey = XXXXvalue'''.splitlines()
3047- >>> cfg = ConfigObj(config)
3048- >>> cfg
3049- {'XXXXsection': {'XXXXkey': 'XXXXvalue'}}
3050- >>> def transform(section, key):
3051- ... val = section[key]
3052- ... newkey = key.replace('XXXX', 'CLIENT1')
3053- ... section.rename(key, newkey)
3054- ... if isinstance(val, (tuple, list, dict)):
3055- ... pass
3056- ... else:
3057- ... val = val.replace('XXXX', 'CLIENT1')
3058- ... section[newkey] = val
3059- >>> cfg.walk(transform, call_on_sections=True)
3060- {'CLIENT1section': {'CLIENT1key': None}}
3061- >>> cfg
3062- {'CLIENT1section': {'CLIENT1key': 'CLIENT1value'}}
3063- """
3064- out = {}
3065- # scalars first
3066- for i in range(len(self.scalars)):
3067- entry = self.scalars[i]
3068- try:
3069- val = function(self, entry, **keywargs)
3070- # bound again in case name has changed
3071- entry = self.scalars[i]
3072- out[entry] = val
3073- except Exception:
3074- if raise_errors:
3075- raise
3076- else:
3077- entry = self.scalars[i]
3078- out[entry] = False
3079- # then sections
3080- for i in range(len(self.sections)):
3081- entry = self.sections[i]
3082- if call_on_sections:
3083- try:
3084- function(self, entry, **keywargs)
3085- except Exception:
3086- if raise_errors:
3087- raise
3088- else:
3089- entry = self.sections[i]
3090- out[entry] = False
3091- # bound again in case name has changed
3092- entry = self.sections[i]
3093- # previous result is discarded
3094- out[entry] = self[entry].walk(
3095- function,
3096- raise_errors=raise_errors,
3097- call_on_sections=call_on_sections,
3098- **keywargs)
3099- return out
3100-
3101- def decode(self, encoding):
3102- """
3103- Decode all strings and values to unicode, using the specified encoding.
3104-
3105- Works with subsections and list values.
3106-
3107- Uses the ``walk`` method.
3108-
3109- Testing ``encode`` and ``decode``.
3110- >>> m = ConfigObj(a)
3111- >>> m.decode('ascii')
3112- >>> def testuni(val):
3113- ... for entry in val:
3114- ... if not isinstance(entry, unicode):
3115- ... print >> sys.stderr, type(entry)
3116- ... raise AssertionError, 'decode failed.'
3117- ... if isinstance(val[entry], dict):
3118- ... testuni(val[entry])
3119- ... elif not isinstance(val[entry], unicode):
3120- ... raise AssertionError, 'decode failed.'
3121- >>> testuni(m)
3122- >>> m.encode('ascii')
3123- >>> a == m
3124- 1
3125- """
3126- warn('use of ``decode`` is deprecated.', DeprecationWarning)
3127- def decode(section, key, encoding=encoding, warn=True):
3128- """ """
3129- val = section[key]
3130- if isinstance(val, (list, tuple)):
3131- newval = []
3132- for entry in val:
3133- newval.append(entry.decode(encoding))
3134- elif isinstance(val, dict):
3135- newval = val
3136- else:
3137- newval = val.decode(encoding)
3138- newkey = key.decode(encoding)
3139- section.rename(key, newkey)
3140- section[newkey] = newval
3141- # using ``call_on_sections`` allows us to modify section names
3142- self.walk(decode, call_on_sections=True)
3143-
3144- def encode(self, encoding):
3145- """
3146- Encode all strings and values from unicode,
3147- using the specified encoding.
3148-
3149- Works with subsections and list values.
3150- Uses the ``walk`` method.
3151- """
3152- warn('use of ``encode`` is deprecated.', DeprecationWarning)
3153- def encode(section, key, encoding=encoding):
3154- """ """
3155- val = section[key]
3156- if isinstance(val, (list, tuple)):
3157- newval = []
3158- for entry in val:
3159- newval.append(entry.encode(encoding))
3160- elif isinstance(val, dict):
3161- newval = val
3162- else:
3163- newval = val.encode(encoding)
3164- newkey = key.encode(encoding)
3165- section.rename(key, newkey)
3166- section[newkey] = newval
3167- self.walk(encode, call_on_sections=True)
3168-
3169- def istrue(self, key):
3170- """A deprecated version of ``as_bool``."""
3171- warn('use of ``istrue`` is deprecated. Use ``as_bool`` method '
3172- 'instead.', DeprecationWarning)
3173- return self.as_bool(key)
3174-
3175- def as_bool(self, key):
3176- """
3177- Accepts a key as input. The corresponding value must be a string or
3178- the objects (``True`` or 1) or (``False`` or 0). We allow 0 and 1 to
3179- retain compatibility with Python 2.2.
3180-
3181- If the string is one of ``True``, ``On``, ``Yes``, or ``1`` it returns
3182- ``True``.
3183-
3184- If the string is one of ``False``, ``Off``, ``No``, or ``0`` it returns
3185- ``False``.
3186-
3187- ``as_bool`` is not case sensitive.
3188-
3189- Any other input will raise a ``ValueError``.
3190-
3191- >>> a = ConfigObj()
3192- >>> a['a'] = 'fish'
3193- >>> a.as_bool('a')
3194- Traceback (most recent call last):
3195- ValueError: Value "fish" is neither True nor False
3196- >>> a['b'] = 'True'
3197- >>> a.as_bool('b')
3198- 1
3199- >>> a['b'] = 'off'
3200- >>> a.as_bool('b')
3201- 0
3202- """
3203- val = self[key]
3204- if val == True:
3205- return True
3206- elif val == False:
3207- return False
3208- else:
3209- try:
3210- if not isinstance(val, StringTypes):
3211- raise KeyError
3212- else:
3213- return self.main._bools[val.lower()]
3214- except KeyError:
3215- raise ValueError('Value "%s" is neither True nor False' % val)
3216-
3217- def as_int(self, key):
3218- """
3219- A convenience method which coerces the specified value to an integer.
3220-
3221- If the value is an invalid literal for ``int``, a ``ValueError`` will
3222- be raised.
3223-
3224- >>> a = ConfigObj()
3225- >>> a['a'] = 'fish'
3226- >>> a.as_int('a')
3227- Traceback (most recent call last):
3228- ValueError: invalid literal for int(): fish
3229- >>> a['b'] = '1'
3230- >>> a.as_int('b')
3231- 1
3232- >>> a['b'] = '3.2'
3233- >>> a.as_int('b')
3234- Traceback (most recent call last):
3235- ValueError: invalid literal for int(): 3.2
3236- """
3237- return int(self[key])
3238-
3239- def as_float(self, key):
3240- """
3241- A convenience method which coerces the specified value to a float.
3242-
3243- If the value is an invalid literal for ``float``, a ``ValueError`` will
3244- be raised.
3245-
3246- >>> a = ConfigObj()
3247- >>> a['a'] = 'fish'
3248- >>> a.as_float('a')
3249- Traceback (most recent call last):
3250- ValueError: invalid literal for float(): fish
3251- >>> a['b'] = '1'
3252- >>> a.as_float('b')
3253- 1.0
3254- >>> a['b'] = '3.2'
3255- >>> a.as_float('b')
3256- 3.2000000000000002
3257- """
3258- return float(self[key])
3259-
3260-
3261-class ConfigObj(Section):
3262- """An object to read, create, and write config files."""
3263-
3264- _keyword = re.compile(r'''^ # line start
3265- (\s*) # indentation
3266- ( # keyword
3267- (?:".*?")| # double quotes
3268- (?:'.*?')| # single quotes
3269- (?:[^'"=].*?) # no quotes
3270- )
3271- \s*=\s* # divider
3272- (.*) # value (including list values and comments)
3273- $ # line end
3274- ''',
3275- re.VERBOSE)
3276-
3277- _sectionmarker = re.compile(r'''^
3278- (\s*) # 1: indentation
3279- ((?:\[\s*)+) # 2: section marker open
3280- ( # 3: section name open
3281- (?:"\s*\S.*?\s*")| # at least one non-space with double quotes
3282- (?:'\s*\S.*?\s*')| # at least one non-space with single quotes
3283- (?:[^'"\s].*?) # at least one non-space unquoted
3284- ) # section name close
3285- ((?:\s*\])+) # 4: section marker close
3286- \s*(\#.*)? # 5: optional comment
3287- $''',
3288- re.VERBOSE)
3289-
3290- # this regexp pulls list values out as a single string
3291- # or single values and comments
3292- # FIXME: this regex adds a '' to the end of comma terminated lists
3293- # workaround in ``_handle_value``
3294- _valueexp = re.compile(r'''^
3295- (?:
3296- (?:
3297- (
3298- (?:
3299- (?:
3300- (?:".*?")| # double quotes
3301- (?:'.*?')| # single quotes
3302- (?:[^'",\#][^,\#]*?) # unquoted
3303- )
3304- \s*,\s* # comma
3305- )* # match all list items ending in a comma (if any)
3306- )
3307- (
3308- (?:".*?")| # double quotes
3309- (?:'.*?')| # single quotes
3310- (?:[^'",\#\s][^,]*?)| # unquoted
3311- (?:(?<!,)) # Empty value
3312- )? # last item in a list - or string value
3313- )|
3314- (,) # alternatively a single comma - empty list
3315- )
3316- \s*(\#.*)? # optional comment
3317- $''',
3318- re.VERBOSE)
3319-
3320- # use findall to get the members of a list value
3321- _listvalueexp = re.compile(r'''
3322- (
3323- (?:".*?")| # double quotes
3324- (?:'.*?')| # single quotes
3325- (?:[^'",\#].*?) # unquoted
3326- )
3327- \s*,\s* # comma
3328- ''',
3329- re.VERBOSE)
3330-
3331- # this regexp is used for the value
3332- # when lists are switched off
3333- _nolistvalue = re.compile(r'''^
3334- (
3335- (?:".*?")| # double quotes
3336- (?:'.*?')| # single quotes
3337- (?:[^'"\#].*?)| # unquoted
3338- (?:) # Empty value
3339- )
3340- \s*(\#.*)? # optional comment
3341- $''',
3342- re.VERBOSE)
3343-
3344- # regexes for finding triple quoted values on one line
3345- _single_line_single = re.compile(r"^'''(.*?)'''\s*(#.*)?$")
3346- _single_line_double = re.compile(r'^"""(.*?)"""\s*(#.*)?$')
3347- _multi_line_single = re.compile(r"^(.*?)'''\s*(#.*)?$")
3348- _multi_line_double = re.compile(r'^(.*?)"""\s*(#.*)?$')
3349-
3350- _triple_quote = {
3351- "'''": (_single_line_single, _multi_line_single),
3352- '"""': (_single_line_double, _multi_line_double),
3353- }
3354-
3355- # Used by the ``istrue`` Section method
3356- _bools = {
3357- 'yes': True, 'no': False,
3358- 'on': True, 'off': False,
3359- '1': True, '0': False,
3360- 'true': True, 'false': False,
3361- }
3362-
3363- def __init__(self, infile=None, options=None, **kwargs):
3364- """
3365- Parse or create a config file object.
3366-
3367- ``ConfigObj(infile=None, options=None, **kwargs)``
3368- """
3369- if infile is None:
3370- infile = []
3371- if options is None:
3372- options = {}
3373- else:
3374- options = dict(options)
3375- # keyword arguments take precedence over an options dictionary
3376- options.update(kwargs)
3377- # init the superclass
3378- Section.__init__(self, self, 0, self)
3379- #
3380- defaults = OPTION_DEFAULTS.copy()
3381- for entry in options.keys():
3382- if entry not in defaults.keys():
3383- raise TypeError, 'Unrecognised option "%s".' % entry
3384- # TODO: check the values too.
3385- #
3386- # Add any explicit options to the defaults
3387- defaults.update(options)
3388- #
3389- # initialise a few variables
3390- self.filename = None
3391- self._errors = []
3392- self.raise_errors = defaults['raise_errors']
3393- self.interpolation = defaults['interpolation']
3394- self.list_values = defaults['list_values']
3395- self.create_empty = defaults['create_empty']
3396- self.file_error = defaults['file_error']
3397- self.stringify = defaults['stringify']
3398- self.indent_type = defaults['indent_type']
3399- self.encoding = defaults['encoding']
3400- self.default_encoding = defaults['default_encoding']
3401- self.BOM = False
3402- self.newlines = None
3403- self.write_empty_values = defaults['write_empty_values']
3404- self.unrepr = defaults['unrepr']
3405- #
3406- self.initial_comment = []
3407- self.final_comment = []
3408- #
3409- self._terminated = False
3410- #
3411- if isinstance(infile, StringTypes):
3412- self.filename = infile
3413- if os.path.isfile(infile):
3414- infile = open(infile).read() or []
3415- elif self.file_error:
3416- # raise an error if the file doesn't exist
3417- raise IOError, 'Config file not found: "%s".' % self.filename
3418- else:
3419- # file doesn't already exist
3420- if self.create_empty:
3421- # this is a good test that the filename specified
3422- # isn't impossible - like on a non existent device
3423- h = open(infile, 'w')
3424- h.write('')
3425- h.close()
3426- infile = []
3427- elif isinstance(infile, (list, tuple)):
3428- infile = list(infile)
3429- elif isinstance(infile, dict):
3430- # initialise self
3431- # the Section class handles creating subsections
3432- if isinstance(infile, ConfigObj):
3433- # get a copy of our ConfigObj
3434- infile = infile.dict()
3435- for entry in infile:
3436- self[entry] = infile[entry]
3437- del self._errors
3438- if defaults['configspec'] is not None:
3439- self._handle_configspec(defaults['configspec'])
3440- else:
3441- self.configspec = None
3442- return
3443- elif hasattr(infile, 'read'):
3444- # This supports file like objects
3445- infile = infile.read() or []
3446- # needs splitting into lines - but needs doing *after* decoding
3447- # in case it's not an 8 bit encoding
3448- else:
3449- raise TypeError, ('infile must be a filename,'
3450- ' file like object, or list of lines.')
3451- #
3452- if infile:
3453- # don't do it for the empty ConfigObj
3454- infile = self._handle_bom(infile)
3455- # infile is now *always* a list
3456- #
3457- # Set the newlines attribute (first line ending it finds)
3458- # and strip trailing '\n' or '\r' from lines
3459- for line in infile:
3460- if (not line) or (line[-1] not in ('\r', '\n', '\r\n')):
3461- continue
3462- for end in ('\r\n', '\n', '\r'):
3463- if line.endswith(end):
3464- self.newlines = end
3465- break
3466- break
3467- if infile[-1] and infile[-1] in ('\r', '\n', '\r\n'):
3468- self._terminated = True
3469- infile = [line.rstrip('\r\n') for line in infile]
3470- #
3471- self._parse(infile)
3472- # if we had any errors, now is the time to raise them
3473- if self._errors:
3474- info = "at line %s." % self._errors[0].line_number
3475- if len(self._errors) > 1:
3476- msg = ("Parsing failed with several errors.\nFirst error %s" %
3477- info)
3478- error = ConfigObjError(msg)
3479- else:
3480- error = self._errors[0]
3481- # set the errors attribute; it's a list of tuples:
3482- # (error_type, message, line_number)
3483- error.errors = self._errors
3484- # set the config attribute
3485- error.config = self
3486- raise error
3487- # delete private attributes
3488- del self._errors
3489- #
3490- if defaults['configspec'] is None:
3491- self.configspec = None
3492- else:
3493- self._handle_configspec(defaults['configspec'])
3494-
3495- def __repr__(self):
3496- return 'ConfigObj({%s})' % ', '.join(
3497- [('%s: %s' % (repr(key), repr(self[key]))) for key in
3498- (self.scalars + self.sections)])
3499-
3500- def _handle_bom(self, infile):
3501- """
3502- Handle any BOM, and decode if necessary.
3503-
3504- If an encoding is specified, that *must* be used - but the BOM should
3505- still be removed (and the BOM attribute set).
3506-
3507- (If the encoding is wrongly specified, then a BOM for an alternative
3508- encoding won't be discovered or removed.)
3509-
3510- If an encoding is not specified, UTF8 or UTF16 BOM will be detected and
3511- removed. The BOM attribute will be set. UTF16 will be decoded to
3512- unicode.
3513-
3514- NOTE: This method must not be called with an empty ``infile``.
3515-
3516- Specifying the *wrong* encoding is likely to cause a
3517- ``UnicodeDecodeError``.
3518-
3519- ``infile`` must always be returned as a list of lines, but may be
3520- passed in as a single string.
3521- """
3522- if ((self.encoding is not None) and
3523- (self.encoding.lower() not in BOM_LIST)):
3524- # No need to check for a BOM
3525- # the encoding specified doesn't have one
3526- # just decode
3527- return self._decode(infile, self.encoding)
3528- #
3529- if isinstance(infile, (list, tuple)):
3530- line = infile[0]
3531- else:
3532- line = infile
3533- if self.encoding is not None:
3534- # encoding explicitly supplied
3535- # And it could have an associated BOM
3536- # TODO: if encoding is just UTF16 - we ought to check for both
3537- # TODO: big endian and little endian versions.
3538- enc = BOM_LIST[self.encoding.lower()]
3539- if enc == 'utf_16':
3540- # For UTF16 we try big endian and little endian
3541- for BOM, (encoding, final_encoding) in BOMS.items():
3542- if not final_encoding:
3543- # skip UTF8
3544- continue
3545- if infile.startswith(BOM):
3546- ### BOM discovered
3547- ##self.BOM = True
3548- # Don't need to remove BOM
3549- return self._decode(infile, encoding)
3550- #
3551- # If we get this far, will *probably* raise a DecodeError
3552- # As it doesn't appear to start with a BOM
3553- return self._decode(infile, self.encoding)
3554- #
3555- # Must be UTF8
3556- BOM = BOM_SET[enc]
3557- if not line.startswith(BOM):
3558- return self._decode(infile, self.encoding)
3559- #
3560- newline = line[len(BOM):]
3561- #
3562- # BOM removed
3563- if isinstance(infile, (list, tuple)):
3564- infile[0] = newline
3565- else:
3566- infile = newline
3567- self.BOM = True
3568- return self._decode(infile, self.encoding)
3569- #
3570- # No encoding specified - so we need to check for UTF8/UTF16
3571- for BOM, (encoding, final_encoding) in BOMS.items():
3572- if not line.startswith(BOM):
3573- continue
3574- else:
3575- # BOM discovered
3576- self.encoding = final_encoding
3577- if not final_encoding:
3578- self.BOM = True
3579- # UTF8
3580- # remove BOM
3581- newline = line[len(BOM):]
3582- if isinstance(infile, (list, tuple)):
3583- infile[0] = newline
3584- else:
3585- infile = newline
3586- # UTF8 - don't decode
3587- if isinstance(infile, StringTypes):
3588- return infile.splitlines(True)
3589- else:
3590- return infile
3591- # UTF16 - have to decode
3592- return self._decode(infile, encoding)
3593- #
3594- # No BOM discovered and no encoding specified, just return
3595- if isinstance(infile, StringTypes):
3596- # infile read from a file will be a single string
3597- return infile.splitlines(True)
3598- else:
3599- return infile
3600-
3601- def _a_to_u(self, aString):
3602- """Decode ASCII strings to unicode if a self.encoding is specified."""
3603- if self.encoding:
3604- return aString.decode('ascii')
3605- else:
3606- return aString
3607-
3608- def _decode(self, infile, encoding):
3609- """
3610- Decode infile to unicode. Using the specified encoding.
3611-
3612- if is a string, it also needs converting to a list.
3613- """
3614- if isinstance(infile, StringTypes):
3615- # can't be unicode
3616- # NOTE: Could raise a ``UnicodeDecodeError``
3617- return infile.decode(encoding).splitlines(True)
3618- for i, line in enumerate(infile):
3619- if not isinstance(line, unicode):
3620- # NOTE: The isinstance test here handles mixed lists of unicode/string
3621- # NOTE: But the decode will break on any non-string values
3622- # NOTE: Or could raise a ``UnicodeDecodeError``
3623- infile[i] = line.decode(encoding)
3624- return infile
3625-
3626- def _decode_element(self, line):
3627- """Decode element to unicode if necessary."""
3628- if not self.encoding:
3629- return line
3630- if isinstance(line, str) and self.default_encoding:
3631- return line.decode(self.default_encoding)
3632- return line
3633-
3634- def _str(self, value):
3635- """
3636- Used by ``stringify`` within validate, to turn non-string values
3637- into strings.
3638- """
3639- if not isinstance(value, StringTypes):
3640- return str(value)
3641- else:
3642- return value
3643-
3644- def _parse(self, infile):
3645- """Actually parse the config file."""
3646- temp_list_values = self.list_values
3647- if self.unrepr:
3648- self.list_values = False
3649- comment_list = []
3650- done_start = False
3651- this_section = self
3652- maxline = len(infile) - 1
3653- cur_index = -1
3654- reset_comment = False
3655- while cur_index < maxline:
3656- if reset_comment:
3657- comment_list = []
3658- cur_index += 1
3659- line = infile[cur_index]
3660- sline = line.strip()
3661- # do we have anything on the line ?
3662- if not sline or sline.startswith('#'):
3663- reset_comment = False
3664- comment_list.append(line)
3665- continue
3666- if not done_start:
3667- # preserve initial comment
3668- self.initial_comment = comment_list
3669- comment_list = []
3670- done_start = True
3671- reset_comment = True
3672- # first we check if it's a section marker
3673- mat = self._sectionmarker.match(line)
3674- if mat is not None:
3675- # is a section line
3676- (indent, sect_open, sect_name, sect_close, comment) = (
3677- mat.groups())
3678- if indent and (self.indent_type is None):
3679- self.indent_type = indent
3680- cur_depth = sect_open.count('[')
3681- if cur_depth != sect_close.count(']'):
3682- self._handle_error(
3683- "Cannot compute the section depth at line %s.",
3684- NestingError, infile, cur_index)
3685- continue
3686- #
3687- if cur_depth < this_section.depth:
3688- # the new section is dropping back to a previous level
3689- try:
3690- parent = self._match_depth(
3691- this_section,
3692- cur_depth).parent
3693- except SyntaxError:
3694- self._handle_error(
3695- "Cannot compute nesting level at line %s.",
3696- NestingError, infile, cur_index)
3697- continue
3698- elif cur_depth == this_section.depth:
3699- # the new section is a sibling of the current section
3700- parent = this_section.parent
3701- elif cur_depth == this_section.depth + 1:
3702- # the new section is a child the current section
3703- parent = this_section
3704- else:
3705- self._handle_error(
3706- "Section too nested at line %s.",
3707- NestingError, infile, cur_index)
3708- #
3709- sect_name = self._unquote(sect_name)
3710- if parent.has_key(sect_name):
3711- self._handle_error(
3712- 'Duplicate section name at line %s.',
3713- DuplicateError, infile, cur_index)
3714- continue
3715- # create the new section
3716- this_section = Section(
3717- parent,
3718- cur_depth,
3719- self,
3720- name=sect_name)
3721- parent[sect_name] = this_section
3722- parent.inline_comments[sect_name] = comment
3723- parent.comments[sect_name] = comment_list
3724- continue
3725- #
3726- # it's not a section marker,
3727- # so it should be a valid ``key = value`` line
3728- mat = self._keyword.match(line)
3729- if mat is None:
3730- # it neither matched as a keyword
3731- # or a section marker
3732- self._handle_error(
3733- 'Invalid line at line "%s".',
3734- ParseError, infile, cur_index)
3735- else:
3736- # is a keyword value
3737- # value will include any inline comment
3738- (indent, key, value) = mat.groups()
3739- if indent and (self.indent_type is None):
3740- self.indent_type = indent
3741- # check for a multiline value
3742- if value[:3] in ['"""', "'''"]:
3743- try:
3744- (value, comment, cur_index) = self._multiline(
3745- value, infile, cur_index, maxline)
3746- except SyntaxError:
3747- self._handle_error(
3748- 'Parse error in value at line %s.',
3749- ParseError, infile, cur_index)
3750- continue
3751- else:
3752- if self.unrepr:
3753- comment = ''
3754- try:
3755- value = unrepr(value)
3756- except Exception, e:
3757- if type(e) == UnknownType:
3758- msg = 'Unknown name or type in value at line %s.'
3759- else:
3760- msg = 'Parse error in value at line %s.'
3761- self._handle_error(msg, UnreprError, infile,
3762- cur_index)
3763- continue
3764- else:
3765- if self.unrepr:
3766- comment = ''
3767- try:
3768- value = unrepr(value)
3769- except Exception, e:
3770- if isinstance(e, UnknownType):
3771- msg = 'Unknown name or type in value at line %s.'
3772- else:
3773- msg = 'Parse error in value at line %s.'
3774- self._handle_error(msg, UnreprError, infile,
3775- cur_index)
3776- continue
3777- else:
3778- # extract comment and lists
3779- try:
3780- (value, comment) = self._handle_value(value)
3781- except SyntaxError:
3782- self._handle_error(
3783- 'Parse error in value at line %s.',
3784- ParseError, infile, cur_index)
3785- continue
3786- #
3787- key = self._unquote(key)
3788- if this_section.has_key(key):
3789- self._handle_error(
3790- 'Duplicate keyword name at line %s.',
3791- DuplicateError, infile, cur_index)
3792- continue
3793- # add the key.
3794- # we set unrepr because if we have got this far we will never
3795- # be creating a new section
3796- this_section.__setitem__(key, value, unrepr=True)
3797- this_section.inline_comments[key] = comment
3798- this_section.comments[key] = comment_list
3799- continue
3800- #
3801- if self.indent_type is None:
3802- # no indentation used, set the type accordingly
3803- self.indent_type = ''
3804- #
3805- if self._terminated:
3806- comment_list.append('')
3807- # preserve the final comment
3808- if not self and not self.initial_comment:
3809- self.initial_comment = comment_list
3810- elif not reset_comment:
3811- self.final_comment = comment_list
3812- self.list_values = temp_list_values
3813-
3814- def _match_depth(self, sect, depth):
3815- """
3816- Given a section and a depth level, walk back through the sections
3817- parents to see if the depth level matches a previous section.
3818-
3819- Return a reference to the right section,
3820- or raise a SyntaxError.
3821- """
3822- while depth < sect.depth:
3823- if sect is sect.parent:
3824- # we've reached the top level already
3825- raise SyntaxError
3826- sect = sect.parent
3827- if sect.depth == depth:
3828- return sect
3829- # shouldn't get here
3830- raise SyntaxError
3831-
3832- def _handle_error(self, text, ErrorClass, infile, cur_index):
3833- """
3834- Handle an error according to the error settings.
3835-
3836- Either raise the error or store it.
3837- The error will have occured at ``cur_index``
3838- """
3839- line = infile[cur_index]
3840- cur_index += 1
3841- message = text % cur_index
3842- error = ErrorClass(message, cur_index, line)
3843- if self.raise_errors:
3844- # raise the error - parsing stops here
3845- raise error
3846- # store the error
3847- # reraise when parsing has finished
3848- self._errors.append(error)
3849-
3850- def _unquote(self, value):
3851- """Return an unquoted version of a value"""
3852- if (value[0] == value[-1]) and (value[0] in ('"', "'")):
3853- value = value[1:-1]
3854- return value
3855-
3856- def _quote(self, value, multiline=True):
3857- """
3858- Return a safely quoted version of a value.
3859-
3860- Raise a ConfigObjError if the value cannot be safely quoted.
3861- If multiline is ``True`` (default) then use triple quotes
3862- if necessary.
3863-
3864- Don't quote values that don't need it.
3865- Recursively quote members of a list and return a comma joined list.
3866- Multiline is ``False`` for lists.
3867- Obey list syntax for empty and single member lists.
3868-
3869- If ``list_values=False`` then the value is only quoted if it contains
3870- a ``\n`` (is multiline).
3871-
3872- If ``write_empty_values`` is set, and the value is an empty string, it
3873- won't be quoted.
3874- """
3875- if multiline and self.write_empty_values and value == '':
3876- # Only if multiline is set, so that it is used for values not
3877- # keys, and not values that are part of a list
3878- return ''
3879- if multiline and isinstance(value, (list, tuple)):
3880- if not value:
3881- return ','
3882- elif len(value) == 1:
3883- return self._quote(value[0], multiline=False) + ','
3884- return ', '.join([self._quote(val, multiline=False)
3885- for val in value])
3886- if not isinstance(value, StringTypes):
3887- if self.stringify:
3888- value = str(value)
3889- else:
3890- raise TypeError, 'Value "%s" is not a string.' % value
3891- squot = "'%s'"
3892- dquot = '"%s"'
3893- noquot = "%s"
3894- wspace_plus = ' \r\t\n\v\t\'"'
3895- tsquot = '"""%s"""'
3896- tdquot = "'''%s'''"
3897- if not value:
3898- return '""'
3899- if (not self.list_values and '\n' not in value) or not (multiline and
3900- ((("'" in value) and ('"' in value)) or ('\n' in value))):
3901- if not self.list_values:
3902- # we don't quote if ``list_values=False``
3903- quot = noquot
3904- # for normal values either single or double quotes will do
3905- elif '\n' in value:
3906- # will only happen if multiline is off - e.g. '\n' in key
3907- raise ConfigObjError, ('Value "%s" cannot be safely quoted.' %
3908- value)
3909- elif ((value[0] not in wspace_plus) and
3910- (value[-1] not in wspace_plus) and
3911- (',' not in value)):
3912- quot = noquot
3913- else:
3914- if ("'" in value) and ('"' in value):
3915- raise ConfigObjError, (
3916- 'Value "%s" cannot be safely quoted.' % value)
3917- elif '"' in value:
3918- quot = squot
3919- else:
3920- quot = dquot
3921- else:
3922- # if value has '\n' or "'" *and* '"', it will need triple quotes
3923- if (value.find('"""') != -1) and (value.find("'''") != -1):
3924- raise ConfigObjError, (
3925- 'Value "%s" cannot be safely quoted.' % value)
3926- if value.find('"""') == -1:
3927- quot = tdquot
3928- else:
3929- quot = tsquot
3930- return quot % value
3931-
3932- def _handle_value(self, value):
3933- """
3934- Given a value string, unquote, remove comment,
3935- handle lists. (including empty and single member lists)
3936- """
3937- # do we look for lists in values ?
3938- if not self.list_values:
3939- mat = self._nolistvalue.match(value)
3940- if mat is None:
3941- raise SyntaxError
3942- # NOTE: we don't unquote here
3943- return mat.groups()
3944- #
3945- mat = self._valueexp.match(value)
3946- if mat is None:
3947- # the value is badly constructed, probably badly quoted,
3948- # or an invalid list
3949- raise SyntaxError
3950- (list_values, single, empty_list, comment) = mat.groups()
3951- if (list_values == '') and (single is None):
3952- # change this if you want to accept empty values
3953- raise SyntaxError
3954- # NOTE: note there is no error handling from here if the regex
3955- # is wrong: then incorrect values will slip through
3956- if empty_list is not None:
3957- # the single comma - meaning an empty list
3958- return ([], comment)
3959- if single is not None:
3960- # handle empty values
3961- if list_values and not single:
3962- # FIXME: the '' is a workaround because our regex now matches
3963- # '' at the end of a list if it has a trailing comma
3964- single = None
3965- else:
3966- single = single or '""'
3967- single = self._unquote(single)
3968- if list_values == '':
3969- # not a list value
3970- return (single, comment)
3971- the_list = self._listvalueexp.findall(list_values)
3972- the_list = [self._unquote(val) for val in the_list]
3973- if single is not None:
3974- the_list += [single]
3975- return (the_list, comment)
3976-
3977- def _multiline(self, value, infile, cur_index, maxline):
3978- """Extract the value, where we are in a multiline situation."""
3979- quot = value[:3]
3980- newvalue = value[3:]
3981- single_line = self._triple_quote[quot][0]
3982- multi_line = self._triple_quote[quot][1]
3983- mat = single_line.match(value)
3984- if mat is not None:
3985- retval = list(mat.groups())
3986- retval.append(cur_index)
3987- return retval
3988- elif newvalue.find(quot) != -1:
3989- # somehow the triple quote is missing
3990- raise SyntaxError
3991- #
3992- while cur_index < maxline:
3993- cur_index += 1
3994- newvalue += '\n'
3995- line = infile[cur_index]
3996- if line.find(quot) == -1:
3997- newvalue += line
3998- else:
3999- # end of multiline, process it
4000- break
4001- else:
4002- # we've got to the end of the config, oops...
4003- raise SyntaxError
4004- mat = multi_line.match(line)
4005- if mat is None:
4006- # a badly formed line
4007- raise SyntaxError
4008- (value, comment) = mat.groups()
4009- return (newvalue + value, comment, cur_index)
4010-
4011- def _handle_configspec(self, configspec):
4012- """Parse the configspec."""
4013- # FIXME: Should we check that the configspec was created with the
4014- # correct settings ? (i.e. ``list_values=False``)
4015- if not isinstance(configspec, ConfigObj):
4016- try:
4017- configspec = ConfigObj(
4018- configspec,
4019- raise_errors=True,
4020- file_error=True,
4021- list_values=False)
4022- except ConfigObjError, e:
4023- # FIXME: Should these errors have a reference
4024- # to the already parsed ConfigObj ?
4025- raise ConfigspecError('Parsing configspec failed: %s' % e)
4026- except IOError, e:
4027- raise IOError('Reading configspec failed: %s' % e)
4028- self._set_configspec_value(configspec, self)
4029-
4030- def _set_configspec_value(self, configspec, section):
4031- """Used to recursively set configspec values."""
4032- if '__many__' in configspec.sections:
4033- section.configspec['__many__'] = configspec['__many__']
4034- if len(configspec.sections) > 1:
4035- # FIXME: can we supply any useful information here ?
4036- raise RepeatSectionError
4037- if hasattr(configspec, 'initial_comment'):
4038- section._configspec_initial_comment = configspec.initial_comment
4039- section._configspec_final_comment = configspec.final_comment
4040- section._configspec_encoding = configspec.encoding
4041- section._configspec_BOM = configspec.BOM
4042- section._configspec_newlines = configspec.newlines
4043- section._configspec_indent_type = configspec.indent_type
4044- for entry in configspec.scalars:
4045- section._configspec_comments[entry] = configspec.comments[entry]
4046- section._configspec_inline_comments[entry] = (
4047- configspec.inline_comments[entry])
4048- section.configspec[entry] = configspec[entry]
4049- section._order.append(entry)
4050- for entry in configspec.sections:
4051- if entry == '__many__':
4052- continue
4053- section._cs_section_comments[entry] = configspec.comments[entry]
4054- section._cs_section_inline_comments[entry] = (
4055- configspec.inline_comments[entry])
4056- if not section.has_key(entry):
4057- section[entry] = {}
4058- self._set_configspec_value(configspec[entry], section[entry])
4059-
4060- def _handle_repeat(self, section, configspec):
4061- """Dynamically assign configspec for repeated section."""
4062- try:
4063- section_keys = configspec.sections
4064- scalar_keys = configspec.scalars
4065- except AttributeError:
4066- section_keys = [entry for entry in configspec
4067- if isinstance(configspec[entry], dict)]
4068- scalar_keys = [entry for entry in configspec
4069- if not isinstance(configspec[entry], dict)]
4070- if '__many__' in section_keys and len(section_keys) > 1:
4071- # FIXME: can we supply any useful information here ?
4072- raise RepeatSectionError
4073- scalars = {}
4074- sections = {}
4075- for entry in scalar_keys:
4076- val = configspec[entry]
4077- scalars[entry] = val
4078- for entry in section_keys:
4079- val = configspec[entry]
4080- if entry == '__many__':
4081- scalars[entry] = val
4082- continue
4083- sections[entry] = val
4084- #
4085- section.configspec = scalars
4086- for entry in sections:
4087- if not section.has_key(entry):
4088- section[entry] = {}
4089- self._handle_repeat(section[entry], sections[entry])
4090-
4091- def _write_line(self, indent_string, entry, this_entry, comment):
4092- """Write an individual line, for the write method"""
4093- # NOTE: the calls to self._quote here handles non-StringType values.
4094- if not self.unrepr:
4095- val = self._decode_element(self._quote(this_entry))
4096- else:
4097- val = repr(this_entry)
4098- return '%s%s%s%s%s' % (
4099- indent_string,
4100- self._decode_element(self._quote(entry, multiline=False)),
4101- self._a_to_u(' = '),
4102- val,
4103- self._decode_element(comment))
4104-
4105- def _write_marker(self, indent_string, depth, entry, comment):
4106- """Write a section marker line"""
4107- return '%s%s%s%s%s' % (
4108- indent_string,
4109- self._a_to_u('[' * depth),
4110- self._quote(self._decode_element(entry), multiline=False),
4111- self._a_to_u(']' * depth),
4112- self._decode_element(comment))
4113-
4114- def _handle_comment(self, comment):
4115- """Deal with a comment."""
4116- if not comment:
4117- return ''
4118- start = self.indent_type
4119- if not comment.startswith('#'):
4120- start += self._a_to_u(' # ')
4121- return (start + comment)
4122-
4123- # Public methods
4124-
4125- def write(self, outfile=None, section=None):
4126- """
4127- Write the current ConfigObj as a file
4128-
4129- tekNico: FIXME: use StringIO instead of real files
4130-
4131- >>> filename = a.filename
4132- >>> a.filename = 'test.ini'
4133- >>> a.write()
4134- >>> a.filename = filename
4135- >>> a == ConfigObj('test.ini', raise_errors=True)
4136- 1
4137- """
4138- if self.indent_type is None:
4139- # this can be true if initialised from a dictionary
4140- self.indent_type = DEFAULT_INDENT_TYPE
4141- #
4142- out = []
4143- cs = self._a_to_u('#')
4144- csp = self._a_to_u('# ')
4145- if section is None:
4146- int_val = self.interpolation
4147- self.interpolation = False
4148- section = self
4149- for line in self.initial_comment:
4150- line = self._decode_element(line)
4151- stripped_line = line.strip()
4152- if stripped_line and not stripped_line.startswith(cs):
4153- line = csp + line
4154- out.append(line)
4155- #
4156- indent_string = self.indent_type * section.depth
4157- for entry in (section.scalars + section.sections):
4158- if entry in section.defaults:
4159- # don't write out default values
4160- continue
4161- for comment_line in section.comments[entry]:
4162- comment_line = self._decode_element(comment_line.lstrip())
4163- if comment_line and not comment_line.startswith(cs):
4164- comment_line = csp + comment_line
4165- out.append(indent_string + comment_line)
4166- this_entry = section[entry]
4167- comment = self._handle_comment(section.inline_comments[entry])
4168- #
4169- if isinstance(this_entry, dict):
4170- # a section
4171- out.append(self._write_marker(
4172- indent_string,
4173- this_entry.depth,
4174- entry,
4175- comment))
4176- out.extend(self.write(section=this_entry))
4177- else:
4178- out.append(self._write_line(
4179- indent_string,
4180- entry,
4181- this_entry,
4182- comment))
4183- #
4184- if section is self:
4185- for line in self.final_comment:
4186- line = self._decode_element(line)
4187- stripped_line = line.strip()
4188- if stripped_line and not stripped_line.startswith(cs):
4189- line = csp + line
4190- out.append(line)
4191- self.interpolation = int_val
4192- #
4193- if section is not self:
4194- return out
4195- #
4196- if (self.filename is None) and (outfile is None):
4197- # output a list of lines
4198- # might need to encode
4199- # NOTE: This will *screw* UTF16, each line will start with the BOM
4200- if self.encoding:
4201- out = [l.encode(self.encoding) for l in out]
4202- if (self.BOM and ((self.encoding is None) or
4203- (BOM_LIST.get(self.encoding.lower()) == 'utf_8'))):
4204- # Add the UTF8 BOM
4205- if not out:
4206- out.append('')
4207- out[0] = BOM_UTF8 + out[0]
4208- return out
4209- #
4210- # Turn the list to a string, joined with correct newlines
4211- output = (self._a_to_u(self.newlines or os.linesep)
4212- ).join(out)
4213- if self.encoding:
4214- output = output.encode(self.encoding)
4215- if (self.BOM and ((self.encoding is None) or
4216- (BOM_LIST.get(self.encoding.lower()) == 'utf_8'))):
4217- # Add the UTF8 BOM
4218- output = BOM_UTF8 + output
4219- if outfile is not None:
4220- outfile.write(output)
4221- else:
4222- h = open(self.filename, 'wb')
4223- h.write(output)
4224- h.close()
4225-
4226- def validate(self, validator, preserve_errors=False, copy=False,
4227- section=None):
4228- """
4229- Test the ConfigObj against a configspec.
4230-
4231- It uses the ``validator`` object from *validate.py*.
4232-
4233- To run ``validate`` on the current ConfigObj, call: ::
4234-
4235- test = config.validate(validator)
4236-
4237- (Normally having previously passed in the configspec when the ConfigObj
4238- was created - you can dynamically assign a dictionary of checks to the
4239- ``configspec`` attribute of a section though).
4240-
4241- It returns ``True`` if everything passes, or a dictionary of
4242- pass/fails (True/False). If every member of a subsection passes, it
4243- will just have the value ``True``. (It also returns ``False`` if all
4244- members fail).
4245-
4246- In addition, it converts the values from strings to their native
4247- types if their checks pass (and ``stringify`` is set).
4248-
4249- If ``preserve_errors`` is ``True`` (``False`` is default) then instead
4250- of a marking a fail with a ``False``, it will preserve the actual
4251- exception object. This can contain info about the reason for failure.
4252- For example the ``VdtValueTooSmallError`` indeicates that the value
4253- supplied was too small. If a value (or section) is missing it will
4254- still be marked as ``False``.
4255-
4256- You must have the validate module to use ``preserve_errors=True``.
4257-
4258- You can then use the ``flatten_errors`` function to turn your nested
4259- results dictionary into a flattened list of failures - useful for
4260- displaying meaningful error messages.
4261- """
4262- if section is None:
4263- if self.configspec is None:
4264- raise ValueError, 'No configspec supplied.'
4265- if preserve_errors:
4266- if VdtMissingValue is None:
4267- raise ImportError('Missing validate module.')
4268- section = self
4269- #
4270- spec_section = section.configspec
4271- if copy and hasattr(section, '_configspec_initial_comment'):
4272- section.initial_comment = section._configspec_initial_comment
4273- section.final_comment = section._configspec_final_comment
4274- section.encoding = section._configspec_encoding
4275- section.BOM = section._configspec_BOM
4276- section.newlines = section._configspec_newlines
4277- section.indent_type = section._configspec_indent_type
4278- if '__many__' in section.configspec:
4279- many = spec_section['__many__']
4280- # dynamically assign the configspecs
4281- # for the sections below
4282- for entry in section.sections:
4283- self._handle_repeat(section[entry], many)
4284- #
4285- out = {}
4286- ret_true = True
4287- ret_false = True
4288- order = [k for k in section._order if k in spec_section]
4289- order += [k for k in spec_section if k not in order]
4290- for entry in order:
4291- if entry == '__many__':
4292- continue
4293- if (not entry in section.scalars) or (entry in section.defaults):
4294- # missing entries
4295- # or entries from defaults
4296- missing = True
4297- val = None
4298- if copy and not entry in section.scalars:
4299- # copy comments
4300- section.comments[entry] = (
4301- section._configspec_comments.get(entry, []))
4302- section.inline_comments[entry] = (
4303- section._configspec_inline_comments.get(entry, ''))
4304- #
4305- else:
4306- missing = False
4307- val = section[entry]
4308- try:
4309- check = validator.check(spec_section[entry],
4310- val,
4311- missing=missing
4312- )
4313- except validator.baseErrorClass, e:
4314- if not preserve_errors or isinstance(e, VdtMissingValue):
4315- out[entry] = False
4316- else:
4317- # preserve the error
4318- out[entry] = e
4319- ret_false = False
4320- ret_true = False
4321- else:
4322- ret_false = False
4323- out[entry] = True
4324- if self.stringify or missing:
4325- # if we are doing type conversion
4326- # or the value is a supplied default
4327- if not self.stringify:
4328- if isinstance(check, (list, tuple)):
4329- # preserve lists
4330- check = [self._str(item) for item in check]
4331- elif missing and check is None:
4332- # convert the None from a default to a ''
4333- check = ''
4334- else:
4335- check = self._str(check)
4336- if (check != val) or missing:
4337- section[entry] = check
4338- if not copy and missing and entry not in section.defaults:
4339- section.defaults.append(entry)
4340- #
4341- # Missing sections will have been created as empty ones when the
4342- # configspec was read.
4343- for entry in section.sections:
4344- # FIXME: this means DEFAULT is not copied in copy mode
4345- if section is self and entry == 'DEFAULT':
4346- continue
4347- if copy:
4348- section.comments[entry] = section._cs_section_comments[entry]
4349- section.inline_comments[entry] = (
4350- section._cs_section_inline_comments[entry])
4351- check = self.validate(validator, preserve_errors=preserve_errors,
4352- copy=copy, section=section[entry])
4353- out[entry] = check
4354- if check == False:
4355- ret_true = False
4356- elif check == True:
4357- ret_false = False
4358- else:
4359- ret_true = False
4360- ret_false = False
4361- #
4362- if ret_true:
4363- return True
4364- elif ret_false:
4365- return False
4366- else:
4367- return out
4368-
4369-class SimpleVal(object):
4370- """
4371- A simple validator.
4372- Can be used to check that all members expected are present.
4373-
4374- To use it, provide a configspec with all your members in (the value given
4375- will be ignored). Pass an instance of ``SimpleVal`` to the ``validate``
4376- method of your ``ConfigObj``. ``validate`` will return ``True`` if all
4377- members are present, or a dictionary with True/False meaning
4378- present/missing. (Whole missing sections will be replaced with ``False``)
4379- """
4380-
4381- def __init__(self):
4382- self.baseErrorClass = ConfigObjError
4383-
4384- def check(self, check, member, missing=False):
4385- """A dummy check method, always returns the value unchanged."""
4386- if missing:
4387- raise self.baseErrorClass
4388- return member
4389-
4390-# Check / processing functions for options
4391-def flatten_errors(cfg, res, levels=None, results=None):
4392- """
4393- An example function that will turn a nested dictionary of results
4394- (as returned by ``ConfigObj.validate``) into a flat list.
4395-
4396- ``cfg`` is the ConfigObj instance being checked, ``res`` is the results
4397- dictionary returned by ``validate``.
4398-
4399- (This is a recursive function, so you shouldn't use the ``levels`` or
4400- ``results`` arguments - they are used by the function.
4401-
4402- Returns a list of keys that failed. Each member of the list is a tuple :
4403- ::
4404-
4405- ([list of sections...], key, result)
4406-
4407- If ``validate`` was called with ``preserve_errors=False`` (the default)
4408- then ``result`` will always be ``False``.
4409-
4410- *list of sections* is a flattened list of sections that the key was found
4411- in.
4412-
4413- If the section was missing then key will be ``None``.
4414-
4415- If the value (or section) was missing then ``result`` will be ``False``.
4416-
4417- If ``validate`` was called with ``preserve_errors=True`` and a value
4418- was present, but failed the check, then ``result`` will be the exception
4419- object returned. You can use this as a string that describes the failure.
4420-
4421- For example *The value "3" is of the wrong type*.
4422-
4423- >>> import validate
4424- >>> vtor = validate.Validator()
4425- >>> my_ini = '''
4426- ... option1 = True
4427- ... [section1]
4428- ... option1 = True
4429- ... [section2]
4430- ... another_option = Probably
4431- ... [section3]
4432- ... another_option = True
4433- ... [[section3b]]
4434- ... value = 3
4435- ... value2 = a
4436- ... value3 = 11
4437- ... '''
4438- >>> my_cfg = '''
4439- ... option1 = boolean()
4440- ... option2 = boolean()
4441- ... option3 = boolean(default=Bad_value)
4442- ... [section1]
4443- ... option1 = boolean()
4444- ... option2 = boolean()
4445- ... option3 = boolean(default=Bad_value)
4446- ... [section2]
4447- ... another_option = boolean()
4448- ... [section3]
4449- ... another_option = boolean()
4450- ... [[section3b]]
4451- ... value = integer
4452- ... value2 = integer
4453- ... value3 = integer(0, 10)
4454- ... [[[section3b-sub]]]
4455- ... value = string
4456- ... [section4]
4457- ... another_option = boolean()
4458- ... '''
4459- >>> cs = my_cfg.split('\\n')
4460- >>> ini = my_ini.split('\\n')
4461- >>> cfg = ConfigObj(ini, configspec=cs)
4462- >>> res = cfg.validate(vtor, preserve_errors=True)
4463- >>> errors = []
4464- >>> for entry in flatten_errors(cfg, res):
4465- ... section_list, key, error = entry
4466- ... section_list.insert(0, '[root]')
4467- ... if key is not None:
4468- ... section_list.append(key)
4469- ... else:
4470- ... section_list.append('[missing]')
4471- ... section_string = ', '.join(section_list)
4472- ... errors.append((section_string, ' = ', error))
4473- >>> errors.sort()
4474- >>> for entry in errors:
4475- ... print entry[0], entry[1], (entry[2] or 0)
4476- [root], option2 = 0
4477- [root], option3 = the value "Bad_value" is of the wrong type.
4478- [root], section1, option2 = 0
4479- [root], section1, option3 = the value "Bad_value" is of the wrong type.
4480- [root], section2, another_option = the value "Probably" is of the wrong type.
4481- [root], section3, section3b, section3b-sub, [missing] = 0
4482- [root], section3, section3b, value2 = the value "a" is of the wrong type.
4483- [root], section3, section3b, value3 = the value "11" is too big.
4484- [root], section4, [missing] = 0
4485- """
4486- if levels is None:
4487- # first time called
4488- levels = []
4489- results = []
4490- if res is True:
4491- return results
4492- if res is False:
4493- results.append((levels[:], None, False))
4494- if levels:
4495- levels.pop()
4496- return results
4497- for (key, val) in res.items():
4498- if val == True:
4499- continue
4500- if isinstance(cfg.get(key), dict):
4501- # Go down one level
4502- levels.append(key)
4503- flatten_errors(cfg[key], val, levels, results)
4504- continue
4505- results.append((levels[:], key, val))
4506- #
4507- # Go up one level
4508- if levels:
4509- levels.pop()
4510- #
4511- return results
4512-
4513-"""*A programming language is a medium of expression.* - Paul Graham"""
4514
4515=== removed file 'examples/neurospin/neurospy/dataEngine.py'
4516--- examples/neurospin/neurospy/dataEngine.py 2009-03-28 01:56:11 +0000
4517+++ examples/neurospin/neurospy/dataEngine.py 1970-01-01 00:00:00 +0000
4518@@ -1,571 +0,0 @@
4519-"""
4520-DataEngine - An engine handling path or collection of path.
4521- - file and directory search
4522- - file and firectory tagging
4523- - applying actions upon collection of files and directories
4524- - piping the actions
4525-"""
4526-
4527-
4528-#from PyQt4 import QtCore
4529-import tarfile, bz2, os
4530-
4531-from filesystem import path
4532-
4533-
4534-
4535-class DataEngine:
4536- """
4537- The DataEngine Class
4538- """
4539- def __init__(self):
4540-
4541- self.catalog = {}
4542- self.groups = {}
4543- self.tags = []
4544-
4545-
4546- def newPath(self, itemPath):
4547- """
4548- Creates a path item.
4549-
4550- @param itemPath: a path
4551- @return: a L{PathItem} aware of this dataEngine
4552- """
4553-
4554- return PathItem(itemPath, self)
4555-
4556- def newCollection(self, items):
4557- """
4558- Creates a path collection.
4559-
4560- @param items: a list of path
4561- @return: a L{ItemCollection} aware of this dataEngine
4562- """
4563-
4564- return ItemCollection(items, self)
4565-
4566- #---------------------------------------------------------------------------------------------------------------
4567- # search methods
4568-
4569- def __find(self, itemGenerator, maxResults=0):
4570- results = ItemCollection([], self)
4571- for item in itemGenerator:
4572- results.append(PathItem(item, self))
4573- if len(results) == maxResults:
4574- itemGenerator.close()
4575-
4576- if maxResults == 1 and len(results) == 0:
4577- return None
4578- elif maxResults == 1:
4579- return results[0]
4580- else:
4581- return results
4582-
4583- def findDirectories(self, srcDir, dirPattern='*', dirDepth=0, maxResults=0):
4584- """
4585- Find all the directories matching the pattern.
4586-
4587- @param srcDir: directory to search from
4588- @param dirPattern: directory pattern to find
4589- @param dirDepth: max directory depth to search in (0 to infinite)
4590- @param maxResults: max results to return (0 to infinite)
4591- @return:
4592- - directories matching the pattern
4593- - an L{ItemCollection} if maxResults is not equal to 1
4594- - a L{PathItem} if maxResults is equal to 1
4595- """
4596-
4597- srcDir = PathItem(srcDir, self)
4598- if not srcDir.isdir():
4599- raise Exception('DataEngine: -findDirectories- source directory %s is not a valid directory'%srcDir)
4600-
4601- childDirs = srcDir.walkdirs(dirPattern, dirDepth, 0, 'ignore')
4602-
4603- return self.__find(childDirs, maxResults)
4604-
4605- def findFiles(self, srcDir, filePattern='*', dirDepth=0, maxResults=0):
4606- """
4607- Find all the files matching the pattern.
4608-
4609- @param srcDir: directory to search from
4610- @param filePattern: file pattern to find
4611- @param dirDepth: max directory depth to search in (0 to infinite)
4612- @param maxResults: max results to return (0 to infinite)
4613- @return:
4614- - files matching the pattern
4615- - an L{ItemCollection} if maxResults is not equal to 1
4616- - a L{PathItem} if maxResults is equal to 1
4617- """
4618-
4619- srcDir = PathItem(srcDir, self)
4620- if not srcDir.isdir():
4621- raise Exception('DataEngine: -findDirectories- source directory %s is not a valid directory'%srcDir)
4622-
4623- dirFiles = srcDir.walkfiles(filePattern, dirDepth, 0, 'ignore')
4624-
4625- return self.__find(dirFiles, maxResults)
4626-
4627- def getItem(self, *search_tags):
4628- """
4629- Finds one collected L{PathItem} that has the given tags.
4630- Equivalent to L{getSomeItems}(1,search_tags)
4631-
4632- @param search_tags: the search tags
4633- @return: a L{PathItem}
4634- """
4635-
4636- return self.getSomeItems(1, *search_tags)
4637-
4638- def getItems(self, *search_tags):
4639- """
4640- Finds all the collected L{PathItem} that has the given tags.
4641- Equivalent to L{getSomeItems}(0,search_tags)
4642-
4643- @param search_tags: the search tags
4644- @return: an L{ItemCollection}
4645- """
4646-
4647- return self.getSomeItems(0, *search_tags)
4648-
4649- def getSomeItems(self, maxResults, *search_tags):
4650- """
4651- Finds at most a specified number of L{PathItem} that has the given tags.
4652-
4653- @param search_tags: the search tags
4654- @return: an L{ItemCollection}
4655- """
4656-
4657- results = ItemCollection([], self)
4658-
4659- for item in self.catalog.keys():
4660- isResult = True
4661- for search_tag in search_tags:
4662- if not search_tag in self.catalog[item]:
4663- isResult = False
4664- break
4665-
4666- if isResult:
4667- results.append(PathItem(item, self))
4668- if len(results) == maxResults:
4669- break
4670-
4671- if maxResults == 1 and len(results) == 0:
4672- return None
4673- elif maxResults == 1:
4674- return results[0]
4675- else:
4676- return results
4677-
4678- def getGroup(self, name):
4679- if not self.groups.has_key(name):
4680- raise Exception('DataEngine: -findGroupItems- group %s does not exist'%name)
4681- else:
4682- return self.getItems(0, *iter(self.groups[name]))
4683-
4684-
4685- #---------------------------------------------------------------------------------------------------------------
4686- # tag management methods
4687-
4688- def collect(self, item, *tags):
4689- """
4690- Tags a L{PathItem} or an L{ItemCollection}
4691-
4692- @param item: the L{PathItem} or the L{ItemCollection}
4693- @param tags: the item(s) tags
4694- @return: the input parameter item
4695- """
4696-
4697- if isinstance(item, list):
4698- collected = ItemCollection([], self)
4699- for el in item:
4700- el = PathItem(el, self)
4701- if not self.catalog.has_key(el):
4702- self.catalog[el] = ()
4703- self.addItemTags(el, *tags)
4704- collected.append(el)
4705-
4706- return collected
4707-
4708- else:
4709- item = PathItem(item, self)
4710- if self.catalog.has_key(item):
4711- self.addItemTags(item, *tags)
4712- return item
4713- else:
4714- self.catalog[item] = tags
4715- return item
4716-
4717- def uncollect(self, item):
4718- """
4719- Untags a L{PathItem} or an L{ItemCollection}
4720-
4721- @param item: the L{PathItem} or the L{ItemCollection}
4722- @return: the input parameter item
4723- """
4724-
4725- if isinstance(item, list):
4726- uncollected = ItemCollection([], self)
4727- for el in item:
4728- el = PathItem(el, self)
4729- if not self.catalog.has_key(el):
4730- print 'DataEngine: item %s does not exist'%el
4731- continue
4732- else:
4733- self.catalog.pop(el)
4734- uncollected.append(el)
4735-
4736- return uncollected
4737- else:
4738- item = PathItem(item, self)
4739- if self.catalog.has_key(item):
4740- self.catalog.pop(item)
4741- return item
4742- else:
4743- raise Exception('DataEngine: item %s does not exist'%item)
4744-
4745- def reset(self):
4746- """
4747- Resets all data structures.
4748- """
4749-
4750- self.catalog = {}
4751- self.groups = {}
4752- self.tags = []
4753-
4754- def addItemTags(self, item, *tags):
4755- return self.__addTags(self.catalog, item, *tags)
4756-
4757- def rmvItemTags(self, item, *tags):
4758- return self.__rmvTags(self.catalog, item, *tags)
4759-
4760- def addGroupTags(self, name, *tags):
4761- self.__addTags(self.groups, name, *tags)
4762-
4763- def rmvGroupTags(self, name, *tags):
4764- self.__rmvTags(self.groups, name, *tags)
4765-
4766- def __addTags(self, tagDict, key, *tags):
4767- if isinstance(key, list):
4768- for el in key:
4769- self.__rmvTags(tagDict, el, *tags)
4770- tagDict[el] += tags
4771- else:
4772- self.__rmvTags(tagDict, key, *tags)
4773- tagDict[key] += tags
4774-
4775- for tag in tags:
4776- self.tags.append(tag)
4777-
4778- return key
4779-
4780- def __rmvTags(self, tagDict, key, *tags):
4781- for tag in tags:
4782- if tag in self.tags:
4783- self.tags.remove(tag)
4784-
4785- if isinstance(key, list):
4786- for el in key:
4787- if not tagDict.has_key(el):
4788- print 'DataEngine: %s unknown in %s'%(el, tagDict)
4789- continue
4790-
4791- remaining_tags = ()
4792- for tag in tagDict[el]:
4793- if not tag in tags:
4794- remaining_tags += (tag,)
4795-
4796- tagDict[el] = remaining_tags
4797-
4798- else:
4799- if not tagDict.has_key(key):
4800- raise Exception('DataEngine: %s unknown in %s'%(key, tagDict))
4801-
4802- remaining_tags = ()
4803- for tag in tagDict[key]:
4804- if not tag in tags:
4805- remaining_tags += (tag,)
4806-
4807- tagDict[key] = remaining_tags
4808-
4809- return key
4810-
4811- #---------------------------------------------------------------------------------------------------------------
4812- # group management methods
4813-
4814- def setGroup(self, name, *tags):
4815- if self.groups.has_key(name):
4816- return False
4817-
4818- self.groups[name] = tags
4819- return True
4820-
4821- def rmvGroup(self, name):
4822- if self.groups.has_key(name):
4823- return self.groups.pop(name)
4824- else:
4825- raise Exception('DataEngine: -rmvGroup- group %s does not exist'%name)
4826-
4827- #---------------------------------------------------------------------------------------------------------------
4828- # collection methods
4829-
4830- def delete(self, collection):
4831- """
4832- Deletes the items in the collection. If they were collected they are uncollected.
4833-
4834- @todo: if a directory is deleted, there is no uncollection of items within it
4835- @param collection: the L{ItemCollection}
4836- @return: None
4837- """
4838-
4839- for item in collection:
4840- item = path(item)
4841- if item.exists():
4842- if item.isdir():
4843- item.rmtree()
4844- elif item.isfile():
4845- item.remove()
4846- if self.catalog.has_key(item):
4847- self.uncollect(item)
4848-
4849- def move(self, collection, base_dest):
4850- """
4851- Moves the collection items to specified destination. Collected items are automatically updated, as well as the items within a moved directory.
4852-
4853- @param collection: the L{ItemCollection}
4854- @param base_dest: the directory path were the items are moved to
4855- @return: an L{ItemCollection} of the moved items
4856- """
4857-
4858- if not os.path.isdir(base_dest):
4859- raise Exception('DataEngine: -move- destination must be a valid directory')
4860-
4861- movedItems = ItemCollection([], self)
4862-
4863- for item in collection:
4864- item = path(item)
4865- if item.isdir():
4866- movedItems.append(self.__moveDir(item, base_dest))
4867- elif item.isfile():
4868- movedItems.append(self.__moveFile(item, base_dest))
4869-
4870- return movedItems
4871-
4872- def __moveFile(self, item, base_dest):
4873- item.move(base_dest)
4874- newItem = PathItem(os.path.join(base_dest, item.basename()), self)
4875-
4876- if self.catalog.has_key(item):
4877- tags = self.catalog[item]
4878- self.uncollect(item)
4879- self.collect(newItem, *iter(tags))
4880-
4881- return newItem
4882-
4883- #TODO: improve the update of catalog elements within the moved dir : look in the catalog if the element prefix matches the original dir value
4884- def __moveDir(self, directory, base_dest):
4885- dest = os.path.join(base_dest, directory.basename())
4886-
4887- directory.copytree(dest)
4888- for item in self.findFiles(directory, '*', 0, 0):
4889- if self.catalog.has_key(item):
4890- tags = self.catalog[item]
4891- self.uncollect(item)
4892- newItem = PathItem(os.path.join(dest, item.lstrip(directory)), self)
4893- self.collect(newItem, *iter(tags))
4894-
4895- if self.catalog.has_key(directory):
4896- tags = self.catalog[directory]
4897- self.uncollect(directory)
4898- self.collect(dest, *iter(tags))
4899-
4900- directory.rmtree()
4901-
4902- return PathItem(dest, self)
4903-
4904- def copy(self, collection, dest):
4905- """
4906- Copies the collection items to specified destination
4907-
4908- @param collection: the L{ItemCollection}
4909- @param dest: the directory path were the items are copied to
4910- @return: an L{ItemCollection} of the copied items
4911- """
4912-
4913- if not os.path.isdir(dest):
4914- raise Exception('DataEngine: -copy- destination must be a valid directory')
4915-
4916- copies = ItemCollection([], self)
4917-
4918- for item in collection:
4919- item = path(item)
4920- if item.isdir():
4921- copies.append(self.__copyDir(item, dest))
4922- elif item.isfile():
4923- copies.append(self.__copyFile(item, dest))
4924-
4925- return copies
4926-
4927- def prefix(self, collection, prefix):
4928- newCollection = ItemCollection([], self)
4929- for item in collection:
4930- item = PathItem(item, self)
4931- newCollection.append(item.rename(prefix+item.basename()))
4932-
4933- return newCollection
4934-
4935- def suffix(self, collection, suffix):
4936- newCollection = ItemCollection([], self)
4937- for item in collection:
4938- item = PathItem(item, self)
4939-
4940- tmpItem = PathItem(item, self)
4941- item_ext = ''
4942-
4943- while tmpItem.splitext()[1] != '':
4944- item_ext = tmpItem.splitext()[1]+item_ext
4945- tmpItem = PathItem(tmpItem.splitext()[0])
4946-
4947- newCollection.append(item.rename(tmpItem+suffix+item_ext))
4948-
4949- return newCollection
4950-
4951- def __copyFile(self, item, dest):
4952- item.copy(dest)
4953- return PathItem(os.path.join(dest, item.basename()), self)
4954-
4955- def __copyDir(self, directory, base_dest):
4956- dest = os.path.join(base_dest, directory.basename())
4957- directory.copytree(dest)
4958- return PathItem(dest, self)
4959-
4960- def __updateCatalog(self, oldItem, newItem):
4961- if self.catalog.has_key(oldItem):
4962- tags = self.catalog[oldItem]
4963- self.uncollect(oldItem)
4964- self.collect(newItem, *iter(tags))
4965-
4966-#FIXME: I can't get the object to update itself when moving or renaming it
4967-class PathItem(path):
4968- def __new__(cls, itemPath, dataEngine=DataEngine()):
4969- self = path.__new__(cls, itemPath)
4970- self.dataEngine = dataEngine
4971- return self
4972-
4973- def findDirectories(self, dirPattern='*', dirDepth=0, maxResults=0):
4974- return self.dataEngine.findDirectories(self, dirPattern, dirDepth, maxResults)
4975-
4976- def findFiles(self, filePattern='*', dirDepth=0, maxResults=0):
4977- return self.dataEngine.findFiles(self, filePattern, dirDepth, maxResults)
4978-
4979- def copy(self, dest):
4980- return self.dataEngine.copy([self], dest)[0]
4981-
4982- def delete(self):
4983- return self.dataEngine.delete([self])[0]
4984-
4985- def move(self, base_dest):
4986- return self.dataEngine.move([self], base_dest)[0]
4987-
4988- def addTags(self, *tags):
4989- self.dataEngine.addItemsTags(self, *tags)
4990-
4991- def rmvTags(self, *tags):
4992- self.dataEngine.rmvItemsTags(self, *tags)
4993-
4994- def collect(self, *tags):
4995- return self.dataEngine.collect(self, *tags)
4996-
4997- def uncollect(self):
4998- return self.dataEngine.uncollect(self)
4999-
5000- def rename(self, basename):
The diff has been truncated for viewing.

Subscribers

People subscribed via source and target branches