Merge lp://qastaging/~klmitch/nova/os_int_tests into lp://qastaging/~hudson-openstack/nova/trunk

Proposed by Kevin L. Mitchell
Status: Work in progress
Proposed branch: lp://qastaging/~klmitch/nova/os_int_tests
Merge into: lp://qastaging/~hudson-openstack/nova/trunk
Diff against target: 1774 lines (+1648/-0)
17 files modified
Authors (+1/-0)
integration/README (+83/-0)
integration/adaptor.py (+38/-0)
integration/adaptor_dtest.py (+114/-0)
integration/adaptor_nose.py (+466/-0)
integration/base.py (+60/-0)
integration/flags.py (+59/-0)
integration/run_tests.py (+48/-0)
integration/test_flavors.py (+63/-0)
integration/test_images.py (+102/-0)
integration/test_ipgroups.py (+67/-0)
integration/test_server_actions.py (+146/-0)
integration/test_server_images.py (+70/-0)
integration/test_server_ips.py (+58/-0)
integration/test_server_meta.py (+37/-0)
integration/test_servers.py (+146/-0)
integration/utils.py (+90/-0)
To merge this branch: bzr merge lp://qastaging/~klmitch/nova/os_int_tests
Reviewer Review Type Date Requested Status
Brian Lamar (community) Disapprove
Vish Ishaya (community) Needs Fixing
termie Pending
Review via email: mp+61474@code.qastaging.launchpad.net

This proposal supersedes a proposal from 2011-04-19.

Description of the change

Adds an integration (or perhaps functional?) test suite against the OpenStack API (v1.0). The test suite can use either the nose (single-threaded) or DTest (multi-threaded; DTest 0.3.2 (minimum version) is available from PyPi) frameworks. The default is nose; to use DTest, pass --use_dtest to integration/run_tests.py.

DTest is a threaded (using eventlet) test framework which automatically determines parallelization using dependencies between the various tests and test fixtures. It is available in PyPi, and the source is at:

  http://github.com/klmitch/dtest

To post a comment you must log in.
Revision history for this message
termie (termie) wrote : Posted in a previous version of this proposal

Wow, bunch of nice stuff. I'm a little worried it may be duplication of effort / code however, have you looked in nova/tests/integrated? It is doing much of the same stuff you are doing. It doesn't use DTest but it may be interesting for you to look into porting them to it, or adding your tests alongside them if the overlap in testing is small.

review: Needs Information
Revision history for this message
Kevin L. Mitchell (klmitch) wrote : Posted in a previous version of this proposal

On Wed, 2011-04-20 at 21:48 +0000, termie wrote:
> Wow, bunch of nice stuff.

Thank you.

> I'm a little worried it may be duplication of effort / code however,
> have you looked in nova/tests/integrated?

No, I hadn't been aware of it.

> It is doing much of the same stuff you are doing.

The thing that worries me about it is this, from
nova/tests/integrated/__init__.py:

        """
        :mod:`integrated` -- Tests whole systems, using mock services where needed
        =================================
        """

The purpose of my integrated test suite is to test the OpenStack API of
an existing nova cluster (or huddle, or whatever term you want to use),
which is why mine uses gflags in the first place to provide connection
and login information. This is also only a base; we eventually intend
to add stress tests, to really stress out clusters and make sure they
can handle the load and do things properly.

(Incidentally, nova isn't yet quite to that state, unfortunately :/ )

> It doesn't use DTest but it may be interesting for you to look into
> porting them to it, or adding your tests alongside them if the overlap
> in testing is small.

I think it doesn't make sense to merge these two test suites, as the
purpose of the existing one seems to be testing nova during the test
run, whereas mine is more intended to exercise a nova cluster. That
said, I have certainly thought it might be a good idea to port nova's
existing unit test suite over to DTest--the parallelization would have
to be a big win. (And I'm rather proud of DTest ;)
--
Kevin L. Mitchell <email address hidden>

Revision history for this message
termie (termie) wrote : Posted in a previous version of this proposal

Alright, if you are planning on doing tests against live clusters, perhaps then look at the smoketests directory. I don't like the name of it but it is doing exactly that. I think we'd probably love to get your tests in alongside those then, as the existing ones are only really testing the ec2 api.

Regarding DTest, it looks nice and we certainly want parallelization of tests, but we also are pretty happy with most of what nose gets us, what do you think the options are for integration with nose are?

Revision history for this message
Kevin L. Mitchell (klmitch) wrote : Posted in a previous version of this proposal

On Thu, 2011-04-21 at 17:47 +0000, termie wrote:
> Alright, if you are planning on doing tests against live clusters,
> perhaps then look at the smoketests directory. I don't like the name
> of it but it is doing exactly that. I think we'd probably love to get
> your tests in alongside those then, as the existing ones are only
> really testing the ec2 api.

Yeah, that was the one I was worried about when I started this branch.
Because I use a different test infrastructure--for parallelization--I
decided to make it a different directory, though.

> Regarding DTest, it looks nice and we certainly want parallelization
> of tests, but we also are pretty happy with most of what nose gets us,
> what do you think the options are for integration with nose are?

Well, I used nose as one of my sources for ideas and features for DTest.
I know there are some features that don't have equivalents in DTest,
like nose's plug-ins. As for integrating with nose, I haven't looked,
but I suspect it's not going to be possible; even though I've tried to
have similar concepts as unittest and nose, I've deliberately gone with
a completely separate implementation. This allowed me to omit concepts
that didn't make sense (e.g., unittest.TestSuite has no true equivalent
in DTest, and the suite() functions are never called; trying to support
TestSuite or a variant would have complicated DTest too much with too
little benefit, imo). That said, I think most things you may want to be
able to do in nose you can do in DTest, with perhaps a few exceptions (I
don't yet have support for generators yet, for instance).

(BTW, if you see bugs, or think of things that should be in DTest, log
an issue at https://github.com/klmitch/dtest/issues and I'll be happy to
address it :)
--
Kevin L. Mitchell <email address hidden>

Revision history for this message
Gabe Westmaas (westmaas) wrote : Posted in a previous version of this proposal

Good to see us making progress on this level of testing on the OS API, great stuff! I have a couple comments, and I'm interested in what you and others think.

First, the name of the type of tests I think is a little off, especially given the set of tests termie pointed out. I see (at least) 5 different types of tests:

Unit Tests
Integration Tests
*****
Functional Tests
Smoke Tests
Performance Tests

Those tests above the asterisks can use mocks/fakes, etc and those below should not. It seems what you have here is in the Functional/Smoke tests range rather than integration. What do you think? Obviously we just have to pick a name and go with it, but I think "integration" is typically above the asterisk line that I drew above.

Second, I think there is a question about whether we need to use another testing framework to support our tests, or if we use nose tests to write tests and something else to manage parallelization and dependencies.

Revision history for this message
Kevin L. Mitchell (klmitch) wrote : Posted in a previous version of this proposal
Download full text (5.2 KiB)

On Tue, 2011-04-26 at 18:44 +0000, Gabe Westmaas wrote:
> First, the name of the type of tests I think is a little off,
> especially given the set of tests termie pointed out. [...] It seems
> what you have here is in the Functional/Smoke tests range rather than
> integration. What do you think? Obviously we just have to pick a
> name and go with it, but I think "integration" is typically above the
> asterisk line that I drew above.

When the project was presented to me, it was described as an
"integration test suite." That was the name I chose for the directory,
and it wasn't until later, when we involved Rackspace QA guys, that we
started discussing the differences between what constitutes an
integration test vs. a functional test. We came to the conclusion that
these were actually functional tests, but I decided to hold off on
renaming things until the code was written. I'm certainly happy to
rename the suite, now that we have it pretty much complete (at least to
this point; there's talk of adding further tests to perform stressing,
like trying to create 1000 instances simultaneously and that sort of
thing). The only reason I haven't gone ahead and renamed it is just
because I wanted to see some discussion on the merge prop to discover a
consensus.

> Second, I think there is a question about whether we need to use
> another testing framework to support our tests, or if we use nose
> tests to write tests and something else to manage parallelization and
> dependencies.

The problem is that adding parallelization after the fact is as
difficult as adding security after the fact. You often have to think
about a problem differently if you're trying to solve that problem in a
parallel fashion rather than a sequential fashion. This came up in the
course of developing this test suite, in fact: we started with a simple
adaptation of unittest that incorporated threading at the suite level,
and which involved only a single extra file (orbitals.py). However,
when we then needed to add support for class-level test fixtures
(setUpClass/tearDownClass), Trey and I spent 2 or 3 hours discussing how
to do it, and the solution we came up with still wasn't very optimal.
Then there was the question of what to do if we need module-level or,
heaven forbid, package-level test fixtures. Another problem was that
the question of which tests could be run in parallel had to be managed
directly by the user in a more-or-less sideways fashion; orbitals
required every test to be explicitly listed as part of a suite() method
for all the test classes.

As it happens, though, I had already solved this problem for a C unit
test framework I had written for some of the C libraries I've worked on.
I had designed that framework (I called it "test-harness") to run test
programs, but I'd incorporated the idea of dependencies ("If program A
fails, it makes no sense to run tests that require the interfaces A
tests to work"); I had then added some support to run several test
programs in parallel. That dependency-based paradigm translated easily
into the Python DTest framework that I then wrote. I should point out
that I did try to keep several of the unittest paradigms a...

Read more...

Revision history for this message
termie (termie) wrote : Posted in a previous version of this proposal

Unrelated to gabe's response... posting this with kevin sitting behind me but....

I'm hesitant to bring in a new and, no offense, largely untested test framework into the mix, especially one that has a rather different style from our existing tests, doing so would make dealing with any issues in it largely gated on a single person (you) and that's not usually a very good practice.

I would be much happier if DTest was additive to the existing system rather than completely parallel. I've done a decent amount of hacking on nose and unittest and have even ported unittest to other languages once or twice, so I am aware of how tangled a bunch of the code is in there, however I do think that the goals of DTest (threaded execution and dependencies) can be built as a new set of tools layered on top of those systems rather than as a completely separate entity, and I think in general the project would get a lot more traction that way.

I have some example POC code i can show you as well, will update this bug again when I upload it.

Revision history for this message
termie (termie) wrote : Posted in a previous version of this proposal

Here you go... sort of messy but I can walk you through it: https://github.com/termie/nova/tree/parallel_tests

Revision history for this message
Kevin L. Mitchell (klmitch) wrote : Posted in a previous version of this proposal

On Wed, 2011-04-27 at 17:34 +0000, termie wrote:
> Here you go... sort of messy but I can walk you through it:
> https://github.com/termie/nova/tree/parallel_tests

I'd appreciate it if you did; this looks like a copy of the nova tree,
and I don't immediately see the nose plugin you were talking about.
--
Kevin L. Mitchell <email address hidden>

Revision history for this message
Kevin L. Mitchell (klmitch) wrote : Posted in a previous version of this proposal

On Wed, 2011-04-27 at 17:32 +0000, termie wrote:
> Unrelated to gabe's response... posting this with kevin sitting behind
> me but....

Sorry for the delay in response, but my ADD and the distraction of the
summit conspired together...

> I'm hesitant to bring in a new and, no offense, largely untested test
> framework into the mix, especially one that has a rather different
> style from our existing tests, doing so would make dealing with any
> issues in it largely gated on a single person (you) and that's not
> usually a very good practice.

I understand your objections. However, I should point out that the
copyright on DTest is assigned to OpenStack LLC and that the license is
the Apache 2 license. I'm happy to give access to DTest away to other
Nova developers, and I've tried to document the code such that it's
clear what it's trying to do. That should address the "gated on a
single person" objection (tell me what else I can offer/do if it isn't
enough). As for "largely untested"--that's going to be a factor
whatever we do, I think, be it a nose plugin or DTest, and I have made
an effort to test DTest to the best of my (admittedly somewhat limited)
test-writing abilities. I'm fairly confident at this point that DTest
covers the vast majority of the subtleties inherent to a threaded test
framework, and every run I've made on the two existing DTest-based
suites (this integration merge prop and DTest's own test suite) has
worked perfectly.

> I would be much happier if DTest was additive to the existing system
> rather than completely parallel. I've done a decent amount of hacking
> on nose and unittest and have even ported unittest to other languages
> once or twice, so I am aware of how tangled a bunch of the code is in
> there, however I do think that the goals of DTest (threaded execution
> and dependencies) can be built as a new set of tools layered on top of
> those systems rather than as a completely separate entity, and I think
> in general the project would get a lot more traction that way.

Perhaps, and I'd be happy to work with you toward that end, but I know
that DTest, as it stands now, works perfectly well, and that there are
subtleties, like output capturing, that could easily be missed if we
have to do it as a plugin. That's not to say it can't be done; just to
say that I have working code now :)

I'll also point out that layering threading on top of unittest, at least
using the approach of orbitals (code for which should be available in an
earlier version of my branch) turned out to be somewhat of a headache,
and we weren't even trying to do the output capturing that I manage to
incorporate in DTest.
--
Kevin L. Mitchell <email address hidden>

Revision history for this message
Kevin L. Mitchell (klmitch) wrote : Posted in a previous version of this proposal

I'm currently working on refactoring this branch a little bit so it can be used with either nose or DTest. I'm not going to try to do the threading in nose, and I may or may not be able to get dependencies to work; I'll leave the former exclusively to DTest, and the latter may involve some deep surgery to nose.

Revision history for this message
Kevin L. Mitchell (klmitch) wrote : Posted in a previous version of this proposal

The integration suite can now be run under either DTest (multi-threaded) or nose (single-threaded). By default, it runs under nose. Support has been added to make nose honor dependencies. To run integration tests under DTest, pass the --use_dtest option to run_tests.py.

Revision history for this message
Vish Ishaya (vishvananda) wrote :

looks like you are missing an author...

Traceback (most recent call last):
  File "/home/vishvananda/os_int_tests/nova/tests/test_misc.py", line 80, in test_authors_up_to_date
    '%r not listed in Authors' % missing)
AssertionError: set([u'<email address hidden>']) not listed in Authors

Also can you give some instructions on these are supposed to be used? I set up my normal testing environment (using nova.sh) and tried running the integration tests and I get quite a few errors:

FAILED (DEPFAIL=4, TIMEOUT=4, errors=5)

Minor nits per HACKING:
your docstrings should be converted to one line:
"""This."""
vs.
"""
This.
"""

We try to avoid import *

We import modules instead of classes

Otherwise, this looks pretty cool. It would be great to get this integrated with jenkins along with smoketests ans smokestack stuff.

review: Needs Fixing
Revision history for this message
Kevin L. Mitchell (klmitch) wrote :

D'oh, I'll fix the Authors thing shortly; sorry about that.

As for instructions, the basic instructions are to set --username, --api_key, --auth_url, and --glance_host; I'll add a README or something to document that. Should I also try to get that information from environment variables? What would be standard?

(FYI, I should note that in my own testing, some of the tests involving instance creation do not pass 100% of the time. That could be problems with nova or problems specifically with the xen support. I've also seen instance deletions that fail to complete.)

In my own Python hacking, I tend to prefer the multi-line docstring style, but I'll adjust to Nova's; sorry for the style mismatch.

I understand the reluctance for "import *"; I'll look into converting the imports to your advised import style. Is it OK if I leave these alone for the adaptor modules (adaptor.py, adaptor_nose.py, and adaptor_dtest.py), however?

Revision history for this message
Vish Ishaya (vishvananda) wrote :

On May 20, 2011, at 8:07 AM, Kevin L. Mitchell wrote:

>
> I understand the reluctance for "import *"; I'll look into converting the imports to your advised import style. Is it OK if I leave these alone for the adaptor modules (adaptor.py, adaptor_nose.py, and adaptor_dtest.py), however?
>

If there is a good reason to use import * i think it is probably fine.

Revision history for this message
Kevin L. Mitchell (klmitch) wrote :

Vish: I believe I've now addressed all your concerns, including adding the README I suggested and removing all cases of importing objects instead of modules except for the conditional imports in adaptor.py. Is there anything else I need to address?

Thanks!

Revision history for this message
Kevin L. Mitchell (klmitch) wrote :

It's been a while, so I'm going to merge in trunk to keep this branch from getting too far out of sync...

Revision history for this message
Kevin L. Mitchell (klmitch) wrote :

Good I did so; there was a conflict in Authors. This branch is now ready for merging again. Please review...

Revision history for this message
Brian Lamar (blamar) wrote :

I'm very much for this code to be part of an official OpenStack project. That being said, it deals with testing OpenStack as a whole, as it might be deployed in a live environment. Not only are there tests for Nova, but also for Glance and potentially other OpenStack projects.

This would be a great candidate for openstack-common, if such a project existed. While I don't want to 'disapprove' this, I'm really not sure it belongs in this project...but it sure as heck belongs somewhere because we need this bad.

Revision history for this message
Brian Lamar (blamar) wrote :

Also, any idea why I'm getting this:

(nova): TRACE: Traceback (most recent call last):
(nova): TRACE: File "/usr/lib/pymodules/python2.6/nova/rpc.py", line 198, in _receive
(nova): TRACE: rval = node_func(context=ctxt, **node_args)
(nova): TRACE: File "/usr/lib/pymodules/python2.6/nova/exception.py", line 87, in _wrap
(nova): TRACE: return f(*args, **kw)
(nova): TRACE: File "/usr/lib/pymodules/python2.6/nova/compute/manager.py", line 105, in decorated_function
(nova): TRACE: function(self, context, instance_id, *args, **kwargs)
(nova): TRACE: File "/usr/lib/pymodules/python2.6/nova/compute/manager.py", line 564, in prep_resize
(nova): TRACE: 'Migration error: destination same as source!'))
(nova): TRACE: Error: Migration error: destination same as source!
(nova): TRACE:

during testing? It might be that I'm using libvirt/raw images on CloudServers, so snapshotting should fail because raw images can't be snapshotted?

review: Needs Information
Revision history for this message
Kevin L. Mitchell (klmitch) wrote :

On Tue, 2011-05-31 at 18:08 +0000, Brian Lamar wrote:
> Review: Needs Information
> Also, any idea why I'm getting this:
>
> (nova): TRACE: Traceback (most recent call last):
> (nova): TRACE: File "/usr/lib/pymodules/python2.6/nova/rpc.py", line 198, in _receive
> (nova): TRACE: rval = node_func(context=ctxt, **node_args)
> (nova): TRACE: File "/usr/lib/pymodules/python2.6/nova/exception.py", line 87, in _wrap
> (nova): TRACE: return f(*args, **kw)
> (nova): TRACE: File "/usr/lib/pymodules/python2.6/nova/compute/manager.py", line 105, in decorated_function
> (nova): TRACE: function(self, context, instance_id, *args, **kwargs)
> (nova): TRACE: File "/usr/lib/pymodules/python2.6/nova/compute/manager.py", line 564, in prep_resize
> (nova): TRACE: 'Migration error: destination same as source!'))
> (nova): TRACE: Error: Migration error: destination same as source!
> (nova): TRACE:
>
> during testing?[snip]

This turns out to be a known issue with resizing--resizes require a
second host to migrate the instance to, since both instances have to
exist for a period of time. So, if you're using a single dev box, like
I do, or if the simple scheduler picks the same host as the instance
you're trying to resize, you get this migration error.

(At least it says "Migration error" now; when I first saw this, the
error looked like an invalid entity...)
--
Kevin L. Mitchell <email address hidden>

Revision history for this message
Kevin L. Mitchell (klmitch) wrote :

On Tue, 2011-05-31 at 18:07 +0000, Brian Lamar wrote:
> I'm very much for this code to be part of an official OpenStack
> project. That being said, it deals with testing OpenStack as a whole,
> as it might be deployed in a live environment. Not only are there
> tests for Nova, but also for Glance and potentially other OpenStack
> projects.

Actually, there are no tests for glance in this; it assumes glance is
present and functioning properly, and uses it as part of the testing
it's doing for Nova.

> This would be a great candidate for openstack-common, if such a
> project existed. While I don't want to 'disapprove' this, I'm really
> not sure it belongs in this project...but it sure as heck belongs
> somewhere because we need this bad.

This test suite is targeted specifically for nova. While it may not be
a bad idea to have one that focuses on the entire OpenStack
infrastructure, it is worth remembering that people may not use every
OpenStack component.
--
Kevin L. Mitchell <email address hidden>

Revision history for this message
Brian Lamar (blamar) wrote :

Unfortunately I don't believe there is such thing as a functional test which just involves Nova. You're getting and creating images through the /images interface which for all intents and purposes is Glance. Nova doesn't work without an image service, and that image service is Glance. Sure, right now there is the LocalImageService in Nova, but I'm pretty sure that's being targeted for removal sooner rather than later. When that happens, Glance is going to be a hard dependency of Nova. When Keystone is complete and integrated, it will be a hard dependency of Nova.

Integration tests will mock out the image service, auth service, and any other services which Nova depends on. These tests should be in the nova code base.

Functional tests will not mock out any services and will rely on a complete (or minimally complete) deployment. These tests should be in a common project.

Revision history for this message
Matt Dietz (cerberus) wrote :

Brian: my take is that, given your own admission of need, we should have this sooner rather than later. I can see your side of the argument regarding the utility of another openstack project for entities such as this.

However, there's no particular reason in my mind that we should keep this out of Nova if it would drive value. I'm sure there are hesitant would-be adopters out there that would feel much more secure in trying Nova if they saw that it passes a suite of real functional tests. Additionally, it would personally make *me* feel a lot better, as I can reliably fire off a build and know that it at least continues to function at a basic level. Lastly, I'd like to have a working model out there of what functional tests should look like, as I suspect a lot of one-off functional suites are being written as we speak. I think everyone would feel a lot happier if we had a scaffold to build on.

Given that it *could* exist in an external project, is there any reason to not accept this patch, with the idea that we might pull it out later?

Revision history for this message
Brian Lamar (blamar) wrote :

Yes, my objection to this going in is that the creation of a functional test project for OpenStack should be a priority for the project. I'm getting very frustrated with the amount of code that is put in Nova which doesn't have anything to do with "compute". Putting the right code in the right place is very important to me.

All of this being said, I'd love to work with you to get this out there and integrated with SmokeStack.

review: Disapprove
Revision history for this message
Steve Brown (jpgeek) wrote :

Sorry to butt in here, but we are starting on functional tests for Nova via the openstack/rackspace API as well, and trying to figure out how to proceed.

Let me see if I've got the current situation right:
Currently, there are functional tests in Nova (smoketests), which are technically outside the scope of Nova because they test (directly or indirectly) other components such as Glance. Adding this branch would exacerbate that problem, so we are now looking for where it should go.

The current proposal for where this branch should live is to integrate it with SmokeStack.

I want to review this with Dan Prince, but after talking with Vish yesterday I believe that SmokeStack is itself calling the smoketests in (yup) Nova, (and might also be running its own ruby tests as well). This would put us back at the starting point for keeping the code in the right place and also might get back to Termie's concerns above about mixing in other new frameworks (termie wrote on 2011-04-27). I apologize if I have any of this wrong.

Could we just start an openstack-common as Brian suggested? It would be an awful shame to let this die for lack of a place to put it.

Revision history for this message
Matt Dietz (cerberus) wrote :

Steve: we're definitely not letting this branch die. The hope was indeed to eventually create an openstack project out of these tests. With that said, for now we're going to take the approach of simply putting them on Github. But first we need to pull the tests out of nova and create a new project, and we're unfortunately tied up at the moment. I do hope that we can get to it by next week.

929. By Kevin L. Mitchell

pull-up from cerberus

930. By Kevin L. Mitchell

having the server go into the deleted state is just fine

931. By Kevin L. Mitchell

pull-up from cerberus's branch

932. By Kevin L. Mitchell

merge from cerberus

933. By Kevin L. Mitchell

Add a dependency to test_snap_and_restore; move test_rebuild_server to test_server_actions.py

Revision history for this message
Steve Brown (jpgeek) wrote :

Matt: That is great news! I would also be happy to set up a project for it on Github (or lp) if that would help. Pulling tests out of nova is just the one directory (nova/smoketests)? Presumably the next step would be to either modify the nova/smoketests with Kevin's dtest framework so we can run both EC2 and Openstack/Rackspace API tests?

Revision history for this message
Steve Brown (jpgeek) wrote :

Any updates this?

Revision history for this message
Kevin L. Mitchell (klmitch) wrote :

Sorry, I forgot about letting you know. This has been pulled out into a project I called backfire: https://github.com/klmitch/backfire

Revision history for this message
Steve Brown (jpgeek) wrote :

Thanks! I will fork it.

Unmerged revisions

933. By Kevin L. Mitchell

Add a dependency to test_snap_and_restore; move test_rebuild_server to test_server_actions.py

932. By Kevin L. Mitchell

merge from cerberus

931. By Kevin L. Mitchell

pull-up from cerberus's branch

930. By Kevin L. Mitchell

having the server go into the deleted state is just fine

929. By Kevin L. Mitchell

pull-up from cerberus

928. By Kevin L. Mitchell

Pull-up from trunk

927. By Kevin L. Mitchell

Add a README for the integration suite

926. By Kevin L. Mitchell

Also need run_tests to be exported

925. By Kevin L. Mitchell

Add 'Daryl Walleck <email address hidden>' to Authors

924. By Kevin L. Mitchell

Reduce reliance on 'import *' for the adaptors

Preview Diff

[H/L] Next/Prev Comment, [J/K] Next/Prev File, [N/P] Next/Prev Hunk
The diff is not available at this time. You can reload the page or download it.