Some Eclipse Foundation services are deprecated, or will be soon. Please ensure you've read this important communication.

Bug 278319

Summary: add tests for Discovery
Product: z_Archived Reporter: David Green <greensopinion>
Component: MylynAssignee: David Green <greensopinion>
Status: RESOLVED FIXED QA Contact:
Severity: enhancement    
Priority: P3 CC: mik.kersten, steffen.pingel
Version: unspecified   
Target Milestone: 3.3   
Hardware: All   
OS: All   
Whiteboard:
Attachments:
Description Flags
mylyn/context/zip
none
mylyn/context/zip none

Description David Green CLA 2009-05-28 20:46:47 EDT
* network-based unit test
* system test
Comment 1 David Green CLA 2009-06-02 11:49:23 EDT
Mik, I'd like to create a new project for SWTBot tests.  It will make it easier to maintain if we keep the dependencies separate from other tests.  I suggest @org.eclipse.mylyn.discovery.tests.ui@ for a bundle id.  Can I go ahead with this?
Comment 2 David Green CLA 2009-06-02 11:49:25 EDT
Created attachment 138019 [details]
mylyn/context/zip
Comment 3 Mik Kersten CLA 2009-06-04 13:24:55 EDT
I'm not sure that it makes sense to create a new project, especially if the idea is that we will make more use of SWTBot in the future.  It seems to me that as soon as one test project in our main project set has a dependency on SWTBot, the other test should feel free to have that dependency as well?  I assume that we can add the SWTBot check-out to the project.
Comment 4 David Green CLA 2009-06-04 13:45:49 EDT
There are some reasons why it might be favorable to have it as a separate project, at least for starters:

* the JUnit tests are run using a different test runner/launcher
** it makes it clear to developers what kinds of tests to create (traditional JUnit test runner runs in the UI thread, which is not the case for SWTBot tests)
* it makes it easier to toggle running of SWTBot tests (we may decide not to run them with every build, depending on complexity/resources machine profile etc.)
* it minimizes the risk of introducing SWTBot should we have to drop it (new technology adoption risk)
Comment 5 Steffen Pingel CLA 2009-06-04 15:58:42 EDT
We had a similar discussion when performance tests were added and decided to keep all tests in a single plug-in. I wouldn't worry about dependencies too much as long as were are able to add SWT bot to our map files.

For running tests selectively test suites have worked well in the past and performance tests and other tests are already run separately as part of the nightly integration build.

I'm +1 for adding SWTbot to existing test plug-ins, unless it add significant overhead for everyone building from source due to new dependencies.
Comment 6 David Green CLA 2009-06-04 16:15:40 EDT
Keep in mind: to run regular plug-in tests from the Eclipse UI, we use *Run As -> JUnit Plug-in Test*, whereas for SWTBot tests we use *Run As -> SWTBot Test*.  Also the Ant build will have to run a different launcher.  Also SWTBot relies on JUnit 4.  I've done this before and trying to fit them in the same project is more of a headache than it's worth.
Comment 7 Steffen Pingel CLA 2009-06-04 16:52:15 EDT
What about the compile time dependencies? How easy is it to consume those?
Comment 8 David Green CLA 2009-06-04 18:48:19 EDT
(In reply to comment #7)
> What about the compile time dependencies? How easy is it to consume those?

Everything's on an update site: "SWTBot Downloads":http://www.eclipse.org/swtbot/downloads.php
Comment 9 Mik Kersten CLA 2009-06-10 14:19:16 EDT
David: So it sounds like we've converged on adding the SWTBot-coupled tests to any .tests plug-in that needs them?  If so go ahead and add the dependency for the discovery tests and update the map/psf files as specified by Steffen.
Comment 10 David Green CLA 2009-06-10 14:22:53 EDT
(In reply to comment #9)
> David: So it sounds like we've converged on adding the SWTBot-coupled tests to
> any .tests plug-in that needs them? 

No convergence here... it's far better to keep them in their own project.
Comment 11 Steffen Pingel CLA 2009-06-10 21:11:17 EDT
I think we should stick to the current policy that of only requiring an Eclipse SDK  + team project set for Mylyn development. Considering the overhead for every contributor and committer and the implications for the releng process, David has conviced me that a separate test plug-in for SWTbot tests makes sense. We can still merge the test plug-ins later if needed.

+1 for a separate plug-in
Comment 12 Mik Kersten CLA 2009-06-11 11:23:50 EDT
I may be misunderstanding, but to be clear I'm opposed to the idea of having one additional project per component.  We could end up with two dozen new projects and go against the Eclipse naming conventions of a single test project per component: http://wiki.eclipse.org/Naming_Conventions  I could see having one new project that was free to couple to all Mylyn bundles, e.g., org.eclipse.mylyn.tests.swbot, org.eclipse.mylyn.tests.integration or org.eclipse.mylyn.tests.ui.

I also don't yet understand how this is different than the case of our performance tests, where we have additional dependencies.  Wouldn't we just add the required SWTBot projects to the team project set?
Comment 13 David Green CLA 2009-06-11 12:06:25 EDT
To keep it simple:

* it's beneficial to keep SWTBot tests in a project separate from other JUnit tests
* there's no need to have multiple SWTBot test projects

What I'm looking for here is a 'nod of approval' to create a test project for SWTBot tests.  I think it's a great idea to have one of these for all of Mylyn.
Comment 14 Mik Kersten CLA 2009-06-11 13:07:02 EDT
From today's call:
* Call the project: org.eclipse.mylyn.tests.ui
* Add all of it's dependencies to our team project set
Comment 15 Steffen Pingel CLA 2009-06-12 18:29:19 EDT
David, I think the most valuable test would be to automatically verify installability of all features listed on the site. Do you think it would be feasible to create a unit test that does that, even if it does not click through the discovery UI but drives the code programmatically?
Comment 16 David Green CLA 2009-06-16 00:08:05 EDT
(In reply to comment #15)
> David, I think the most valuable test would be to automatically verify
> installability of all features listed on the site. Do you think it would be
> feasible to create a unit test that does that, even if it does not click through
> the discovery UI but drives the code programmatically?

It's definitely feasible.  I can see the value in verifying that discoverable update sites are available.

There's also a lot of value in verifying that the p2 installation works as expected.  p2 is something that we have little control over and it may change from one release to the next.
Comment 17 David Green CLA 2009-06-16 01:30:04 EDT
I've committed two initial tests:

# a test that selects the Trac connector and finishes the wizard, confirming that the P2 installer UI appears with Trac selected, and
# a test that verifies that all directory listings are enabled

So far the sources are in CVS however the plug-in has not been added to the build.

There's a potential problem with the team project set idea: some of the dependencies (eg: SWTBot) are in SVN.  If we add this new plug-in to the team project set along with its dependencies, people using it will have to install an SVN Eclipse plug-in such as Subclipse or Subversive.
Comment 18 David Green CLA 2009-06-16 01:30:35 EDT
Created attachment 139246 [details]
mylyn/context/zip
Comment 19 Steffen Pingel CLA 2009-06-16 05:26:05 EDT
That's really cool! It's fun to watch the test execute. 

At the moment my main concern is not as much the UI testing or breakage in future Eclipse version since we can verify that manually. What would help most is to know if the provisioning plan resolves if all available connectors are selected and that the installation succeeds, i.e. all plug-ins are downloadable from the sites. We are currently making frequent changes to the directory listing and I would like to ensure that the published version actually works.
Comment 20 Steffen Pingel CLA 2009-06-16 23:30:52 EDT
I have moved this to the next milestone but feel free to keep committing test cases to head. Tests are not released so we don't have to worry about regressions etc.
Comment 21 David Green CLA 2009-06-17 14:21:35 EDT
Thanks!
Comment 22 Steffen Pingel CLA 2009-09-19 16:26:52 EDT
Committed a fix for the failing tests.
Comment 23 David Green CLA 2009-10-01 13:49:56 EDT
tests have been created.  Nothing left to do here.