Community
Participate
Working Groups
Build Identifier: Some of my build models result in final aggregations that are around 800MB and can take around 45 minutes. If I re-run a successful build this too can a significant time, longer than I would expect given that everything should have already been downloaded. So I am wondering if the Aggregator can be further optimized so that it only downloads stuff it has not already downloaded (meta data and artifacts). It is quite common to run a build a number of times for different reasons so it would save a lot of time if the build could be made smarter, in this respect, if at all possible. Reproducible: Always
It's an excellent idea. You might recall that we discussed this in detail last spring. A common artifact repository and tagged meta-data repositories that can be used as fall-backs when connections are bad etc.
If we could do something to improve what we already have that would be good enough for now. For example: * I do not expect a build to download metadata or artifacts if these have already been downloaded to one of the staging areas. Can we tune p2 in some way to ensure this? * I do not expect a build to alter the final output directory unless it has a viable plan that it can commit to this area. For example, if I run a successful build and then see I have not added IU A to Custom Category B then the only thing in the final output that should get updated is the metadata - right at the end. * I need to be able to run and re-run builds without breaking what is already built. Right now I have having to do clean builds quite often in order to recover from other issues. * Could we add a validation step at the end of a build that will check the result? (Basic parse to see if metadata and directories are all in the right place. I have this in my publishing build and it is catching issues, like missing directories, on occasion.)
One other reason for needing to re-run builds is that things fail. If I apply a fix for something that has failed to mirror, or the site that was down magically comes back up, I need to be able to re-run the build and have it fill in the gaps. I don't really want to start from scratch each time (which I am having to do at the moment because it seems the final output directory is getting trampled.) I will do some more research on this, and add some detail about what is happening when to this issue.
It seems that much of the work necessary to speed things up has been done in the latest Aggregator release, which is pleasing: b3 Aggregator Editor (Incubation) / 0.2.0.v20111124-1605 (org.eclipse.b3.aggregator.editor.feature.feature.group). I will do some more testing in December and report back.
[Bookkeeping change only. Moving bugs to the new "home" of aggregator, CBI.]
The latest version of the aggregator mirrors artifacts on multiple threads so for SimRel the time is reduced form ~45 minutes to ~7 minutes.