Community
Participate
Working Groups
There is a regression in OpenCloseViewTest#showView:BookmarkView(). On one windows machine it is 20% and the other windows machine it is more than 100%. There is no regression on the linux machine. This seem to have got introduced sometime around 5th March.
Looking at this.
From the test history of I-builds: the 0301 is OK, the 0308 is bad. This makes the following bugs suspect: + Bug 331992. Workspace lock dialog not brought to front + Bug 231081. [Markers] Polish Problems view's columns preferences (REOPENED) + Bug 283820. [Contexts] NPE on Keys preference page in ContextModel.filterContexts (FIXED) + Bug 318914. [WorkingSets] Provide a preference to set size of the list of most recently used working sets (FIXED) + Bug 327396. [WorkingSets] updating of working sets during workbench restore can cause loss of working sets (NEW) + Bug 333417. [KeyBindings] Rename "Workbench" context and hide if not supported (FIXED) + Bug 335308. [JFace] JavaDoc of ControlDecoration#setDescriptionText is wrong (FIXED) + Bug 335960. [IDE] Update BuildAction to use new Workspace Build Configurations API (FIXED) + Bug 338056. SourceProviders through plugin.xml do not work (FIXED) + Bug 338843. Update Display#getAppMenuBar() calls (FIXED)
Kim, How do I get the hardware configurations of the machines on which the performance tests are running?
The machines are all 2 x 3.00GHz machines with 3.00GB of memory. http://wiki.eclipse.org/Platform-releng-faq#What_hardware_comprises_the_platform-releng_build_infrastructure.3F
(In reply to comment #4) > The machines are all 2 x 3.00GHz machines with 3.00GB of memory. > > http://wiki.eclipse.org/Platform-releng-faq#What_hardware_comprises_the_platform-releng_build_infrastructure.3F Hmm, so they are identical machines? The link says: # epwin2 (winxp with 1.5 vm) G on large KVM in rack # epwin3 (winxp with 1.6 vm) 7 on large KVM in rack The test results say: epwin2: Win XP Sun 1.6.0_17 (2 x 3.00GHz - 3GB RAM) epwin3: Win XP Sun 1.6.0_17 (2 x 3.00GHz - 3GB RAM) I am puzzled as to why on two supposedly identical computers (epwin2 and epwin3) the same test consistently runs 10 times faster on epwin2 vs. epwin3 (~300ms on epwin2 and ~3.8s on epwin3).
By the way, I can not duplicate drop in performance on Windows 7 64bit machine.
(In reply to comment #6) > By the way, I can not duplicate drop in performance on Windows 7 64bit machine. Satyam tried it on his Win XP and he couldn't duplicate that >100% drop either. Whats puzzling stuff is that the BookMarks view is just an another instance of the MarkersView (like Problems View, Tasks View, etc) and there is no regression in them. The regression started on sometime between Mar 3rd and Mar 5th and there is nothing significant has been checked in: http://download.eclipse.org/eclipse/downloads/drops/I20110329-0800/performance/epwin3/raw/Scenario356.html
Sometimes the machines just need to be rebooted. I can do that later this week, I'm out of the office right now.
I still can't reproduce it. On my Windows 7 machine the timing using the current build is about the same as in 3.6.2. If there is any difference it is less than about 10% and that gets hidden by the test's variability. Prakash, Satyam, have you being able to reproduce this?
(In reply to comment #9) > I still can't reproduce it. On my Windows 7 machine the timing using the > current build is about the same as in 3.6.2. If there is any difference it is > less than about 10% and that gets hidden by the test's variability. > > Prakash, Satyam, have you being able to reproduce this? I could also see only around 10% regression on Windows XP box.
Created attachment 193738 [details] Add degradation comment to the test Nobody seems to be able to reproduce the results from the epwin3 test machine. The actual difference between 3.6.2 and current code is less than 10%, if there is any. (For me, at least 1 test group out of 3 is faster on 3.7.) It is possible that we had two slight performance degradations in 3.7, one on the order of 3% and another on the order of 5%. Due to the high test variability tracing back to the source of those changes is not practical. I'll add a degradation comment to the test as we are pretty sure those numbers do not indicate real problem. I'll keep this bug open so that, time permitting, we can revisit this bug in 3.8.
See also bug 343297 .
This bug hasn't had any activity in quite some time. Maybe the problem got resolved, was a duplicate of something else, or became less pressing for some reason - or maybe it's still relevant but just hasn't been looked at yet. If you have further information on the current state of the bug, please add it. The information can be, for example, that the problem still occurs, that you still want the feature, that more information is needed, or that the bug is (for whatever reason) no longer relevant. If the bug is still relevant, please remove the stalebug whiteboard tag.