ReleaseEngineering/Buildduty/SVMeetings/Oct26-Oct30

From MozillaWiki
Jump to: navigation, search

Upcoming vacation/PTO: vlad - nov 3 alin - dec 24, dec 28-31

Meetings every Tuesday and Thursday

https://wiki.mozilla.org/ReleaseEngineering/Buildduty/SVMeetings/Sept28-Oct2 https://wiki.mozilla.org/ReleaseEngineering/Buildduty/SVMeetings/Sept28-Oct2-coop-visit https://wiki.mozilla.org/ReleaseEngineering/Buildduty/SVMeetings/Oct5-Oct9 https://wiki.mozilla.org/ReleaseEngineering/Buildduty/SVMeetings/Oct12-Oct16 https://wiki.mozilla.org/ReleaseEngineering/Buildduty/SVMeetings/Oct19-Oct23


[vlad] https://bugzilla.mozilla.org/show_bug.cgi?id=1203128

[alin] https://bugzilla.mozilla.org/show_bug.cgi?id=1204970 updated the bug with the current version of the script example of the resulted output:

   CRITICAL Pending Builds: 9764

Top Builds by Platform: win32 --> 3859 builds win64 --> 2267 builds linux64 --> 896 builds linux --> 668 builds macosx64 --> 522 builds when running the script, I noticed that sometimes it outputs an error like: “Unterminated string starting at: line 130929 column 9 (char 4940149)”

     2. re-imaged several Windows slaves and restarted some others yesterday

marked the bugs as resolved for t-w864-ix-192 and t-w864-ix-193 currently monitoring the evolution of: t-w864-ix-043 t-w864-ix-164 t-w732-ix-117 t-xp32-ix-030

kmoir Looks like https://bugzilla.mozilla.org/show_bug.cgi?id=1210395 has only two dependant test failures and then we can enable 10.10.5 on trunk Have you gone through the list of loaners recently and asked if people still need their loans? Our pending counts have been so high lately, it would be good to get as many machines in pools as possible


Reallocating 30 linux64 machines to windows testing https://bugzilla.mozilla.org/show_bug.cgi?id=1217494

Has this bug been a problem lately tst-emulator64 running out of space https://bugzilla.mozilla.org/show_bug.cgi?id=1217863 nthomas provided a patch, I landed it and created a new golden ami, will keep an eye on it

[Vlad] I will start looking over it to check if the patch resolve the problem Looked over some tst-emulator64 spot instances and the space seems to be OK

Linux64 machines https://bugzilla.mozilla.org/show_bug.cgi?id=1217494 open bug with relops to reimage these machines as windows machines create a patch to add the new machines to slavealloc and graphserver

new bug decommission more pandas/foopies and mobile imaging servers once bug 1183877 lands https://bugzilla.mozilla.org/show_bug.cgi?id=1193002 Could you go through the list of pandas that are showing up as broken in slave health https://secure.pub.build.mozilla.org/builddata/reports/slave_health/ and reboot them to try to get the to work. Then we can move forward with patches to disable some more panda racks

[Vlad] Amy suggested on the bug to use mozpoll, I think this documentation is ok https://wiki.mozilla.org/ReleaseEngineering/Mozpool/How_To_Use_the_Mozpool_Web_UI ?


https://bugzilla.mozilla.org/show_bug.cgi?id=1218406


[alin] working with Vlad on the patch to disable r5 on trunk and enable r7 on trunk - will upload the patch after the call do we have a solution for the high pending counts at the moment? or should we wait until the talos-linux32 slaves get re-imaged and enabled as win732 slaves?

   → 12247 atm


29.10.2015 - 30-10.2015

[alin] https://bugzilla.mozilla.org/show_bug.cgi?id=1203128 @Kim: if you have time, please take a look: path to test master: /builds/buildbot/alin.selagea/test_trunk issue: I don’t see why this script does not work correctly, it seems that it adds talos jobs for non-trunk branches in the case of yosemite_r7 and fails to delete the talos jobs for trunk branches in the case of yosemite (r5) I tried to manually determine the trunk and non-trunk branches on my local PC: ride_trains_branches = [] for name, branch in items_at_least(BRANCHES, 'gecko_version', 45):

  ride_trains_branches.append(name)

not_ride_trains_branches = [] for name, branch in items_before(BRANCHES, 'gecko_version', 45):

  not_ride_trains_branches.append(name)

ride_trains_branches: ['b2g-inbound', 'jamun', 'elm', 'mozilla-central', 'oak', 'alder', 'try', 'larch', 'cedar', 'date', 'ash', 'mozilla-inbound', 'fx-team']

not_ride_trains_branches:['mozilla-b2g37_v2_2', 'mozilla-release', 'cypress', 'mozilla-esr38', 'mozilla-aurora', 'mozilla-beta']

if I use something like: delete_slave_platform(BRANCHES, PLATFORMS, {'macosx64': 'yosemite_r7'}, branch_exclusions=['b2g-inbound', 'jamun', 'elm', 'mozilla-central', 'oak', 'alder', 'try', 'larch', 'cedar', 'date', 'ash', 'mozilla-inbound', 'fx-team']) → it will work fine and won’t generate the extra talos jobs → I initially thought that ride_trains_branches was not being generated correctly, but I don’t see extra jobs other than talos, so I really don’t know what’s the issue here

kmoir> not sure what is happening here, will have to look further