Auto-tools/Automation Development/Meetings/WorkWeek July2011/Notes:ashughes

From MozillaWiki
Jump to: navigation, search

Summary

The following is a summary of the notes I took during the QA Automation Services Work Week, Europe, July 2011

United Kingdom Talks

Test Development Process

  • Juan & I to work with devs to identify features needing testing* AutoSvc & I work to identify which can be automated easily
  • Waverley & I work to identify requirements to get that automated
  • I file bugs for API requirements
  • AutoSvc works to get API in place
  • Waverley develops tests once API in place
  • Waverley directly contacts AutoSvc with blockers via bugzilla (I cc'd)
  • Minor modifications to API can be submitted via patch
  • Any issues greater than a single simple blocker to be liaised through myself
  • Patches reviewed and checked-in by me, super-reviewed by Geo & Henrik
  • Process should begin 1 month before end of quarter so API in place near beginning of quarter
  • Backlog automated in downtime based on state of APIs

Liaison Role

  • work as a go between during the planning process
  • work as a go between for larger issues than simple blockers
  • if blocked or needing reviews, leverage AutoSvc in Europe
  • file bugs and ask for help in #mozmill channel
  • use daily stand-up to increase awareness of issues, even if resolved

Dynamics

Dynamics

  • collaborating more as a team, more personal investment in other people's projects
  • need to develop skills/knowledge to be able to collaborate
  • one way is to include contribution to another project in your goals
  • when there is a red flag issue, team needs to pull together to correct it
  • should we treat Mozmill as a product out of our control, use the shared modules to provide functionalities Mozmill cannot provide, migrate functionality when available
  • Selenium and Mozmill should not be the team, team should be about problem solving with automation but not isolated to automation projects and QA
  • succeed and fail as a team; if someone is falling behind on a project, rest of the team helps; spread the load, shift responsibilities

Identity

  • consultants for Mozilla as problem solvers
  • communicated as a team, if one person fails on a goal the team fails on a goal, always framed in terms of "we"
  • treat every goal as a learning experience, discovering processes which work
  • if something comes up in an individual meeting, add it to the backlog for discussion with the team

P2PU

  • create courses around the things we do, Selenium, Mozmill, processes, etc
  • can be any topic which relates to the team
  • good resource for new hires

Responsibilities

Automation Services

  • API Module creation
  • Fix failing APIs
  • Code reviews for APIs, frameworks, etc
  • Framework extension, subclassing, etc
  • Tool development (ie. scripts to ease automation execution)
  • Engage with Product team on Triaging tests for automation
  • Engage with Tools team on Framework development
  • Engage with Tools team on Framework maintenance
  • Engage with Development team on Product fixes
  • Evangelising/Consulting
  • Engage with Product team on bug management
  • Engage with Product team on Product testdays
  • Run automation testdays
  • Community fostering, engagement, growth, blogging, etc for APIs, Frameworks, etc
  • Determining requirements for infrastructure
  • QA for Tools, Frameworks, etc
  • Proactively react to test execution infrastructure needs
  • Engage with Al for AST infrastructure needs
  • Recruiting for their team
  • Engagement
  • Developers -- best practices, tools, frameworks, etc
  • Internal toolsets
  • External toolsets
  • Product teams -- how can AST help facilitate automation strategies
  • Recruiting on Product, Development
  • Knowledge of Firefox/Platform (whitebox)


Product

  • Triage tests for automation
  • Debugging broken tests
  • Fixing broken tests
  • Code reviews for tests
  • Maintaining infrastructure software
  • Maintaining infrastructure hardware
  • Automated test execution
  • Manual test duties (releases, feature ownership) -- AST will help out for all-hands-on-deck, testdays, etc where available
  • Bug management
  • Running Product testdays
  • Community fostering, engagement, growth, mentorship for test development and in general
  • Result monitoring, reporting
  • Test development
  • Testing patches for integration in testruns for all platforms (needs a Try server)
  • Engage with Al and AST for infrastructure needs
  • Recruiting for test development, Product

Tools

  • Framework development
  • Framework maintenance

Code Reviews

  • make sure a patch is code complete, covers the manual test first
  • single style review -- not a dogmatic approach, as long as it's clear
  • if there are one or two nits, committer can change during check-in
  • Google JS Styleguide?
  • Better Bugzilla integration with Pivotal -- how can the two play nicer and reduce the overhead of duplicating the two
  • complete, functional, clear
  • Review Models:
  • Tests should be complete, functional, and clear
  • 1) Review for completeness to the Litmus test
  • 2) Review for functionality, ie. test does not fail when integrated on all platforms
  • 3) Review for clarity, ie. follows style closely and is easy to understand
  • Alex/Vlad can review while I am not online -- spot-check each other's work
  • Anthony reviews all patches before checking them in
  • Need a 2nd for reviews/check-in in my absence

Community

  • no altering "message" for community vs external teams vs internal
  • community members usually only stick around for a task or two
  • can we utilize Mozilla Reps? Contributor Engagement team?
  • active community increases visibility into process, documentation
  • need to lower barrier to entry
  • meetup, conference, testday attendance
  • ramp up to committing code is extremely long
  • need to communicate more publicly as a default, advertise more our public channels

Community Education

  • Wiki, QMO Docs, MDN, P2PU, Blog, Screencast
  • P2PU
  • How to write an automated test?
  • How to write a Selenium test?
  • Reach out to community to find out what they want to learn

Testdays

  • Testdays need to communicate benefit to testers
  • transferrable skills development
  • visibility
  • part of a "cool" community
  • when a testday fails, figure out what went wrong and learn how to fix it, and try again
  • contributions can be included on resume, linked
  • Push easy starter tasks to other platforms like Mechanical Turk
  • Having interns try to run events supported by us when they go back to their school
  • "offline" testdays
  • Need a bot to welcome people and ping moderator, link people to appropriate tasks
  • allow for day-long testdays with certain unmoderated windows, ie. "testday for X on July 16th"
  • testdays with unmoderated windows, Mozilla office hours
  • alleviate timezone confusion

Work Flow

Failures

  • Product teams first line of defence
  • Failure discovery via dashboards
  • Failure debugging and fixing done via bugzilla by Waverley
  • If API problem, goes to Automation Services team via dependency bug
  • If product problem, goes to Development team via dependency bug

Bugzilla

  • Index whiteboard tags on Wiki/QMO
  • Use for bugs for test development, fixing

Pivotal

  • Use for work flow tracking
  • Needs documenting for work flow
  • Problem with reporting on projects which span multiple projects (ie. Endurance)
  • Need tighter integration with bugzilla to reduce overhead (middleware?)
  • Using Pivotal to create new bugs, review queues, dependencies
  • Automation Services will reach out to Pivotal team to fact-find
  • Discoverability
  • link from wiki, repository readme, scrimped
  • QMO project page?

Goals

  • we should have a "learn from each other" goal
  • we should work on less projects more collaboratively, less silos
  • feeling little freedom for personal development goals
  • old projects (backlog) lose out to new projects coming down the pipe
  • percentage of goals should be devoted to backlog projects
  • personal goals which have dependencies should have a sister goal
  • are we working on projects we shouldn't?
  • how well informed is Matt about the work we are doing?
  • main goal should be supportive of the team
  • should have conversations about cost/priority to backlog projects when a new project is injected
  • Matt needs to make the call on shifting priorities vs hiring more hands
  • 3-month preallocated goals are hard to predict because we are a reactive team, moving to a new model is hard to defend at this point
  • tie to bonus structure causes some issues that mean we can't move away from quarterly goals completely -- need to interface with other reactive teams (ie. IT)
  • review goals every 2 weeks to stay on track

Internal Communication

Meetings

  • bi-weekly team meeting
  • 2 weekly stand-up meeting (hobnob)
  • 15 minute chat
  • during overlap period
  • not a status meeting
  • raising blockers, visibility for issues where you need assistance
  • not a discussion, but a monologue -- raise issues, take online after the meeting
  • if a tool ceases to become useful, team will make a call to stop using it
  • what works for one team may not work for other teams
  • team needs to decide policy, procedure, tools changes for themselves
  • 1-on-1s as needed, unscheduled
  • Raise issues in bi-weekly meeting
  • if 1-on-1 discussion is needed, contact that person directly
  • if group discussion is needed, contact the team via email/mailing-list and break out chats from that
  • can approach team with issues following hobnobs

External Communication

Product Teams

  • Automation Services team opens the door for feedback after providing a particular tool or service
  • Liaisons should feel free to communicate feedback and criticisms as soon as necessary
  • Automation Services team reaches out to get feedback on what was "good" and to get feedback
  • Projects
  • inform AST at project of inception and start reach out for answers
  • prioritize projects before approaching AST, don't dump a lot of projects on them at once
  • AST needs to maintain a priority list
  • What is the open-shop date for taking on new projects? need to build out infrastructure first

Newsletter

  • biweekly in line with team meetings, every monday, on QMO
  • cross post to mailing lists with a reminder that it's been posted

QMO

  • add meetings, talks, presentations to event calendar
  • newsletter posted every 2 weeks on Mondays to summarize team goings on
  • meeting notes posted following the meeting as a blog post
  • do we want an intermediary user so community members can create content? ie. Aleksej
  • need to commit to blogging
  • Forums
  • Usenet plugin?
  • Forums being used as a SUMO entry point
  • need to streamline down to 1 realtime (IRC), 1 persisted (QMO, newsgroup, etc)
  • concerns about spammers and toxic members is a problem we don't need to solve right now
  • need stickies, categories list on top of recent posts from main forums page

Infrastructure

Current

  • WebQA
  • 6 Mac Minis: 3 Win VMs, 2 Linux VMs
  • Mozmill
  • 2 Mac Pros (Release Testing): 5 Win VMs, 2 Linux VMs
  • 1 Mac Pro (Daily Testruns): 5 Win VMs, 2 Linux VMs
  • 1 Mac Mini (Sandbox): OSX 10.7, some VMs
  • ESX
  • MozQA.com: Rackable server

Future

  • 8 rackable servers, 2 Mac Pros incoming
  • WebQA needs a Try server so they can isolate issues on experimental branches, grid issues
  • Mozmill needs a Try server so Waverley can test their patches across all platforms before asking for review
  • Automated creation of virtual-env to encapsulate dependencies so they aren't lost on major updates
  • On-demand delivery of VMs/snapshots via ESX (start a VM based on a template)
  • testrun requested > VM created based on template > VM starts > testrun executed > reports published > VM shuts down > VM destroyed

Concerns

  • too many machines will be hard to maintain unless there is some sort of automated lockstep infrastructure (ie. Puppet)
  • need checklist documentation for updates -- ie. update to OSX 10.7 upgraded python 2.6 to 2.7 resulting in missing site libraries, broken update automation
  • what's running on what server and what's needed for it to run correctly
  • reviewing release notes
  • checking Google for potential issues
  • smoketest to validate -- ie. Mozmill servers can run a daily testrun and update testrun
  • resource contention from other teams
  • costing requests for resources

Responsibilities

  • AST provides new/updated tools/infrastructure, Al installs/updates those tools (supported by AST as needed)
  • New hardware, software requests go through Al
  • Support of on-demand ESX snapshots done by AST; Support for the ESX Server itself done by Al
  • Automation Services Team
  • Support and maintenance of APIs, scripts, and tools
  • Consulting and creating of new APIs, scripts, and tools
  • Infrastructure Team
  • Support and maintenance of hosts, VMs, and installed software
  • Consulting and providing of new hosts, VMs, and installed software
  • Product Teams
  • Maintenance and execution of tests
  • Maintain testrun configurations
  • Development of new tests
  • Escalate API, script, tool issues to Automation Services

Group Docs

Public

  • mentorship - QMO
  • new members - QMO
  • meeting calendar - syndicated Google Calendar
  • meeting minutes - QMO
  • calendar of working hours - syndicated Google Calendar
  • developer tutorials - MDN, linked from QMO
  • reference documentation - MDN
  • design documentation - WIKI
  • project documentation/roadmaps - WIKI

Private

  • new hires - INTRANET
  • interview questions - INTRANET
  • calendar of working hours - private Google Calendar

Other Notes

  • need to find a way for community to report what they are working on for us
  • how do we handle reporting private status vs public status
  • is there a status reporting tool which can serve both of these roles

Events

Goals

  • want more contributors
  • increase awareness of team and projects
  • skill growth for both team and attendees
  • visibility of members and what they do
  • networking and innovation
  • interfacing with existing community

Branding

  • wearing mozilla clothing when attending events, get the brand out there
  • "i do this at mozilla"

Types of Events

  • local meetup events: hosted, sponsored, attended
  • speaking/presenting/teaching
  • pure social meetups
  • hackathons
  • testdays: local and virtual
  • school events
  • gtac
  • selenium meetup, london
  • mozilla festival berlin
  • mozilla festival london
  • use Mozilla Spaces to hold events, invite user groups to hold events
  • Drumbeat
  • Brownbags
  • MozPub

Other Notes

  • More contributors are a side-effect of these efforts being successful, not necessarily the end goal
  • top 20% in profession attend events -- sometimes if they say "yes" they tell themselves "no"
  • students attend events to build skills, experience, and gain visibility (both to themselves and different technologies)
  • should identify events which align with both personal passion and core values
  • try to engage/know your audience, tweak the message/delivery for each specific audience

TShirt

  • Meetup Attire: "ask me about automated testing"
  • Team: something which identifies team identity: community of problem solvers

Retrospective

What Went Well?

  • hard problems solved
  • team gelling
  • comprehensive discussion
  • agility in adjusting schedule to the needs of the discussions
  • non-distractive environment (quiet location, lack of internet)
  • got a good feeling for setting a precedent for good teamwork
  • outsiders opinions well considered
  • group ownership -- sharing lead responsibilities
  • etherpad for itinerary planning

What Did Not Go Well?

  • disconnected from home-base
  • hotels -- who is responsible for payment? travel agency snafu?
  • failure to recognize/involve Cameron -- monopolized by TCM?

What Can We Do Better?

  • not a lot of concrete actionable items
  • location and logistics
  • pre-planning and organization
  • pre-planned budget for event
  • collaboration on organizing itineraries
  • expenses and communication of what is expected
  • better communication between team members when issues arise and are resolved
  • mozilla provided sim cards for upcoming trips?


Action Items

  • Doc Platforms
    • QMO: newcomers -- discoverability portal for getting involved, what's going on, tutorials
    • Wiki: contributors -- project plans, design docs
    • MDN: developers -- reference docs for development
    • Intranet: internal -- docs relevant to MoCo
    • QMO > WIKI > MDN distillation of information
  • links to bugzilla tags, Pivotal, scrumpads, forums, etc
  • Waverley agenda -- automation, firefox, needs, unconference style
  • Work week talking points
  • Work Week wrap-up, Newsletter
  • Henrik 1:1s
    • is it necessary? weekly standup?
    • communicate team issues during team meeting? post-hobnob?
  • Talk to Matt

Liaisons

  • Henrik needs a 2nd for Mozmill dashboard
  • I need a 2nd on Mozmill Test reviews
  • "A go-between QA Product team members and QA Automation Services members" -- There will be situations where this will be a bottleneck (ie. blocker bugs, timezone)
  • assist in implementation of QA Automation Services projects -- creates a bottleneck for both myself (ie. not enough time) and AST (ie. relying on me for projects/pace I'm not used to)

Goals

  • "improve the efficiency and end-to-end timing of our update testing automation"
  • "Develop a set of base-line automated endurance tests that measure Firefox resource (memory, cpu load) utilization over repeated real world use-case scenarios to measure the effect of changes and new features across releases of the Firefox browser."
  • "Create a test framework that supports the development and execution of Web Apps unit and functional tests and collaborate with the Web Apps team and Mozilla test community to write and develop automated test cases for Web Apps infrastructure and applications."

Other

  • update automation needs to detect if updated version of Firefox is not expected (ie. 6.0b3 vs 6.0b4)
  • transfer Addons Manager to someone at Softvision (Vlad?)

Softvision

Day 0

Softvision teams will be broken down into Desktop, Mobile, and WebQA. The Desktop team will be led from Mozilla by Juan and Anthony, and from Softvision by Vlad; this team is inclusive of manual and automated testing.

The daily standups in the Sync Pads have been working quite well; we should continue to do so. However, test automation will be rolled into the moz-firefox sync-pad as Automation Services will be taking over moz-auto.

We need to do a better job of real-time communication. The popular suggestion is to create some overlapping hours, at least a couple days a week. A proposal of us starting at 8am (Softvision's 6pm) would be best. Most people on the Softvision Desktop team have been working until 7pm anyway. Personally, I will be adjusting my office hours from 9-5 to 7-3.

People feel they are reporting, and re-reporting status far too often. Softvision would love to see meetings focus more on solving problems than reporting status. Personally, I'd love to see meetings take on the un-con sticky-note style. Status can always be conveyed via meeting notes.

People at Softvision want to take on more responsibility. They would like to see people taking more feature ownership and less task based. They also see ownership of features being something that happens for the life of the feature, not something that gets handed off once it is released.

People want to see Pivotal Tracker being used more often; not as a clone of Bugzilla but as a place to track projects. For example, we could be using the icebox to create a repository of backlogged tasks which team members with free cycles could jump in on.

Softvision has numerous complaints about lack of visibility of feature status and planning from the developers. Issues are hidden in bugs and not communicated to the wiki pages. Use cases are often inaccurate or undefined. Juan and Anthony need to do a better job of calling out developers and management; Vlad and Alex need to do a better job bringing those issues to our attention.

Not having work in the queue is a blocker. Softivision needs to speak up when they are running out of work. Mozilla needs to be more proactive in making work available. Perhaps a project backlog would help mitigate this issue.

Ownership of a feature is defined as writing test plans, writing test cases, running test plans, triaging bugs, running tests, engaging the community, engaging the project team, and reporting to the QA release driver.

Day 1

Softvision feels that the relationship has been good so far. Tasks are well written, Mozmill is decreasing manual test load, feature ownership gives them a sense of responsibility, it's easy to communicate questions and issues to Mozilla, our standardized processes make things run smoothly, and verifying bugs is "enjoyable".

Softvision feels they are unclear on the overall direction of the project and how their tasks contribute to the end-goal. For example, they thought the goal of bug triage was to get to 0 bugs. We need to do a better job of communicating this information.

When testrun results are late it causes a delay in activity. Ultimately we need to find a way to transfer automation to Softvision so that they can do everything on their time.

Softvision would like to have a greater role in the release-driver process -- perhaps as a junior member.

We need a process for communicating issues. A clearer indication of what the methods of communication are and when it is appropriate to use them is desired.

The wiki pages are poorly maintained and lack ownership. The biggest issue is status, which tends to constantly be out of date. This causes a loss of time due to testing areas which may or may not be landed. Additionally, the wiki pages are not organized, lack discoverability, and lack a site map. Something needs to be done to improve the state of affairs.

Developers tend to be slow to respond, if at all. I've advised Softvision to get in touch with me or Juan when a developer is not responding.

Softvision has requested a repository of release drivers; who they are, what they do, when they are available, how do you contact them, etc. I will work with Juan to provide them with this information.

Softvision is unclear of the priority for bug triage of toolkit, core, firefox, and feature bugs. I need to confer with Juan about what the priority should be.

Softvision is frustrated by the lack of follow-up process for bugs they mark as NEW and bugs they file. They also would like some sort of follow-up process for enhancement bugs. I propose they start using the [softvision-need-followup] whiteboard tag so QA can easily triage and escalate these issues.

Softvision is frustrated by features with no ETA (or out of date ETA). Anthony and Juan will need to push harder on release drivers and project owners to communicate status. Softvision needs to push on QA more so we know when issues need to be escalated.

Softvision is unclear about test automation priority (BFT, Feature, Endurance). I need to confer with other stakeholders for advice on what is the ideal compromise.

The generally accepted test development strategy is:

    Manual testing (BFTs) occurs as a sign-off for Nightly merge.
    Manual feature testing begins in the Nightly and throughout the release cycle.
    Automation planning begins in the Nightly.
    Automation requirements are developed in Aurora.
    Automation tests are developed in Beta (when features are stable).
    Any development downtime should be used to automate the BFTs

Softvision wants swag to hand out at events.

Daily standups will occur as an internal meeting at Softvision's convenience. Issues will be raised to QA via the sync-pads. The Waverley status call will be an opportunity to communicate status and issues externally.

Day 2

The Project Manager role at Softvision will be one of facilitating resources and prioritizing work internal to Softvision. She will not be a blocker to ongoing work. She will conduct daily standups timeboxed to 15 minutes with the teams. Status will be communicated via the Mozilla team sync-pads and the weekly status meeting. Status reporting will include progress, issues, blockers, priorities, and risks. She is undecided on Pivotal Tracker as the ideal tool for her needs but will evaluate. She has taken on the task of better organizing the docs on the wiki which are pertinent to Softvision.

Softvision has a genuine desire to build out the Romanian community and is highly motivated to do so. They would like Mozilla.RO to syndicate to both Planet and QMO. They've also requested WebDev provide support for Mozilla.RO by way of code and security reviews (perhaps webQA can help).

Softvision has requested Mozilla presence at hosted launch parties. On the flip side, we should organize a community event every time Mozilla visits the Softvision office.

Softvision has requested that remote communities in attendance on Air.Mozilla get a shout out.

Softvision have requested swag to hand out at events.

Softvision needs to have business support for events (days off, costs).

Softvision plans to reach out to and collaborate with satellite communities (ie. Bulgaria).

Mozilla should do everything they can to support and provide representation at as many of these events as possible.

Romanian community is largely involved with localization; need only a nudge and support to start getting involved on QA.

Softvision would love to see a Mozilla.RO community t-shirt -- we should do what we can to support it (probably a good community project)

Softvision would love to see our how-to docs translated on Mozilla.RO (probably a good community project)

Contributors in Romania don't use IRC because it is blocked from corporate networks (most contribute from the office) -- what can we do to make ourselves more accessible?

Softvision would like to see students working on Mozilla bugs instead of made-up projects in school.

Many schools in Romania use Yahoo Groups -- this is an untapped resource for MozillaQA

We need to do more to evangelize the CV aspect of contributions.

Bugzilla feature voting is only accessible to those with accounts. Can we do something to allow those in the community, outside of bugzilla, to vote on features and enhancements?

Softvision does not have a clear understanding of new technologies in terms of why we are pursuing them and how they fit into the goals. We need to do a better job of communicating the big picture.

Softvision teams want to attend QA work weeks. What can we do to provide support?

Softvision wants to align their personal goals with our team goals.

Automation Services will be the ultimate authority on development of APIs and Frameworks. Softvision should work with team leads to determine test requirements. The project leads will work directly with Automation Services to get those developed. If Softvision can provide a quick patch to a bug for their needs, they should feel free to do so.

Softvision new-hires should be owning a feature from day 1. They have on-boarding documentation and should be able to take on more features within 3 weeks. Softvision will handle the physical on-boarding with Mozilla only providing support and projects.

The automation priority is Test Failures > API Failures > New Tests.

There was some confusion about support for XP, 2000, and 64-bit platforms. XP gets full coverage, 2000 gets spotchecked until we decide to drop support, 64-bit gets spotchecked until we decide to start support.

Our current infrastructure for running automation was communicated. It's unrealistic for us to expect Softvision to provide the same resources. This will require considerable business support. Softvision have 2 computers with dual booting platforms (no VMs). What can we do to provide support for transitioning test execution onto their plate?

Code reviews will be conducted by Vlad and Alex for style guide and functionality. It is assumed that Vlad and Alex will test patches on all platforms before submitting them for review by me. I will review patches for completeness and correctness before checking them in.

We need a 2nd on test automation; someone to handle check-ins in my absence. Do we have a MozillaQA resource outside of Automation Services? Can we entrust Vlad or Alex with this responsibility? I'd like to avoid burdening Automation Services.

We need to figure out and make a call on priority for development of automated tests (BFTs, Features, or Endurance). Each has pros and cons. We need to give a voice to all stakeholders on this decision.

The process of test development will start with release drivers determining features we want developed. Softvision will then identify requirements of those features. Anthony will work with Automation Services to get those implemented. Softvision will develop the tests once the API is ready. Anthony will then check in the tests.

If a review goes unchecked for a couple of days, Softvision will ping the reviewer.

Add-ons tests should test installing the add-on, not do so from command line.

Anthony needs to identify a priority backlog for the BFT tests.