QA/Browser Technologies/JulyWorkWeek2011
< QA | Browser Technologies
Jump to navigation
Jump to search
Overview
| Date: | July 18th-22, 2011 |
| Attendees: | Tbd |
| What: | Browser Technologies QA Workweek |
| Main Scrumpad: | http://mozqa.sync.in/bt-work-week-july-2011 |
Meeting Space
Offsite:
171 Coronado Ave, HMB Leslie: (650) 703-8993
Onsite: Zombocom
Agenda
The Workweek will consist of Project planning, Discussions, Lightning Talks, and Work sessions. Use the offsite for peer discussions, but also use the onsite to catch up with devs, pm's and other colleagues.
| Schedule | Meeting | Topic |
| Monday | Planning |
topics |
| Tuesday |
Offsite | topics |
| Wednsday, morning |
Offsite | topics |
| Thursday |
Sessions | topics |
| Friday |
None Scheduled | topics |
Monday
- 2pm: Breakdown for the week. idea scrumpad: http://mozqa.sync.in/bt-work-week-july-2011
- 5pm: Head to Offsite. See scrumpad for details
- 8:30pm: Dinner at Sam's Chowderhouse, reservations under Tony
Tuesday
- 8am: Mobile Waverley Call (Notes)
- 9am : Large Group Introduction & Discussions ( Vision Casting )
- 10am: Mobile Discussion / Services Work Session ( Community Discussion )
- 11am: Mobile Discussion / Services Work Session ( Mobile Automation )
- 12pm ~ 12:32 Mobile Discussion (Waverly Mobile)
- 12:32 pm - 1:30pm: Lunch
- 1:30pm ~ 2:30 : Team Building Exercises
- 2:30pm ~ 4:00 pm : Mobile Work Session (mobile coffetalk) / Services Discussion
- 4pm: Services Discussion / Mobile Work Session ( Lightning Talk Notes)
- 5pm: Half Moon Bay fun, Group Picture
- 7pm: Dinner at Miramar Beach Restaurant, reservations under Tony
Wednesday
- 9am: Continued Discussions
- 10am: Continued Discussions
- 11am: Pack up, clean, head out
- 12pm: Lunch someplace, head back to office
Thursday
- 11am: Sync Bug Triage
- 12pm: Farewell Lunch for Aakash. Pho Garden on Castro
- 2pm: BT Project demo, (video recorded on airmozilla, 10 Fwd)
- 3pm: Sync Server Unit, Load, Automated Testing: Dev/Ops/QA Discussion
- 3pm: mobile release test planning (aaron, kevin)
- TBD: TPS vs. Funkload for Sync Server API automation/smokes: James, Owen, Jonathon, other interested parties
Friday
- TBD: TPS vs. Funkload for Sync Server API automation/smokes: James, Owen, Jonathon, other interested parties
- TBD: other stuff
- Tracy returns to KC (7:10 am flight)
- Aaron returns to TO (noon flight)
Action Items
Collaborating Notes from Workweek (7/25)
1. Vision & Goals
Summary:
Responsible for emerging technologies and environments like Sync, Mobile, Experimental Lab projects (Identity, Share, WebApps), and server environments
More resources to help with building out automation, support for new projects, and a better usage of tasks like project investigation, more exploratory testing, interacting with other teams, and defining processes
Less focus on routine tasks like regression testing, and beefing automation while leveraging outsource tools
Takeaways:
(Tony) Defining Process: what is the future of sync and mobile in respect to Browser Tech?
(Tony) Come up with a BT service agreement for emerging projects, that's somewhat carbon copied to hand off to teams
(Tony) define our goals to other teams (marketing, support, devs, l10n)
Sandboxed for Future:
What are our responsibilities, current vs future? As a team?
What's our hiring and resourcing plan?
2. Community
Summary:
Community engagement is a challenge for Mobile and Services. Both have very small userbase now.
provide a clear list of tasks and a regular schedule:
QMO cleanup (owen and rbillings are asking around)
Testday posts (send this earlier! also, needs to have direct list of tasks for those that want it, but exploratory for others)
there should be an assistance from dev : ideally it would be nice if they attended events
Have a playground environment for experimenting with client/server testing and automation
Mobile can be device anywhere, create scripts for easier and faster setup
Services would be a QA infrastructure
Takeaways:
(aaron, tracy) use Moz Reps; additional means of communications
other focus channels (ie Reddit, selenium, Android)
provide a test environment that's easy accessible from outside
(kevin, aaron) Device Anywhere for mobile
(tracy, james) Sync server with access for Services)
(All) Clearer Testday posts, earlier, and list of tangible tasks to execute
(All) Include Devs in testday channels, and ask them to be strategic on what to do
(james, owen, tracy) Setup VM environments and documentation for services. Video Sync help
Sandboxed for Future:
have a forum, survey, summary of bugs, feedback, summary of bugs, list of tasks outside of IRC; not limiting to one day.
introduce folks to emerging technologies + projects, so early adopters can play with it (eg labs)
Having physical meetups. Need to expand on the ideas listed, and drive a purpose
How to handle feedback from outside, and incorporate them in the most effective way into testing
3. Waverley
Summary:
Waverley's feedback revolved around:
How can they help more?
Litmus tests - creating, updating, maintaining?
More regular interaction with us?
what are document source of truth?
Mozilla to provide more guidance, attention, and answering when needed
Having special office hours, more regular interaction outside of tuesday mornings, and keeping them informed on projects and bugs
Encourage Waverley to interact with developers directly in bugzilla, irc, and any other avenues more often. Doing a great job now, but more of it is good.
Include Waverley on our monday 9am triage calls?
Need to provide better feedback on litmus test case writing. Also, writing good feature testplans
Takeaways:
(Tony) flagging in-litmus + bugs and ask them/us to create tests
(aaron, Kevin) provide our knowledge of bugs and features status in our features in test plans
(aaron, Kevin) Follow up with feature sign off / status. Look through their wiki page, also get in touch directly. They have a PM now, so we can work through that channel.
(aaron, Kevin) Need to provide better feedback on litmus test case writing. Also, writing good feature testplans
Create a template for a testplan
(Tony, Tracy) Introduce Services projects (sync triaging, client-side testing for now) to Waverley.
(All) regression-wanted; build config
Sandboxed for Future:
Future projects and expectations for Waverley
4. Services Goals
Summary:
client-side automation, explore mozmill and TPS
server-side automation, explore funkload and TPS
Continuous integration is needed
Full QA against Load Cluster (combined Staging)
Pull/package FF build with it. Build out a VM that can be used by anyone
Takeaways:
(tracy, owen) What sets of tests do we automate?
(tracy) develop mozmill automation for client sync smoketests
(Owen) Funkload tests with load. Add reporting
(James) Get production reports so we can analyze baselines and customized load tests
(James) Get two servers up and running so we can snapshot them as reference system
(tracy, owen, James) Have proper documentation for all the above
Sandboxed for Future:
Sync Server distribution and support + Reference Setup
Creating VMs with reference setup openly
Building up all components for distribution (nginx, gunicorn, sync server, etc...)
Make available for community to contribute
5. Mobile Automation
Summary:
Android is a top tier platform focus - so we should be adding automation to the mix
No current client mobile framework on our plate now.
QA can work with A-team and releng to work on fixing broken mochitests
crowdsource addon : get out there
We know Dev team has lack of automation now, and aware of it
Discussed pros and cons of
Takeaways:
(martijn) continue to work with joel on getting the failed mochitests on tinderbox down (Q3 Goal)
(martijn, aaron) continue to get the crowdsource addon in the hands of testers quickly (Q3 Goal) (see wiki page)
(tchung, aaron) work with QA Automation team to create a wish-list of what a client tool should look like
In addition, talk to Clint's team to see what they have in mind
Sandboxed for Future:
A better understanding of how dev automation is today, and their roadmap
What other tools can be used? (eg. more Test Harness like - what about performance testing?)
Investigate usage of 3rd party automation tools in DA, and going down the private cloud route
continued client automation solution in house
L10n Coverage has nothing. what about memory usage, telemetry, performance?
Who can write more browser-chrome tests?
Ask developers as well for their input: mochitests are being written from them
6. Sync Discussion
Summary:
Takeaways:
Sandboxed for Future:
7. Mobile Coffee Talk
Summary:
Takeaways:
Sandboxed for Future:
8. Beta Environment
Summary:
Takeaways:
Sandboxed for Future: