Firefox:2.0 QA Activities:L10n Test Plan
- 1 General strategy
- 2 Trademark testing
- 3 3rd Party Locale review
- 4 Special regional distributions
- 5 Milestone testing
- 6 In-Product Page Testing
- 7 Schedule
- 8 Results
- 9 Reference Info
- Verify trademark issues with info from product management
- Thorough translation, context and layout testing with a 3rd party service for top locales and supliment with simplistic spot checks in house as needed for last minute checks, etc.
- Test special regional distributions.
- Overall Status of Locales
This testing will be based on the L10n Requirements Doc. This contains the general and default requirements. Additional specific search engineer requirements are in bugs and seperate documents. bug 347931 is an example. We plan to attach rolled up requirements to these locale specific trademark tracking bugs.
It is critical that the specific requirements be clear and accurate. They will be handed to testers that have not be privy to the various discussions about the trademark review. It is also critical that any mismatches get resolved quickly and decisions about judgment calls also be made quickly.
Search Engines (engines and order)
We will make a first pass using the L10n Search Verifier tool. This tool checks the search engine list and the search engine order. One pass has already been made and this looks good. Rob is working with Axel to extend this tool to other ares of trademark testing
In addition to using the tool, we will be doing a manual verification. This ensures the search engine logos are right, the redirection is working, and the search results are "professional". There will have to be at least one end-to-end test to quality this. This is true for all of these trademark areas. Later builds, with or without changes, can be checked with Rob's tool or the MetaDiff Tool and only the specific changes (if any) can be tested.
The Professional Test: this ensures that search engines provide results that are not obviously spamming or providing offensive information. This is a simple user test of each engine where we provide a few search terms and see if the results look professional.
Third Party Reviews: Some of this will be accomplished by third party reviewers. We have some community members (individuals and partner partner companies) that have agreed to help with this. These people have language skills to effectively work with the specific locale. Locales that are not covered by outside community members with language skills will be covered by in-house testers. These will not benefit from having language skills but will still be able to provide a very course 3rd party review of the trademarking issues.
Suggest Feature of search engines
There may be some way to automate this, but for now we are planning for not having a tool. We will confirm that the suggest feature is turned on or off. For those locales with it on, we will supply a few search terms and check that the results are professional. See details in the Search Engine section about what the professional test is and how we are going to use Third Party reviews.
Key URLs - first run page, default home page
This is a quick test that the redirection is working and the the resultant page is professional. The tester checks each of the key URLs live.
The Start Page is automatically generated so that is not an issue with localizers changing. We will not test that.
The bookmarks in the bookmarks menu will be briefly checked live to see that they lead to professional sites. This is an area that wold benefit from a tool to verify the locale has the proper URL that will be redirected correctly There will also be the third party review to check these live to ensure the bookmark title is reflective of the content, redirection is working and the resultant site is in good shape.
Verification is to be performed by the l10n Bookmarks Verifier.
This test will ensure the pre-installed feed readers follow the L10n guidelines, are working properly and result in professional results. We hope that the tool will help verify this along with a manual end-to-end test.
This test will ensure the pre-installed feeds follow the L10n guidelines, are working properly and result in professional results. We hope that the tool will help verify this.
Approach to the Tiers
Tier 1 and 2 tend to have more specific customizations. These will be checked more thoroughly than the Tier 3 locales. Rob's tool will work equally well on all locales so that will be used everywhere. The Manual testing will be opportunistic based on who is available for what locales and when. For example, we are planning two rounds of testing with one of the partners. Some of the trademark testing will be covered in the first round and some in the second round. I will lay out a schedule elsewhere in this L10n Test Plan. We have good language skilled coverage for all the Tier 1 locales and most of the Tier 2 locales. I am looking at options for the Tier 2 locales that we don't have good language skilled coverage.
Tier 3 will get very brief spot checks with language skilled people or possibly with non-skilled in-house engineers if that is the only option. All the trademark areas will be covered with the Professional Test described above. Few search terms or RSS feeds, for example, may be tested.
3rd Party Locale review
We are using 3rd party community reviews to provide some independent objective testing of locales. This is similar to the way we test features in QA. For locales, there is the extra challange of finding people with strong language skills. These reviews will be done by community members including individuals and partners.
This testing will focusing on Tier 1 and 2 locales. We are currently missing Swedish, Hungarian, Czech, and Finnish. We should be able to arrange at least smoke tests for these. This will include reviewing .dtds, .property files for proper string translations. Review ~200 dialogs for translation, context, linguistic, and layout issues.
We haven't been explicit about the In-Product Web Pages. We should revisit this with the reviewers.
The review will also include a quality assessment. Each reviewer will provide some subjective and quantifiable feedback about their locale. The questions asked are:
- Do you use a localized version Firefox as your primary browser? If not, why not?
- How do you rate the overall user experience with the localized version of Firefox 2.0? (1-5 1=poor, 5=excellent)
- How do you rate the translated first-run page and the default start page? (1-5)
- How do you rate the translated menus? (1-5)
- How do you rate the help info? (1-5) - This is not available in many locales for Beta 1.
- How do you rate the selection and translation of default search engines and bookmarks for your locale? (1-5)
- How do you rate the translation of tool tips? (1-5)
Special regional distributions
This is an end-to-end test to ensure search codes are working correctly. It includes through Basil to work with Google and Yahoo! to ensure tracking data gets all the way into their tracking databases.
We will ensure:
- Tracking codes go out properly
- Start pages are correct
- Homepage should have tracking code including start page migration
- No unexpected codes a present
- That updates to the standard version do not affect special regional tracking codes
There will be a specific test plan for each distribution. These are available seperately.
Every milestone will require L10n verifications. We expect Build to run the MetaDiff automated tool. This is agreed to by Build. We now also have a L1On search engine verification tool from Rob we will run. Then we will do some end-to-end manual tests.
All of this will be do the first time a locale is complete. After that we will run the tools and focus on any changes that that may have affected a locale. Every release we will spot check the Tier 1 locales, primarily the Win32 platform, to ensure any underlying changes did not effect the functionality of the locales.
In-Product Page Testing
This is testing of the seven key web pages which are linked from the default tabs and Bookmarks menu. Here is the list:
Replace the locale code in the two places in each URL with the proper one for the locale you are testing.
These pages can be viewed by the people outside MoCo using this guest account: username: email@example.com Password: guest
- Verify Accurate string and contextual translation
- Detailed verfication of fonts
- General verification of proper font or layout rendering
- Ensure no 404 errors for links on each page
- Ensure correct locales called in localized build of product
- i.e., fr doesn't return de in-product pages
- Contract with our 3rd party L10n reviewers to review 7 of the tier 1 and tier 2 locales (Oct 12). Specifically:
- Tier 1:
- Iberian Spanish (es-ES)
- Japanese (ja)
- Tier 2:
- Latin American Spanish (es-AR)
- Italian (it)
- Korean (ko)
- Tier 1:
- Receive en-US versions of the product pages by Friday (Oct 13th)
- Provide to the 3rd party reviewers to ensure they can complete the testing on schedule
- This wan't received until Monday, Oct 16th
- Receive the planned in-product pages (Oct 16th)
- The 3rd party reviewers will test from the evening of Oct 16th through Oct 19
- This includes testing items #1 - #5 of the requirements list above
- The MoCo QA and community will test all other locales between Oct 17th-20th
- This exclude the locales that are not likely to be ready by Oct 16th (Belarusian (be), Bulgarian (bg), Macedonian (mk))
- This testing primarily covers test requirements #3 - #5 unless we can find language skilled members of the community.
- The 3rd party reviewers will test from the evening of Oct 16th through Oct 19
- In-Product Page Testing Results from SmartWare - for es-AR, es-ES, ja, it, ko, zh-CN, and zh-TW
- Results from tomcat - de
- Results from timr - fr, pl
- Results from tracy - el, fy-NL, ga-IE, gu-IN
- Results from juan - en-US, en-GB, fi, pa-IN, ca
- Results for MoCo testing
- this is a Google public spreadsheet
- just setting it up; results to be posted shortly
- Week of 8/21/06
- Pilot the trademark testing of a few locales (Tim)
- Week of 8/28/06
- 8/28/06 - Beta2 L10n Builds complete
- 8/28/06 - 1st round 3rd party testing with da, ko, nl, pt-BR, ru, tr. This includes both linguistic review and trademark review.
- In house smoke tests of other locales
- Late in the week Beta 2 is released
- Week of 9/4/06
- 2nd round 3rd party testing for da, ko, nl, pt-BR, ru, tr. This should be the final translation testing only.
- Week of 9/11/06
- 2nd round 3rd party testing for de, es-AR, es-ES, fr, it, ja/ja-JP-mac, pl, zh-CN, zh-TW. This will be both the final linguistic review and trademark review
- Week of 9/18/06
- RC1 Freeze
- Smoke test testing of remaining Tier 2 locales: cs, fi, hu, sv-SE. This is both a linguistic review and trademark review
- Week of 9/25/06
- Smoketesting of Tier 3 locales
- Week of 10/2/06
- RC 2 Lockdown (dates are rough-can't find google calendar)
- Week of 10/9/06
- Week of 10/16/06
- Week of 10/23/06
- Week of 10/30/06
- FF2 Final Release
- L10n Testing Results
- This include results from automated tools and visual verification.
- In-Product Page Testing Results
- Trademark and BD requirements
- L10n Requirements Document
- Search Engine requirements - This is not final as of 8/28/06. There are about 30 custom plug-ins that are not included in this doc. See Chofmanns email. But this version will be used as is for Beta2 testing.
- L10n RSS Requirements - this is not finalized as of 8/27/06
- Mic's L10n Productization page
- Search engine requirements - see intranet doc
- Localization teams
- Locale status tracking page
- In-Product Web Pages - These need to be translated and reviewed
- Axel's L10n Locale Release Readiness page
- Axel's Search Engine per locale status page
- bug 348568 - Set up the infrastructure for the new URL structure for l10n
- bug 347931 - Sample trademark tracking bug. Note format for the Summary line with the locale prefixed to the the front.