Compatibility/Meetings/Sync w Honza

From MozillaWiki
Jump to: navigation, search

Contents

May 16 2023

WebRTC issues (SV)

Since WebRTC is getting more visibility lately, is there a way to see when using Firefox (maybe in about:webrtc) if a website is using it? Would it be useful to add a Trend Label for specific WebRTC issues, to use for future refrences, if we can differentiate issues specific for WebRTC?

We have documented around it and it seems to be used when using web apps for voice/video chatting that use your webcam, headphones and microphone, or for P2P file transfer. We have an ideea of how issues related should look like, but maybe there's a better way to find out if WebRTC is used.

Paul: The mobile and desktop team are doing dedicated tests for webrtc.

Paul: We have an idea what is WebRTC and what to look for but we were wondering if there's a way to check exactly if an issue is webrtc or not?

Honza: I don't have an immediate input regarding WebRTC but lets put a topic in the meeting we have later on with the team.

Top 100 testing (SV)

Paul: QA Desktop wants to reduce their workload regarding this topic. They will go down from their current number of sites tested.

Paul: We will allocate 1h per day for testing the top 100 sites because the main task is to triage bugs and that's more important.

Honza: I am understanding that you would spent roughly around 1h per day for testing this OKR, meaning that in one month, you will cover a batch.

Paul: Testing is organized by us, since we took over this OKR from oher teams, to ensure that the available time for testing is used to its maximum.

Honza: Some things to take into consideration if the triage process time will increase.

Paul: As previously discussed we will test mainly on Android, Desktop-Windows and if the time allows it we will test on Mac as well.

Honza: Every month we should be able to do one subset from the document.

Paul: Initially it was planned to test this twice a year.

Honza: So that means roughly one site per working day.

Paul: We are aiming to verify one website in one hour per day.

Honza: Is that do-able.

Raul/Calin: It looks do-able. We will concetrate our efforts first on mobile and desktop- Windows. If time permits, we will extend the desktop enviroments for testing.

Honza: I have seen that m.youtube.com is not included, just youtube.com in the list.

Paul: We have a list for both desktop and mobile, and if the site has the mobile version, when testing in mobile, we will cover the mobile version as well.

Honza: What are the next steps?

Paul: We will test the list accordingly, to avoid overlapping. It would be better to have one pause between lists, to ensure avoiding overlappin.


May 2 2023

Interventions testing possible restrictions (SV)

After the last run for Intervention tests, both manual and automated, we have made notes in [this document](https://docs.google.com/spreadsheets/d/1b1K-Zk35uLCk_6n2I-aON-4yc0bkZWQQPme1Em59QOQ/edit#gid=2035890881) regarding which website might not be tested using automation, due to restrictions (2FA, geolocation restrictions, enviroment).

These possbile restrictions were observed when running mainly manual tests.

Honza: That looks good. Can you run the list by Tom as well, and keep him in the loop?

Raul: Will do.

Honza: How many are automated and how many are manual test?

Raul: All the entries in the list are runned manually. When running the automated tests, about 50+ are being runned.

Honza: Is there a reason for that?

Raul: Tom is working on setting up the mobile automated tests, so for now we are only running them on a desktop enviroment.

Honza: I see. Are the manual test runned by enviroment?

Raul: Some are runned on all platforms, some just on mobile, others just on desktop.

State of Webcompat report (Honza)

Honza: There will be a summary there. James is mentioning QA, mainly from trends. We are thinking about a summary/conclusion about the mentioned subjects.

Do you think we should mentioned other issue?

Raul: Firefox not being supported, reports for a specific site like we have seen for Reddit, where we have a communication channel with them. We can also extract issues from the Trends OKRs by using the assigned labels.

Honza: That sounds great. Do you think we can do a top of issues based on Trends?

Raul: We can see which labels from Trends received the most issues and make a top 3.

Honza: Could you bring this is up in our upcoming team meeting?

Raul: Will have it ready by then.

Updates- Top 100 (Honza)

Paul: We are still waiting for the feedback from the team, regarding what would be important in the webcompat area, what we should focus on. Maybe be can use the same list as the desktop team or do another round of checks to see if the list entries have changed. We will take a look to see what the list looks now vs how it should look with the current top sites.

Honza: I like the categories as well.


April 21 2023

1. Top 100 testing(SV)

It seems that the Desktop QA team is also running tests for checking build compatibility with websites. - I've discussed with them to see if they want to continue to do this, so we won't overlap. - We'll have a meeting on Wednesday to discuss the coverage.


Paul: Initially I have discussed this with the mobile team and we've decided to move the task to the WebCompat team. We are still waiting for a response from the Desktop QA team if they would let us do the task instead.

Honza: So there's FF desktop QA doing similar things like testing Firefox builds and there's also the mobile team doing web compatibility testing.

Paul: The mobile team won't be doing that anymore since they are switching...

Honza: So mobile would give up on testing compatibility on those websites? From my understanding, different builds are available for different OS (mobile and desktop)

Paul: When this OKR was created, we did not consider that other teams are testing this, since this seems more like a webcompat team thing. Now we have to decide if they will start testing this, or should we keep this OKR for other teams inside the organization.

Honza: Given the fact that 5 platforms would be tested, how do we see this from the workload point of view?

Paul: Other teams are running the tests in one quarter for 100 Top Sites, 30 sites per round. Their estimations show that it takes them around 20 minutes per website, 11h for 32 websites. For about 100 websites that would be around in the 33 hours mark, but maybe we should do the testing extensively, going deeper compared to them, so that would mean double. But the plan would look more doable after we talk with the desktop team.

Honza: How much time would be left for other things, since this is not the only OKR? How much time should we spend on this? Unless they are testing different features.

Paul: I think it is general. I think we could test 2 times a year the top 100 websites.

Raul: I've seen from their previous tests that they test features such as saved logins but from our point that's not webcompat.

Honza: We will lay out a plan to see what it is covered from our point of view.


Honza: Is mobile covered?

Paul: Both mobile and desktop.

Honza: How much time is needed for mobile?

Paul: About 20 minutes per website. We need to cover also mobile. We plan to run it twice a year. We will see what the stakeholder has to say.

UX Research (Honza)

Honza: we have the results that summarize all the responses given by people. James was looking at that document and provided small summary. Since the feedback was related to Social Sites, I was curious if there was something overlapping, like common ground, to identify the same set of priorities/ conclusions. I did not see any overlaps there, but maybe there is something I am missing.

Paul: From what websites did the reports come?

Honza: It was user research, no specific sites were targeted. The feedback given in the survey is related to webcompat. I was curious if there are any overlaps in this. But maybe we could coordinate a little bit more, maybe we can sync with these efforts and see if people are saying the same things as our findings from our testing.

Paul: We might not get to the same conclusions due to hardware availability, as some users have a ton of different configurations.

Honza: Maybe we can get the results once the survey is completed and adjust our testing accordingly, maybe concentrate and narrow our efforts to sites we pick and testing we are doing.

Paul: That sounds like a plan, as we can concentrate on areas reported in the reports. We would know where to stress the application more.

Honza: I will try to keep you informed about the results.

Raul: As Paul said we can either do tests for the top 10 or do for a specific feature that fails on certain websites.

Honza: [This is the output](https://docs.google.com/document/d/1TQu-R95zkeF4rsEYRBMk8-Xf5WBhDs1J3qllKQ216Mc/edit) from James, and this is the original [document](https://docs.google.com/document/d/1xnq33IbwSjL7DV5pHhP83bVTnKND4LZMr2LG4uF-8qU/edit)


Interventions testing (Honza)

Honza: Are there any differences between the automated tests that are run manually?

Raul: I'm guessing the automated tests are the same as the manual testing. We have runs for manual and automated tests. Some test runs require 2FA authenticator and these will fail when running the automation suite. Also, geographical restrictions, environmental restrictions, and incomplete login credentials are also taken into consideration for automated runs, as these will fail if the correct setup is not available.

Paul: Could we mark which tests could be run manually and which ones could be run automatically?

Raul: We run the automated tests at the end of the manual run for interventions. Usually on the first automated run, there is a high number of failed tests, which is lower on the second run

Some tests need to be run manually because it requires authentications and/or VPN.

At the end of the runs, we have a clear view of why some automated tests fail.



Honza: Then as Paul says, we should make a list of which ones can be run manually and which ones can be automated.

Honza: Could you make a new column in the doc and classify which ones are which?

Raul: Sure, we could try to do that. Usually, Tom knows better which ones can be automated and which ones has to be tested manually.

Honza: Yes, Tom knows more about this, so please feel free to contact him and sync on this subject.


April 4 2023

Google offline feature (Honza)

Honza: This seems done. I've shared this with others, and now I am waiting for feedback from Joe and Maire. They are interested in a bigger anaylis, to help us understand better what is broken from the webcompat point of view, like we've seen from the UI point of view, or weather we can reveal more from future testing.

They want to know how expensive will it be to fix these features. Maybe the next step is falling in the diagnosis backend. Could further testing help the diagnosis process, or even maybe find the root cause?

Raul: We can provide more testing, but we do not have the knowledge to pin point the root cause.

Paul: We'll probably need Engineering help to figure that out. Even for the initial testing Denis helped us a lot to figure out the problems.

Honza: Fair, then we shall consider this done from QA prspective.


Testing Methodology (Honza)

Honza: I've created this folder so we can put other relevant documents here besides the ones we already have, so they will be easier to find. Honza: I was looking at the document with the top 10 social media websites and I think we can improve the structure. I've created a guideline doc with what info would be useful to have in our reports.

Raul: Should this document help us with new OKRs when we are testing different websites?

Honza: Yes. Those kind of documents could describe the methodology on how those features/websites were tested so it could be easier for other to understand whats happening there, what the situation is.

Honza: So I'm looking at the results, I can see there are a lot of links redirecting to different issues but thats not too insightful. I think it should be more helpful if you make a summary on each site, which kind of issue are more concering for that specific website. We could make this summary better by focusing only on te P1's after the Engineerign team triaged the issues.

Raul: There's one issue here, we don't really assign a priority ourselves. Thats normally made by the dev team. Not all the issue are trully webcompat, some are ETP issues. Should we categorise them?

Paul: No, just focus on the P1 issues after they are triaged by the team and the priority is added.

Honza: So this document is not exclusively for the webcompat team, this is higher than that. The goal is not helping with triage. This is for people outside the project to understand the problems that are going with those websites. I think what I was editing the most in the document is the context about those websites.

Paul: We'll provide a link with all the issues we are refering too and do a summary of them, maybe even categorize them if we see a pattern.


Q2 Proposal Review - Top 100 Sites (Beaucoup list) (SV)

As discussed, we are planning the following [OKR](https://github.com/mozilla/webcompat-team-okrs/issues/271)

Is there an up-to-date list that we can use? Also, we are thinking of running the tests in TestRail, for each domain, where we can group sites (News, E-learning, Shopping, etc.)

We have this test suite used in the past, which we can tailor it to be up-to-date and relevant: [link](https://testrail.stage.mozaws.net/index.php?/suites/view/39006&group_by=cases:section_id&group_order=asc)

Note: The link should be accessed after account login is performed. If not, then accessing the link and performing the login requires the link to accessed again in order to see the test cases.

Testing will be conducted on Desktop and mobile. Should we include iOS + ANDROID on the mobile side, and Mac +Windows on Desktop?

[Paul] 1. I've discussed with the Mobile QA team and they only covered Android, so we can scratch iOS. 2. Regarding Desktop side, I think it's the WebCompat team role to decide what Desktop platforms to include in our testing based on markers like users numbers, platforms with most issues, etc. 3. Mobile QA team was testing websites depending on regions, but that is not mandatory either, they have only done that as the Alexa top they were using, was splitting websites on regions. Also, I think it would only add complexity for us if we want to do that, so I don't think it's worth it.

Raul: In the past we've used Alexa for the top 100 sites, in the current beaucoup list last time we had only around 20.

Honza: We actually have 100 websites now from the beacoup list but those 20 were currated.

Honza: We can use the current list from Beucoup or we can make our own list.

Paul: Top sites from Alexa was based on how accessed the pages were, but that is not avaible anymore. Maybe webcompat has another list that we might use?

Honza: We can use also the list made by Tranco, the spreadsheet from Beucoup and compile a list that reflects the Top 100 sites today. There is also the HTP archive.

Links: https://tranco-list.eu/

https://www.similarweb.com/

https://docs.google.com/spreadsheets/d/1HcafFKM_bv-2O6qad011mTgCNWX2lEjvBNxTYld1GYk/edit#gid=1838995098

Honza: Best would be to talk to Ksenia about this as well.

Paul: We will look over it and see if we can figure out an overlapp, if we don't we will ask Ksenia's help. However, what would be helpful, is to find out on what platforms we should focus our testing on from Desktop. Maybe we could pick the ones most used, or pick the ones from which most of the reported issues are coming from, or other clasification.

Honza: Next week maybe we should open a new topic with the team regarding this subject.

Honza: So we are making an analisys which sites are not supported and lately we are seeing an increase in them. This should also be one of our main goals, if certain pages from top 100 list are supported by Firefox.

Paul: We can pinpoint regressions also by doing this constantly. If possible, we are planing to do this twice a year.

Honza: Maybe we can also look at a trend in this case,to see how the webcompat part evolved from one run to another.

Paul: Yes, we can have a summary in the report, and we can see the differences between runs.

Honza: That sounds good, sound like a good reason to do this OKR.


March 21 2023

Google drive offline mode (SV)

Google Drive requires an `add-on` that is specific just for Chromium browsers, in order to be used offline and since docs/sheets etc. rely on the Gdrive we are not able to test these as well.

Raul: Chrome does not need the `add-on`, it works without it. Edge needs it for the feature, which is available once it is installed. However, for Firefox, with the default UA, does not have the "Offline" feature for Gdrive, following the above link. Changing the UA in Firefox to either Edge or Chrome (it is important to be signed in prior to changing the UA) shows the `feature` in the settings of the account. Once you try to enable it, it prompts a pop-up that redirects to the installment page of the add-on if you click on the `Install` button, but there is no option available to install the add-on, as it is specific for Chromium-based browsers.

Honza: I see, so accessing Gdocs and Gsheets is not possible.

Honza: It would be nice to have a separate document, to document all the findings regarding this.

Raul: When should this document be ready?

Honza: If you can have it ready for the next meeting, that will be great.

Raul: Will do.


Social Media Top sites Exploratory Testing (SV)

Testing is on track and will be completed on time. We have substituted some pages that can not be tested on Mobile and Desktop (quora instead of snapchat)

Link: https://docs.google.com/spreadsheets/d/1b1K-Zk35uLCk_6n2I-aON-4yc0bkZWQQPme1Em59QOQ/edit#gid=1488193316


Raul: We've replaced 2 websites we couldn't test from the list with pinterest.com and quora.com

Honza: You can not test Snapchat?

Calin: Mobile requires the app, and desktop is not supported (known issue): https://github.com/webcompat/web-bugs/issues/107613

Honza: I see. Is this the final document?

Raul: Once we are done with the testing, we will export all the data we found in a new document.

Honza: Alright, a summary of the findings would help explain the situation. You could insert different screenshots, different issues related for each webpage, any other data that might help.

Raul: What about the issues that were previously reported before the test was done?

Honza: You could add/mention them in the report as well, especially for reddit.com. The reason why I am asking this is that we have a contact now for reddit.com, and a separate communication channel with them, so all the known issues will come in handy for this.

Raul: We will have the report ready in next the quarter. Basically, we finish the testing on 31 st of March. We can make a draft of the report ready by the first Monday of Q2, for the first team meeting of Q2.

Honza: Ok, sounds good.


Honza: I could create a structure for the report, to make sure we cover everything. Basically what we are looking for is issues that are reproducible on all browsers, issues that are reproducible on Firefox but not on other browsers, and issues for mobile environment that require the installment of an app/feature to be used in the app by default by the page. Like where we had `Can't test`.


February 21 2023

Q2 Proposal: Top 100 most visited sites

Could we test this in the up-coming Q2 period?

Things we had in mind for testing:

  • The page is rendered correctly, no visible artifacts or layout problems are present, without glitches, interruptions.
  • The user can:
sign in without issues,
use the search bar,
scroll the page and open different links - no visible artifact or layout are present,
open a menu and select submenus,
navigate without issues back and forward the pages.
  • Share an article/information.


Raul: We haven't tested the top 100 most visited websites for a while and we were thinking proposing this one for the next Quarter.

Honza: Do you have specific strategies to test those websites?

Raul: Yes, we had in mind features like sign-in, search bar, menus/submenus, articles etc.

Honza: I see, how do you find the top 100 websites?

Raul: Well, before we had Alexa top 500 sites, and now at the moment we have an internal list, and the Beaucoup list.

Honza: Okay, we will have to look into it. Is this OKR do-able? How long did it took last time when doing such task?

Raul: We might finish the whole OKR in one Quarter depending on the workload. In the past, we took this OKR from one Quarter and continued with it in the upcoming Quarter.

Honza: So when doing this OKR, if you find bugs do you file them as regular on github or Bugzilla?

Raul: We normally file the bugs on Github but we can do it also on Bugzilla, if needed.

Honza: I'll ask around for a list, and we can see what the plan is for the next Quarter.

Webcompat.com Analysis

Honza: This is the most used data source that we have. There are discussion how we could get more, keeping in mind that we have limited resources. The action to report sites is availbe jsut for non-release channels. Enabling this to release versions, might cause a high number of issues. We are in search for a specific way to get more reports. That would be a different channel, not linked to webcompat.com. It would be a telemetry, it would end up in a data base. This is just an FYI.

Exploratory testing of Top 10 Social Media

Raul: Most websites we tried on the mobile version need the app to be installed.

Calin: The issue is with the websites where the core feature is messaging.

Honza: Is it impossible to test?

Calin: Main page loads, other features need the app. For some you can switch to the desktop version.

Honza: So desktop works, but mobile needs an app.

SV: Yes. That is a current workaround.

Honza: So how is that going so far? Anything interesting regarding the reports?

Raul: So far, just small UI defects.

Honza: The important part is that everything we find has a coresponding report, if not, a new report is in place. Where testing is not possible, please make a not of that.

Raul: We can make notes of the current know issues for each site.

Honza: If testing is not possible, leave the N/A page in the list, and introduce a new relevant page to test in the least.

PayPal account (Honza)

Honza: I was also asking around for PayPal, not easy to get one. I ended up to talk with Dave Hunt. There is a paypal email list, which we did not get a respond for that. This is still pending. We can use this pending email list as a reply for current issue.

Calin: The main issue for Pay Pal is that we do not have a linked credit card or a valid ID verification.

Raul: In the past we used a temporary credit card that was generated online but now it no longer works.

Move to Bugzilla Add-on(SV)

We have started to use this add-on, for issues that are non-compat, but could use an investigation on Bugzilla: https://addons.mozilla.org/en-US/firefox/addon/move-to-bugzilla/

We had in mind user retention when using this, as instead of closing the issues as Non-Compat or Incomplete (where account creation is not possible), If users (even anonymous ones) can see that we are trying to help them and move the issues accordingly to other relevant projects, that might help us in user retention for 2023.

Raul: We used the addon where we couldn't test the issue ourselves such as special accounts/banking or non-compat issues (another app involved, special set-up, pref changes) for user retention.

E.g.: https://bugzilla.mozilla.org/show_bug.cgi?id=1816677

Honza: That sounds great, we want users and reporters to see that we are attempting to solve issues, not just closing the door as valid non-compat or incomplete issues.

Duplicates & repositories moved from Github to Bugzilla

The Fenix repository has moved from github to Bugzilla and all the bugs are closed and archived now, but not every issue has a coresponding Core bug report on Bugzilla.

If we receive a reproducible bug report that is a duplicate of the one from the Fenix repository, how should we proceed in this situation? Should we just mark them as duplicates of the closed Fenix Reports, or file a new report on Bugzilla ?

Context: https://github.com/webcompat/web-bugs/issues/118438


Honza: I would not close them as duplicates for the Github Fenix issues, I would open a Bugzilla issue. It's read only on Git, so it would be good practice to look an archived report to an active report on Bugzilla.

Calin: Should we file a new bug if there isn't one already?

Honza: Yeah, we could use the see also option, and link the closed Fenix report there and the current reproducible Git issue.

Calin: Is there someone handling the current archived Fenix Git issues?

Honza: We do some in our current webcompat triage meeting.

Honza: How do you search for core bugs in Bugzilla?

Calin: We perform different search queries, either with the current title, or we use relevant keywords.

Raul: We also search by domain or by duplicates, based on history.

Honza: Why don't you send the issue to needsdiagnosis?

Raul: Well, we use the current knowledge base for duplicate issues or similar issues reported in webcompat.com. If we see that a similar report received a resolution that states that the issue is either a Firefox issue or Bugzilla issue, we act based on history, if we have encountered similar bugs in the past. When we are not sure and nothing relevant has been found related to the reported bug, we will just move it to needsdiagnosis or ping someone to look into it, and give a resolution for it, that we might use in the future for similar reports.

Honza: That sounds like a good approach.

Firefox "Unsupported" banner

Is there a way to identify unsupported sites and see how widespread this problem is?

Honza: Do you have any idea how we could identify how many websites do not support Firefox?

Raul: At the moment, just the OKR Trend will help.

Calin: I think we should first identify what is the common issue across websites when it comes to the browser being unsupported or for example why some websites from a specific geographical region are more prone to make Firefox an unsupported browser.

Honza: That makes sense. If we knew what the Core problem is, that might help. Any ideas how we should do that?

Calin: Maybe an user poll would help.

Raul: And maybe from our previous experience we can see, like we saw with the 110 version breaking sites in the UA String, and freezing the version number to 100 solved the issue with Firefox being unsupported for certain pages.


February 7 2023

MS Teams

Honza: For this issue, we have mixed results. Now, the page fails in Nightly but works in Release. It seems that there are 2 versions of MS Teams, for regular users and for Enterprise users.

Raul: We might have 1 for corporate and 1 for regular users, the free one is one that we kept receiving reports for and which we investigated.

Honza: We are not sure if they blocked only Nightly or other versions as well such as Release and Beta.

Raul: Normally only the meetings were not supported but now it seems the whole website is not supported on Linux. We could investigate it more since we had no new issues reporting this.

Honza: That will help, how do I try? Do I need a test account?

Raul: You can test with a regular account.

Honza: Great, could have a look into it?

Raul: Will do.

  • Notes:

After the Team Meeting, Dennis concluded that team.microsoft.live is also affected. The reports came for teams.microsoft.com as well.


Webcompat.com Analysis

Analysis from Kate Hudson: https://docs.google.com/document/d/1AvULYoAAE4LqL20gUpeHzOmJFOi3scVOeQW-SmxD-nM/edit#

Questions:

  • Does it make sense to optimize and get more reports?
  • We have limited resources to test all reports, but perhaps we could get more quality reports? What could be better?

Honza: We looked into some data about how users file the bugs and most of them give up after completing the first 3-4 steps (7 total). Maybe we could make the report easier, reducing the steps.

Raul: We've noticed in reports that when they get to the description part, the users get frustrated and they will just fill the form with random words until they've met the minimum required to submit the report.

Honza: Is the screenshot optional?

Raul: Yes it is.

Honza: How many people skip the screenshot part, how important is it for you? Raul: Yes, it is optional but it is very important for us. It is easier for us to identify the issue.

Honza: If we somehow make the reporting much easier for the users, how you will handle the situation if we see an increase in the reports?

Raul: We are not sure if that will cause an increase in the reports we receive but in the past, we have received way more and we had a hard time triaging while also doing OKRs, as we saw when the "Report an issue" button was added and ready for usage.


Testing framework for shipped interventions

Honza: Tom was working on the infrastructure on how we test/manage interventions. How does that work?

Raul: At the moment we do Manual and Automatic testing and we do our testing on real devices. For now, these tests don't take that much time. The only issue is when we have to test payments or websites that require special accounts such as banking.

Honza: So you run those tests with on/off interventions, what do you do if they fail?

Raul: We keep them open and mark them as reproducible, if the issue is no longer reproducible we create a task on Bugzilla to remove the interventions because its no longer needed.

Honza: Are there any factors to take into consideration when running the Interventions Tests - Automated?

Raul: Besides having the environment set up, for example, bankofamerica.com requires a Mac device, since the tests are run on a Windows Machine, that will show as fail.

Raul: Other factors to take into consideration are VPN Connections, UI update of the page, and a stable Internet connection.

Honza: Do you feel that these runs of the Interventions are still needed?

Raul: Yes. We can keep track of the issues, we can see if other issues are related to the Interventions, and running them once a month does not take extra resources. Honza: What is the flow of creating the Run for each cycle?

Raul: We have a predefined list of Interventions that are in place, and following that list, we create the Run based on the environment and if the issues are CSS Injections or Overrides.

Honza: Is the list stable regarding the number of issues?

Raul: Yes, pretty much. The number of Interventions is roughly around 90 at each cycle. Some are new, some are old. It gets updated regularly.

Honza: Glad to hear that


January 10 2023

OKRs for 2023 (SV)

Should we make new OKRs for Trends: https://github.com/mozilla/webcompat-team-okrs/issues/262, or can we transfer the old one, as we did not close it at the end of 2022. The same question is in place for: https://github.com/mozilla/webcompat-team-okrs/issues/259.

We are thinking of some OKRs to be added in Q1 for 2023. But given the RoadMap for 2023, are there any possible OKRs that we should focus on first?


Raul: Should we open a new OKR for the Untriaged/Triaged issues from 2022?

Honza: The ones that are in progress from the last year should be closed. I think a new project should be created as well.

Raul: Dennis created the project for 2023.

Honza: What about new OKRs? I've seen that Top streaming websites OKRs is already done. Maybe we should move to a different area now.

Raul: We should probably test Top social media websites.

Honza: Have you tested them already in the past?

Raul: We had some testing before.

Honza:Nice, if you have documents to review please send them to me.

Raul: If we have any idea for new OKRs we will let you know or maybe, if you have any suggestions for the roadmap.

Honza: First, any other ideas for Q1 OKRS?

Raul: We had a cleanup on bugzilla before with bugs related to webcompat, some that were really old and kept piling up in the backlog.

Honza: That sounds like a candidate for a new OKR for Q1.

Raul: We could do that again.

Honza: I am wondering when you are performing triage, do you find any other issues besides the reported issue?If you find any issues that reproduce, how do you proceed?

Raul: We ping Ksenia or Tom to look into it. There are issues that are Firefox issues but not necessarily WebCompat issues, such as features of the browser.

Honza:What is the amount of bugs in our Product (WebCompat)?

Raul: We don't know exactly...We can perform a search query in Webcompat Mobile and Desktop Component, based on the NEW or UNASSIGNED status.

Honza: Cleanup on Bugzilla could be a great OKR as well. We should talk more about that on the upcoming meeting with the team.

Raul: For the top sites Social Media OKR, should we use the Beacoup list?

Honza:We have like 1000s sites there, I don't think that category of top 10/20 social media sites is there, I will send you the sheet: https://docs.google.com/spreadsheets/d/128x7npL8THeiGbXsYoiSRN3W19Ha7VrGN3zBUH5w3ik/edit#gid=0. Also, Dennis might assist with another list.

Issues that are reproducible on Release version, but not on Nightly version (SV)

Usually we close such issues as WontFix, but we were wondering if we could discuss about it with other team members before adding a resolution to the issue, depending on the importance of the page (users, popularity, etc) or number of reports received for one website. As seen here: https://github.com/webcompat/web-bugs/issues/115740#issuecomment-1361317178, an uplift to Beta/Release helps in this kind of situations, especially if we are looking at user retention as well.

Raul: Based on the issues we've received, we've noticed a trend with Azure. A bug that reproduced on Firefox release but not with Nightly.

Raul: Using the methology of the trends we've managed to identify the issue regarding Azure. This also shows how important it is to identify the trends that are happening. But because the issue is a classic Wontfix issue, Ksenia pitched in and we were able to successfully solve the issue, without waiting for the next 110 release version, thus retaining users.

Honza: I saw the thread, good job there. This is what the trend OKR should be about. Also, yes, it is a good idea based on that issue that we should prioritize websites that could use a uplift/patch.

Honza: If we are on the subject, we are thinking since Trends seems to help us a lot, to integrate Trends in the Knowledge base -real data base which can be easily integrated in Bquery. HTTPArchive and telemetric data are stored there. This can be merge together. The next base should be built in the same system. Read data and learn from them. For example Webcompat issues with CSS Properties failing, we can query the archive and see how much that property is used across the web, and it can help identify the impact of the issue.It would help provide a state of webcompat report. Having trends intergrated there that would be something. My point is that the OKR is on github, and somehow to store it in Bquery. Maybe we can do something similar for trends. Fetching data from web bugs. Fetching the trends and process them. This is something to think about for Q2 OKR.

Verified Paypal account (SV)

We were unable to add Paypal as a payment option when ordering on ubereats (non-compat) and we believe it's because we do not have a verified account.

Context: https://github.com/webcompat/web-bugs/issues/116245 Other: https://github.com/webcompat/web-bugs/issues/116379

Honza: When you say verified account, who sould verify the account and how?

Calin; The account needs to be verified by Pay Pal, with a valid ID.

Honza: I will ask around.



December 13 2022

Needsdiagnosis re-check issues Q4 - Done (SV)

We have finished the re-check of Needsdiagnosis issues. Below are the results: https://datastudio.google.com/u/0/reporting/a0034c41-70fd-4d7e-9dec-4275eb287b9f/page/p_ohx6mafb1c?pli=1

Honza: Awesome, thanks for that.

Raul: Exploratory testing is on track, and will be finished in the coming week.

Honza: Great news.

DevTools Workweek (Honza)

Honza: Most of next year will be about those 3 subjects regarding webcompat (Webcompat Knowledge base, Webcompat Issue impact,Webcompat User retention) We are aiming for a dashboard that will show us the top 20 webcompat issues (top issues), based on scoring. That is addressed in Webcompat Issue impact, collecting data. Our current scoring logic is based on assumptions. All the data collected is stored in Web Query. The canonical list of webcompat issues is stored in Bugzilla, for every webcompat issue, there will be a corresponding core Bugzilla report. Webcompat Retention - finding the relation between fixing webcompat issues and user retention. In order words, could we say that fixing webcompat issues helps us retain users? Can we prove it? It is about producing a document on how we can do it, based on behavioral differences.

So our focus for the next year is to build webcompat knowledge base, collect data, and user retention -impact user retention.

Calin: Whenever a bug will be reported on Webcompat, there should be a corresponding Bugzilla Bug?

Honza: If there already is a Bugzilla core bug for that, there is no need. If not, there should be a Bugzilla bug for that (core issue). We will talk about this flow with the whole team as well, in the near future.

Honza: There will be some demos as well.

Honza: I was also thinking about how to improve our triage, based on the data received. Could we simply the way we produce this data? The work you do is visible on the data, but are people looking at this data? Can we add a way to simplify the way people understand this data, outside of the project? So 2 things that we should be working on. 1- work becoming more visible, 2-learning from the data. This could be a goal for next year.

Honza: There is a company that works for Firefox (Bocoup), that concentrates on platform gaps. Like features available on Chrome, but not on Firefox. We had to come up with the top 20 sites, based on the top 100 sites that they are working on. So that fits together with the current OKRs we have done, as we used some of the data from there, to justify our picks.

Raul: Should we also think about future OKRs for 2023?

Honza: We will discuss more on this at our meeting happening in January, next year. I will be on PTO when our next meeting is due this month.



November 15 2022

Issues for pages where content is streamed (movies, tv shows) where we are not sure if the streaming is legal or not (pirating) (SV)

How should we check such pages, to see if the streaming is legal? And if the streaming is not legal, but the issue is reproducible, how should we address them?

Context:

Raul: How do we check if those pages are legal or not? In different parts of the world, some are legal in other parts some are illegal.

Honza: Good question. Are those sites popular?

Raul: Some of them are popular.

Honza: We mostly care about wecompatibily between browsers, we tend not to focus on the content that a website is providing, but we do focus on how the browser is behaving. Some content might not be available in our browser, but the users will still access that website on the other browsers regardless if those are illegal or not. So we want to retain them on Firefox.

Calin: There are some sites that present pop-ups, suspicious content, redirects to spam sites, etc, close to malware content. Should we look into them?

Honza: This sounds like something that `NSFW` pages might present. What are you doing with adult sites?

Calin: We test them, and act accordingly. But some sites are trying to trick the users into clicking on some suspicious links/pop-ups.

Honza: That case is simple. We can not learn from that, the whole goal for our team is: we do it to identify webcompat issues and recommend the platform team to fix them. If there is nothing we can learn from the site, we can ignore them.

Calin: Some users think it is a browser problem, which is not true.

Honza: It is very time-consuming to figure out if it is a webcompat issue when diagnosing. I think this is the same for testing.

Honza: regarding the if the stream is illegal or not and how we can act on them, I'll get back to you after I got a clear answer.


NeedsDiagnosis Re-check issues (SV)

We have finished the list with reports submitted by users:

This week we have started the rest of the issues.

Honza: What is the pattern regarding most of the issues that are fixed or nobody responding on the issue?

Calin: Those issues are very mixed, so a pattern can not be clearly pinpointed.

Honza: Are people responsive?

Calin: Some of them are.

Honza: How long do you wait for a response?

Calin: About 12-14 days. After this, we will start looking into anonymous user reports.

Honza: Because there you have nobody to ask for info, do you just close them or?

Calin: We will treat them depending on the outcome (reproducible, worksforme, non-compat, etc).


Disney account (SV)

Disney account is up and running, we have checked some older reported issues, but the page loads as expected, and streaming presents no issues.

Honza: I think I found a way how to get a paid account for pages behind a paywall, so if you need in the future something similar, let me know.


Work week in Berlin (Honza)

Honza: regarding this, you can join the meeting via Zoom if that is something you are interested in.

SV: If the schedule is aligned, sure.

Honza: As you know, not every topic is of importance to you. Here is a link to the document: https://docs.google.com/document/d/1mDQcItNs1nZ6gN6C8OVApPDcf2w5gXbJXTcqY6-_h3g/edit

Honza: I think Tuesday would be the most interesting for you, to see how your work fits into the big puzzle.

Raul: If the policy allows us, sure. If so, we could join and do this instead of the meeting.

Honza: I'll double-check as well.

Honza: Could you also send me something to highlight 2022, OKRs, numbers, charts, etc, whatever you are proud of? If you can send it to me by the end of the week.

Raul: Sure thing.


[FYI] PTOs:

29th of November meeting will be canceled



November 1 2022

Web Renderer (SV)

We know that the Web Renderer functionality is no longer a thing but we are curious if there are any updates regarding that topic because we still receive bug reports on webcompat with the label "type-webrenderer-enabled".

E.g : https://github.com/webcompat/web-bugs/issues/113160, https://github.com/webcompat/web-bugs/issues/113089

Honza: No news regarding that. That label is appended by the BOT, let me talk with Ksenia about the BOT integration with bugs in GitHub repo. They seem obsolete for now, but maybe there is something I am missing.

Honza: Is there anything relevant in the reports about webrenderer?

Calin: Usually webrenderer was enabled in about:config, but now for some issues, the label seems to be applied even for issues where about:config can not be accessed (Firefox Release for Android)

Honza: Just ignore it, for now, I will talk to Ksenia to see if there is any reason to keep this label in the future.

Issues where a paid account is needed (SV)

We have seen an increase in issues for Disney Plus, mostly about the video not playing, but we are unable to test the issue because a paid account is needed: https://github.com/webcompat/web-bugs/issues?q=disneyplus.

Link for current test accounts: https://docs.google.com/document/d/1cdZG1_uw0rXjxYMmHvWR3dzeFNEtlI8imaOP1UN0BuQ/edit#

Honza: Who gave us access to the paid accounts in the past? I can see both paid and unpaid accounts in the list.

Raul: The "Media Top Sites" tab contains the paid accounts needed at the moment. Recently we have not made such a query for accounts, but from my knowledge, Oana would highlight the need for a paid account to Karl, and Karl would pass that on to the relevant team.


Needsdiagnosis issues (SV)

We have made a document with the provided open issues in the Needsdiagnosis milestone that need to be rechecked: https://docs.google.com/spreadsheets/d/1F9vcSpLQ_hNBeZinsytGXlfXpJLW6vh7C0BJYtd9hIY/edit?pli=1#gid=1519039493

As requested, we have also made the 2nd list with issues that are reported by users: https://docs.google.com/spreadsheets/d/1F9vcSpLQ_hNBeZinsytGXlfXpJLW6vh7C0BJYtd9hIY/edit?pli=1#gid=1634164584

Raul: The first list is with the general issues reported by anonymous users and regular users and the second one is with the issues reported by users.

Honza: We should go with the list of issues reported by users first.

Raul: Sure. We can make an OKR task. After the task is made, when should we start? Honza: Sure, send me a link. We can start the task right away.

Honza: What's the strategy when testing those issues? Raul: Regarding the reports received by `anonymous users`, if we can not reproduce the issue, we will close them as FIXED. If more info will be needed, we will ping the assignee of the report. Regarding the reports received by `users`, if we can not reproduce the issue, we will confirm this with the reporter. If the user confirms, we can close the issue as FIXED. If the issue is said to not be fixed from the reporter's point of view, we will ask for info from the assignee and/or the reporter. In both cases, if the issue is still reproducible, we will leave a comment to highlight this.

Raul: If no answers are received from the users after 12-14 days, we will close the issue accordingly.

Honza: Sounds like a plan then.



October 17 2022

Webcompat.com repository edit rights (SV)

Oana: It seems that both Raul and Calin have no rights to close/reopen or add labels to the issues on Webcompat.com repository.

Oana: Also, it seems that you (Honza) is missing from the OKRs, as an assignee: https://prnt.sc/jwkxzvrcfROT

Honza: It seems that I do not have privileges for that repository, I'll resolve this later.

Also for Calin, could you invite him to Webcompat internal Slack channel?

Slack ID: ctanase

Honza: All done, you should be there.

We've gathered a list of QA experimental labels for trends (SV)

 Link to the list: https://docs.google.com/spreadsheets/d/1F9vcSpLQ_hNBeZinsytGXlfXpJLW6vh7C0BJYtd9hIY/edit?pli=1#gid=824466205

Honza: It looks good. But the prefix for the label seems to be used for something else.

Raul: A lot of prefixes are already used, so maybe we can come up with a more relevant prefix.

Honza: We'll use this for now, and later we can edit it. I was looking at your notes, so all of these seem like good candidates. Do you want to add them immediately?

Oana: We can use them for now, and we can add issues to the Trends OKR based on these labels. Also, we should keep the same color for all related labels to trends.

Honza: That looks ok. We can start using them. The list should be up to date.

Oana: Yes, we update the list accordingly.

Oana: the experimental labels were added: https://github.com/webcompat/web-bugs/labels?q=trend

[Beta] State of WebCompat Report (SV)

We have made a draft for: https://docs.google.com/document/d/145jS-dMuTHHsJMCgS0lhvhfa64p2j3nnzSRs2cT3adk/edit#heading=h.8uq5fr5mdo7m regarding the Trends OKR on what we've learned and what we can use from Trends.

Honza: Can we highlight the most important issues from this list?

Raul: Based on the data gathered from Trends and from the number of issues related to a certain site, Youtube seems to be on top of the list.

Honza: It seems that the Print Preview feature can be added to the list, being reproducible on other pages.

Honza: Our next plan is to proactively test the site. Do you think we can test another category of sites as we have for Top Streaming Sites OKRs?

Oana: We can test sites that allow online meetings (Zoom, Google Meet, Microsoft Teams, etc).

Honza: Can we also test relating to Print Preview?

Oana: If the page presents this feature, sure. Or has content that is mainly used for Print Preview (PDF files for example)

Honza: Are there any Trends in the mobile area?

Oana: Mostly for the unsupported features, which are present on the desktop.

Honza: Let me put together some text based on this document. The report is getting more reach now, so this is a way for you to have more visibility. We can also combine this OKR with the current OKR for Exploratory Testing on Top Streaming Sites, maybe there is something we can learn from both OKRs, or we can use them in our advantage to highlight something important. We can also use them for reference for a future OKR.

Honza: I'll summarize the table, and then you can review it and add your ideas as well.

Notes:
* Youtube - an important site and many reports were related to it (recently an issue with short videos, not being able to play videos). This is a major site and reported issues tend to have rather big impact. Mostly desktop.
* PDF Printing - Issues found in the produced PDF. Chrome works fine. Mostly desktop. Honza: double check with Tom, why this isn't a Webcompat issue.
* OKR - test top movies and streaming sites
* Next OKR - Testing online meetings sites

[FYI] PTOs:

Oana: 24th-31st October Oana: Maternity leave starting from November


October 4 2022

[FYI] OKR tasks added to the dashboard (SV)

https://github.com/mozilla/webcompat-team-okrs/projects/16

Honza: I see 2 planned, and 3 in progress. These planned OKRs, will you be working on that list?

Oana: Calin gathered the list, and we will discuss the topic.

Honza: The in-progress ones make sense.

Oana: They are OKRs in progress e.g. End every week with O untriaged/unmoderated issues.

[FYI] Site Intervention/UA Override re-check (SV)

This task is being performed today to be part of this month and new OKR task.

Worldwide Streaming & Video Stream sites (SV)

A list of 10 Streaming sites and 10 Video Stream sites was gathered to perform exploratory testing on them in Q4:

Which approach would be the best to tackle?

1. Test 5 Streaming sites and 5 Video Stream sites in 2022 Q4
2. Test 10 Streaming sites in 2022 Q4 and 10 Video Stream sites in 2023 Q1 or vice versa

Calin: We have broken down this list into 2 categories, mainly movie streaming and gaming streaming. Can we merge this list or should we make 2 separate OKRs?

Oana: What would be the best approach? Having mixed content, or having 2 OKRs?

Honza: Do we have a link?

Calin: Only SV accounts have access, but I will update the permission.

Calin: Link added.

Honza: Interesting, you have gathered all the issues related to certain pages?

Calin: We have used the INCOMPAT ADD-on.

Honza: How does that work?

Calin: The ADD-on shows all the issues listed under a certain domain. We can share it now via share screen, and we can have a walkthrough on how it works and how is it useful to our project.

Honza: Right, so it counts every issue (closed and opened). That looks very helpful.

Honza: Coming back to the OKR proposal, is there any plan to reflect the number of issues, the signal where most issues were?

Oana: We wanted to have worldwide coverage.

Honza: Seeing the number of issues per domain, should we concentrate our efforts first on the most reports for a site? Netflix seems to have quite a lot. Do you also look after the popularity of the domain?

Calin: We have taken that into consideration as well, as we have picked via different sites domains that have a significant number of subscribers.

Honza: How will the testing work?

Calin: We have a checklist.

Oana: Mostly exploratory testing will be done.

Honza: Do we have the checklist somewhere, so I can see it?

Calin:https://docs.google.com/document/d/1mfazfJ03Zmf4zrFVQ7vrMFuk8HoRYLRK-6sJ0tziLSQ/edit

Honza: I will check the documents offline as well.

Oana: Where we will not have an account, that will be a problem.

Honza: If you do not have an account, that means that our options will be limited.

Oana: We have a list with Mozilla paid accounts from SV, so will be using them.

Raul: Where we do not have an account, we can proceed to the next available domain where testing can be done.

Oana: We can also check the list to where an account is needed or not.

Honza: You mentioned Mozilla accounts. Are those paid by Mozilla?

Oana: Yes.

Honza: What happens where in the future you will need an account (paid) for a certain account, what is the procedure there?

Oana: We ask around for an account, and if there isn't one available, we can make a request based on the priority of the domain/issue.

Honza: If there are sites for which we do not have an account, it will be nice to gather a list of sites to that we do not have access/account.

Oana: Sure, we can gather data for a future document.

Honza: How do you collect data that we can use after these OKRs?

Oana: We add them to the OKR task with all the relevant data.

Raul: We also make Metrics and collect data via documents.

Honza: Any interesting data that we can collect from this, we can add it to Trends. Testing these sites may offer us some data that can provide some insights e.g the result after the testing compared to the results gathered for that domain in webcompat. E.g. how Firefox is doing in that area, is it supporting them properly. Using your checklist, you can score an URL based on the findings.

Hawaii AllHands (Honza)

Are there any updates/meeting notes? Any tasks for us?

Honza: Yes. Honza: Meeting notes from most of the meetings the team (DevTools + WebCompat) had at AllHands https://docs.google.com/document/d/1r8zMNmjRsdCILn7H5nAUytYrkseCmFpkYeHzd2Jw8-c/edit#heading=h.q7wr5qbyddkd

Honza: Scrolling down into the project, Webcompat tooling for Diagnosing was a topic that presents interest for our project.

  • Pretty Printing was one of the biggest topics.
  • Platform Data Glue shows an insight into how the future issues might look (what’s happening in other browsers and not in our browser)

Honza: If you can go over these notes, and give feedback, that would be very valuable.

Honza: State of `webcompat` report shows the goal of `webcompat`(how many issues, what is broken)

Honza: We are actively building a report which has a goal to teach others what is important from `webcompat`, so for example, the product team can gather data to improve or fix the browser based on our reports.

Honza: Everyone understands that `webcompat` is important, but not a lot of people know how helpful a `webcompat` report can be.

Honza: In the future, more context is planned to be added to our `webcompat` reports.

Honza: Feedback on this note would help us a lot.

Honza: For example, after our top 10 Streaming OKR, we can gather data that other teams might find useful, or they can see what can be improved. As you are doing the report, see it as we are teaching people about our product via `webcompat1, based on the irregularities found.

Honza: Patricia asked about a Trend that shows that something bad is happening,e.g performance issues.

Honza: There is a list of performance bugs, with the label `bad performance` or `performance` label. https://github.com/webcompat/web-bugs/issues?q=is%3Aissue+performance+label%3Atype-bad-performance+is%3Aclosed

Honza: Ksenia uses the unsupported label. https://github.com/webcompat/web-bugs/issues?q=is%3Aissue+is%3Aopen+firefox+is+unsupported+label%3Atype-unsupported

Honza: can we do something similar while triaging?

Oana: we also add the label `unsupported`,`type-no-css` or `print`, but for performance issues, we are not always sure if it really is a performance issue and something else, so we only record a performance profile and add a comment with it on the issue

Oana: labels are added by our dev team for performance issues after it is being diagnosed by the devs

Honza: Can we think about a new list of labels for Trends? And how we can make our diagnosis process easier?

Oana, Raul: Sure.

September 20 2022

Onboarding Calin (SV)

  • add sv-calin to Slack channel #webcompat-internal
  • add Calin to webcompat internal - for Google Calendar

Honza: I will do it in two weeks after the ALL HANDS and his PTO, so I can properly introduce him to the team as well, and so he can introduce himself as well.

iOS issues approach (SV)

We have received an issue coming from the Firefox iOS repository, but issues that are reproducible for Firefox iOS are being moved by us to the iOS repository, and not in our `needsdiagnosis milestones`:

https://github.com/webcompat/web-bugs/issues/109582#issuecomment-1238750705

Should we continue to move reproducible issues for Firefox iOS to the Firefox iOS repository? If yes, can we also let the Firefox iOS team about this, so their are aware of our flow, instead of them moving valid issues to our repository?

Oana: iOS was dropped from our team. We just move them to their repository. We have a Firefox iOS channel for webcompat on Slack (#webcompat-firefoxios ), but there is no activity there.

Link to iOS repository: https://github.com/mozilla-mobile/firefox-ios/issues/

Honza: Let's discuss this with the entire team at our meeting.

Tom: When the iOS team determines that an issue is with a website, not Firefox/iOS, they will move it back to us.And in those cases it will probably be non-compat, since the issue will also happen on Safari on iOS too.

Tom: So we can just leave those open as needscontact, but not specific to firefox (just iOS in general). That way if anyone ever wants to reach out to the site, they can.

Focus and Fenix repositories transition to Bugzilla (SV)

We have seen this on the Focus repository: https://github.com/mozilla-mobile/focus-android/issues/7621

Do we have any info regarding a date when this will happen?

How would this affect us and our workflow? And if specific bugs will continue to be reported in the old/new repositories, or on Bugzilla.

Honza: I don't have much info on that as well, but I will ask around. I think all reports related to Fenix and Focus will be reported in Bugzilla.

Oana: Will our repository be moved to Bugzilla as well?

Honza: So far, there are no plans regarding this.

Proposals for Q4 (SV)

We have gathered some proposals for Q4 https://docs.google.com/document/d/1qlI81okmrrdhvdxLeW337PG25_TRmNONio_bcob0vL8/edit#

Honza: Can you please explain more about the intervention OKR?

Oana: Before the release, we check if the Interventions or UA Override as still needed for the issue

Honza: And what about the OKR regarding the top 10-20 sites?

Oana: This will be based on the reports received and also trends.

Honza: We can also use the Trends OKR to help us with this. Looking at the Trends, we can see what we can focus on. Collecting data from the Trends OKR and checking if the Trends are really Trends or not.

Oana: Based on the current Trends, we could consider top 10-20 Streaming sites.

Honza: What about the Contact ready? What is the outcome of this? Does it work?

Oana: Sometimes we get a response that the issue is not reproducible. Sometimes the website contact is not reachable, and we can close them as incomplete.

Honza: This sounds like we can do this more often. Who is setting the milestone for this?

Oana: The person who is assigned to the issue.

Honza: Did you do this using other labels/milestones?

Oana: We did it for most of the milestones (sitewait/needsdiagnosis/needscontact). We usually do this at each quarter. We iterate around the milestones.

Honza: How many issues are in these milestones?

Oana: Around 63, but for `needscontact` around 500.

Honza: Regarding rechecking the ETP issue, we can ask Tom at the meeting. Will you be checking all the ETP issues on Bugzilla?

Oana: There is a component in anti-tracking on Bugzilla. Some of them might be complicated or complex.

  • Update from TOM on ETP recheck OKR:

Tom: the anti-tracking team considers standard-mode issues important, but I believe strict-mode-only ones are generally assigned a severity quickly, and do not require a round of re-checking anytime soon after that. So we can skip non-standard issues if they have a severity of s2 or lower, and I don't think there are too many bugs left as s1 or which are standard-mode-only. we have triage and diagnosis meetings every week, so I don't think there is too much value in going over them again.

Honza: Sounds good. Thanks for the list.

Oana: All the proposals depend on an estimate of the workload, so some might be dropped.

Trends (Honza)

 * YouTube videos don’t work ([bug](https://bugzilla.mozilla.org/show_bug.cgi?id=1785149))
 * Entrata platform not supported in Firefox ([gh](https://github.com/mozilla/webcompat-team-okrs/issues/262#issuecomment-1211521618))
 * Yahoo mail ([gh](https://github.com/mozilla/webcompat-team-okrs/issues/262#issuecomment-1202993381))
 * Google Calendar ([gh](https://github.com/mozilla/webcompat-team-okrs/issues/262#issuecomment-1246669138))

Honza: What should be the workflow? Spotting trends or potential risks and checking the Bugzilla bugs already reported, or reporting a new one in Bugzilla are also part of the workflow.

Oana: For Firefox being unsupported, we can contact the site owner as well.

Honza: Should we go back and look in the comments if the issues are reproducible?

Oana: Yes, we already re-check the issues that have been signaled using the Trends if they are reproducible or fixed. We update them on the fly, with the corresponding Bugzilla bugs as well. We also follow up with users as well.

Honza: Do reporters show any interest? E.g asking questions, being willing to volunteer?

Oana: Not in our case, very low interest is shown to volunteer to do diagnosis, they are interested only for the issue to be fixed.

Oana: Dennis collaborates with other people (outreach) and part of the work they do is to learn how to diagnose issues.

Honza: I will talk to Dennis about it.


September 06 2022

Cases where the "lock icon" shows a message like "Parts of the page are not secure" (SV)

In the past, we have treated issues where inside the lock icon from the URL bar stated that parts of the page are not secured or not working, as valid webcompat issues. How should we treat similar issues from now on?

Context: https://github.com/webcompat/web-bugs/issues/109640

honza: treat them as non-compat, but better ask dennis about it (for mixed content issues)

dennis: That happens when a site served via encrypted https:// is loading an asset, like an image, over unencrypted http://. This is a small security risk, which is why browsers warn you. However, this is just a warning, Firefox will still load and display the "insecure" image either way. So from our point of view, nothing is broken in most cases. It only becomes a WebCompat bug if something is actually broken on that site, for example if images are missing. In theory, Chrome will show the same warning. However, this became a lot more confusing recently, as Chrome is now shipping an automagic upgrade mechanism. If Chrome encounters mixed content, Chrome will try to load the image via https://, and only warn if that fails. So you generally don't see this warning in Chrome anymore

Accounts/access for Calin (SV)

As far as I know LDAP account was requested already for Calin. For webcompat, he already has an account, but he needs rights to be able to change the status of issues/add labels etc.

Oana: is it possible for Calin to get access?

Honza: working on it

Oana: but for Github, he needs to be added to the repository.

Honza: I will do it.

Repositories:

- https://github.com/webcompat

- https://github.com/mozilla

Browser feature issues that are reproducible for one page only, working as expected on other pages(SV)

Are issues like this classed as valid Compatibility issues, or valid browser issues? For example, more saved login infos are shown (3) for walmart.com, instead of just the login info for walmart. Facebook for example shows just the login info for Facebook, without showing other saved login infos for other pages.

Context: https://github.com/webcompat/web-bugs/issues/110278

honza: indeed it might be a browser feature, but we can't fully know for sure, so move them to Needsdiagnosis and they will be checked at the Fast Triage meeting

QA Trends (Honza)

Most encountered issues at triage:

- YouTube videos not playing - reproduced by Calin and Raul

- Twitch videos - videos are not displayed in dark mode - fix is available for Firefox Nightly

- Print preview issues - broken layout

honza: continue adding qa trends

August 23 2022

DevTool - Remote Debugging tablet (SV)

Unable to see Inspect panel on a tablet device (empty screen shown), even if connection to the device was established. (USB debugging is enabled both in Firefox and in the device system) https://prnt.sc/f4ubd-5H54QX

Sometimes an error is shown https://prnt.sc/lwn7JAju04t9

Honza: Let's keep this topic for the next meeting.

Oana: [Update] it works now

Notes:

  • If this happens again:
   * There is #devtools and #devtools-team slack channel
   * The best person to talk to: @jdescottes

ETP issues and caching (SV)

Some sites are broken due to ETP (usually Strict), but after disabling and enabling it again, the issue no longer reproduces.

E.g. https://github.com/webcompat/web-bugs/issues/109139 (https://www.mpecopark.co.uk/)

Workaround: clearing cache, the issue reproduces again with ETP - Strict.

This probably needs some investigation on the ETP side, while changing states cache should be cleared.

Should we continue reporting the issues in Bugzilla as we did before and add a note regarding the cache problem?

honza: talk to Tom, and ad the topic to the Webcompact meeting agenda, and file a Bugzilla bug if there is none (with details, and examples)


July 26 2022

QA Documentation added to Mana page (SV)

https://mana.mozilla.org/wiki/display/QA/WebCompat+team+documents

Honza: Cool, thanks for that.

OKR Triage Trends (SV)

We've added some topics to the OKR https://github.com/mozilla/webcompat-team-okrs/issues/262

Raul: Observing duplicate issues that were not reproducible at the time, as per: https://github.com/webcompat/web-bugs/issues/106716, where we observed a pattern, helped us to reproduce an issue that affected numerous users following the Trend guideline.

What is a Trend? (SV)

We summarized some ideas:

- something that occurs on most used/popular sites
- duplicate issues 
- unsupported features on Firefox 
- unable to sign in with FB/Google/Twitter with ETP - Strict enabled
- embedded media content not displayed with ETP - Strict enabled

Honza: yes, these points summarize the idea of the Trend, and what to look for when observing a Trend. I would also add a page not working in certain parts of the world.

[FYI] Incompat ad-on (SV)

https://addons.mozilla.org/en-US/firefox/addon/incompat/?utm_source=addons.mozilla.org&utm_medium=referral&utm_content=search

This addon is a companion for https://webcompat.com. It helps track compatibility bugs for sites and shows you the number of already reported bugs on that domain.

Honza: Does this add-on show Duplicate issues?

Raul: Yes, it creates a list of all the issues signaled regarding the URL submitted. All the reports are shown via a GitHub search query made by the-addon

Honza: Does it help you in your daily triage?

Raul: Very much. It helps identify duplicates faster and easier, or other related issues with the reported issue.

DevTool - Remote Debugging tablet (SV)

Unable to see Inspect panel on a tablet device (empty screen shown), even if the connection to the device was established. (USB debugging is enabled both in Firefox and in the device system) https://prnt.sc/f4ubd-5H54QX

Sometimes an error is shown https://prnt.sc/lwn7JAju04t9

Honza: Let's keep this topic for the next meeting.

Priority vs severity labes (Honza)

Honza: I've seen that you use labels to set severity and priority. How does that work?

Raul: Once an issue is reproducible, we move it to the Needsdiagnosis milestone, where we set the default label for priority to normal, and the severity label accordingly.

Honza: How do you set the priority level via the label?

Raul: Based on the impact the signaled issue has, we have 3 levels for the severity label: minor, important, and critical.

- minor- for cosmetical issues on the page
- important- a non-core broken piece of functionality
- critical- the page or core functionality is unusable, or you would probably open another browser to use it.

Honza: Who sets the labels?

Raul: The priority is set during the diagnosis process, while the severity is set in the triage process.

DevTools for QA guidelines document (SV):

https://docs.google.com/document/d/1LiRE6crZ4sSzzxvU3Xkjg410bQ6-5h2b7nivetMl4ek/edit

Honza: Thanks for that, that is very helpful.


July 12 2022

Q3 Planning done (SV)

We've created the tasks and added them to 2022H2 dashboard. https://github.com/mozilla/webcompat-team-okrs/projects/16

Oana: If you consider adding other tasks, please let us know.

Honza: How does Site Interventions Release work? Is it like testing every Release intervention?

Oana: Yes, that is correct.

Issue reported with features unsupported on Firefox (SV)

We get issues where some features are not supported on Firefox but are supported on other browsers.

There are many times where we can't create a test account or the STR are not clearly provided, so we are unable to verify them.

Based on the info/description/screenshot provided by the user, should we move them to Needsdiagnosis or maybe directly to Needscontact, so the team would contact the site owner to understand why those features are not supported in Firefox?

e.g https://github.com/webcompat/web-bugs/issues/107210

Honza: If it's the second case, we can move it to NeedsContact, and if we do not have an account, we should move it to NeedsDiagnosis, because we can ask around if we can get an account to test.

Interventions/UA Overrides QA verification guidelines manual+automation (SV)

Ref: https://docs.google.com/document/d/1WCKizeFjR_125FMmEySPAjjSSkEmimUrXvwnoyIhE9E/edit?usp=sharing

Honza: Do we have a list of the already made documents? I know this is not the first time you have made such a document, which is super-helpful.

Oana: We can make one, sure.

Honza: We also have the Mana Page: https://mana.mozilla.org/wiki/pages/viewpage.action?spaceKey=FIREFOX&title=WebCompat, can we add it here?

Oana: Sure, we can add it here.

Honza: Can you quickly summarize this document for me?

Oana: ofc, there is a bit of info about Interventions/UA Overrides, how we can see it based on different devices, how to enable or disable them, and a section about manual and automation testing.

Oana: If one of the Webcompat issue is reproducible, the Intervention/UA Override is still needed. If not, we create a Bugzilla task to remove the Intervention/UA Override.

Honza: When you test if an Intervention/UA Override is needed, do you go through a list?

Oana: Yes, Dennis created a dashboard where we have the necessary data.

Honza: I see you build the gecko driver as well. Is that needed?

Oana: Yes, that is needed for the automation set-up in order to run the tests in Firefox

Honza: Sounds cool, please share this document in the next Webcompat meeting.

Honza: What happens if there are too many failed automation tests?

Oana: We run them one by one.

Oana: For the moment, the automation tests are just for the desktop issues.

Trends/patterns (Honza)

One of our goals is to spot trends in WebCompat landscape.

Honza: We have a list of tasks to be completed. This should help us recommend to the team what are the top of the webcompat issues, which are important, which are not.

There are different perspectives regarding this (how many users are using the page, etc).

We have recommendations to the platform team, so that they can focus on them.

The second part is trends, and about spotting them. So, what is a trend?

A trend might be that a page does not work in Firefox, in some parts of the world (unknown issues). Or other browsers are implementing APIs, except for Firefox (doing something without us- future issues).

Oana: The second example of the trend can be observed in Interventions/Overrides

Honza: Exactly. The third case of a trend is known webcompat issues.

So as you are doing triage, you might be able to spot trends. Maybe you can see some patterns. Is that something you can help us?

Raul: Should we focus on reproducible issues reported by users, or Worksforme issues?

Oana: Usually we move issues that we can not reproduce to Worksforme.

Honza: Writing them down, regardless of their status, as long as the issues present a pattern/trend, we can write them down.

Is this possible to summarize this kind of issue?

Oana: We can make a document to summarize this.

Honza: Great, let's try this.

Raul: Here we have an issue that might be classed as a trend. Should we mention future issues inside related issues, like we did here?

https://github.com/webcompat/web-bugs/issues/106716

Honza: Sure.

Oana: Can we make an OKR task out of this, eg. Triage Trends?

Honza: Sure.

Oana: I've created the OKR task and added a few items:

https://github.com/mozilla/webcompat-team-okrs/issues/262

DevTools (Honza)

How much DevTools is useful/needed for triage?

Oana: We use the Inspector to pin-point the affected area of the code, or we play around with the CSS where possible to see if a possible fix can be applied, and we also use it for RDM.

Honza: For the next time, I will put again this item in the agenda, so we can talk more about this, and what we can use to improve DevTools for everybody.

Oana: Also, I've seen something about the Compatibility panel. We hardly use this. Also the screenshot feature.

Honza: We will talk about this as well.

Oana: Reporting issues from the DevTools, how will this work? Will it be implemented?

Honza: This is a suggestion, we have not agreed on yet.

Oana: We also use Remote Debugging for Android? We have some ideas there for improvements, as we guide users sometimes to use this, and we also use them.

Honza: Sure, we can talk about what we can improve here as well.

Oana: Also, we use the performance tab of the DevTools, and the Network tab.

Honza: Cool, highlight this in a document regarding how, what and why you use DevTools in your triage process.


June 27 2022

Verification of shipped Intervention/UA Override (Honza)

honza: as discussed on slack with Dennis, I was wondering about the process from your point of view as to what is happening around every cycle of shipped Interventions/ Overrides

oana: at each cycle, 2 weeks before the release date, we perform a verification, both manually and automatically, for the list of the Inverventios/Overrides.

We check with both the Interventions/Overrides enabled/disabled, and based on the results, we conclude if the Interventions/Overrides are needed or not.

If the issue is no longer reproducible with the Interventions/Overrides disabled, we will submit a Bugzilla report to request the removal of the issue from the list.

honza: I have seen that some Bugzilla reports for Interventions/Overrides have a corresponding GitHub issue.

oana: some of them have a corresponding GitHub issue because they were first signaled using Webcompat reporter, and they are added to the "See Also" field in order to give context when investigating.

honza: is there a list regarding the Interventions/Overrides?

oana: We have a list we use, created by Dennis https://webcompat-interventions-dashboard.0b101010.services/

but also in `about:compat`, we can see the active Interventions/Overrides that are in place and their corresponding Bugzilla task.

oana: at each run, we create our own list (containing both the Bugzilla and the Github issue, add status and comments) https://docs.google.com/spreadsheets/d/1F9vcSpLQ_hNBeZinsytGXlfXpJLW6vh7C0BJYtd9hIY/edit?pli=1#gid=477077755

oana: we'll create an Intervention/UA Override guidelines document asap

[FYI] Q3 Planning is in progress (SV)

June 14 2022

Firefox Release vs Firefox Nightly reproducibility (SV)

Currently, issues that are reproducible on Release but they are not reproducible on Nightly, we close them as Won't fix, with the message for the user to test on the next release.

We previously agree to this approach with Karl. Should we continue doing so?

honza: we can keep this approach

[FYI] Bug reported QA flow (SV)

We've created a chart and some guidelines on the work performed by SV QA team after a bug is reported on webcompat.com platform.

 Chart with the QA flow: https://drive.google.com/file/d/16Y3XwBIZArdySUvZAzcql8AiHetu5Sql/view?usp=sharing
 Guidelines - flow explained: https://docs.google.com/document/d/190-mh_vNzrso2RaEQ5OULii-YsN3d5sUt8AZVSOUH5A/edit?usp=sharing  

honza: can this be shared with Webcompat team?

oana: yes, I've created a copy for the team using a Mozilla account, everyone should have now access to view and comment

[Honza] WebCompat Repos

What is the relation and how the process looks like?

honza: we discussed it, all good

Fast Response Triage details (SV)

Are there any updates on this topic?

honza: things are moving, work in progress, I'll keep you updated

paul: this will be helpful since all the team members are working on it


May 31 2022

Welcome Honza

Introduction to webcompat. QA tasks.

Previous Sync meetings:

- Sync meetings with Karl

- Sync meetings with Mike