- Web Compatibility Meeting Meeting - 2023-04-25
- Minutes: Previous 2023-04-18
- Scribe: James
- Chair: Raul
State of WebCompat Report (Honza)
Draft document [available](https://docs.google.com/document/d/1Bp50F9NrD4uiZ38bELCrBTkHcCxB9Ie-nWJrHvEY3qo/edit#)
- Are we ready to publish it?
Some comments that need to be addressed. Any feedback?
- jgraham: Writing up the social media sites study
- Honza: Seems like we're pretty much ready. Finish the edits this week and publish on Monday.
DevTools: JS Pretty Printing (Nicolas)
Inline scripts couldn't be pretty printed. Now the js in inline scripts are prettified. Doesn't have contextual awareness of indent level + etc. Also fixed column breakpoints when pretty printing. Needed to fix a performance issue for this to work. We also made pretty printing in general faster and are mnitoring performance. Pretty printing before could give very long lines. Now we adjusted the heuristics to improve things. Not prettier-level but should make things much more readable. Also some smaller fixes.
- Feedback (suggestions for improvements, next steps, etc.)
- Tom: Thanks! I've already noticed the improvements and am using column breakpoints. Still some bugs with weird regexp (maybe) breaking pretty printing (makes script not run).
- Nicolas: Please file a bug, we don't know about that.
- Ksenia: It looks great!
- Honza: One more area for improvement is source maps. Broken on reddit.
- Tom: Also broken in Chrome on reddit. Might be buggy source maps.
- Honza: Improvements in that area might improve debugging of some production sites. Please schedule peer feedback sessions, those are important sources of feedback.
- Nicolas: Alex has started working on source maps. I'm reviewing patches already. My next project is to add a compatibility tooltip. Shows an icon if a property has some compat issues. Seems like it could be helpful. Hopefully there will be a demo of that in a couple of months.
- Tom: Would like CSS pretty printing as well if possible, including in the style editor. Sometimes I end up removing parts of the stylesheet, and that would be easier with pretty printing.
- Nicolas: That's planned, but we haven't had time so far. Could be useful for cases where style rules don't come from CSS files but from JS code.
QA - Top 100 Websites testing (SV)
We want to discuss and decide on how to construct the list of websites that we should follow.
Angle of approach
We are thinking at 2 possible angles: 1. Based on most visited websites, no matter from what category they are part of. 2. Top visited websites for specific categories that are of interest to our team/Firefox.
Both of them have pros and cons, so we would appreciate your input if possible. Also, we would appreciate a little help in building the list after we decide on the approach. Honza already suggested that we should use sources like: - https://www.similarweb.com/ - https://tranco-list.eu/ - [CrUX top 1000 doc](https://docs.google.com/spreadsheets/d/1HcafFKM_bv-2O6qad011mTgCNWX2lEjvBNxTYld1GYk/edit#gid=1838995098)
How deep do we want to check these websites?
We were thinking of checking: - UI - Making sure the websites are properly loaded - Functionality - Check basic functionality of the website - Account creation - An account can be created if available - Account login/logout (where available) - Saving logins - Verify that account log-ins can be saved and used? (not sure if this is of interest to us). - Audio/Video - Ensure that audio/video content can be played
1. Valid phone number from a certain country might be needed for account creation process. 2. Credit card might be needed for account creation process or basic functionality of the page (for shopping pages - payment at the checkout process). 3. Geolocation restrictions - where VPN does not work and connection to the page can not be made.
For mobile, testing will be conducted on Android. For desktop, should we focus on Windows, or should we add other Operating Systems as well(e.g Mac, Linux)?
- Paul: This is an OKR for QA. We've previously used Alexa. Other QA teams have been using categories e.g. streaming, banking. Feedback on which approach would work best would be welcome. Need to decide how broad the testing will be.
- Honza: I'm working on the credit card problem, but it will take time.
- Honza: Should we use the sites that were identified in the user feedback?
- James: Yeah, a lot of the sites were from the top sites - video conferencing, social sites. One reason to not use Tranco because it won't tell the difference between google meet and google.com. One option is to take a 1000 from CrUX and from that pick some subset to fill categories (social, conferencing), it could be random. It makes sense to have sites with known breakage and that are harder to test.
- Paul: We'll generate a list and send it to the team to make sure we're covering the most important ones. We could do different selections at different times.
- Honza: You should also share the feature testing plan.
- Paul: We've shared an initial list. We want to make sure that sites render, that accounts can be created, that media works, but we can't go very deep / test edge case features.
- Raul: There might be other browser features that we want to test e.g. printing? We usually don't do this, but we could add it if it helps.
- Tom: Mobile testing?
- Paul: Yes, on Android.
- Tom: Form inputs would be good to test so we can see where there's e.g. fastclick problems.