Open anssiko opened 8 years ago
thx @anssiko. I will take the responsibility to keep the IDL-harness up-to-date with the spec. I will try to extract the IDL for each conformance class (controlling and receiving UA) programatically. I'd like also to ask Presentation API implementors to share with the group more information about their implementations e.g. which version of controlling UA, which presentation devices, limitations, etc. These information may help testers while writing or reviewing tests. For testing activities, I'd like to start with basic tests around the following features:
If you like to open an new issue please use the label testing.
I will let you know when the initial web-platform-tests
PR is merged.
Thanks @louaybassbouss. I updated the initial comment with the suggested tasks for basic tests, and will link to related issues when they appear.
The initial PR has landed to establish the test structure for the Presentation API, so you're now able to run the work-in-progress tests in-browser at:
http://w3c-test.org/presentation-api/
The Presentation API tests are located in the w3c/web-platform-tests
repo at:
https://github.com/w3c/web-platform-tests/tree/master/presentation-api
(I added these links to the head of the spec and updated the first comment in this issue accordingly.)
w3c-test.org mirrors the master
branch of the web-platform-tests
repo. Whenever a new Pull Request is merged to master
, the mirror should be updated.
thx @anssiko. I just submitted a new PR to the web-platform-tests
repo containing an update of idlharness.html for controlling and receiving UAs according to the lastet ED from today.
Volunteers a welcome to review this PR. Please add a comment to the PR after review (if everything is fine just put LGTM).
@zqzhang Could you look into creating some of the basic tests enumerated in https://github.com/w3c/presentation-api/issues/266#issue-133190198?
@yhe39 will jump in this after the International Labor Day holidays.
Thanks @zqzhang and @yhe39 -- please coordinate your test contributions with the Test Facilitator @louaybassbouss. He indicated he'd have new contributors to help with testing too.
@louaybassbouss feel free to share your plans and expected schedule. It'd be good if this meta issue would be kept updated as the testing progresses as to allow more people jump in and help.
Initial test results published (thanks @louaybassbouss and @zqzhang):
https://w3c.github.io/test-results/presentation-api/controlling-ua/all.html https://w3c.github.io/test-results/presentation-api/receiving-ua/all.html
These test results contain idlharness.js tests only at the moment. For increased coverage, @louaybassbouss, @zqzhang, and @yhe39 will be adding basic tests (see the first comment) beyond the existing tests for WebIDL interfaces. Any ambiguities or issues in the spec that may be discovered while writing further test cases should be reported to the group so we can improve the spec accordingly.
I am happy to announce that the first test report of Presentation API is published. It includes separate test results for Controlling 1 and Receiving 2 UAs. It considers for now only WebIDL tests. Many thanks to @zqzhang for the support. Also welcome @yhe39 from Intel, @mariuswsl and @taff-franck from Fraunhofer FOKUS who volunteered to support this activity. @mariuswsl and @taff-franck are working on the first 4 basic tests from the list in the first comment. Other volunteers are welcome to write, review or run tests. We plan to publish a second test report before the next F2F Meeting and a third one by the end of June. The second report will considers a subset of the basic tests and third report all basic tests and hopefully more Browsers. To Presentation API implementers: can you please share information with the group about current status, limitations, setup of presentation devices, etc.. This will help us a lot in writing and running tests.
Thanks for the update @louaybassbouss, and welcome @mariuswsl and @taff-franck to the growing Presentation API testing team.
I just submitted a PR for second test report. @zqzhang can you review and merge.
Thanks @louaybassbouss! Ping @Honry who can probably help review. @zqzhang may be unavailable currently.
done. BTW I also submitted a new PR with idlharness updates. @Honry can you also review and merge this.
I would like to help review the PR, but I have no access right to merge it, still need help from someone else.
Thanks for your help. Let us know when you have reviewed the PR and we'll find someone to merge the PR.
OK, I've added some comments on the PR, @louaybassbouss please help address them.
The pull request for the testing report has been merged. For the tests update pull request, please address the review comment and drop me a note. I can then help merge it.
The new presentation API test reports are available here:
Thx @zqzhang, @Honry, @yhe39, @mariuswsl and @taff-franck for writing or reviewing the tests.
PS: I will be on vacation until end of July. Please contact during this time @zqzhang or @anssiko I you want to submit or review tests or if you need any help regarding testing.
Thanks @louaybassbouss, @zqzhang, @Honry, @yhe39, @mariuswsl and @taff-franck!
(The test suite and implementation report are linked from the spec.)
Dependent issue filed: https://github.com/w3c/web-platform-tests/issues/3294
hello all I am back from vacation before I review all the mails is there anything urgent to do first regarding testing?
To get a rough estimation of the test coverage of the spec so far, I prepared the following document, which @louaybassbouss and I will try to keep up to date as the group progresses on the test suite: https://tidoust.github.io/presentation-api-testcoverage/
This document is meant to be a (quick-and-dirty) work document: it links sections of the spec to test files that check them, and gives an estimation of each section's coverage, along with comments as to what still needs to be fixed, improved or done (comments are visible when you hover the coverage percentages, or the links to the test files). Tests may check more than one section. Tests that appear with "(PR)" are defined in a pull request and not merged yet.
The coverage estimation in percentage is rough and does not mean much on top "section not tested", "some missing tests", "we could perhaps do better" and "should be good enough". In other words, it's more a way to quickly assess which sections still need some love, do not read too much into it. Ideally, the test suite will cover all sections. In practice, some tests are arguably more important than others, and some steps that depend on implementations may prove hard to test.
The document is on GitHub. Feel free to suggest updates and/or send pull requests: https://github.com/tidoust/presentation-api-testcoverage/
Thx @tidoust all Presentation API related PRs on web-platform-tests are now merged. I will update the test coverage document.
Dependent issue filed: https://github.com/w3c/web-platform-tests/issues/4168
@louaybassbouss @tidoust
Re https://github.com/w3c/presentation-api/issues/266#issuecomment-243449748:
Just checking in on the test plan and coverage status. Is the coverage document [1] up do date or has work progressed since 11/21?
[1] https://tidoust.github.io/presentation-api-testcoverage/
For Chrome, are you blocked on a functional presentation.receiver API?
Reporting on what I'm aware of...
I think the coverage document is up-to-date, and we are late on enacting the test plan that it contains. Not "very very" late, but late nevertheless.
@tomoyukilabs has prepared more complete getAvailability
tests end of November:
https://github.com/w3c/web-platform-tests/pull/4257
Reviewing these tests triggered the issues I raised against the Presentation API. These tests were put on hold in the meantime (e.g. tests change depending on whether the same Promise is always returned, only if unsettled, or something else). I think we'll be done with getAvailability
testing once these tests have been adjusted.
Tests on "reconnect", "close" and "terminate" operations on the controlling side could still be improved, but they already check main steps there, so we're already in a good shape.
There are a couple of meta-issues to fix and improve existing tests. @obstschale is working on that.
Further tests (receiver, messaging) are somewhat blocked by the lack of a functional receiver implementation (although there may be one with Fennec now, I haven't checked), but also - and mainly? - for lack of time available to work on them.
The testing effort could use more resources!
@tidoust I tried to check Presentation API on the latest version of Nightly for Android (Fennec), and confirmed that it seems to have both controller and receiver implementation of Presentation API in 1-UA mode (which communicates with Chromecast).
If there isn't any problem, I would like to start implementing test cases which require receiver implementation.
@tomoyukilabs that'd be great, thanks for helping us with the testing effort!
Pinging @schien to give us a brief update on the Fennec receiver implementation status.
Per the update in Issue #406 Chrome desktop (Windows, Mac, Linux) also has support for the Receiver API in 1-UA mode, currently available in dev channel releases. Target screens supported include Google Cast devices and Hangouts.
We are aware of some failures with existing test cases and are working to address them (don't have bug #s handy but can follow up). Let me know if you have any questions about using it or see anything that looks amiss.
@mfoltzgoogle Thanks a lot for your update! However, I haven't still succeeded to confirm the Receiver API in 1-UA mode (with arbitrary URLs) with Chrome Dev for Mac 58.0.3004.3 yet; a receiver UA does not seem to have navigator.presentation.receiver
. Could you find anything I might be missing?
The receiver property is only exposed in pages started as presentations. I can double check the status when I have access to a Mac.
Yes, I have checked existence of navigator.presentation.receiver
at the receiver by the codes like:
<!DOCTYPE html>
<meta charset="utf-8">
<script>
window.onload = () => {
document.body.innerHTML += navigator.presentation.receiver;
};
</script>
The results on my Mac and Chromecast would be undefined
, though.
Okay, I'm able to reproduce this. We're investigating.
Update:
Thanks! I have confirmed that PresentationReceiver in 1-UA mode of Chrome Canary works.
@tidoust @tomoyukilabs Just pinging on the state of test coverage. Does the test coverage document [1] reflect the current status?
[1] https://tidoust.github.io/presentation-api-testcoverage/
@mfoltzgoogle Yes, if I understand correctly.
@mfoltzgoogle Yes, I maintain that document and it should roughly reflect the current status. In particular, thanks to @tomoyukilabs's thorough work on the test suite, it's almost all green now. The remaining work items are a bunch of minor things to fix or improve but the test suite now covers most algorithms.
Actually, with today's updates, I believe we're basically done. What we'll need to do now is:
Sounds good. We'll await the implementation report (or if I get a slice of time, can obviously attempt to run the current test suite myself).
@tidoust Could we help out with the remaining TODOs here?
https://github.com/tidoust/presentation-api-testcoverage/blob/gh-pages/coverage.js
Also it looks like some sections of the spec are missing entries, so they show up as question marks in the coverage document.
@mfoltzgoogle The remaining TODOs are around points that I'm not sure we can test in the end. Typically, I do not see any good reproducible way to force an error when the connection gets established. And I'm not sure we can include tests in our test suite that require more than one secondary display available. I turned the "TODOs" into "If possible" suggestions for now.
The question marks were for sections for which I did not really know how to report coverage because the concepts they define can only be indirectly tested. I added links to these indirect tests and report the coverage for these sections as "N/A".
Thanks for the clarification. I think the following two should be possible, it just requires there to be multiple presentation displays (possibly supporting different presentation URLs). I can send a PR if I can find a suitable solution.
comments: [
'If possible: test with multiple availability URLs in the set of availability objects',
'If possible: test with multiple presentation displays (test user may not have multiple displays at hand though)'
]
I have a WIP to add a test for availability with multiple displays, but it needs a bit more work before I can send a PR.
Re: https://github.com/w3c/presentation-api/issues/266#issuecomment-300191905
@tidoust
Should we expect an implementation report soon?
My understanding is that @louaybassbouss is working on it. @louaybassbouss, any timeline?
Is there an issue filed to split the tests up? Or does that just mean splitting the results in the implementation report (versus splitting the test cases)?
I did not create a separate issue for that for now, because I do not know whether that will be needed at all.
It all depends on what the implementation report will reveal. The situation we should avoid is a report with lots of failures triggered by the same bug, that would not convey the fact that the implementation actually supports the feature under test. If the updated report looks bad, we should try to make the tests more atomic. If the updated report looks good, we don't need to split the tests up.
@tidoust yes are working on this there was an issue in the Test Runner that blocked us @aleygues already reported this. After it is fixed, we were able to create the test report for Chrome Desktop but for Chrome Android in all the tests the Screen Selection Dialog was not displayed. @mfoltzgoogle any idea? we use as receiver Chromecast Ultra.
@louaybassbouss Is there anything which I can help you with?
@louaybassbouss I'll take a look and see if I can repro. CC @avayvod
@mfoltzgoogle we were able to run the tests on Chrome Canary for Android thx @avayvod for fixing it. Now we have the reports for Chrome Canary for Android, Chrome Desktop and Firefox for Android. We then recognised that the results of IDL tests are missing in the report due to some error in the idlharness lib which seems to be fixed in the latest version. @aleygues want to run the tests again to have also the IDL results in the report but the Test Runner is not working. This happened many times in the last weeks. I hope the bug on test runner will be fixed soon in order to run the tests with the IDLharness. otherwise I can publish the report with the current results and we can update it when we have the new report. @tidoust what do you think?
When doing a larger change spanning across components it is often helpful to have a single issue for tracking the bigger change. This is such a "meta issue" for tracking the development of the test suite for the Presentation API.
Resources:
Tasks and related sub-issues:
web-platform-tests/presentation-api
PR: w3c/web-platform-tests/pull/2432interface Presentation
IDL into partials: #230web-platform-tests/presentation-api
to match the ED specidl-controlling-ua
and/oridl-receiving-ua
: #268All - @louaybassbouss is your test facilitator for the Presentation API spec and will be responsible for coordinating the test suite development. Please work with @louaybassbouss by contributing new test cases and reviewing existing tests. See the Work Mode > Testing wiki for more information on how to setup your own test environment.
Happy testing!