Open hochhaus opened 8 years ago
Making Closure Rules support Polymer is on the agenda. A webfiles rule is available at HEAD that was introduced by https://github.com/bazelbuild/rules_closure/commit/5a8d6dffb29346191631c6afee9785c8d826a7ce twelve days ago. Check out https://github.com/jart/tensorflow/commit/46574c979b986b5fbe7259cb0e0f2790ac422ccd for a sneak preview of how it's going to be used. Please be advised that I do not recommend webfiles for production quite yet (or Closure Rules at HEAD for that matter.) But it will be ready in the next few months. My primary assignment at Google right now is to bring order to TensorBoard builds, and TensorBoard is written in Polymer.
Polymer v2 is a big improvement over v1. I like that they're using real ES6 classes now, instead of that weird Polymer({...})
DSL that caused us to need --polymer_pass
. I think there are nice things about Polymer. Once understood, it allows the developer to create contemporary looking UIs with very little effort. It also abstracts many patterns which are common in business applications.
But Polymer makes a lot of tradeoffs. It radically changes everything about how web development has been done for decades. It turns many Google best practices on its head, e.g. not writing code in HTML files, not copying vendor components into your codebase, not launching an autonomous open source project for a three line HTML file, etc. It requires you to use tools like vulcanize which have thousands of transitive NPM dependencies written by random people. It's difficult and non-obvious how to augment behaviors Polymer abstracts. It's very heavyweight and only natively supported by Chrome. It makes tooling such as Chrome debugger less useful. If one has greater dev resources, and the developers in question are highly experienced, then there is simply no comparison between Polymer and the Closure stack in terms of performance, safety, and maintainability.
That said, the build work we're doing is going to make Polymer much better and help it deliver on its promises of componentization in a remarkable way. We're planning to rewrite tools like vulcanize from scratch in Java so we don't need node and can leverage the Closure JavaScript and Stylesheets compilers for compile-time safety and optimization. It's going to be great. The build graph is going to be intelligent so JS can be written in its own files or in HTML. But it's not going to be easy.
Thanks for sharing all the context and your experience! Your plans sound amazing. The java rewrite of vulcanize is much better than what I was thinking. Avoiding the npm dependency mess would be outstanding. :)
Thanks for all your work on this repository! This project keeps becoming more and more impressive. Please let me know if I can do anything to help.
One other question. As I've started playing with web components, I'm a big fan of shadow dom and custom elements. However, I have a strong distaste for html templates. They seem like such a big step backwards from idom soy templates. Therefore, for components that I write myself, I was planning on skipping html templates all together and using shadow dom + custom element + idom soy. Based on your experience does this seem like a good or bad idea?
I think you have good judgement and nothing about what you said sounds like a bad idea. Soy is pretty good and is the one place where Google has really put the most thought into preventing things like XSS attacks. It's not worth abandoning if you understand how to use it.
Also, the gerrit code review tool, as part of the PolyGerrit UI rewrite, has started adding bazel support for polymer build primitives (cripser and vulcanize)
Closure compiler was integrated in gerrit code review bazel build tool chain: [1]. However, only white space mode works currently, because of https://github.com/google/closure-compiler/issues/2042.
It turns out, that with whitespace optimization level es5_runtime isn't injected per default. Addressed here: [1].
@jart Could you provide a status update on polymer and rules_closure
? The previous points you mentioned that are most interesting from my perspective are:
Any other points you are willing to share are appreciated as well. Lastly, do you have pointers to where portions of this code lives in open source repos (eg: tensorboard)?
I wasn't able to attract the support within Google to generalize as much as I was hoping.
However, you can get pretty reasonable Polymer + TypeScript + Closure + Protobuf + Bazel support out of the box, by depending on TensorBoard's build system, and using tf_web_library
.
This is an example of how to use TensorBoard's build as a dependency: https://github.com/tensorflow/tensorboard-plugin-example Please note that repo is pinned a few months behind. I'd greatly appreciate a PR that bumps it up to TensorBoard HEAD.
Here are the main pieces of code to focus on, which explain what I mean:
Your WORKSPACE file would look like that.
Your BUILD files would look like that.
You use tensorboard_html_binary to vulcanize.
You use tensorboard_zip_file to package / distribute / deploy your web server files.
We already arduously defined, within TensorBoard's build system, probably at least half of the WORKSPACE definitions you're going to need, e.g. Polymer, Node, common libraries, etc. They're all defined using highly-available geographically redundant mirrors, out-of-the-box.
The Skylark code that makes these things possible is available under tensorboard/defs/.
To give another example, Facets is another Google Research project that uses Polymer. It was able to open source rather quickly by depending on TensorBoard's build system. (Note: TensorBoard's build depends on rules_closure. There's no competition going on here.) They were even kind enough to give a shout out in their blog announcement.
I'll also note that everything that's been baked into these builds, has received some vetting from myself and possibly also security folks. Please note that this can't be interpreted as a promise. Please see the Apache 2.0 license to learn more. Also, motivational reading: https://hackernoon.com/im-harvesting-credit-card-numbers-and-passwords-from-your-site-here-s-how-9a8cb347c5b5
@jart Do you think gerrit code review could use rules_closure
with tensorboard rules to use advanced optimization level on Polymer 1? We are using still WHITESPACE_ONLY
mode, unfortunately.
https://github.com/GerritCodeReview/gerrit/blob/master/polygerrit-ui/app/rules.bzl#L17
Right now TensorBoard skips Closure Compiler entirely and expresses its JS dependency relationships as <script>
tags in HTML imports, specifically because Polymer 1.o is the way it is. A friend of mine on the Polymer team informed me a year ago that they do consider Closure Compiler support in Polymer 2 a priority. I haven't yet had time to delve into Polymer 2 so I'm unable to comment further.
One thing I will note is that, due to the flexibility of Bazel, you can write your Closure heavy code using closure_js_library and closure_js_binary. Then put the final resulting .min.js file as a srcs to your tf_web_library. But if you want to run Polymer({...})
class definitions through Closure, be prepared for pain.
However, please note, the TensorBoard toolchain does support using Closure Compiler in "simple" mode with Polymer and it's baked into the Vulcanization process, and it does indeed work on Polymer 1. It's just that "simple" mode doesn't give us the full benefits of the Closure Compiler, and it takes a while to compile. TensorBoard currently doesn't require that level of minimization, which is the only reason why we skip it right now.
Further note: You say tensorboard_html_binary(compiler = 1)
to get it to run Closure. https://github.com/tensorflow/tensorboard/blob/master/tensorboard/defs/vulcanize.bzl#L110 Please note there's a lot of hard-coding done in that Java file. It's pretty generalized, but not 100%.
Oh and just one more thing: I recommend avoiding ES6 imports. I can't even begin to describe how badly we were burned by them. There doesn't appear to be any consensus on what they mean. Node defines dependencies quadratically. Chrome sorta supports them but not really.
If you use ES6 imports then (to the best of my knowledge) you've just guaranteed your codebase is incompatible with all web browsers and cannot be used period, without either waiting 45 seconds for Closure Compiler, or relying on a tool like rollup. Friends don't let friends do ES6 imports. They're not ready for prime time.
Please keep in mind, the Closure Compiler is so slow because it was designed to create production release builds for google.com, not for developer iteration. The TensorBoard build system lets you do fast raw sources iteration, if you bazel run //some/tf_web_library:target
. But that's not possible with ES6 imports. goog.require
and and possibly goog.module()
should work out of the box, maybe with some finangling. Plain old <script>
tag dependency expression is 100% guaranteed to work.
As of Chrome M70, HTML imports are deprecated in favor of ES modules, and are slated to be removed in M73, around April 2019:
https://www.chromestatus.com/features/5144752345317376
Does this influence our plans? (cc @nfelt)
With the pending release of polymer 2.0 (built upon Shadow DOM v1 & Custom Elements v1) the polymer project is looking interesting. The closure tools blog has a series of posts (1, 2, 3, 4) about using closure and Polymer together. Also, the gerrit code review tool, as part of the PolyGerrit UI rewrite, has started adding bazel support for polymer build primitives (cripser and vulcanize). The polymer ecosystem currently integrates with several other build systems, however ideally first class bazel support would provided in a central location. If I had to guess, polymer, closure and bazel all play very well together inside Google. Possibly that code can't be easily open sourced. I thought I would mention it though before multiple different implementations of partially baked bazel support rules for polymer start popping up.
@jart What are your thoughts about having
rules_closure
natively support polymer? Is it out of scope?