Open ascott1 opened 8 years ago
Perhaps the delivery team could help on the performance measurement side... maybe integrating webpagetest into our workflows, or something similar?
Perhaps the delivery team could help on the performance measurement side... maybe integrating webpagetest into our workflows, or something similar?
Yes, for sure! @KimberlyMunoz has also mentioned the performance reporting that New Relic can provide.
Let's also consider adopting a performance budget and integrating tests against that budget into Travis/Jenkins and the Software Advisory Group review.
I did a little bit of work on this with Chris and Irina earlier this Fall before all the atomic design stuff was ready. I had trouble with webpagetest testing local/non-public pages. I looked into using sitespeed.io, but setting that up in Jenkins seemed to be more trouble than it was worth.
I'd also like to collect statistics on the current landscape of our testing efforts
I think this would be awesome. Most repos already have coveralls tracking this and we can start tracking that here: https://coveralls.io/github/cfpb. However, some repos like cfgov-refresh, are having trouble tracking history. We think it has something do with there being no "master" branch.
Most repos already have coveralls tracking this and we can start tracking that here: https://coveralls.io/github/cfpb
Yes, good idea @KimberlyMunoz!
Great goals, and thanks for including mission focus, which is easy to lose sight of in development.
I spent some time talking to @virginiacc and she mentioned something that I thought was really interesting and worth adding. I offered to try to summarize, but @virginiacc please chime in with anything I miss:
We should aim to develop a default way that we write JavaScript at the Bureau and document it as part of the front-end standards. I see "default" as the key word here as it doesn't imply that it's the only way we write JavaScript, but a sensible approach that applies to most of the projects we do. This also could be different for external projects and internal projects, which tend to require different approaches.
We may also or alternatively want to develop a decision matrix for choosing a JS approach for a project. 18F has some great guidelines on choosing a JavaScript framework. In doing so, they're not limiting their developers,but instead saying "here are a few choices that we support."
I see this falling both under the speed and code quality categories.
I don't have much to add to this discussion, other than my broad agreement with everything that's been said. Great work, @ascott1!
Speaking with @virtix the other day, he requested that I write up some overall goals for the FEWD team. Rather than keep these thoughts to myself, I thought I'd post them here for everyone to see. I'm certain this is an imperfect list, so please feel free to pick it apart. What's missing? What did I get wrong? Is this too generic?
I'm lumping these goals into three overall themes: speed, accessibility, and quality. Most importantly, I want to emphasize that we are a mission driven organization and all of these goals are aimed at supporting that mission:
To be successful at the above we need to create things that are widely accessible, highly performant, work on a range of devices, and serve the highest number of people possible.
Speed
Make it easier to get things onto the web
Right now the process of getting from idea to website is an arduous one. With the launch of V1 and the adoption of Wagtail, that should become a simpler process. In the interim, we need to define a process for getting new content on cf.gov. Let's work with the back-end team to define that process and to create a plan for integration into V1 post launch.
Improve the process of working with CF
We should continue our work on Capital Framework with the goal of making it both easier to use and easier to contribute to. This work has begun and has already made great progress.
The two initial outcomes will be:
The end goal is that working with and contributing to Capital Framework will become an intergral part of all front-end workflows rather than a separate task.
Focus on web performance
As a team we follow many performance best practices, but we haven't discussed performance holistically. I think it'd be worthwhile to document our performance best practices and expectations. Let's also consider adopting a performance budget and integrating tests against that budget into Travis/Jenkins and the Software Advisory Group review.
Accessibility
Become device agnostic/improve our mobile first approach
Up until now we've used browser statistics to help guide which browsers we support. I think that's provided us with a great baseline, but I'd like to go beyond that and focus on building device agnostic experiences that provide a useable experience for all users despite their device. This is related to the following goal and will require coordination with our design and back-end dev teams.
Progressively enhance our sites (particularly cf.gov)
I think this also supports the above goal of device agnosticism. I love JavaScript. I mean, really love it, but I also think we should continue to double down on progressive enhancement. I've been guilty of building things for cf.gov that would be non-functional without JavaScript, often because I'm handed an API and asked to build something. In our current landscape, I'm not sure what the best way forward is. Since the back-end team is doubling down on Django, it may be that FEWDs should learn enough Django to do basic routing and form submissions. Let's make a goal of figuring this out and at the very least make a conscious decisions to do so when we build something that is dependent on JavaScript.
Build accessibility into SAG
Let's ensure that our sites hit everything on this checklist and make it a requirement of our Software Advisory Group process. Let's also continue the integration of automated accessibility checks into our workflow.
Quality
Continue to increase code coverage
Over the last year, we have gone from an organization that sometimes writes front-end tests to one where it is expected to do so. Let's continue that by both writing functional and unit tests. I'd also like to collect statistics on the current landscape of our testing efforts (% of projects covered and code coverage reports of those projects) and set goals for improvements.
Focus on code quality
Continue to improve the review process of the Software Advisory Group and focus on developer education through both in-house and external training opportunities.
Other things to think about
Atomic design - how do we best leverage the work that the V1 team is doing with atomic design?