A splendid time was had by all! On Wednesday, December 12th we gathered at Google in Mountain View, California to work on Web Platform Docs. We had over 35 attendees who worked on the site for a whole day. Google provided snacks, beverages, lunch, and a wine/beer reception afterward with live music. There was, of course, the usual t-shirts, and other cool swag – see below.
Premium Web Platform Doc Sprint Swag
Many of the attendees were new to Web Platform Docs – we registered 15 new users. So we spent some time early on walking them through the site and the Getting Started pages. Once they got going, they caught on pretty quickly. Having a group of folks doing a small task across a broad area really makes a big improvement. This is the power of a doc sprint, turning what would be a tedious and daunting chore into something doable; having company makes it fun and many hands make light work.
Matthew and Tony
We also had several people who just knew how to pick up a shovel and start digging. Many of these folks follow this forum, the e-mail list, the IRC channel, and other forums; they and you are lending your expertise where it really matters. One area of expertise you may have that we can really use is in the domain of user testing, heuristics, and user experience design. These docs sprints provide a perfect laboratory – albeit without the one-way glass. If you’ve been behind the glass, you probably would prefer the chummy doc sprint to that sterile, Observer effect-infected environment. The data is bound to be better, too.
As it is, we have only a limited amount of data from this doc sprint, and none of it scientifically sanitized. Here’s what we got:
- 375 (apprx.)
- New WPD members onboarded:
- Pictures: see:
- G+ event.
Dilip, Dickson, Scott Eliot, and Ming Ming
- Renato developed a Web Platform Search Companion search extension to Chrome. Install this extension from the Chrome Web Store. Just type “wpd”+space on your Chrome omnibox (that box where you type URLs) and the extension will be activated. Then, type whatever – a CSS property, for example, and you will get direct URLs to the corresponding webplatform.org pages.
- Dan implemented a MediaWiki search extension, helping to resolve bug 19401. This, too, provides pop-up results, but in the Search field on the wiki page.
- Tony created a Quick Start guide using a tool he developed that builds MediaWiki-formatted pages from HTML. The tool can be used to aggregate several HTML pages and import them to the wiki.
- Luz contributed several WebGL tutorials with the help of Noel.
- Jonathan contributed an HTML Lists tutorial.
- A dedicated group including Ruby, Dilip, Aysegul, Andrea, Mark, Kathy, Tim, Parker, Eliot, Tommi, Suman, and lots of other people whose names can’t be devined from their user IDs added summaries and otherwise fixed up articles in the css/selectors, css/properties, html/attributes, and html/elements spaces. This was a TON of great work!
Lacking any methodology whatsoever, and while at the same time juggling three questions at once – I know, you’ve been there, too – I gathered the following impressions from the session.
- Most of the attendees were familiar with web development concepts and technologies
- Some attendees were exploring web development for career opportunities
- Some attendees were attending the doc sprint looking for business opportunities
- Some attendees were here to help build the barn – pure generosity (yes, Virginia, there is a Santa Claus)
- Many people had trouble reading the Getting Started documentation
- Most people were forgiving of the site’s usability shortcomings (like the infamous session ID loss).
And on. I have more, and I’m sure you do, too. Point is, we need to set out to prove or disprove some of this stuff and develop some metrics around our community, how much they know, why they’re in this, how well our documentation reads and works, how well our user interface performs.
Part of our doc sprint methodology should include appointing someone (or some many) to gather user feedback – go around asking pointed questions, challenging participants to solve specific problems, just like one of those highly-paid consultants in there with the (bribed) user test subject in the room with the big mirror that everyone tries to pretend isn’t there. This person should be dedicated to that task, gathering user feedback only, and to reporting the findings – not like this blog post, but much better. We could develop a standard questionnaire, assign points to ranges, the whole nine yards. The goal is to figure out how well our site works for contributors, and we don’t need to be too data-centric to accomplish that, but if we could chart our progress against changes, that would be a bonus.
With each doc sprint we do, we’re getting better at running these, and @peterlubbers is developing a “Doc Sprint in a Box” that captures some best practices and provides tools to make it easier for any of our members to start a doc sprint. We welcome any pointers from attendees and others running doc sprints as well. We need to keep having an active conversation about how to best use our doc sprints to develop the site and its content.
Thanks to everyone for their dedication and contributions!