This post is more than 3 years old.

A month ago in a tweet related to my post about bringing people back to the open web, I casually proposed a resource that would score tools, services and other websites on their commitment to being a part of the open web. I'm back to flesh that idea out a little more.

Crude mockup of a score badge

I'm imagining a simple site that displays a score or grade for each major user-facing tool or service on the web.

The score would help users of the site know at a glance what to expect from the service when it comes to the practices and mechanics of maintaining openness on the web. A badge with the score on it could be voluntarily displayed by the sites themselves, or the score could be incorporated into a browser extension and similar tools that give visibility to the information as users explore the web.

If a site has a high score, users could confidently invest time and energy in it knowing that they'd benefit from clear ownership of their data, easy interoperability with other tools, and no proprietary lock-in. If a site has a low score, users would know that they are entering a walled garden where their data and access to it is the product.

The score or grade would be based on some easily digestible criteria. In my initial proposal these would look at the robustness of the site's API offering, the availability of standard feed options, the usefulness of export tools, the focus on user empowerment, and the level of transparency about how the service works and makes use of user data:

Robust API

  • High score: The resource offers a robust, well-documented API that allows anyone to interact with all of the core data and transactions managed by the resource. The API is a high-performing, fully supported, well-maintained part of the overall offering. Developer-friendly tools such as an API testing console, example client libraries and example queries and responses are included.
  • Medium Score: The resource offers an API that allows anyone to interact with most of the data and transactions managed by the resource. Basic access to the API may incur additional fees or be a part of a premium offering. Support and documentation may be limited.
  • Low Score: The resource does not offer an API, or the API is so limited as to not be practical for common use cases.

Standard Feed Options

  • High: The resource offers one or more standard, structured feeds of a user’s core data and transactions managed by the resource. RSS, JSON and XML are common examples. The feeds are a high-performing, updated in close to real-time, fully supported, well-maintained part of the overall offering. An effort has been made to ensure high visibility and easy integration with tools that consume those feeds.
  • Medium: The resource offers a standard, structured feed that gives the user partial access to their data and transactions managed by the resource. Feed access may incur additional fees or be a part of a premium offering. Support and documentation may be limited. Feeds may not be highly visible, may not fully follow standards, and may experience delays in updating.
  • Low: The resource does not offer standard feeds or the information they contain is so limited as to not be practical for common use cases.

Exports

  • High: The resource offers comprehensive, structured exports of a user’s core data and transaction history managed by the resource. The export is available in a standard format that can easily and programmatically be converted to another format. The export tools are fully supported, well-maintained part of the overall offering. An effort has been made to ensure high visibility and even documentation about how to use the exported data in other tools.
  • Medium: The resource offers export options that give the user partial access to their data and transactions managed by the resource. Export access may incur additional fees or be a part of a premium offering. Support and documentation may be limited. Export tools may not be highly visible, may be difficult to use, or may require some additional effort by the user to apply elsewhere.
  • Low: The resource does not offer export tools, the information they contain is so limited as to not be practical for common use cases, or the format they are offered in is so non-standard so as to require significant additional effort by the user to apply elsewhere.

User Empowerment

  • High: The resource is structured around empowering its user to advance their own particular goals and interests. The user experience is core to the design and implementation of the resource.
  • Medium: The resource offers some user-centric tools or services. Users may be expected to invest significant time to learn the basics of the resource before becoming efficient with it.
  • Low: The resource does not offer export tools, the information they contain is so limited as to not be practical for common use cases, or the format they are offered in is so non-standard so as to require significant additional effort by the user to import it elsewhere.

Transparency

  • High: The resource makes a substantial, high-quality, high-visibility effort to be transparent about how it works, how and where its users data is collected and used, and how users can manage or erase the storage of their personal data. Source code, architecture documents and technical specifications are shared whenever possible. Transparency questions are quickly answered.
  • Medium: The resource offers some transparency about it and its collection and usage of user data. Transparency information may not be highly visible, and may be out of date. Source code and architecture doc sharing is limited.
  • Low: The resource does not offer any kind of transparency resource, or the information it contains is so generic, inaccurate or misleading so as not to be useful to the average user.

(I've posted the above criteria in a Google Doc where anyone can comment, in case you want to offer suggestions on specific parts.)

How each of these would be weighted and incorporated into a final score remains to be seen.

For a given site I'd like to make sure that it's a relatively objective process to look at the real, publicized offerings around each of these areas and pick a High/Medium/Low rating. But in cases where it's not clear or where there are differing interpretations, the scoring site would need a mechanism for discussion in an attempt to reach consensus.

In general, if I built this site I'd want it to be accessible for anyone to contribute to, but in a way that allows for scoring to be peer reviewed and improved over time. "The consensus of the open web advocacy  community" holds a lot more weight than "some guy named Chris." So whether that's via issues or pull requests on a GitHub repo, a form on a WordPress site, or something more custom, anyone should be able to come along and add a resource, contribute to its scoring, and join in the discussion.

Of course, the scoring site itself should get high marks in all of the above areas. It should have a robust API for accessing sites and scores, a feed of newly scored sites and discussions, the ability to download all scoring data at once for further analysis and other uses, and strong transparency about how everything works.

Questions I could use feedback on, please comment below:

  1. Generally speaking, is this resource something worth building and having?
  2. Are the scoring criteria I've proposed complete and helpful?
  3. What kind of final score presentation seems best: a grade, a number, a star rating, something else?
  4. What else needs to be considered as a part of building and publicizing this kind of site?

3 thoughts on “Scoring sites on their commitment to the open web?

  1. 1. Generally speaking, is this resource something worth building and having?

    Yes, I think it's a great idea!
    Are the scoring criteria I've proposed complete and helpful?

    2. I think they are all good criteria. I would add support for hyperlinks--one of my frustrations with Twitter and Instagram is that they don't support them well.

    3. What kind of final score presentation seems best: a grade, a number, a star rating, something else?

    I like letter grades, but they are probably too U.S.-centric. I'd vote for a score out of ten.

    4. What else needs to be considered as a part of building and publicizing this kind of site?

    This would be no small feat but it would be supremely cool if I could post my reviews to my own website using some standardized semantic markup, submit the URL to an aggregator, and have my review be factored into the average. Kind of like Webmentions: https://indieweb.org/Webmention

  2. Hmm, seems like the User Empowerment list isn't quite right. Says:

    > Low: The resource does not offer export tools, the information they contain is so limited as to not be practical for common use cases, or the format they are offered in is so non-standard so as to require significant additional effort by the user to import it elsewhere.

    Whereas this seems to be a point about export tools instead.

    Either way, it's a good list, and it's definitely worth pointing out sites and services that follow these guidelines.

Leave a Reply

Your email address will not be published. Required fields are marked *