dotnet / docs

This repository contains .NET Documentation.
https://learn.microsoft.com/dotnet
Creative Commons Attribution 4.0 International
4.27k stars 5.89k forks source link

Proposal for API Docs #772

Closed richlander closed 7 years ago

richlander commented 8 years ago

Proposal for API Docs

API documentation is important! We need to define the following:

See existing docs and content: https://docs.microsoft.com/en-us/dotnet/core/api/system.

Requirements

The APIs are the same across all products, including implemented interfaces and class inheritance. In some cases, this won't be true. If it's an exception, we'll decide that one implementation is the default and any others are correctly described in remarks.

Scope

The docs will document .NET APIs in:

Frameworks:

Libraries:

The user experience should bias towards seeing all APIs while providing soft-filters and cues to define which APIs are implemented in the various products. This is primarily an issue for the frameworks listed above, as opposed to the libraries.

Representation

URLs are easy to predict and hack.

Hacking means that the following URLs should go somewhere useful:

The default experience is viewing all .NET APIs, unfiltered by product or release. You can filter your view in terms of a product and version, such as .NET Framework 4.5, .NET Standard 1.5 or Mono 4.4.

Good example of version navigation: https://developer.android.com/reference/packages.html. The Android experience always shows all APIs and greys some of them out. That's a great experience. The "Platform" toggle on this page is not a good experience: https://docs.microsoft.com/en-us/active-directory/adal/microsoft.identitymodel.clients.activedirectory.

Describing Product Membership

The unfiltered view states all .NET products that support a given type/member, independent of filter. For example, when viewing pages in terms of Mono, you see "Introduced in Mono 1.1, .NET Framework 2.0, ...".

Two options for the filtered view:

The "View source" should take you to the product source for a given type or member. It should take you to the specific product in the filtered view and a default product in the unfiltered view.

Product version is not important for this experience, only product.

Examples, for the Filtered case:

For unfiltered, use a precedence algorithm, based on API availability information from reference assemblies:

Assumption: All docs are in the core-docs repo.

There are two ways to improve a document, either via updating the .cs file or the sidecar file. The "Improve this doc" button can only go to one place. Both files should contain a fully-qualified GitHub link to the other file (in a comment), so that it's easy to go from one file to the other.

Proposal: The button takes you to the .cs file. It's an open question of whether it should be in "edit" or "view" mode on GitHub.

Format

The repository does not version, in terms of branching. There will not be branches for given .NET product releases. It's certainly fine to branch as part of a release, but the branch will be temporary. Given that the repository is for multiple products, the branches would not be meaningful / intuitive.

Relationship to Product Source

/// comments will be copied to product source on some cadence. One option is to snapshot /// comments from core-docs into corefx (for example) once per release, at the start of the release. This means that /// comments in product source will always be one release behind, but also won't disrupt product source at an unfortunate time.

Adding new APIs

There needs to be a tool that can take a set of reference assemblies and then add new APIs into the .cs files that are not present as stubs (empty /// comments).

Extra effort: A tool that prints out types and members that are missing /// comments.

Projects that already have /// comments

Projects (basically all the libraries) like ASP.NET Core already have /// comments within the product source. They don't have to change what they are doing.

In those cases, "View Source" and "Improve this Doc" will go to the same place, to the product source.

We need to decide for these projects where the sidecar files will go. Ideally, they'll go in the product repo, beside the files, potentially in a subdirectory. There isn't much value in splitting this content and requiring contributors for that content to clone two repos.

IntelliSense

IntelliSense will need to be generated from these same files. We'll need a similar tool that generates intellisense files from the .cs files plus a set of reference assemblies that defines a given platform.

joelmartinez commented 8 years ago

This versioning model is yet undefined.

I'd love to be a part of this conversation ... as this is an issue that we are looking at in the mono API documentation world at the moment as well. How can I get involved @richlander?

Can view docs in terms of a product version, such as .NET Framework or .NET Standard

🙌🏼 This is precisely what we want to enable for mono/xamarin's API documentation :D

BillWagner commented 8 years ago

/// comments will be copied to product source on some cadence. One option is to snapshot /// comments from core-docs into corefx (for example) once per release, at the start of the release. This means that /// comments in product source will always be one release behind, but also won't disrupt product source at an unfortunate time.

I want to think about this in relation to the "View source" button on our API docs. It goes to the product source in GitHub (including line number). We should make sure this process doesn't make that link go to the wrong location.

richlander commented 8 years ago

@joelmartinez, @BillWagner - Just made a big update to the proposal, intended to answer your questions. Tell me what you think.

joelmartinez commented 8 years ago

The default experience is viewing all .NET APIs, unfiltered by product or release. You can filter your view in terms of a product and version, such as .NET Framework 4.5, .NET Standard 1.5 or Mono 4.4.

Love the concept of the documentation UX being everything by default. The filtered view lets you change the context that you're viewing the documentation in.

There are a few more nuances though, to this context that probably need to be captured. For example, the System.Web namespace is available only on certain platforms. So how can we indicate that System.Web, is available when you're reading the documentation in the context of ASP.NET 4, but the story is different in the context of asp.net core; and furthermore, it's available in .net 4.5 and mono 4.x, but only when you're on windows and mac, not on ios and android.

richlander commented 8 years ago

ASP.NET Core doesn't use System.Web.

This content seems like remarks, to me.

joelmartinez commented 8 years ago

sorry ... yes, that was a poor example :P but the rest stands ... how can we indicate platform differences that don't necessarily correlate to version or frameworks?

richlander commented 8 years ago

I'm suggesting remarks.

rowanmiller commented 8 years ago

Projects (basically all the libraries) like ASP.NET Core already have /// comments within the product source. They don't have to change what they are doing.

In those cases, "View Source" and "Improve this Doc" will go to the same place, to the product source.

👍 we do this on the EF project and writing docs is just part of the normal dev/code-review process

mairaw commented 8 years ago

Yes! That's great @rowanmiller. But .NET is a bit more complex, so that's why we're starting this discussion. We don't have a single source that defines all the different pivots (Framework, Core, Mono, etc.) and lots of legacy code with no comments.

Also, are EF and ASP.NET tied to our release or are we going to be showing a different filter for libraries?

Some comments:

I think this one will be hard. The experience in docs.microsoft.com should be consistent across products so we need to think about that too. So the original intent was that you can go to the source and then there you can edit the /// comments if you find an issue and then if you click improve this doc, it will let you add additional content like remarks in the sidecar Markdown file. Not sure if the other products will have the same challenges as us. So for ASP.NET and EF where they have the source and can edit comments directly in the source, the original design works. Maybe we should continue to open the markdown.

richlander commented 8 years ago

@mairaw Yes, there is going to be a lot of challenge to pull of these docs and the various buttons. There is a lot of value in this single uniform experience. We're going to need to some custom implementations to deliver it. The value is high enough that it's worth doing more to achieve it.

Certainly, it's possible to simplify it. I'm starting with the best (IMO) UX. It would be good to get some feedback on that UX, the value they see in it and whether they'd want something different.

I added some more content to accommodate your questions and comments.

migueldeicaza commented 8 years ago

I just realized that this discussion was taking place on a github issue, I participated on a Slack discussion yesterday along these lines.

With Mono, we are using an enhanced version of the original XML file format that was used to produce the ECMA API documentation. The documentation code drop that we got a few years ago from Microsoft also came on a variation of this, which we imported into Mono.

To give you an idea of what this looks like, you can see the MSCorlib documentation in Mono:

https://github.com/mono/mono/tree/master/mcs/class/corlib/Documentation

Or you can look at our documentation for Urho for example:

https://github.com/xamarin/urho/tree/master/Docs

The markup is derived from the C# markup that is used inside the /// comments.

I think that this should be the format that we adapt. Some of the benefits include:

Doing the work on a C# file and editing inline comments sounds like a world of pain. While not impossible to do this (after all, we can put a man on the moon) it is a shortcut that does not address any of the problems that are already addressed and will require additional engineering to think about solutions, while still providing a poor editing experience.

XML is just a very convenient format to use to maintain these docs.

The above covers some of the good things about using the XML format. The C# format has merits, but compared to the above, they are measured in sub-atomic units.

[1] iOS doc population:

https://github.com/xamarin/maccore/blob/master/tools/docfixer-ios/populate.cs

iOS doc updates based on rich metadata:

https://github.com/xamarin/maccore/blob/master/tools/docfixer/document-generated-code.cs

See also [6]

[2] https://github.com/xamarin/maccore/blob/master/tools/doc-relcontent/doc-relcontent-merger.cs

[3] http://www.mono-project.com/docs/tools+libraries/tools/monodoc/generating-documentation/

[4] http://screencast.com/t/hB9LfwJD which alters an existing documentation set like this: https://gist.github.com/migueldeicaza/733164b0b19b107e941f2648e080cc68

[5] Caveat: because on the internet "anyone" can be a troll, we think that we should have some protective measures against vandalism. That is, turn anonymous editing into a git branch. A human would review, and merge those docs if they are good. Trusted accounts could just commit directly to the repository.

[6] Urho documentation for common idioms, a simple F# script showing what we do with it:

https://github.com/xamarin/urho/blob/master/Docs/fill.fs

MichaelNorman commented 8 years ago

My perspective, as a writer, is that we absolutely do not want developers writing API documentation. Ditto for Random J. Webuser. Drafts or notes, maybe, but I've never seen documentation written by a developer that would pass muster as the global face of Xamarin or Microsoft to developers. And I don't expect that developers will accept the level of training required to write localizable documentation, nor that level of training that is required to produce documentation for ESL speakers. Whatever technical solution is settled upon, it needs to take into consideration globalization and localization, and the concomitant requirement of professional writers and, even, professional editorial support.

This, to me, is one of the stronger reasons for removing support for triple-whack comments as a matter of policy. My fluids professor used to say that no layperson thinks that they have an intuitive understanding of brain surgery, but they're convinced they have an intuitive understanding of fluid dynamics, which is at least as complicated. There is a possibly weaker corollary for technical documentation.

richlander commented 8 years ago

@MichaelNorman That's a good perspective, however (IMO), I don't think it has to be a case of starkly defined roles. In fact, we've found in our work that hard-lines in roles leads to lots of problems. We prefer significant overlap, which we find increases collaboration and shared accountability. It also increases the respect that folks have of people in other roles.

Couple thoughts to consider:

The best combination is when writers, devs and PMs collaborate on docs. A joke, but what I'd like to see in terms of collaboration:

That's obviously written from a PM perspective.

For .NET, the development team doesn't participate much in the API docs. That's for a set of reasons, none of them good. My hope is that we can make the API doc content and tooling much more accessible to those developers so that they can participate in the process. Once they do, I know that they will find technical inaccuracies in the well-written English that they read. With each of those fixed issues, the docs will become more accurate and the writers will learn a greater appreciation for the product, which will help generally.

On the /// comments as the source of truth, I'm very interested in the system that your team has been using. I just watch the screencast shared by @migueldeicaza above. That looks really good. We were using an XML editor before. A custom app like that makes a ton of sense to me, since API docs are a narrow content domain.

MichaelNorman commented 8 years ago

@richlander, with all due respect, I think you are missing the point. Software development is a profession, a specialization. Writing technical documentation is a profession, a specialization. To say, "We don't want developers writing documentation," is no more unreasonable than saying, "We don't want writers introducing regressions into our software." It's not about turf. It's about why we don't want heart surgeons doing brain surgery, and vice versa.

We can talk about how "pretty good" the EF docs are. They do look good. How localizable are they? How well do they conform to the style guide for the organization? How well does the documentation fit in with the larger product documentation? How consistent is the voice between different docs in the same product, as well as between the documentation for the product and for other MS products? In short, how much of how good they look is due to seeing them with the undiscriminating eye of the reader, as opposed to the experienced eye of the writer?

Far from engendering respect for writers, conflating the roles and assuming that developers can match the quality and efficiency of the writing staff in their spare time denigrates the writing profession and the contribution of the writers. Now, you are not the first to be so accused! Writing has always been viewed as a cost center inside software organizations. Developers make the thing we sell, and writers just support the user. This is still true in an organization like Xamarin, no matter how supportive Xamarin has been with regard to the value of documentation. (And, boy, have they been supportive!) But this is a trend that needs especially to be pointed out now, because it is so pernicious, and because it always affects the toolchain.

As far as your PM comment about logistical focus and emphasis, that is already performed by the writing team at Xamarin, and always was while I was writing for Microsoft. Sure, there was dev and PM input, and it was always eagerly solicited. But to suppose that this is some novel value-add of the PM role that is routinely disregarded by the writers,... Well, suffice it to say we don't see eye to eye on that at the moment.

To address your point about what developers know and what writers know, that too misses the point in a subtle way. It's more useful to talk about what developers do and what writers do. For example, writers do writing and interviewing. Developers do software development. An oft unappreciated part of the practice of technical writing is soliciting not just correct information from the subject matter experts (SMEs), but teasing out the organizational structure of the product from the developer, and relating that to the (manifoldly diverse) customer (embedded within their cultural frameworks and our legal framework), not just more articulately than the developer can afford to do with the time allotted to her in her job, but systematically with a sensitivity to the issues to which I've alluded (And you have not addressed, tellingly!) about ESL, localization, style, homogeneity of voice, and so on.

For my money, the "lots of problems" that organizations face in regard to interdisciplinary squabbles are not solved by softening role lines or by conflating roles, but rather by changing procedural habits and educating all parties in better ways to collaborate.

I have been, in this post, direct, shall we say? I hope it is received in a firm but respectful way.

richlander commented 8 years ago

Thanks for sharing your insight. I've not done your job before, so I'll take what you say at face value, and of course, respectfully, as you hope.

We've now traveled a ways away from the topic of this issue, so I think I'll not go any farther into this topic, as much as I do think it is an interesting one to discuss further. Given that we now work for the same company, I think we'll indeed have the option available to us.

lobrien commented 8 years ago

IMO I view the technical issue as one of modularity: the XML format has an advantage in that everything's in one place but, since it's hierarchical, there's an impedance mismatch when it comes to, e.g., platforms. We aren't going to solve that (or, rather, we ought not to hare off after one of the bazillion overly-complex attempts to hide/mitigate XML's hierarchical nature).

Since I was raised in the relational age, I tend to prefer my data orthogonal. So personally, remarks in one place, summaries in another, platform info in a 3rd would probably appeal to me if this were a green field effort. (Which, of course, is the furthest thing from the truth.)

Even given my predilection, I worry that the idea of "summary-only" .cs triple-whack files may be problematic from a synchronization standpoint. That is, as the implementation and types change, we know that maintaining the map between implementation and docs is something more than trivial. It seems to me that you want the storage for that map to be the easiest-to-work-with format. For me that would be XML rather than triple-whacked-.cs (System.Xml.Linq FTW). (On the other hand, the synchronization code is essentially about types and signatures and .cs has the advantage of giving us actual type-safety: if the synchronization is wrong, we get a compile-time error. Hooray!)

At Xamarin, we have a more semi-monolithic approach: 1 XML file per type, a directory per namespace. That approach has served us very well, IMO. Without going into details, we have a lot of flexibility re. tooling: we have a visual editor, but on the other hand, scripting and automating against XML is trivial.

If I had my druthers, I'd stay with the Xamarin approach. But is it true that the sidecar-file, at least for remarks, is a necessary pre-requisite for localization?

mairaw commented 8 years ago

For the .NET Framework API docs, we also use XML in our CMS, however the file structure is not visible to us and we can only edit one API at a time. Then the platform info is a different input from another set of XML files. We have the option of editing by hand or by using XMetal which understands our schema, etc. I wouldn't say that the sidecar-file is a prerequisite for localization since today our CMS exports the localizable segments in the XML format already. I can find some folks from our Loc team to comment on that aspect with more property than I can.

joelmartinez commented 8 years ago

URLs are easy to predict and hack.

Just wanted to add a bit of my experience with the Xamarin developer portal URLs. After I switched the url format for API docs to move the information from the query string, to being a part of the path itself, I came across a few limitations. For a very small subset of APIs, which had a large number of parameters, we encountered some errors due to the path being too long. After some fiddling with configurations (maxUrlLength, etc), I decided to break up the URL into multiple segments. So for an API that might have looked like this:

/api/Some.Long.Namespace.TypeName(OneParameter,TwoParameter,ThreeParameter)

It now looks like:

/api/Some.Long.Namespace.TypeName/p/OneParameter/TwoParameter/ThreeParameter/

I'm only bringing this up as something that should be tested ... I don't know if any APIs in the BCL would trigger this error, but it might be good to verify with the longest URLs in the set before going live :)

richlander commented 8 years ago

@MichaelNorman We use Roslyn to read/mutate .cs files. It's a purpose-built API for that purpose.

At Xamarin, we have a more semi-monolithic approach: 1 XML file per type, a directory per namespace. That approach has served us very well, IMO.

That's mostly the same as the proposal, in terms of the granlarity.

richlander commented 8 years ago

@joelmartinez good point. We haven't gotten to that level of detail. I think we were largely hoping to skip that (one page per overload). We'll see.

mairaw commented 8 years ago

For the URLs, we are already live, so I don't think we ran into problems with the length.

BillWagner commented 7 years ago

I think we've addressed these in our current API docs work.

I propose we close this. @richlander @mairaw Can you vote on this proposal?

cartermp commented 7 years ago

I'll just close this. We've addressed what is discussed here, and the vision has basically been executed.