Agile Computing Authors: Yeshim Deniz, Pat Romanski, Elizabeth White, Rene Buest, Stackify Blog

Related Topics: @CloudExpo, Agile Computing, @DevOpsSummit

@CloudExpo: Blog Feed Post

The Utopia of API Documentation By @JKRiggins | @CloudExpo #API #Cloud

It's proven time and again how much API documentation matters to your developer experience

The Utopia of API Documentation
by Jennifer Riggins

It's proven time and again how much API documentation matters to your developer experience - in fact, it kind of matters more than anything else as to whether your API is adopted or not. And certainly developer experience matters to your overall bottom line. After all, in the world of the application programming interface or the API, developers are your users and therefore their user experience matters most.

There's no doubt your API documentation has to be sexy, but, as sexiness is in the eye of the beholder, there's a lot of debate about just what that means. Today, SmartBear sits down with Arnaud Lauret of AXA Banque (a.k.a. the API Handyman) to talk about this idea of documentation utopia and his vision of an ideal world where APIs and documentation live in perfect harmony.

The API economy will collapse on poor documentation
Before we can talk about what great documentation is, we should probably talk about what bad documentation is. The worst documentation is the one that doesn't exist. But second worst is API documentation that's never used. "Poor documentation leads to poor APIs and poor APIs lead to poor documentation-to the dark side where no one uses them," Star Wars fan Lauret said. And if your documentation isn't used, soon enough, neither will your API.

Why was your documentation overlooked? Some frequent culprits include:

  • It shouldn't have been written in the first place.
  • It's too long.
  • It's too hard to understand.
  • It's outdated.
  • It's not adapted to your audience.
  • It's not on concept.
  • It couldn't even be found.

"Without good documentation, you can't have an API and without a good API you can't have documentation," Lauret said, referring to the symbiotic relationship intrinsically linking the two.

What does documentation look like in a utopia?
According to Lauret, "API documentation should include all instructions, comments and information needed to build, maintain and use an API and [its] subsystems."

For some more or less agreed-upon qualities of good API documentation. It must be:

  • adapted for audience - like all good marketing and customer support, perhaps multiple documentation depending on the audience's needs
  • DX-first - made for humans, by humans
  • machine-readable
  • Google-readable - search engine optimization matters when most people are typing "X API" into Google
  • well-organized like a reference guide or table of contents
  • intwined with the API itself - dual-screens or opening in new window, allowing users to try something out right away
  • not a burden to create
  • with pricing and usage policies
  • with contact information
  • adapted to the learner or user
  • riddled with use cases and code examples
  • made up of everything you could need to use the API
  • paired with a story - why you are doing this to achieve that
  • easy to produce, publish and maintain
  • adapted to what kind of software is being documented, like SaaS versus platform
  • adapted to audience to the people that will use it - end user versus inside your company
  • adapted to context - when in the discovery process and how people will use it
  • equipped with some sort of way to collect user feedback on how you can further improve it
  • easily found, whether within the developer portal or prominently placed on your website

We could go on and on but instead of just making a wish list, let's get into how your documentation can make your API shine and vice versa.

"Microservices is widely adopted and the concept has been applied to documentation for usable, maintainable, reusable, replaceable micro-documentation."

Lauret says to avoid the doom of unread documentation, everything should be micronized. "Microservices is widely adopted and the concept has been applied to documentation for usable, maintainable, reusable, replaceable micro-documentation." It all falls into the domain-driven, context-driven world we're re-approaching.

And the movement toward microservices also makes documenting seem more achievable in bite-sized increments. This falls right in line with a greater sense of ownership for each service-and who better than the one who wrote the code to then explain its purpose?

Going full circle, microservices fit nicely with continuous delivery.

For continuous documentation delivery, there must be automation
While it seems like automation drives much of continuous delivery, continuous delivery has us going so fast that API documentation gets neglected. But automation can actually be the key to continually delivering great documentation.

"When you code, you don't reinvent the wheel, you reuse existing libraries and modules," Lauret said. While you still need human touch in your docs, it stands to serve that you can ease the burden of creating them by automating them and reuse-recycling them.

In Lauret's perfect APIverse-that we think we are getting nearer to every day-there will be continuous update, delivery and improvement of documentation, right along with coding and testing. This means that at least part of the documentation can and should be automated.

This automation has to come with standardization, which means that companies should prioritize creating standards for API documentation.

To offer Lauret's example: "If all your APIs are true REST APIs and you always them design them the same way, you lessen the need for documentation. If you write documentation using command and shared structures, templates, and common and shared vocabulary and concepts, they become easier to write, read, understand."

Doc automation could even have version control to pair doc versions accurately with different releases.

"And don't forget that documentation should tell stories and should get all API documentation to use them," Lauret said.

Lauret offers up probably the most popular way to automate: our very own Swagger.

"Documentation and its subjects are analyzed to check that they are consistent with each other. For example, if you have an API descriptor, the system checks that the API is conforming to that descriptor. This already exists with ReadyAPI from SmartBear. You can take an API descriptor in Swagger, and ReadyAPI will create the basic testing to check that the implementation for the API is correct compared to the API descriptor," he went onto explain.

But remember that Swagger isn't the final piece of the puzzle. It'll get down your specs and build the perimeter of your API, but Swagger alone does not make complete API documentation. Your API's story is a big part of it too. Stormpath offers a strong example of documentation enriched with a strong introduction to its concepts and terminology.

While formats like Swagger and RAML can automate the raw specification, you can also try a tool like LucyBot, to make Swagger more human-readable.

In his idyllic APIverse, Lauret sees everyone working with a documentation package manager or DPM for both people who write code and for those who write API descriptors that allow you to automate based on certain dependencies. A DPM helps you write the documentation, search for requests for comments (RFCs), templates, and vocabulary. The DPM creates a descriptor that describes the dependencies of your documentations. NodeJS is already doing something similar with its Node Package Manager (NPM), but it works for people who code, but not yet for people who write documentation. Every level of documentation is linked together via the DPM, allowing users to switch among them.

Finally, "Every single documentation is written as code-human and machine-readable. This structured document can be copied and transformed by any other format." From there, you can really start to make your API documentation-and by extension, your API- more accessible.

Approaching the world of documentation for everyone
Lauret offers the example that "If you have an API descriptor and the data are fairly simple, you can generate an API. All that without coding anything." From there you can automate even more writer-friendly documentation or even allow both your users and internal developers to make mock-ups to test if a design is good or not. From there you can build user-friendly design tools including automatic mock-ups to test if a design is good or not.

By treating documentation as code, it becomes structured and machine readable. This data then can build documentation APIs, "and, with this data, you can build outrageously beautiful visualizations," Lauret gushed.

With everything you do, you need to think about how developers are accessing your API. For first-timers, they need an intro to put it all into perspectives. There are others that will play around with your API first and use the documentation as a reference tool, wanting to easily find examples and a step-by-step of how to do something specific.

Lauret reminds you that "When your API is your product, these people need to create really cool documentation."

Like how we are teaching kids coding visually with tools like SNAP, both of these API customers would benefit from a more visual map of the API documentation, like a mind map or a flow chart, including links and ways to expand and visualize the functionalities and how they interrelate. FoxyCart's API comes with a great visualization of their API documentation. Similarly Lauret created a tree-like 3D-JS visualization of this Swagger spec to create a flow-chart view of the lengthy documentation. He's also thinking about pairing Mermaid sequence diagrams with Swagger.

All users would also both benefit from search functionality built into the docs.

Once you establish the system, structure, framework and responsibility for creating and updating API documentation, you can use that machine-readability to increase human-readability. You can create a venerable "Document Factory," a module you could integrate into a continuous delivery system, automatically generating different types of renderings from the raw source documentation, like PDFs, static websites, or even whole books.

Lauret even suggests that if your API documentation is good enough and machine-readable enough, it becomes something that could easily be converted into an API itself or as a source for partial or complete software generation. This part of utopia is already available with RESTlet and Nuclear-Powered Mushroom.

Imagine how you could better reach your developers if you can appeal to all their different learning techniques, from the person who still likes to read the code up to an auditory learner who could listen to an API-narrated walk-through, from someone who wants a stagnant image to someone who wants a Prezi-like interactive experience. You could even mimic what Absolut Vodka did with APIs, automatically piecing together audio and shots to make hundreds of recipe videos. Or maybe one day there will be a way to automate the creation of a video game that walks devs through your docs!

And it's not just user experience but languages too. Lauret shares the dream of what he class "Hyper-Documentation" where the hypermedia API meets content negotiation, where the server sends back content in a certain language or format to create truly contextual documentation.

Of course, nothing is more valuable than when documentation is integrated with its API, sort of a split-screen or hyperlinked reality where your users or potential users can jump to the part they need in the documentation and then easily toggle back to that part within the API, immediately putting to practice what they learned.

Who's in charge of writing API documentation?
The real problem with API documentation is that while we increasingly acknowledge the value of it, it's the first casualty of the "There's Never Any Time Syndrome." And when it's unclear who's in charge, you're short of volunteers too.

Lauret argues that in order to guarantee quality and audience, "only the right people write the documentation for the right audience in the right context." He contends that creating documentation is a real job for qualified people with real skills that should depend on context.

It doesn't seem to be debated that this should be a job, it more comes down to who is responsible for it. And until we decide that, I'm afraid we may find ourselves in cycle of blame game.

A trend which grows along with the importance of documentation is to treat your documentation and client libraries as products themselves. Companies like Sendgrid pair it with a product manager who has a keen understanding of the key stakeholders that has her own team of people dedicated to the task.

Some would say marketing should get involved because it's a huge tool for selling your API, as well as the aforementioned fact that SEO does matter as people are still Googling to find you first. Plus, if you're putting all this work into well-maintained documentation, you'll also need to enlist marketing or your website team to make sure you're tracking all of the doc's user behavior, time on page, and other important analytics. Only then will you know if your documentation is any good, which is to say actually used.

Like in the not-for-profit grant writing world, others would argue to outsource it to a technical writer, but I worry that an external writer won't fully understand the needs of your API user and, really, they may err on the side of too technical, not enough human. And from this technical perspective, the engineers should be involved anyway. As well as the testers for that matter. Oh and the UX folks.

I for one argue on behalf of the developer evangelist at least being in charge of coordinating this ongoing project. They're the ones who should be best aware of what the clients need and the different use cases needed to explain how to do that. And I'd argue, while they don't need to write the whole thing, they should find the right people to do it and they probably should draft the Get Started. And then the whole team should occasionally dogfood their own start docs to make sure it still makes sense and is easy to get right into.

Or why not outsource it to a brand advocate or early adopter? That customer that's working with it every day, has a stake in your API and its documentation usability, and who could use some extra cash. Her motivation may be greater than most.

Of course, it could and probably should be different roles within each company, the size of that company and perhaps even each audience using it. Add to that, is your API an ancillary product or your bread and butter? If it's the latter, then this responsibility increases dramatically. What's clear is that every company has to define the person or persons who is in charge of this essential part of the developer experience.

But it's just the beginning
We don't doubt there's a need and a demand for better API documentation and doc automation. But we're still at the very early stages of creating something that's both machine-readable and for the different audiences that could be looking to consume your API. That's where you come in.

Read the original blog entry...

More Stories By SmartBear Blog

As the leader in software quality tools for the connected world, SmartBear supports more than two million software professionals and over 25,000 organizations in 90 countries that use its products to build and deliver the world’s greatest applications. With today’s applications deploying on mobile, Web, desktop, Internet of Things (IoT) or even embedded computing platforms, the connected nature of these applications through public and private APIs presents a unique set of challenges for developers, testers and operations teams. SmartBear's software quality tools assist with code review, functional and load testing, API readiness as well as performance monitoring of these modern applications.

@ThingsExpo Stories
Poor data quality and analytics drive down business value. In fact, Gartner estimated that the average financial impact of poor data quality on organizations is $9.7 million per year. But bad data is much more than a cost center. By eroding trust in information, analytics and the business decisions based on these, it is a serious impediment to digital transformation.
Business professionals no longer wonder if they'll migrate to the cloud; it's now a matter of when. The cloud environment has proved to be a major force in transitioning to an agile business model that enables quick decisions and fast implementation that solidify customer relationships. And when the cloud is combined with the power of cognitive computing, it drives innovation and transformation that achieves astounding competitive advantage.
As IoT continues to increase momentum, so does the associated risk. Secure Device Lifecycle Management (DLM) is ranked as one of the most important technology areas of IoT. Driving this trend is the realization that secure support for IoT devices provides companies the ability to deliver high-quality, reliable, secure offerings faster, create new revenue streams, and reduce support costs, all while building a competitive advantage in their markets. In this session, we will use customer use cases...
Digital Transformation: Preparing Cloud & IoT Security for the Age of Artificial Intelligence. As automation and artificial intelligence (AI) power solution development and delivery, many businesses need to build backend cloud capabilities. Well-poised organizations, marketing smart devices with AI and BlockChain capabilities prepare to refine compliance and regulatory capabilities in 2018. Volumes of health, financial, technical and privacy data, along with tightening compliance requirements by...
Andrew Keys is Co-Founder of ConsenSys Enterprise. He comes to ConsenSys Enterprise with capital markets, technology and entrepreneurial experience. Previously, he worked for UBS investment bank in equities analysis. Later, he was responsible for the creation and distribution of life settlement products to hedge funds and investment banks. After, he co-founded a revenue cycle management company where he learned about Bitcoin and eventually Ethereal. Andrew's role at ConsenSys Enterprise is a mul...
DXWorldEXPO LLC announced today that "Miami Blockchain Event by FinTechEXPO" has announced that its Call for Papers is now open. The two-day event will present 20 top Blockchain experts. All speaking inquiries which covers the following information can be submitted by email to [email protected] Financial enterprises in New York City, London, Singapore, and other world financial capitals are embracing a new generation of smart, automated FinTech that eliminates many cumbersome, slow, and expe...
DXWorldEXPO | CloudEXPO are the world's most influential, independent events where Cloud Computing was coined and where technology buyers and vendors meet to experience and discuss the big picture of Digital Transformation and all of the strategies, tactics, and tools they need to realize their goals. Sponsors of DXWorldEXPO | CloudEXPO benefit from unmatched branding, profile building and lead generation opportunities.
The best way to leverage your Cloud Expo presence as a sponsor and exhibitor is to plan your news announcements around our events. The press covering Cloud Expo and @ThingsExpo will have access to these releases and will amplify your news announcements. More than two dozen Cloud companies either set deals at our shows or have announced their mergers and acquisitions at Cloud Expo. Product announcements during our show provide your company with the most reach through our targeted audiences.
DevOpsSummit New York 2018, colocated with CloudEXPO | DXWorldEXPO New York 2018 will be held November 11-13, 2018, in New York City. Digital Transformation (DX) is a major focus with the introduction of DXWorldEXPO within the program. Successful transformation requires a laser focus on being data-driven and on using all the tools available that enable transformation if they plan to survive over the long term. A total of 88% of Fortune 500 companies from a generation ago are now out of bus...
With 10 simultaneous tracks, keynotes, general sessions and targeted breakout classes, @CloudEXPO and DXWorldEXPO are two of the most important technology events of the year. Since its launch over eight years ago, @CloudEXPO and DXWorldEXPO have presented a rock star faculty as well as showcased hundreds of sponsors and exhibitors! In this blog post, we provide 7 tips on how, as part of our world-class faculty, you can deliver one of the most popular sessions at our events. But before reading...
Cloud Expo | DXWorld Expo have announced the conference tracks for Cloud Expo 2018. Cloud Expo will be held June 5-7, 2018, at the Javits Center in New York City, and November 6-8, 2018, at the Santa Clara Convention Center, Santa Clara, CA. Digital Transformation (DX) is a major focus with the introduction of DX Expo within the program. Successful transformation requires a laser focus on being data-driven and on using all the tools available that enable transformation if they plan to survive ov...
DXWordEXPO New York 2018, colocated with CloudEXPO New York 2018 will be held November 11-13, 2018, in New York City and will bring together Cloud Computing, FinTech and Blockchain, Digital Transformation, Big Data, Internet of Things, DevOps, AI, Machine Learning and WebRTC to one location.
DXWorldEXPO LLC announced today that ICOHOLDER named "Media Sponsor" of Miami Blockchain Event by FinTechEXPO. ICOHOLDER give you detailed information and help the community to invest in the trusty projects. Miami Blockchain Event by FinTechEXPO has opened its Call for Papers. The two-day event will present 20 top Blockchain experts. All speaking inquiries which covers the following information can be submitted by email to [email protected] Miami Blockchain Event by FinTechEXPO also offers s...
Dion Hinchcliffe is an internationally recognized digital expert, bestselling book author, frequent keynote speaker, analyst, futurist, and transformation expert based in Washington, DC. He is currently Chief Strategy Officer at the industry-leading digital strategy and online community solutions firm, 7Summits.
Digital Transformation and Disruption, Amazon Style - What You Can Learn. Chris Kocher is a co-founder of Grey Heron, a management and strategic marketing consulting firm. He has 25+ years in both strategic and hands-on operating experience helping executives and investors build revenues and shareholder value. He has consulted with over 130 companies on innovating with new business models, product strategies and monetization. Chris has held management positions at HP and Symantec in addition to ...
Cloud-enabled transformation has evolved from cost saving measure to business innovation strategy -- one that combines the cloud with cognitive capabilities to drive market disruption. Learn how you can achieve the insight and agility you need to gain a competitive advantage. Industry-acclaimed CTO and cloud expert, Shankar Kalyana presents. Only the most exceptional IBMers are appointed with the rare distinction of IBM Fellow, the highest technical honor in the company. Shankar has also receive...
Enterprises have taken advantage of IoT to achieve important revenue and cost advantages. What is less apparent is how incumbent enterprises operating at scale have, following success with IoT, built analytic, operations management and software development capabilities - ranging from autonomous vehicles to manageable robotics installations. They have embraced these capabilities as if they were Silicon Valley startups.
The standardization of container runtimes and images has sparked the creation of an almost overwhelming number of new open source projects that build on and otherwise work with these specifications. Of course, there's Kubernetes, which orchestrates and manages collections of containers. It was one of the first and best-known examples of projects that make containers truly useful for production use. However, more recently, the container ecosystem has truly exploded. A service mesh like Istio addr...
Predicting the future has never been more challenging - not because of the lack of data but because of the flood of ungoverned and risk laden information. Microsoft states that 2.5 exabytes of data are created every day. Expectations and reliance on data are being pushed to the limits, as demands around hybrid options continue to grow.
With tough new regulations coming to Europe on data privacy in May 2018, Calligo will explain why in reality the effect is global and transforms how you consider critical data. EU GDPR fundamentally rewrites the rules for cloud, Big Data and IoT. In his session at 21st Cloud Expo, Adam Ryan, Vice President and General Manager EMEA at Calligo, examined the regulations and provided insight on how it affects technology, challenges the established rules and will usher in new levels of diligence arou...