Blog

How to Get Successful Content Review as a Technical Writer

How to Get Successful Content Review as a Technical Writer

Authors

SYDNEY JONES
HEAD OF MARKETING COMMUNICATIONS AT IXIASOFT

 

SHARON FIGUEIRA
PRE-SALES ENGINEER FOR NORTH AMERICA AT IXIASOFT

 

Tech writers work hard to understand the domain, terminology, and specifications of each product. Often, the most challenging part of the process is getting quality review from SMEs.

When doc bugs come in, it’s the tech writer’s neck on the line—no matter who is responsible for the error.

In highly technical domains like telecom and networking, or in specialist domains like legal or medical, review feedback is imperative for producing high-quality content. IXIAtalks: Episode 2 explores the impact of a bad review and the consequences it can have for the quality of technical documentation.

Defining Bad Review

First, it’s important to understand the different types of “bad review:”

 

Conversely, good review is rigorous, on time, and focused on the target content.

The Consequences of Bad Review

In some cases, bad review results in a typo that goes unnoticed for months, even years. In other cases, bad review can cost organizations hundreds of millions of dollars. To better understand the consequences of bad review, it’s worth diving into a case study or two.

Mariner Probe

On July 22, 1962, NASA’s Mariner 1 rocket was launched into space. Shortly after taking off, a range safety officer ordered a destructive abort. NASA investigators traced the cause of the accident to an error in the guidance control software which transmitted a series of incorrect course correction signals and threw the spacecraft off its flight trajectory.

The range safety officer ordered the intentional detonation of the spacecraft less than five minutes after liftoff to prevent the vehicle from crashing into a populated area. When calculating the adjusted costs of research, development, training and construction, the total losses connected to the accident are estimated to exceed $620 million USD. Ouch!

Mars Climate Orbiter

In 1999, a disaster investigation board reported that NASA’s Mars Climate Orbiter burned up in the Martian atmosphere because engineers failed to convert units from English to metric.

The software calculated the force the thrusters needed to exert in pounds of force. A separate piece of software took in the data assuming it was in the metric unit—newtons. Had this error been caught by a keen-eyed technical writer, NASA would have been spared $125 million USD.

 

Why does bad review happen?

There are many factors that contribute to poor, incomplete, or altogether missing review. Here are four of the most prominent:

  • Collaborators struggle with the format, interface, or tools we provide.
  • Documentation is undervalued, collaborators don’t have enough time to do the review, and review is seen as a low-value task.
  • Reviewers give the wrong kind of input, or focus on old or out-of-scope content.
  • Collaborators are not accountable for poor or missing review/content, because the cost of error fixing is not understood, and root cause analysis is difficult if you are working with monolithic PDFs without systematic tracking of who did what.

 

What barriers might collaborators facing?

  • Lack of confidence about language skills
  • Lack of time, and sometimes an inability to plan review time due to last-minute communication around upcoming reviews
  • Struggling to find the right scope for review

 

How can organizations improve collaboration and alleviate bad review?

The first step to improving the quality of review is to make the review process as easy as possible. To do this, consider:

  • Providing low-barrier tools
  • Focusing limited resources on key content (rather than asking them to find new content in a PDF, or review a sub-set of content, or accidentally asking them to re-review content they or someone else has already reviewed)
  • Supporting reviewers and thinking about their experience (using templates, providing coaching, doing one-on-one support, discussing what is NOT required)

Equally as important is to hold reviewers accountable. Rather than pointing fingers, the objective is to  trace a doc bug back to its original source and ensure the issue is analyzed and resolved. This can be done by:

  • Tracking what was reviewed, by who, when
  • Analyzing doc bugs for root cause and asking the tough questions
  • Providing bad review as a cause for bugs and making this transparent to senior management (for example, listing the bug cause as “missing review, poor review, incomplete review”)

 

Solutions

There are many ways to ensure successful document review within your organization. The new IXIASOFT CCMS Web interface offers a way to lower the bar and reduce the amount of know-how a reviewer needs to interact with your world. For example, it allows you to deliver content in an html5 deliverable, and to revise content using familiar features like track changes. The SME is still working with the unique DITA object which you manage as the content expert, so there is no need for copy-and-paste, or merging versions, or transcribing SME input.

Granularity

It is much easier to assign granular topics for review in DITA than it is in a component-based information architecture. However, if you are still in a document paradigm, think about chopping up content so reviewers only see new content assigned to them.

Intuitive Interface

The mental model during the design phase at IXIASOFT was to make the interface so simple that users require little or no training—it’s a bit like Word or Wiki. The basic principle is to go to where the reviewers are, rather than force them to navigate your world. It’s the principle of thinking about your audience.

Traceability/ Transparency

Make sure you have an accessible record of review comments and inputs which can be stored in your repository so you can do the root cause analysis and make changes to a failing review process.

 

Summary: Overcoming the Barriers for Good Review

To conclude, there are many ways to encourage quality review within your team. Firstly, it’s crucial you support reviewers with easy-to-use tools, clear processes, task context, and even coaching. You must also ensure reviewers are assigned the right content to review. Leveraging bugs and errors is another key component to improving the review process; it’s important to hold reviewers accountable.

Finally, you must communicate the value of documentation and the cost of errors within your team.

To learn more about how to get good review, or to further explore IXIASOFT CCMS Web, check out our webinar IXIAtalks: Episode 2 “Getting the Review You Want.” 

This blog was originally presented as an IXIAtalks webinar by Sharon Figueira and Sydney Jones. Sharon—IXIASOFT pre-sales technical consultant for North America—what it means to get a bad review and the consequences it can have on technical documentation teams..

A bit about Sharon:

Sharon has 18 years’ experience as a technical communications professional, 15 as a manager, with 2 DITA migrations under her belt – one for Ericsson and one for Kodak.



X