When your team members are separated by space or time, don’t abandon peer reviews. They can still be powerful contributors to product quality and team productivity. In this adaptation from his forthcoming book, Karl Wiegers spells out how to engage the team in distributed review meetings or asynchronous reviews. Although they aren’t the same as sitting down face-to-face, these techniques provide a valuable alternative mechanism for getting a little help from your friends.
The many benefits of software peer reviews include improved quality and productivity, sharing of technical knowledge, and gaining insights that lead to process improvements. Sometimes, though, it is hard for potential reviewers to put their heads together in real time.
Increasingly, software projects involve teams that collaborate across multiple corporations, time zones, continents, nationalities, organizational cultures, and native languages. Such projects must modify the traditional face-to-face peer review method. The review issues include both communication logistics and cultural factors; the latter usually pose the greater challenge. Even if cultural barriers are not an issue, you’ll need to deal with the difficulties of holding reviews with participants who cannot meet in person.
The two dimensions to consider are time and place. If review participants can assemble in the same location, you can hold a traditional review meeting. Geographically separated participants can hold distributed review meetings, and reviewers who cannot connect concurrently can practice asynchronous reviews. With either nontraditional method; however, your collaborations will be more effective if the participants meet in person at least once early on. Use this meeting to establish the team rapport and respect for the review moderator’s leadership that are necessary for effective reviews. Periodic face-to-face meetings throughout the project will help maintain the bond the team members established at the beginning.
Distributed Review Meeting
Today’s audio- and videoconferencing tools can facilitate communication if the participants are available at the same time but in different places, although sometimes "same time" is complicated when the participants reside in different time zones. My colleague Erik moderated several reviews that involved participants who spanned twelve time zones. You can manage this challenge by changing the time of day that you hold the reviews, to rotate the inconvenience of getting up in the middle of the night. This also avoids the perception that certain individuals or locations are subordinate to others.
A distributed review places special burdens on both the participants and the review moderator. When I participated in one distributed review meeting by telephone, I was struck by the absence of body language and facial expressions. I couldn’t tell what the other participants were doing or thinking. I couldn’t see when someone looked puzzled or looked like she was getting ready to say something. It is also difficult to detect sidebar conversations over the telephone or see when participants have left the room or are distracted. Use expert moderators for such long-distance reviews.
Establish some ground rules for taking turns speaking, identifying yourself before making a comment, relinquishing control to the moderator, and timeboxing discussions. For instance, a "round robin" approach to raising issues can keep all participants engaged when the moderator has difficulty knowing who is not contributing. Johanna Rothman described many conference call dos and don’ts for multicultural project meetings in her article "Managing Multicultural Projects with Complementary Practices" (Cutter IT Journal, April 2001).
My colleague Chris used a moderator at each of the three locations participating in a series of conference-call review meetings. During the meeting, each moderator facilitated participation by the team members present in his room. The moderators conferred before and after each session to discuss which aspects worked well and which did not.
I know one moderator who uses a whistle when leading audioconference reviews. A short toot gains the attention of participants who can’t see when the moderator is trying to break into the discussion. Another moderator has used the dialing beeps on the telephone as an attention-getter. A simple tone sequence such as the opening notes of Beethoven’s Fifth Symphony (dial 3-3-3-7) is easily recognized.
Videoconferencing addresses some of the challenges of conference-call review meetings. However, the time lag in videoconference equipment can be distracting and makes it easy for multiple participants to begin speaking simultaneously. Then they all stop speaking, and the cycle begins anew. During a videoconference review, the moderator can hold up a colored piece of paper or wave a flag when he needs to get the group’s attention.
Distributed reviews benefit from Internet-based collaboration tools: Visit Coworking.com for some examples. Some of these tools display the product being reviewed in a browser-like display so all participants see the same image. The recorder captures items in an online issue log as the reviewers bring them up, perhaps displaying the log in the browser for remote participants to view. Hyperlinks between the product under review and supporting documents permit easy and convenient navigation during the distributed discussion. Some studies of such collaborative review approaches indicate that they can be as effective as face-to-face meetings (Vahid Mashayekhi et al., "Distributed, Collaborative Software Inspection," IEEE Software, September 1993).
Asynchronous Review
If your reviewers can participate only in different times and places, or even at different times in the same location, use asynchronous review approaches. The simplest such method is a peer deskcheck, in which the author asks one colleague to look at a work product. A peer deskcheck depends entirely on the single reviewer’s knowledge, skill, and self-discipline, so expect wide variability in the results. A passaround is a multiple, concurrent peer deskcheck, with several reviewers invited to provide input. As an alternative to distributing physical copies of the document, you can place an electronic copy in a shared file. Reviewers can provide their feedback in the form of document annotations, such as Microsoft Word comments or PDF notes.
Asynchronous reviews address some of the potential shortcomings of traditional peer reviews. These include insufficient preparation prior to the meeting, personality conflicts, and meetings that segue into problem solving or deviate on other tangents. The author should expect to spend some time following up on comments made by specific reviewers. He can do this face-to-face if geography permits or by telephone if it does not.
Asynchronous reviews have their own shortcomings. Because participants contribute input over a period of time, asynchronous reviews can take several days to complete. Some volunteers won’t find the time or motivation to contribute to an asynchronous review. In addition, asynchronous reviews lack the physical meeting that focuses the participants’ attention and stimulates the synergy that enhances defect discovery. Some people don’t bother to contribute when they see that someone else has already responded. The initial contributors to the discussion can set its direction if their comments are visible to all participants from the beginning.
Philip Johnson and his colleagues developed the Collaborative Software Review System (CSRS), available under the GNU public license (see Johnson’s "Design for Instrumentation: High Quality Measurement of Formal Technical Review," Software Quality Journal, March 1996). Used in conjunction with a review approach called FTArm (Formal, Technical, Asynchronous Review Method), CSRS first allows reviewers to raise private issues about the item being reviewed. Next, the tool permits them to view, respond to, and vote on issues and action proposals contributed by other reviewers. Tools such as CSRS capture more details of discussions and the thought process behind them than a recorder can note during a traditional fast-moving review meeting.
Engaging review participants in different locations or at different times is challenging. However, the benefits that distributed and asynchronous peer reviews provide to collaborative software projects make them worth trying when the reviewers cannot meet in person.