The Index Project
Index Award 2005 Finalists
InterPlay is an interactive evaluation system that enables spectators to participate in sports and games where judging is a critical factor. Used with a simple hand-held scoring device and expert feedback relayed over public video monitors, it stimulates involvement, learning and an appreciation for the athletic skills required to compete.

Functionality and use of design
InterPlay involves spectators in sport evaluation. In the time between viewing an event and official scoring, spectators transmit an initial score, view a composite spectator voting result, review expert commentary on freeze-frame images of the action, submit a final score, and compare the final spectator opinion with the official decision.

How did this design improve life?
As sports have become increasingly professionalized, the gap has widened between those who actually play and those who view from afar. Legions of couch potatoes argue sports in front of the TV set, but have no real interaction with the athletes, games or sports they watch.

The InterPlay system transforms the spectator participation experience from passive to active. The experience created is one of working together with those who manage the event -- referees, umpires, judges -- to form a collective opinion about what has transpired. The system uses iterative, visual feedback loops (based loosely on the Delphi technique for group decision making) to encourage consensus building through participatory learning about the rules of the sport, the skills of the athletes, and the actions of the competitors themselves. Explained here in the context of use for a sporting event, the system's procedures, mechanisms and concepts are valid for a much wider range of opinion generating, advice giving and decision-making activities.

InterPlay allows spectators to feel like they too are players -- part of the action in a way simultaneously both intimate and public.

Audience members participate in a continuous cycle of observing, judging and learning that forms an active information conduit between audience and event. The goal is sophisticated, direct association with the event using communication technologies with precision to strengthen perception, enhance learning, and establish connection.

Using a feedback-driven process, spectators take part in a form of group decision-making as a competition moves forward. For example, when InterPlay is used to judge diving at a swimming meet, a call for spectators to score a diver is issued before the dive takes place. When the dive is completed, scoring results are accumulated electronically and shown on monitors throughout the venue as the spectators' first judgment. While spectators are considering their composite judgment and breakdowns of it by country and other categories, professional insights about the dive, made by unofficial but highly knowledgeable experts appointed specifically for the task, are shown as visual annotations of freeze-frame images of the dive -- much as is done with "telestrators" to analyze instant replays in team sports. Spectators then submit final scores, having been able to compare their first opinions with those of the entire audience and the experts. Subtlety and nuance are learned over time. Contrasts between what audience members think and what the experts think -- as expressed by the experts' insights and the judges' final scoring -- provide the basis for continuous incremental learning over the course of a competition.

The process externalizes the drama of sports and play while providing a glimpse into the collective mind of an audience that can be narrowly homogeneous or widely diverse, highly knowledgeable or procedurally naive, internationally tolerant or regionally parochial. As a subtle, but important benefit of the evaluation process, InterPlay also provides an intrinsic check on slanted subjectivity or favoritism by judges, the source of major scandals at the Olympic games of recent years.

Designed originally for use in closed Olympic venues where events such as diving, figure skating, gymnastics and other sports are judged subjectively, InterPlay, from the beginning, was conceived for applications on a larger scale. The closed venue is artificially limiting. Large regional, national and international audiences follow teams in many sports, and the broadcast media bring those games and competitions into homes around the world in real time. These audiences are potentially active spectators. The same systems and procedures described for InterPlay can function across any distance through cable and satellite communications. "Instant replay" already provides expert annotation of events in many sports; spectator participation can quickly follow.

The InterPlay system incorporates concepts borrowed consciously from fields where group decision-making is important. In reciprocation, it also can contribute to other fields -- fields as diverse as participatory polling, interactive entertainment, market research and urban planning. Use within these fields would extend concepts for collecting opinions, providing feedback, and building informed consensus. Long-term prospects for a mature process enriched with additional capabilities may well include strategic planning at a political level and even direct political decision support.

Drawbacks of life improvement
InterPlay is a system, and as such, has within it all the inherent obstacles to implementation to be expected when people and equipment must work together in close coordination. It also must bridge cultures and levels of knowledge, often greater difficulties to overcome. Some of these potential obstacles and drawbacks are discussed below.

Learning versus Popularity
Evaluation is a refined process. Judges are chosen for their expert knowledge and ability to apply it. How can lay persons be expected to make evaluations anywhere near the level of sophistication of an experienced Judge? A mass evaluation process could simply become a popularity context.

Comment. Evaluation is also an essential component of the learning process. Mass audience judging will include judgments by highly sophisticated amateurs as well as first-time viewers. The iterative process incorporated in InterPlay will offer visual insights through annotations of video images by highly trained experts as well as summations and distributions of audience evaluations. Because the process is iterative, it will painlessly prod audience evaluators to learn and rethink their votes based on increasingly more refined information. The result, as a background response, will reveal audience opinion and biases -- as well as learning -- to all, but it will not be an official result unless ground rules of the event are changed to require it. The benefits of audience learning and involvement far outweigh the negative potential for a popularity referendum.

Integrity versus Entertainment
Sports that are subjectively judged (for example: diving, figure skating, gymnastics) usually have technical as well as aesthetic components. Audiences instinctively respond to the aesthetic aspects, often not even recognizing technical achievements. What is to prevent audience participation from turning sporting events into simplistic entertainment?

Comment. Because actual scoring will remain in the control of judges, technical quality will always be evaluated, and will be accorded the most expert judgment. Audience scoring may be expected naturally to reflect the aesthetic qualities of performances, but should also improve technically over each scoring opportunity, and should show notably increased sophistication over the course of an entire event. The iterative scoring process with its presentation of experts' visual insights and changing distributions of audience votes will influence audience learning as an event progresses. Audience participation will have an entertainment value, but technical competence will be preserved and entertainment should progress toward informed entertainment.

For closed venues, the cost of the system must be borne by the revenue produced by ticket sales and external resources provided by sponsors. Other than for venues like the Olympics, is the system only practical where outside audiences can participate and television revenues are a part of the income?

Comment. The InterPlay system actually is relatively inexpensive. Existing large screen monitors in most arenas are suitable for the presentation of the feedback showing expert insights and the results of spectator voting. Display screens today can also be rented, if necessary, for approximately $2,500 per day. The hand-held scoring device proposed is quite simple in design and has been priced at approximately $2.00 per unit in the numbers normal for Olympic events. Its cost could be included in the ticket price for any single event, and it can be used for multiple events. Base stations for relaying scores from spectators' wireless scoring devices are estimated to cost $1,200 per station today. Four to fourteen such stations would be necessary, depending on the size of the venue. Overall estimates, depending on the availability of existing displays and whether the cost of scoring devices would be passed on to spectators, show the system could be implemented for Olympic-sized games for as low as $21,500 to as high as $700,000.

Research and need
The InterPlay system was created using Structured Planning methodology, user observation, and human factors testing. For a full description of Structured Planning, see papers by Charles L. Owen at The project proceeded as follows.

Project Definition
The project began with a Charter, a "brief" given to the five-person design and planning team. Using that, the team developed "White paper" Defining Statements from a study of issues affecting policy. Positions arrived on these established how the team would proceed (for example, regarding favoritism, bias, scale, cost). With the Charter, Defining Statements focused the direction of the project.

Action Analysis
Interviews with experts and videotapes of events were used to identify users, their roles and functions. This information was incorporated in a "Function Structure" that showed major modes of projected system operation, activities to occur within them, and the Functions to be performed. A concerted research effort was also devoted to uncovering problems that occur in analogous systems as functions are performed --why things go wrong performing some functions, and why other functions are performed well. These insights were explored in detail and written up in documents called Design Factors. Activity analyses recorded information about users and Functions; Design Factors documented insights and suggested ideas.

Ideas of all kinds were recorded with key details on single-page Solution Element forms. Overall, three sets of critical information were the research product at this stage: a set of Functions the system must perform (organized in a Function Structure), a set of insights about them (Design Factors), and a set of hundreds of preliminary ideas (Solution Elements).

Information Structuring
A wealth of information creates its own problem -- how to organize it. Categorical lists, good products of analysis, are ill suited to the creative possibilities of synthesis. Whether two Functions should be considered together should not be because they are categorically "related", but because a significant number of potential system solutions are of concern to both. Associations between solutions and Functions were established with an interaction procedure, and a computer program (RELATN) used the information to establish links between Functions. Another program, VTCON, then found clusters of highly interlinked Functions within this network and organized them into a hierarchical Information Structure, an optimized "road map" for innovative planning.

Working with the Information Structure as a guide, the team used a structured brainstorming technique called Ends/Means Synthesis to select, modify or invent final solutions suitable to the needs of the Functions now associated in clusters. Original Solution Elements were used as-is where appropriate. Others were modified to reflect new possibilities revealed by the associations of the clustered Functions. New ideas were developed to fill remaining needs.

To check for coverage, features of final ideas, now System Elements, were evaluated for their fulfillment of Functions. System Elements were also considered with each other in a search for additional synergies. Where new ways to work together were found, properties and features were refined. As final documents, System Elements were written up with succinct descriptions, relevant required properties and features, and extensive discussions and scenarios to explain the ideas fully, both conceptually and operationally.

Human Factors
InterPlay is a system designed for people. Among many "people problems it had to solve were communicating new behaviors to an international audience using as few words as possible to avoid potential language problems and minimize bias. Concepts for this were tested at the detail level using prototyping methods.

Paper prototyping tested screen designs for the effective communication of scoring, over multiple iterations uncovering cognitive roadblocks encountered by international subjects. Animations of the screens simulated timing and content to be communicated within a 50 second constraint. Testing revealed that the process could be speeded up and, Testing revealed that the process could be speeded up and, at the faster speed, the "feel" of the tasks became more sports-like and exciting to subjects-- an unexpected quality improvement for the system.

Rapid prototyping was used to develop a way for subjects to score athletes without being distracted by having to look at the scoring device. Controls and display also evolved under prototyping to a universal hand size and an appearance promoting the look and feel of sports.

The cumulative result of all research was a presentation and a report. Both can be seen at

Integrating the systematic methods of Structured Planning with the grounded theory of user research allowed the team to conduct research and smoothly convert insights into ideas for a complex system product. The result is a low-cost, re-deployable, adaptive system that redefines the spectator experience while reinforcing existing social qualities and the excitement of sports culture. Through InterPlay, audiences can gain deeper appreciation of sports as well as a better understanding of the complexities of bias, sportsmanship, competition and fairness.

Designed by
Elizabeth Akers, Glenn Steinberg, Jenny Fan, Ric Edinberg & Yi Leng Lee (United States), 2003