12A.1
Crowdsourcing Competitions: Are They Applicable across the Weather Enterprise?

- Indicates paper has been withdrawn from meeting
- Indicates an Award Winner
Thursday, 8 January 2015: 8:30 AM
132AB (Phoenix Convention Center - West and North Buildings)
Troy M. Anselmo, SAIC, Newport News, VA; and J. B. Frenzer

SAIC is assessing the feasibility and applicability of the crowdsourcing competition process to solve challenges in moving new capabilities and new research into NOAA's operational environment. “Crowdsourcing is ideally positioned for experimental work on systems of innovation because you can rapidly hire cutting-edge technical skills on an elastic basis”, Eric Knipp, vice president of application platform strategies at Gartner states in his March 31, 2014 report, “Use Crowdsourcing as a Force Multiplier in Application Development”. His report also notes that crowdsourcing is particularly suitable for idea generation and for small, easy to scope tasks (or projects that can be broken down into small tasks). With crowdsourcing contests, the contest sponsors get the luxury of choosing between several winning solutions and can even combine multiple winners into an innovative new approach. Contest sponsors also benefit from built-in cost controls because they only spend what they plan upfront for the contest, similar to a fix-price contract. Given the reported effectiveness of crowdsourcing for commercial enterprise applications, SAIC will test its effectiveness for weather-related applications.

SAIC will conduct this crowdsourcing experiment using a competition-oriented development platform from Appirio. As noted by Gartner, it is important for contest sponsors to use a crowdsourcing platform to create an arm's length relationship with the contest competitors. This preserves the integrity of the contests and the reputations of the sponsors and contestants. Communication between the sponsor and contestants on specifications and requirements is conducted through the development platform. SAIC will evaluate the crowdsourcing competition process by sponsoring at least one competition that is targeted to college students. SAIC will fund the Appirio platform fees and the prizes for the contest winners. The student contest winners will be announced at the 2015 AMS Annual Meeting in Phoenix, AZ. The SAIC-sponsored contest(s) will involve improvements to weather-related algorithms or will target improvements in the use of big data analysis for predictive systems.

Appirio's crowdsourcing development platform, called [topcoder]™, is typically used to conduct competitions with a worldwide community of over 600,000 designers, developers, and data science experts. However, the platform has the capability to target contests to specific communities defined by the contest sponsor. Appirio's clients benefit as competitions generate hundreds of ideas, the very best of which can be converted into working prototypes and community-tested operational applications. Clients benefit as multiple experts compete to solve their problem, with only the top solutions (as defined by very specific criteria) receiving rewards. These competitions provide Appririo's clients with a highly competitive, results-oriented, elastic workforce. Crowdsourcing competitions cover the entire software development lifecycle and are held to: • create architectures, designs, and applications • provide testing • develop or improve algorithms • use data science and big data analysis • perform predictive analytics.

Competitions are generally scaled to atomic pieces that can be solved within a few weeks. Appirio's clients design a set of competitions to solve their specific needs, whether creating a new system from scratch or just solving a very narrow need that requires specific technologies, skills, or innovative approaches. One example of the use of a crowdsourcing contest involved the NASA Tournament Lab, a collaboration between NASA and Harvard, which is hosting NASA's Asteroid Grand Challenge Series. The goal of this challenge is to identify every asteroid that could threaten earth and determine what action might be taken.

As the Gartner report points out there can challenges in conducting crowdsourcing competitions. These include: • Constraining the scope of the individual contests to simplify the specifications and allow contestants enough response time • Responding quickly to inquiries and questions to keep contestants engaged and on track • Picking a crowdsourcing platform that can help guide sponsors on designing effective contests, evaluation criteria, and rewards

As a result of the SAIC-sponsored contest(s), we will report on the overall experience and effectiveness of the process and development platform. This will include an evaluation of the volume of contestant responses, scope of the contest(s), effectiveness of the specifications and requirements, efficacy of the evaluation criteria, overall quality of the winning responses, contestant question quality and volume, and other lessons learned.