6.2 Using Cameras as Weather Sensors—Deriving Weather Data from Images

Tuesday, 8 January 2019: 1:45 PM
North 224B (Phoenix Convention Center - West and North Buildings)
Randall Bass, FAA, Washington, DC; and J. A. Colavito, G. Pokodner, M. Matthews, K. Kronfeld, and M. Pirone

“A picture is worth a thousand words” is especially true regarding weather conditions. Everyone reading this abstract has probably viewed a picture and rather than looking at the primary focus of the image, have instead looked at the background to see what the weather was like. Now with modern technology, imagery can be used to actually derive quantifiable weather data that may be used for decision making or even integrated into weather prediction models.

The Federal Aviation Administration’s (FAA) Aviation Weather Branch has undertaken several initiatives to obtain weather data from camera images. In Alaska, weather cameras provide valuable information to pilots in areas where traditional weather data is not available, such as in mountain passes and at remote airports. There is an incredible amount of information provided in the images, but pilots and meteorologists must manually assess each image in order to integrate the information into their flight planning decisions or weather products. This is a high cognitive workload task and becomes quite time consuming when multiplied by the number of camera sites in Alaska. Thus the FAA is sponsoring research to use automation and commercial crowd sourcing techniques to extract data from the cameras – thus transforming weather cameras into weather sensors.

Visibility is one parameter that can be derived from weather cameras. One technique used to estimate visibility is automated edge detection. In this technique, the edge-strength of an image is assessed and compared to a historical clear day edge-strength for that site. The edge detection technique has shown positive results in identifying low visibility events and changing conditions. Another technique used to estimate visibility is crowd sourcing. In this approach, human workers are asked to estimate the visibility apparent in an image using an annotated clear day image for reference. Once a critical number of people weigh in, the system converges on a crowd solution. Testing of this capability has resulted in positive results as well. To take advantage of the strengths of each approach, the FAA has developed an initial prototype that combines the edge detection and crowd sourcing techniques into a human-machine hybrid system. The hybrid will be tested for the first time in the summer of 2018.

A novel concept that has been tested is remotely determining wind speed and direction by pointing a camera at a windsock. The FAA sponsored research to employ machine learning techniques to assess the feasibility and accuracy of a camera based wind sensor. The results were exceedingly promising and returned several recommendations for optimal camera placement relative to the windsock.

This talk will provide an overview of these research initiatives to derive weather data from cameras and present the preliminary results of initial testing.

- Indicates paper has been withdrawn from meeting
- Indicates an Award Winner