The Federal Aviation Administration’s (FAA) Aviation Weather Branch has undertaken several initiatives to obtain weather data from camera images. In Alaska, weather cameras provide valuable information to pilots in areas where traditional weather data is not available, such as in mountain passes and at remote airports. There is an incredible amount of information provided in the images, but pilots and meteorologists must manually assess each image in order to integrate the information into their flight planning decisions or weather products. This is a high cognitive workload task and becomes quite time consuming when multiplied by the number of camera sites in Alaska. Thus the FAA is sponsoring research to use automation and commercial crowd sourcing techniques to extract data from the cameras – thus transforming weather cameras into weather sensors.
Visibility is one parameter that can be derived from weather cameras. One technique used to estimate visibility is automated edge detection. In this technique, the edge-strength of an image is assessed and compared to a historical clear day edge-strength for that site. The edge detection technique has shown positive results in identifying low visibility events and changing conditions. Another technique used to estimate visibility is crowd sourcing. In this approach, human workers are asked to estimate the visibility apparent in an image using an annotated clear day image for reference. Once a critical number of people weigh in, the system converges on a crowd solution. Testing of this capability has resulted in positive results as well. To take advantage of the strengths of each approach, the FAA has developed an initial prototype that combines the edge detection and crowd sourcing techniques into a human-machine hybrid system. The hybrid will be tested for the first time in the summer of 2018.
A novel concept that has been tested is remotely determining wind speed and direction by pointing a camera at a windsock. The FAA sponsored research to employ machine learning techniques to assess the feasibility and accuracy of a camera based wind sensor. The results were exceedingly promising and returned several recommendations for optimal camera placement relative to the windsock.
This talk will provide an overview of these research initiatives to derive weather data from cameras and present the preliminary results of initial testing.