When we feel ourselves coming down with something, we look it up. If you type the words “I think I’m getting” into Google’s search box, the suggested next-word options are “sick,” “the flu,” and “a cold,” though these recommendations and their order can vary slightly. In 2008, Google put that search data to use with Google Flu Trends, which shows where clusters of influenza related search-engine-action are happening on a map. Flu Trends can spot outbreaks one or two weeks before official Center for Disease Control data simply because people tend to turn to the web before a doctor. Earlier this year, Google used the same method to launch Dengue Trends.

Crowdsourcing, whether through analyzing search data or collecting bits of information from a bunch of individuals, has become an incredibly useful and eclectic tool for public health reporting. It helps reporters spot trends, validate theories, and find stories that might otherwise be missed. The method is not flawless or even rigorously scientific, of course. For example, a couple of blogs have used Google analyses to show where “bed bugs” searches have spiked regionally, but they haven’t proved to be reliable predictors of actual infestations. Still, it’s another tool in the public health reporter’s box.

Taking a look at not only where people are searching for terms, but how the search is framed, can also generate insights. “How are Americans responding to their weight? The answer to this question may be found by analyzing Google’s search data,” Business Insider’s Greg Voakes recently posited. His analysis revealed that more than 50 percent of weight-related keywords referred to losing weight “fast” or “quick,” and he cautioned readers about the risks of crash dieting.

Other services, like Sickweather, use word data to track illnesses. Mining social media outlets like Facebook and Twitter, the company reportedly had encouraging results when frequency of the word “cough” spiked in Algonquin, Illinois and Milwaukee, Wisconsin one month before whooping cough outbreaks happened in both areas this fall. But the site has had to be careful with data collection. Use of the word “fever” had to be separated from any post that also contained the word “Bieber,” to close the gap between actual fevers and those induced by the teen idol.

Traditional methods of crowdsourcing, which require active collaboration, have been used to report on public health issues as well, particularly in cases of environmental dangers. Following the destruction of Japan’s Fukushima Daiichi nuclear plant in March, there was a demand for information about where dispersed pockets of radiation were located. Official information was scarce and slow to be released. A group called Safecast emerged to fill the void. They handed out Geiger counters to Japanese citizens so they could measure and submit radiation levels to a website. Safecast then used the data to construct a map of radiation levels across the country.

Such techniques are equally useful at the community level. Take southwest Detroit, where asthma rates are three times the rate of the rest of Michigan. Residents had long blamed the semi-trucks that frequented neighborhood streets. Community members and employees from Public Radio International and WDET Detroit Public Radio distributed flyers in spring of 2010, asking the community to send the text message “TRUCK,” along with their location, when they saw the vehicles on neighborhood streets. Thirty people sent a total of seventy texts, and some members of the community documented the trucks in more traditional ways; one married couple wrote down when trucks came down their street, and counted 470 times over four months. Another woman showed a reporter hundreds of photos she’d taken.

Alysia Santo is a former assistant editor at CJR.