Pollution monitoring? There's an app for that.

USC lab releases smartphone app that measures particulate air pollution

‘Visibility’ now available for download; developers hope users can help fill in the many blanks in existing air quality maps

University of Southern California computer scientists have found a way to combine smartphone resources with a novel application that allows the phones’ users to help monitor air quality.

The application, provisionally titled “Visibility,” is available for download at http://robotics.usc.edu/~mobilesensing/Projects/AirVisibilityMonitoring

The researchers, from the USC Viterbi School of Engineering, hope that as many users as possible download and try it in order to improve the software. Currently, the download works for smartphones running the Android system and soon will be widely available on Android app sources. An iPhone app is in the works.

The basic principle of the Visibility app is simple, according to the paper documenting the work by USC computer science professor Gaurav Sukhatme.

The user takes a picture of the sky while the sun is shining, which can be compared to established models of sky luminance to estimate visibility. Visibility is directly related to the concentration of harmful “haze aerosols,” tiny particles from dust, engine exhaust, mining or other sources in the air. Such aerosols turn the blue of a sunlit clear sky gray.

There is one caveat: It has to be the right picture. The visibility/pollution models are based on the viewing geometry of the image and the position of the sun.

The Visibility app works because modern smartphones contain a rich set of sensors that include cameras, GPS systems, compasses and accelerometers, in addition to the powerful communication capabilities that are inspiring a slew of intelligent phone applications ranging from personal health monitoring to gaming and social networking.

Sameera Poduri, a postdoctoral researcher in Sukhatme’s lab, explained that the accelerometer in the phone – the sensor that tells how the user is holding the phone, determining whether it displays information vertically or horizontally – can “guide the user to point the camera in exactly the right direction.”

The picture must be all or mostly sky, which makes a contribution from human user judgment critical. As the research paper noted: “Several computer vision problems that are extremely challenging to automate are trivially solved by a human. In our system, segmenting sky pixels in an arbitrary image is one such problem. When the user captures an image, we ask him [or her] to select a part of the image that is sky.”

The accelerometers and the compass on the phone capture its position in three dimensions while the GPS data and time are used to compute the exact position of the sun. The application automatically computes the camera and solar orientation, uploading this data along with the image — a small (100KB) black-and-white file — to a central computer. The central computer analyzes the image to estimate pollutant content and returns a message to the user, as well as registering the information. (User identities are anonymized)

The system potentially can help fill in the many blanks in the existing maps of air pollution. Conventional air pollution monitors are expensive and thinly deployed, Sukhatme noted. Many California counties have no monitors at all. Even Los Angeles County has only a relative few.

The system has been tested in several locations, including Los Angeles (on a rooftop at USC and in Altadena) and in Phoenix, Ariz. The USC rooftop camera has a built-in “ground truth” test – it is near a conventional air pollution monitoring station.

So far the results are promising, but they indicate that several improvements are possible.

As Sukhatme said, “We’re sure we can improve it if we get people trying it and testing it and sending data.”

###
Advertisements

16 thoughts on “Pollution monitoring? There's an app for that.

  1. Let me guess – people are most likely to send images when conditions are extremely good or extremely bad. I’m not sure if that’s good or bad. 🙂 The existing network ought to be able to recognize the general conditions. Images taken during extremely good times might be able to disclose pollution sources that otherwise would be lost in the noise, images taken in extremely bad conditions will likely simply show that it’s bad out there.

  2. AGW aside, real pollution(not a trace gas that is beneficial to the environment) is an issue that needs to be addressed.

  3. Grrrr … Android only, near as I can tell, and me with an iPhone … love the idea, though, crowdsourcing climate data works for me.
    w.

  4. But where is the pollution? Certainly there is pollution in big cities, but I am concerned about all the pollutions supposedly caused by big coal. I live in Pennsylvania, where we create >40% of our electricity with coal, and we are downwind of Indiana, Ohio and Illinois which also use coal.
    http://www.sourcewatch.org/index.php?title=Existing_U.S._Coal_Plants
    Yet the pollution measurements don’t seem anything to be alarmed about.
    http://www.epa.gov/cgi-bin/broker?_service=data&_debug=0&_program=dataprog.maptest_07.sas&parm=81102&stfips=42
    http://www.epa.gov/airtrends/
    If you go here and download the last comprehensive Air Quality report, 2007, you will find that most of the measurements are less than half the National Air Quality Standard levels!!! page 8 Exec Summary. So if air pollution in my relatively population dense state of Pa is low, considering the high amt of coal burned and being downwind of other densely populated coal states? What is the worry about all this coal pollution? (not talking CO2, I know we can always worry about that!!)

  5. From: Willis Eschenbach on September 20, 2010 at 3:58 pm

    Grrrr … Android only, near as I can tell, and me with an iPhone …

    Oh well, you can still get that neat “Drown a Polar Bear Cub” game. That’s so funny, watching his big scared eyes…
    Alright, technically it’s a “Save the Polar Bear Cub” educational game to promote “Climate Awareness.” It’ll give you a quiz question, you pick the answer involving the least energy used or the most CO2 saved or otherwise the most “green” answer, and the reward is the patch of sea ice he’s on grows a bit. It’s a timed game, you have to keep it going until his mother floats by on a larger chunk of ice to rescue him. Get three correct in a row and he does a happy dance.
    But he is so cute when you get them wrong! Like when it asks about going to the store for some soy milk, and you choose to take the SUV instead of the bicycle… It’s like “The Scream” but with big eyes larger than his head. Looks like Japanese anime, it’s so cool. And at the end when he’s standing on that little ice cube before going “Splunk!” into the water…!

  6. Willis Eschenbach says:
    September 20, 2010 at 3:58 pm
    Grrrr … Android only, near as I can tell, and me with an iPhone … love the idea, though, crowdsourcing climate data works for me.

    Yea, that kind of surprised me too. I thought new software came out on the iPhone first. I wonder if that is a trend?

  7. So it can differentiate between pollution and water vapor? Our sky here in michigan is a lot whiter (or grayer) than it is in any of the southwestern dry states.

  8. Chad: “AGW aside, real pollution(not a trace gas that is beneficial to the environment) is an issue that needs to be addressed.”
    Chad, FWIW, the pollutants that injure and kill people, like ozone and particulate matter, are present at concentrations of parts per billion or micrograms per cubic meter. One could certainly think of these as trace constituents of the atmosphere as well. As a matter of fact, this is a few orders of magnitude *less* than CO2’s concentration. And don’t forget, the same sources that emit CO2 also emit these traditional pollutants. Also, several studies have suggested that the benefits to air quality of a CO2 control policy could offset most or all of the cost of the CO2 controls (http://iopscience.iop.org/1748-9326/5/1/014007).

  9. Ric Werme says:
    September 20, 2010 at 3:12 pm
    Let me guess – people are most likely to send images when conditions are extremely good or extremely bad. I’m not sure if that’s good or bad. 🙂 The existing network ought to be able to recognize the general conditions. Images taken during extremely good times might be able to disclose pollution sources that otherwise would be lost in the noise, images taken in extremely bad conditions will likely simply show that it’s bad out there.
    But isn’t that the essence of the whole matter?
    Isn’t IT REALLY all about ‘subjectivity?’
    THINK about that: Take a picture.
    But WHEN?
    And BY WHOM?
    WHO is to say that the picture taken hasn’t been PURPOSELY modified with a filter, or other other artificial means (Photoshop) such as to produce an entirely manipulated rendering?
    Yeah: Get =JUST EVERYONE= involved, but then WHERE is the QUALITY ASSURANCE?
    THAT is NOT science.
    More it is yet another story of the boy who cried ‘WOLF!!!!’
    And you know what happens after that, don’t you?

  10. All very well and good but it is ultimately less about clarity and more about what is reducing that clarity or luminosity, other then water vapor, that is of true importance. I have a real problem with the term, harmful, being a loaded adjudicative without any definition. For me spruce pollen is very harmful if in sufficient concentration. It however is anything but harmful to spruce trees. If you can’t tell me what is causing the reduction in rather specific terms the thing is just another cute trick.

  11. Being an electronic engineer I’m “a little” doubtful about this method to collect air pollutant data.
    What about the camera CCD sensitivity and spectral response biases?
    Who calibrated them to get the precision required for such measurements?
    But the big question form is: Why does it convert the image to B&W before sending it to the “central computer”? That is, why does it discard precious spectral informations which could discriminate broadband absorbers from the chaotic air mixture?
    Uhmm…
    Ok, I know that climate scientists accustomed us to believe data collected from instruments such GRACE satellites where (just to say) 0.1% is due to the true measurement and 99.9% came from the result of their simulator used to give the measurement a meaning, but this App is too much for me.
    It smells of hoax.

  12. I want an app that actually collects an air sample and analyzes it I feel a picture only shows how smoggy it is. I know I want a lot, but it would be awsome for the novel I’m working on.

  13. In college I took a 1 credit class called ‘light and color in the open air’ or something like that. Basically it was a ‘here are some neat things.’
    One of the things pointed out (and vissible in the pics) is that the ‘up’ you look, the darker it looks. Basically there is more scattering at the horizens, so the more direct the sunlight, the darker the sky looks -assuming you aren’t staring at the sun 🙂
    So how does that work for this app? does a dark sky above my head mean that the sun is at 45deg or lower (ie not straight up) or doe it mean more pollution? We can make all sorts of measurements, but without some context or evaluation, do they mean anything?

Comments are closed.