I live in Northern California and it is wildfire season. Increasingly, this means poor air quality even in cities. With COVID (don’t go indoors!) and wildfires (don’t go outdoors!) — there was really nowhere else to go besides my own apartment. That of course meant that it was time for a new project! If my goal was to just get an air quality monitor, I probably would have gone with purple air, as it comes with a social component, but I wanted to make life hard on myself. Some features of the approach I took below:
— Data can be contributed to this open source community.
— Sensors: BME280 (tracks temperature, humidity, & pressure), PMS5003 (laser particle counter — trackers air quality basically), LTR-559 (light sensor), MEMS (microphone, noise sensor)
Things I knew how to do: Write Python
Things I didn’t know how to do: Everything else described below
Below are my steps to create a Raspberry Pi driven air quality monitor.
Raspberry Pi and enviro+
This is my first Raspberry Pi adventure and now I understand why >30 million have already been sold worldwide. I got a Raspberry Pi Zero W, which is literally a 10 dollar computer. It has its own operating system, Raspbian, and connects to wifi. (I’ve discovered it is not, however, fast enough to load YouTube.) A Raspberry Pi is basically a circuit board and one can choose what to attach to it. I got the starter pack because it contains some extra cords that end up coming in handy. Individual sensors (humidity, temperature, etc.) are sold for a few dollars. However since I’m a newbie, I went with the fancier enviro+ attachment. It worked right out of the box and has tons of sensors: temperature, pressure, humidity, light, etc. To monitor air quality, there is an additional particulate matter attachment that records PM1, PM2.5 and PM10. The Air Quality Index (AQI) is derived from these measurements. It also already has its own Python library, so it is super easy to get data from the sensors. After following this and this (minus the outdoor construction part), I successfully graphed sensor readings on the Raspberry Pi. Overall, the tutorials worked well. The most challenging part for me (a software not hardware person) was the soldering, but it is possible to buy a pre-soldered Raspberry Pi. I’d recommend if you don’t have a father with a garage full of metal & wood working equipment.
Graphing sensor readings on the Raspberry Pi itself was a great first step, but I wanted to monitor air quality remotely. My eventual goals were 1. Build a dashboard website with metrics and analyses I can refer to and 2. Leverage this data to turn on or off various other appliances (not discussed in this post).
AWS: Sending data to a database
Code running on Raspberry Pi for this part: https://github.com/kimberlymcm/raspberrypi
To tackle this, I decided to use AWS — Internet of Things Core. AWS has great (and free!) tools for connecting devices to the internet and to each other. I also chose to write my data to a DynamoDB database. DynamoDB is a fully managed, NoSQL database. I’m not a database expert, so I mostly picked it because it looked (and was) easy to use. I registered a ‘thing’ in the AWS — IoT Things Core. I registered a ‘thing’, keeping all the defaults. At the end it asks you to download the policy and put it on the ‘thing’. I just scp’d the scripts to the Raspberry Pi.
Then go to IoT Core at AWS and register a new thing. Keep the defaults. Download the policy, and scp the scripts to the Raspberry Pi. I went with DynamoDB, a fully managed nosql database, primarily because it was super easy to set up and has a free tier. Here is a very detailed video if you’d like to learn more. I set my primary key to the device_id and my sort key to time. I created a ‘rule’ (via this) to write the sensor data being sent to AWS to my new table. I have python3 read_and_send_to_aws.py set on crontab to run at startup from the git repo above.
AWS: Displaying data on a web app or website
Code for this part (running first locally & then on AWS): https://github.com/kimberlymcm/flaskapp
Great so now I have my Raspberry Pi up and running and the sensor readings being stored in my AWS DynamoDB table. The next step is to read from the table and create a dashboard. For this, I chose to use Flask + Plotly + AWS Elastic Beanstalk. Flask is a super simple Python micro web framework. It is lightweight and makes iterating very fast. Also, it is Python-based and I’m not super familiar with other frontend web frameworks. Plotly makes it super easy to create professional looking Python-based dashboards and graphs. AWS Elastic Beanstalk is a service for deploying web applications that hides most of the complicated parts. I found it to be the most finicky part of the whole project.
First, the goal is to get a flask app running on my own computer. I’m new to flask so I followed some tutorials (like this) to get the hang of how it worked. Then I edited the tutorial for my use case (see my git repo above). It was pretty quick to get this up and running — I’ve definitely jumped on the flask bandwagon.
The next step, uploading the web app to Elastic Beanstalk took forever. The crux of the problem appears to be that the Elastic Beanstalk CLI had some inconsistencies with the framework I was using, so I eventually just uploaded the app to the UI, which worked successfully. I also turned off the automated load balancer because it was sending many requests to check the table health, which was costing dollars :). The last steps for me were configuring the website to use https (like this) and routing the website to a subdomain of my personal website.
MVP is complete! See at http://airquality.kimberlymcmanus.com
[This blog is a general overview on how I went about the project — happy to provide more specifics if helpful!]
Next steps: Send notifications and automatically turn on air purifiers. Until next time…