This project gets USGS data, cleans it, and displays it as either a:
- Jupyter notebook.
Quakes_all_in_one.ipnbis the most instructive, an all-in-one solution, and what generated the image above. It displays the map with Folium because I wanted to, as closely as possible, recreate the USGS site which uses Leaflet.JS, which Folium is based on. This notebook also pulls the data at the time of use in the event you have an earthquake watching emergency.
Fortunately, most folks don't have earthquake watching emergencies, and I could opt for pulling data once a day. So, at midnight, GMT, a GitHub action creates an ephemeral runner which runs data_processing.py which downloads USGS earthquake data. The data is already clean but needs to be transformed which the script also does. It then saves both a daily and aggregated version (for future use). Parquet is used for speed and compactness. That brings us to the two other ways of displaying the data.
- Streamilit app.
Quakes_app.pyis what Streamlit Cloud hosts and runs at https://quakes.streamlit.app/ - Jupyter notebook.
Quakes_display.ipynbdiffers from the other jupyter notebook in where it gets it data from and it's currently the only one that displays the aggregated data.
Or, put graphically:
A few years ago I started a makerspace with a group of really good folks. Along the way, we met an artist and instructor, Christina Weisner, who was in the early stages of doing a project and consulting with one of our members, Kerry Krauss. Kerry was a professor of electronics technology at the local community college.
According to Kerry, the code was a bit of kludge. It got the data from USGS somehow; I'm not sure if was RSS, Atom, or JSON. From there, the data was processed and sent a signal to a bunch of Arduino Uno boards by sound. Each Arduino was used to actuate one of the seismometers Christina bought. That might seem nuts but Kerry's rationale was, since they were having to troubleshoot at each location, audio was easier to troubleshoot than whipping out a multimeter every time. You can see Christina & Kerry and learn more about her project here.
What I liked about this project was how it blended art with technology. Even more interesting was one of the hydrophones was still functional so Christina (with some help) was able to make the observers part of the installation. Another thing I found interesting was the artist as a sort of conductor rather than as the sole author. Christina had the inspiration and idea for the project but almost all the most fabrication and technical aspects came from others.
- Pull JSON data from USGS of earthquakes greater than magnitude 2.5 over the last 24 hrs
- Pipeline formatting of data
- Display map of quakes >= 2.5 over last 24 hrs using Folium
- GitHub actions to automate update of data.
- Deploy app
- Contact Eric Geist at Tsunami and Earthquake Research to see if there have been more tsunami occurrences
- Get & massage data for tsunami warnings and actual tsunamis reported
- Contact Lisa Wald of USGS
- Get tsunami warning/occurrence data (working)
- Develop model
- Twilio integration
- Find a way to export a photo as .png rather than as
quakes_last_24_hours.htmlto update repo image at top daily. This is tricky because to do this for folium seem to require selenium and the formatting can be variable.

