OpenStreetMap Listing Scraper helps you collect structured location data directly from OpenStreetMap search results. It turns place-based searches into clean, usable datasets with addresses, coordinates, and URLs, saving hours of manual work.
Created by Bitbash, built to showcase our approach to Scraping and Automation!
If you are looking for openstreetmap-listing-scraper you've just found your team — Let’s Chat. 👆👆
This project extracts detailed geographic and address-level information from OpenStreetMap search queries. It solves the problem of manually gathering and normalizing location data for analysis, research, or product development. It’s built for developers, data analysts, and teams working with location-based datasets.
- Converts search keywords into structured geographic records
- Normalizes addresses and coordinates into consistent fields
- Designed for scalable data collection across many queries
- Outputs clean data ready for analytics, GIS tools, or databases
| Feature | Description |
|---|---|
| Keyword-based search | Extracts locations using natural language search queries |
| Full address extraction | Captures complete, human-readable address details |
| Coordinate precision | Provides accurate latitude and longitude values |
| Result limiting | Controls how many locations are returned per query |
| Structured output | Produces clean, well-formed JSON data |
| Efficient processing | Optimized for fast, repeatable data collection |
| Field Name | Field Description |
|---|---|
| keyword | The search term used to find the location |
| address | Full formatted address of the location |
| url | Direct OpenStreetMap URL for the place |
| lat | Latitude coordinate |
| lon | Longitude coordinate |
Example:
[
{
"keyword": "Library in New York, US",
"address": "New Amsterdam Library, 9, Murray Street, Tribeca, Manhattan, New York County, New York, 10007, United States",
"url": "https://www.openstreetmap.org/node/2632446787",
"lat": "40.7134916",
"lon": "-74.0079247"
}
]
OpenStreetMap Listing Scraper/
├── src/
│ ├── main.py
│ ├── scraper/
│ │ ├── search_handler.py
│ │ ├── location_parser.py
│ │ └── request_client.py
│ ├── utils/
│ │ └── validators.py
│ └── config/
│ └── settings.example.json
├── data/
│ ├── input.sample.json
│ └── output.sample.json
├── requirements.txt
└── README.md
- Data analysts use it to collect location datasets, so they can run geographic insights and reporting.
- Product teams use it to enrich apps with place data, improving location-based features.
- GIS developers use it to source coordinates, enabling accurate spatial mapping.
- Researchers use it to study urban or regional patterns without manual data gathering.
What type of searches does this support? It supports keyword-based location searches such as places, services, or points of interest in specific regions.
Can I limit how many results are returned? Yes, you can define a maximum number of items per query to control dataset size.
Is the output easy to integrate with other tools? The output is structured JSON, making it compatible with databases, analytics tools, and GIS platforms.
Does it work for multiple keywords at once? Yes, you can provide multiple search keywords in a single run.
Primary Metric: Processes an average location query in under one second with consistent response times.
Reliability Metric: Maintains a high success rate across repeated runs with stable output structure.
Efficiency Metric: Handles multiple keyword searches with minimal overhead and controlled resource usage.
Quality Metric: Delivers complete address and coordinate data for the vast majority of results.
