Inspiration

Farmers markets often sell cheaper, healthier, and more sustainable grocery items. However, many people opt to buy food from large grocery stores instead of local markets. To address this issue, we wanted to create an app that would encourage people buy their groceries at farmers markets. In this way, we promote a healthy and more sustainable lifestyle.

What it does

For our project, we wanted to create a web app where users can enter their grocery shopping list and their zip code. Having received these inputs, the website initiates a web spider that crawls through the websites of farmers markets in/near the specified zip code. While crawling through each webpage, the crawler matches items listed on each page with items in the user's grocery list. The website then prints the price and at which farmers market the user can buy each of their grocery items.

How we built it

We used AWS Lambda to initiate a web crawler. The web crawler then sends all the raw html it collects into an SQLite database. We used HTML and CSS to build the frontend.

Challenges we ran into

Neither of us had ever used AWS, SQLite, or any type of web-crawling software before. We faced difficulties in bootstrapping our AWS account for CDK, since the original csv file from our user did not include the user name. Additionally, due to time/ability constraints we were not able to parse the html and return the details for each specific item.

Accomplishments that we're proud of

We were able to use AWS to initiate a web crawler. We were also able to design a frontend for a website.

What we learned

We learned about UI/UX design, AWS, and the SQLite database.

What's next for FindAFarmer

Parse the HTML to extract specific items/info from each farmers market webpage. Print the results on the website.

Share this project:

Updates