Scrape data from a regularly updated website (e.g. cafeteria's weekly lunch menu, local bank interest rates, Groupon/LivingSocial/etc deals), save to a database (postgres), hook up to a cron job, and use the Data Visualization tutorial to play with different visualization techniques.
Within your terminal:
- Create database for storing scraped data
(WebScraperProj) $ cd new-coder/scrape/lib/full_source/tutorial/tutorial- Edit settings.py and set your database settings
(WebScraperProj) $ scrapy crawl livingsocial
Within your terminal:
(WebScraperProj) $ cd new-coder/scrape/lib/full_source/tutorial/tutorial(WebScraperProj) $ scrapy check livingsocial
- write code.
- write tests.
- write tutorial.
- be awesome.