Inspiration

Our inspiration for this project is that we recognize that the internet is filled with an abundance of false information and unreliable sources. We wanted to help people determine whether the website they are reading can be trusted or not and simplify the process of citing their sources. Additionally, we wanted to provide a way for people to gather a quick summary of a website so they do not have to spend as much time scanning through articles.

What it is

LinkHack is a React web application served by a Flask backend and deployed on a CentOS server with Docker and NGINX.

Purpose

Its purpose is to help academic staff and students validate the reliability of their potential sources. LinkHack uses Natural Language Processing to summarize the contents of any website. In addition, it uses Beautiful Soup to scrape the article for relevant data and formats it into MLA format.

How we built it

LinkHack is built using NLTK for the NLP model, BeautifulSoup for the web scraping, Flask for the backend framework, ReactJS for the frontend framework, CentOS for the server, containerized by Docker and served with NGINX. Additionally, the deployment is automated by GitHub actions

Challenges we ran into

  • Ran into some issues regarding libraries in Docker containers
  • Had no prior NLP model experience
  • Trouble with CORS when calling the backend from the frontend

Accomplishments that we're proud of

  • Creating a functional NLP summarizer
  • Deploying the application to a live and secure server
  • Having a nice UI design
  • Optimized algorithms in the backend
  • The application is user-friendly and intuitive

What we learned

  • How to develop an NLP model for summarizing text
  • How to connect the front and backend in a deployment environment
  • How to deploy a frontend application to a server with Docker

What's next for LinkHack

  • Adding script functionality to the application
  • Vertically scaling the deployment environment
  • Add other citation format options
Share this project:

Updates