Inspiration

I was learning about the co:here toxicity filter and thought about how cool it would be to be able to browse the web without any unwanted toxic behavior. Even on websites like 4chan where toxic behavior is expected or allowed.

It also allows kids to browse the web safely knowing that toxic behavior will be filtered out

What it does

On page load, it replaces the html with a peaceful version of the html page so users can browse calmly.

How we built it

We built a browser extension on chrome with a backend that uses the co:here api to scan user interaction on popular websites for toxic behavior.

Challenges we ran into

Latency is a huge issue for this app. Our first prototype had 20 seconds of latency. Up to 10 seconds for processing and parsing the html and up to 10 seconds to run the NLP. THis is with a set of around 1500 blocks of text. Through some creative filtering we got it down to 700 blocks, but that was still much too high. To solve the problem we reduced the scope of the problem to user interaction and made it easier for developers to extend the app for their favorite websites. Then we are only processing around 20 blocks of text with the NLP model at a time.

Accomplishments that we're proud of

We're proud of the speed we got the app running in. It still takes about 4 seconds but that's much better than 20.

One of our team members has never done js dev before so completing this app is a huge accomplishment.

What we learned

Sometimes a general solution for a large problem is impossible. Instead we had to break down the problem to create a more specific solution to each popular website.

What's next for Chill Web

Chill web needs to expand and become more efficient before it can be a product. The prototype with GCP and co:here is too expensive to run regularly. NLP is the slowest and most expensive part of the app, a weaker and faster model that doesn't catch everything could reduce cost and allow the app to be marketable for a subscription.

Built With

Share this project:

Updates