Skip to content

hc2111/CourseProject

 
 

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

15 Commits
 
 
 
 
 
 
 
 

Repository files navigation

CourseProject

Explain your model For the competition I used the BERT(Bidirectional Encoder Representations from Transformers) model. This model is a cutting edge model for classification. BERT's main innovation is applying the bidirectional training of Transformer, a popular attention model, to language modeling.

How I performed the training:

Since BERT training on CPU was incredably slow, I utilized a google cloud VM to train the model on the gpu, this enabled much faster training speeds and more experimentation with parameters and tuning.

Experiments with other methods:

Prior to using BERT, I tried to create my own model and tunings, however they failed to come close to the baseline, so I expanded my options and opted to utilize BERT.

How to run: Download the Data folder from this link: https://drive.google.com/file/d/1OqLtj9BTnob45huOsFN_fMN23hL_WrGi/view?usp=sharing and the jupyter notebook, and then run all the cells.

demo video: https://drive.google.com/file/d/1LIkJRzyLRYKK2roMzF5CP208Hlj24ehw/view?usp=sharing

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Contributors

Languages

  • Jupyter Notebook 100.0%