Inspiration

Siamese neural networks and encoder architectures. Machine Learning literature also shows that keeping it small can be beneficial for easy tasks.

What it does

MANGOsta is a model trained to classify pairs of clothing between compatible and not compatible. This may be used in many applications such as recommendation and querying.

How we built it

We use a siamese architecture. The model is given a pair of clothes and outputs an embedding for each. The loss is calculated with a mean squared error over the cosine similarity between the two embeddings, and the true label. The true label was calculated using the list of outfits. A 1 is expected if the two pieces of cloth appear in some outfit at the same time, a 0 otherwise.

Challenges we ran into

  • Data outliers.
  • Time.

Accomplishments that we're proud of

The ability of classifying pairs of clothes as compatibles or not is very versatile and applicable to many contexts. The valid F1 of 80% shows us the the data treatment and model training were effective.

Also the model is extremely fast to train and to run. We needed less then 1 minute on CPU to train the full model.

What we learned

The power of small siamese models.

What's next for MANGOsta

More applications can be developed, and more data could improve the model even further.

Authors

Severino Da Dalt, Ferran Espuña, Roger Garcia Nevado, Liliu Martinez Xicoira

Built With

Share this project:

Updates