Inspiration

Our core mission is to increase the efficiency of the typical App development workflow & removing the redundant & time-consuming bits from the process by providing an easily accessible & automated solution where user can just provide a Sketch of UI and code will be autogenerated. This is how Sketch 2 Code came into existence.

We also realised that it wasn't convenient for all to draw a Sketch of UI and hence also added another feature called Text 2 Code which is an even easier way where user can just type-in their requirements and the code wil be generated.

What it does

  • Sketch to Code provides code snippet for UIElements in Swift & SwiftUI.
  • Text to Code generates code snippet based on text provided.
  • Save Snippet saves code snippets to our backend. In total we have 2 types of code generators:-

Sketch 2 Code✏️

Rigt now we have two type of Sketch 2 Code conversion:

1. Single Element Detecion

  • In this we have concentrated on single UIElement detection. There are times when developers need code of a specific type of UIElement.
  • So in such cases users can go on with using the Single Element Detection.

2. UI Screen Detection

  • As the project tite suggests our prior motve was to ease the work of coding UI stuffs.
  • Suitable for getting the code of a complete UI screen.

3. History

  • Complete record of all the recently used snippets automatically saved in our database.

Text 2 Code 💬

  • Text 2 Code is an additional feature which gives the users freedom to type their required UIElement instead of providing a sketch.
  • Using Text 2 Code is really easy and simple, just start writing the element type and the suggestion box in the app will show you the order of writing.
  • While writing the text you need to follow an order, which will be provided automatically by the App.

How we built it

Sketch2Code uses the following elements:

Screenshot 2021-08-12 at 4 12 58 PM

The iOS app is built using Swift & Xcode with libraries like VisionKit, CoreML, ImageIO, Loafjet & Alamofire.

For our backend, we used domain.com to create and launch server with the codebase we have right now. Cockroach DB was used for storing and retrieving information in real-time.

Challenges we ran into

  • Adding multiple ML models for element detection was hard since none of us had extensive experience with CoreML
  • Choosing the catalogue of elements for initial support from a vast object library

Accomplishments that we're proud of

  • Implementing all the core features within given time.
  • Adding multiple ML models with CoreML
  • Integrating the database with the App

What we learned

  • Implementing Object detection
  • Performing Unit & UI Tests with Xcode
  • Deeper understanding of Libraries like CoreML, Alamofire
  • Text Recognition with Vision
  • Integrating SwiftUI into a UIKit app
  • API development

What's next for Sketch 2 Code

With the upcoming releases we have thought of bringing various new features like:

  • Multiple Coding Language Support

  • Add more UIElements

  • Adding enhanced NLP in Text2Code, to make it more versatile

  • Improve object detection

  • Release Sketch2Code as a Beta release to know user feedback

Built With

  • alamofire
  • coreml
  • imageio
  • loafjet
  • swift
  • swiftui
  • uikit
  • visionkit
Share this project:

Updates