Inspiration

We saw many GAN models that can replace apples with oranges in an image and that can transform cartoons into real photos. (https://arxiv.org/abs/2102.01143) So we decided to either change Minecraft screenshots or Minecraft satellite to real photos/satellite pictures. The second one is more doable in 2 days so we chose that. We compared 3 methods: texture mapping, CycleGAN, and VGG Style Transfer. And we show these 3 methods on our website.

What it does

It will change Minecraft screenshots to satellite pictures using 3 methods. Users can compare the difference.

How we built it

We used Google Colab and TensorFlow to run our model and also used google cloud to host that as a website. Since ML is data-hungry, we wrote some scripts to collect data from the Internet.

Challenges we ran into

Our model is not running really well at the beginning and it is because of an error of the training loop variable. And when we are going to build a web server, we an into problems of video upload while processing, we moved to use only images.

Accomplishments that we're proud of

  • We chose the hardest field in CS - machine learning - and it works to some degree. And during this hackathon we learned a lot about image processing, training ai models, using python libraries, and using google cloud. We used a unique method to collect images. Our python script for collecting images from Minecraft biomes used pyautogui to navigate the world and save screenshots from a selected area. For the google images dataset, we wrote a similar script to navigate google earth data. Each of these sets were then formatted with an additional python script. It was challenging to find large enough datasets and we’re specifically proud of this unique method for image collection. ac ## What we learned
  • How GAN and CycleGAN work
  • How to build a website with Google Cloud
  • How to apply for VGG and style transfer
  • How to transfer files between web client and server with multi-user settings dynamically (I did data before but not file)

What's next for Minecraft 2 real

Since 2 days is not enough to train a model and collect enough data, we will work on it further on training longer time and collecting more data. We might also need to twist our neural network in the meantime. And after this network is working well, we could make it a streaming service in which users can transfer their Minecraft gaming screen to realistic photos.

Special Prices

I think our project is connected to climate and space is that it is translating satellite images. And the method we used can also translate images between grassland and desert.

(the filename has copy of in it because I made a clean version for the final demo) I said the texture mapping will detect the object and map it, but that is not what we did but our future plan.

Built With

Share this project:

Updates