Inspiration

It was Christmas. I didn't have a present. I needed a present.

I assume an inspiration could be Kellogg's from the 90s with Game CD-ROMs on my breakfast table.

What it does

On Christmas, every person gets a nice box with a nice ghost.

Userflow

The box is a jigsaw puzzle box, laser-cut from wood.
The person opens the box.
The person sees candy, cornflakes, toys, or a teddy bear inside the box.
The person sees a small letter inside the box.
The letter contains a link to boxgeist.com and a tiny secret.
The person enjoys the candy, cornflakes, toys, or teddy bear.
The person visits Boxgeist.
The person explores all the ghosts with limited access.
The person speaks to different ghosts.
The person mints one ghost with their tiny secret.
The person conjures their ghost with full access.
The person speaks to their ghosts.

Every ghost is special. Every ghost has a unique personality and skill set.
For example:
Sentinel, a hacker from Greenhog. He helps you with your IT problems.
Lara, the nurse from Hospitz. She helps you with your health-related questions.

A little bit like Kellogg's in the 1990s—you buy cornflakes, you get a game.

How we built it

We built it incrementally:

  1. Created a Software Design Document to define scope.
  2. Laid out the website in plain HTML.
  3. Converted it into components via Vue.js.
  4. Worked on BlogView.
  5. Worked on HomeView.
  6. Worked on GhostsView.
  7. Worked on GhostView.
  8. Worked on VoiceButton.
  9. Worked on Material Button.
  10. Worked on ARButton.
  11. Worked on the NFT pipeline.
  12. Created the NFT collection.
  13. Worked on AssetView.
  14. Christmas: Showed the family their boxes.
  • The box itself is a laser-cut piece of wood. A simple SVG file is used to let the laser cutter cut all six sides.
  • The NFT pipeline creates, based on a Blender file with 135 animated low-poly characters, a full NFT collection. It takes the name of the mesh and converts it into a character sheet (Name, Background, Knowledge Base, Skill Set, etc.). See the result here: metadata.boxgeist.com
  • The AssetView has an embedded wallet. Grandma will definitely not download MetaMask. If the mint button is pressed, it sends an API request to a tiny backend, which calls Apillion (a Web3 SDK) and mints a ghost for her. If AssetView detects she has an NFT, it hides the mint button and shows a ghost certificate of her ghost. If she clicks on the owner, she is redirected to the block explorer, which displays her NFT nicely in her account.
  • The GhostView is a Three.js scene. It shows the 3D model of a ghost. The person clicks the voice button, which connects to OpenAI's real-time API. The system prompt is the character sheet of the ghost. The person clicks the material button, allowing them to switch the ghost's appearance. The person clicks the AR button, entering the XR space and mounting a fresh DOM overlay into it. The person has to place their ghost. Once the ghost is placed, it appears. Inside the DOM overlay, there is a leave button. Clicking it closes the XR space and returns to the GhostView.

Challenges we ran into

Uff, multiple ones:

  • Defining scope and a way of working. Something had to be delivered, and a lot of unknowns were involved. I made sure that every three days, I had something that worked.
  • 3D models: Modeling, texturing, rigging, and animating them myself in a low-poly way. Later, I bought a model set from Imphenzia. He uses the same procedure as I do, so integration went smoothly.
  • VoiceButton: At the time, there was only a real-time voice beta and a repo where I could borrow some code. Tinkering with how it worked and setting it up was challenging. Later, OpenAI released their 1.0 version, which removed some headaches from the architecture. The biggest headache was hiding the API keys.
  • Material Button: I thought about making a cool shader to display the hologram in an ink-like way. After a full day, I deleted the idea.
  • AR Button: Entering, interacting with, leaving, and re-entering the XR space had multiple issues.
  • NFT pipeline: I started simple by writing a character sheet by hand. Based on that, I extracted an overall structure and instructed an AI to use this structure to generate a full character sheet based on the mesh name. Rendering an image and animation for each model for the block explorer to pick up was another struggle. Interacting with Apillion's Web3 SDK took quite some time to grasp. This turned out to be worth it because it removed the headache of creating my own smart contract, though I needed to understand their inner workings.
  • AssetsView: Integrating the wallet was surprisingly painless. Getting the mint button to work and deciding not to include a transfer button was a good choice. Displaying a ghost certificate was solid because the data was structured well, eliminating any listing headaches.
  • Laser-cutting the box: Initially, I thought about ordering a simple box via Vistaprint but was shocked by the pricing for a custom box. So, I reused an older layout for laser-cutting the box. Unfortunately, due to time constraints, it didn’t make it for Christmas. I ended up packing Kellogg's packages together—it worked okay but wasn’t ideal.

Accomplishments that we're proud of

It works.

Family and friends received a present.
They saw ghosts. They spoke to ghosts. They entered the blockchain world.

The younger generation loved it.
The older generation had questions about the functionality.

Looking back 3.5 years, oh man, it's amazing.
Understanding and acquiring all the necessary pieces to scratch something like this out makes me really proud. We're talking about UI/UX design, frontend development, backend development, 3D modeling, and crypto development—a "one-man army" right here.

3.5 years ago, I started this journey from 0 to Web3 AR dev. The AI revolution just threw oil on the fire. The fact that holograms can speak with just a few lines of code—pure heaven.

What we learned

Mostly conjunction.

I built multiple WebAR applications.
I built multiple backend applications.
I built multiple frontend applications.
I built multiple decentralized applications.
I built multiple 3D models.
I built multiple ...

But combining and seamlessly integrating all the pieces was definitely a key lesson.
Lego is easy; Legos are complex—kind of like that.

What's next for Boxgeist

I'll take what I learned and port it to the next project.

My overall goal is to create a Web3 AR shop. With the rise of Voice AI, this opens up a whole new scope of opportunities.

A Web3 AR shop is a simple online shop that allows you to buy products.
Every product is special. Every product has a hologram.

Built With

Share this project:

Updates

posted an update

Hola :wave:

Notes:

  • On some Google Chrome browsers on certain Android devices, the voice button will not activate.
  • iPhone does not display the holograms. To view them, the XR Viewer needs to be downloaded.
  • WebXR API is not enabled on iPhone by default. The general assumption is that WebXR will become part of the stable release on iOS this year. Apple's Vision Pro has WebXR enabled in visionOS 2, and the WebXR community (myself included) speculates that Safari will enable the WebXR API sometime this year. There are workarounds to make it work, but they involve additional costs. For prototyping or gifts for family and friends, these costs I will avoid. :see_no_evil:

Yesterday, I showed the app to a friend over a beer. These were the discussion points:
The most surprising issue was that "talking to ghosts" worked flawlessly on my phone, but it didn’t work on his Android device.

Cheers :beers:

Log in or sign up for Devpost to join the conversation.

posted an update

Hola :wave:

I got a hint, so a quick update. :)\ In the video, I said, "100 Ghosts are yours to find." I meant it literally!

How to Get a Ghost

  1. Visit boxgeist.com/#/assets/.
  2. Log in with a simple email.
  3. Press "Mint" to get your ghost.
    • No Metamask or Blockchain knowledge required.
    • Even Grandma managed to get a ghost on Christmas :smile:

How to Talk to the Ghosts

  1. Click on any ghost.
  2. Ask, "Who are you?"
    • The ghost will tell you.

How Unique Responses Work

  • Visit the metadata.
  • Each metadata file has a prompt_url field.
    • This field links to markdown files that are fed into OpenAI Voice Chat to generate responses.

How It Was Built

  1. Blog: Learn how it was built in the Blog.
  2. Podcast:
    • On the last day, I compiled all my diary entries into one file and made a podcast.
    • It’s more narrative and less technical.
  3. Source Code:
    • I kept the source code readable for AIs and humans.
    • Example: The workings of WebAR are here.

Want a Physical Box?

  • Email Me: I’ll cut another one for you. It's not much work.
  • DIY Option: Visit a nearby hackerspace/makerspace—they usually have laser cutters. The necessary SVG files are in the source code.

The Idea in a Nutshell

  • "Nice box, nice ghost."\ On Christmas, I threw some IT notions into the mix (Frontend, AR, Blockchain, Server). Ufff, that didn't help to "sell" the present—it just sparked confusion. "Nice box, nice ghost." This keeps the discussion lighthearted and avoids opening the bottle of IT blablabla. Lesson learned. :see_no_evil:

Cheers :coffee:

Log in or sign up for Devpost to join the conversation.