Inspiration
The AI landscape is evolving faster than any course can keep up with. As computer science students, we constantly hear about new models, frameworks, and tools, but there is no clear way to learn what they do, how they differ, or when to use them. Reading lists of tools does not build real understanding. The real skill developers need is judgment: the ability to look at a project and know which tools are the right choice.
We wanted to build something that helps students learn the AI ecosystem in a hands-on way instead of passively reading about it. That idea led to Stack or Crack, a game where learning happens through real decisions.
What it does
Stack or Crack is a gamified learning platform that teaches the modern AI ecosystem by making players build real tech stacks.
Players are given a project scenario, such as building an AI assistant or a startup product, along with constraints like budget or required features. They must choose the tools they want to use, including language models, frameworks, databases, and APIs.
While building their stack, players can explore each tool to learn what it does, what it is best used for, and how it compares to other options. This allows users to understand the ecosystem while making decisions, instead of memorizing tools from a list.
After submitting their stack, an AI judge evaluates their choices and gives feedback tool by tool. The judge explains which decisions were strong, which were unnecessary, and which requirements were missed.
By comparing the player’s stack with an expert stack, the game helps users build intuition about when and why to use different tools.
How we built it
We built Stack or Crack using Next.js for both the frontend and backend, with Tailwind CSS for styling the interface.
The evaluation system is powered by Claude Haiku, which acts as the AI judge. It analyzes the user’s selected tools, compares them to an expert solution, and generates detailed feedback explaining why each choice does or does not fit the scenario.
We designed the tool selection system to be modular so new AI tools can easily be added in the future. This allows the platform to stay relevant even as the AI ecosystem continues to grow.
In the future, the system could pull trending tools automatically from sources like Hugging Face or Product Hunt to keep the game up to date.
Challenges we ran into
The biggest challenge was time. We had many features planned that we did not have enough time to fully implement, including several quality-of-life improvements and additional gameplay scenarios. One feature we especially wanted was automatic updating of the AI tool landscape, where the platform could pull new models and frameworks from external sources so the landscape always stays current.
Another challenge was designing the interface without a dedicated UI/UX designer. Because of this, a lot of time went into making the layout intuitive while also keeping the tool selection system clear and easy to use. Balancing functionality, usability, and speed of development was harder than expected.
We also spent significant time designing the AI judge prompts so that the feedback felt meaningful instead of random. Getting consistent, useful explanations from the model required multiple iterations and careful tuning.
Accomplishments that we're proud of
We are proud that Stack or Crack is a fully functioning platform, especially with an AI judge that gives meaningful, tool-by-tool feedback. The system evaluates user stacks and provides insights that feel realistic and helpful, which was a major technical achievement.
We are also proud that the project turned a complex, rapidly evolving AI ecosystem into something interactive and understandable. Players are not just memorizing tools. They are learning to make decisions and gaining intuition about when and why to use specific models and frameworks.
Finally, we are proud to have created a tool that we ourselves find genuinely useful. It reinforces our own learning while providing a fun, hands-on way to explore AI tools. This shows the idea works in practice, not just in theory.
What we learned
A lot of us were trying things for the first time, like integrating the backend and working with the AI judge powered by Claude Haiku. We also learned how tricky it can be to design a platform as you build it, especially without a dedicated UI/UX person on the team.
Working on the interface taught us how important it is to make things clear and intuitive, and how much iteration it takes to balance usability with technical complexity.
Another big takeaway was how helpful it was to communicate our ideas with mentors and sponsors. Getting outside perspectives helped us spot issues and improve the project in ways we might not have thought of on our own.
Overall, we learned that building a system that actually teaches decision-making is hard, but it was exciting to see our ideas take shape in a way that works in practice.
What's next for Stack or Crack
Next, we want to add more ways for players to learn and explore the AI ecosystem. One idea is to make the AI judge act more like a trained model, so its feedback stays accurate, unbiased, and up to date.
We also want to add a news section where players can learn about new tools and updates in the AI space, and a history feature to save stacks and track past decisions. Another mode could let players put in their own projects to see what an expert stack would look like.
In the longer term, we hope to expand beyond just AI tools to include system architecture decisions and workflows. The goal is for the platform to stay current with the latest developments in the tech landscape while giving players a way to practice real decision-making.
FAQ
Q1. How do you determine the “expert stack”?
- For each project scenario, we put together a curated expert stack based on current best practices and common tools in the field. The AI judge then compares the player’s choices to this reference stack while also considering constraints like budget, features, and functionality.
Q2. Why gamify the experience instead of just listing tools?
- Just reading lists of AI tools doesn’t teach you how to make real decisions. By turning it into a game, players have to choose tools under constraints, which helps them build practical judgment and develop intuition about when and why to use certain models.
Q3. Who is Stack or Crack for?
- Our main audience is CS students and early-career developers trying to navigate the fast-changing AI ecosystem. It’s also useful for anyone wanting a hands-on way to learn new tools and build decision-making skills, not just memorize features.
Built With
- claude-haiku
- next.js
- tailwindcss
Log in or sign up for Devpost to join the conversation.