Inspiration

We believe memes will revolutionize future shopping—they are highly viral, engaging, and memorable. This led to the creation of "67", a simple, seemingly innocent, and distinctive hand gesture transformed into a compelling shopping interaction. Traditional screen-based interfaces are less engaging; we theorize that integrating hand movements via computer vision will significantly enhance memory recall, as studies confirm that physical action boosts memory retention.

What It Does

The "67" store currently focuses on three product categories: fitness equipment, musical instruments, and apparel.

  • REWEAR: The interface is inspired by a dating app model, using swipes—left to dismiss, right to like—and the "67" gesture for a "Superlike." Product images are tagged with features (e.g., cotton, blue, baggy). The Superlike action triggers the system to provide highly personalized and accurate recommendations, creating a tailored shopping experience.

  • Music: Users employ hand gestures for controlling aspects of music, such as guitar chords, pitch modulation, and effects (like fade-out or volume adjustments). The system tracks finger positions via camera, eliminating the need for prior musical knowledge. This lowers the barrier to entry, encouraging exploration and purchase.

  • Fitness: The "67" gesture is integrated as a bicep curl motion. The system monitors the user's form during exercises (e.g., dumbbell lifts), offering real-time feedback on posture and timing to help the user improve efficiency.

All functionalities are powered by computer vision technology that tracks movement, resulting in a more dynamic and memorable shopping experience.

How We Built It

Developing the real-time posture analysis required extensive data logging. We utilized Haskell's functional programming capabilities and fuzzy control theory, to allow noisy and sudden error logic issue to efficiently process and analyze this data. The results were then fed into a trained Large Language Model (LLM) for generating product recommendations.

A major initial hurdle was the LLM's tendency to repeat the same suggestions after a single interaction, acting like a stateless, one-shot function. We overcame this by implementing temporary memory and refining the prompts to ensure the model retained conversational history and offered diverse recommendations.

Challenges We Encountered

  • AI Memory: The LLM was initially forgetful, repeatedly suggesting the same products.

  • Processing Power: Analyzing real-time gestures was computationally demanding; Haskell provided the necessary efficiency.

  • Vision Accuracy: Finger and pose tracking was inconsistent under poor lighting or with shaky hands, necessitating the development of robust filtering algorithms.

Accomplishments We're Proud Of

We successfully integrated computer vision with an LLM to create a functional and helpful product.

What We Learned

Gestural Interaction: Physical gestures create a deeper, more tangible connection between the user and the AI.

LLM Tuning: Prompt engineering is often a more effective solution for improving LLM performance than simply increasing hardware resources.

UX/Sales: Integrating fun, memorable elements (like memes) creates an addictive user experience that naturally drives sales.

What's Next for 67 Store

Our plan is to scale the platform by:

Augmented Reality (AR): Integrating AR functionality to allow users to "try on" items, such as the baggy shirt, while performing the "67" gesture.

Collaborations: Partnering with key content creators and meme influencers for exclusive product drops.

Global Expansion: Introducing voice-command options for accessibility and exploring a blockchain-based reward system where users earn points for verified gestures.

Built With

Share this project:

Updates