Web Enclave https://webenclave.com/ Building Apps for the Modern Age Thu, 29 Jan 2026 05:22:01 +0000 en-US hourly 1 https://wordpress.org/?v=6.9.4 https://i0.wp.com/webenclave.com/wp-content/uploads/2024/04/cropped-258ecd59-3861-4d7f-ba0d-fa05f4644cb0.webp?fit=32%2C32&ssl=1 Web Enclave https://webenclave.com/ 32 32 223724139 OpenFret: The All-in-One Platform I Built for Guitarists https://webenclave.com/2026/01/29/openfret-the-all-in-one-platform-i-built-for-guitarists/ https://webenclave.com/2026/01/29/openfret-the-all-in-one-platform-i-built-for-guitarists/#respond Thu, 29 Jan 2026 04:00:37 +0000 https://webenclave.com/?p=281 As a guitarist and software developer, I’ve always been frustrated by the fragmented tools we use. One app for tracking your gear, another for practice logs, something else for learning scales—it’s messy. So I built OpenFret, and today I want to share what it does and why I think it fills a real gap for […]

The post OpenFret: The All-in-One Platform I Built for Guitarists appeared first on Web Enclave.

]]>
As a guitarist and software developer, I’ve always been frustrated by the fragmented tools we use. One app for tracking your gear, another for practice logs, something else for learning scales—it’s messy. So I built OpenFret, and today I want to share what it does and why I think it fills a real gap for musicians.

## What is OpenFret?

OpenFret is an all-in-one platform designed specifically for solo musicians, particularly guitarists. It combines gear tracking, practice analytics, AI-powered lessons, real-time collaboration, and community features into a single, cohesive experience. Think of it as your personal guitar command center.

## Smart Guitar Inventory

One of the features I’m most proud of is the smart inventory system. You can track every guitar you own—from 6-strings to 8-strings—with auto-filled details from thousands of brands. The system monitors everything: woods, pickups, fingerboard radius, tunings, and even when you last changed your strings. No more forgetting which guitar has which setup.

## Practice Analytics That Actually Help

Practice is where growth happens, but most of us don’t track it well. OpenFret includes persistent timers, a built-in metronome, scale visualization tools, and fretboard maps. There’s also Last.fm integration for detailed song tracking, so you can see patterns in what you’re learning and playing.

## Session Mode: Git for Musicians

This is where things get interesting. I’ve basically built version control for music. Like GitHub for code, Session Mode lets you fork sessions, layer tracks, see version history, and collaborate with musicians worldwide. You can download stems, merge contributions, and actually see the evolution of a piece. It’s collaboration done right.

## Musical Helpers to Break the Rut

We’ve all been stuck in the E minor pentatonic box. The musical helpers feature includes scale references, chord progressions, fretboard visualization, key selection, and tempo control. It’s designed to push you out of your comfort zone and into new musical territory.

## A Community That Gets It

OpenFret isn’t just a tool—it’s a community. You can search for other musicians, discover gear combinations, leave comments, get notifications, and connect your Discord. The hearts and comments system lets you discover what setups other players are using and get inspired by their rigs.

## Built-in Guitar Tuner

Yeah, there are a thousand tuner apps. But having one built into your all-in-one platform means one less thing to switch between. The tuner includes frequency control (Hz adjustment), sharp/flat detection, and beginner-friendly instructions. Perfect for when you’re stretching new strings.

## The RPG Experience: Level Up Your Playing

Here’s something I’m really excited about—OpenFret includes a full RPG-style progression system to make practicing feel like a game. Earn XP for every practice session, song learned, and technique mastered. Level up your character, unlock achievements, and track your journey from Beginner Bard to Guitar God.

The skill tree system lets you specialize in different areas: fingerpicking, shredding, jazz improvisation, or rhythm playing. Complete daily quests and weekly challenges to earn rewards and keep your practice streak alive. There’s even a leaderboard if you’re feeling competitive.

It’s gamification done right—not gimmicky, but genuinely motivating. I built this because I know how hard it is to stay consistent with practice. When you can see your progress visually and unlock new abilities, it changes how you approach the guitar.

## Try It Out

OpenFret is free to start—no credit card required. If you’re a guitarist looking to level up your organization, practice routine, and connection with other musicians, I’d love for you to give it a shot.

Check it out at [openfret.com](https://openfret.com) and let me know what you think in the comments.

The post OpenFret: The All-in-One Platform I Built for Guitarists appeared first on Web Enclave.

]]>
https://webenclave.com/2026/01/29/openfret-the-all-in-one-platform-i-built-for-guitarists/feed/ 0 281
Debugging Python Code Like a Professional: My Journey and Tips https://webenclave.com/2025/02/14/debugging-python-code-like-a-professional-my-journey-and-tips/ https://webenclave.com/2025/02/14/debugging-python-code-like-a-professional-my-journey-and-tips/#respond Fri, 14 Feb 2025 03:45:40 +0000 https://webenclave.com/?p=261 Debugging can feel like navigating a maze blindfolded. I mean, Python’s ecosystem is a blessing and a curse—on one hand, it’s insanely powerful, and on the other, tracking down those pesky bugs can sometimes drive you up the wall. Over the years, I’ve been through every possible debugging scenario, from staring at cryptic stack traces […]

The post Debugging Python Code Like a Professional: My Journey and Tips appeared first on Web Enclave.

]]>
Debugging can feel like navigating a maze blindfolded. I mean, Python’s ecosystem is a blessing and a curse—on one hand, it’s insanely powerful, and on the other, tracking down those pesky bugs can sometimes drive you up the wall. Over the years, I’ve been through every possible debugging scenario, from staring at cryptic stack traces to trying to tame performance issues in a production environment. Today, I’m excited to share my personal journey and some battle-tested techniques for debugging Python code like a pro—whether you’re a grad student, a DevOps guru, or a seasoned software engineer.

The Basics of Debugging Python

Let’s kick things off with the fundamentals. I’ve spent countless hours wrestling with bugs that seemed to come out of nowhere. Here are some basics that have saved me more times than I can count:

  • Breakpoints: These are your best friends. They let you pause execution and take a good, hard look at what’s happening inside your program.
  • Stepping Through Code: Whether you’re stepping over, into, or out of functions, this technique helps you follow the program’s flow and understand exactly where things go awry.
  • Variable Inspection: Ever wished you could just see what your variables are up to at any given moment? Hover over them or use your IDE’s variables panel to get the scoop.
  • Stack Traces: When your program throws an exception, the stack trace is like a breadcrumb trail leading back to the source of the problem.

These basics are where every debugging journey starts. Now, let’s talk about one of my favorite tools—Visual Studio Code.

Getting Down with VS Code

If you’re not using VS Code for Python development yet, you’re in for a treat. This editor has been my go-to for years, and its debugging features are nothing short of stellar.

Setting Up Your Debug Environment

First things first: make sure you have the Python extension installed. Then, you’ll need a launch.json file in your .vscode folder to define your debug configurations. Here’s a simple setup I’ve used countless times:

{
  "version": "0.2.0",
  "configurations": [
    {
      "name": "Python: Current File",
      "type": "python",
      "request": "launch",
      "program": "${file}",
      "console": "integratedTerminal"
    }
  ]
}

Also, don’t forget to pick the right Python interpreter for your project. Trust me—nothing ruins your debugging vibe faster than running the wrong version of Python.

Breakpoints, Stepping, and Variable Inspections

  • Setting Breakpoints: Just click next to the line numbers in your editor. It’s as simple as that, and VS Code will highlight the line for you.
  • Stepping Through: Use F10 to step over, F11 to step into functions, and Shift+F11 to step out. It’s like having a remote control for your code.
  • Variable Inspection: When you hit a breakpoint, hover over any variable to see its current value, or check out the Variables panel to get a broader view.

Conditional Breakpoints and Log Points

Sometimes, you only want to stop the execution when something specific happens. That’s where conditional breakpoints come in. Right-click on a breakpoint, select “Edit Breakpoint,” and set a condition. For instance:

if user.id == target_id:
    # Breakpoint triggers only when this condition is true.

And if you prefer not to halt your execution but still want to monitor what’s happening, use Log Points. They print out messages to the debug console without pausing your code—a lifesaver when you’re chasing a tricky bug in a tight loop.

Advanced Debugging Tricks

Once you’ve mastered the basics, it’s time to dive into some advanced techniques that have personally helped me tackle the nastiest of bugs.

Stack Traces and Exception Handling

Stack traces are like the crime scene photos of your code’s errors—they tell you exactly where things went wrong. Here’s an example:

Traceback (most recent call last):
  File "app.py", line 42, in <module>
    main()
  File "app.py", line 35, in main
    process_data(data)
  File "processor.py", line 15, in process_data
    result = data[undefined_key]
KeyError: 'undefined_key'

By carefully following the stack trace, you can pinpoint that missing key error. And of course, wrapping code in try-except blocks allows you to gracefully handle these errors and log them for later analysis:

try:
    result = data[undefined_key]
except KeyError as e:
    logging.error("Encountered a KeyError: %s", e)
    # Additional handling here

Profiling for Performance Bottlenecks

Sometimes the problem isn’t a crash—it’s that your code is just too slow. Profiling lets you see where your program is spending its time. I’ve found tools like cProfile and Py-Spy to be indispensable:

  • cProfile: A built-in Python profiler that’s great for getting a quick snapshot.
  import cProfile
  cProfile.run('my_function()')
  • Py-Spy: This tool attaches to a running process and samples what your code is doing without significant overhead.

Integrating these profilers into your debugging routine can help you spot and fix performance issues before they become a headache in production.

Logging Like a Boss

Effective logging is like leaving breadcrumbs behind—you can trace back your steps when things go wrong. I used to underestimate logging until I found myself sifting through endless error reports. Here’s what I do:

  • Use Python’s Logging Module: Set up logging with different severity levels. A quick setup looks like this:
  import logging

  logging.basicConfig(
      level=logging.DEBUG,
      format='%(asctime)s - %(levelname)s - %(message)s'
  )
  logging.debug("Debugging message: Here’s what’s happening...")
  • Structured Logging: In an enterprise setting, structured logging (think JSON) makes it much easier to filter and analyze logs using centralized systems like ELK Stack.
  • Context is King: Always include as much contextual information as possible. Whether it’s user IDs, request IDs, or even just timestamps, these details can make all the difference when you’re trying to follow a trail of breadcrumbs.

Debugging in Production

Debugging in production is a whole different ballgame. Unlike local development, you’re working in an environment where you can’t just restart the whole system every time something goes wrong.

Tracing and Observability in Enterprise Settings

  • Distributed Tracing: In a microservices architecture, I rely on tools like Jaeger or Zipkin to track requests as they bounce from one service to another. It’s like having a GPS for your data.
  • Centralized Logging: Aggregating logs from multiple sources into one place (using services like Splunk, ELK, or even cloud-native solutions) is crucial. This way, you can correlate events across services.
  • Monitoring and Alerting: Tools like Prometheus and Grafana help me keep an eye on system health. If something goes off the rails, alerts notify me before it turns into a full-blown crisis.
  • Remote Debugging: When the situation gets critical, and I need to get inside a live process, remote debugging (done cautiously, of course) can be a game changer. It’s a delicate operation, but sometimes it’s the only way to see what’s really going on.

Real-World Scenarios

Over the years, I’ve encountered a few scenarios that really tested my debugging skills. Here are some quick stories and how I tackled them:

Runtime Exceptions and Errors

  • The Mystery KeyError: I once had a bug that only happened in production. By reproducing the issue locally, adding strategic breakpoints, and inspecting the variable states, I discovered that a particular API call was returning an unexpected format. A simple fix later, and the bug was history.

Memory Leaks

  • The Disappearing Memory: Debugging memory leaks can be maddening. I used tools like memory_profiler and tracemalloc to monitor memory consumption, eventually identifying a subtle bug in a caching mechanism that was hoarding memory. Fixing that leak made a world of difference.

Performance Bottlenecks

  • Slow as Molasses: Sometimes, code just doesn’t run fast enough. In one project, I used cProfile and later switched to Py-Spy to pinpoint a function that was eating up CPU time. Once I optimized that section, performance improved dramatically.

Distributed Systems

  • Lost in the Maze of Microservices: In a recent project, correlation IDs became my secret weapon. By passing unique IDs through each service, I could piece together the full journey of a request—even when it spanned multiple systems.

Final Thoughts

Debugging might not always be glamorous, but it’s one of the most rewarding parts of being a developer. From those early days of staring blankly at stack traces to mastering remote debugging in production, I’ve learned that a bit of patience and the right tools can turn even the most cryptic bug into a solvable puzzle.

I hope these tips and stories help you tackle your own debugging adventures with more confidence and a smile on your face. Remember: every bug is an opportunity to learn something new (even if it sometimes feels like the universe is trying to drive you crazy).

The post Debugging Python Code Like a Professional: My Journey and Tips appeared first on Web Enclave.

]]>
https://webenclave.com/2025/02/14/debugging-python-code-like-a-professional-my-journey-and-tips/feed/ 0 261
A No-Nonsense Guide to Python Environments with uv https://webenclave.com/2025/01/31/a-no-nonsense-guide-to-python-environments-with-uv/ https://webenclave.com/2025/01/31/a-no-nonsense-guide-to-python-environments-with-uv/#respond Fri, 31 Jan 2025 04:28:23 +0000 https://webenclave.com/?p=232 Python’s ecosystem is a blessing and a curse. On one hand, it’s ridiculously powerful, but on the other, managing environments and dependencies can be a pain. I’ve gone through just about every possible way to set up a Python environment—venv, pipenv, Poetry, pyenv, conda, even rolling my own Dockerized setups. But recently, I came across […]

The post A No-Nonsense Guide to Python Environments with uv appeared first on Web Enclave.

]]>
Python’s ecosystem is a blessing and a curse. On one hand, it’s ridiculously powerful, but on the other, managing environments and dependencies can be a pain. I’ve gone through just about every possible way to set up a Python environment—venv, pipenv, Poetry, pyenv, conda, even rolling my own Dockerized setups.

But recently, I came across uv, a new tool from the folks who built ruff (one of the fastest Python linters out there), and it immediately caught my interest.

After playing around with it, I have to say: I’m impressed. uv is blazing fast, well thought out, and combines features from several other tools I was using separately. If you’re someone who gets annoyed at how long Python dependency installation takes, you might want to give uv a try.

What is uv?

At its core, uv is an extremely fast package manager, project manager, and Python version manager. It’s written in Rust, which explains why it’s absurdly quick compared to pip and Poetry.

But it’s not just a package installer— uv aims to replace a bunch of tools in the Python ecosystem:

  • pip & pip-tools → Installs and syncs dependencies
  • pipx → Runs Python CLI tools in isolated environments
  • Poetry → Manages dependencies and lockfiles
  • pyenv → Installs and manages Python versions
  • virtualenv → Handles virtual environments
  • twine → Publishes packages

So instead of juggling 4+ tools, uv tries to do it all in one go. And so far, it does a good job of it.

Why You Might Want to Use uv

Before I jump into installation and usage, let’s quickly go over why you might even care about switching to uv.

  1. It’s ridiculously fast. Installing dependencies with uv is 10x-100x faster than pip. Yes, seriously. If you’re tired of staring at “Collecting dependencies…” for minutes, uv is your new best friend.
  2. It replaces multiple tools. If you’re using pyenv, Poetry, and pipx separately, you can cut down your toolchain complexity.
  3. Python version management built-in. No need to rely on pyenv to manage different versions of Python—uv does it natively.
  4. Reproducible environments. Thanks to lockfiles (uv.lock), you get fully deterministic installs, similar to Poetry.
  5. It just works. No messing with shell scripts, PATH issues, or activation quirks. It’s smooth.

Installing uv

The folks at Astral (the creators) made uv super easy to install. Here’s how you do it:

📦 Install uv (the recommended way)

On macOS/Linux

curl -LsSf https://astral.sh/uv/install.sh | sh

On Windows (PowerShell)

powershell -ExecutionPolicy ByPass -c "irm https://astral.sh/uv/install.ps1 | iex"

Alternatively, via pip or pipx (my favorite way)

pip install uv

Once installed, you can check if it’s working:

uv --version

Setting Up a New Python Project with uv

Alright, let’s say you’re starting a new Python project. Normally, you’d go through the whole:

python -m venv .venv
source .venv/bin/activate
pip install ...
pip freeze & requirements.txt

With uv, it’s way simpler:

uv init my_project
cd my_project

This automatically creates a virtual environment inside the project and sets up a pyproject.toml file (like Poetry).

Adding dependencies? Just run:

uv add requests

This:
✅ Installs requests in a snap
✅ Updates your pyproject.toml
✅ Updates uv.lock (so you get reproducible installs)

Want to remove it?

uv remove requests

Running Python Code with uv

uv has an easy way to run scripts inside its environment without manually activating anything.

uv run python script.py

This ensures your script runs inside your project’s virtual environment, even if you’re not inside an activated shell session.

uv vs. Other Python Environment Tools

Let’s compare uv against some of the big players:

FeatureuvvenvpipPoetrypyenv
Installs Packages
Lockfile Support
Python Version Management
Runs CLI Tools (pipx alternative)
Reproducible Installs
Speed🚀🚀🚀🚶🐢🐇🚶

If you’re currently using venv + pip, uv is a massive upgrade. If you’re using Poetry, it’s more of a lateral move, but the speed boost alone might make you switch.

Using uv in Docker

I always try to set up my projects so they work well in both local and containerized environments. uv makes that easier too.

Here’s how I set up a Dockerfile with uv:

FROM python:3.13-alpine

# Install uv
RUN pip install uv

# Set up the application
WORKDIR /app
COPY . /app

# Create the virtual environment
RUN uv venv .venv

# Ensure the environment is activated for `uv sync`
ENV VIRTUAL_ENV=/app/.venv
ENV PATH="$VIRTUAL_ENV/bin:$PATH"

# Install dependencies
RUN uv sync

# Run the application
CMD ["uv", "run", "python", "script.py"]

Final Thoughts: Should You Use uv?

After testing uv, I’m pretty convinced that this is the future of Python package management. The speed, simplicity, and feature set make it a no-brainer for anyone tired of sluggish pip installs or juggling multiple tools.

If you’re still using venv + pip, I highly recommend giving uv a shot. If you’re using Poetry, it’s at least worth testing—especially if you’re frustrated with Poetry’s occasional slowness.

At the very least, it’s another great tool to have in your Python toolbox. And who knows? You might just end up replacing your entire workflow with it.

The post A No-Nonsense Guide to Python Environments with uv appeared first on Web Enclave.

]]>
https://webenclave.com/2025/01/31/a-no-nonsense-guide-to-python-environments-with-uv/feed/ 0 232
Setting Up a Secure Local Network with Caddy, Cloudflare DNS, and Let’s Encrypt https://webenclave.com/2024/11/07/setting-up-a-secure-local-network-with-caddy-cloudflare-dns-and-lets-encrypt/ https://webenclave.com/2024/11/07/setting-up-a-secure-local-network-with-caddy-cloudflare-dns-and-lets-encrypt/#respond Thu, 07 Nov 2024 05:52:27 +0000 https://webenclave.com/?p=179 As of 11/7/2024 — This is my home network software development setup. I have much more running than just Ollama, ChromaDb, etc. I also have several Postgres, Mongo, and other databases running in this setup. Assumptions: You have a machine running Docker and have a local static IP set on that machine. In this post, […]

The post Setting Up a Secure Local Network with Caddy, Cloudflare DNS, and Let’s Encrypt appeared first on Web Enclave.

]]>
As of 11/7/2024 — This is my home network software development setup. I have much more running than just Ollama, ChromaDb, etc. I also have several Postgres, Mongo, and other databases running in this setup.

Assumptions: You have a machine running Docker and have a local static IP set on that machine.

In this post, I’ll walk you through a setup for serving local network services securely over HTTPS, using Caddy, Cloudflare DNS, and Let’s Encrypt. This configuration allows you to have SSL certificates from Let’s Encrypt for your internal services, even though they’re only accessible within your local network. This setup is ideal if you want to avoid setting up a local certificate authority while still using HTTPS.

Overview of the Infrastructure

The core components of this setup include:

  • Caddy: A powerful web server with automatic HTTPS capabilities, configured with Cloudflare for DNS-01 challenge-based SSL certificates.
  • Cloudflare DNS: Handles DNS for our domain and provides an API that Caddy uses to validate certificates without exposing services publicly.
  • Let’s Encrypt: Provides SSL certificates via DNS-01 challenge, verified through Cloudflare.

The setup uses Docker to containerize Caddy and other services (e.g., Ollama, ChromaDb, TTS, Pihole, and anything else you desire).

How It Works

1. User Device Requests a Local Service

When a local device, like a laptop or phone, tries to access a service (e.g., https://ollama.webenclave.com), the request goes to Cloudflare DNS. Cloudflare manages the DNS for webenclave.com, pointing the subdomains (e.g., ollama.webenclave.com) to an internal IP address (e.g., 192.168.x.x). Cloudflare is set to “DNS only” mode to allow direct access.

2. Caddy as the Reverse Proxy

On the Docker host machine, Caddy listens for HTTPS requests. Caddy is configured with a Cloudflare DNS plugin that enables it to automatically obtain SSL certificates using Let’s Encrypt, without requiring external exposure. Caddy does the following:

  • Reverse Proxy: Caddy forwards requests to the appropriate Docker container based on the subdomain.
  • TLS with DNS-01 Challenge: Caddy handles HTTPS certificates for each service using a DNS-01 challenge. When a certificate is needed (for example, for ollama.webenclave.com), Caddy triggers the DNS-01 challenge with Cloudflare.

3. Let’s Encrypt and Cloudflare API for Certificate Management

Caddy communicates with Cloudflare’s DNS API to complete the DNS-01 challenge for Let’s Encrypt. Here’s how it works:

  • DNS-01 Challenge: Caddy requests a certificate from Let’s Encrypt for ollama.webenclave.com. Let’s Encrypt requires proof that we control this domain, so Caddy uses the Cloudflare API to create a temporary DNS TXT record for validation.
  • Certificate Issuance: Let’s Encrypt validates this TXT record, verifying our control over the domain, and issues a certificate. Caddy then stores and manages this certificate, automatically renewing it before it expires.

With this setup, Caddy provides valid HTTPS certificates from Let’s Encrypt for each local service, so connections are secure and trusted by default.

4. Reverse Proxy to Local Services

Once the certificate is obtained, Caddy acts as a reverse proxy to route requests to the appropriate Docker container for each service:

  • Ollama (ollama.webenclave.com)
  • ChromaDb (chroma.webenclave.com)
  • TTS (tts.webenclave.com)
  • Pihole (pihole.webenclave.com)

Each service is accessible over HTTPS, with traffic staying entirely within the local network.

Key Benefits of This Setup

  • Secure HTTPS for Local Services: Using Let’s Encrypt with Cloudflare’s DNS-01 challenge, you get trusted SSL certificates for local services without exposing them to the internet.
  • Automatic Certificate Management: Caddy handles obtaining and renewing certificates automatically, so you don’t need to manually manage SSL certificates.
  • Simple and Scalable: With Docker, it’s easy to add more services. You can configure additional subdomains and services in the Caddy configuration as needed.

Setting Up Your Own Environment

To set up this infrastructure on your own network, follow these steps:

  1. Register a Domain on Cloudflare (if you don’t already have one).
  2. Create DNS Records: In Cloudflare, create subdomain DNS records for each service, marked as “DNS Only” (gray). Set each subdomain to your local network IP (e.g., 192.168.x.x).
  3. Generate a Cloudflare API Token: This token needs DNS editing permissions to allow Caddy to use the DNS-01 challenge.
  4. Set Up Caddy with the Cloudflare Plugin: Use a Docker image for Caddy that includes the Cloudflare DNS plugin. Configure Caddy to use the Cloudflare API token for automatic HTTPS.
  5. Deploy Services: Deploy your services in Docker, configure them with Caddy as the reverse proxy, and access them securely over HTTPS within your network.

Code

Here’s an example Caddyfile that shows how Caddy is configured for DNS-01 challenges with Cloudflare:

{
    acme_dns cloudflare <YOUR_CLOUDFLARE_API_TOKEN>
}

ollama.webenclave.com {
    reverse_proxy http://x.x.x.x:11434
    tls {
        dns cloudflare <YOUR_CLOUDFLARE_API_TOKEN>
    }
}

chroma.webenclave.com {
    reverse_proxy http://x.x.x.x:8000
    tls {
        dns cloudflare <YOUR_CLOUDFLARE_API_TOKEN>
    }
}

tts.webenclave.com {
    reverse_proxy http://x.x.x.x:8020
    tls {
        dns cloudflare <YOUR_CLOUDFLARE_API_TOKEN>
    }
}

pihole.webenclave.com {
    reverse_proxy http://x.x.x.x
    tls {
        dns cloudflare <YOUR_CLOUDFLARE_API_TOKEN>
    }
}

Dockerfile (create docker image with Caddy + Cloudflare Plugin)

# syntax=docker/dockerfile:1
FROM caddy:2.8.4-builder AS builder
RUN xcaddy build \
  --with github.com/caddy-dns/cloudflare

FROM caddy:2.8.4 AS caddy
COPY --from=builder /usr/bin/caddy /usr/bin/caddy

docker-compose.yml

version: "3.8"
services:
  caddy:
    build:
      context: .  # Build context where the Dockerfile is located
    volumes:
      - ./Caddyfile:/etc/caddy/Caddyfile
    ports:
      - "443:443"
    networks:
      - mynetwork

networks:
  mynetwork:
    driver: bridge

The post Setting Up a Secure Local Network with Caddy, Cloudflare DNS, and Let’s Encrypt appeared first on Web Enclave.

]]>
https://webenclave.com/2024/11/07/setting-up-a-secure-local-network-with-caddy-cloudflare-dns-and-lets-encrypt/feed/ 0 179
My Journey Through Gamedev.js Jam 2024: Building “Neon Velocity” https://webenclave.com/2024/05/16/my-journey-through-gamedev-js-jam-2024-building-neon-velocity/ https://webenclave.com/2024/05/16/my-journey-through-gamedev-js-jam-2024-building-neon-velocity/#respond Thu, 16 May 2024 03:07:47 +0000 https://webenclave.com/?p=150 Participating in the Gamedev.js Jam 2024 was a thrilling and enlightening experience. As a relatively new indie game developer, this was my third game jam, and it offered a unique opportunity to push my skills and creativity to new heights. My game, “Neon Velocity,” aimed to deliver a high-octane, multiplayer vehicular combat experience in a […]

The post My Journey Through Gamedev.js Jam 2024: Building “Neon Velocity” appeared first on Web Enclave.

]]>
Participating in the Gamedev.js Jam 2024 was a thrilling and enlightening experience. As a relatively new indie game developer, this was my third game jam, and it offered a unique opportunity to push my skills and creativity to new heights. My game, “Neon Velocity,” aimed to deliver a high-octane, multiplayer vehicular combat experience in a futuristic, neon-lit arena. Here’s a look at my journey, the challenges I faced, and the lessons I learned.

Game Concept and Features

“Neon Velocity” is designed to provide adrenaline-pumping vehicular combat in dynamic, morphing multiplayer arenas. Each match requires players to adapt their strategies to the ever-changing terrain. Key features include:

  • Dynamic Arenas: Shape-shifting platforms keep the gameplay fresh and challenging.
  • Diverse Vehicles and Power-Ups: Players can choose from various vehicles and use strategic power-ups to gain an edge.
  • 16-Player Battles: Intense multiplayer matches with up to 16 players.
  • Customization: Extensive options for personalizing vehicles and tracks.
  • Inclusivity: A focus on creating a non-toxic, inclusive environment for all gamers.

Development Tools and Techniques

The game was built using Phaser for the front-end and Colyseus for the server-side logic. I also integrated AI-generated music and art to enhance the game’s atmosphere. Here’s a brief overview of the development process:

  • Phaser: Chosen for its robust capabilities and ease of use for creating HTML5 games.
  • Colyseus: Utilized for managing multiplayer sessions and real-time game state synchronization.
  • AI-Generated Assets: Leveraged AI tools to create unique music and visuals, saving time and adding a unique flair to the game.

Challenges and Solutions

  1. Dynamic Arenas: One of the most challenging aspects was implementing the shape-shifting platforms. I had to ensure that the platforms changed in a way that was both strategic and fair, requiring careful consideration of game balance and mechanics.
  2. Multiplayer Integration: Ensuring smooth gameplay for up to 16 players was another significant challenge. I had to optimize the netcode and server performance to handle the real-time interactions and physics calculations efficiently.
  3. AI Players: Creating intelligent AI players that could provide a challenging and realistic practice mode required developing algorithms for pathfinding, decision-making, and adapting to the dynamic arenas.

Results and Reflection

Here’s how “Neon Velocity” ranked in various categories:

  • Innovation: #101 (2.828)
  • Audio: #123 (2.734)
  • Gameplay: #138 (2.640)
  • Overall: #148 (2.555)
  • Theme: #180 (2.216)
  • Graphics: #186 (2.357)

Considering there were 232 submissions, I am quite pleased with these results. The rankings reflect the game’s strengths in innovation and audio, and they highlight areas for improvement, such as theme alignment and graphical polish.

Lessons Learned

  1. Time Management: Balancing the various aspects of game development within the limited timeframe of a game jam is crucial. Prioritizing core features and iterative testing helped ensure a functional and enjoyable game.
  2. Community Feedback: Leveraging the #gamedevjs hashtag on Twitter and engaging with the Discord community provided valuable feedback that helped refine the gameplay and address issues early.
  3. Inclusivity Matters: Creating a game that appeals to a diverse audience requires intentional design choices. Ensuring a non-toxic, welcoming environment and offering gameplay modes that cater to different preferences significantly enhanced the game’s appeal.

Future Plans

The journey doesn’t end here. Based on the feedback and experiences from the jam, I plan to:

  • Enhance Graphics: Improve the visual quality to make the game more immersive.
  • Optimize Performance: Further refine the netcode and game performance to ensure smooth multiplayer experiences.
  • Expand Features: Introduce new vehicles, power-ups, and customization options.
  • Community Engagement: Foster a vibrant community through events, tournaments, and continued interaction on Discord and social media.

Conclusion

Participating in the Gamedev.js Jam 2024 was a rewarding experience that pushed my boundaries and significantly enhanced my game development skills. “Neon Velocity” is a testament to what can be achieved with creativity, hard work, and a passion for inclusive gaming. I’m excited to continue developing this game and look forward to the next challenge on my journey as an indie game developer.

Thank you for reading, and I hope you’ll check out “Neon Velocity” and join our growing community of players. If you have any feedback or questions, feel free to reach out!

Play the game: https://bchip1.itch.io/neon-velocity

The post My Journey Through Gamedev.js Jam 2024: Building “Neon Velocity” appeared first on Web Enclave.

]]>
https://webenclave.com/2024/05/16/my-journey-through-gamedev-js-jam-2024-building-neon-velocity/feed/ 0 150
Bridging Words and Pictures: How AI Understands and Generates Images from Text https://webenclave.com/2024/04/26/bridging-words-and-pictures-how-ai-understands-and-generates-images-from-text/ https://webenclave.com/2024/04/26/bridging-words-and-pictures-how-ai-understands-and-generates-images-from-text/#respond Fri, 26 Apr 2024 03:41:19 +0000 https://webenclave.com/?p=121 Introduction to Text-to-Image Modeling in the Digital Age In the digital world today, the blend of text and images is becoming more distinct, driven by advancements in artificial intelligence. This technology, called text-to-image modeling, is transforming how we approach creativity and communication. The Mechanics of Text-to-Image AI and Its Training Process Text-to-image modeling harnesses AI’s […]

The post Bridging Words and Pictures: How AI Understands and Generates Images from Text appeared first on Web Enclave.

]]>
Introduction to Text-to-Image Modeling in the Digital Age

In the digital world today, the blend of text and images is becoming more distinct, driven by advancements in artificial intelligence. This technology, called text-to-image modeling, is transforming how we approach creativity and communication.

The Mechanics of Text-to-Image AI and Its Training Process

Text-to-image modeling harnesses AI’s capacity to interpret both written words and visual content. Trained on extensive datasets comprising countless image-text pairs, these models grasp the nuanced relationships between descriptions and their corresponding visuals. Through this training, AI systems develop a profound comprehension of how specific terms align with visual characteristics.

Understanding Embeddings: The Core of Text-to-Image Technology

A pivotal element of this technology is the concept of embeddings—numerical representations of text and images within a mutual, multi-dimensional space. This allows AI to juxtapose and align text with images effectively. A notable technique in this field is CLIP (Contrastive Language-Image Pre-training), which equips the AI to navigate the “language” of images related to text descriptions.

Measuring Congruence in AI with Cosine Similarity

The AI evaluates the congruence between text and images using cosine similarity. This metric, measuring the cosine of the angle between text and image embeddings, identifies their similarity, with values nearing 1 indicating a close match. This capability enables the AI to produce images that are true to the provided text descriptions.

Expanding Horizons: Applications of Text-to-Image Technology in Creative and Marketing Industries

Text-to-image technology’s potential extends across various sectors. In creative fields, artists and designers collaborate with AI to create innovative visuals that challenge conventional boundaries. Marketing professionals utilize these models to craft tailored visual content that resonates with specific audiences and enhances campaign effectiveness.

Educational and Communicative Benefits of Text-to-Image Modeling

Moreover, educational and communicative applications are profound. Educators incorporate this technology to devise illustrative aids that simplify complex subjects, while content creators generate impactful visuals that bolster their narratives.

The Future of Visual Representation: Advancements and Potential of AI Models

As these models advance, their applications expand, promising even more sophisticated and nuanced visual representations. This evolution marks a significant shift towards a more visually-oriented and AI-integrated future in communication and creative expression.

# This is a highly simplified example and does not reflect the actual complexity
# and data requirements of training models like CLIP, nor does it accurately simulate
# the embedding process, which in real applications involves deep learning techniques
# and a vast amount of training data.

import numpy as np

# Predefined embeddings for a small set of categories
category_embeddings = {
    "cat": np.array([1, 0, 0], dtype=np.float64),
    "dog": np.array([0, 1, 0], dtype=np.float64),
    "pet": np.array([0.5, 0.5, 0], dtype=np.float64),
    "rug": np.array([0, 0, 1], dtype=np.float64),
    "mat": np.array([0, 0, 0.5], dtype=np.float64)
}

def generate_embedding(description):
    """
    Generate a normalized embedding vector for a given description.

    Args:
        description (str): A textual description containing one or more keywords that map to predefined embeddings.

    Returns:
        numpy.ndarray: A normalized vector representing the aggregate embedding of the input description.
    """
    words = description.split()
    embedding = np.zeros_like(next(iter(category_embeddings.values())), dtype=np.float64)
    for word in words:
        if word in category_embeddings:
            embedding += category_embeddings[word]
    if np.linalg.norm(embedding) > 0:
        embedding /= np.linalg.norm(embedding)
    return embedding

class MockCLIPModel:
    """
    A mock model simulating the functionality of the CLIP model which maps images and text to a shared embedding space.
    """
    def __init__(self):
        """
        Initializes the MockCLIPModel with empty dictionaries to store text and image embeddings.
        """
        self.text_to_embedding = {}
        self.image_to_embedding = {}

    def embed_text(self, text):
        """
        Retrieve or create a normalized embedding for a given text.

        Args:
            text (str): The text to embed.

        Returns:
            numpy.ndarray: The embedding vector for the given text.
        """
        if text not in self.text_to_embedding:
            self.text_to_embedding[text] = generate_embedding(text)
        return self.text_to_embedding[text]

    def embed_image(self, image_description):
        """
        Retrieve or create a normalized embedding for a given image description.

        Args:
            image_description (str): The description of the image.

        Returns:
            numpy.ndarray: The embedding vector for the given image description.
        """
        if image_description not in self.image_to_embedding:
            self.image_to_embedding[image_description] = generate_embedding(image_description)
        return self.image_to_embedding[image_description]

    def find_similarities(self, text, image_description):
        """
        Calculate the cosine similarity between embeddings of text and image description.

        Args:
            text (str): The text to compare.
            image_description (str): The image description to compare.

        Returns:
            float: The cosine similarity score between the text and image description embeddings.
        """
        text_embedding = self.embed_text(text)
        image_embedding = self.embed_image(image_description)
        if np.linalg.norm(text_embedding) == 0 or np.linalg.norm(image_embedding) == 0:
            return 0  # Return 0 similarity if either embedding is a zero vector
        similarity = np.dot(text_embedding, image_embedding) / (
            np.linalg.norm(text_embedding) * np.linalg.norm(image_embedding)
        )
        return similarity

# Example usage
clip_model = MockCLIPModel()

# Embedding text and an image descriptions
text = "A cat sitting on a mat"
an_image_of_a_pet_on_a_rug = "A pet on a rug"
an_image_of_a_person_driving_a_car = "A person driving a car"

# Finding similarity between the text and the image descriptions
similarity_for_pet_on_rug = clip_model.find_similarities(
    text, an_image_of_a_pet_on_a_rug
)
similarity_for_person_driving_car = clip_model.find_similarities(
    text, an_image_of_a_person_driving_a_car
)
print(
    f"Similarity for pet on rug: {similarity_for_pet_on_rug:.2f} and for person driving car: {similarity_for_person_driving_car:.2f}"
)

# Output:
# Similarity for pet on rug: 0.73 and for person driving car: 0.00

The post Bridging Words and Pictures: How AI Understands and Generates Images from Text appeared first on Web Enclave.

]]>
https://webenclave.com/2024/04/26/bridging-words-and-pictures-how-ai-understands-and-generates-images-from-text/feed/ 0 121
Building a Simple Blockchain in Python https://webenclave.com/2024/04/12/building-a-simple-blockchain-in-python/ https://webenclave.com/2024/04/12/building-a-simple-blockchain-in-python/#respond Fri, 12 Apr 2024 23:35:19 +0000 https://webenclave.com/?p=108 I’m excited to share my recent presentation at the Michigan Python Group Meetup, where I demonstrated how to build a simple blockchain using Python. This session was designed to demystify the concepts of blockchain and hashing algorithms, making them accessible to both coding novices and seasoned developers. Introduction to Blockchain and Proof of Work I […]

The post Building a Simple Blockchain in Python appeared first on Web Enclave.

]]>
I’m excited to share my recent presentation at the Michigan Python Group Meetup, where I demonstrated how to build a simple blockchain using Python. This session was designed to demystify the concepts of blockchain and hashing algorithms, making them accessible to both coding novices and seasoned developers.

Introduction to Blockchain and Proof of Work

I started the session by introducing the basic concept of a blockchain and the cryptographic proof known as “proof of work.” This system is fundamental to understanding how technologies like Bitcoin operate. Proof of work is a mechanism that requires a participant to do a certain amount of computational work which is easy to verify but challenging to produce, thus ensuring the security and integrity of the blockchain network.

Using SHA-256 Hashing Algorithm

The focus of our build was the SHA-256 hashing algorithm, also used by Bitcoin. A hashing algorithm helps convert an input of any size into a fixed-size string or a hash, which acts like a fingerprint for data. I emphasized that SHA-256 is deterministic, meaning the same input will always produce the same output, but it’s designed to be unpredictable so the output hash doesn’t reveal anything about the input.

Building the Blockchain: A Step-by-Step Guide

During the presentation, I walked through the coding process, starting with defining a blockchain class and explaining each line of code that handles the creation of new blocks and the chain itself. Here are the key concepts we covered:

Creating a Block: Each block contains data, the hash of the previous block (linking it), and its own hash calculated from its contents.

Initializing the Blockchain: The genesis block starts the blockchain, with subsequent blocks added containing transactional data.

Implementing Proof of Work: This involves adjusting a nonce value in the block until the hash of the block meets a network difficulty target, like starting with a certain number of zeros.

Practical Demonstration

I also provided a practical demonstration, where I live-coded to show how to implement these concepts in Python. I showcased the dynamic addition of blocks to the blockchain and how the network verifies the integrity of the blockchain through re-mining and validation processes when changes occur.

Security Aspects of Blockchain

We discussed potential security vulnerabilities, such as the 51% attack, where an entity gains majority control of the network’s mining power and can alter the blockchain. I stressed the importance of decentralization in preventing such attacks.

Broader Applications of Blockchain Technology

To conclude, I explored how blockchain technology transcends cryptocurrencies. I discussed its applications in providing transparent, immutable records for supply chain management, digital identities, and more, highlighting its potential to revolutionize various industries by ensuring data integrity and security.

Conclusion

My presentation to the Michigan Python Group provided a comprehensive introduction to building a simple blockchain in Python. By breaking down the technical jargon and simplifying the coding process, I aimed to make blockchain technology accessible to all attendees. This session not only enlightened participants on the inner workings of blockchains but also inspired them to consider its broader implications and applications in their fields.

For those interested in exploring more, I encourage you to follow my GitHub for updates and further resources. Whether you’re a developer looking to implement blockchain solutions or simply curious about the technology, there’s no doubt that blockchain is a fascinating field worth exploring further.

Github: https://github.com/BChip/SimpleBlockchain-SVSUBitcoinGroup

The post Building a Simple Blockchain in Python appeared first on Web Enclave.

]]>
https://webenclave.com/2024/04/12/building-a-simple-blockchain-in-python/feed/ 0 108
Enhanced Satellite Orbit Visualization and Pass Prediction https://webenclave.com/2024/04/03/enhanced-satellite-orbit-visualization-and-pass-prediction/ https://webenclave.com/2024/04/03/enhanced-satellite-orbit-visualization-and-pass-prediction/#respond Wed, 03 Apr 2024 05:30:57 +0000 https://webenclave.com/?p=101 Introduction: Satvis.Space, the powerful web application for satellite orbit visualization and pass prediction, has undergone a series of optimizations in a recent fork. This enhanced version not only improves the application’s performance but also significantly reduces the Docker build time, making it even more efficient and user-friendly. Features: The forked version of Satvis.Space retains all […]

The post Enhanced Satellite Orbit Visualization and Pass Prediction appeared first on Web Enclave.

]]>
Introduction: Satvis.Space, the powerful web application for satellite orbit visualization and pass prediction, has undergone a series of optimizations in a recent fork. This enhanced version not only improves the application’s performance but also significantly reduces the Docker build time, making it even more efficient and user-friendly.

Features: The forked version of Satvis.Space retains all the impressive features of the original application. It accurately calculates satellite positions and orbits using TLE data, allows users to select their groundstation through geolocation or map interaction, and provides local browser notifications for upcoming satellite passes. The serverless architecture and offline functionality as a Progressive Web App (PWA) remain intact, ensuring seamless access to satellite data in any environment.

Optimized Docker Build: One of the most notable improvements in this fork is the optimization of the Docker build process. By leveraging a multi-stage build approach and carefully selecting the appropriate base images, the build time has been reduced from approximately 900 seconds to an impressive 150 seconds on an Oracle Ampere instance.

The optimized Dockerfile starts with the official Node.js 18 image for installing dependencies. It then switches to the Oven Bun image, a lightweight and fast JavaScript runtime, for building the application. The installed dependencies are copied from the previous stage, ensuring a clean and efficient build process.

To further optimize the build, the TLE data is updated during the build process itself using the bun run update-tle command. This eliminates the need for a separate step and reduces the overall build time.

Finally, the built application files are copied to the Nginx image for serving the application. Nginx, known for its high performance and stability, ensures efficient delivery of the Satvis.Space application to users’ browsers.

Performance Optimization: In addition to the Docker build optimization, the forked version of Satvis.Space focuses on enhancing the application’s performance. Efforts have been made to optimize the code and improve the overall speed and responsiveness of the application across various browsers.

These performance optimizations may include techniques such as code minification, efficient asset loading, and browser-specific optimizations. The goal is to provide users with a smooth and seamless experience while interacting with the satellite visualization and pass prediction features, regardless of their browser choice.

Automatic TLE Updates: To ensure the accuracy of satellite position calculations and pass predictions, the forked version of Satvis.Space includes an automated process for updating the TLE data. The TLE data is fetched and updated during the Docker build process itself, eliminating the need for manual intervention.

This automated update process guarantees that users always have access to the most recent satellite data, enhancing the reliability and precision of the orbit visualizations and pass predictions.

Conclusion: The optimized fork of Satvis.Space takes the already impressive satellite orbit visualization and pass prediction application to new heights. With significant improvements in the Docker build process, the build time has been reduced from ~900 seconds to a mere ~150 seconds on an Oracle Ampere instance. This optimization streamlines the deployment process and makes it more efficient.

Moreover, the performance optimizations ensure a smooth and responsive user experience across different browsers, making the application accessible to a wider audience. The automated TLE data updates during the build process further enhance the accuracy and reliability of the satellite information.

Whether you’re a satellite enthusiast, researcher, or professional, this optimized version of Satvis.Space is a valuable tool that combines powerful features with improved performance and efficient deployment capabilities. Its user-friendly interface, accurate calculations, and seamless user experience make it an indispensable resource for anyone interested in exploring the fascinating world of satellites and their orbits.

View: https://space.webenclave.com/

Fork: https://github.com/BChip/space

The post Enhanced Satellite Orbit Visualization and Pass Prediction appeared first on Web Enclave.

]]>
https://webenclave.com/2024/04/03/enhanced-satellite-orbit-visualization-and-pass-prediction/feed/ 0 101
Second Game Jam – itch.io Mini Jam 154: Travel https://webenclave.com/2024/03/18/second-game-jam-itch-io-mini-jam-154-travel/ https://webenclave.com/2024/03/18/second-game-jam-itch-io-mini-jam-154-travel/#respond Mon, 18 Mar 2024 02:41:33 +0000 https://webenclave.com/?p=93 What an incredible journey this game jam has been! Working with my team of 7 has been an absolute blast. From conceptualizing the idea to the final submission, every step has been a learning curve and a chance to push our creative and technical skills to the limit. We dove headfirst into the world of […]

The post Second Game Jam – itch.io Mini Jam 154: Travel appeared first on Web Enclave.

]]>
What an incredible journey this game jam has been! Working with my team of 7 has been an absolute blast. From conceptualizing the idea to the final submission, every step has been a learning curve and a chance to push our creative and technical skills to the limit.

We dove headfirst into the world of game development with Phaser, weaving together dynamic gameplay with React for the UI, and using Vite and NX for an efficient development workflow. The challenge of incorporating the game jam’s limitation inspired us – what if Jeff Bezos extended Amazon Prime delivery to space? Thus, our game was born, focusing on the high-stakes world of interstellar cargo delivery.

Creating the main game was an exhilarating experience. We spent hours fine-tuning the mechanics to ensure that navigating through asteroid fields felt dangerous yet rewarding, and that the space battles were as intense as they were fun. The strategic gameplay elements, requiring players to make split-second decisions, were designed to immerse players in the role of a cadet of the Galactic Retrieval Unit.

Finding the perfect sound assets was like digging for treasure. Each sound effect and piece of background music had to enhance the ambiance, pulling players deeper into the cosmic setting of our game. When we finally saw (and heard) it all come together, the feeling was indescribable.

Creating the YouTube gameplay video/ad was another highlight. It allowed us to step back and see our game through the eyes of potential players. Crafting this video, we aimed to capture the essence of our game’s adventure and the sheer fun of playing it. Watching the final cut, we felt a mix of pride and excitement – we couldn’t wait for the world to see it.

And let’s not forget the cookies! Six cookies fueled our late-night coding sessions, with one special cookie that will forever remain a cherished team meme.

In conclusion, this game jam has been an unforgettable experience. Not only did we create a game we’re proud of, but we also forged stronger bonds as a team and expanded our horizons as developers. To everyone who embarks on the journey, we hope you find it as thrilling to play as it was for us to create. Here’s to many more adventures in game development!

Play the game: https://kbve.itch.io/travelbox

The post Second Game Jam – itch.io Mini Jam 154: Travel appeared first on Web Enclave.

]]>
https://webenclave.com/2024/03/18/second-game-jam-itch-io-mini-jam-154-travel/feed/ 0 93
First Game Jam – itch.io Mini Jam 153: Fishing https://webenclave.com/2024/03/09/first-game-jam-itch-io-mini-jam-153-fishing/ https://webenclave.com/2024/03/09/first-game-jam-itch-io-mini-jam-153-fishing/#respond Sat, 09 Mar 2024 05:51:49 +0000 https://webenclave.com/?p=86 As a software developer, I’ve always been passionate about the potential of code to create immersive experiences. So, when I decided to participate in my first game jam, the excitement was palpable. The Mini Jam, a 72-hour-long video game development marathon hosted on Itch.io, was my chosen battlefield. Every two weeks, this jam challenges developers […]

The post First Game Jam – itch.io Mini Jam 153: Fishing appeared first on Web Enclave.

]]>
As a software developer, I’ve always been passionate about the potential of code to create immersive experiences. So, when I decided to participate in my first game jam, the excitement was palpable. The Mini Jam, a 72-hour-long video game development marathon hosted on Itch.io, was my chosen battlefield. Every two weeks, this jam challenges developers with a theme and a unique limitation, pushing the boundaries of creativity and innovation.

This jam’s theme was “Fishing,” a concept ripe with potential. However, the real twist came with the announcement of the limitation at the start of the jam: No water. Initially, this threw our plans into disarray. How could we create a fishing game without water? But as we chewed over this curveball, it sparked an unexpected burst of creativity. Our solution? “Fish & Chip” – a mystical journey set in a world where the oceans have turned to sand, and magic replaces water.

The Genesis of “Fish & Chip”

“Fish & Chip” invites players into a realm transformed, where they assume the role of a legendary Desert Fisher. Using their typing skills, players cast their lines into the dunes to catch mystical fish hidden beneath the sands. This innovative gameplay mechanic blends typing games with fishing, creating a unique challenge that tests both speed and accuracy.

Our game was brought to life by a diverse team: myself, Chip, David, Holy, Nezt50, Retornodomal, Archandroid, and KBVE. We utilized the Phaser engine, integrated with Astro, React, TailwindCSS, and several other libraries to build our game. Credit also goes to FinalBossBlues and Annoraaq for the base tilemap data and placeholder assets, which we extensively redesigned to fit our sandy theme. The protagonist, Chip, was meticulously hand-drawn before being pixelated, showcasing the incredible talent within our team.

Overcoming the Limitation

The limitation of no water initially seemed like a setback but ultimately served as the bedrock of our game’s identity. Transforming the limitation into the cornerstone of our creative direction, we developed a desert-themed mini RPG that incorporated fishing in a way no one expected. This not only allowed us to meet the challenge head-on but also to innovate within the confines of the jam’s criteria.

The Jam Results

After an intense 72 hours, “Fish & Chip” was born, and the results were more than encouraging. We ranked in several categories among dozens of entries, a testament to the hard work and creativity of our team:

  • Presentation: #76
  • Concept: #91
  • Overall: #96
  • Use of the Limitation: #99
  • Enjoyment: #110

These results, derived from 34 ratings, reflect the game’s reception and the potential for future development. The feedback has been invaluable, highlighting both our game’s strengths and areas for improvement.

Looking Forward

While “Fish & Chip” was born out of a game jam, the journey doesn’t end here. Encouraged by the feedback and the enjoyment we had in creating this game, we’re considering further development. The dream? To evolve “Fish & Chip” from a game jam project into a full-fledged game.

Final Thoughts

Participating in the Mini Jam was an incredible learning experience. It challenged us to think outside the box, work under pressure, and collaborate effectively. The limitation, initially daunting, proved to be a source of inspiration, reminding us that creativity truly does thrive within constraints.

As for the cookies, let’s just say that both the physical and the browser variety played a significant role in fueling our creativity throughout this adventure.

Stay tuned for more updates on “Fish & Chip,” and thank you to everyone who supported us on this journey. This game jam might have been my first, but it certainly won’t be my last.

Play the game on itch.io: https://kbve.itch.io/fishchip

The post First Game Jam – itch.io Mini Jam 153: Fishing appeared first on Web Enclave.

]]>
https://webenclave.com/2024/03/09/first-game-jam-itch-io-mini-jam-153-fishing/feed/ 0 86