Inspiration

Managing daily campus logistics at Binghamton is a fragmented nightmare. The data exists, but it is scattered across half a dozen different apps, legacy websites, and hidden portals. To find out if the East Gym is packed, if the Route 14 bus is on time, or if there are washers open in Mountainview, students have to constantly app-switch. Binghamton's current bot, Baxter, is just a static FAQ reader. We were inspired by the Model Context Protocol (MCP) to upgrade Baxter from a passive knowledge base into an autonomous agent that can actually unify and interact with this live, scattered infrastructure. By utilizing an open standard like MCP, we ensured our solution isn't locked into a single ecosystem. It is built to work with any LLM provider.

What it does

BingMCP upgrades Baxter into an autonomous campus agent. Using the Model Context Protocol, it orchestrates live data across six different campus domains: OCCT bus + shuttle ETAs, laundry availability, gym occupancy, LibCal study room reservations, upcoming BEngaged events, and dining hall menus.

Instead of forcing students to juggle six different apps or websites, BingMCP allows them to ask complex logistical puzzles in natural language. If a student asks, "I have 90 minutes. Are there study rooms open in the library, and what's on the menu at C4?", the agent intelligently routes the query, executes the necessary tools, and returns a unified, real-time answer.

How we built it

We engineered a heavily decoupled architecture to ensure enterprise-grade scalability, focusing almost entirely on backend systems and API standardization.

The Orchestrator: We bypassed heavy third-party frameworks and built a lean, custom reasoning loop directly into our backend. This engine securely routes user queries to the appropriate utilities with minimal overhead.

The MCP Server (The Core Architecture): We built a standalone Python MCP Server using FastMCP. This acts as a universal translation layer between the LLM and the university's data.

Reverse Engineering the Campus: The heavy lifting was standardizing Binghamton's fragmented data silos. We sniffed network traffic to map the undocumented API endpoints for LibCal study rooms and dining hall menus. We reverse-engineered the legacy PHP servers powering the OCCT and B-Line transit apps. We then scraped the laundry endpoints and tapped directly into the live production database of BingGym.com.

The Frontend: We rapidly prototyped the chat interface using Next.js and the Vercel AI SDK, allowing us to dedicate our engineering sprint entirely to the MCP server and API integrations.

Challenges we ran into

Binghamton's infrastructure is a wild mix of legacy systems and third-party vendor software. Dealing with undocumented endpoints for LibCal and the dining halls required intercepting network requests and mapping out the data structures blindly. Furthermore, the legacy transit servers return massive, unstructured JSON payloads that would instantly blow out an LLM's context window. We had to build lightweight, stateless parsing logic inside our MCP tools to extract only the necessary variables to allow the agents to filter based on what they actually need.

Accomplishments that we're proud of

We are incredibly proud of achieving true architectural decoupling. By isolating our reverse-engineered APIs into a standalone MCP server, our campus tools are completely LLM-agnostic. Because we built to the MCP standard, any client or provider that supports MCP connectors (whether it is an Anthropic Claude instance, Cursor, or our custom orchestrator) can instantly interface with Binghamton's live data. We successfully took six different, undocumented, messy campus data sources and standardized them into a single, universally consumable protocol over a single weekend.

What we learned

We learned that integrating so many moving parts takes much more than just workhorse effort. Establishing strong architectural planning at the very beginning of the project is what ultimately dictates how well a team can collaborate going forward. On the technical side, we mastered the intricacies of reverse engineering undocumented API endpoints and exactly how to filter that chaotic data down into clean, highly specific MCP tools.

What's next for BingMCP

Since we have successfully decoupled the integration layer, scaling the agent is as simple as writing more MCP tools. We plan to build integrations for the Canvas LMS to fetch upcoming assignments, hook into BU Brain to check for registration holds, and add a tool to check campus ID card (CBI) balances.

Built With

Share this project:

Updates