<![CDATA[Sonjj]]>https://sonjj.com/https://sonjj.com/favicon.pngSonjjhttps://sonjj.com/Ghost 6.20Sun, 22 Mar 2026 00:16:51 GMT60<![CDATA[Lessons From Running OpenClaw: Cron Scripts & n8n Webhooks]]>https://sonjj.com/openclaw-cron-script-n8n-webhook/69be232bb8b74452cf3a93c9Sat, 21 Mar 2026 04:53:19 GMTEarly 2025, an AI coding agent on Replit (an online IDE with built-in AI agents) got a straightforward task: update a few records in a database. What happened? The agent spiraled out of control, ignored every safety instruction, and wiped out 1,206 executive records and 1,196 company entries in a single run. Then it fabricated test results to cover up the damage.

This isn't some edge case. Anyone letting AI agents touch production systems has seen something like it — or will.

I use OpenClaw daily to manage the SONJJ Ecosystem — SmailPro, Smser, YChecker, and several other products. Each one has its own database, its own API, real data from real users. A Replit-style disaster isn't a matter of "if" — it's "when," unless you put guardrails in place.

After a few months of trial and error, I landed on 2 simple patterns that make working with OpenClaw significantly safer — no fancy frameworks required:

  1. Keep scripts separate from cron — let the agent write script files, don't let it control cron directly
  2. Use n8n webhooks as a middle layer — the agent only calls webhooks, never touches your database or product APIs

Why You Need a Middle Layer

OWASP ranks "Excessive Agency" (LLM06:2025) among the top risks when deploying AI agents — broken down into 3 dimensions:

  • Excessive Functionality — giving the agent more tools than it actually needs
  • Excessive Autonomy — letting the agent perform critical actions without approval
  • Excessive Permissions — granting higher access than required (write access when read would do)

Sounds abstract. But map it to real OpenClaw usage and you'll recognize these immediately:

Inline cron: You ask the agent to set up a cron job. It writes a command straight into crontab -e. When something breaks, you have no idea what cron is running, can't test it in isolation, and there's no version control. The agent can schedule anything on your machine.

Custom API endpoints for the agent: You add new routes to your product's codebase — more code to write, maintain, and deploy. Over time, the codebase gets cluttered with endpoints that only serve the AI agent, not your actual users.

Direct SQL queries: This is the most dangerous one. Give the agent a connection string and it can SELECT, UPDATE, or DELETE from any table. The Replit incident from the intro? Exactly this scenario.

According to Gartner, 80% of organizations report their AI agents misbehaving, leaking data, or hallucinating. OpenClaw itself had CVE-2026-25253 (CVSS 8.8) — a vulnerability allowing remote code execution that bypassed container isolation.

The problem isn't that AI agents are unreliable. The problem is we're giving them too much direct access. The fix is simple: add a middle layer.

Pattern 1: Keep Scripts Separate From Cron

Python script file

The Problem

When you ask OpenClaw to "set up a cron job to back up the database every day at 2 AM," here's what the agent does — it opens crontab -e and writes:

0 2 * * * mysqldump -u root -p'password' mydb > /tmp/backup.sql

Looks fine. Until you need to debug it. Cron runs at 2 AM while you're asleep. Next morning, no backup — and no clue why. No logs, no error output, no way to test it without waiting for the next 2 AM cycle. And since the command lives inside crontab, there's no version control — you don't know what the agent changed, or when.

This is an old DevOps problem: inline cron commands are an anti-pattern. But when an AI agent is the one writing cron entries, it gets worse — you didn't even write that command.

The Fix: Let the Agent Write Script Files

Instead of letting OpenClaw write directly into crontab, change how you give instructions:

"Write me a script backup-smailpro-db.sh to back up the SmailPro database. Don't set up cron yet — just the script."

The workflow becomes:

  1. Agent writes a script file (.sh or .py) — with a shebang line, error handling, and a descriptive name
  2. You review the script — read through it, check for anything unexpected
  3. Test manually — run bash backup-smailpro-db.sh in your terminal, check the output
  4. Add it to cron — only after the script works: 0 2 * * * /path/to/backup-smailpro-db.sh

Cron now does exactly one thing: schedule. All the logic lives in the script file — separation of concerns.

Why This Works

  • Human-in-the-loop: The script is an artifact you review before it runs. The agent writes, you approve — exactly what OWASP recommends.
  • Easy debugging: Something broke? Run the script in your terminal, see the output right away. No waiting for the next cron cycle.
  • Version control: .sh files go into git. You know exactly what the agent changed through commit history.
  • Reusable: Once written, the script can be called from anywhere — cron, a webhook, manually, or even from n8n (more on that next).
  • Meaningful names: check-smailpro-health.sh, sync-user-stats.py — you know what a script does just by reading its name, instead of parsing a cryptic one-liner in crontab.

This isn't a new idea — separating scripts from cron has been best practice forever. What's new is that when an AI agent writes the script, this pattern becomes a natural safety layer: you always get a chance to review before anything runs on a schedule.

Pattern 2: n8n Webhooks — A "Firewall" for Your AI Agent

n8n execution log

Two Bad Options

When OpenClaw needs to interact with your products — pulling metrics, checking statuses, processing data — you typically face two choices:

Build a custom API endpoint for the agent: Add a new route to your product codebase, write a controller, implement the logic, redeploy. Every new task the agent needs means more code in your product. After a few months, the codebase is littered with endpoints that serve nobody but the AI agent.

Let the agent query SQL directly: Hand OpenClaw a connection string and let it write its own queries. Fast? Sure. But the agent has full database access — SELECT * is the mild case, DROP TABLE is the nightmare. No logs, no controls, no way to know what the agent queried until something goes wrong.

The Fix: n8n Webhooks as a Middle Layer

Instead of either option, I create n8n webhooks for each specific task. The flow is straightforward:

  1. Create a webhook in n8n — one webhook = one task (e.g., "count today's new SmailPro users")
  2. n8n handles the backend — queries the database, calls internal APIs, transforms data
  3. Returns a JSON response — the agent gets results, knows nothing about the DB or logic behind it

OpenClaw knows exactly one thing: the webhook URL. It calls that URL, gets JSON back. Done. It doesn't know where the database is, which tables exist, or what queries run.

Why n8n Instead of a Custom API?

  • No code in your product: Build workflows with n8n's visual editor — drag and drop nodes. Not a single line of code touches your product codebase.
  • Built-in execution logs: Every webhook call is recorded — timestamp, input payload, output, execution time. Debugging is faster than with any custom API.
  • Kill switch: Agent acting up? Go into n8n, disable the webhook. Instant revocation. No redeployment, no code changes.
  • Test vs Production URLs: n8n automatically creates two URLs per webhook — test freely, only activate the production URL when you're confident.
  • Authentication: Supports header auth and basic auth — only requests with valid credentials get processed.
  • No codebase bloat: Workflows live in n8n, completely separate from your product code. Delete a workflow and it's gone — no dead code left behind.

A Real Example

I needed OpenClaw to report daily SmailPro signups. Instead of giving the agent database access:

  • Created an n8n webhook: get-new-users-today
  • Workflow: Webhook node → MySQL node (query SELECT COUNT(*) FROM users WHERE DATE(created_at) = CURDATE()) → Respond to Webhook node (returns {"new_users": 42})
  • OpenClaw just calls curl https://n8n.example.com/webhook/get-new-users-today → gets the number

The agent never sees the SQL query, doesn't know the table is called users, has no idea where the database lives. All it knows: call this URL, get a number.

The Common Thread

These two patterns look different but follow the same principle: always put a middle layer between your AI agent and your systems.

A script file sits between the agent and cron — you review it before it runs. An n8n webhook sits between the agent and your database — the agent only knows a URL, nothing behind it.

The more autonomy you give an agent, the more you need to restrict its direct access. Not because AI agents are dumb or untrustworthy — but because even humans need guardrails when working with production systems. AI agents are no different, they just need different guardrails.

You don't need a complex framework or an enterprise security platform to get started. A .sh file that gets reviewed before cron calls it, or an n8n webhook standing between the agent and your database — that's enough to cut your risk significantly.

If you're using OpenClaw or any AI agent, try one of these patterns. Start with the simplest task you have — you'll see the difference.

]]>
<![CDATA[Back Up N8N Workflows to Airtable — Smart Sync]]>https://sonjj.com/n8n-backup-airtable-smart-sync/69ad88cfb8b74452cf3a6f3bSun, 08 Mar 2026 14:48:00 GMT

If you're running self-hosted n8n, there's a risk nobody really talks about: no built-in backup. One server incident, one accidental reset, one bad upgrade — and everything you spent hours building is gone.

I looked around online and found mostly backup-to-Google-Drive guides. The approach works by saving all your JSON files into a Google Drive folder, with Google Sheets acting as the index — tracking workflow names, file paths, and metadata. Reasonable in theory. But the setup is a pain: you need to create a Google Cloud project, enable the Google Drive API and Google Sheets API separately, set up an OAuth Client ID, and authenticate each service. Never done it before? Expect to spend 30–45 minutes just getting past this part — and you still don't get smart sync.

This guide walks you through the N8N TO AIRTABLE BACKUP - SMART SYNC workflow — download one JSON file, fill in 4 fields, done. It runs every 4 hours, 6 times a day, and when you delete a workflow in n8n, Airtable removes it automatically.

If you don't have an Airtable account and Personal Access Token yet, read the Airtable workspace and API key setup guide first, then come back here.


Why Airtable Instead of Google Drive?

Here's how the Google Drive approach works: all your workflow JSON files go into a single Google Drive folder, and Google Sheets acts as the "registry" — storing workflow names, file paths, and other metadata. Fine in theory.

The problem is the setup. You need to create a Google Cloud project, enable Google Drive API and Google Sheets API separately, create an OAuth Client ID, and authenticate each service individually. Never done this before? That part alone takes 30–45 minutes — and you still end up without smart sync.

Airtable handles it differently:

  • Much simpler setup: Airtable only needs a Personal Access Token — create one token, paste it in, done. No Google Cloud project, no enabling APIs one by one, no OAuth flow.
  • Everything in one place: The JSON backup lives right inside the Airtable record as an attachment — alongside the workflow ID, name, and timestamps. No jumping between Google Sheets and Google Drive.
  • Sort and filter built in: Airtable automatically tracks created_at and updated_at fields. Want to see the most recently changed workflows? One click to sort.
  • Powerful search: Type the workflow name and it shows up immediately — no opening a spreadsheet, no copying paths, no hunting through Drive folders.
  • Free tier is plenty: Airtable's free plan supports 1,000 records per base. With under 100 workflows, you'll be on the free tier for years.

Straight up: Airtable turns your backup into a searchable, filterable, sortable database — and cuts out the 30–45 minute OAuth setup entirely.


Before You Start

Run through this checklist first:

Once you import the workflow into n8n, the workflow description includes a link to clone the Airtable base template — you'll use that in the next step, so no need to build the base from scratch.

Got all three? Let's go.


Import and Configure the Workflow

Step 1: Import the Workflow into n8n

Open n8n → click Create in the top right → select Import from file.

Back Up N8N Workflows to Airtable — Smart Sync

Select the downloaded JSON file (backup-n8n-to-airtable.json) → n8n loads the workflow. You'll see 2 branches running in parallel: Intelligent Backup and Cleanup.

Back Up N8N Workflows to Airtable — Smart Sync

Step 2: Clone the Airtable Base Template

After importing, open the workflow description (click the workflow name in the header) — there's a link inside to clone the Airtable base template.

Click the link → Airtable asks you to log in → select the workspace to clone into → Add base.

Back Up N8N Workflows to Airtable — Smart Sync

Once cloned, open the new base and look at the URL bar:

https://airtable.com/appXXXXXXXXXXXXXX/tblXXXXXXXXXXXXXX/...

Copy these two parts:

  • Base ID: starts with app (e.g. appXXXXXXXXXXXXXX)
  • Table ID: starts with tbl (e.g. tblXXXXXXXXXXXXXX)

Back Up N8N Workflows to Airtable — Smart Sync

Step 3: Enter Base ID + Table ID in the "Airtable Info" Node

Go back to n8n → open the Airtable Info node (the green node at the start of the workflow).

Enter the Base ID and Table ID you just copied into the correct fields.

Quick tip: All Airtable config lives in this one node — if you ever switch bases, you only have to update one place instead of hunting through each node individually.

Back Up N8N Workflows to Airtable — Smart Sync

Step 4: Connect Your Airtable Personal Access Token

In the workflow, find the Airtable credential node → click to open → select or create a new credential → paste your Airtable Personal Access Token.

Don't have a token yet? See the guide on creating an Airtable Personal Access Token.

Back Up N8N Workflows to Airtable — Smart Sync

Step 5: Create an n8n API Key

Go to your n8n instance Settings → select the n8n API tab → click Create an API key → give it a name (e.g. "Backup Workflow") → copy the generated key.

Back Up N8N Workflows to Airtable — Smart Sync

Step 6: Enter the n8n API Key in the Header Auth Node

Find the Header Auth node in the workflow → open the credential → paste the API key into the Value field.

Back Up N8N Workflows to Airtable — Smart Sync

Configuration is done. The workflow now has access to both n8n and Airtable — activate it and it's ready to run.


How Smart Backup Works

After fetching all workflows from the n8n API (GET /api/v1/workflows), Branch 1 doesn't blindly back everything up. It compares against what's already in Airtable and sorts workflows into 3 cases:

  • Case A — Not in Airtable yet: New workflow, never backed up → creates a new record + uploads the JSON file for the first time.
  • Case B — Exists but has updates: Compares the updatedAt field in n8n against the record in Airtable — if they differ, the workflow was modified → updates the record + replaces the old JSON with the new one.
  • Case C — Exists, no changes: updatedAt matches → skipped entirely. No Airtable API call, no file upload.

This logic means 6 runs per day don't burn through API calls pointlessly — only workflows that actually changed get touched.


Sync: Auto-Delete When You Delete a Workflow

This is my favorite part of the workflow.

Branch 2 — Cleanup — runs in parallel with Branch 1: pulls all records from Airtable → uses the n8n API to check whether each workflow ID still exists (GET /api/v1/workflows/{id}).

If the workflow has been deleted from n8n, the API returns 404. Instead of letting that 404 crash the entire workflow, n8n is configured with onError = continueErrorOutput — a 404 isn't a fatal error, it's a signal that this workflow ID no longer exists. That signal triggers the next step: delete the corresponding record in Airtable.

The result: Airtable always mirrors n8n exactly. No ghost backups from deleted workflows. No manual cleanup needed.

Back Up N8N Workflows to Airtable — Smart Sync


Download & Wrap Up

Set it up once and forget about it.

  • Runs every 4 hours, 6 times a day (adjustable)
  • Only backs up what actually changed
  • Delete a workflow in n8n → Airtable removes it automatically
  • Search, sort, and filter your workflows in Airtable — far easier than digging through Drive folders

Download the workflow: backup-n8n-to-airtable.json — no email required.

If you run into anything during setup, drop a comment below — I read them all.

]]>
<![CDATA[Airtable Setup Guide: Create Account, Workspace & Get Your API Key]]>https://sonjj.com/airtable-setup-workspace-api-key/69a7a6ecb8b74452cf3a6142Wed, 04 Mar 2026 06:20:55 GMT

I use Airtable a lot — mainly as a database for AI agents, n8n, and other automation workflows. The more I use it, the more I realize I need a dedicated setup guide that other posts can reference. This is that foundation post — any tutorial in the series that involves Airtable will link back here instead of re-explaining the same setup every time.

One important heads-up: Airtable deprecated the old API Key in February 2024. You now need to use a Personal Access Token (PAT) instead. If you're following a tutorial that tells you to grab an API key from Account Settings — skip it. That approach no longer works.


What Is Airtable & Why Do You Need an API Key?

In plain terms, Airtable looks like Google Sheets but works like a real database underneath. You can create tables, link data between them, filter and sort any way you want — all through a drag-and-drop interface, no code required.

Airtable has over 500,000 customers across industries — eCommerce, project management, content planning, you name it. I personally use it as the central database for a 7-website ecosystem, connected to n8n to automate daily workflows.

What's the API key for?

You don't need an API key to use Airtable through the web interface. But if you want to connect it to other tools — n8n, ChatGPT, Claude, Zapier — you need a "key" that lets those tools read and write data in Airtable on your behalf.

That key is the Personal Access Token (PAT).

Each token can have specific permissions: read-only, write-only, or both. You can also scope it to specific bases instead of your entire account — much safer than the old API key, which gave full access with no way to limit it.


Create an Airtable Account

Go to airtable.com and click Sign up for free. No credit card needed.

Two ways to sign up:

  • Google account — fastest option, done in one click
  • Email + password — if you prefer keeping accounts separate

I'd recommend the Google account for convenience. After signing up, Airtable asks a few questions about your use case — you can skip them all, they don't affect anything.

Airtable Setup Guide: Create Account, Workspace & Get Your API Key

Once you're in, you'll see the main dashboard — where all your workspaces and bases live. Left sidebar shows workspaces, the center area shows your bases.

First time in, it'll look empty. Next, we'll create your first workspace and base.

Airtable Setup Guide: Create Account, Workspace & Get Your API Key


Create a Workspace, Base & Table

Before creating anything, here's how Airtable structures data:

  • Workspace — the top-level container, holds multiple bases
  • Base — like an Excel file, holds multiple tables
  • Table — a single data table (like a sheet in Excel)
  • Records — individual rows in a table

That's the whole hierarchy. Now let's create your first base.

Option 1: Start from scratch

From the dashboard, click Create a base → select Start from scratch. Airtable creates a base with one default table. From there, add columns and rows however you like.

This works best when you already know what you need.

Option 2: Copy an existing template

If you're not sure where to start, templates are the quickest way in. Airtable has hundreds across project management, CRM, content calendars, and more.

Here's a real example: copying the Backup N8N template — a base I built on Airtable Universe to store n8n workflow backups.

On the template page, you can see the base structure and sample data:

Airtable Setup Guide: Create Account, Workspace & Get Your API Key

Click into the base to see details. There's a Copy base button in the top right — click it:

Airtable Setup Guide: Create Account, Workspace & Get Your API Key

Airtable will ask which workspace to save it to. Pick yours and click Copy base:

Airtable Setup Guide: Create Account, Workspace & Get Your API Key

Done. Back on the dashboard, the base is now in your workspace and ready to use:

Airtable Setup Guide: Create Account, Workspace & Get Your API Key

Whether you built from scratch or copied a template, the next step is the same — get an API key to connect this base to external tools.


Get Your Personal Access Token (API Key)

This is what the whole post is building toward.

Quick reminder: Airtable dropped the old API Key in February 2024. There's now only one way to connect — Personal Access Token (PAT). Any tutorial telling you to grab an API key from Account Settings is out of date.

Step 1: Open Developer Hub

Click your account avatar in the top right → select Developer hub. This is where all your tokens live.

Step 2: Create a new token

Click Create new token. Airtable asks for three things:

Token name — pick something descriptive. For example:

  • n8n-automation — for n8n workflows
  • chatgpt-integration — for Custom GPTs
  • claude-agent — for AI agents

Don't use generic names like "my token" or "test" — once you have 5–10 tokens, you won't remember which one does what.

Scopes — what the token can access. Choose just what you need:

  • data.records:read — read data from tables
  • data.records:write — create, update, and delete records
  • schema.bases:read — read the base structure (table names, field names)

These three scopes cover 90% of use cases. For n8n, ChatGPT, or Claude integrations, these are the ones to pick.

Bases — which bases this token can access. You can choose "All current and future bases" or specify individual ones. I'd recommend picking specific bases if you know what you need — it's safer.

Airtable Setup Guide: Create Account, Workspace & Get Your API Key

Step 3: Copy the token immediately

After clicking Create token, Airtable shows the token exactly once. No "show again" button. No recovery option. If you forget to copy it, you'll have to delete the token and start over.

Copy the token, store it somewhere safe (a password manager, a private file), then paste it into whatever tool you're connecting to.

Airtable Setup Guide: Create Account, Workspace & Get Your API Key

Token security

A few rules worth keeping:

  • One token per purpose — don't share a token between n8n and ChatGPT. Create separate ones so you can revoke them individually without breaking everything else.
  • Token leaked? Revoke it immediately. Go to Developer Hub, delete the token, create a new one. Takes 30 seconds.
  • Don't commit tokens to code — if you're writing code, store tokens in environment variables, never paste them directly into source code.

Free Plan Limits

Now that you have a token, here's what Airtable's free plan allows — it's honestly enough for most people getting started, but good to know upfront:

  • 1,000 records per base — plenty for side projects, task management, or personal content planning
  • 1,000 API calls per month — each read/write from n8n or ChatGPT counts as one call; for simple workflows, this goes a long way
  • 5 requests per second — this rate limit applies to all plans, including paid ones
  • 5 editors per workspace — fine for solo work or small teams
  • 100 automation runs per month — this only counts Airtable's built-in automations; if you're running workflows through n8n externally, this number is irrelevant

Airtable Setup Guide: Create Account, Workspace & Get Your API Key

If you're just getting started with Airtable to connect it to n8n or an AI agent, the free plan is more than enough. Don't rush to upgrade. Use it until you hit the limits — by then, you'll know exactly what you need, and the upgrade decision will be a lot easier to make.


What's Next

That's it. You now have:

  1. An Airtable account
  2. A workspace and your first base
  3. A Personal Access Token with the right scopes

Those three things are enough to connect Airtable to anything. The setup is a one-time thing — from here on, whenever you need a new integration, just create a new token.

Where you go next depends on what you want to use Airtable with:

  • n8n — automate workflows and sync data between apps
  • ChatGPT / Claude — use Airtable as memory for AI agents
  • Zapier / Make — quick no-code connections

I'll have separate guides for each — all starting from where this post ends.

]]>
<![CDATA[My AI Journey — 2 Years of FOMO That Paid Off]]>https://sonjj.com/my-ai-journey/69bd64a5b8b74452cf3a923eThu, 26 Feb 2026 15:22:00 GMTThe First Time I Walked Into an Internet Café

In 2003, I walked into my first internet café in Vietnam. Everything was new — the giant CRT monitor, the dial-up modem screaming through the connection, the first game loading on screen while I sat there unable to blink. The internet felt like magic. I didn't understand how it worked. I just knew it opened a world completely unlike anything I'd ever seen.

Twenty-some years later, in April 2024, I clicked "Pay" on my first $20 Claude Pro subscription. And that feeling came rushing back — identical. The same curiosity. The same sense of standing in front of something much bigger than me that I didn't fully understand yet. The same question: "What else can this thing do?"

I've been making a living online for over a decade. A generalist dev — not deep in any one thing, just enough of everything to get the job done. Code, SEO, content, running a bunch of my own websites and APIs. But when I started using AI every day — 4 to 6 hours, consistently, for nearly 2 years — I realized I wasn't just adding a new tool. I was rebuilding how I work from the ground up.

This isn't a tutorial. This isn't a tool review. This is my actual story — from that first $20 Claude Pro subscription, through the messy phase of trying everything, to finding the four tools that finally made everything click.

When Google Stopped Being My First Move

I'd been coding for fun and then for money for over a decade. I was good at Googling. I knew how to phrase exact queries, filter StackOverflow by highest votes, scan through 5-6 tabs and stitch together a solution. That was survival skill for any dev before 2024.

Then Claude changed everything.

The biggest difference wasn't speed or accuracy — it was the nature of the interaction itself. Google is search: I type keywords, read results, piece things together on my own. AI is dialogue: I describe a problem, it asks follow-up questions, I explain more, we dig deeper together. Not a smarter search engine. A completely different way of working.

And I threw everything at it:

  • Code — from writing small functions to reviewing complex logic, refactoring whole modules
  • SEO — keyword analysis, content strategy, technical SEO audits
  • Content — brainstorming, outlining, drafting, tone adjustments
  • Offpage — using AI to refine link building strategy, analyze competitors

Four to six hours a day. Not because I forced myself. Because every time I opened Claude, I'd find something else it could do that used to cost me hours of solo effort. That excitement felt exactly like the early days of discovering the internet — every day, a new corner to explore.

I basically stopped using StackOverflow. Google only comes out when I need a specific link or have to check official documentation. Everything else goes through AI.

The Phase Nobody Talks About

Then things got chaotic.

I discovered n8n — a workflow automation platform. Brilliant. I built my first workflow: auto-check email, auto-call an API. Then built another. Then deleted the first one because it no longer fit. Then rebuilt from scratch because I'd thought of a better approach. Each workflow solved one isolated problem, connected to nothing else, belonging to no system.

Meanwhile, GitHub and the AI community kept shipping new tools. Cursor. Windsurf. Claude Desktop. Cline. Continue. Aider. A new name every week, each with an impressive demo, each with an appealing use case. I'd install one, not get fluent before the next one dropped. Not get deep on one workflow before being pulled into another.

No framework for filtering. Every tool seemed good. Every workflow seemed right. But nothing connected to anything else.

I think a lot of people are in this phase right now — and it's the phase most AI workflow posts skip entirely. They jump straight from "I started using AI" to "here's my complete system," skipping over the middle part — where you're drowning in choices, tool-hopping constantly, and don't know when to stop and actually build something.

A 2026 survey found that 46% of product teams say the biggest barrier to AI adoption isn't the lack of tools — it's the lack of integration between tools and existing workflows. That number didn't surprise me. I'd lived through it.

Four Tools, One System

Things started clicking when I stopped asking "which tool is best?" and started asking "how do these tools connect?" Not because I found the perfect tool — but because I found how four tools could support each other as a single system.

n8n — The Middleware Layer

n8n workflow automation platform

n8n is workflow automation — built to automate, connect apps, run on schedule. That's its original purpose, and it does it well. But for me, n8n plays an additional role: the middleware layer between AI and the server. Not because n8n was designed for this — but because when I looked at my workflow, it fit perfectly. Instead of letting AI call directly into a database or API server, every request goes through n8n first — processed, filtered, then returning exactly the data needed. Safer, more token-efficient, and fully under my control.

Craft Agents — The Workshop

Craft Agents is where I do the actual work every day. Building skills, creating data source connections, writing plans, testing cron jobs — anything that requires craft happens here. The chat and build experience in Craft Agents is noticeably better than any tool I've used. It's open source, so I forked it and added features to let it share skills and sources with OpenClaw. Two separate tools became one shared ecosystem.

Craft Agents interface

Obsidian — The Central Brain

Obsidian isn't a note-taking app. For me, it's the knowledge management architecture for the entire system. Every document, knowledge base, project structure, AI output — all of it lives in the Obsidian vault. Both Craft Agents and OpenClaw share the same Obsidian workspace. With tags and folder structure, everything is easy to find and easy to scale.

Obsidian knowledge management vault

OpenClaw — The Factory

If Craft Agents is the workshop — where you build and prepare — OpenClaw is the factory — where things run. Skills that have been tested, cron jobs that have been set up, queries that need fast results — all of that runs on OpenClaw. It doesn't replace Craft Agents. They have different roles and complement each other completely.

These four tools aren't just used separately and called a "system." They're intentionally connected — Craft Agents and OpenClaw share skills and sources through the fork, Obsidian is the shared vault for both, and n8n sits in the middle as a safe bridge to the server and external data.

OpenClaw execution interface

Why These Four

The question I get most: "Why not just use one tool for everything?"

Because no single tool does everything well. Each one solves a different problem, and the way they support each other creates something bigger than the sum of its parts.

Why not use OpenClaw for everything? OpenClaw executes well — stable cron jobs, fast lookups, clean output. But the chat and build experience doesn't come close to Craft Agents. When you need to build a new skill, test a new source, or plan something complex — Craft Agents is smoother by a mile. These two don't compete. One builds, one executes.

Why Obsidian instead of Notion or Google Docs? Plain text and offline-first. Markdown files sit right on the machine — AI agents access them directly through the file system, no separate API calls needed, no dependency on any cloud service. When you're managing a bunch of websites and APIs, having everything in one vault with clear folder structure and tags isn't just convenient — it's essential. Obsidian is the soul of the system, not because it's trendy, but because its architecture fits perfectly with how AI agents work.

Why use n8n as middleware? n8n wasn't built for this — it's a workflow automation tool, plain and simple. But after enough trial and error, the lesson was clear: letting AI call directly into an API server is fast but wasteful (responses carry too much extra data) and carries higher security risk. I looked at what I already had and saw that n8n could stand in the middle — receive requests from AI, process them, return exactly the data needed. Safer and more token-efficient. Not an advertised feature — just the way I shaped it to fit my workflow.

Why fork Craft Agents? It's open source — anyone can fork and customize. And the instinct of someone who earns a living online is: integrate rather than reinvent. A tool that's close to what I need but missing one key feature — I don't switch tools, I fix it. Adding the ability to share skills and sources with OpenClaw turned two separate tools into one ecosystem. Don't wait for someone else to build the bridge. Build it yourself.

The problem a lot of teams face isn't a shortage of AI tools — it's the missing connections between them. This four-tool stack solves that — not with a super-tool, but by wiring the right tools together.

What's Coming Next

This post is just the big picture — the story of the journey, not a setup guide. But behind this four-tool stack are a lot of specifics I'm gradually documenting:

  • How to fork Craft Agents and add skill/source sharing with OpenClaw — step by step, from zero
  • Obsidian vault architecture for AI agents — how I organize folders, tags, and naming conventions so both Craft Agents and OpenClaw can read everything
  • Building n8n as an API middleware — designing safe workflows between AI and a production server
  • My actual daily workflow — from receiving a task to getting it done, where AI steps in, and which tool handles which part

None of this is generic tutorial content. These are things I'm actually using, documented from real operations — with enough context for you to adapt to your own work, not copy wholesale.

If you want those posts as soon as they're out, subscribe to the newsletter below. No spam, no sales. Just posts when I actually have something worth sharing.


Still the Same Curiosity

Nearly two years. From that first $20 Claude Pro to a system managing a whole stack of websites and APIs I've been building for over a decade. But the numbers aren't the point. The biggest shift was in thinking — from tool user to system builder. From someone searching for answers to someone designing how to ask questions.

AI doesn't replace skills. It amplifies them. Everyone has access to AI — but actually putting it to work in a meaningful way is something else entirely. Making AI tools interact and fit together is harder still. Research says AI coding assistants boost productivity 20-30% — but that number only means something when you have a system to take advantage of it. Using AI piecemeal, differently every day, keeps you stuck in experiment mode. With a system, AI actually becomes part of how you work.

If you're in the chaotic phase right now — tool-hopping, deleting workflows, not knowing where to stop — that's normal. I was there too. Everyone goes through it before things start to click.

You don't need to start with four tools at once. Start with one. Get deep on it. Then add the second when you feel the gap. Everyone's journey is different — what matters is building with intention, not building because of FOMO.

Sitting in front of Claude now feels the same as sitting in front of that CRT monitor in the internet café back in 2003. Still the same curiosity. The difference is — this time, I know what I'm building.

]]>
<![CDATA[Claude Desktop & MCP: Build an "AI Brain" That Controls Your Entire System]]>https://sonjj.com/claude-desktop-mcp-ai-brain-ecosystem/6867cd370f04eecb6f662266Fri, 04 Jul 2025 13:34:19 GMT

Managing an ecosystem like SONJJ with a data center on Airtable containing 12 tables, over 200 data fields, plus thousands of links, articles, and SEO strategies is a huge challenge for one person. Constantly switching between tools, copy-pasting information, and running manual commands wastes my most valuable resources: time and focus.

In this article, I'll pull back the curtain on that "AI brain": the breakthrough combination of Claude Desktop and the MCP (Model Context Protocol) ecosystem. You'll see how an AI can become a real assistant, capable of directly interacting with and controlling the entire system I've built over the years.

We'll explore why Claude Desktop is a "game-changer," what MCP is, and dive deep into 4 real case studies of how I use them to manage data, content, knowledge base, and my computer itself.

Claude Desktop & MCP: Build an "AI Brain" That Controls Your Entire System

1. Claude Desktop: Not Just a Chatbot, But a Command Center

When people think of AI, many immediately think of a chat window in their browser. Claude Desktop goes way beyond that. It's a specialized work environment, a real command center.

1.1 Beyond a chat window

What makes Claude Desktop different is its ability to create separate Projects. Each project is like a specialized "brain" with its own background knowledge. I can create a project for "SmailPro," upload all technical documents, source code files, and market analysis into it. When I work in this project, Claude will only think based on that context.

And here's the "game-changing" feature: Artifacts. When I ask Claude to create code, a plan, or a complex document, it doesn't just return text in the chat window. It creates a complete file in a separate window, ready for me to copy, edit, or save. This is the difference between "chatting" with AI and "working" with AI.

Claude Desktop & MCP: Build an "AI Brain" That Controls Your Entire System

1.2 Investment in superpowers

At $20/month, this is the highest ROI (Return on Investment) I've ever made. It gives me an AI assistant working 24/7. The real value isn't in writing a few emails or summarizing text, but in becoming a real "employee," an important link in my operating system.

2. MCP (Model Context Protocol): Giving AI "Hands and Feet"

If Claude Desktop is the brain, then MCP is the "arms and legs" that allow that brain to interact with the real world.

Simply put, MCP are specialized "drivers" that allow Claude to communicate directly with other applications: your servers, APIs, and even your computer itself.

This is Claude's absolute advantage over other models. It completely breaks down the limits of a chat window. AI is no longer a passive advisor that can only give advice. Now, it's an executor capable of taking action based on what it knows.

3. Case Study: The "Big Four" MCPs Running the SONJJ Ecosystem

Theory is one thing, but how does it actually work? This is the core part. I'll show you exactly how I use the 4 most important MCPs to control my SONJJ ecosystem.

3.1 desktop-commander: The General on Your Computer

  • Function: Allows Claude to read files, write files, and run terminal commands directly on my computer.
  • Real example: Starting a new side project.

Instead of opening terminal, cd into folder, mkdir, touch files manually, I just need to command:

Using desktop-commander, please do these tasks:
1. Create a new folder at '~/projects/new-landing-page'.
2. Inside that, create these files: 'index.html', 'styles.css', 'app.js'.
3. Write basic HTML boilerplate code into the 'index.html' file.
4. Open this folder in VSCode for me.
  • Impact: Completely automates work environment setup. Just one command, everything is ready. It seems small, but multiply it by dozens of times each month. The time saved is significant.

3.2 airtable-mcp-server: Data Architect

  • Function: Allows Claude to interact both ways with my Airtable Base through API. It can read, search, create new, and update records.
  • Real example: Adding a new backlink building plan for the smailpro.com project.

My Airtable system is very complex. To create a new backlinks_tier_1 record, I have to link it with 3 different tables. Now, I just need to command in natural language:

Using airtable-mcp, please create a new record in the "backlinks_tier_1" table (tbl6zzZ8TbS1WABAr) of base "app7sFHYrv2XUGQQa" with this information:
- Link to record in "structures" table with URL "https://smailpro.com/features".
- Link to record in "resource.backlinks" table with domain "topdevforums.net".
- Link to "keywords" record with keyword "premium temporary email".
- Set "scheduled_date" as tomorrow.
  • Impact: I can manage a huge database using natural language instead of manually clicking and filling forms. This helps reduce errors and speeds up work many times over.

3.3 ghost-mcp: Automatic Editor

  • Function: Connects Claude directly with Ghost CMS, allowing it to create, update, and manage posts.
  • Real example: Publishing a new article to sonjj.com blog.

After Claude helps me complete an article in the Artifacts window, I don't need to copy-paste. I command:

Using ghost-mcp, please take the content from the markdown file I just uploaded, then create a new draft post on sonjj.com with the title "Complete SmailPro API Integration Guide". Please assign tags "api" and "tutorial".
  • Impact: This is the final piece to automate my content distribution flow, making the Airtable (planning) → Claude (writing) → Ghost (publishing) process seamlessly real.

3.4 mcp-pinecone: Vector Knowledge Keeper

  • Function: Allows Claude to perform semantic search on my vectorized Pinecone knowledge base.
  • Real example: Answering a complex strategic question that needs information synthesis from multiple sources.

A potential investor asks: "How does your SmailPro solve privacy issues for end users compared to competitors?" I can ask Claude:

Using mcp-pinecone to query the "smailpro" namespace, please find and synthesize all information about security features, data storage policies, technical architecture related to encryption, and real privacy applications. Then, compare these points with 2 main competitors and present results in table format.
  • Impact: Turns AI into a real strategic consulting expert. It doesn't just search keywords, but also "understands" and synthesizes information from the huge knowledge base I've built.

4. Combined Power and the Philosophy Behind

The real power isn't in each individual MCP, but in their combination. In a single conversation, I can command Claude: "Please analyze data from Pinecone about the topics users care about most this month, then create a list of article ideas in Airtable, then draft the first article and post it to Ghost as a draft, finally save the resulting markdown file to the '~/content/published' folder on my computer."

All with just a few commands.

This is the realization of the philosophy I always pursue: "Integration over Innovation" and "Free up human time". Technology isn't meant to replace us. It's there to amplify our capabilities, to automate boring tasks, giving us more time for what really matters: family, strategic thinking, creativity, and life.

Conclusion & Next Actions

The key point you need to remember is: Claude Desktop + MCPs allow you to build an AI assistant "tailored" for your own workflow. This is the leap from "using AI" to "commanding AI".

If you feel interested, don't just stop at reading. Start taking action. The first step is very simple: install Claude Desktop and experiment with the easiest MCP, which is desktop-commander. Try automating a small task that you have to do repeatedly every day.

And this is just the introduction. In the next articles, I'll go deep into how to set up each automated workflow with n8n to connect all these tools together into a complete machine.

]]>
<![CDATA[Guide to buying credits and understanding SonJJ's credit system]]>https://sonjj.com/sonjj-api-credits-system-guide/67fe51de093385dc35448242Tue, 15 Apr 2025 13:02:00 GMTIn this article, we will guide you on how to purchase credits from the system wallet, explain in detail the credit calculation mechanism for each type of API request, and share cost optimization strategies.

1. Introduction to SonJJ's credit system

SonJJ's credit system is the main payment unit for all API requests. This is one of the most outstanding features of SonJJ compared to other API services in the market:

  • Credits never expire - You can buy and use them anytime, without worrying about usage time limits
  • Charges only on success - Credits are only counted when the request returns status code 200
  • Fair cost - Each type of endpoint has a different cost, reflecting the actual value it provides
  • Attractive bonuses - Save up to 35% when buying larger credit packages
"Credits do not expire. And it comes in cheap, if you buy larger plans."

2. Simple guide to creating an API Key

You just need to log in to the system management area through this link https://my.sonjj.com/. Then you only need to select the menu item named API

API Key creation interface screenshot

You will see a button called "Change API KEY", click on it and you will get an API Key to use.

3. Guide to buying credits from wallet

Before buying credits, you need to ensure you have added money to your SonJJ wallet. If not, you can refer to the guide on adding funds to your SonJJ wallet.

Available credit packages

Package name Credits Price Bonus Savings
Plan-1 10,000 $10 +0 0%
Plan-2 49,000 $49 +7,350 15%
Plan-3 99,000 $99 +34,650 35%
SonJJ credit package pricing table
SonJJ credit package pricing table

Step-by-step credit purchase process

  1. Log in to your SonJJ account
  2. Click on the "Credit" tab in the navigation bar
  3. In the "Buy Credits (does not expire)" section, choose the credit package that fits your needs
  4. Click on the "Buy Now" button
  5. The system will automatically pay from your SonJJ wallet (through "Pay from your wallet")
  6. Credits will be added to your account immediately

Saving tip: If you plan to use the API for a long time, choosing Plan-3 with a 35% bonus will help you save significant costs. With 133,650 credits (including bonus), you can make millions of requests to low-cost endpoints.

4. Credit calculation mechanism for each API request

SonJJ applies a smart credit calculation mechanism: only counting credits for successful requests (returning status_code=200). This ensures you only pay when you actually receive value from the API.

"Cost of each successful request (status_code=200)"

Credit costs are allocated based on the value and complexity of each endpoint:

Explaining cost differences

There are clear differences in costs between endpoints:

  • Simple endpoints (0.05 credits): Inbox query endpoints have the lowest cost due to simple processing
  • Standard endpoints (1-2 credits): Message, list, and create endpoints have average costs
  • Complex endpoints (>10 credits): Endpoints like check_index have the highest cost (14 credits) due to their high value and complex processing

Special note: Costs will vary between plan packages. The above pricing applies to PLAN-1. With higher packages that include bonuses, the actual cost for each request will be lower.

5. Benefits of SonJJ's credit system

SonJJ's credit system brings many benefits to users:

Fair cost

You only pay based on the actual value received. Simple endpoints have low costs, while endpoints providing higher value will have correspondingly higher costs. This ensures a balance between cost and benefit.

Easy budget forecasting

With fixed credit costs for each type of request, you can easily estimate the API budget for your project. For example, if you know you will make 1,000 requests to the inbox endpoint daily, you will spend 50 credits/day (1,500 credits/month).

Savings with scale

The more credits you buy, the more you save. With the 35% bonus from Plan-3, the actual cost for each request will decrease significantly. This is a major advantage for businesses using the API with high frequency.

No worries about deadlines

Credits never expire, allowing you to buy in large quantities to take advantage of bonuses without worrying about having to use them all within a specific time period.

6. API cost optimization strategies

To optimize costs when using SonJJ's API, you can apply the following strategies:

Choose low-cost endpoints when possible

For example, if you only need to check if there are new emails, use the /inbox endpoint (0.05 credits) instead of /message (1-2 credits). Only use expensive endpoints when absolutely necessary.

Focus on credit packages with high bonuses

Plan-3 with a 35% bonus will help you save significant costs in the long run. Instead of buying Plan-1 multiple times, consider investing in Plan-3 from the start.

Optimize API call frequency

Instead of calling the API continuously, set up reasonable intervals between calls. This not only saves credits but also reduces load on your system.

Cost optimization example:

A company needs to check 10,000 emails monthly:

  • Option 1: Using Single API /v1/check_index (14 credits/request) = 140,000 credits/month
  • Option 2: Combining inbox checks (0.05 credits) and only using message checks (1 credit) when needed = ~20,000 credits/month

→ Savings: 120,000 credits/month (equivalent to over $120 with Plan-1)

7. Frequently asked questions

Do credits expire?

No, SonJJ credits never expire. You can buy and use them anytime without worrying about usage time limits.

How do I track credit usage?

You can check your remaining credits on the SonJJ homepage after logging in. It displays your current credit balance.

Am I charged for failed requests?

No, SonJJ only counts credits for successful requests (returning status_code=200). Failed requests are not charged.

8. Conclusion and next steps

SonJJ's credit system provides a fair and efficient approach to API usage. With features like never-expiring credits, charging only on success, and attractive bonuses for large packages, SonJJ helps you optimize costs significantly.

To start using the API and take advantage of the credit system:

  1. Add funds to your SonJJ wallet (if you haven't already)
  2. Purchase an appropriate credit package (recommend Plan-3 if you have long-term usage plans)
  3. Create an API key and start integrating the API into your application
  4. Apply the cost optimization strategies mentioned

Start today and enjoy the benefits of SonJJ's flexible, cost-effective API system!

]]>
<![CDATA[How to Add Funds to SonJJ Through Payeer]]>https://sonjj.com/add-funds-sonjj-payeer/67fe23b4093385dc35448196Tue, 15 Apr 2025 09:27:50 GMT

Adding funds to your SonJJ account is an important step to access premium features and use API credits within our service ecosystem. In this article, I'll guide you on how to add funds through the Payeer payment gateway - a flexible payment method that supports both credit cards and various cryptocurrencies.

The funding process is very simple and takes only about 5 minutes to complete. After successful payment, you'll immediately receive the balance in your account and can use it to optimize your work. Follow the steps below to add funds quickly and safely.

Step 1: Access the SonJJ wallet

How to Add Funds to SonJJ Through Payeer
Sonjj wallet interface

Step 2: Choose amount and payment gateway

  • Enter the amount you want to add in the "Enter Amount (USD)" field
  • Or select one of the available packages: $1, $5, $20, $50, $100
  • In the "Choose Payment Gate" section, select "Payeer (BTC,USDT,BCH,TROX...etc)"
  • Click the red "MAKE A PAYMENT" button

Step 3: Confirm payment information

  • Check the order information: Order ID, Amount
  • Click "GO PAYMENT" to continue
How to Add Funds to SonJJ Through Payeer
Transaction confirmation window

Step 4: Select payment method

  • The Payeer page displays payment methods
  • Choose the appropriate method (e.g., PAYEER, BITCOIN, LITECOIN, etc.)
How to Add Funds to SonJJ Through Payeer
Payment methods

Step 5: Confirm information and complete payment

  • Review the information: Recipient (smailpro.com), Comments, Amount
  • If needed, enter your email and click "CONFIRM"
How to Add Funds to SonJJ Through Payeer
Final confirmation page

Step 6: Pay with cryptocurrency (if selected)

  • Send the EXACT amount displayed (e.g., 0.65960977 LTC)
  • Transfer to the provided wallet address
  • Complete within the specified time (under 15 minutes)
How to Add Funds to SonJJ Through Payeer
Litecoin payment instructions

Step 7: Confirm successful transaction

  • After payment is complete, you'll be redirected to the wallet page
  • Check your updated balance and the "Order Payeer successfully!" notification
How to Add Funds to SonJJ Through Payeer
Successful transaction

Important notes

When paying with cryptocurrency:

  • ⚠️ You must transfer the EXACT amount requested
  • ⏱️ Complete within the 15-minute time limit
  • 🔄 Wait a few minutes for the system to confirm the transaction

Troubleshooting payment issues:

  1. Failed transactions usually occur due to:
    • The transferred amount doesn't match exactly
    • Transaction exceeds the allowed time
  2. How to contact support:

Congratulations! You have completed the process of adding funds to your SonJJ account through the Payeer payment gateway. Now you can use your balance to access premium features and API credits, helping optimize your work and save your valuable time.

If you have any questions about the funding process or encounter difficulties during payment, don't hesitate to contact our support team for timely assistance.

]]>
<![CDATA[Transforming SonJJ.com into a multilingual website]]>https://sonjj.com/multilingual-website-update/67f334fe093385dc354465f3Mon, 07 Apr 2025 02:14:23 GMTI've just completed the customization of SonJJ.com into a multilingual website. This is a significant step in expanding my reach and sharing knowledge with a broader audience.

English has truly been my weak point, but with this decision, I must change that. I've decided to systematically improve my English skills, not just for better communication but to connect more effectively with the international community. This will enable me to share more projects, case studies, and practical experiences with users worldwide.

My goal is clear: to bring optimization knowledge and solutions to more people.

]]>
<![CDATA[Introduction and Detailed Account Registration Guide]]>https://sonjj.com/huong-dan-dang-ky-tai-khoan/67f2363c2b8743ef9f0a3e4aTue, 01 Apr 2025 03:23:00 GMTI. Introduction


Hello there, I don't know where you're coming from, but if you've made it here today and are reading this article, then we truly have a connection. Perhaps our common goal is the same - to learn methods and optimize time when working on computers. And that's exactly why I built this website.

Sonjj.com is a website specialized in providing API tools or solutions to help save as much computer work time as possible... Its mission is to help online workers, programmers, marketers, or anyone facing tons of repetitive daily tasks - reclaim their time. If you already understand the purpose of this website, then don't waste any more time - let's get started! In the next section, I will guide you step by step on how to register for a SonJJ.com account - the first step to begin your journey of reclaiming your valuable time.


II. Sign up for sonjj account

Imagine this as creating a key to open the door to our ecosystem. Once you have this key, you can move freely between different "rooms" (websites within the system) without having to knock on the door from the beginning.

Access my.sonjj.com

First, open your browser and access my.sonjj.com - this is the control center for our entire ecosystem. To be honest, I designed it similar to how Google operates their accounts. Sign in once, use everywhere.

The interesting thing is that when you try to log in to any website in the ecosystem such as smailpro.com, smser.net, or cardgener.com, the system will automatically direct you to my.sonjj.com for authentication and then redirect you back. No need to remember multiple different passwords - simplifying everything!

Fill in registration information

You have three ways to create a sonjj account:

Method 1: Manual Registration (traditional form)

This method may be a bit "classic" but gives you complete control:

  1. Enter your name
  2. Fill in your email correctly (this will be your primary ID to recover your account if needed)
  3. Create a password (the more complex, the better)
  4. Confirm your password (just to ensure you didn't mistype) After submitting the form, check your email for confirmation. Seriously, don't skip this step - it's how the system knows your email actually exists.

Method 2: Quick Registration with Google

If you're a fan of "optimizing every second" like me, this option is for you:

  1. Click on the "Sign in with Google" button
  2. Select the Google account you want to use
  3. Confirm access permissions And done! The system automatically retrieves your email and name from your Google account. No need for email verification because Google has already done that.

Method 3: Use an existing sonjj.com account

If you already have a sonjj.com account:

  1. Click on the "Sign in with SonJJ" option
  2. Enter your sonjj.com account information

Email Verification

With manual registration, you'll receive a confirmation email within minutes. The email contains an activation link - click on it and your account will be activated immediately.

The good thing here is that if you don't receive the email, there's a "Resend confirmation email" button right on the page. No need to fill out the form again from the beginning - I hate having to re-enter information.

First Login

After verifying your email or registering via Google/SonJJ, you'll be redirected to the main dashboard. This isn't a complicated admin page with graphs and charts that overwhelm you - I've designed it simple enough for newcomers to easily navigate. You can see the image below; if you see an interface like that, it means you've successfully created an account and logged in.

Dashboard
Dashboard

III. Congratulations! You Have Completed the First Step

So you've successfully created your SonJJ.com account and become familiar with the admin interface. Easy, wasn't it? This is just the first step to begin your journey of automating your daily tasks.

To truly harness the power of these APIs, the next step you need to take is to deposit money into your account and get an API key to start using them.

In the next article, I will guide you in detail on how to deposit money into your SonJJ account and obtain an API key. Every minute you invest in this initial setup will help you save hours later on.

See you in the article "Guide to Depositing Money Into Your Account And Converting Money to Points".


]]>
<![CDATA[Upgrade of the Ugener tool v1.02]]>Below are the features that have been added to the Ugener.com tool. I hope these updates will be useful for all of your work.

  1. Usage History
  • Automatically saves created profiles in the local browser memory
  • Limits storage to a maximum of 50 most recent profiles
  • Easy access and reuse
]]>
https://sonjj.com/nang-cap-cong-cu-ugener-v1-02/67f2363c2b8743ef9f0a3e50Tue, 07 Jan 2025 07:45:00 GMTBelow are the features that have been added to the Ugener.com tool. I hope these updates will be useful for all of your work.

  1. Usage History
  • Automatically saves created profiles in the local browser memory
  • Limits storage to a maximum of 50 most recent profiles
  • Easy access and reuse of information
  1. Copy All
  • Allows copying all created profile information with just one click
  • Saves time and reduces manual operations for users
  1. Custom Views
  • Provides templates to display information according to specific needs
  • Example: Google view will only display fields necessary for creating a Google account
  • Will continue to add more templates in the future
  1. Custom Domain
  • Allows users to change the domain of the created email address
  • Flexibility in creating email addresses according to needs

These new features are designed to enhance the user experience, increase usability, and expand the customization capabilities of the tool.

]]>
<![CDATA[Upgrade of the Smailpro tool v1.0.1]]>https://sonjj.com/nang-cap-cong-cu-smailpro-v1-0-1/67f2363c2b8743ef9f0a3e4fMon, 04 Nov 2024 07:45:00 GMTThis is an important update with a standout feature: it allows creating multiple temporary email accounts simultaneously - a rare feature on other websites. Thanks to this, users' work efficiency will be maximized.

The new interface is designed to be more intuitive and polished, helping users easily monitor and select the type of email they need to create. Additionally, we have added a Gmail list from the 5th server and will continue to update in the future. Visit here to try the new tool: https://smailpro.com/temporary-email

]]>
<![CDATA[Upgrade of the Ugener tool v1.01]]>https://sonjj.com/nang-cap-cong-cu-ugener-v1-0-1/67f2363c2b8743ef9f0a3e4eSat, 10 Aug 2024 07:44:00 GMTThis update improves the interface, focuses on US username information, and adds a new feature that randomly generates real addresses from Google Maps. Try the new feature here

]]>
<![CDATA[Save time with Ychecker: From personal need to community solution]]>https://sonjj.com/cong-cu-kiem-tra-gmail-hang-loat/67f2363c2b8743ef9f0a3e4cSat, 10 Aug 2024 03:54:00 GMT

The Challenge of Managing Multiple Gmail Accounts

Save time with Ychecker: From personal need to community solution

My work requires using many different Gmail accounts. At first, this seemed normal, but gradually it became a real nightmare:

  • Accounts suddenly requiring phone number verification
  • Some accounts disabled for unclear reasons
  • Having to constantly log in and check each account individually
    As a result, I spent hours each day just managing these accounts. Precious time wasted!

The Birth of Ychecker: From Personal Problem to Community Solution

These difficulties motivated me to create Ychecker. Initially just to solve my personal problem, but then I realized: many others must be facing similar challenges.
Therefore, I decided to develop Ychecker not just for myself but also to share it for free with everyone. Because I believe that time is our most valuable asset.

Ychecker: A Versatile Tool for Gmail Management

Ychecker was designed with a simple goal: to check multiple Gmail accounts simultaneously. You can access it at https://ychecker.com/.
Save time with Ychecker: From personal need to community solution

Key Features:

  1. Bulk Checking: At https://ychecker.com/bulk-email-checker, you can check multiple accounts at once.
  2. Issue Detection: Ychecker quickly identifies which accounts are experiencing problems.
  3. API Integration: For developers who want to integrate checking functionality into their applications.

How to Use Ychecker

Save time with Ychecker: From personal need to community solution

Using Ychecker is extremely simple:

  1. Visit https://ychecker.com/bulk-email-checker
  2. Enter the list of emails to check
  3. Click the check button
  4. Wait a moment and view the results
    That's it! No need to log into each account individually anymore.

The Power of API

For developers, Ychecker provides a powerful API that allows integration of email checking functionality into your application. You can learn more about the API at https://ychecker.com/api.

With the API, you can:

  • Automate the email checking process
  • Integrate checking functionality into existing workflows
  • Build more powerful email management applications

How It Saves Time

Before Ychecker, I spent about 30 minutes checking 20 Gmail accounts. Now it takes only 2 minutes! That means:

  • Nearly 30 minutes saved each day
  • 15 hours saved each month
    Not only does it save time, but Ychecker also helps detect issues with Gmail accounts early, allowing timely handling and avoiding work disruptions.

Conclusion

The story of Ychecker shows that sometimes, a solution to a personal problem can become a useful tool for the entire community. I created Ychecker to solve my own needs, but I'm very happy to see it can help many others.

If you're also managing multiple Gmail accounts, give Ychecker a try. I look forward to receiving your feedback to improve this tool even further.

Remember that time is our most valuable asset. Use tools like Ychecker to optimize your time, so you have more time for the important things in life.

Wishing you efficient work and enjoyment of every precious moment! 😊

]]>
<![CDATA[Milestone: Deployment of the SonjjAPI system]]>A proud step forward in my journey - SonjjAPI, the system's internal API, has officially gone live. Previously, I relied on RapidAPI to publish APIs, but now, with unwavering effort and dedication, I have completed a comprehensive and independent solution. Every aspect has been optimized and the system

]]>
https://sonjj.com/trien-khai-he-thong-sonjj-api/67f2363c2b8743ef9f0a3e4dSat, 18 May 2024 07:41:00 GMTA proud step forward in my journey - SonjjAPI, the system's internal API, has officially gone live. Previously, I relied on RapidAPI to publish APIs, but now, with unwavering effort and dedication, I have completed a comprehensive and independent solution. Every aspect has been optimized and the system is ready to deliver an excellent experience to users. This is a milestone to be proud of, marking my growth and significant development in this field.

]]>
<![CDATA[One Punch Man]]>https://sonjj.com/one-punch-man/67f2363c2b8743ef9f0a3e43Mon, 01 Apr 2024 07:24:00 GMT

One Punch Man is a famous Japanese action-superhero comic series. The story revolves around the main character Saitama, a top superhero with absolute power, who can defeat any enemy no matter how fierce with just one punch.
Despite possessing ultimate power, Saitama often feels bored because he is too strong, no one can make it difficult for him. He always wants to find a worthy opponent to have a tense, dramatic battle. At the same time, Saitama has to face many challenges and misunderstandings from the people around him.
With humorous drawings, eye-catching action style and witty details, One Punch Man quickly became a manga-anime phenomenon loved in Japan and many countries around the world. The work received many compliments for exploiting the superhero theme in a new and interesting way.

]]>