An analysis of why companies like Feishu, Google, Stripe, ElevenLabs, and NetEase Music are all releasing CLI tools in 2026, and how CLI has become the universal plugin for AI.
Why Everyone Is Building CLI Tools Overnight
Feishu, Google, Stripe, ElevenLabs, NetEase Music.
In recent months, a group of seemingly unrelated companies coincidentally did the same thing: release CLI tools.
Karpathy recently wrote an article documenting his entire process of building an app with AI.
He spent most of his time not writing code, but jumping between browser tabs configuring API Keys, changing DNS, filling environment variables.
His original words:
"Your service should have a CLI tool. Don't make developers visit, view, or click. Directly instruct and empower their AI."
It's 2026, why is everyone suddenly going back to "command line" — something that looks so retro?
What Exactly Is CLI
If you're not a programmer, CLI sounds very technical. The concept is actually very simple.
GUI (Graphical Interface) is opening an app, seeing buttons and menus, clicking around with mouse.
CLI (Command Line) is opening a black window, typing a line of text, pressing enter, job done.
Analogy: GUI is going to a restaurant, looking at the menu, pointing to the waiter "I want this".
CLI is directly shouting to the kitchen "Kung Pao Chicken, less oil, more spicy". Same result, but CLI is more precise, easier to automate.
Why are CLI and AI particularly suited? Because AI is "text in, text out". GUI is for eyes to see, AI has no eyes.
CLI is pure text, AI naturally operates in this world.
AI wants to help you compress video, doesn't need to open Premiere looking for export button. It runs one line ffmpeg -i input.mp4 -crf 28 output.mp4 and it's done.
Humans haven't fallen back in love with command line. It's AI that originally lived in the command line.
Where Are AI's Capability Boundaries
Many people imagine AI as an omniscient, omnipotent brain.
A more accurate analogy is: a very smart new employee, can learn anything, learns fast, but needs two things — tools and manuals.
Installed ffmpeg, AI can process video. Installed Feishu CLI, AI can check your schedule and send messages for you.
Installed Google Workspace CLI, AI can manage your email and cloud drive.
Not installed? "Sorry, can't do that."
ffmpeg: An open-source audio/video processing tool, almost the industry standard for video processing.
brew install ffmpegto install, AI can then help you compress video, convert formats, cut clips.
AI's actual capability = tools it can call + context it gets.
Tools are easy to understand. What's "context"? Simply put, manuals.
For classic tools like ffmpeg, AI has seen massive amounts of usage in training data, doesn't need additional manuals.
But Feishu CLI just launched in 2026, AI's training data has none of it. Without giving it a manual, AI doesn't even know it exists.
So new-generation CLI tools all come with files called Skills — essentially Markdown-written manuals telling AI what this tool can do and how to use it.
There's a推论 worth noting: the newer the tool, the more AI relies on such explicit manuals.
Training data can never catch up with tool release speed. Manuals will only become more important.
CLI Is Being Reinvented
Past CLI and current CLI, though both called CLI, are already two different things.
Traditional CLI, like ffmpeg, jq, curl, is for programmers.
Output is colorful text for human eyes. When choices are needed, interactive menus pop up — natural for humans, but AI freezes when encountering such popups.
New-generation CLI assumes from design that callers might be AI:
- All operations passed through parameters at once, no menus popup
- Output JSON format, AI parses directly; comes with Skills manual
- Support
--dry-runpreview, letting AI see what will happen before executing - AI can also ask the tool "what commands do you have? what parameters are needed?" without reading full documentation
Take Feishu CLI as example: after installation, 200+ commands covering 11 domains including calendar, messages, documents, tasks, email.
Say "check my schedule for tomorrow", AI calls lark-cli calendar +agenda;
Say "send Zhang San a message saying meeting moved to 3 PM", AI calls corresponding message command. Whole process without opening Feishu app.
Google Workspace CLI is more extreme — one command starts an MCP service letting AI directly operate Gmail, Google Drive, Google Calendar through standard protocol.
MCP (Model Context Protocol): Standard communication protocol between AI and external services, explained in next section.
CLI Is Becoming AI's Universal Plugin
Here's a phenomenon I observed while building CodePilot, rarely discussed so far.
First explain three concepts in one sentence:
- MCP: Standard communication protocol between AI and external services. You can understand it as USB interface for the AI world.
- Skills: Manuals telling AI "how to use this tool".
- Plugin: Packaged installable extension combining tools, protocols, and manuals. Similar to apps on phones.
The AI community has been debating which of these three will become mainstream.
But look closely at what new-generation CLI is doing — you'll find they package all three.
Google Workspace CLI is typical: CLI commands provide execution capability, built-in MCP service provides standard communication protocol, included Skills files serve as usage manuals.
Feishu CLI, Stripe CLI, ElevenLabs CLI, NetEase Music CLI — all follow this pattern.
In other words, a CLI tool is essentially a Plugin.
But it has several advantages over Plugin.
Claude Code's Plugin can only be used in Claude Code.
Feishu CLI, once installed, can be used in Claude Code, Cursor, Gemini CLI — not platform-locked.
This is especially important domestically because, for well-known reasons, users might connect to Claude today, switch to DeepSeek tomorrow, try Qwen the day after.
CLI doesn't care which model calls it — it's model-agnostic execution layer.
Plugin marketplace has review processes; CLI tools publish to npm and they're live, as free as launching websites.
And humans can use them too — AI not required to run commands in terminal directly. Developers have more motivation to build and maintain.
npm: Developer version of App Store, massive command-line tools distributed through it. Run
npm install -g toolnameto install.
CLI can also do something Plugins can't: composition. gws gmail +triage | jq '.messages[]', two tools piped together, previous output becomes next input.
Shell pipeline — decades-old design — suddenly becomes valuable again in the AI era. Plugins are isolated, no standard way to combine.
While everyone debates which will win between MCP, Skills, and Plugins, the answer might be that CLI packages all of them — and cross-platform.
But It's Not All Good
Said so many benefits, now for the problems.
CLI's biggest structural flaw is security.
Plugins run in platform sandboxes with declarative permission control.
CLI directly executes shell commands — once AI can run gws, it can do anything with that identity.
No fine-grained "read-only, no write" control. Currently relying on --dry-run and popup confirmations as remedies, but compared to platform-level permission frameworks, the gap is large.
Sandbox: An isolation mechanism, similar to mobile app permission popups — "Allow access to camera?" Programs can only do what they're permitted; problems don't affect other system parts.
In actual use, I've also stepped on many pitfalls integrating various CLI tools into CodePilot.
First: Manual too large, AI brain explodes.
Some tools have very large Skills files. AI's context window has capacity limits — one file takes up a huge chunk, significantly degrading reasoning quality after loading.
By comparison, Google Workspace CLI's Skills files average 1.6KB, precisely giving AI needed information — no more, no less.
Second: Interactive prompts freeze AI.
Stripe CLI early versions popup "? Which environment?" selection menu — natural for humans, AI freezes completely. Later added --no-interactive to solve.
Third: Output too long, drowns useful information.
One query returns tens of thousands of characters of JSON, all poured into context, making it hard to find truly needed information.
Google Workspace CLI uses field masks to control return size — few tools follow this design.
Field masks: When calling API, specify only which fields to return, not dumping all data. Querying emails only needs subject and sender, not body — saves huge context.
These problems share one root:
"Designed for AI" and "validated in AI" are two different things. Like mobile adaptation — responsive looks fine in design, but on real device buttons can't be clicked.
My Experiment: Letting AI Manage Its Own Tools
When building CodePilot's CLI management feature, I experienced a mindset shift.
Initially traditional software approach:
Write code to detect what's installed on user systems, write UI for users to manage tools in interface, write logic to detect updates. Standard approach.
Later realized: I already have AI in my product, why bypass it?
Tool installation — directly start conversation letting AI do it.
AI reads --help, judges OS, handles permission errors, guides authentication configuration.
Installation error? It can read error messages and judge: need sudo? install dependency first?
Tool registration is similar — give AI a prompt template, read --help, automatically generate structured description of what tool can do, how to use it, typical scenarios.
sudo: Command on macOS/Linux to temporarily gain system admin privileges. Installing some tools requires writing to system directories, normal permissions insufficient — add sudo.
Simple idea: don't use software to help users manage AI's tools, let AI manage its own tools.
Each tool's situation is different — coding dead installation logic is endless.
But AI can handle such open-ended problems. Turning tool installation into AI tasks is more reliable than writing installers covering all edge cases.
I also made a 5-dimensional Agent compatibility scoring, from whether designed for AI, supports structured output, supports self-inspection, supports preview, and attention to context size.
Honestly this scoring is more of a call: hoping tool developers start thinking "is my CLI AI-friendly?"
What's Still Missing
CLI is becoming infrastructure for AI capability extension, but there are obvious gaps.
How do you know "there's a Feishu CLI that lets AI operate Feishu"?
Currently basically word-of-mouth. npm and GitHub are best positioned to become App Store for AI tools, but they lack motivation. Discovery mechanism is blank.
Authentication is also a problem.
Feishu one login, Google another, Stripe another — five tools means five logins. Too much friction for average users.
Installation experience is also unreliable.
npm, brew designed over a decade ago, assuming users are command-line-savvy developers.
When operators become AI, permission issues, missing dependencies, path conflicts — problems that "check Stack Overflow to solve" become real obstacles.
Homebrew (brew): Most popular package manager on macOS, specifically for installing command-line tools.
brew install ffmpeginstalls ffmpeg,brew install jqinstalls jq, that simple.
Industry isn't short of tools, protocols, or manuals.
What's missing is the infrastructure layer enabling these three to be discovered, installed, and trusted.
Whoever builds this becomes npm of the AI era.
So, Why Is Everyone Building CLI
Everyone realizes CLI might currently be the most efficient AI capability distribution method.
A CLI tool simultaneously contains execution capability, communication protocol, and usage manual — it's a complete AI plugin.
Cross-platform, no review, usable by both humans and AI.
Every additional useful CLI tool installed gives your AI one more skill. Every reduction in unnecessary context noise makes your AI a bit smarter.
We're in a chaotic transition era between old and new.
Old formats, old data barriers, old package managers intertwined with new AI-native toolchains.
CLI isn't the only answer, but it's currently the most pragmatic one.