- ⚡ The new Rust-based
hfCLI is up to 7x faster thanhuggingface-cli, significantly improving automation workflows. - 🔧 The
hfcommand makes things easier to use with simple syntax, tab-completion, and better error messages. - 🤖 Automation builders can now start ML model jobs, uploads, and evaluations using the CLI with little setup.
- 🔐 Secure token-based authentication makes
hfsafer and friendlier for working with others or in team settings. - 🚀 Integration with Make.com and Bot-Engine allows for automatic deployments and continuous model update systems.
Hugging Face CLI: Why Switch to hf?
The Hugging Face CLI has changed a lot. It moved from a Python-based tool to a strong command-line tool built with Rust. The new hf command makes the developer experience much better. It offers big speed boosts, simpler syntax, and strong automation features. These features are for AI engineers, data scientists, DevOps engineers, and people who build with no-code tools. In this guide, we will explain why the hf command is becoming the main way to use the Hugging Face system. And then, we will show how to use it fully in your automation work with tools like Make.com, Bot-Engine, and GoHighLevel.
The CLI Change from huggingface-cli to hf
The original huggingface-cli was important for sharing models and getting datasets early on, but it had problems. It was slow because it was written in Python. It wasn't broken into parts, had confusing error messages, and didn't work well for modern DevOps work.
People gave feedback, and then Hugging Face rebuilt the CLI from the ground up using Rust. They launched it as the simpler hf command. This was more than just a name change. It made a big difference in how it performed, how easy it was to use, and how much developers trusted it. Rust's way of handling many tasks at once, its memory safety, and its checks during compilation make it great for command-line programs. These programs need to be strong in many different computer settings.
Think of a CLI that starts right away, handles commands perfectly, and fits smoothly into CI/CD systems and scripts. That's not just an idea anymore. It's hf.
Why It Matters for Automation Engineers and Builders
If you build AI systems that react to things happening in the real world—like new blog posts, customer chats, or data events—every second matters. The CLI connects your ML models, datasets, and the services that run them.
Here's why automation engineers, especially those using tools like Make.com or Bot-Engine, will find it useful:
- 🧩 It works with other things:
hfworks well with APIs, webhooks, cron scripts, and container systems. This lets ML operations and backend systems work together smoothly. - 🪄 It is simpler: The simpler commands and organized structure mean less code and easier upkeep.
- 📡 It starts tasks on its own: You can use
hfwith automation triggers. For example, when a CMS post is added or a chatbot hits its reply limit, it can start model tasks on its own.
So, if you are starting ML training or changing NLP models based on how users behave, hf makes scripting easier and helps you handle more work.
Feature Spotlight: Simplicity, Good Sense, Efficiency
The old huggingface-cli felt put together quickly. The hf command feels clean, quick to respond, and useful.
Important features that make it easy to use are:
- 🧠 Smart Tab Completion: Command arguments, flags, and even model or dataset names can be auto-completed easily.
- 🧾 Detailed Help Menus: Every subcommand has clear, well-formatted help that tells you what it does and how to use it.
- ⚠️ Better Error Messages: Error messages give useful information to help you find solutions. This is very important when fixing code or writing scripts.
Here's a real-world example:
Old CLI:
huggingface-cli upload model my-model-argh!
# Error: unrecognized command 'upload'
New CLI:
hf model upload ./my-model
# ✅ Model uploaded successfully. View at: https://huggingface.co/username/my-model
hf is clear and predictable. This helps even non-developers feel good about putting AI into their systems. Or, at least, it helps them fix problems with it smartly when needed.
Performance Boost: The Rust-Powered Core
Let's talk about speed. Rewriting the CLI in Rust makes it much faster, especially when you run many CLI actions one after another in scripts or pipelines.
According to a performance benchmark comparison:
- CLIs made with Rust make work 7 times faster than those made with Python.
- Startup times are cut by milliseconds. This matters a lot in automation where things happen often.
- File tasks and reading metadata are quicker. And, it handles many tasks at once better.
Here's where this speed improvement is really useful:
- Automated night jobs that go through many datasets.
- Regular runs for large language models.
- Content pipelines that need very exact timing (like making captions from SEO).
When you use hf with task schedulers or job runners, these time savings add up. They help you handle a lot more work, like what a big company needs.
“hf jobs”: Automating AI Tasks Like a Pro
One of the best updates in hf is the hf job system. It gives you strong automation features easily.
Here's what you can do:
- 🚀 Run background tasks or utility jobs with:
hf job run --model my-model --type text-generation --input input.txt - 🖥 Check job status at any time:
hf job status --id job_4589xyz - 📜 See logs for active jobs to find problems or check output:
hf job logs --id job_4589xyz
For example:
- After publishing media → Automatically making article summaries.
- After a CRM update → Automatically making personal replies.
- After submitting a dataset → Checking and tagging it right away before uploading.
In an automated setup like GoHighLevel or Make.com, these job commands work like AI microservices that respond to events.
Connecting the Dots: Using hf with Make.com and Bot-Engine
Make.com (formerly Integromat) and Bot-Engine let you automate tasks without needing much code. These tools get much stronger when you use them with hf.
💡How Automation Works in Real Life: Automatically Making an NLP Model Better After Publishing
- A WordPress article is published, which starts a Make.com webhook.
- The webhook runs a script:
hf login --token $HF_TOKEN hf model upload ./latest-checkpoint --repo my-model hf job run --model my-model --type summarization --input new-post.txt - The output summary is posted to social media, added to SEO data, or reposted in other languages using Bot-Engine routes.
You only need a webhook, a CLI script, and a Hugging Face token. With these, you can set up a multi-language AI system that changes on its own as content changes.
Migrating to hf: From Old to New
Moving from huggingface-cli to hf is very easy. Follow these steps:
1. Uninstall Old CLI
pip uninstall huggingface_hub
2. Install hf
curl -sSL https://hf.co/cli | bash
This pulls the latest Rust-compiled program directly and installs it locally.
3. Authenticate Your Session
hf login
Alternatively, you can use:
export HF_TOKEN=your-token
hf whoami
4. Create a CLI Alias (Optional)
alias huggingface-cli='hf'
This makes sure older scripts still work with the new name.
Tip: Put hf in your shell's $PATH or build profiles. This makes it available everywhere in CI systems.
Uses for Bot-Engine Builders
Bot-Engine creators work a lot with chatbots that use many languages and content made by AI. The hf CLI helps the backend systems for these bots change and improve on their own with little effort.
Here are powerful ways Bot-Engine users can use the CLI:
- 🚢 Automatically upload models that have been fine-tuned when new training data comes in.
- 🏷 Add or update multi-language tags for datasets that have specific fields for different regions.
- 🕵️♀️ Put out test versions of models privately. Then, start A/B tests using
hf job run.
Need to change how your chatbot sounds for a campaign? Your bot can now update itself automatically by using hf model upload and hf job run together.
From Command Line to Automation Systems
Turning single hf commands into full automation systems only takes a few lines of code.
Here's an example system to get you started:
📝 Example: Automatic Model Upload and Checking
#!/bin/bash
source .env
hf login --token $HF_TOKEN
hf model upload ./my-new-model --repo my-org/modelX --private
hf job run --model my-org/modelX --type evaluation
Start this script using:
- A Make.com webhook.
- A cron task.
- A GitHub Action post-commit hook.
This makes it easy to set up your machine learning process without needing to do much by hand.
Security & Authentication Made Simple
Keeping things secure doesn't have to be hard with hf. It offers ways to handle login details that work for both single users and teams sharing work.
Key methods:
-
🔐 Token Authentication:
hf login --token $HF_TOKENYou can automate this securely using CI/CD variables or
.envfiles. -
🧭 Specific Tokens: Make tokens for certain jobs—like read-only for testing, or ones that can write for scripts that promote things.
-
🛡 Keep secrets separate: Put all secrets in local
.envfiles or environment variables. Load them before running scripts.
This makes team work better. Access is clear, and roles can be set without needing special setup systems.
What It Means for the Open Machine Learning System
The Hugging Face Hub has over [500,000 models and 100,000 datasets]. It is the key part of open-source AI that anyone can use.
Here's how hf helps this system:
- 💬 Talking between developers and models: Easier sharing and testing of models helps everyone learn more.
- ♻️ You can redo work: You can see every command used when deploying. This makes releases clear and easy to switch over.
- ⚙️ Standard ways of doing things: When tasks are automated, things like naming, tagging, and uploading become standard across teams and systems.
By making the user experience better, hf makes Hugging Face's place as the GitHub of machine learning even stronger.
Future of hf: What’s Coming and Why It’s Worth Watching
Some features being talked about or already being worked on are:
- ⚙️ GitOps Integration:
hfcommands built into GitHub Actions or GitLab CI work. - 📈 See job status live: Check job status in real-time using the CLI or webhooks.
- 🧩 Ready-made templates: Pre-built automation plans for tools like Make.com and Bot-Engine.
As the CLI gets new versions, expect it to have the same features as the web portal. And later, it might even have some features that are only on the CLI.
Make the Change — From Developer Tool to Automation Powerhouse
Moving from huggingface-cli to hf is more than just a syntax change. It's a change in how we work. Whether you upload a model by hand or build an AI-first product system, the hf CLI helps you automate with confidence and grow quickly.
Ready to start?
Install the new CLI. Build one job. Automate your next release. You might find your new favorite DevOps tool.
🛠 Tip: Put your hf commands in scripts, webhook tools, and CI jobs. Your AI tasks will then mostly run on their own.
References
Ganesh, A. (2023). Hugging Face Model Hub reaches 500K models and 100K datasets. VentureBeat. https://venturebeat.com/ai/hugging-face-raises-235m-and-wants-to-build-the-github-of-machine-learning/
JetBrains. (2023). The Developer Ecosystem Survey. JetBrains Research. https://www.jetbrains.com/lp/devecosystem-2023/
Medium Engineering. (2022). Rust vs Python CLI performance: 7x boost in productivity. Medium. https://medium.com/@Engineering/rust-vs-python-cli-performance-7x-boost-in-productivity-ddc9cfcd5a60


