- Home
- Productivity
- Llm Functions
Llm Functions
Easily create LLM tools and agents using plain Bash/JavaScript/Python functions.
Rating
Votes
0
score
Downloads
0
total
Price
Free
No login needed
Works With
About
LLM Functions
This project empowers you to effortlessly build powerful LLM tools and agents using familiar languages like Bash, JavaScript, and Python.
Forget complex integrations, harness the power of [function calling](https://platform.openai.com/docs/guides/function-calling) to connect your LLMs directly to custom code and unlock a world of possibilities. Execute system commands, process data, interact with APIs – the only limit is your imagination.
Tools Showcase
Agents showcase
Prerequisites
Make sure you have the following tools installed:
Getting Started with AIChat
Currently, AIChat is the only CLI tool that supports `llm-functions`. We look forward to more tools supporting `llm-functions`.
1. Clone the repository
git clone https://github.com/sigoden/llm-functions
cd llm-functions2. Build tools and agents
#### I. Create a ./tools.txt file with each tool filename on a new line.
get_current_weather.sh
execute_command.sh
#execute_py_code.pyWhere is the web_search tool?
The web_search tool itself doesn't exist directly, Instead, you can choose from a variety of web search tools.
To use one as the web_search tool, follow these steps:
- 1.Choose a Tool: Available tools include:
web_search_cohere.shweb_search_perplexity.shweb_search_tavily.shweb_search_vertexai.sh
- 1.Link Your Choice: Use the
argccommand to link your chosen tool asweb_search. For example, to useweb_search_perplexity.sh:
```sh
$ argc link-web-search web_search_perplexity.sh
```
This command creates a symbolic link, making web_search.sh point to your selected web_search_perplexity.sh tool.
Now there is a web_search.sh ready to be added to your ./tools.txt.
#### II. Create a ./agents.txt file with each agent name on a new line.
coder
todo#### III. Build bin and functions.json
argc build#### IV. Ensure that everything is ready (environment variables, Node/Python dependencies, mcp-bridge server)
argc check3. Link LLM-functions and AIChat
AIChat expects LLM-functions to be placed in AIChat's functions_dir so that AIChat can use the tools and agents that LLM-functions provides.
You can symlink this repository directory to AIChat's functions_dir with:
ln -s "$(pwd)" "$(aichat --info | sed -n 's/^functions_dir\s\+//p')"
# OR
argc link-to-aichatAlternatively, you can tell AIChat where the LLM-functions directory is by using an environment variable:
export AICHAT_FUNCTIONS_DIR="$(pwd)"4. Start using the functions
Done! Now you can use the tools and agents with AIChat.
Don't lose this
Three weeks from now, you'll want Llm Functions again. Will you remember where to find it?
Save it to your library and the next time you need Llm Functions, it’s one tap away — from any AI app you use. Group it into a bench with the rest of the team for that kind of task and you can pull the whole stack at once.
⚡ Pro tip for geeks: add a-gnt 🤵🏻♂️ as a custom connector in Claude or a custom GPT in ChatGPT — one click and your library is right there in the chat. Or, if you’re in an editor, install the a-gnt MCP server and say “use my [bench name]” in Claude Code, Cursor, VS Code, or Windsurf.
a-gnt's Take
Our honest review
This plugs directly into your AI and gives it new abilities it didn't have before. Easily create LLM tools and agents using plain Bash/JavaScript/Python functions. Once connected, just ask your AI to use it. It's completely free and works across most major AI apps. This one just landed in the catalog — worth trying while it's fresh.
Tips for getting started
Tap "Get" above, pick your AI app, and follow the steps. Most installs take under 30 seconds.
Pair this with your daily workflow. The more you use it, the more time you'll save.
What's New
Imported from GitHub
Ratings & Reviews
0.0
out of 5
0 ratings
No reviews yet. Be the first to share your experience.