Skip to content

Conversation

@Amber-Williams
Copy link

I noticed the command by the LLM wasn't matching what I needed for my system so I've been modifying my prompt using an alias function like this with better results. With this feature llm-cmd will improve the plugin for others too.

ask_cmd() {
    SYSTEM_PROMPT="Return only the command to be executed as a raw string, no string delimiters wrapping it, no yapping, no markdown, no fenced code blocks, what you return will be passed to subprocess.check_output() directly.

    The user's system output from uname -a is:
    $MY_SYSTEM_CONTEXT

    For example, if the user asks: undo last git commit
    You return only: git reset --soft HEAD~1"

    llm cmd --system "$SYSTEM_PROMPT" "$*"
}

Warning

I wasn't able to run the tests

@ohEmily
Copy link

ohEmily commented Apr 22, 2025

Could #18 be why you couldn't run the tests? pytest worked for me from a virtualenv once I added that line.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants