Skip to content

Commit

Permalink
Merge pull request #1 from yusufcanb/release/1.0-rc1
Browse files Browse the repository at this point in the history
Release 1.0-rc1
  • Loading branch information
yusufcanb authored Feb 24, 2024
2 parents d365231 + 7fbeade commit d08bb98
Show file tree
Hide file tree
Showing 50 changed files with 1,115 additions and 1,108 deletions.
26 changes: 11 additions & 15 deletions .github/workflows/build.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -6,30 +6,26 @@ jobs:
build:

runs-on: ubuntu-latest

steps:
- uses: actions/checkout@v4

- name: Setup Go 1.20
- name: Setup Go 1.21
uses: actions/setup-go@v4
with:
go-version: 1.20
go-version: 1.21

- name: Display Go version
run: go version

- name: Install dependencies
run: |
go get ./...
env:
GOPATH: /opt/hostedtoolcache/go
run: go install github.com/yusufcanb/tlm

- name: Build
run: go build -v cmd/cli.go
env:
GOPATH: /opt/hostedtoolcache/go

- name: Test with the Go CLI
run: go test
env:
GOPATH: /opt/hostedtoolcache/go
run: bash build.sh $(cat VERSION)

- name: Archive artifacts
uses: actions/upload-artifact@v4
with:
name: dist
path: |
dist
47 changes: 41 additions & 6 deletions README.md
Original file line number Diff line number Diff line change
@@ -1,7 +1,13 @@
# tlm - Your terminal companion, locally powered by CodeLLaMa.
# tlm - Local terminal companion, powered by CodeLLaMa.

tlm is your CLI companion which requires nothing then your workstation. It uses most efficient and powerful [CodeLLaMa](https://ai.meta.com/blog/code-llama-large-language-model-coding/) in your local environment to provide you the best possible command line suggestions.

![](./assets/suggest.gif)

![](./assets/explain.gif)

![](./assets/config.gif)

## Features

- 💸 No API Key (Subscription) is required. (ChatGPT, Github Copilot, Azure OpenAI, etc.)
Expand All @@ -15,7 +21,32 @@ tlm is your CLI companion which requires nothing then your workstation. It uses
- 🚀 One liner generation and command explanation.


![](./assets/tlm-in-action.png)
## Usage

```
$ tlm help
NAME:
tlm - local terminal companion powered by CodeLLaMa.
USAGE:
tlm [global options] command [command options]
VERSION:
1.0-rc1
COMMANDS:
suggest, s suggest a command.
explain, e explain a command.
install, i deploy CodeLLaMa to your system.
config, c configure preferences.
version, v print version.
GLOBAL OPTIONS:
--help, -h show help
--version, -v print the version
```

## Usage

Expand Down Expand Up @@ -53,7 +84,7 @@ GLOBAL OPTIONS:
Download latest release;

```bash
curl -fsSL -o tlm https://github.com/yusufcanb/tlm/releases/download/1.0-rc0/tlama_1.0-rc0_linux_amd64
curl -fsSL -o tlm https://github.com/yusufcanb/tlm/releases/download/1.0-rc1/tlm_1.0-rc1_linux_amd64
```

Make it executable;
Expand All @@ -68,7 +99,9 @@ Move it to your `$PATH`;
sudo mv tlm /usr/local/bin
```

⚠️ If you already have CodeLLaMa on your system, you can just use the following command to configure it;
> [!TIP]
> If you already have CodeLLaMa on your system, you can just use the following command to configure it;
```
tlm config set llm.host <codellama_host>
```
Expand All @@ -85,10 +118,12 @@ Finally, follow the instructions to install CodeLLaMa. This will install CodeLLa
Download latest release;

```powershell
Invoke-WebRequest -Uri "https://github.com/yusufcanb/tlm/releases/download/1.0-alpha.0/tlama_1.0-alpha.0_windows_amd64.exe" -OutFile "tlm.exe"
Invoke-WebRequest -Uri "https://github.com/yusufcanb/tlm/releases/download/1.0-rc1/tlama_1.0-rc1_windows_amd64.exe" -OutFile "tlm.exe"
```

⚠️ If you already have CodeLLaMa on your system, you can just use the following command to configure it;
> [!TIP]
> If you already have CodeLLaMa on your system, you can just use the following command to configure it;
```
.\tlm.exe config set llm.host <codellama_host>
```
Expand Down
1 change: 1 addition & 0 deletions VERSION
Original file line number Diff line number Diff line change
@@ -0,0 +1 @@
1.0-rc1
8 changes: 8 additions & 0 deletions app/Modelfile.explain
Original file line number Diff line number Diff line change
@@ -0,0 +1,8 @@
FROM codellama:7b

PARAMETER temperature 0.25
PARAMETER top_p 0.2
PARAMETER top_k 25
PARAMETER seed 42

SYSTEM You are a command line application which helps user to get brief explanations for shell commands. You will be explaining given executable shell command to user with shortest possible explanation. If given input is not a shell command, you will respond with "I can only explain shell commands. Please provide a shell command to explain". You will never respond any question out of shell command explanation context.
8 changes: 8 additions & 0 deletions app/Modelfile.suggest
Original file line number Diff line number Diff line change
@@ -0,0 +1,8 @@
FROM codellama:7b

PARAMETER temperature 0.1
PARAMETER top_p 0.5
PARAMETER top_k 40
PARAMETER seed 1

SYSTEM You are software program specifically for Command Line Interface usage. User will ask you some thing that can be convertible to a UNIX or Windows command. You won't provide information or explanations and your output will be just an executable shell command inside three backticks.
64 changes: 64 additions & 0 deletions app/app.go
Original file line number Diff line number Diff line change
@@ -0,0 +1,64 @@
package app

import (
_ "embed"
ollama "github.com/jmorganca/ollama/api"
"github.com/yusufcanb/tlm/config"
"github.com/yusufcanb/tlm/explain"
"github.com/yusufcanb/tlm/install"
"github.com/yusufcanb/tlm/suggest"

"github.com/urfave/cli/v2"
)

//go:embed Modelfile.explain
var explainModelfile string

//go:embed Modelfile.suggest
var suggestModelfile string

type TlmApp struct {
App *cli.App

explainModelfile string
suggestModelfile string
}

func New(version string) *TlmApp {
con := config.New()
con.LoadOrCreateConfig()

o, _ := ollama.ClientFromEnvironment()
sug := suggest.New(o, suggestModelfile)
exp := explain.New(o, explainModelfile)
ins := install.New(o, suggestModelfile, explainModelfile)

cliApp := &cli.App{
Name: "tlm",
Usage: "local terminal companion powered by CodeLLaMa.",
Version: version,
HideHelpCommand: true,
Action: func(c *cli.Context) error {
return cli.ShowAppHelp(c)
},
Commands: []*cli.Command{
sug.Command(),
exp.Command(),
ins.Command(),
con.Command(),
&cli.Command{
Name: "version",
Aliases: []string{"v"},
Usage: "print version.",
Action: func(c *cli.Context) error {
cli.ShowVersion(c)
return nil
},
},
},
}

return &TlmApp{
App: cliApp,
}
}
Binary file added assets/config.gif
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file added assets/explain.gif
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file added assets/suggest.gif
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
41 changes: 41 additions & 0 deletions assets/tapes/config.tape
Original file line number Diff line number Diff line change
@@ -0,0 +1,41 @@
Output config.gif

Set Theme "GitHub Dark"
Set Margin 60
Set MarginFill "#4e8eff"

Set Width 1400
Set Height 1000
Set FontSize 26

Type "tlm config"
Sleep 1s
Enter
Sleep 1s

# host
Sleep 500ms
Enter
Sleep 500ms

# shell
Up
Sleep 500ms
Enter
Sleep 500ms

# suggest
Sleep 500ms
Up
Sleep 500ms
Enter
Sleep 500ms

# explain
Sleep 500ms
Down
Sleep 500ms
Enter
Sleep 500ms

Sleep 4s
14 changes: 14 additions & 0 deletions assets/tapes/explain.tape
Original file line number Diff line number Diff line change
@@ -0,0 +1,14 @@
Output explain.gif

Set Theme "GitHub Dark"
Set Margin 60
Set MarginFill "#4e8eff"

Set Width 1401
Set Height 1000
Set FontSize 26

Type "tlm explain 'wmic path win32_VideoController get name,status,adapterram'"
Sleep 250ms
Enter
Sleep 35
25 changes: 25 additions & 0 deletions assets/tapes/suggest.tape
Original file line number Diff line number Diff line change
@@ -0,0 +1,25 @@
Output suggest.gif

Set Theme "GitHub Dark"
Set Margin 60
Set MarginFill "#4e8eff"

Set Width 1401
Set Height 1000
Set FontSize 26

Type "tlm suggest 'list all network interfaces but only their ip addresses'"
Sleep 250ms
Enter
Sleep 8s

Enter
Sleep 500ms

Up
Sleep 250ms
Up
Sleep 250ms
Enter

Sleep 8s
2 changes: 1 addition & 1 deletion build.sh
Original file line number Diff line number Diff line change
Expand Up @@ -19,7 +19,7 @@ build() {
fi

echo "Building for $os/$arch (version: $version) -> $output_name"
CGO_ENABLED=0 GOOS=$os GOARCH=$arch go build -o "dist/${version}/${output_name}" "cmd/cli.go"
CGO_ENABLED=0 GOOS=$os GOARCH=$arch go build -o "dist/${version}/${output_name}" main.go
}

# Replace this with the name of your main Go file (package)
Expand Down
17 changes: 0 additions & 17 deletions cmd/cli.go

This file was deleted.

58 changes: 58 additions & 0 deletions config/api.go
Original file line number Diff line number Diff line change
@@ -0,0 +1,58 @@
package config

import (
"fmt"
"github.com/spf13/viper"
"github.com/yusufcanb/tlm/shell"
"log"
"os"
"path"
)

var defaultLLMHost = "http://localhost:11434"

func isExists(path string) bool {
if _, err := os.Stat(path); os.IsNotExist(err) {
return false
}
return true
}

func (c *Config) LoadOrCreateConfig() {
viper.SetConfigName(".tlm")
viper.SetConfigType("yaml")
viper.AddConfigPath("$HOME")

homeDir, err := os.UserHomeDir()
if err != nil {
log.Fatal(err)
}

configPath := path.Join(homeDir, ".tlm.yaml")
if !isExists(configPath) {
viper.Set("shell", shell.GetShell())

viper.Set("llm.host", defaultLLMHost)
viper.Set("llm.suggestion", "balanced")
viper.Set("llm.explain", "balanced")

err := os.Setenv("OLLAMA_HOST", defaultLLMHost)
if err != nil {
fmt.Printf(shell.Err()+" error writing config file, %s", err)
}

if err := viper.WriteConfigAs(path.Join(homeDir, ".tlm.yaml")); err != nil {
fmt.Printf(shell.Err()+" error writing config file, %s", err)
}
}

err = viper.ReadInConfig()
if err != nil {
log.Fatalf("Error reading config file, %s", err)
}

err = os.Setenv("OLLAMA_HOST", viper.GetString("llm.host"))
if err != nil {
fmt.Printf(shell.Err()+" %s", err)
}
}
Loading

0 comments on commit d08bb98

Please sign in to comment.