Skip to content

Commit

Permalink
Groq::Model.load_models
Browse files Browse the repository at this point in the history
  • Loading branch information
drnic committed Apr 20, 2024
1 parent 9530a1c commit 847868e
Show file tree
Hide file tree
Showing 5 changed files with 108 additions and 2 deletions.
4 changes: 2 additions & 2 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -4,7 +4,7 @@ Groq Cloud runs LLM models fast and cheap. Llama 3, Mixtrel, Gemma, and more at

[![speed-pricing](docs/images/groq-speed-price-20240421.png)](https://wow.groq.com/)

Speed and pricing at 2024-04-21.
Speed and pricing at 2024-04-21. Also see their [changelog](https://console.groq.com/docs/changelog) for new models and features.

## Groq Cloud API

Expand Down Expand Up @@ -185,7 +185,7 @@ Assistant reply with model gemma-7b-it:

LLMs are increasingly supporting deferring to tools or functions to fetch data, perform calculations, or store structured data. Groq Cloud in turn then supports their tool implementations through its API.

See the [Using Tools](https://console.groq.com/docs/tool-use) documentation for the list of models that currently support tools.
See the [Using Tools](https://console.groq.com/docs/tool-use) documentation for the list of models that currently support tools. Others might support it sometimes and raise errors other times.

```ruby
@client = Groq::Client.new(model_id: "mixtral-8x7b-32768")
Expand Down
7 changes: 7 additions & 0 deletions lib/groq/client.rb
Original file line number Diff line number Diff line change
Expand Up @@ -46,6 +46,13 @@ def chat(messages, model_id: nil, tools: nil)
end
end

def get(path:)
client.get do |req|
req.url path
req.headers["Authorization"] = "Bearer #{@api_key}"
end
end

def post(path:, body:)
client.post do |req|
req.url path
Expand Down
18 changes: 18 additions & 0 deletions lib/groq/model.rb
Original file line number Diff line number Diff line change
Expand Up @@ -49,5 +49,23 @@ def default_model
def default_model_id
default_model[:model_id]
end

# https://api.groq.com/openai/v1/models
# Output:
# {"object": "list",
# "data": [
# {
# "id": "gemma-7b-it",
# "object": "model",
# "created": 1693721698,
# "owned_by": "Google",
# "active": true,
# "context_window": 8192
# },
def load_models(client:)
client ||= Groq::Client.new
response = client.get(path: "/openai/v1/models")
response.body
end
end
end
58 changes: 58 additions & 0 deletions test/fixtures/vcr_cassettes/api/get_models.yml

Some generated files are not rendered by default. Learn more about how customized files appear on GitHub.

23 changes: 23 additions & 0 deletions test/groq/test_models.rb
Original file line number Diff line number Diff line change
@@ -0,0 +1,23 @@
# frozen_string_literal: true

require "test_helper"

class TestGroqModel < Minitest::Test
def test_load_models
VCR.use_cassette("api/get_models") do
client = Groq::Client.new
models = Groq::Model.load_models(client: client)
expected = {
"object" => "list",
"data" => [
{"id" => "gemma-7b-it", "object" => "model", "created" => 1693721698, "owned_by" => "Google", "active" => true, "context_window" => 8192},
{"id" => "llama2-70b-4096", "object" => "model", "created" => 1693721698, "owned_by" => "Meta", "active" => true, "context_window" => 4096},
{"id" => "llama3-70b-8192", "object" => "model", "created" => 1693721698, "owned_by" => "Meta", "active" => true, "context_window" => 8192},
{"id" => "llama3-8b-8192", "object" => "model", "created" => 1693721698, "owned_by" => "Meta", "active" => true, "context_window" => 8192},
{"id" => "mixtral-8x7b-32768", "object" => "model", "created" => 1693721698, "owned_by" => "Mistral AI", "active" => true, "context_window" => 32768}
]
}
assert_equal expected, models
end
end
end

0 comments on commit 847868e

Please sign in to comment.