ruby-mana ๐Ÿ”ฎ

A pure embedded LLM engine for Ruby. Write natural language, it just runs.

Not an API wrapper. A language construct that weaves LLM into your code.

Ruby 3.3+ gem that embeds LLM as a native language construct. MIT licensed.

Building an AI agent? ruby-claw uses mana as its engine โ†’
$ gem install ruby-mana

or gem "ruby-mana" in your Gemfile

What makes it different

A pure embedded LLM engine. Your natural language strings can read and write live Ruby state.

1. Semantic computation

Traditional approach

words = { "three" => 3, "cuatro" => 4 }
nums = [1, "2", "three", "cuatro", "ไบ”"]
# Need a lookup table for every language?
# What about "ไบ”"? Add Chinese? Japanese?
# This doesn't scale.

ruby-mana

require "mana"

numbers = [1, "2", "three", "cuatro", "ไบ”"]

~"compute the semantic average of <numbers>, store in <result>"

result  # => 3.0

2. Object manipulation

Traditional approach

client = LLM::Client.new
response = client.chat(
  "Classify: #{email.subject}"
)
data = JSON.parse(response)
email.category = data["category"]
email.priority = data["priority"]

ruby-mana

require "mana"

email = Email.new("URGENT: Server down")

~"read <email> subject, set its category and priority"

email.category  # => "infrastructure"
email.priority  # => "critical"

3. LLM compiler

Traditional approach

response = client.chat("Write a fibonacci function")
code = extract_code(response)
File.write("fibonacci.rb", code)
# manually load, validate, cache...
# what if the prompt changes?
# what about parameter signatures?

ruby-mana

require "mana"

mana def fibonacci(n)
  ~"return an array of the first n Fibonacci numbers"
end

fibonacci(10)  # first call โ†’ LLM generates โ†’ cached
fibonacci(20)  # second call โ†’ pure Ruby, zero API cost

4. Function discovery

Traditional approach

tools = [{
  name: "fetch_price",
  description: "Get stock price",
  parameters: {symbol: {type: "string"}}
}]
response = client.chat(prompt, tools: tools)
# parse tool_calls, dispatch manually...

ruby-mana

require "mana"

def fetch_price(symbol)
  { "AAPL" => 189.5, "TSLA" => 248.9 }[symbol]
end

portfolio = ["AAPL", "TSLA"]

~"call fetch_price for each in <portfolio>, sum into <total>"

puts total  # => 438.4

Features

๐Ÿ”ฎ Semantic computation

~"..." turns any string into an LLM prompt that reads and writes live Ruby state. One character, full binding access โ€” variables, objects, functions.

๐Ÿ“– Object manipulation

LLM reads attributes with <var>, writes back directly. No JSON parsing, no serialization โ€” it operates on your Ruby objects in place.

๐Ÿงฌ LLM compiler

mana def generates method implementations on first call, caches as real .rb files. Subsequent calls are pure Ruby โ€” zero API cost.

โšก Function discovery

Prism AST parser auto-discovers your methods and signatures. Just define Ruby functions โ€” LLM finds and calls them. No registration needed.

๐Ÿงฉ Tool registration interface

Mana.register_tool lets frameworks like ruby-claw inject agent tools into the engine. Clean boundary between engine and consumer.

๐Ÿ—บ๏ธ Dispatch map architecture

Registered tools route through a clean dispatch map โ€” tool name to handler. No tangled if/else chains, just declarative routing.

How it works

Mana is a pure stateless engine. No persistent memory โ€” just context in, result out.

1
Context flows in. Your Ruby binding (variables, objects, functions) is captured and passed as context to the LLM. Nothing is stored between calls.
2
Tools are registered. Frameworks call Mana.register_tool to inject capabilities. The dispatch map routes each tool call to its handler.
3
LLM executes. The engine sends context + messages to the LLM provider, receives a result, and writes it back into your Ruby state. Pure function โ€” no side-channel persistence.
4
ruby-claw builds on top. ruby-claw uses mana as its LLM engine, registering agent tools (web search, file I/O, code execution) through the tool interface. Mana stays stateless; claw manages the agent loop.

Examples

Semantic computation

LLM understands meaning across types, languages, and messy real-world data.

reviews = ["love it!", "broke in 2 days", "ๆœ€้ซ˜๏ผ"]

~"score sentiment of <reviews>, store in <scores> and <summary>"

puts scores   # => [0.9, -0.8, 0.95]
puts summary  # => "Mostly positive, one negative"

Object manipulation

LLM reads and writes object attributes directly โ€” no serialization needed.

class Task < Struct.new(:title, :tags, :estimate, :assignee)
end

task = Task.new("Migrate user auth to OAuth2")

~"read <task> title, set appropriate tags, estimate in hours, and suggest assignee"

puts task.tags      # => ["backend", "security", "auth"]
puts task.estimate  # => 8
puts task.assignee  # => "senior backend engineer"

LLM compiler

Describe what you want, LLM writes the code once, caches it as a .rb file. After that, pure Ruby.

mana def slugify(title)
  ~"convert title to a URL-safe slug: lowercase, hyphens, no special chars"
end

slugify("Hello, World! ไฝ ๅฅฝ")   # first call โ†’ LLM generates โ†’ cached
slugify("Another Post")         # second call โ†’ pure Ruby, zero API cost

puts Mana.source(:slugify)     # view the generated code

Function discovery

Just define Ruby functions โ€” LLM discovers and calls them automatically via AST introspection.

# Search users by name
def find_users(query:, limit: 5)
  User.where("name LIKE ?", "%#{query}%").limit(limit)
end

# Send an email notification
def notify(user_id:, message:)
  Mailer.send(user_id, message)
end

~"find users named 'Alice', notify each about the maintenance window tomorrow"

# LLM calls find_users, then notify for each โ€” all discovered from source

Full documentation on GitHub โ†’

Building agents? ruby-claw is the agent framework powered by mana.