Skip to content

Qirrel is a sophisticated and extensible NLP library framework for text processing and analysis.

License

Notifications You must be signed in to change notification settings

dev-dami/Qirrel

Repository files navigation

Qirrel

TypeScript NLP pipeline for fast, structured text understanding

CI CD NPM Version NPM Downloads License TypeScript

GitHub · NPM · Author: Damilare Osibanjo

Why Qirrel

Qirrel gives you a production-ready pipeline for extracting structure from raw text:

  • Tokenization with positional metadata.
  • Rule-based extraction for email, phone, URL, and number entities.
  • Optional LLM enrichment via adapter pattern.
  • Built-in caching and lifecycle events.
  • Batch processing with controllable concurrency.

Install

bun add qirrel

30-Second Start

import { processText } from 'qirrel';

const result = await processText(
  'Contact Jane at jane@example.com or +44 20 7946 0958',
);

console.log(result.data?.entities);

Core Features

Feature API Notes
Single-text processing processText(text) Returns QirrelContext
Batch processing processTexts(texts, configPath?, { concurrency }) Keeps input order
Custom pipeline new Pipeline(configPath?) Add processors and hooks
Events pipeline.on(PipelineEvent.*, handler) Run/processor/error telemetry
Caching pipeline.isCached/getCached/setCached LRU + TTL
LLM adapter pipeline.getLLMAdapter() gemini, openai, generic

Batch Example

import { processTexts } from 'qirrel';

const inputs = [
  'US: +1 415 555 2671',
  'URL: https://example.com',
  'Email: team@example.com',
];

const outputs = await processTexts(inputs, undefined, { concurrency: 2 });
console.log(outputs.map((o) => o.data?.entities));

LLM Example

import { Pipeline } from 'qirrel';

const pipeline = new Pipeline('./config-with-llm.yaml');
await pipeline.init();

const adapter = pipeline.getLLMAdapter();
if (adapter) {
  const response = await adapter.generate('Summarize: Qirrel is an NLP pipeline.');
  console.log(response.content);
}

Minimal LLM Config

llm:
  enabled: true
  provider: openai
  apiKey: ${QIRREL_LLM_API_KEY}
  model: gpt-4o-mini
  timeout: 30000
  cacheTtl: 300000

Quality Gates (CI/CD)

This repository includes GitHub Actions workflows:

  • ci.yml: runs install, build, tests, and coverage on push/PR.
  • release.yml: runs release checks and can publish to npm on version tags (v*) when NPM_TOKEN is configured.

Agent-Native Mode

Qirrel ships with an agent-native bridge plus an MCP server so the same parsing core works for API code and agent toolchains.

import { createQirrelAgentBridge } from "qirrel";

const bridge = createQirrelAgentBridge();
const result = await bridge.callTool("qirrel.parse_text", { text: "Email me@example.com" });
console.log(result.structuredContent);

Run MCP server:

bun run mcp:start

Run benchmark:

bun run bench:agent

Framework comparison benchmark:

bun run bench:frameworks

Generate local markdown benchmark report:

bun run bench:report

Documentation

Contributing

See CONTRIBUTING.md.

License

MIT. See LICENSE.

About

Qirrel is a sophisticated and extensible NLP library framework for text processing and analysis.

Topics

Resources

License

Contributing

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Contributors 3

  •  
  •  
  •  

Languages