OpenMed GLiNER-Relex Base v1.0 MLX

This is a public OpenMed MLX artifact for zero-shot entity and relation extraction on Apple Silicon. It packages knowledgator/gliner-relex-base-v1.0 in the OpenMed MLX format so it can run locally in Python, macOS, and iOS apps through OpenMedKit.

OpenMed is the main developer experience:

What This Model Does

OpenMed/gliner-relex-base-v1.0-mlx performs experimental zero-shot relation extraction. You provide:

  • entity labels, such as medication, condition, symptom, or procedure
  • relation labels, such as treats, causes, associated with, or contraindicated for
  • input text

The model returns extracted entities plus directed relation triples between entity pairs.

Artifact metadata:

  • OpenMed MLX task: zero-shot-relation-extraction
  • OpenMed MLX family: gliner-uni-encoder-token-relex
  • Backbone: DeBERTa-v3 / deberta-v2 config family
  • Max sequence length: 512
  • Weight format: weights.safetensors
  • Runtime APIs: GLiNERRelexMLXPipeline in Python and OpenMedRelationExtractor in Swift
  • Runtime status: experimental

Python Quick Start

Install OpenMed with MLX support:

pip install -U "openmed[mlx]"

Run relation extraction from the Hub snapshot:

from huggingface_hub import snapshot_download
from openmed.mlx.inference import GLiNERRelexMLXPipeline

model_path = snapshot_download("OpenMed/gliner-relex-base-v1.0-mlx")
extractor = GLiNERRelexMLXPipeline(model_path)

result = extractor.inference(
    "Aspirin was prescribed for headache after the patient reported migraine symptoms.",
    labels=["medication", "condition", "symptom"],
    relations=["treats", "associated with"],
    threshold=0.5,
    relation_threshold=0.9,
)

print(result["entities"])
print(result["relations"])

You can also load a local snapshot directory:

from openmed.mlx.inference import create_mlx_pipeline

extractor = create_mlx_pipeline("/path/to/gliner-relex-base-v1.0-mlx")

Swift And iOS Quick Start

Add OpenMedKit to your Xcode project:

  1. In Xcode, choose File > Add Package Dependencies.
  2. Paste https://github.com/maziyarpanahi/openmed.
  3. Add the OpenMedKit package product to your app target.
  4. Download this MLX artifact once, then reuse the cached local model for offline inference.

Example:

import OpenMedKit

let modelURL = try await OpenMedModelStore.downloadMLXModel(
    repoID: "OpenMed/gliner-relex-base-v1.0-mlx"
)

let extractor = try OpenMedRelationExtractor(modelDirectoryURL: modelURL)

let result = try extractor.extract(
    "Aspirin was prescribed for headache after the patient reported migraine symptoms.",
    entityLabels: ["medication", "condition", "symptom"],
    relationLabels: ["treats", "associated with"],
    threshold: 0.5,
    relationThreshold: 0.9
)

for entity in result.entities {
    print(entity.label, entity.text, entity.score)
}

for relation in result.relations {
    print(relation.head.text, relation.label, relation.tail.text, relation.score)
}

Notes for Apple apps:

  • MLX inference runs on Apple Silicon and supported Apple devices.
  • The iOS Simulator is not the acceptance target for MLX inference; test on a physical iPhone or iPad.
  • After the first model download, inference is local and does not require a server.

Prompt Format

The artifact includes OpenMed prompt-packing metadata so Python and Swift build the same input sequence:

{
  "kind": "gliner-relex",
  "entity_token": "<<ENT>>",
  "relation_token": "<<REL>>",
  "separator_token": "<<SEP>>",
  "class_token_index": 128001,
  "rel_token_index": 128003,
  "embed_marker_token": true,
  "split_mode": "words"
}

Practical Guidance

  • Start with a conservative relation_threshold, such as 0.9, for cleaner demos.
  • Use concise labels. For example, prefer medication over a long natural-language phrase.
  • Tune threshold and relation_threshold separately because entity recall and relation precision behave differently.
  • Keep inputs under the artifact's 512 token context window or chunk text before inference.
  • Treat this runtime as experimental while OpenMed continues parity testing across Python MLX and Swift MLX.

Credits

Downloads last month
2,244
MLX
Hardware compatibility
Log In to add your hardware

Quantized

Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for OpenMed/gliner-relex-base-v1.0-mlx

Finetuned
(1)
this model

Collection including OpenMed/gliner-relex-base-v1.0-mlx