We use essential cookies to make our site work. With your consent, we may also use non-essential cookies to improve user experience and analyze website traffic…

DeepInfra raises $107M Series B to scale the inference cloud — read the announcement

BAAI logo

BAAI/

bge-en-icl

$0.010

/ 1M tokens

A LLM-based embedding model with in-context learning capabilities that achieves SOTA performance on BEIR and AIR-Bench. It leverages few-shot examples to enhance task performance.

BAAI/bge-en-icl cover image
BAAI/bge-en-icl cover image
bge-en-icl

Ask me anything

0.00s

Settings

Model Information

BGE-EN-ICL

A large language model-based embedding model that supports in-context learning for enhanced task adaptation. Key features:

  • In-context learning with few-shot examples
  • SOTA performance on BEIR and AIR-Bench benchmarks
  • Flexible usage through FlagEmbedding or HuggingFace Transformers
  • Supports both zero-shot and few-shot scenarios
  • 7.11B parameters with F32 precision

For implementation details and usage examples, visit our GitHub repository.