We use essential cookies to make our site work. With your consent, we may also use non-essential cookies to improve user experience and analyze website traffic…

🚀 New model available: DeepSeek-V3.1 🚀

intfloat/

multilingual-e5-large-instruct

The Multilingual-E5 models, initialized from XLM-RoBERTa, support up to 512 tokens per input — any longer text will be silently truncated. To ensure optimal performance, always prefix inputs with “query:” or “passage:”, as the model was explicitly trained with this format.

Public
$0.0005 / sec
512
ProjectPaperLicense
intfloat/multilingual-e5-large-instruct cover image

Input

inputs
You can add more items with the button on the right

You need to login to use this model

Login

Settings

The service tier used for processing the request. When set to 'priority', the request will be processed with higher priority. 3

whether to normalize the computed embeddings 2

The number of dimensions in the embedding. If not provided, the model's default will be used.If provided bigger than model's default, the embedding will be padded with zeros. (Default: empty, 32 ≤ dimensions ≤ 8192)

Output

[
  [
    0,
    0.5,
    1
  ],
  [
    1,
    0.5,
    0
  ]
]
Model Information