SEA Performance
Overall SEA Average
0.6B 80.03 |
0.6B 78.93 |
0.6B 78.45 |
78.31 |
8B 77.26 |
6B 76.46 |
0.3B 76 |
0.6B 75.92 |
9B 75.85 |
0.6B 75.33 |
0.5B 74.99 |
0.3B 70.44 |
0.3B 65.4 |
7B 65.31 |
7B 64.44 |
0.6B 60.66 |
59.58 |
0.1B 54.39 |
52.89 |
Language Performance by Model
Model Size: ≤200B
Open instruct models only
Model | ID | TH | VI | FIL | MY | TA | KM | MS | LO | TET |
|---|---|---|---|---|---|---|---|---|---|---|
SEA-LION Embedding E5 0.6B AISG | 80.82 | 81.34 | 79.54 | 80.77 | 78.46 | 80.64 | 79.68 | 85.29 | 84.05 | 69.67 |
SEA-LION ModernBERT Embedding 0.6B AISG | 80.00 | 80.85 | 77.64 | 78.83 | 76.40 | 79.44 | 78.38 | 84.90 | 82.45 | 65.60 |
Multilingual E5 Instruct (multilingual-e5-large-instruct) 0.6B Microsoft | 79.50 | 81.11 | 78.00 | 78.37 | 79.19 | 77.09 | 78.13 | 84.60 | 83.94 | 69.40 |
SEA-LION ModernBERT Embedding 0.3B AISG | 78.70 | 78.90 | 76.06 | 77.40 | 73.92 | 77.35 | 77.30 | 83.83 | 80.24 | 56.31 |
BGE-Multilingual-Gemma2 9B Beijing Academy of Artificial Intelligence | 79.93 | 80.58 | 78.76 | 70.01 | 79.61 | 80.96 | 74.39 | 83.38 | 65.82 | 65.05 |
Cohere embed-multilingual-v3.0 CohereLabs | 79.72 | 80.98 | 78.93 | 76.13 | 78.99 | 78.87 | 77.01 | 82.42 | 83.34 | 66.75 |
LaBSE 0.5B UKP Lab | 73.98 | 70.20 | 72.60 | 73.63 | 76.99 | 76.59 | 74.06 | 82.87 | 79.84 | 69.11 |
BGE-M3 6B Beijing Academy of Artificial Intelligence | 78.09 | 77.59 | 75.91 | 73.12 | 75.78 | 77.51 | 76.23 | 82.54 | 82.26 | 65.53 |
Qwen3 Embedding 8B Alibaba | 79.73 | 81.49 | 78.99 | 74.91 | 78.05 | 75.95 | 75.46 | 82.39 | 78.20 | 67.44 |
EmbeddingGemma 0.3B | 80.19 | 80.58 | 76.40 | 70.42 | 57.11 | 76.10 | 64.07 | 79.19 | 53.01 | 67.32 |
GritLM 7B GritLM | 80.47 | 72.84 | 77.37 | 45.05 | 77.49 | 60.42 | 52.58 | 78.41 | 30.07 | 69.66 |
Multilingual E5 (multilingual-e5-large) 0.6B Microsoft | 78.59 | 79.89 | 78.93 | 70.28 | 77.98 | 77.83 | 72.11 | 80.10 | 79.91 | 63.55 |
E5 Mistral Instruct (e5-mistral-7b-instruct) 7B Microsoft | 79.22 | 74.77 | 75.37 | 48.85 | 78.10 | 66.73 | 56.49 | 78.82 | 27.98 | 66.73 |
jina-embeddings-v3 0.6B Jina AI | 77.35 | 78.64 | 76.10 | 75.11 | 74.25 | 76.14 | 74.73 | 77.91 | 77.91 | 65.11 |
text-embedding-3-small OpenAI | 78.34 | 55.24 | 70.06 | 32.79 | 68.08 | 35.38 | 30.15 | 69.78 | 23.97 | 65.09 |
Qwen3 Embedding 0.6B Alibaba | 75.60 | 75.85 | 75.12 | 49.08 | 63.11 | 61.02 | 44.10 | 69.51 | 29.79 | 63.38 |
voyage-3 Voyage AI | 75.56 | 69.78 | 73.68 | 48.19 | 71.43 | 67.28 | 35.02 | 69.13 | 24.27 | 61.48 |
paraphrase-multilingual-mpnet-base-v2 0.3B UKP Lab | 74.60 | 73.91 | 72.66 | 61.19 | 52.02 | 63.31 | 64.44 | 75.48 | 65.63 | 50.78 |
paraphrase-multilingual-MiniLM-L12-v2 0.1B UKP Lab | 71.47 | 70.42 | 69.89 | 54.48 | 47.22 | 27.88 | 39.92 | 69.58 | 45.34 | 47.69 |
SEA Competencies
Model Size: ≤200B
Open instruct models only
Model | SEA | Multilabel Classification | Bitext Mining | QA Retrieval | Reranking | Classification | Semantic Textual Similarity | Pair Classification | Instruction Retrieval | Clustering |
|---|---|---|---|---|---|---|---|---|---|---|
SEA-LION Embedding E5 0.6B AISG | 80.03 | 89.41 | 89.59 | 77.78 | 78.11 | 76.06 | 76.52 | 67.28 | 69.98 | 53.98 |
Multilingual E5 Instruct (multilingual-e5-large-instruct) 0.6B Microsoft | 78.93 | 87.84 | 87.86 | 77.16 | 77.24 | 77.70 | 75.59 | 66.58 | 69.10 | 58.09 |
SEA-LION ModernBERT Embedding 0.6B AISG | 78.45 | 89.61 | 88.71 | 76.64 | 75.94 | 76.33 | 75.82 | 66.81 | 68.17 | 52.41 |
Cohere embed-multilingual-v3.0 CohereLabs | 78.31 | 89.98 | 88.32 | 78.17 | 77.77 | 78.52 | 73.11 | 66.11 | 65.59 | 48.99 |
Qwen3 Embedding 8B Alibaba | 77.26 | 90.57 | 84.78 | 81.99 | 78.51 | 78.60 | 75.31 | 63.10 | 70.81 | 52.93 |
BGE-M3 6B Beijing Academy of Artificial Intelligence | 76.46 | 89.89 | 86.18 | 73.56 | 75.98 | 75.98 | 73.27 | 68.73 | 58.51 | 42.23 |
SEA-LION ModernBERT Embedding 0.3B AISG | 76.00 | 89.38 | 86.18 | 73.57 | 72.64 | 75.76 | 73.94 | 66.10 | 63.89 | 49.31 |
Multilingual E5 (multilingual-e5-large) 0.6B Microsoft | 75.92 | 88.94 | 84.51 | 78.25 | 79.00 | 78.24 | 69.61 | 65.79 | 66.06 | 47.83 |
BGE-Multilingual-Gemma2 9B Beijing Academy of Artificial Intelligence | 75.85 | 90.89 | 82.02 | 80.55 | 69.04 | 78.13 | 72.53 | 73.87 | 71.52 | 49.14 |
jina-embeddings-v3 0.6B Jina AI | 75.33 | 88.97 | 81.86 | 76.28 | 72.49 | 77.40 | 73.17 | 63.61 | 69.11 | 50.90 |
LaBSE 0.5B UKP Lab | 74.99 | 86.65 | 86.84 | 53.72 | 61.23 | 75.19 | 68.32 | 62.32 | 39.73 | 41.39 |
EmbeddingGemma 0.3B | 70.44 | 89.19 | 72.87 | 77.13 | 78.44 | 77.25 | 68.13 | 69.60 | 70.72 | 40.22 |
paraphrase-multilingual-mpnet-base-v2 0.3B UKP Lab | 65.40 | 87.28 | 68.12 | 58.28 | 64.01 | 73.79 | 70.15 | 70.79 | 52.44 | 41.12 |
E5 Mistral Instruct (e5-mistral-7b-instruct) 7B Microsoft | 65.31 | 88.32 | 65.30 | 72.93 | 75.33 | 76.65 | 63.50 | 63.81 | 54.46 | 49.48 |
GritLM 7B GritLM | 64.44 | 88.76 | 63.63 | 65.97 | 73.37 | 77.47 | 64.69 | 63.86 | 67.60 | 46.29 |
Qwen3 Embedding 0.6B Alibaba | 60.66 | 88.19 | 56.53 | 76.24 | 75.03 | 74.47 | 65.74 | 60.36 | 65.80 | 43.94 |
voyage-3 Voyage AI | 59.58 | 88.70 | 55.62 | 62.91 | 74.62 | 75.72 | 61.97 | 60.23 | 61.77 | 45.15 |
paraphrase-multilingual-MiniLM-L12-v2 0.1B UKP Lab | 54.39 | 84.88 | 53.23 | 52.47 | 62.27 | 70.50 | 64.59 | 65.70 | 48.66 | 31.50 |
text-embedding-3-small OpenAI | 52.89 | 88.19 | 43.12 | 65.18 | 71.25 | 72.88 | 52.31 | 60.16 | 52.87 | 39.34 |