GTE ModernColBERT V1
模型简介
模型特点
模型能力
使用案例
🚀 GTE-ModernColBERT-v1
这是一个基于 Alibaba-NLP/gte-modernbert-base 的 PyLate 模型,可将句子和段落映射为128维的密集向量序列,适用于使用 MaxSim 算子进行语义文本相似度计算。
🚀 快速开始
首先安装 PyLate 库:
pip install -U pylate
检索
PyLate 提供了一个简化的接口,用于使用 ColBERT 模型对文档进行索引和检索。索引利用 Voyager HNSW 索引来高效处理文档嵌入,并实现快速检索。
文档索引
首先,加载 ColBERT 模型并初始化 Voyager 索引,然后对文档进行编码和索引:
from pylate import indexes, models, retrieve
# 步骤 1: 加载 ColBERT 模型
model = models.ColBERT(
model_name_or_path=lightonai/GTE-ModernColBERT-v1,
)
# 步骤 2: 初始化 Voyager 索引
index = indexes.Voyager(
index_folder="pylate-index",
index_name="index",
override=True, # 如果存在现有索引,则覆盖它
)
# 步骤 3: 编码文档
documents_ids = ["1", "2", "3"]
documents = ["document 1 text", "document 2 text", "document 3 text"]
documents_embeddings = model.encode(
documents,
batch_size=32,
is_query=False, # 确保将其设置为 False 以表明这些是文档,而不是查询
show_progress_bar=True,
)
# 步骤 4: 通过提供嵌入和相应的 ID 将文档嵌入添加到索引中
index.add_documents(
documents_ids=documents_ids,
documents_embeddings=documents_embeddings,
)
请注意,您不必每次都重新创建索引和编码文档。一旦创建了索引并添加了文档,您可以通过加载它来重复使用该索引:
# 要加载索引,只需使用正确的文件夹/名称实例化它,而不覆盖它
index = indexes.Voyager(
index_folder="pylate-index",
index_name="index",
)
查询前 k 个文档
一旦文档被索引,您可以为给定的查询集检索前 k 个最相关的文档。 为此,使用要搜索的索引初始化 ColBERT 检索器,对查询进行编码,然后检索前 k 个文档以获取前匹配项的 ID 和相关性分数:
# 步骤 1: 初始化 ColBERT 检索器
retriever = retrieve.ColBERT(index=index)
# 步骤 2: 编码查询
queries_embeddings = model.encode(
["query for document 3", "query for document 1"],
batch_size=32,
is_query=True, # 确保将其设置为 True 以表明这些是查询
show_progress_bar=True,
)
# 步骤 3: 检索前 k 个文档
scores = retriever.retrieve(
queries_embeddings=queries_embeddings,
k=10, # 为每个查询检索前 10 个匹配项
)
重排序
如果您只想使用 ColBERT 模型在第一阶段检索管道的基础上进行重排序,而不构建索引,您可以简单地使用 rank 函数并传递要重排序的查询和文档:
from pylate import rank, models
queries = [
"query A",
"query B",
]
documents = [
["document A", "document B"],
["document 1", "document C", "document B"],
]
documents_ids = [
[1, 2],
[1, 3, 2],
]
model = models.ColBERT(
model_name_or_path=pylate_model_id,
)
queries_embeddings = model.encode(
queries,
is_query=True,
)
documents_embeddings = model.encode(
documents,
is_query=False,
)
reranked_documents = rank.rerank(
documents_ids=documents_ids,
queries_embeddings=queries_embeddings,
documents_embeddings=documents_embeddings,
)
✨ 主要特性
- 语义相似度计算:能够将句子和段落映射为128维的密集向量序列,使用 MaxSim 算子进行语义文本相似度计算。
- 高效检索:利用 Voyager HNSW 索引,实现快速的文档检索。
- 长上下文处理:在长上下文嵌入基准测试中表现出色,能够处理超出训练长度的文档。
📦 安装指南
pip install -U pylate
💻 使用示例
基础用法
# 加载 ColBERT 模型
from pylate import models
model = models.ColBERT(
model_name_or_path=lightonai/GTE-ModernColBERT-v1,
)
# 编码文档
documents = ["document 1 text", "document 2 text"]
documents_embeddings = model.encode(
documents,
is_query=False,
)
# 编码查询
queries = ["query for document 1"]
queries_embeddings = model.encode(
queries,
is_query=True,
)
高级用法
# 文档索引和检索
from pylate import indexes, retrieve
# 初始化 Voyager 索引
index = indexes.Voyager(
index_folder="pylate-index",
index_name="index",
override=True,
)
# 添加文档嵌入到索引
documents_ids = ["1", "2"]
index.add_documents(
documents_ids=documents_ids,
documents_embeddings=documents_embeddings,
)
# 初始化 ColBERT 检索器
retriever = retrieve.ColBERT(index=index)
# 检索前 10 个文档
scores = retriever.retrieve(
queries_embeddings=queries_embeddings,
k=10,
)
📚 详细文档
模型详情
属性 | 详情 |
---|---|
模型类型 | PyLate 模型 |
基础模型 | Alibaba-NLP/gte-modernbert-base |
文档长度 | 300 个标记 |
查询长度 | 32 个标记 |
输出维度 | 128 维 |
相似度函数 | MaxSim |
训练数据集 | ms-marco-en-bge-gemma |
语言 | 英语 |
许可证 | Apache 2.0 |
文档长度
GTE-ModernColBERT 在 MS MARCO 上使用知识蒸馏进行训练,文档长度为 300 个标记,这解释了其文档长度的默认值。 然而,正如 ModernBERT 论文所示,ColBERT 模型可以推广到远远超出其训练长度的文档长度,并且 GTE-ModernColBERT 在长上下文嵌入基准测试中实际上取得了远远高于当前最优水平的结果,请参阅 LongEmbed 结果。 在加载模型时,只需根据需要调整文档长度参数:
model = models.ColBERT(
model_name_or_path=lightonai/GTE-ModernColBERT-v1,
document_length=8192,
)
ModernBERT 本身仅在 8K 上下文长度上进行了训练,但似乎 GTE-ModernColBERT 可以推广到更大的上下文大小,不过这并不保证,因此请自行进行测试!
模型来源
- 文档:PyLate 文档
- 仓库:GitHub 上的 PyLate
- Hugging Face:Hugging Face 上的 PyLate 模型
完整模型架构
ColBERT(
(0): Transformer({'max_seq_length': 299, 'do_lower_case': False}) with Transformer model: ModernBertModel
(1): Dense({'in_features': 768, 'out_features': 128, 'bias': False, 'activation_function': 'torch.nn.modules.linear.Identity'})
)
🔧 技术细节
训练超参数
非默认超参数
eval_strategy
: stepsper_device_train_batch_size
: 16learning_rate
: 3e-05bf16
: True
所有超参数
点击展开
overwrite_output_dir
: Falsedo_predict
: Falseeval_strategy
: stepsprediction_loss_only
: Trueper_device_train_batch_size
: 16per_device_eval_batch_size
: 8per_gpu_train_batch_size
: Noneper_gpu_eval_batch_size
: Nonegradient_accumulation_steps
: 1eval_accumulation_steps
: Nonetorch_empty_cache_steps
: Nonelearning_rate
: 3e-05weight_decay
: 0.0adam_beta1
: 0.9adam_beta2
: 0.999adam_epsilon
: 1e-08max_grad_norm
: 1.0num_train_epochs
: 3max_steps
: -1lr_scheduler_type
: linearlr_scheduler_kwargs
: {}warmup_ratio
: 0.0warmup_steps
: 0log_level
: passivelog_level_replica
: warninglog_on_each_node
: Truelogging_nan_inf_filter
: Truesave_safetensors
: Truesave_on_each_node
: Falsesave_only_model
: Falserestore_callback_states_from_checkpoint
: Falseno_cuda
: Falseuse_cpu
: Falseuse_mps_device
: Falseseed
: 42data_seed
: Nonejit_mode_eval
: Falseuse_ipex
: Falsebf16
: Truefp16
: Falsefp16_opt_level
: O1half_precision_backend
: autobf16_full_eval
: Falsefp16_full_eval
: Falsetf32
: Nonelocal_rank
: 6ddp_backend
: Nonetpu_num_cores
: Nonetpu_metrics_debug
: Falsedebug
: []dataloader_drop_last
: Truedataloader_num_workers
: 0dataloader_prefetch_factor
: Nonepast_index
: -1disable_tqdm
: Falseremove_unused_columns
: Truelabel_names
: Noneload_best_model_at_end
: Falseignore_data_skip
: Falsefsdp
: []fsdp_min_num_params
: 0fsdp_config
: {'min_num_params': 0, 'xla': False, 'xla_fsdp_v2': False, 'xla_fsdp_grad_ckpt': False}fsdp_transformer_layer_cls_to_wrap
: Noneaccelerator_config
: {'split_batches': False, 'dispatch_batches': None, 'even_batches': True, 'use_seedable_sampler': True, 'non_blocking': False, 'gradient_accumulation_kwargs': None}deepspeed
: Nonelabel_smoothing_factor
: 0.0optim
: adamw_torchoptim_args
: Noneadafactor
: Falsegroup_by_length
: Falselength_column_name
: lengthddp_find_unused_parameters
: Noneddp_bucket_cap_mb
: Noneddp_broadcast_buffers
: Falsedataloader_pin_memory
: Truedataloader_persistent_workers
: Falseskip_memory_metrics
: Trueuse_legacy_prediction_loop
: Falsepush_to_hub
: Falseresume_from_checkpoint
: Nonehub_model_id
: Nonehub_strategy
: every_savehub_private_repo
: Nonehub_always_push
: Falsegradient_checkpointing
: Falsegradient_checkpointing_kwargs
: Noneinclude_inputs_for_metrics
: Falseinclude_for_metrics
: []eval_do_concat_batches
: Truefp16_backend
: autopush_to_hub_model_id
: Nonepush_to_hub_organization
: Nonemp_parameters
:auto_find_batch_size
: Falsefull_determinism
: Falsetorchdynamo
: Noneray_scope
: lastddp_timeout
: 1800torch_compile
: Falsetorch_compile_backend
: Nonetorch_compile_mode
: Nonedispatch_batches
: Nonesplit_batches
: Noneinclude_tokens_per_second
: Falseinclude_num_input_tokens_seen
: Falseneftune_noise_alpha
: Noneoptim_target_modules
: Nonebatch_eval_metrics
: Falseeval_on_start
: Falseuse_liger_kernel
: Falseeval_use_gather_object
: Falseaverage_tokens_across_devices
: Falseprompts
: Nonebatch_sampler
: batch_samplermulti_dataset_batch_sampler
: proportional
训练日志
点击展开
Epoch | Step | Training Loss | NanoClimateFEVER_MaxSim_ndcg@10 | NanoDBPedia_MaxSim_ndcg@10 | NanoFEVER_MaxSim_ndcg@10 | NanoFiQA2018_MaxSim_ndcg@10 | NanoHotpotQA_MaxSim_ndcg@10 | NanoMSMARCO_MaxSim_ndcg@10 | NanoNFCorpus_MaxSim_ndcg@10 | NanoNQ_MaxSim_ndcg@10 | NanoQuoraRetrieval_MaxSim_ndcg@10 | NanoSCIDOCS_MaxSim_ndcg@10 | NanoArguAna_MaxSim_ndcg@10 | NanoSciFact_MaxSim_ndcg@10 | NanoTouche2020_MaxSim_ndcg@10 | NanoBEIR_mean_MaxSim_ndcg@10 |
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
0.004 | 20 | 0.0493 | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
0.008 | 40 | 0.0434 | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
0.012 | 60 | 0.0324 | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
0.016 | 80 | 0.0238 | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
0.02 | 100 | 0.0202 | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
0.024 | 120 | 0.0186 | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
0.028 | 140 | 0.0172 | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
0.032 | 160 | 0.0164 | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
0.036 | 180 | 0.0157 | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
0.04 | 200 | 0.0153 | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
0.044 | 220 | 0.0145 | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
0.048 | 240 | 0.014 | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
0.052 | 260 | 0.0138 | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
0.056 | 280 | 0.0135 | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
0.06 | 300 | 0.0132 | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
0.064 | 320 | 0.0129 | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
0.068 | 340 | 0.0126 | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
0.072 | 360 | 0.0123 | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
0.076 | 380 | 0.0122 | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
0.08 | 400 | 0.012 | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
0.084 | 420 | 0.0121 | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
0.088 | 440 | 0.0115 | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
0.092 | 460 | 0.0113 | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
0.096 | 480 | 0.0112 | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
0.1 | 500 | 0.0111 | 0.3085 | 0.6309 | 0.9206 | 0.5303 | 0.8618 | 0.6893 | 0.3703 | 0.7163 | 0.9548 | 0.3885 | 0.4682 | 0.7930 | 0.5982 | 0.6331 |
0.104 | 520 | 0.0109 | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
0.108 | 540 | 0.0109 | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
0.112 | 560 | 0.0109 | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
0.116 | 580 | 0.0105 | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
0.12 | 600 | 0.0102 | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
0.124 | 620 | 0.0104 | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
0.128 | 640 | 0.0103 | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
0.132 | 660 | 0.01 | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
0.136 | 680 | 0.0101 | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
0.14 | 700 | 0.0098 | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
0.144 | 720 | 0.0097 | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
0.148 | 740 | 0.0097 | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
0.152 | 760 | 0.0096 | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
0.156 | 780 | 0.0096 | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
0.16 | 800 | 0.0094 | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
0.164 | 820 | 0.0096 | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
0.168 | 840 | 0.0095 | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
0.172 | 860 | 0.0093 | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
0.176 | 880 | 0.0092 | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
0.18 | 900 | 0.0093 | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
0.184 | 920 | 0.009 | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
0.188 | 940 | 0.009 | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
0.192 | 960 | 0.0089 | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
0.196 | 980 | 0.0089 | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
0.2 | 1000 | 0.0089 | 0.3148 | 0.6586 | 0.9335 | 0.5374 | 0.8810 | 0.6805 | 0.3746 | 0.7368 | 0.9486 | 0.3955 | 0.4824 | 0.8219 | 0.6089 | 0.6442 |
0.204 | 1020 | 0.0088 | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
0.208 | 1040 | 0.0089 | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
0.212 | 1060 | 0.0088 | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
0.216 | 1080 | 0.0086 | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
0.22 | 1100 | 0.0087 | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
0.224 | 1120 | 0.0088 | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
0.228 | 1140 | 0.0086 | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
0.232 | 1160 | 0.0086 | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
0.236 | 1180 | 0.0084 | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
0.24 | 1200 | 0.0086 | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
0.244 | 1220 | 0.0085 | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
0.248 | 1240 | 0.0084 | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
0.252 | 1260 | 0.0084 | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
0.256 | 1280 | 0.0081 | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
0.26 | 1300 | 0.0083 | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
0.264 | 1320 | 0.0084 | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
0.268 | 1340 | 0.0082 | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
0.272 | 1360 | 0.0082 | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
0.276 | 1380 | 0.008 | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
0.28 | 1400 | 0.0078 | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
0.284 | 1420 | 0.0079 | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
0.288 | 1440 | 0.0078 | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
0.292 | 1460 | 0.0081 | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
0.296 | 1480 | 0.0081 | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
0.3 | 1500 | 0.0079 | 0.3510 | 0.6590 | 0.9285 | 0.5463 | 0.8893 | 0.6853 | 0.3800 | 0.7370 | 0.9513 | 0.3980 | 0.5268 | 0.8268 | 0.6130 | 0.6533 |
0.304 | 1520 | 0.0078 | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
0.308 | 1540 | 0.0078 | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
0.312 | 1560 | 0.0077 | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
0.316 | 1580 | 0.0078 | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
0.32 | 1600 | 0.0078 | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
0.324 | 1620 | 0.0078 | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
0.328 | 1640 | 0.0078 | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
0.332 | 1660 | 0.0076 | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
0.336 | 1680 | 0.0076 | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
0.34 | 1700 | 0.0077 | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
0.344 | 1720 | 0.0076 | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
0.348 | 1740 | 0.0074 | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
0.352 | 1760 | 0.0074 | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
0.356 | 1780 | 0.0075 | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
0.36 | 1800 | 0.0076 | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
0.364 | 1820 | 0.0075 | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
0.368 | 1840 | 0.0073 | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
0.372 | 1860 | 0.0075 | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
0.376 | 1880 | 0.0073 | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
0.38 | 1900 | 0.0074 | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
0.384 | 1920 | 0.0072 | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
0.388 | 1940 | 0.0072 | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
0.392 | 1960 | 0.0071 | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
0.396 | 1980 | 0.0073 | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
0.4 | 2000 | 0.0071 | 0.3551 | 0.6807 | 0.9311 | 0.5340 | 0.8951 | 0.7019 | 0.3767 | 0.7460 | 0.9559 | 0.3912 | 0.5121 | 0.8245 | 0.6058 | 0.6546 |
0.404 | 2020 | 0.0073 | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
0.408 | 2040 | 0.0072 | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
0.412 | 2060 | 0.0071 | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
0.416 | 2080 | 0.0073 | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
0.42 | 2100 | 0.0069 | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
0.424 | 2120 | 0.0071 | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
0.428 | 2140 | 0.0069 | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
0.432 | 2160 | 0.0071 | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
0.436 | 2180 | 0.0071 | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
0.44 | 2200 | 0.007 | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
0.444 | 2220 | 0.0069 | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
0.448 | 2240 | 0.0071 | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
0.452 | 2260 | 0.0069 | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
0.456 | 2280 | 0.0069 | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
0.46 | 2300 | 0.0069 | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
0.464 | 2320 | 0.0069 | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
0.468 | 2340 | 0.0069 | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
0.472 | 2360 | 0.0068 | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
0.476 | 2380 | 0.0068 | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
0.48 | 2400 | 0.0067 | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
0.484 | 2420 | 0.0068 | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
0.488 | 2440 | 0.0067 | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
0.492 | 2460 | 0.0068 | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
0.496 | 2480 | 0.0069 | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
0.5 | 2500 | 0.0068 | 0.3647 | 0.6883 | 0.9435 | 0.5624 | 0.8946 | 0.7065 | 0.3815 | 0.7709 | 0.9658 | 0.3993 | 0.5631 | 0.8371 | 0.6076 | 0.6681 |
0.504 | 2520 | 0.0067 | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
0.508 | 2540 | 0.0068 | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
0.512 | 2560 | 0.0067 | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
0.516 | 2580 | 0.0068 | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
0.52 | 2600 | 0.0066 | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
0.524 | 2620 | 0.0067 | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
0.528 | 2640 | 0.0067 | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
0.532 | 2660 | 0.0067 | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
0.536 | 2680 | 0.0067 | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
0.54 | 2700 | 0.0068 | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
0.544 | 2720 | 0.0066 | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
0.548 | 2740 | 0.0067 | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
0.552 | 2760 | 0.0064 | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
0.556 | 2780 | 0.0064 | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
0.56 | 2800 | 0.0066 | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
0.564 | 2820 | 0.0063 | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
0.568 | 2840 | 0.0066 | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
0.572 | 2860 | 0.0066 | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
0.576 | 2880 | 0.0065 | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
0.58 | 2900 | 0.0066 | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
0.584 | 2920 | 0.0065 | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
0.588 | 2940 | 0.0063 | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
0.592 | 2960 | 0.0066 | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
0.596 | 2980 | 0.0065 | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
0.6 | 3000 | 0.0064 | 0.3585 | 0.7081 | 0.9409 | 0.5474 | 0.8915 | 0.7037 | 0.3796 | 0.7763 | 0.9540 | 0.4038 | 0.5628 | 0.8424 | 0.6042 | 0.6672 |
0.604 | 3020 | 0.0064 | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
0.608 | 3040 | 0.0063 | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
0.612 | 3060 | 0.0064 | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
0.616 | 3080 | 0.0065 | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
0.62 | 3100 | 0.0065 | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
0.624 | 3120 | 0.0064 | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
0.628 | 3140 | 0.0064 | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
0.632 | 3160 | 0.0062 | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
0.636 | 3180 | 0.0062 | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
0.64 | 3200 | 0.0063 | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
0.644 | 3220 | 0.0064 | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
0.648 | 3240 | 0.0063 | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
0.652 | 3260 | 0.0063 | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
0.656 | 3280 | 0.0063 | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
0.66 | 3300 | 0.0064 | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
0.664 | 3320 | 0.0063 | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
0.668 | 3340 | 0.0061 | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
0.672 | 3360 | 0.0062 | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
0.676 | 3380 | 0.0061 | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
0.68 | 3400 | 0.0063 | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
0.684 | 3420 | 0.006 | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
0.688 | 3440 | 0.0061 | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
0.692 | 3460 | 0.0062 | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
0.696 | 3480 | 0.0062 | - | - | - | - | - | - | - | - | - | - | - | - | - | - |
0.7 | 3500 | 0.0061 | 0.3783 | 0.7080 | 0.9441 | 0.5603 | 0.8902 | 0.7022 | 0.3824 | 0.7780 | 0.9612 | 0.3995 | 0.5414 | 0.8450 | 0.6049 | 0.6689 |
0.7 |







