Dataset Preview
The full dataset viewer is not available (click to read why). Only showing a preview of the rows.
The dataset generation failed
Error code: DatasetGenerationError
Exception: ArrowNotImplementedError
Message: Cannot write struct type 'attributes' with no child field to Parquet. Consider adding a dummy child field.
Traceback: Traceback (most recent call last):
File "/usr/local/lib/python3.12/site-packages/datasets/builder.py", line 1831, in _prepare_split_single
writer.write_table(table)
File "/usr/local/lib/python3.12/site-packages/datasets/arrow_writer.py", line 712, in write_table
self._build_writer(inferred_schema=pa_table.schema)
File "/usr/local/lib/python3.12/site-packages/datasets/arrow_writer.py", line 757, in _build_writer
self.pa_writer = pq.ParquetWriter(
^^^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.12/site-packages/pyarrow/parquet/core.py", line 1070, in __init__
self.writer = _parquet.ParquetWriter(
^^^^^^^^^^^^^^^^^^^^^^^
File "pyarrow/_parquet.pyx", line 2363, in pyarrow._parquet.ParquetWriter.__cinit__
File "pyarrow/error.pxi", line 155, in pyarrow.lib.pyarrow_internal_check_status
File "pyarrow/error.pxi", line 92, in pyarrow.lib.check_status
pyarrow.lib.ArrowNotImplementedError: Cannot write struct type 'attributes' with no child field to Parquet. Consider adding a dummy child field.
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "/usr/local/lib/python3.12/site-packages/datasets/builder.py", line 1847, in _prepare_split_single
num_examples, num_bytes = writer.finalize()
^^^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.12/site-packages/datasets/arrow_writer.py", line 731, in finalize
self._build_writer(self.schema)
File "/usr/local/lib/python3.12/site-packages/datasets/arrow_writer.py", line 757, in _build_writer
self.pa_writer = pq.ParquetWriter(
^^^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.12/site-packages/pyarrow/parquet/core.py", line 1070, in __init__
self.writer = _parquet.ParquetWriter(
^^^^^^^^^^^^^^^^^^^^^^^
File "pyarrow/_parquet.pyx", line 2363, in pyarrow._parquet.ParquetWriter.__cinit__
File "pyarrow/error.pxi", line 155, in pyarrow.lib.pyarrow_internal_check_status
File "pyarrow/error.pxi", line 92, in pyarrow.lib.check_status
pyarrow.lib.ArrowNotImplementedError: Cannot write struct type 'attributes' with no child field to Parquet. Consider adding a dummy child field.
The above exception was the direct cause of the following exception:
Traceback (most recent call last):
File "/src/services/worker/src/worker/job_runners/config/parquet_and_info.py", line 1455, in compute_config_parquet_and_info_response
parquet_operations = convert_to_parquet(builder)
^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/src/services/worker/src/worker/job_runners/config/parquet_and_info.py", line 1054, in convert_to_parquet
builder.download_and_prepare(
File "/usr/local/lib/python3.12/site-packages/datasets/builder.py", line 894, in download_and_prepare
self._download_and_prepare(
File "/usr/local/lib/python3.12/site-packages/datasets/builder.py", line 970, in _download_and_prepare
self._prepare_split(split_generator, **prepare_split_kwargs)
File "/usr/local/lib/python3.12/site-packages/datasets/builder.py", line 1702, in _prepare_split
for job_id, done, content in self._prepare_split_single(
^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.12/site-packages/datasets/builder.py", line 1858, in _prepare_split_single
raise DatasetGenerationError("An error occurred while generating the dataset") from e
datasets.exceptions.DatasetGenerationError: An error occurred while generating the datasetNeed help to make the dataset viewer work? Make sure to review how to configure the dataset viewer, and open a discussion for direct support.
best_metric
null | best_model_checkpoint
null | epoch
float64 | eval_steps
int64 | global_step
int64 | is_hyper_param_search
bool | is_local_process_zero
bool | is_world_process_zero
bool | log_history
list | logging_steps
int64 | max_steps
int64 | num_input_tokens_seen
int64 | num_train_epochs
int64 | save_steps
int64 | stateful_callbacks
dict | total_flos
float64 | train_batch_size
int64 | trial_name
null | trial_params
null |
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
null | null | 115.85906
| 500
| 232
| false
| true
| true
|
[
{
"epoch": 0.42953020134228187,
"grad_norm": 10.125,
"learning_rate": 4.166666666666666e-8,
"loss": 0.6557,
"step": 1
},
{
"epoch": 0.8590604026845637,
"grad_norm": 10.3125,
"learning_rate": 8.333333333333333e-8,
"loss": 0.6669,
"step": 2
},
{
"epoch": 1.429530201342282,
"grad_norm": 21.125,
"learning_rate": 1.25e-7,
"loss": 1.3476,
"step": 3
},
{
"epoch": 1.8590604026845639,
"grad_norm": 9.9375,
"learning_rate": 1.6666666666666665e-7,
"loss": 0.6643,
"step": 4
},
{
"epoch": 2.4295302013422817,
"grad_norm": 21,
"learning_rate": 2.0833333333333333e-7,
"loss": 1.3391,
"step": 5
},
{
"epoch": 2.859060402684564,
"grad_norm": 10.125,
"learning_rate": 2.5e-7,
"loss": 0.6637,
"step": 6
},
{
"epoch": 3.4295302013422817,
"grad_norm": 20.125,
"learning_rate": 2.916666666666667e-7,
"loss": 1.314,
"step": 7
},
{
"epoch": 3.859060402684564,
"grad_norm": 10.125,
"learning_rate": 3.333333333333333e-7,
"loss": 0.6712,
"step": 8
},
{
"epoch": 4.429530201342282,
"grad_norm": 20.5,
"learning_rate": 3.75e-7,
"loss": 1.331,
"step": 9
},
{
"epoch": 4.859060402684563,
"grad_norm": 9.9375,
"learning_rate": 4.1666666666666667e-7,
"loss": 0.668,
"step": 10
},
{
"epoch": 5.429530201342282,
"grad_norm": 20.375,
"learning_rate": 4.5833333333333327e-7,
"loss": 1.2948,
"step": 11
},
{
"epoch": 5.859060402684563,
"grad_norm": 10.3125,
"learning_rate": 5e-7,
"loss": 0.6892,
"step": 12
},
{
"epoch": 6.429530201342282,
"grad_norm": 20.25,
"learning_rate": 5.416666666666666e-7,
"loss": 1.2959,
"step": 13
},
{
"epoch": 6.859060402684563,
"grad_norm": 9.625,
"learning_rate": 5.833333333333334e-7,
"loss": 0.6469,
"step": 14
},
{
"epoch": 7.429530201342282,
"grad_norm": 20.625,
"learning_rate": 6.249999999999999e-7,
"loss": 1.3468,
"step": 15
},
{
"epoch": 7.859060402684563,
"grad_norm": 9.375,
"learning_rate": 6.666666666666666e-7,
"loss": 0.6381,
"step": 16
},
{
"epoch": 8.429530201342281,
"grad_norm": 19.375,
"learning_rate": 7.083333333333334e-7,
"loss": 1.2382,
"step": 17
},
{
"epoch": 8.859060402684564,
"grad_norm": 9.5,
"learning_rate": 7.5e-7,
"loss": 0.6261,
"step": 18
},
{
"epoch": 9.429530201342281,
"grad_norm": 19.5,
"learning_rate": 7.916666666666666e-7,
"loss": 1.3184,
"step": 19
},
{
"epoch": 9.859060402684564,
"grad_norm": 9.3125,
"learning_rate": 8.333333333333333e-7,
"loss": 0.6193,
"step": 20
},
{
"epoch": 10.429530201342281,
"grad_norm": 18.875,
"learning_rate": 8.75e-7,
"loss": 1.2506,
"step": 21
},
{
"epoch": 10.859060402684564,
"grad_norm": 9.4375,
"learning_rate": 9.166666666666665e-7,
"loss": 0.6397,
"step": 22
},
{
"epoch": 11.429530201342281,
"grad_norm": 18.25,
"learning_rate": 9.583333333333334e-7,
"loss": 1.2113,
"step": 23
},
{
"epoch": 11.859060402684564,
"grad_norm": 9.1875,
"learning_rate": 0.000001,
"loss": 0.6248,
"step": 24
},
{
"epoch": 12.429530201342281,
"grad_norm": 18,
"learning_rate": 9.951923076923077e-7,
"loss": 1.1951,
"step": 25
},
{
"epoch": 12.859060402684564,
"grad_norm": 9.25,
"learning_rate": 9.903846153846153e-7,
"loss": 0.6005,
"step": 26
},
{
"epoch": 13.429530201342281,
"grad_norm": 18,
"learning_rate": 9.85576923076923e-7,
"loss": 1.1784,
"step": 27
},
{
"epoch": 13.859060402684564,
"grad_norm": 8.6875,
"learning_rate": 9.807692307692306e-7,
"loss": 0.5676,
"step": 28
},
{
"epoch": 14.429530201342281,
"grad_norm": 17.75,
"learning_rate": 9.759615384615384e-7,
"loss": 1.1531,
"step": 29
},
{
"epoch": 14.859060402684564,
"grad_norm": 8.625,
"learning_rate": 9.711538461538462e-7,
"loss": 0.5612,
"step": 30
},
{
"epoch": 15.429530201342281,
"grad_norm": 17,
"learning_rate": 9.663461538461537e-7,
"loss": 1.1161,
"step": 31
},
{
"epoch": 15.859060402684564,
"grad_norm": 7.875,
"learning_rate": 9.615384615384615e-7,
"loss": 0.5301,
"step": 32
},
{
"epoch": 16.42953020134228,
"grad_norm": 15.875,
"learning_rate": 9.567307692307693e-7,
"loss": 1.0799,
"step": 33
},
{
"epoch": 16.859060402684563,
"grad_norm": 7.5,
"learning_rate": 9.519230769230768e-7,
"loss": 0.5175,
"step": 34
},
{
"epoch": 17.42953020134228,
"grad_norm": 15.375,
"learning_rate": 9.471153846153846e-7,
"loss": 1.056,
"step": 35
},
{
"epoch": 17.859060402684563,
"grad_norm": 7.28125,
"learning_rate": 9.423076923076923e-7,
"loss": 0.5176,
"step": 36
},
{
"epoch": 18.42953020134228,
"grad_norm": 14.0625,
"learning_rate": 9.374999999999999e-7,
"loss": 1.0123,
"step": 37
},
{
"epoch": 18.859060402684563,
"grad_norm": 7.0625,
"learning_rate": 9.326923076923077e-7,
"loss": 0.4993,
"step": 38
},
{
"epoch": 19.42953020134228,
"grad_norm": 13.6875,
"learning_rate": 9.278846153846154e-7,
"loss": 0.9754,
"step": 39
},
{
"epoch": 19.859060402684563,
"grad_norm": 6.53125,
"learning_rate": 9.230769230769231e-7,
"loss": 0.4802,
"step": 40
},
{
"epoch": 20.42953020134228,
"grad_norm": 13.6875,
"learning_rate": 9.182692307692307e-7,
"loss": 1.0118,
"step": 41
},
{
"epoch": 20.859060402684563,
"grad_norm": 6.28125,
"learning_rate": 9.134615384615383e-7,
"loss": 0.4703,
"step": 42
},
{
"epoch": 21.42953020134228,
"grad_norm": 12.8125,
"learning_rate": 9.086538461538461e-7,
"loss": 0.9598,
"step": 43
},
{
"epoch": 21.859060402684563,
"grad_norm": 5.90625,
"learning_rate": 9.038461538461538e-7,
"loss": 0.4764,
"step": 44
},
{
"epoch": 22.42953020134228,
"grad_norm": 12.0625,
"learning_rate": 8.990384615384616e-7,
"loss": 0.9186,
"step": 45
},
{
"epoch": 22.859060402684563,
"grad_norm": 5.5625,
"learning_rate": 8.942307692307692e-7,
"loss": 0.4616,
"step": 46
},
{
"epoch": 23.42953020134228,
"grad_norm": 11.3125,
"learning_rate": 8.894230769230768e-7,
"loss": 0.8699,
"step": 47
},
{
"epoch": 23.859060402684563,
"grad_norm": 5.65625,
"learning_rate": 8.846153846153846e-7,
"loss": 0.4505,
"step": 48
},
{
"epoch": 24.42953020134228,
"grad_norm": 10.875,
"learning_rate": 8.798076923076922e-7,
"loss": 0.894,
"step": 49
},
{
"epoch": 24.859060402684563,
"grad_norm": 5.15625,
"learning_rate": 8.75e-7,
"loss": 0.4373,
"step": 50
},
{
"epoch": 25.42953020134228,
"grad_norm": 10.25,
"learning_rate": 8.701923076923077e-7,
"loss": 0.8748,
"step": 51
},
{
"epoch": 25.859060402684563,
"grad_norm": 5.21875,
"learning_rate": 8.653846153846154e-7,
"loss": 0.4253,
"step": 52
},
{
"epoch": 26.42953020134228,
"grad_norm": 9.8125,
"learning_rate": 8.605769230769231e-7,
"loss": 0.8547,
"step": 53
},
{
"epoch": 26.859060402684563,
"grad_norm": 4.75,
"learning_rate": 8.557692307692306e-7,
"loss": 0.4297,
"step": 54
},
{
"epoch": 27.42953020134228,
"grad_norm": 9.3125,
"learning_rate": 8.509615384615384e-7,
"loss": 0.8236,
"step": 55
},
{
"epoch": 27.859060402684563,
"grad_norm": 4.375,
"learning_rate": 8.461538461538461e-7,
"loss": 0.4086,
"step": 56
},
{
"epoch": 28.42953020134228,
"grad_norm": 8.8125,
"learning_rate": 8.413461538461539e-7,
"loss": 0.8382,
"step": 57
},
{
"epoch": 28.859060402684563,
"grad_norm": 4.28125,
"learning_rate": 8.365384615384615e-7,
"loss": 0.4054,
"step": 58
},
{
"epoch": 29.42953020134228,
"grad_norm": 8.4375,
"learning_rate": 8.317307692307692e-7,
"loss": 0.8062,
"step": 59
},
{
"epoch": 29.859060402684563,
"grad_norm": 4.09375,
"learning_rate": 8.269230769230768e-7,
"loss": 0.3939,
"step": 60
},
{
"epoch": 30.42953020134228,
"grad_norm": 7.875,
"learning_rate": 8.221153846153845e-7,
"loss": 0.8,
"step": 61
},
{
"epoch": 30.859060402684563,
"grad_norm": 3.71875,
"learning_rate": 8.173076923076923e-7,
"loss": 0.3967,
"step": 62
},
{
"epoch": 31.42953020134228,
"grad_norm": 7.6875,
"learning_rate": 8.125e-7,
"loss": 0.7915,
"step": 63
},
{
"epoch": 31.859060402684563,
"grad_norm": 3.640625,
"learning_rate": 8.076923076923077e-7,
"loss": 0.3815,
"step": 64
},
{
"epoch": 32.42953020134228,
"grad_norm": 6.96875,
"learning_rate": 8.028846153846154e-7,
"loss": 0.7866,
"step": 65
},
{
"epoch": 32.85906040268456,
"grad_norm": 3.4375,
"learning_rate": 7.98076923076923e-7,
"loss": 0.3837,
"step": 66
},
{
"epoch": 33.42953020134228,
"grad_norm": 7.21875,
"learning_rate": 7.932692307692307e-7,
"loss": 0.794,
"step": 67
},
{
"epoch": 33.85906040268456,
"grad_norm": 3.21875,
"learning_rate": 7.884615384615384e-7,
"loss": 0.3734,
"step": 68
},
{
"epoch": 34.42953020134228,
"grad_norm": 6.625,
"learning_rate": 7.836538461538462e-7,
"loss": 0.7544,
"step": 69
},
{
"epoch": 34.85906040268456,
"grad_norm": 3.03125,
"learning_rate": 7.788461538461538e-7,
"loss": 0.3774,
"step": 70
},
{
"epoch": 35.42953020134228,
"grad_norm": 6.5625,
"learning_rate": 7.740384615384615e-7,
"loss": 0.7432,
"step": 71
},
{
"epoch": 35.85906040268456,
"grad_norm": 3.015625,
"learning_rate": 7.692307692307693e-7,
"loss": 0.3747,
"step": 72
},
{
"epoch": 36.42953020134228,
"grad_norm": 6.5625,
"learning_rate": 7.644230769230768e-7,
"loss": 0.7619,
"step": 73
},
{
"epoch": 36.85906040268456,
"grad_norm": 2.765625,
"learning_rate": 7.596153846153846e-7,
"loss": 0.366,
"step": 74
},
{
"epoch": 37.42953020134228,
"grad_norm": 6.4375,
"learning_rate": 7.548076923076922e-7,
"loss": 0.7233,
"step": 75
},
{
"epoch": 37.85906040268456,
"grad_norm": 2.78125,
"learning_rate": 7.5e-7,
"loss": 0.3688,
"step": 76
},
{
"epoch": 38.42953020134228,
"grad_norm": 5.84375,
"learning_rate": 7.451923076923077e-7,
"loss": 0.71,
"step": 77
},
{
"epoch": 38.85906040268456,
"grad_norm": 2.703125,
"learning_rate": 7.403846153846153e-7,
"loss": 0.3644,
"step": 78
},
{
"epoch": 39.42953020134228,
"grad_norm": 5.59375,
"learning_rate": 7.355769230769231e-7,
"loss": 0.7017,
"step": 79
},
{
"epoch": 39.85906040268456,
"grad_norm": 2.609375,
"learning_rate": 7.307692307692307e-7,
"loss": 0.3492,
"step": 80
},
{
"epoch": 40.42953020134228,
"grad_norm": 5.5625,
"learning_rate": 7.259615384615385e-7,
"loss": 0.7202,
"step": 81
},
{
"epoch": 40.85906040268456,
"grad_norm": 2.515625,
"learning_rate": 7.211538461538461e-7,
"loss": 0.3661,
"step": 82
},
{
"epoch": 41.42953020134228,
"grad_norm": 5.40625,
"learning_rate": 7.163461538461538e-7,
"loss": 0.699,
"step": 83
},
{
"epoch": 41.85906040268456,
"grad_norm": 2.5,
"learning_rate": 7.115384615384616e-7,
"loss": 0.3517,
"step": 84
},
{
"epoch": 42.42953020134228,
"grad_norm": 5.40625,
"learning_rate": 7.067307692307692e-7,
"loss": 0.7218,
"step": 85
},
{
"epoch": 42.85906040268456,
"grad_norm": 2.4375,
"learning_rate": 7.019230769230769e-7,
"loss": 0.3529,
"step": 86
},
{
"epoch": 43.42953020134228,
"grad_norm": 5.375,
"learning_rate": 6.971153846153845e-7,
"loss": 0.7068,
"step": 87
},
{
"epoch": 43.85906040268456,
"grad_norm": 2.4375,
"learning_rate": 6.923076923076922e-7,
"loss": 0.3433,
"step": 88
},
{
"epoch": 44.42953020134228,
"grad_norm": 5.375,
"learning_rate": 6.875e-7,
"loss": 0.6786,
"step": 89
},
{
"epoch": 44.85906040268456,
"grad_norm": 2.40625,
"learning_rate": 6.826923076923076e-7,
"loss": 0.3625,
"step": 90
},
{
"epoch": 45.42953020134228,
"grad_norm": 5.03125,
"learning_rate": 6.778846153846154e-7,
"loss": 0.6834,
"step": 91
},
{
"epoch": 45.85906040268456,
"grad_norm": 2.390625,
"learning_rate": 6.730769230769231e-7,
"loss": 0.3559,
"step": 92
},
{
"epoch": 46.42953020134228,
"grad_norm": 5.0625,
"learning_rate": 6.682692307692307e-7,
"loss": 0.6748,
"step": 93
},
{
"epoch": 46.85906040268456,
"grad_norm": 2.21875,
"learning_rate": 6.634615384615384e-7,
"loss": 0.3429,
"step": 94
},
{
"epoch": 47.42953020134228,
"grad_norm": 4.6875,
"learning_rate": 6.586538461538461e-7,
"loss": 0.669,
"step": 95
},
{
"epoch": 47.85906040268456,
"grad_norm": 2.3125,
"learning_rate": 6.538461538461538e-7,
"loss": 0.3511,
"step": 96
},
{
"epoch": 48.42953020134228,
"grad_norm": 4.90625,
"learning_rate": 6.490384615384615e-7,
"loss": 0.6536,
"step": 97
},
{
"epoch": 48.85906040268456,
"grad_norm": 2.25,
"learning_rate": 6.442307692307693e-7,
"loss": 0.3432,
"step": 98
},
{
"epoch": 49.42953020134228,
"grad_norm": 4.90625,
"learning_rate": 6.394230769230768e-7,
"loss": 0.6758,
"step": 99
},
{
"epoch": 49.85906040268456,
"grad_norm": 2.28125,
"learning_rate": 6.346153846153845e-7,
"loss": 0.3396,
"step": 100
},
{
"epoch": 50.42953020134228,
"grad_norm": 5.125,
"learning_rate": 6.298076923076923e-7,
"loss": 0.682,
"step": 101
},
{
"epoch": 50.85906040268456,
"grad_norm": 2.203125,
"learning_rate": 6.249999999999999e-7,
"loss": 0.3389,
"step": 102
},
{
"epoch": 51.42953020134228,
"grad_norm": 4.6875,
"learning_rate": 6.201923076923077e-7,
"loss": 0.6563,
"step": 103
},
{
"epoch": 51.85906040268456,
"grad_norm": 2.140625,
"learning_rate": 6.153846153846154e-7,
"loss": 0.3416,
"step": 104
},
{
"epoch": 52.42953020134228,
"grad_norm": 4.71875,
"learning_rate": 6.105769230769232e-7,
"loss": 0.6639,
"step": 105
},
{
"epoch": 52.85906040268456,
"grad_norm": 2.171875,
"learning_rate": 6.057692307692307e-7,
"loss": 0.3264,
"step": 106
},
{
"epoch": 53.42953020134228,
"grad_norm": 4.65625,
"learning_rate": 6.009615384615384e-7,
"loss": 0.655,
"step": 107
},
{
"epoch": 53.85906040268456,
"grad_norm": 2.171875,
"learning_rate": 5.961538461538461e-7,
"loss": 0.3341,
"step": 108
},
{
"epoch": 54.42953020134228,
"grad_norm": 4.875,
"learning_rate": 5.913461538461538e-7,
"loss": 0.661,
"step": 109
},
{
"epoch": 54.85906040268456,
"grad_norm": 2.125,
"learning_rate": 5.865384615384616e-7,
"loss": 0.3295,
"step": 110
},
{
"epoch": 55.42953020134228,
"grad_norm": 4.9375,
"learning_rate": 5.817307692307692e-7,
"loss": 0.6776,
"step": 111
},
{
"epoch": 55.85906040268456,
"grad_norm": 2.1875,
"learning_rate": 5.769230769230768e-7,
"loss": 0.3201,
"step": 112
},
{
"epoch": 56.42953020134228,
"grad_norm": 4.75,
"learning_rate": 5.721153846153846e-7,
"loss": 0.6722,
"step": 113
},
{
"epoch": 56.85906040268456,
"grad_norm": 2.03125,
"learning_rate": 5.673076923076922e-7,
"loss": 0.3236,
"step": 114
},
{
"epoch": 57.42953020134228,
"grad_norm": 4.75,
"learning_rate": 5.625e-7,
"loss": 0.6624,
"step": 115
},
{
"epoch": 57.85906040268456,
"grad_norm": 2.125,
"learning_rate": 5.576923076923077e-7,
"loss": 0.3312,
"step": 116
},
{
"epoch": 58.42953020134228,
"grad_norm": 4.5,
"learning_rate": 5.528846153846153e-7,
"loss": 0.6591,
"step": 117
},
{
"epoch": 58.85906040268456,
"grad_norm": 2.125,
"learning_rate": 5.480769230769231e-7,
"loss": 0.331,
"step": 118
},
{
"epoch": 59.42953020134228,
"grad_norm": 4.5,
"learning_rate": 5.432692307692307e-7,
"loss": 0.6645,
"step": 119
},
{
"epoch": 59.85906040268456,
"grad_norm": 2.03125,
"learning_rate": 5.384615384615384e-7,
"loss": 0.3195,
"step": 120
},
{
"epoch": 60.42953020134228,
"grad_norm": 4.71875,
"learning_rate": 5.336538461538461e-7,
"loss": 0.6419,
"step": 121
},
{
"epoch": 60.85906040268456,
"grad_norm": 2.15625,
"learning_rate": 5.288461538461539e-7,
"loss": 0.3401,
"step": 122
},
{
"epoch": 61.42953020134228,
"grad_norm": 4.53125,
"learning_rate": 5.240384615384615e-7,
"loss": 0.6366,
"step": 123
},
{
"epoch": 61.85906040268456,
"grad_norm": 2.03125,
"learning_rate": 5.192307692307692e-7,
"loss": 0.325,
"step": 124
},
{
"epoch": 62.42953020134228,
"grad_norm": 4.5,
"learning_rate": 5.144230769230769e-7,
"loss": 0.6545,
"step": 125
},
{
"epoch": 62.85906040268456,
"grad_norm": 2.03125,
"learning_rate": 5.096153846153845e-7,
"loss": 0.3192,
"step": 126
},
{
"epoch": 63.42953020134228,
"grad_norm": 4.625,
"learning_rate": 5.048076923076923e-7,
"loss": 0.6721,
"step": 127
},
{
"epoch": 63.85906040268456,
"grad_norm": 2,
"learning_rate": 5e-7,
"loss": 0.317,
"step": 128
},
{
"epoch": 64.42953020134229,
"grad_norm": 4.625,
"learning_rate": 4.951923076923076e-7,
"loss": 0.656,
"step": 129
},
{
"epoch": 64.85906040268456,
"grad_norm": 2.015625,
"learning_rate": 4.903846153846153e-7,
"loss": 0.3138,
"step": 130
},
{
"epoch": 65.42953020134229,
"grad_norm": 4.625,
"learning_rate": 4.855769230769231e-7,
"loss": 0.6579,
"step": 131
},
{
"epoch": 65.85906040268456,
"grad_norm": 2.015625,
"learning_rate": 4.807692307692307e-7,
"loss": 0.3155,
"step": 132
},
{
"epoch": 66.42953020134229,
"grad_norm": 4.625,
"learning_rate": 4.759615384615384e-7,
"loss": 0.6563,
"step": 133
},
{
"epoch": 66.85906040268456,
"grad_norm": 2.0625,
"learning_rate": 4.711538461538461e-7,
"loss": 0.3287,
"step": 134
},
{
"epoch": 67.42953020134229,
"grad_norm": 4.46875,
"learning_rate": 4.6634615384615384e-7,
"loss": 0.6236,
"step": 135
},
{
"epoch": 67.85906040268456,
"grad_norm": 2.03125,
"learning_rate": 4.6153846153846156e-7,
"loss": 0.3347,
"step": 136
},
{
"epoch": 68.42953020134229,
"grad_norm": 4.40625,
"learning_rate": 4.567307692307692e-7,
"loss": 0.6196,
"step": 137
},
{
"epoch": 68.85906040268456,
"grad_norm": 2.015625,
"learning_rate": 4.519230769230769e-7,
"loss": 0.3191,
"step": 138
},
{
"epoch": 69.42953020134229,
"grad_norm": 4.40625,
"learning_rate": 4.471153846153846e-7,
"loss": 0.6408,
"step": 139
},
{
"epoch": 69.85906040268456,
"grad_norm": 2.078125,
"learning_rate": 4.423076923076923e-7,
"loss": 0.333,
"step": 140
},
{
"epoch": 70.42953020134229,
"grad_norm": 4.3125,
"learning_rate": 4.375e-7,
"loss": 0.6166,
"step": 141
},
{
"epoch": 70.85906040268456,
"grad_norm": 2.03125,
"learning_rate": 4.326923076923077e-7,
"loss": 0.3227,
"step": 142
},
{
"epoch": 71.42953020134229,
"grad_norm": 4.375,
"learning_rate": 4.278846153846153e-7,
"loss": 0.6322,
"step": 143
},
{
"epoch": 71.85906040268456,
"grad_norm": 2.015625,
"learning_rate": 4.2307692307692304e-7,
"loss": 0.3272,
"step": 144
},
{
"epoch": 72.42953020134229,
"grad_norm": 4.4375,
"learning_rate": 4.1826923076923076e-7,
"loss": 0.6294,
"step": 145
},
{
"epoch": 72.85906040268456,
"grad_norm": 1.984375,
"learning_rate": 4.134615384615384e-7,
"loss": 0.3185,
"step": 146
},
{
"epoch": 73.42953020134229,
"grad_norm": 4.34375,
"learning_rate": 4.0865384615384614e-7,
"loss": 0.6261,
"step": 147
},
{
"epoch": 73.85906040268456,
"grad_norm": 2,
"learning_rate": 4.0384615384615386e-7,
"loss": 0.3194,
"step": 148
},
{
"epoch": 74.42953020134229,
"grad_norm": 4.5625,
"learning_rate": 3.990384615384615e-7,
"loss": 0.6478,
"step": 149
},
{
"epoch": 74.85906040268456,
"grad_norm": 2.03125,
"learning_rate": 3.942307692307692e-7,
"loss": 0.3218,
"step": 150
},
{
"epoch": 75.42953020134229,
"grad_norm": 4.71875,
"learning_rate": 3.894230769230769e-7,
"loss": 0.6065,
"step": 151
},
{
"epoch": 75.85906040268456,
"grad_norm": 1.9921875,
"learning_rate": 3.8461538461538463e-7,
"loss": 0.32,
"step": 152
},
{
"epoch": 76.42953020134229,
"grad_norm": 4.6875,
"learning_rate": 3.798076923076923e-7,
"loss": 0.6607,
"step": 153
},
{
"epoch": 76.85906040268456,
"grad_norm": 1.984375,
"learning_rate": 3.75e-7,
"loss": 0.3142,
"step": 154
},
{
"epoch": 77.42953020134229,
"grad_norm": 4.59375,
"learning_rate": 3.701923076923077e-7,
"loss": 0.6432,
"step": 155
},
{
"epoch": 77.85906040268456,
"grad_norm": 1.984375,
"learning_rate": 3.6538461538461534e-7,
"loss": 0.318,
"step": 156
},
{
"epoch": 78.42953020134229,
"grad_norm": 4.3125,
"learning_rate": 3.6057692307692306e-7,
"loss": 0.6283,
"step": 157
},
{
"epoch": 78.85906040268456,
"grad_norm": 1.9375,
"learning_rate": 3.557692307692308e-7,
"loss": 0.3121,
"step": 158
},
{
"epoch": 79.42953020134229,
"grad_norm": 4.53125,
"learning_rate": 3.5096153846153844e-7,
"loss": 0.6369,
"step": 159
},
{
"epoch": 79.85906040268456,
"grad_norm": 2.03125,
"learning_rate": 3.461538461538461e-7,
"loss": 0.3264,
"step": 160
},
{
"epoch": 80.42953020134229,
"grad_norm": 4.1875,
"learning_rate": 3.413461538461538e-7,
"loss": 0.6223,
"step": 161
},
{
"epoch": 80.85906040268456,
"grad_norm": 1.9609375,
"learning_rate": 3.3653846153846154e-7,
"loss": 0.3123,
"step": 162
},
{
"epoch": 81.42953020134229,
"grad_norm": 4.375,
"learning_rate": 3.317307692307692e-7,
"loss": 0.6508,
"step": 163
},
{
"epoch": 81.85906040268456,
"grad_norm": 2.015625,
"learning_rate": 3.269230769230769e-7,
"loss": 0.3138,
"step": 164
},
{
"epoch": 82.42953020134229,
"grad_norm": 4.46875,
"learning_rate": 3.2211538461538464e-7,
"loss": 0.6216,
"step": 165
},
{
"epoch": 82.85906040268456,
"grad_norm": 2,
"learning_rate": 3.1730769230769225e-7,
"loss": 0.3169,
"step": 166
},
{
"epoch": 83.42953020134229,
"grad_norm": 4.09375,
"learning_rate": 3.1249999999999997e-7,
"loss": 0.6275,
"step": 167
},
{
"epoch": 83.85906040268456,
"grad_norm": 1.953125,
"learning_rate": 3.076923076923077e-7,
"loss": 0.3122,
"step": 168
},
{
"epoch": 84.42953020134229,
"grad_norm": 4.40625,
"learning_rate": 3.0288461538461536e-7,
"loss": 0.651,
"step": 169
},
{
"epoch": 84.85906040268456,
"grad_norm": 2.03125,
"learning_rate": 2.980769230769231e-7,
"loss": 0.3156,
"step": 170
},
{
"epoch": 85.42953020134229,
"grad_norm": 4.4375,
"learning_rate": 2.932692307692308e-7,
"loss": 0.6009,
"step": 171
},
{
"epoch": 85.85906040268456,
"grad_norm": 1.9609375,
"learning_rate": 2.884615384615384e-7,
"loss": 0.3202,
"step": 172
},
{
"epoch": 86.42953020134229,
"grad_norm": 4.34375,
"learning_rate": 2.836538461538461e-7,
"loss": 0.6324,
"step": 173
},
{
"epoch": 86.85906040268456,
"grad_norm": 1.9765625,
"learning_rate": 2.7884615384615384e-7,
"loss": 0.3263,
"step": 174
},
{
"epoch": 87.42953020134229,
"grad_norm": 4.6875,
"learning_rate": 2.7403846153846156e-7,
"loss": 0.6188,
"step": 175
},
{
"epoch": 87.85906040268456,
"grad_norm": 2.015625,
"learning_rate": 2.692307692307692e-7,
"loss": 0.3217,
"step": 176
},
{
"epoch": 88.42953020134229,
"grad_norm": 4.53125,
"learning_rate": 2.6442307692307694e-7,
"loss": 0.6405,
"step": 177
},
{
"epoch": 88.85906040268456,
"grad_norm": 1.9453125,
"learning_rate": 2.596153846153846e-7,
"loss": 0.3117,
"step": 178
},
{
"epoch": 89.42953020134229,
"grad_norm": 4.46875,
"learning_rate": 2.5480769230769227e-7,
"loss": 0.6075,
"step": 179
},
{
"epoch": 89.85906040268456,
"grad_norm": 1.9921875,
"learning_rate": 2.5e-7,
"loss": 0.3171,
"step": 180
},
{
"epoch": 90.42953020134229,
"grad_norm": 4.4375,
"learning_rate": 2.4519230769230765e-7,
"loss": 0.6599,
"step": 181
},
{
"epoch": 90.85906040268456,
"grad_norm": 1.96875,
"learning_rate": 2.4038461538461537e-7,
"loss": 0.3044,
"step": 182
},
{
"epoch": 91.42953020134229,
"grad_norm": 4.5,
"learning_rate": 2.3557692307692306e-7,
"loss": 0.6261,
"step": 183
},
{
"epoch": 91.85906040268456,
"grad_norm": 1.9375,
"learning_rate": 2.3076923076923078e-7,
"loss": 0.309,
"step": 184
},
{
"epoch": 92.42953020134229,
"grad_norm": 4.34375,
"learning_rate": 2.2596153846153845e-7,
"loss": 0.613,
"step": 185
},
{
"epoch": 92.85906040268456,
"grad_norm": 2,
"learning_rate": 2.2115384615384614e-7,
"loss": 0.3183,
"step": 186
},
{
"epoch": 93.42953020134229,
"grad_norm": 4.34375,
"learning_rate": 2.1634615384615386e-7,
"loss": 0.6246,
"step": 187
},
{
"epoch": 93.85906040268456,
"grad_norm": 1.953125,
"learning_rate": 2.1153846153846152e-7,
"loss": 0.3089,
"step": 188
},
{
"epoch": 94.42953020134229,
"grad_norm": 4.6875,
"learning_rate": 2.067307692307692e-7,
"loss": 0.6457,
"step": 189
},
{
"epoch": 94.85906040268456,
"grad_norm": 1.921875,
"learning_rate": 2.0192307692307693e-7,
"loss": 0.3103,
"step": 190
},
{
"epoch": 95.42953020134229,
"grad_norm": 4.34375,
"learning_rate": 1.971153846153846e-7,
"loss": 0.6053,
"step": 191
},
{
"epoch": 95.85906040268456,
"grad_norm": 2,
"learning_rate": 1.9230769230769231e-7,
"loss": 0.3124,
"step": 192
},
{
"epoch": 96.42953020134229,
"grad_norm": 4.5,
"learning_rate": 1.875e-7,
"loss": 0.6622,
"step": 193
},
{
"epoch": 96.85906040268456,
"grad_norm": 1.8984375,
"learning_rate": 1.8269230769230767e-7,
"loss": 0.3082,
"step": 194
},
{
"epoch": 97.42953020134229,
"grad_norm": 4.625,
"learning_rate": 1.778846153846154e-7,
"loss": 0.6421,
"step": 195
},
{
"epoch": 97.85906040268456,
"grad_norm": 1.9296875,
"learning_rate": 1.7307692307692305e-7,
"loss": 0.3034,
"step": 196
},
{
"epoch": 98.42953020134229,
"grad_norm": 4.4375,
"learning_rate": 1.6826923076923077e-7,
"loss": 0.6455,
"step": 197
},
{
"epoch": 98.85906040268456,
"grad_norm": 1.921875,
"learning_rate": 1.6346153846153846e-7,
"loss": 0.3193,
"step": 198
},
{
"epoch": 99.42953020134229,
"grad_norm": 4.40625,
"learning_rate": 1.5865384615384613e-7,
"loss": 0.6141,
"step": 199
},
{
"epoch": 99.85906040268456,
"grad_norm": 1.9296875,
"learning_rate": 1.5384615384615385e-7,
"loss": 0.3151,
"step": 200
},
{
"epoch": 100.42953020134229,
"grad_norm": 4.46875,
"learning_rate": 1.4903846153846154e-7,
"loss": 0.6238,
"step": 201
},
{
"epoch": 100.85906040268456,
"grad_norm": 1.9140625,
"learning_rate": 1.442307692307692e-7,
"loss": 0.3075,
"step": 202
},
{
"epoch": 101.42953020134229,
"grad_norm": 4.4375,
"learning_rate": 1.3942307692307692e-7,
"loss": 0.6047,
"step": 203
},
{
"epoch": 101.85906040268456,
"grad_norm": 1.96875,
"learning_rate": 1.346153846153846e-7,
"loss": 0.3161,
"step": 204
},
{
"epoch": 102.42953020134229,
"grad_norm": 4.25,
"learning_rate": 1.298076923076923e-7,
"loss": 0.6286,
"step": 205
},
{
"epoch": 102.85906040268456,
"grad_norm": 1.9765625,
"learning_rate": 1.25e-7,
"loss": 0.3143,
"step": 206
},
{
"epoch": 103.42953020134229,
"grad_norm": 4.3125,
"learning_rate": 1.2019230769230769e-7,
"loss": 0.589,
"step": 207
},
{
"epoch": 103.85906040268456,
"grad_norm": 1.8984375,
"learning_rate": 1.1538461538461539e-7,
"loss": 0.3129,
"step": 208
},
{
"epoch": 104.42953020134229,
"grad_norm": 4.53125,
"learning_rate": 1.1057692307692307e-7,
"loss": 0.6231,
"step": 209
},
{
"epoch": 104.85906040268456,
"grad_norm": 1.9140625,
"learning_rate": 1.0576923076923076e-7,
"loss": 0.3118,
"step": 210
},
{
"epoch": 105.42953020134229,
"grad_norm": 4.40625,
"learning_rate": 1.0096153846153847e-7,
"loss": 0.6417,
"step": 211
},
{
"epoch": 105.85906040268456,
"grad_norm": 2.015625,
"learning_rate": 9.615384615384616e-8,
"loss": 0.304,
"step": 212
},
{
"epoch": 106.42953020134229,
"grad_norm": 4.4375,
"learning_rate": 9.134615384615383e-8,
"loss": 0.6409,
"step": 213
},
{
"epoch": 106.85906040268456,
"grad_norm": 1.9453125,
"learning_rate": 8.653846153846153e-8,
"loss": 0.3041,
"step": 214
},
{
"epoch": 107.42953020134229,
"grad_norm": 4.5,
"learning_rate": 8.173076923076923e-8,
"loss": 0.6568,
"step": 215
},
{
"epoch": 107.85906040268456,
"grad_norm": 1.921875,
"learning_rate": 7.692307692307692e-8,
"loss": 0.3134,
"step": 216
},
{
"epoch": 108.42953020134229,
"grad_norm": 4.53125,
"learning_rate": 7.21153846153846e-8,
"loss": 0.6207,
"step": 217
},
{
"epoch": 108.85906040268456,
"grad_norm": 1.96875,
"learning_rate": 6.73076923076923e-8,
"loss": 0.311,
"step": 218
},
{
"epoch": 109.42953020134229,
"grad_norm": 4.6875,
"learning_rate": 6.25e-8,
"loss": 0.6236,
"step": 219
},
{
"epoch": 109.85906040268456,
"grad_norm": 1.9296875,
"learning_rate": 5.7692307692307695e-8,
"loss": 0.3157,
"step": 220
},
{
"epoch": 110.42953020134229,
"grad_norm": 4.1875,
"learning_rate": 5.288461538461538e-8,
"loss": 0.6104,
"step": 221
},
{
"epoch": 110.85906040268456,
"grad_norm": 2.03125,
"learning_rate": 4.807692307692308e-8,
"loss": 0.3118,
"step": 222
},
{
"epoch": 111.42953020134229,
"grad_norm": 4.6875,
"learning_rate": 4.326923076923076e-8,
"loss": 0.6517,
"step": 223
},
{
"epoch": 111.85906040268456,
"grad_norm": 1.921875,
"learning_rate": 3.846153846153846e-8,
"loss": 0.2999,
"step": 224
},
{
"epoch": 112.42953020134229,
"grad_norm": 4.46875,
"learning_rate": 3.365384615384615e-8,
"loss": 0.6313,
"step": 225
},
{
"epoch": 112.85906040268456,
"grad_norm": 2.03125,
"learning_rate": 2.8846153846153848e-8,
"loss": 0.3222,
"step": 226
},
{
"epoch": 113.42953020134229,
"grad_norm": 4.4375,
"learning_rate": 2.403846153846154e-8,
"loss": 0.634,
"step": 227
},
{
"epoch": 113.85906040268456,
"grad_norm": 1.9609375,
"learning_rate": 1.923076923076923e-8,
"loss": 0.3112,
"step": 228
},
{
"epoch": 114.42953020134229,
"grad_norm": 4.4375,
"learning_rate": 1.4423076923076924e-8,
"loss": 0.6463,
"step": 229
},
{
"epoch": 114.85906040268456,
"grad_norm": 1.9375,
"learning_rate": 9.615384615384615e-9,
"loss": 0.3102,
"step": 230
},
{
"epoch": 115.42953020134229,
"grad_norm": 4.40625,
"learning_rate": 4.807692307692308e-9,
"loss": 0.6202,
"step": 231
},
{
"epoch": 115.85906040268456,
"grad_norm": 2.015625,
"learning_rate": 0,
"loss": 0.316,
"step": 232
}
] | 1
| 232
| 0
| 116
| 10
|
{
"TrainerControl": {
"args": {
"should_epoch_stop": false,
"should_evaluate": false,
"should_log": false,
"should_save": true,
"should_training_stop": true
},
"attributes": {}
}
}
| 209,921,899,528,978,430
| 1
| null | null |
null | null | 115.85906
| 500
| 232
| false
| true
| true
|
[
{
"epoch": 0.42953020134228187,
"grad_norm": 8.8125,
"learning_rate": 4.166666666666666e-8,
"loss": 0.712,
"step": 1
},
{
"epoch": 0.8590604026845637,
"grad_norm": 8.9375,
"learning_rate": 8.333333333333333e-8,
"loss": 0.7198,
"step": 2
},
{
"epoch": 1.429530201342282,
"grad_norm": 17.375,
"learning_rate": 1.25e-7,
"loss": 1.4769,
"step": 3
},
{
"epoch": 1.8590604026845639,
"grad_norm": 8.5,
"learning_rate": 1.6666666666666665e-7,
"loss": 0.6968,
"step": 4
},
{
"epoch": 2.4295302013422817,
"grad_norm": 17.5,
"learning_rate": 2.0833333333333333e-7,
"loss": 1.472,
"step": 5
},
{
"epoch": 2.859060402684564,
"grad_norm": 8.625,
"learning_rate": 2.5e-7,
"loss": 0.693,
"step": 6
},
{
"epoch": 3.4295302013422817,
"grad_norm": 18,
"learning_rate": 2.916666666666667e-7,
"loss": 1.4794,
"step": 7
},
{
"epoch": 3.859060402684564,
"grad_norm": 8.625,
"learning_rate": 3.333333333333333e-7,
"loss": 0.7055,
"step": 8
},
{
"epoch": 4.429530201342282,
"grad_norm": 17.375,
"learning_rate": 3.75e-7,
"loss": 1.4165,
"step": 9
},
{
"epoch": 4.859060402684563,
"grad_norm": 8.75,
"learning_rate": 4.1666666666666667e-7,
"loss": 0.7142,
"step": 10
},
{
"epoch": 5.429530201342282,
"grad_norm": 17.375,
"learning_rate": 4.5833333333333327e-7,
"loss": 1.4198,
"step": 11
},
{
"epoch": 5.859060402684563,
"grad_norm": 8.5,
"learning_rate": 5e-7,
"loss": 0.7148,
"step": 12
},
{
"epoch": 6.429530201342282,
"grad_norm": 17.875,
"learning_rate": 5.416666666666666e-7,
"loss": 1.4431,
"step": 13
},
{
"epoch": 6.859060402684563,
"grad_norm": 8.5,
"learning_rate": 5.833333333333334e-7,
"loss": 0.7109,
"step": 14
},
{
"epoch": 7.429530201342282,
"grad_norm": 16.875,
"learning_rate": 6.249999999999999e-7,
"loss": 1.4174,
"step": 15
},
{
"epoch": 7.859060402684563,
"grad_norm": 8.3125,
"learning_rate": 6.666666666666666e-7,
"loss": 0.6954,
"step": 16
},
{
"epoch": 8.429530201342281,
"grad_norm": 18,
"learning_rate": 7.083333333333334e-7,
"loss": 1.4271,
"step": 17
},
{
"epoch": 8.859060402684564,
"grad_norm": 8.125,
"learning_rate": 7.5e-7,
"loss": 0.6788,
"step": 18
},
{
"epoch": 9.429530201342281,
"grad_norm": 16.5,
"learning_rate": 7.916666666666666e-7,
"loss": 1.3755,
"step": 19
},
{
"epoch": 9.859060402684564,
"grad_norm": 8.3125,
"learning_rate": 8.333333333333333e-7,
"loss": 0.7031,
"step": 20
},
{
"epoch": 10.429530201342281,
"grad_norm": 16.125,
"learning_rate": 8.75e-7,
"loss": 1.3792,
"step": 21
},
{
"epoch": 10.859060402684564,
"grad_norm": 7.75,
"learning_rate": 9.166666666666665e-7,
"loss": 0.6857,
"step": 22
},
{
"epoch": 11.429530201342281,
"grad_norm": 15.5625,
"learning_rate": 9.583333333333334e-7,
"loss": 1.3574,
"step": 23
},
{
"epoch": 11.859060402684564,
"grad_norm": 7.875,
"learning_rate": 0.000001,
"loss": 0.6794,
"step": 24
},
{
"epoch": 12.429530201342281,
"grad_norm": 16.5,
"learning_rate": 9.951923076923077e-7,
"loss": 1.3683,
"step": 25
},
{
"epoch": 12.859060402684564,
"grad_norm": 7.5625,
"learning_rate": 9.903846153846153e-7,
"loss": 0.6649,
"step": 26
},
{
"epoch": 13.429530201342281,
"grad_norm": 15.4375,
"learning_rate": 9.85576923076923e-7,
"loss": 1.2809,
"step": 27
},
{
"epoch": 13.859060402684564,
"grad_norm": 7.59375,
"learning_rate": 9.807692307692306e-7,
"loss": 0.6508,
"step": 28
},
{
"epoch": 14.429530201342281,
"grad_norm": 14.6875,
"learning_rate": 9.759615384615384e-7,
"loss": 1.3003,
"step": 29
},
{
"epoch": 14.859060402684564,
"grad_norm": 7.21875,
"learning_rate": 9.711538461538462e-7,
"loss": 0.6247,
"step": 30
},
{
"epoch": 15.429530201342281,
"grad_norm": 14,
"learning_rate": 9.663461538461537e-7,
"loss": 1.2513,
"step": 31
},
{
"epoch": 15.859060402684564,
"grad_norm": 7.0625,
"learning_rate": 9.615384615384615e-7,
"loss": 0.6358,
"step": 32
},
{
"epoch": 16.42953020134228,
"grad_norm": 13.1875,
"learning_rate": 9.567307692307693e-7,
"loss": 1.1943,
"step": 33
},
{
"epoch": 16.859060402684563,
"grad_norm": 6.65625,
"learning_rate": 9.519230769230768e-7,
"loss": 0.6106,
"step": 34
},
{
"epoch": 17.42953020134228,
"grad_norm": 13.1875,
"learning_rate": 9.471153846153846e-7,
"loss": 1.1643,
"step": 35
},
{
"epoch": 17.859060402684563,
"grad_norm": 6.4375,
"learning_rate": 9.423076923076923e-7,
"loss": 0.594,
"step": 36
},
{
"epoch": 18.42953020134228,
"grad_norm": 11.9375,
"learning_rate": 9.374999999999999e-7,
"loss": 1.1929,
"step": 37
},
{
"epoch": 18.859060402684563,
"grad_norm": 6.125,
"learning_rate": 9.326923076923077e-7,
"loss": 0.5921,
"step": 38
},
{
"epoch": 19.42953020134228,
"grad_norm": 12,
"learning_rate": 9.278846153846154e-7,
"loss": 1.1463,
"step": 39
},
{
"epoch": 19.859060402684563,
"grad_norm": 5.8125,
"learning_rate": 9.230769230769231e-7,
"loss": 0.5766,
"step": 40
},
{
"epoch": 20.42953020134228,
"grad_norm": 11,
"learning_rate": 9.182692307692307e-7,
"loss": 1.1282,
"step": 41
},
{
"epoch": 20.859060402684563,
"grad_norm": 5.40625,
"learning_rate": 9.134615384615383e-7,
"loss": 0.5585,
"step": 42
},
{
"epoch": 21.42953020134228,
"grad_norm": 11,
"learning_rate": 9.086538461538461e-7,
"loss": 1.12,
"step": 43
},
{
"epoch": 21.859060402684563,
"grad_norm": 5.375,
"learning_rate": 9.038461538461538e-7,
"loss": 0.5425,
"step": 44
},
{
"epoch": 22.42953020134228,
"grad_norm": 10.3125,
"learning_rate": 8.990384615384616e-7,
"loss": 1.1169,
"step": 45
},
{
"epoch": 22.859060402684563,
"grad_norm": 5,
"learning_rate": 8.942307692307692e-7,
"loss": 0.5358,
"step": 46
},
{
"epoch": 23.42953020134228,
"grad_norm": 10.375,
"learning_rate": 8.894230769230768e-7,
"loss": 1.0859,
"step": 47
},
{
"epoch": 23.859060402684563,
"grad_norm": 4.625,
"learning_rate": 8.846153846153846e-7,
"loss": 0.5327,
"step": 48
},
{
"epoch": 24.42953020134228,
"grad_norm": 10,
"learning_rate": 8.798076923076922e-7,
"loss": 1.102,
"step": 49
},
{
"epoch": 24.859060402684563,
"grad_norm": 4.84375,
"learning_rate": 8.75e-7,
"loss": 0.5151,
"step": 50
},
{
"epoch": 25.42953020134228,
"grad_norm": 9.0625,
"learning_rate": 8.701923076923077e-7,
"loss": 1.0243,
"step": 51
},
{
"epoch": 25.859060402684563,
"grad_norm": 4.46875,
"learning_rate": 8.653846153846154e-7,
"loss": 0.5328,
"step": 52
},
{
"epoch": 26.42953020134228,
"grad_norm": 8.8125,
"learning_rate": 8.605769230769231e-7,
"loss": 1.0692,
"step": 53
},
{
"epoch": 26.859060402684563,
"grad_norm": 4.1875,
"learning_rate": 8.557692307692306e-7,
"loss": 0.519,
"step": 54
},
{
"epoch": 27.42953020134228,
"grad_norm": 8.3125,
"learning_rate": 8.509615384615384e-7,
"loss": 0.9828,
"step": 55
},
{
"epoch": 27.859060402684563,
"grad_norm": 3.9375,
"learning_rate": 8.461538461538461e-7,
"loss": 0.5317,
"step": 56
},
{
"epoch": 28.42953020134228,
"grad_norm": 7.9375,
"learning_rate": 8.413461538461539e-7,
"loss": 0.9731,
"step": 57
},
{
"epoch": 28.859060402684563,
"grad_norm": 3.828125,
"learning_rate": 8.365384615384615e-7,
"loss": 0.5123,
"step": 58
},
{
"epoch": 29.42953020134228,
"grad_norm": 7.75,
"learning_rate": 8.317307692307692e-7,
"loss": 1.0084,
"step": 59
},
{
"epoch": 29.859060402684563,
"grad_norm": 3.5,
"learning_rate": 8.269230769230768e-7,
"loss": 0.4885,
"step": 60
},
{
"epoch": 30.42953020134228,
"grad_norm": 7.1875,
"learning_rate": 8.221153846153845e-7,
"loss": 1.0129,
"step": 61
},
{
"epoch": 30.859060402684563,
"grad_norm": 3.375,
"learning_rate": 8.173076923076923e-7,
"loss": 0.4978,
"step": 62
},
{
"epoch": 31.42953020134228,
"grad_norm": 6.875,
"learning_rate": 8.125e-7,
"loss": 1.0041,
"step": 63
},
{
"epoch": 31.859060402684563,
"grad_norm": 3.328125,
"learning_rate": 8.076923076923077e-7,
"loss": 0.4848,
"step": 64
},
{
"epoch": 32.42953020134228,
"grad_norm": 6.59375,
"learning_rate": 8.028846153846154e-7,
"loss": 1.0018,
"step": 65
},
{
"epoch": 32.85906040268456,
"grad_norm": 3.21875,
"learning_rate": 7.98076923076923e-7,
"loss": 0.4902,
"step": 66
},
{
"epoch": 33.42953020134228,
"grad_norm": 6.53125,
"learning_rate": 7.932692307692307e-7,
"loss": 0.9407,
"step": 67
},
{
"epoch": 33.85906040268456,
"grad_norm": 3.046875,
"learning_rate": 7.884615384615384e-7,
"loss": 0.4741,
"step": 68
},
{
"epoch": 34.42953020134228,
"grad_norm": 6.09375,
"learning_rate": 7.836538461538462e-7,
"loss": 0.9498,
"step": 69
},
{
"epoch": 34.85906040268456,
"grad_norm": 2.953125,
"learning_rate": 7.788461538461538e-7,
"loss": 0.4865,
"step": 70
},
{
"epoch": 35.42953020134228,
"grad_norm": 6.09375,
"learning_rate": 7.740384615384615e-7,
"loss": 0.9352,
"step": 71
},
{
"epoch": 35.85906040268456,
"grad_norm": 2.8125,
"learning_rate": 7.692307692307693e-7,
"loss": 0.4765,
"step": 72
},
{
"epoch": 36.42953020134228,
"grad_norm": 5.875,
"learning_rate": 7.644230769230768e-7,
"loss": 0.9345,
"step": 73
},
{
"epoch": 36.85906040268456,
"grad_norm": 2.625,
"learning_rate": 7.596153846153846e-7,
"loss": 0.4653,
"step": 74
},
{
"epoch": 37.42953020134228,
"grad_norm": 6,
"learning_rate": 7.548076923076922e-7,
"loss": 0.9161,
"step": 75
},
{
"epoch": 37.85906040268456,
"grad_norm": 2.71875,
"learning_rate": 7.5e-7,
"loss": 0.4687,
"step": 76
},
{
"epoch": 38.42953020134228,
"grad_norm": 5.4375,
"learning_rate": 7.451923076923077e-7,
"loss": 0.9321,
"step": 77
},
{
"epoch": 38.85906040268456,
"grad_norm": 2.640625,
"learning_rate": 7.403846153846153e-7,
"loss": 0.4636,
"step": 78
},
{
"epoch": 39.42953020134228,
"grad_norm": 5.28125,
"learning_rate": 7.355769230769231e-7,
"loss": 0.8965,
"step": 79
},
{
"epoch": 39.85906040268456,
"grad_norm": 2.578125,
"learning_rate": 7.307692307692307e-7,
"loss": 0.454,
"step": 80
},
{
"epoch": 40.42953020134228,
"grad_norm": 5.4375,
"learning_rate": 7.259615384615385e-7,
"loss": 0.9177,
"step": 81
},
{
"epoch": 40.85906040268456,
"grad_norm": 2.5,
"learning_rate": 7.211538461538461e-7,
"loss": 0.4784,
"step": 82
},
{
"epoch": 41.42953020134228,
"grad_norm": 5.09375,
"learning_rate": 7.163461538461538e-7,
"loss": 0.8982,
"step": 83
},
{
"epoch": 41.85906040268456,
"grad_norm": 2.5,
"learning_rate": 7.115384615384616e-7,
"loss": 0.445,
"step": 84
},
{
"epoch": 42.42953020134228,
"grad_norm": 5.34375,
"learning_rate": 7.067307692307692e-7,
"loss": 0.98,
"step": 85
},
{
"epoch": 42.85906040268456,
"grad_norm": 2.453125,
"learning_rate": 7.019230769230769e-7,
"loss": 0.4438,
"step": 86
},
{
"epoch": 43.42953020134228,
"grad_norm": 5.40625,
"learning_rate": 6.971153846153845e-7,
"loss": 0.9044,
"step": 87
},
{
"epoch": 43.85906040268456,
"grad_norm": 2.375,
"learning_rate": 6.923076923076922e-7,
"loss": 0.4486,
"step": 88
},
{
"epoch": 44.42953020134228,
"grad_norm": 5.09375,
"learning_rate": 6.875e-7,
"loss": 0.8871,
"step": 89
},
{
"epoch": 44.85906040268456,
"grad_norm": 2.375,
"learning_rate": 6.826923076923076e-7,
"loss": 0.4462,
"step": 90
},
{
"epoch": 45.42953020134228,
"grad_norm": 5.125,
"learning_rate": 6.778846153846154e-7,
"loss": 0.9203,
"step": 91
},
{
"epoch": 45.85906040268456,
"grad_norm": 2.3125,
"learning_rate": 6.730769230769231e-7,
"loss": 0.4525,
"step": 92
},
{
"epoch": 46.42953020134228,
"grad_norm": 4.96875,
"learning_rate": 6.682692307692307e-7,
"loss": 0.8813,
"step": 93
},
{
"epoch": 46.85906040268456,
"grad_norm": 2.296875,
"learning_rate": 6.634615384615384e-7,
"loss": 0.4475,
"step": 94
},
{
"epoch": 47.42953020134228,
"grad_norm": 5,
"learning_rate": 6.586538461538461e-7,
"loss": 0.8673,
"step": 95
},
{
"epoch": 47.85906040268456,
"grad_norm": 2.375,
"learning_rate": 6.538461538461538e-7,
"loss": 0.451,
"step": 96
},
{
"epoch": 48.42953020134228,
"grad_norm": 5.28125,
"learning_rate": 6.490384615384615e-7,
"loss": 0.8849,
"step": 97
},
{
"epoch": 48.85906040268456,
"grad_norm": 2.234375,
"learning_rate": 6.442307692307693e-7,
"loss": 0.4547,
"step": 98
},
{
"epoch": 49.42953020134228,
"grad_norm": 5.125,
"learning_rate": 6.394230769230768e-7,
"loss": 0.919,
"step": 99
},
{
"epoch": 49.85906040268456,
"grad_norm": 2.21875,
"learning_rate": 6.346153846153845e-7,
"loss": 0.4363,
"step": 100
},
{
"epoch": 50.42953020134228,
"grad_norm": 5.09375,
"learning_rate": 6.298076923076923e-7,
"loss": 0.8571,
"step": 101
},
{
"epoch": 50.85906040268456,
"grad_norm": 2.21875,
"learning_rate": 6.249999999999999e-7,
"loss": 0.45,
"step": 102
},
{
"epoch": 51.42953020134228,
"grad_norm": 5.0625,
"learning_rate": 6.201923076923077e-7,
"loss": 0.8721,
"step": 103
},
{
"epoch": 51.85906040268456,
"grad_norm": 2.28125,
"learning_rate": 6.153846153846154e-7,
"loss": 0.4487,
"step": 104
},
{
"epoch": 52.42953020134228,
"grad_norm": 4.6875,
"learning_rate": 6.105769230769232e-7,
"loss": 0.8641,
"step": 105
},
{
"epoch": 52.85906040268456,
"grad_norm": 2.234375,
"learning_rate": 6.057692307692307e-7,
"loss": 0.457,
"step": 106
},
{
"epoch": 53.42953020134228,
"grad_norm": 4.9375,
"learning_rate": 6.009615384615384e-7,
"loss": 0.8778,
"step": 107
},
{
"epoch": 53.85906040268456,
"grad_norm": 2.15625,
"learning_rate": 5.961538461538461e-7,
"loss": 0.4405,
"step": 108
},
{
"epoch": 54.42953020134228,
"grad_norm": 4.71875,
"learning_rate": 5.913461538461538e-7,
"loss": 0.8647,
"step": 109
},
{
"epoch": 54.85906040268456,
"grad_norm": 2.140625,
"learning_rate": 5.865384615384616e-7,
"loss": 0.4415,
"step": 110
},
{
"epoch": 55.42953020134228,
"grad_norm": 4.71875,
"learning_rate": 5.817307692307692e-7,
"loss": 0.8494,
"step": 111
},
{
"epoch": 55.85906040268456,
"grad_norm": 2.1875,
"learning_rate": 5.769230769230768e-7,
"loss": 0.4379,
"step": 112
},
{
"epoch": 56.42953020134228,
"grad_norm": 4.6875,
"learning_rate": 5.721153846153846e-7,
"loss": 0.9021,
"step": 113
},
{
"epoch": 56.85906040268456,
"grad_norm": 2.125,
"learning_rate": 5.673076923076922e-7,
"loss": 0.438,
"step": 114
},
{
"epoch": 57.42953020134228,
"grad_norm": 4.78125,
"learning_rate": 5.625e-7,
"loss": 0.8313,
"step": 115
},
{
"epoch": 57.85906040268456,
"grad_norm": 2.140625,
"learning_rate": 5.576923076923077e-7,
"loss": 0.4409,
"step": 116
},
{
"epoch": 58.42953020134228,
"grad_norm": 4.90625,
"learning_rate": 5.528846153846153e-7,
"loss": 0.9088,
"step": 117
},
{
"epoch": 58.85906040268456,
"grad_norm": 2.171875,
"learning_rate": 5.480769230769231e-7,
"loss": 0.4392,
"step": 118
},
{
"epoch": 59.42953020134228,
"grad_norm": 4.5625,
"learning_rate": 5.432692307692307e-7,
"loss": 0.8646,
"step": 119
},
{
"epoch": 59.85906040268456,
"grad_norm": 2.078125,
"learning_rate": 5.384615384615384e-7,
"loss": 0.4294,
"step": 120
},
{
"epoch": 60.42953020134228,
"grad_norm": 4.84375,
"learning_rate": 5.336538461538461e-7,
"loss": 0.9012,
"step": 121
},
{
"epoch": 60.85906040268456,
"grad_norm": 2.109375,
"learning_rate": 5.288461538461539e-7,
"loss": 0.4294,
"step": 122
},
{
"epoch": 61.42953020134228,
"grad_norm": 4.8125,
"learning_rate": 5.240384615384615e-7,
"loss": 0.8819,
"step": 123
},
{
"epoch": 61.85906040268456,
"grad_norm": 2.109375,
"learning_rate": 5.192307692307692e-7,
"loss": 0.433,
"step": 124
},
{
"epoch": 62.42953020134228,
"grad_norm": 4.65625,
"learning_rate": 5.144230769230769e-7,
"loss": 0.8412,
"step": 125
},
{
"epoch": 62.85906040268456,
"grad_norm": 2.171875,
"learning_rate": 5.096153846153845e-7,
"loss": 0.4263,
"step": 126
},
{
"epoch": 63.42953020134228,
"grad_norm": 4.5625,
"learning_rate": 5.048076923076923e-7,
"loss": 0.8617,
"step": 127
},
{
"epoch": 63.85906040268456,
"grad_norm": 2.140625,
"learning_rate": 5e-7,
"loss": 0.4328,
"step": 128
},
{
"epoch": 64.42953020134229,
"grad_norm": 4.65625,
"learning_rate": 4.951923076923076e-7,
"loss": 0.88,
"step": 129
},
{
"epoch": 64.85906040268456,
"grad_norm": 2.078125,
"learning_rate": 4.903846153846153e-7,
"loss": 0.4249,
"step": 130
},
{
"epoch": 65.42953020134229,
"grad_norm": 4.5,
"learning_rate": 4.855769230769231e-7,
"loss": 0.859,
"step": 131
},
{
"epoch": 65.85906040268456,
"grad_norm": 2.09375,
"learning_rate": 4.807692307692307e-7,
"loss": 0.4224,
"step": 132
},
{
"epoch": 66.42953020134229,
"grad_norm": 4.84375,
"learning_rate": 4.759615384615384e-7,
"loss": 0.8522,
"step": 133
},
{
"epoch": 66.85906040268456,
"grad_norm": 2.109375,
"learning_rate": 4.711538461538461e-7,
"loss": 0.4355,
"step": 134
},
{
"epoch": 67.42953020134229,
"grad_norm": 4.6875,
"learning_rate": 4.6634615384615384e-7,
"loss": 0.8479,
"step": 135
},
{
"epoch": 67.85906040268456,
"grad_norm": 2.109375,
"learning_rate": 4.6153846153846156e-7,
"loss": 0.4257,
"step": 136
},
{
"epoch": 68.42953020134229,
"grad_norm": 4.53125,
"learning_rate": 4.567307692307692e-7,
"loss": 0.86,
"step": 137
},
{
"epoch": 68.85906040268456,
"grad_norm": 2.078125,
"learning_rate": 4.519230769230769e-7,
"loss": 0.4292,
"step": 138
},
{
"epoch": 69.42953020134229,
"grad_norm": 4.53125,
"learning_rate": 4.471153846153846e-7,
"loss": 0.8563,
"step": 139
},
{
"epoch": 69.85906040268456,
"grad_norm": 2.125,
"learning_rate": 4.423076923076923e-7,
"loss": 0.4379,
"step": 140
},
{
"epoch": 70.42953020134229,
"grad_norm": 4.4375,
"learning_rate": 4.375e-7,
"loss": 0.8371,
"step": 141
},
{
"epoch": 70.85906040268456,
"grad_norm": 2.078125,
"learning_rate": 4.326923076923077e-7,
"loss": 0.4232,
"step": 142
},
{
"epoch": 71.42953020134229,
"grad_norm": 4.59375,
"learning_rate": 4.278846153846153e-7,
"loss": 0.8841,
"step": 143
},
{
"epoch": 71.85906040268456,
"grad_norm": 2.046875,
"learning_rate": 4.2307692307692304e-7,
"loss": 0.3962,
"step": 144
},
{
"epoch": 72.42953020134229,
"grad_norm": 4.6875,
"learning_rate": 4.1826923076923076e-7,
"loss": 0.8918,
"step": 145
},
{
"epoch": 72.85906040268456,
"grad_norm": 2.03125,
"learning_rate": 4.134615384615384e-7,
"loss": 0.4198,
"step": 146
},
{
"epoch": 73.42953020134229,
"grad_norm": 4.5,
"learning_rate": 4.0865384615384614e-7,
"loss": 0.8469,
"step": 147
},
{
"epoch": 73.85906040268456,
"grad_norm": 2.0625,
"learning_rate": 4.0384615384615386e-7,
"loss": 0.4263,
"step": 148
},
{
"epoch": 74.42953020134229,
"grad_norm": 4.65625,
"learning_rate": 3.990384615384615e-7,
"loss": 0.8602,
"step": 149
},
{
"epoch": 74.85906040268456,
"grad_norm": 2.0625,
"learning_rate": 3.942307692307692e-7,
"loss": 0.4269,
"step": 150
},
{
"epoch": 75.42953020134229,
"grad_norm": 4.40625,
"learning_rate": 3.894230769230769e-7,
"loss": 0.8281,
"step": 151
},
{
"epoch": 75.85906040268456,
"grad_norm": 2.046875,
"learning_rate": 3.8461538461538463e-7,
"loss": 0.4213,
"step": 152
},
{
"epoch": 76.42953020134229,
"grad_norm": 4.53125,
"learning_rate": 3.798076923076923e-7,
"loss": 0.8655,
"step": 153
},
{
"epoch": 76.85906040268456,
"grad_norm": 2.0625,
"learning_rate": 3.75e-7,
"loss": 0.4241,
"step": 154
},
{
"epoch": 77.42953020134229,
"grad_norm": 4.65625,
"learning_rate": 3.701923076923077e-7,
"loss": 0.847,
"step": 155
},
{
"epoch": 77.85906040268456,
"grad_norm": 2.046875,
"learning_rate": 3.6538461538461534e-7,
"loss": 0.4176,
"step": 156
},
{
"epoch": 78.42953020134229,
"grad_norm": 4.53125,
"learning_rate": 3.6057692307692306e-7,
"loss": 0.8274,
"step": 157
},
{
"epoch": 78.85906040268456,
"grad_norm": 2.09375,
"learning_rate": 3.557692307692308e-7,
"loss": 0.4362,
"step": 158
},
{
"epoch": 79.42953020134229,
"grad_norm": 4.53125,
"learning_rate": 3.5096153846153844e-7,
"loss": 0.8632,
"step": 159
},
{
"epoch": 79.85906040268456,
"grad_norm": 2.0625,
"learning_rate": 3.461538461538461e-7,
"loss": 0.4179,
"step": 160
},
{
"epoch": 80.42953020134229,
"grad_norm": 4.34375,
"learning_rate": 3.413461538461538e-7,
"loss": 0.8338,
"step": 161
},
{
"epoch": 80.85906040268456,
"grad_norm": 2.046875,
"learning_rate": 3.3653846153846154e-7,
"loss": 0.4285,
"step": 162
},
{
"epoch": 81.42953020134229,
"grad_norm": 4.34375,
"learning_rate": 3.317307692307692e-7,
"loss": 0.8369,
"step": 163
},
{
"epoch": 81.85906040268456,
"grad_norm": 2.015625,
"learning_rate": 3.269230769230769e-7,
"loss": 0.4176,
"step": 164
},
{
"epoch": 82.42953020134229,
"grad_norm": 4.5,
"learning_rate": 3.2211538461538464e-7,
"loss": 0.8413,
"step": 165
},
{
"epoch": 82.85906040268456,
"grad_norm": 2.015625,
"learning_rate": 3.1730769230769225e-7,
"loss": 0.4267,
"step": 166
},
{
"epoch": 83.42953020134229,
"grad_norm": 4.375,
"learning_rate": 3.1249999999999997e-7,
"loss": 0.8479,
"step": 167
},
{
"epoch": 83.85906040268456,
"grad_norm": 2.0625,
"learning_rate": 3.076923076923077e-7,
"loss": 0.4152,
"step": 168
},
{
"epoch": 84.42953020134229,
"grad_norm": 4.5,
"learning_rate": 3.0288461538461536e-7,
"loss": 0.8393,
"step": 169
},
{
"epoch": 84.85906040268456,
"grad_norm": 1.9765625,
"learning_rate": 2.980769230769231e-7,
"loss": 0.4205,
"step": 170
},
{
"epoch": 85.42953020134229,
"grad_norm": 4.59375,
"learning_rate": 2.932692307692308e-7,
"loss": 0.8542,
"step": 171
},
{
"epoch": 85.85906040268456,
"grad_norm": 2,
"learning_rate": 2.884615384615384e-7,
"loss": 0.4161,
"step": 172
},
{
"epoch": 86.42953020134229,
"grad_norm": 4.46875,
"learning_rate": 2.836538461538461e-7,
"loss": 0.8537,
"step": 173
},
{
"epoch": 86.85906040268456,
"grad_norm": 1.9765625,
"learning_rate": 2.7884615384615384e-7,
"loss": 0.4179,
"step": 174
},
{
"epoch": 87.42953020134229,
"grad_norm": 4.5625,
"learning_rate": 2.7403846153846156e-7,
"loss": 0.8437,
"step": 175
},
{
"epoch": 87.85906040268456,
"grad_norm": 2,
"learning_rate": 2.692307692307692e-7,
"loss": 0.4224,
"step": 176
},
{
"epoch": 88.42953020134229,
"grad_norm": 4.5625,
"learning_rate": 2.6442307692307694e-7,
"loss": 0.8537,
"step": 177
},
{
"epoch": 88.85906040268456,
"grad_norm": 2.046875,
"learning_rate": 2.596153846153846e-7,
"loss": 0.422,
"step": 178
},
{
"epoch": 89.42953020134229,
"grad_norm": 4.59375,
"learning_rate": 2.5480769230769227e-7,
"loss": 0.8233,
"step": 179
},
{
"epoch": 89.85906040268456,
"grad_norm": 2,
"learning_rate": 2.5e-7,
"loss": 0.4161,
"step": 180
},
{
"epoch": 90.42953020134229,
"grad_norm": 4.40625,
"learning_rate": 2.4519230769230765e-7,
"loss": 0.8752,
"step": 181
},
{
"epoch": 90.85906040268456,
"grad_norm": 1.9453125,
"learning_rate": 2.4038461538461537e-7,
"loss": 0.41,
"step": 182
},
{
"epoch": 91.42953020134229,
"grad_norm": 4.53125,
"learning_rate": 2.3557692307692306e-7,
"loss": 0.8751,
"step": 183
},
{
"epoch": 91.85906040268456,
"grad_norm": 2.078125,
"learning_rate": 2.3076923076923078e-7,
"loss": 0.4204,
"step": 184
},
{
"epoch": 92.42953020134229,
"grad_norm": 4.28125,
"learning_rate": 2.2596153846153845e-7,
"loss": 0.8359,
"step": 185
},
{
"epoch": 92.85906040268456,
"grad_norm": 1.984375,
"learning_rate": 2.2115384615384614e-7,
"loss": 0.414,
"step": 186
},
{
"epoch": 93.42953020134229,
"grad_norm": 4.71875,
"learning_rate": 2.1634615384615386e-7,
"loss": 0.827,
"step": 187
},
{
"epoch": 93.85906040268456,
"grad_norm": 2,
"learning_rate": 2.1153846153846152e-7,
"loss": 0.4217,
"step": 188
},
{
"epoch": 94.42953020134229,
"grad_norm": 4.4375,
"learning_rate": 2.067307692307692e-7,
"loss": 0.8209,
"step": 189
},
{
"epoch": 94.85906040268456,
"grad_norm": 2,
"learning_rate": 2.0192307692307693e-7,
"loss": 0.414,
"step": 190
},
{
"epoch": 95.42953020134229,
"grad_norm": 4.46875,
"learning_rate": 1.971153846153846e-7,
"loss": 0.8536,
"step": 191
},
{
"epoch": 95.85906040268456,
"grad_norm": 2.015625,
"learning_rate": 1.9230769230769231e-7,
"loss": 0.4171,
"step": 192
},
{
"epoch": 96.42953020134229,
"grad_norm": 4.40625,
"learning_rate": 1.875e-7,
"loss": 0.8215,
"step": 193
},
{
"epoch": 96.85906040268456,
"grad_norm": 2.0625,
"learning_rate": 1.8269230769230767e-7,
"loss": 0.427,
"step": 194
},
{
"epoch": 97.42953020134229,
"grad_norm": 4.28125,
"learning_rate": 1.778846153846154e-7,
"loss": 0.8155,
"step": 195
},
{
"epoch": 97.85906040268456,
"grad_norm": 2.0625,
"learning_rate": 1.7307692307692305e-7,
"loss": 0.4218,
"step": 196
},
{
"epoch": 98.42953020134229,
"grad_norm": 4.40625,
"learning_rate": 1.6826923076923077e-7,
"loss": 0.8371,
"step": 197
},
{
"epoch": 98.85906040268456,
"grad_norm": 1.984375,
"learning_rate": 1.6346153846153846e-7,
"loss": 0.4196,
"step": 198
},
{
"epoch": 99.42953020134229,
"grad_norm": 4.375,
"learning_rate": 1.5865384615384613e-7,
"loss": 0.8204,
"step": 199
},
{
"epoch": 99.85906040268456,
"grad_norm": 2.015625,
"learning_rate": 1.5384615384615385e-7,
"loss": 0.4338,
"step": 200
},
{
"epoch": 100.42953020134229,
"grad_norm": 4.34375,
"learning_rate": 1.4903846153846154e-7,
"loss": 0.8246,
"step": 201
},
{
"epoch": 100.85906040268456,
"grad_norm": 1.9765625,
"learning_rate": 1.442307692307692e-7,
"loss": 0.4145,
"step": 202
},
{
"epoch": 101.42953020134229,
"grad_norm": 4.625,
"learning_rate": 1.3942307692307692e-7,
"loss": 0.8675,
"step": 203
},
{
"epoch": 101.85906040268456,
"grad_norm": 2,
"learning_rate": 1.346153846153846e-7,
"loss": 0.4205,
"step": 204
},
{
"epoch": 102.42953020134229,
"grad_norm": 4.46875,
"learning_rate": 1.298076923076923e-7,
"loss": 0.8331,
"step": 205
},
{
"epoch": 102.85906040268456,
"grad_norm": 2,
"learning_rate": 1.25e-7,
"loss": 0.4132,
"step": 206
},
{
"epoch": 103.42953020134229,
"grad_norm": 4.4375,
"learning_rate": 1.2019230769230769e-7,
"loss": 0.8368,
"step": 207
},
{
"epoch": 103.85906040268456,
"grad_norm": 2.03125,
"learning_rate": 1.1538461538461539e-7,
"loss": 0.4184,
"step": 208
},
{
"epoch": 104.42953020134229,
"grad_norm": 4.375,
"learning_rate": 1.1057692307692307e-7,
"loss": 0.8658,
"step": 209
},
{
"epoch": 104.85906040268456,
"grad_norm": 2.015625,
"learning_rate": 1.0576923076923076e-7,
"loss": 0.4317,
"step": 210
},
{
"epoch": 105.42953020134229,
"grad_norm": 4.40625,
"learning_rate": 1.0096153846153847e-7,
"loss": 0.8195,
"step": 211
},
{
"epoch": 105.85906040268456,
"grad_norm": 2.109375,
"learning_rate": 9.615384615384616e-8,
"loss": 0.4314,
"step": 212
},
{
"epoch": 106.42953020134229,
"grad_norm": 4.4375,
"learning_rate": 9.134615384615383e-8,
"loss": 0.8262,
"step": 213
},
{
"epoch": 106.85906040268456,
"grad_norm": 2.046875,
"learning_rate": 8.653846153846153e-8,
"loss": 0.4219,
"step": 214
},
{
"epoch": 107.42953020134229,
"grad_norm": 4.5625,
"learning_rate": 8.173076923076923e-8,
"loss": 0.8271,
"step": 215
},
{
"epoch": 107.85906040268456,
"grad_norm": 1.96875,
"learning_rate": 7.692307692307692e-8,
"loss": 0.4164,
"step": 216
},
{
"epoch": 108.42953020134229,
"grad_norm": 4.4375,
"learning_rate": 7.21153846153846e-8,
"loss": 0.8535,
"step": 217
},
{
"epoch": 108.85906040268456,
"grad_norm": 2.046875,
"learning_rate": 6.73076923076923e-8,
"loss": 0.4105,
"step": 218
},
{
"epoch": 109.42953020134229,
"grad_norm": 4.40625,
"learning_rate": 6.25e-8,
"loss": 0.8138,
"step": 219
},
{
"epoch": 109.85906040268456,
"grad_norm": 1.984375,
"learning_rate": 5.7692307692307695e-8,
"loss": 0.4273,
"step": 220
},
{
"epoch": 110.42953020134229,
"grad_norm": 4.5,
"learning_rate": 5.288461538461538e-8,
"loss": 0.8655,
"step": 221
},
{
"epoch": 110.85906040268456,
"grad_norm": 2,
"learning_rate": 4.807692307692308e-8,
"loss": 0.4068,
"step": 222
},
{
"epoch": 111.42953020134229,
"grad_norm": 4.40625,
"learning_rate": 4.326923076923076e-8,
"loss": 0.8253,
"step": 223
},
{
"epoch": 111.85906040268456,
"grad_norm": 1.9921875,
"learning_rate": 3.846153846153846e-8,
"loss": 0.4206,
"step": 224
},
{
"epoch": 112.42953020134229,
"grad_norm": 4.65625,
"learning_rate": 3.365384615384615e-8,
"loss": 0.8335,
"step": 225
},
{
"epoch": 112.85906040268456,
"grad_norm": 2,
"learning_rate": 2.8846153846153848e-8,
"loss": 0.4198,
"step": 226
},
{
"epoch": 113.42953020134229,
"grad_norm": 4.5,
"learning_rate": 2.403846153846154e-8,
"loss": 0.8763,
"step": 227
},
{
"epoch": 113.85906040268456,
"grad_norm": 2,
"learning_rate": 1.923076923076923e-8,
"loss": 0.418,
"step": 228
},
{
"epoch": 114.42953020134229,
"grad_norm": 4.5625,
"learning_rate": 1.4423076923076924e-8,
"loss": 0.8562,
"step": 229
},
{
"epoch": 114.85906040268456,
"grad_norm": 1.9765625,
"learning_rate": 9.615384615384615e-9,
"loss": 0.4102,
"step": 230
},
{
"epoch": 115.42953020134229,
"grad_norm": 4.4375,
"learning_rate": 4.807692307692308e-9,
"loss": 0.8371,
"step": 231
},
{
"epoch": 115.85906040268456,
"grad_norm": 1.9609375,
"learning_rate": 0,
"loss": 0.4188,
"step": 232
}
] | 1
| 232
| 0
| 116
| 10
|
{
"TrainerControl": {
"args": {
"should_epoch_stop": false,
"should_evaluate": false,
"should_log": false,
"should_save": true,
"should_training_stop": true
},
"attributes": {}
}
}
| 243,388,770,567,585,800
| 1
| null | null |
null | null | 115.85906
| 500
| 232
| false
| true
| true
|
[
{
"epoch": 0.42953020134228187,
"grad_norm": 17.375,
"learning_rate": 4.166666666666666e-8,
"loss": 0.9944,
"step": 1
},
{
"epoch": 0.8590604026845637,
"grad_norm": 17.125,
"learning_rate": 8.333333333333333e-8,
"loss": 1.0053,
"step": 2
},
{
"epoch": 1.429530201342282,
"grad_norm": 33.5,
"learning_rate": 1.25e-7,
"loss": 2.0029,
"step": 3
},
{
"epoch": 1.8590604026845639,
"grad_norm": 16.5,
"learning_rate": 1.6666666666666665e-7,
"loss": 1.0003,
"step": 4
},
{
"epoch": 2.4295302013422817,
"grad_norm": 34.5,
"learning_rate": 2.0833333333333333e-7,
"loss": 1.9647,
"step": 5
},
{
"epoch": 2.859060402684564,
"grad_norm": 16.375,
"learning_rate": 2.5e-7,
"loss": 0.9701,
"step": 6
},
{
"epoch": 3.4295302013422817,
"grad_norm": 34.5,
"learning_rate": 2.916666666666667e-7,
"loss": 2.055,
"step": 7
},
{
"epoch": 3.859060402684564,
"grad_norm": 16.75,
"learning_rate": 3.333333333333333e-7,
"loss": 0.9851,
"step": 8
},
{
"epoch": 4.429530201342282,
"grad_norm": 35,
"learning_rate": 3.75e-7,
"loss": 1.9802,
"step": 9
},
{
"epoch": 4.859060402684563,
"grad_norm": 17,
"learning_rate": 4.1666666666666667e-7,
"loss": 0.9851,
"step": 10
},
{
"epoch": 5.429530201342282,
"grad_norm": 33.75,
"learning_rate": 4.5833333333333327e-7,
"loss": 2.0142,
"step": 11
},
{
"epoch": 5.859060402684563,
"grad_norm": 16.625,
"learning_rate": 5e-7,
"loss": 0.9631,
"step": 12
},
{
"epoch": 6.429530201342282,
"grad_norm": 34,
"learning_rate": 5.416666666666666e-7,
"loss": 1.9978,
"step": 13
},
{
"epoch": 6.859060402684563,
"grad_norm": 17,
"learning_rate": 5.833333333333334e-7,
"loss": 0.9844,
"step": 14
},
{
"epoch": 7.429530201342282,
"grad_norm": 33.5,
"learning_rate": 6.249999999999999e-7,
"loss": 1.9302,
"step": 15
},
{
"epoch": 7.859060402684563,
"grad_norm": 15.9375,
"learning_rate": 6.666666666666666e-7,
"loss": 0.95,
"step": 16
},
{
"epoch": 8.429530201342281,
"grad_norm": 33.25,
"learning_rate": 7.083333333333334e-7,
"loss": 1.9643,
"step": 17
},
{
"epoch": 8.859060402684564,
"grad_norm": 15.625,
"learning_rate": 7.5e-7,
"loss": 0.9424,
"step": 18
},
{
"epoch": 9.429530201342281,
"grad_norm": 31.375,
"learning_rate": 7.916666666666666e-7,
"loss": 1.9306,
"step": 19
},
{
"epoch": 9.859060402684564,
"grad_norm": 15.625,
"learning_rate": 8.333333333333333e-7,
"loss": 0.952,
"step": 20
},
{
"epoch": 10.429530201342281,
"grad_norm": 29.125,
"learning_rate": 8.75e-7,
"loss": 1.8804,
"step": 21
},
{
"epoch": 10.859060402684564,
"grad_norm": 14.75,
"learning_rate": 9.166666666666665e-7,
"loss": 0.9159,
"step": 22
},
{
"epoch": 11.429530201342281,
"grad_norm": 29,
"learning_rate": 9.583333333333334e-7,
"loss": 1.8056,
"step": 23
},
{
"epoch": 11.859060402684564,
"grad_norm": 13.6875,
"learning_rate": 0.000001,
"loss": 0.9177,
"step": 24
},
{
"epoch": 12.429530201342281,
"grad_norm": 25.875,
"learning_rate": 9.951923076923077e-7,
"loss": 1.8447,
"step": 25
},
{
"epoch": 12.859060402684564,
"grad_norm": 11.625,
"learning_rate": 9.903846153846153e-7,
"loss": 0.9052,
"step": 26
},
{
"epoch": 13.429530201342281,
"grad_norm": 20,
"learning_rate": 9.85576923076923e-7,
"loss": 1.6995,
"step": 27
},
{
"epoch": 13.859060402684564,
"grad_norm": 10.1875,
"learning_rate": 9.807692307692306e-7,
"loss": 0.877,
"step": 28
},
{
"epoch": 14.429530201342281,
"grad_norm": 18,
"learning_rate": 9.759615384615384e-7,
"loss": 1.7052,
"step": 29
},
{
"epoch": 14.859060402684564,
"grad_norm": 8.4375,
"learning_rate": 9.711538461538462e-7,
"loss": 0.8363,
"step": 30
},
{
"epoch": 15.429530201342281,
"grad_norm": 16.25,
"learning_rate": 9.663461538461537e-7,
"loss": 1.6953,
"step": 31
},
{
"epoch": 15.859060402684564,
"grad_norm": 8.0625,
"learning_rate": 9.615384615384615e-7,
"loss": 0.8527,
"step": 32
},
{
"epoch": 16.42953020134228,
"grad_norm": 15.0625,
"learning_rate": 9.567307692307693e-7,
"loss": 1.6249,
"step": 33
},
{
"epoch": 16.859060402684563,
"grad_norm": 7.34375,
"learning_rate": 9.519230769230768e-7,
"loss": 0.8286,
"step": 34
},
{
"epoch": 17.42953020134228,
"grad_norm": 14.1875,
"learning_rate": 9.471153846153846e-7,
"loss": 1.5865,
"step": 35
},
{
"epoch": 17.859060402684563,
"grad_norm": 6.9375,
"learning_rate": 9.423076923076923e-7,
"loss": 0.7977,
"step": 36
},
{
"epoch": 18.42953020134228,
"grad_norm": 12.75,
"learning_rate": 9.374999999999999e-7,
"loss": 1.6016,
"step": 37
},
{
"epoch": 18.859060402684563,
"grad_norm": 6.625,
"learning_rate": 9.326923076923077e-7,
"loss": 0.7821,
"step": 38
},
{
"epoch": 19.42953020134228,
"grad_norm": 13.4375,
"learning_rate": 9.278846153846154e-7,
"loss": 1.5797,
"step": 39
},
{
"epoch": 19.859060402684563,
"grad_norm": 6.1875,
"learning_rate": 9.230769230769231e-7,
"loss": 0.7771,
"step": 40
},
{
"epoch": 20.42953020134228,
"grad_norm": 11.9375,
"learning_rate": 9.182692307692307e-7,
"loss": 1.5256,
"step": 41
},
{
"epoch": 20.859060402684563,
"grad_norm": 5.875,
"learning_rate": 9.134615384615383e-7,
"loss": 0.748,
"step": 42
},
{
"epoch": 21.42953020134228,
"grad_norm": 12.125,
"learning_rate": 9.086538461538461e-7,
"loss": 1.5201,
"step": 43
},
{
"epoch": 21.859060402684563,
"grad_norm": 5.75,
"learning_rate": 9.038461538461538e-7,
"loss": 0.7454,
"step": 44
},
{
"epoch": 22.42953020134228,
"grad_norm": 11.3125,
"learning_rate": 8.990384615384616e-7,
"loss": 1.531,
"step": 45
},
{
"epoch": 22.859060402684563,
"grad_norm": 5.53125,
"learning_rate": 8.942307692307692e-7,
"loss": 0.7526,
"step": 46
},
{
"epoch": 23.42953020134228,
"grad_norm": 11.3125,
"learning_rate": 8.894230769230768e-7,
"loss": 1.46,
"step": 47
},
{
"epoch": 23.859060402684563,
"grad_norm": 5.21875,
"learning_rate": 8.846153846153846e-7,
"loss": 0.7315,
"step": 48
},
{
"epoch": 24.42953020134228,
"grad_norm": 10.9375,
"learning_rate": 8.798076923076922e-7,
"loss": 1.4508,
"step": 49
},
{
"epoch": 24.859060402684563,
"grad_norm": 5.3125,
"learning_rate": 8.75e-7,
"loss": 0.7162,
"step": 50
},
{
"epoch": 25.42953020134228,
"grad_norm": 9.9375,
"learning_rate": 8.701923076923077e-7,
"loss": 1.4465,
"step": 51
},
{
"epoch": 25.859060402684563,
"grad_norm": 4.9375,
"learning_rate": 8.653846153846154e-7,
"loss": 0.7136,
"step": 52
},
{
"epoch": 26.42953020134228,
"grad_norm": 10,
"learning_rate": 8.605769230769231e-7,
"loss": 1.4159,
"step": 53
},
{
"epoch": 26.859060402684563,
"grad_norm": 4.8125,
"learning_rate": 8.557692307692306e-7,
"loss": 0.6983,
"step": 54
},
{
"epoch": 27.42953020134228,
"grad_norm": 9.625,
"learning_rate": 8.509615384615384e-7,
"loss": 1.4073,
"step": 55
},
{
"epoch": 27.859060402684563,
"grad_norm": 4.59375,
"learning_rate": 8.461538461538461e-7,
"loss": 0.7012,
"step": 56
},
{
"epoch": 28.42953020134228,
"grad_norm": 9.375,
"learning_rate": 8.413461538461539e-7,
"loss": 1.3826,
"step": 57
},
{
"epoch": 28.859060402684563,
"grad_norm": 4.5625,
"learning_rate": 8.365384615384615e-7,
"loss": 0.7103,
"step": 58
},
{
"epoch": 29.42953020134228,
"grad_norm": 9.0625,
"learning_rate": 8.317307692307692e-7,
"loss": 1.3456,
"step": 59
},
{
"epoch": 29.859060402684563,
"grad_norm": 4.34375,
"learning_rate": 8.269230769230768e-7,
"loss": 0.679,
"step": 60
},
{
"epoch": 30.42953020134228,
"grad_norm": 8.625,
"learning_rate": 8.221153846153845e-7,
"loss": 1.3536,
"step": 61
},
{
"epoch": 30.859060402684563,
"grad_norm": 4.15625,
"learning_rate": 8.173076923076923e-7,
"loss": 0.6913,
"step": 62
},
{
"epoch": 31.42953020134228,
"grad_norm": 8.875,
"learning_rate": 8.125e-7,
"loss": 1.3714,
"step": 63
},
{
"epoch": 31.859060402684563,
"grad_norm": 4.125,
"learning_rate": 8.076923076923077e-7,
"loss": 0.6586,
"step": 64
},
{
"epoch": 32.42953020134228,
"grad_norm": 8.5,
"learning_rate": 8.028846153846154e-7,
"loss": 1.3987,
"step": 65
},
{
"epoch": 32.85906040268456,
"grad_norm": 4.03125,
"learning_rate": 7.98076923076923e-7,
"loss": 0.6669,
"step": 66
},
{
"epoch": 33.42953020134228,
"grad_norm": 8.4375,
"learning_rate": 7.932692307692307e-7,
"loss": 1.3135,
"step": 67
},
{
"epoch": 33.85906040268456,
"grad_norm": 3.90625,
"learning_rate": 7.884615384615384e-7,
"loss": 0.6537,
"step": 68
},
{
"epoch": 34.42953020134228,
"grad_norm": 7.9375,
"learning_rate": 7.836538461538462e-7,
"loss": 1.2721,
"step": 69
},
{
"epoch": 34.85906040268456,
"grad_norm": 3.8125,
"learning_rate": 7.788461538461538e-7,
"loss": 0.6552,
"step": 70
},
{
"epoch": 35.42953020134228,
"grad_norm": 7.75,
"learning_rate": 7.740384615384615e-7,
"loss": 1.2754,
"step": 71
},
{
"epoch": 35.85906040268456,
"grad_norm": 3.71875,
"learning_rate": 7.692307692307693e-7,
"loss": 0.6506,
"step": 72
},
{
"epoch": 36.42953020134228,
"grad_norm": 7.59375,
"learning_rate": 7.644230769230768e-7,
"loss": 1.3171,
"step": 73
},
{
"epoch": 36.85906040268456,
"grad_norm": 3.53125,
"learning_rate": 7.596153846153846e-7,
"loss": 0.6397,
"step": 74
},
{
"epoch": 37.42953020134228,
"grad_norm": 7.75,
"learning_rate": 7.548076923076922e-7,
"loss": 1.2787,
"step": 75
},
{
"epoch": 37.85906040268456,
"grad_norm": 3.5625,
"learning_rate": 7.5e-7,
"loss": 0.6283,
"step": 76
},
{
"epoch": 38.42953020134228,
"grad_norm": 7.125,
"learning_rate": 7.451923076923077e-7,
"loss": 1.3052,
"step": 77
},
{
"epoch": 38.85906040268456,
"grad_norm": 3.578125,
"learning_rate": 7.403846153846153e-7,
"loss": 0.6376,
"step": 78
},
{
"epoch": 39.42953020134228,
"grad_norm": 7.5,
"learning_rate": 7.355769230769231e-7,
"loss": 1.2147,
"step": 79
},
{
"epoch": 39.85906040268456,
"grad_norm": 3.453125,
"learning_rate": 7.307692307692307e-7,
"loss": 0.6285,
"step": 80
},
{
"epoch": 40.42953020134228,
"grad_norm": 7.4375,
"learning_rate": 7.259615384615385e-7,
"loss": 1.2263,
"step": 81
},
{
"epoch": 40.85906040268456,
"grad_norm": 3.484375,
"learning_rate": 7.211538461538461e-7,
"loss": 0.6488,
"step": 82
},
{
"epoch": 41.42953020134228,
"grad_norm": 7.34375,
"learning_rate": 7.163461538461538e-7,
"loss": 1.2249,
"step": 83
},
{
"epoch": 41.85906040268456,
"grad_norm": 3.625,
"learning_rate": 7.115384615384616e-7,
"loss": 0.6248,
"step": 84
},
{
"epoch": 42.42953020134228,
"grad_norm": 7.71875,
"learning_rate": 7.067307692307692e-7,
"loss": 1.2908,
"step": 85
},
{
"epoch": 42.85906040268456,
"grad_norm": 3.734375,
"learning_rate": 7.019230769230769e-7,
"loss": 0.6246,
"step": 86
},
{
"epoch": 43.42953020134228,
"grad_norm": 8.125,
"learning_rate": 6.971153846153845e-7,
"loss": 1.2471,
"step": 87
},
{
"epoch": 43.85906040268456,
"grad_norm": 3.8125,
"learning_rate": 6.923076923076922e-7,
"loss": 0.608,
"step": 88
},
{
"epoch": 44.42953020134228,
"grad_norm": 7.25,
"learning_rate": 6.875e-7,
"loss": 1.2211,
"step": 89
},
{
"epoch": 44.85906040268456,
"grad_norm": 3.328125,
"learning_rate": 6.826923076923076e-7,
"loss": 0.6014,
"step": 90
},
{
"epoch": 45.42953020134228,
"grad_norm": 6.5625,
"learning_rate": 6.778846153846154e-7,
"loss": 1.2531,
"step": 91
},
{
"epoch": 45.85906040268456,
"grad_norm": 3.1875,
"learning_rate": 6.730769230769231e-7,
"loss": 0.6233,
"step": 92
},
{
"epoch": 46.42953020134228,
"grad_norm": 6.46875,
"learning_rate": 6.682692307692307e-7,
"loss": 1.1605,
"step": 93
},
{
"epoch": 46.85906040268456,
"grad_norm": 3.109375,
"learning_rate": 6.634615384615384e-7,
"loss": 0.6205,
"step": 94
},
{
"epoch": 47.42953020134228,
"grad_norm": 6.46875,
"learning_rate": 6.586538461538461e-7,
"loss": 1.2007,
"step": 95
},
{
"epoch": 47.85906040268456,
"grad_norm": 3.234375,
"learning_rate": 6.538461538461538e-7,
"loss": 0.6053,
"step": 96
},
{
"epoch": 48.42953020134228,
"grad_norm": 6.34375,
"learning_rate": 6.490384615384615e-7,
"loss": 1.1754,
"step": 97
},
{
"epoch": 48.85906040268456,
"grad_norm": 2.90625,
"learning_rate": 6.442307692307693e-7,
"loss": 0.6223,
"step": 98
},
{
"epoch": 49.42953020134228,
"grad_norm": 6.34375,
"learning_rate": 6.394230769230768e-7,
"loss": 1.1999,
"step": 99
},
{
"epoch": 49.85906040268456,
"grad_norm": 2.765625,
"learning_rate": 6.346153846153845e-7,
"loss": 0.589,
"step": 100
},
{
"epoch": 50.42953020134228,
"grad_norm": 5.71875,
"learning_rate": 6.298076923076923e-7,
"loss": 1.2062,
"step": 101
},
{
"epoch": 50.85906040268456,
"grad_norm": 2.703125,
"learning_rate": 6.249999999999999e-7,
"loss": 0.5984,
"step": 102
},
{
"epoch": 51.42953020134228,
"grad_norm": 5.78125,
"learning_rate": 6.201923076923077e-7,
"loss": 1.1375,
"step": 103
},
{
"epoch": 51.85906040268456,
"grad_norm": 2.609375,
"learning_rate": 6.153846153846154e-7,
"loss": 0.6177,
"step": 104
},
{
"epoch": 52.42953020134228,
"grad_norm": 5.53125,
"learning_rate": 6.105769230769232e-7,
"loss": 1.1858,
"step": 105
},
{
"epoch": 52.85906040268456,
"grad_norm": 2.53125,
"learning_rate": 6.057692307692307e-7,
"loss": 0.6026,
"step": 106
},
{
"epoch": 53.42953020134228,
"grad_norm": 5.4375,
"learning_rate": 6.009615384615384e-7,
"loss": 1.1762,
"step": 107
},
{
"epoch": 53.85906040268456,
"grad_norm": 2.53125,
"learning_rate": 5.961538461538461e-7,
"loss": 0.5848,
"step": 108
},
{
"epoch": 54.42953020134228,
"grad_norm": 5.40625,
"learning_rate": 5.913461538461538e-7,
"loss": 1.2002,
"step": 109
},
{
"epoch": 54.85906040268456,
"grad_norm": 2.4375,
"learning_rate": 5.865384615384616e-7,
"loss": 0.5962,
"step": 110
},
{
"epoch": 55.42953020134228,
"grad_norm": 5.125,
"learning_rate": 5.817307692307692e-7,
"loss": 1.1821,
"step": 111
},
{
"epoch": 55.85906040268456,
"grad_norm": 2.484375,
"learning_rate": 5.769230769230768e-7,
"loss": 0.6007,
"step": 112
},
{
"epoch": 56.42953020134228,
"grad_norm": 5.3125,
"learning_rate": 5.721153846153846e-7,
"loss": 1.1558,
"step": 113
},
{
"epoch": 56.85906040268456,
"grad_norm": 2.40625,
"learning_rate": 5.673076923076922e-7,
"loss": 0.6026,
"step": 114
},
{
"epoch": 57.42953020134228,
"grad_norm": 5.28125,
"learning_rate": 5.625e-7,
"loss": 1.153,
"step": 115
},
{
"epoch": 57.85906040268456,
"grad_norm": 2.453125,
"learning_rate": 5.576923076923077e-7,
"loss": 0.5808,
"step": 116
},
{
"epoch": 58.42953020134228,
"grad_norm": 5.46875,
"learning_rate": 5.528846153846153e-7,
"loss": 1.2185,
"step": 117
},
{
"epoch": 58.85906040268456,
"grad_norm": 2.359375,
"learning_rate": 5.480769230769231e-7,
"loss": 0.6032,
"step": 118
},
{
"epoch": 59.42953020134228,
"grad_norm": 5.0625,
"learning_rate": 5.432692307692307e-7,
"loss": 1.1406,
"step": 119
},
{
"epoch": 59.85906040268456,
"grad_norm": 2.34375,
"learning_rate": 5.384615384615384e-7,
"loss": 0.5825,
"step": 120
},
{
"epoch": 60.42953020134228,
"grad_norm": 5.1875,
"learning_rate": 5.336538461538461e-7,
"loss": 1.1886,
"step": 121
},
{
"epoch": 60.85906040268456,
"grad_norm": 2.375,
"learning_rate": 5.288461538461539e-7,
"loss": 0.5846,
"step": 122
},
{
"epoch": 61.42953020134228,
"grad_norm": 5.0625,
"learning_rate": 5.240384615384615e-7,
"loss": 1.164,
"step": 123
},
{
"epoch": 61.85906040268456,
"grad_norm": 2.296875,
"learning_rate": 5.192307692307692e-7,
"loss": 0.5941,
"step": 124
},
{
"epoch": 62.42953020134228,
"grad_norm": 4.9375,
"learning_rate": 5.144230769230769e-7,
"loss": 1.1717,
"step": 125
},
{
"epoch": 62.85906040268456,
"grad_norm": 2.25,
"learning_rate": 5.096153846153845e-7,
"loss": 0.5762,
"step": 126
},
{
"epoch": 63.42953020134228,
"grad_norm": 4.75,
"learning_rate": 5.048076923076923e-7,
"loss": 1.1301,
"step": 127
},
{
"epoch": 63.85906040268456,
"grad_norm": 2.28125,
"learning_rate": 5e-7,
"loss": 0.5839,
"step": 128
},
{
"epoch": 64.42953020134229,
"grad_norm": 4.875,
"learning_rate": 4.951923076923076e-7,
"loss": 1.1447,
"step": 129
},
{
"epoch": 64.85906040268456,
"grad_norm": 2.234375,
"learning_rate": 4.903846153846153e-7,
"loss": 0.5904,
"step": 130
},
{
"epoch": 65.42953020134229,
"grad_norm": 5,
"learning_rate": 4.855769230769231e-7,
"loss": 1.1521,
"step": 131
},
{
"epoch": 65.85906040268456,
"grad_norm": 2.203125,
"learning_rate": 4.807692307692307e-7,
"loss": 0.5651,
"step": 132
},
{
"epoch": 66.42953020134229,
"grad_norm": 5.0625,
"learning_rate": 4.759615384615384e-7,
"loss": 1.1653,
"step": 133
},
{
"epoch": 66.85906040268456,
"grad_norm": 2.34375,
"learning_rate": 4.711538461538461e-7,
"loss": 0.6035,
"step": 134
},
{
"epoch": 67.42953020134229,
"grad_norm": 4.84375,
"learning_rate": 4.6634615384615384e-7,
"loss": 1.1642,
"step": 135
},
{
"epoch": 67.85906040268456,
"grad_norm": 2.265625,
"learning_rate": 4.6153846153846156e-7,
"loss": 0.5703,
"step": 136
},
{
"epoch": 68.42953020134229,
"grad_norm": 4.71875,
"learning_rate": 4.567307692307692e-7,
"loss": 1.1526,
"step": 137
},
{
"epoch": 68.85906040268456,
"grad_norm": 2.234375,
"learning_rate": 4.519230769230769e-7,
"loss": 0.562,
"step": 138
},
{
"epoch": 69.42953020134229,
"grad_norm": 4.6875,
"learning_rate": 4.471153846153846e-7,
"loss": 1.1726,
"step": 139
},
{
"epoch": 69.85906040268456,
"grad_norm": 2.21875,
"learning_rate": 4.423076923076923e-7,
"loss": 0.5851,
"step": 140
},
{
"epoch": 70.42953020134229,
"grad_norm": 4.84375,
"learning_rate": 4.375e-7,
"loss": 1.1486,
"step": 141
},
{
"epoch": 70.85906040268456,
"grad_norm": 2.234375,
"learning_rate": 4.326923076923077e-7,
"loss": 0.5778,
"step": 142
},
{
"epoch": 71.42953020134229,
"grad_norm": 4.59375,
"learning_rate": 4.278846153846153e-7,
"loss": 1.1508,
"step": 143
},
{
"epoch": 71.85906040268456,
"grad_norm": 2.234375,
"learning_rate": 4.2307692307692304e-7,
"loss": 0.5722,
"step": 144
},
{
"epoch": 72.42953020134229,
"grad_norm": 4.9375,
"learning_rate": 4.1826923076923076e-7,
"loss": 1.1663,
"step": 145
},
{
"epoch": 72.85906040268456,
"grad_norm": 2.21875,
"learning_rate": 4.134615384615384e-7,
"loss": 0.5864,
"step": 146
},
{
"epoch": 73.42953020134229,
"grad_norm": 4.8125,
"learning_rate": 4.0865384615384614e-7,
"loss": 1.1551,
"step": 147
},
{
"epoch": 73.85906040268456,
"grad_norm": 2.109375,
"learning_rate": 4.0384615384615386e-7,
"loss": 0.573,
"step": 148
},
{
"epoch": 74.42953020134229,
"grad_norm": 4.625,
"learning_rate": 3.990384615384615e-7,
"loss": 1.1708,
"step": 149
},
{
"epoch": 74.85906040268456,
"grad_norm": 2.171875,
"learning_rate": 3.942307692307692e-7,
"loss": 0.5656,
"step": 150
},
{
"epoch": 75.42953020134229,
"grad_norm": 4.8125,
"learning_rate": 3.894230769230769e-7,
"loss": 1.171,
"step": 151
},
{
"epoch": 75.85906040268456,
"grad_norm": 2.09375,
"learning_rate": 3.8461538461538463e-7,
"loss": 0.5674,
"step": 152
},
{
"epoch": 76.42953020134229,
"grad_norm": 4.90625,
"learning_rate": 3.798076923076923e-7,
"loss": 1.1524,
"step": 153
},
{
"epoch": 76.85906040268456,
"grad_norm": 2.109375,
"learning_rate": 3.75e-7,
"loss": 0.5638,
"step": 154
},
{
"epoch": 77.42953020134229,
"grad_norm": 4.78125,
"learning_rate": 3.701923076923077e-7,
"loss": 1.1671,
"step": 155
},
{
"epoch": 77.85906040268456,
"grad_norm": 2.21875,
"learning_rate": 3.6538461538461534e-7,
"loss": 0.5748,
"step": 156
},
{
"epoch": 78.42953020134229,
"grad_norm": 4.65625,
"learning_rate": 3.6057692307692306e-7,
"loss": 1.1411,
"step": 157
},
{
"epoch": 78.85906040268456,
"grad_norm": 2.1875,
"learning_rate": 3.557692307692308e-7,
"loss": 0.5821,
"step": 158
},
{
"epoch": 79.42953020134229,
"grad_norm": 4.59375,
"learning_rate": 3.5096153846153844e-7,
"loss": 1.1462,
"step": 159
},
{
"epoch": 79.85906040268456,
"grad_norm": 2.15625,
"learning_rate": 3.461538461538461e-7,
"loss": 0.5651,
"step": 160
},
{
"epoch": 80.42953020134229,
"grad_norm": 5.21875,
"learning_rate": 3.413461538461538e-7,
"loss": 1.1256,
"step": 161
},
{
"epoch": 80.85906040268456,
"grad_norm": 2.1875,
"learning_rate": 3.3653846153846154e-7,
"loss": 0.5604,
"step": 162
},
{
"epoch": 81.42953020134229,
"grad_norm": 4.53125,
"learning_rate": 3.317307692307692e-7,
"loss": 1.113,
"step": 163
},
{
"epoch": 81.85906040268456,
"grad_norm": 2.171875,
"learning_rate": 3.269230769230769e-7,
"loss": 0.5801,
"step": 164
},
{
"epoch": 82.42953020134229,
"grad_norm": 4.6875,
"learning_rate": 3.2211538461538464e-7,
"loss": 1.1455,
"step": 165
},
{
"epoch": 82.85906040268456,
"grad_norm": 2.09375,
"learning_rate": 3.1730769230769225e-7,
"loss": 0.5647,
"step": 166
},
{
"epoch": 83.42953020134229,
"grad_norm": 4.46875,
"learning_rate": 3.1249999999999997e-7,
"loss": 1.1486,
"step": 167
},
{
"epoch": 83.85906040268456,
"grad_norm": 2.09375,
"learning_rate": 3.076923076923077e-7,
"loss": 0.5805,
"step": 168
},
{
"epoch": 84.42953020134229,
"grad_norm": 4.4375,
"learning_rate": 3.0288461538461536e-7,
"loss": 1.1441,
"step": 169
},
{
"epoch": 84.85906040268456,
"grad_norm": 2.109375,
"learning_rate": 2.980769230769231e-7,
"loss": 0.5679,
"step": 170
},
{
"epoch": 85.42953020134229,
"grad_norm": 4.9375,
"learning_rate": 2.932692307692308e-7,
"loss": 1.1337,
"step": 171
},
{
"epoch": 85.85906040268456,
"grad_norm": 2.109375,
"learning_rate": 2.884615384615384e-7,
"loss": 0.5691,
"step": 172
},
{
"epoch": 86.42953020134229,
"grad_norm": 4.71875,
"learning_rate": 2.836538461538461e-7,
"loss": 1.1342,
"step": 173
},
{
"epoch": 86.85906040268456,
"grad_norm": 2.21875,
"learning_rate": 2.7884615384615384e-7,
"loss": 0.5844,
"step": 174
},
{
"epoch": 87.42953020134229,
"grad_norm": 4.75,
"learning_rate": 2.7403846153846156e-7,
"loss": 1.11,
"step": 175
},
{
"epoch": 87.85906040268456,
"grad_norm": 2.234375,
"learning_rate": 2.692307692307692e-7,
"loss": 0.5964,
"step": 176
},
{
"epoch": 88.42953020134229,
"grad_norm": 4.84375,
"learning_rate": 2.6442307692307694e-7,
"loss": 1.1519,
"step": 177
},
{
"epoch": 88.85906040268456,
"grad_norm": 2.15625,
"learning_rate": 2.596153846153846e-7,
"loss": 0.5713,
"step": 178
},
{
"epoch": 89.42953020134229,
"grad_norm": 4.46875,
"learning_rate": 2.5480769230769227e-7,
"loss": 1.1,
"step": 179
},
{
"epoch": 89.85906040268456,
"grad_norm": 2.078125,
"learning_rate": 2.5e-7,
"loss": 0.569,
"step": 180
},
{
"epoch": 90.42953020134229,
"grad_norm": 4.8125,
"learning_rate": 2.4519230769230765e-7,
"loss": 1.1541,
"step": 181
},
{
"epoch": 90.85906040268456,
"grad_norm": 2.078125,
"learning_rate": 2.4038461538461537e-7,
"loss": 0.5681,
"step": 182
},
{
"epoch": 91.42953020134229,
"grad_norm": 4.84375,
"learning_rate": 2.3557692307692306e-7,
"loss": 1.1761,
"step": 183
},
{
"epoch": 91.85906040268456,
"grad_norm": 2.078125,
"learning_rate": 2.3076923076923078e-7,
"loss": 0.548,
"step": 184
},
{
"epoch": 92.42953020134229,
"grad_norm": 4.71875,
"learning_rate": 2.2596153846153845e-7,
"loss": 1.1111,
"step": 185
},
{
"epoch": 92.85906040268456,
"grad_norm": 2.078125,
"learning_rate": 2.2115384615384614e-7,
"loss": 0.5646,
"step": 186
},
{
"epoch": 93.42953020134229,
"grad_norm": 4.84375,
"learning_rate": 2.1634615384615386e-7,
"loss": 1.1318,
"step": 187
},
{
"epoch": 93.85906040268456,
"grad_norm": 2.078125,
"learning_rate": 2.1153846153846152e-7,
"loss": 0.5685,
"step": 188
},
{
"epoch": 94.42953020134229,
"grad_norm": 4.6875,
"learning_rate": 2.067307692307692e-7,
"loss": 1.1284,
"step": 189
},
{
"epoch": 94.85906040268456,
"grad_norm": 2.078125,
"learning_rate": 2.0192307692307693e-7,
"loss": 0.5648,
"step": 190
},
{
"epoch": 95.42953020134229,
"grad_norm": 4.90625,
"learning_rate": 1.971153846153846e-7,
"loss": 1.1235,
"step": 191
},
{
"epoch": 95.85906040268456,
"grad_norm": 2.078125,
"learning_rate": 1.9230769230769231e-7,
"loss": 0.5704,
"step": 192
},
{
"epoch": 96.42953020134229,
"grad_norm": 4.53125,
"learning_rate": 1.875e-7,
"loss": 1.1003,
"step": 193
},
{
"epoch": 96.85906040268456,
"grad_norm": 2.09375,
"learning_rate": 1.8269230769230767e-7,
"loss": 0.5748,
"step": 194
},
{
"epoch": 97.42953020134229,
"grad_norm": 5.0625,
"learning_rate": 1.778846153846154e-7,
"loss": 1.1609,
"step": 195
},
{
"epoch": 97.85906040268456,
"grad_norm": 2.046875,
"learning_rate": 1.7307692307692305e-7,
"loss": 0.574,
"step": 196
},
{
"epoch": 98.42953020134229,
"grad_norm": 4.71875,
"learning_rate": 1.6826923076923077e-7,
"loss": 1.165,
"step": 197
},
{
"epoch": 98.85906040268456,
"grad_norm": 2.0625,
"learning_rate": 1.6346153846153846e-7,
"loss": 0.5529,
"step": 198
},
{
"epoch": 99.42953020134229,
"grad_norm": 4.34375,
"learning_rate": 1.5865384615384613e-7,
"loss": 1.1181,
"step": 199
},
{
"epoch": 99.85906040268456,
"grad_norm": 2.140625,
"learning_rate": 1.5384615384615385e-7,
"loss": 0.5853,
"step": 200
},
{
"epoch": 100.42953020134229,
"grad_norm": 4.53125,
"learning_rate": 1.4903846153846154e-7,
"loss": 1.0952,
"step": 201
},
{
"epoch": 100.85906040268456,
"grad_norm": 2.046875,
"learning_rate": 1.442307692307692e-7,
"loss": 0.5715,
"step": 202
},
{
"epoch": 101.42953020134229,
"grad_norm": 4.875,
"learning_rate": 1.3942307692307692e-7,
"loss": 1.1688,
"step": 203
},
{
"epoch": 101.85906040268456,
"grad_norm": 2.078125,
"learning_rate": 1.346153846153846e-7,
"loss": 0.5713,
"step": 204
},
{
"epoch": 102.42953020134229,
"grad_norm": 4.65625,
"learning_rate": 1.298076923076923e-7,
"loss": 1.111,
"step": 205
},
{
"epoch": 102.85906040268456,
"grad_norm": 2.078125,
"learning_rate": 1.25e-7,
"loss": 0.5617,
"step": 206
},
{
"epoch": 103.42953020134229,
"grad_norm": 4.6875,
"learning_rate": 1.2019230769230769e-7,
"loss": 1.1196,
"step": 207
},
{
"epoch": 103.85906040268456,
"grad_norm": 2.046875,
"learning_rate": 1.1538461538461539e-7,
"loss": 0.5728,
"step": 208
},
{
"epoch": 104.42953020134229,
"grad_norm": 4.8125,
"learning_rate": 1.1057692307692307e-7,
"loss": 1.1649,
"step": 209
},
{
"epoch": 104.85906040268456,
"grad_norm": 2.109375,
"learning_rate": 1.0576923076923076e-7,
"loss": 0.5778,
"step": 210
},
{
"epoch": 105.42953020134229,
"grad_norm": 4.65625,
"learning_rate": 1.0096153846153847e-7,
"loss": 1.1146,
"step": 211
},
{
"epoch": 105.85906040268456,
"grad_norm": 2.21875,
"learning_rate": 9.615384615384616e-8,
"loss": 0.5838,
"step": 212
},
{
"epoch": 106.42953020134229,
"grad_norm": 4.65625,
"learning_rate": 9.134615384615383e-8,
"loss": 1.1526,
"step": 213
},
{
"epoch": 106.85906040268456,
"grad_norm": 1.9921875,
"learning_rate": 8.653846153846153e-8,
"loss": 0.5514,
"step": 214
},
{
"epoch": 107.42953020134229,
"grad_norm": 4.6875,
"learning_rate": 8.173076923076923e-8,
"loss": 1.1244,
"step": 215
},
{
"epoch": 107.85906040268456,
"grad_norm": 2.078125,
"learning_rate": 7.692307692307692e-8,
"loss": 0.5643,
"step": 216
},
{
"epoch": 108.42953020134229,
"grad_norm": 4.90625,
"learning_rate": 7.21153846153846e-8,
"loss": 1.1783,
"step": 217
},
{
"epoch": 108.85906040268456,
"grad_norm": 2.09375,
"learning_rate": 6.73076923076923e-8,
"loss": 0.5582,
"step": 218
},
{
"epoch": 109.42953020134229,
"grad_norm": 4.59375,
"learning_rate": 6.25e-8,
"loss": 1.0912,
"step": 219
},
{
"epoch": 109.85906040268456,
"grad_norm": 2.109375,
"learning_rate": 5.7692307692307695e-8,
"loss": 0.5761,
"step": 220
},
{
"epoch": 110.42953020134229,
"grad_norm": 4.46875,
"learning_rate": 5.288461538461538e-8,
"loss": 1.1257,
"step": 221
},
{
"epoch": 110.85906040268456,
"grad_norm": 2.09375,
"learning_rate": 4.807692307692308e-8,
"loss": 0.5646,
"step": 222
},
{
"epoch": 111.42953020134229,
"grad_norm": 5.15625,
"learning_rate": 4.326923076923076e-8,
"loss": 1.1416,
"step": 223
},
{
"epoch": 111.85906040268456,
"grad_norm": 2,
"learning_rate": 3.846153846153846e-8,
"loss": 0.5649,
"step": 224
},
{
"epoch": 112.42953020134229,
"grad_norm": 4.65625,
"learning_rate": 3.365384615384615e-8,
"loss": 1.1299,
"step": 225
},
{
"epoch": 112.85906040268456,
"grad_norm": 2.171875,
"learning_rate": 2.8846153846153848e-8,
"loss": 0.5831,
"step": 226
},
{
"epoch": 113.42953020134229,
"grad_norm": 4.5,
"learning_rate": 2.403846153846154e-8,
"loss": 1.1757,
"step": 227
},
{
"epoch": 113.85906040268456,
"grad_norm": 2.09375,
"learning_rate": 1.923076923076923e-8,
"loss": 0.5684,
"step": 228
},
{
"epoch": 114.42953020134229,
"grad_norm": 4.78125,
"learning_rate": 1.4423076923076924e-8,
"loss": 1.0999,
"step": 229
},
{
"epoch": 114.85906040268456,
"grad_norm": 2.1875,
"learning_rate": 9.615384615384615e-9,
"loss": 0.5761,
"step": 230
},
{
"epoch": 115.42953020134229,
"grad_norm": 4.4375,
"learning_rate": 4.807692307692308e-9,
"loss": 1.1087,
"step": 231
},
{
"epoch": 115.85906040268456,
"grad_norm": 2.0625,
"learning_rate": 0,
"loss": 0.5792,
"step": 232
}
] | 1
| 232
| 0
| 116
| 10
|
{
"TrainerControl": {
"args": {
"should_epoch_stop": false,
"should_evaluate": false,
"should_log": false,
"should_save": true,
"should_training_stop": true
},
"attributes": {}
}
}
| 217,714,658,528,722,940
| 1
| null | null |
null | null | 77.430976
| 500
| 232
| false
| true
| true
|
[
{
"epoch": 0.43097643097643096,
"grad_norm": 11.25,
"learning_rate": 4.166666666666666e-8,
"loss": 0.7093,
"step": 1
},
{
"epoch": 0.8619528619528619,
"grad_norm": 11.625,
"learning_rate": 8.333333333333333e-8,
"loss": 0.7181,
"step": 2
},
{
"epoch": 1,
"grad_norm": 11.6875,
"learning_rate": 1.25e-7,
"loss": 0.7035,
"step": 3
},
{
"epoch": 1.430976430976431,
"grad_norm": 11.625,
"learning_rate": 1.6666666666666665e-7,
"loss": 0.7126,
"step": 4
},
{
"epoch": 1.861952861952862,
"grad_norm": 11.0625,
"learning_rate": 2.0833333333333333e-7,
"loss": 0.7081,
"step": 5
},
{
"epoch": 2,
"grad_norm": 12.3125,
"learning_rate": 2.5e-7,
"loss": 0.7259,
"step": 6
},
{
"epoch": 2.430976430976431,
"grad_norm": 11.5625,
"learning_rate": 2.916666666666667e-7,
"loss": 0.7111,
"step": 7
},
{
"epoch": 2.861952861952862,
"grad_norm": 11.5,
"learning_rate": 3.333333333333333e-7,
"loss": 0.7134,
"step": 8
},
{
"epoch": 3,
"grad_norm": 10.8125,
"learning_rate": 3.75e-7,
"loss": 0.7033,
"step": 9
},
{
"epoch": 3.430976430976431,
"grad_norm": 11.375,
"learning_rate": 4.1666666666666667e-7,
"loss": 0.7029,
"step": 10
},
{
"epoch": 3.861952861952862,
"grad_norm": 11,
"learning_rate": 4.5833333333333327e-7,
"loss": 0.7111,
"step": 11
},
{
"epoch": 4,
"grad_norm": 12.0625,
"learning_rate": 5e-7,
"loss": 0.7147,
"step": 12
},
{
"epoch": 4.430976430976431,
"grad_norm": 11.25,
"learning_rate": 5.416666666666666e-7,
"loss": 0.7061,
"step": 13
},
{
"epoch": 4.861952861952862,
"grad_norm": 10.9375,
"learning_rate": 5.833333333333334e-7,
"loss": 0.7076,
"step": 14
},
{
"epoch": 5,
"grad_norm": 11.4375,
"learning_rate": 6.249999999999999e-7,
"loss": 0.6796,
"step": 15
},
{
"epoch": 5.430976430976431,
"grad_norm": 10.75,
"learning_rate": 6.666666666666666e-7,
"loss": 0.685,
"step": 16
},
{
"epoch": 5.861952861952862,
"grad_norm": 10.8125,
"learning_rate": 7.083333333333334e-7,
"loss": 0.7135,
"step": 17
},
{
"epoch": 6,
"grad_norm": 10.5,
"learning_rate": 7.5e-7,
"loss": 0.6538,
"step": 18
},
{
"epoch": 6.430976430976431,
"grad_norm": 10.8125,
"learning_rate": 7.916666666666666e-7,
"loss": 0.6866,
"step": 19
},
{
"epoch": 6.861952861952862,
"grad_norm": 9.875,
"learning_rate": 8.333333333333333e-7,
"loss": 0.671,
"step": 20
},
{
"epoch": 7,
"grad_norm": 10.625,
"learning_rate": 8.75e-7,
"loss": 0.7019,
"step": 21
},
{
"epoch": 7.430976430976431,
"grad_norm": 10.3125,
"learning_rate": 9.166666666666665e-7,
"loss": 0.6952,
"step": 22
},
{
"epoch": 7.861952861952862,
"grad_norm": 9.5625,
"learning_rate": 9.583333333333334e-7,
"loss": 0.6603,
"step": 23
},
{
"epoch": 8,
"grad_norm": 10.125,
"learning_rate": 0.000001,
"loss": 0.621,
"step": 24
},
{
"epoch": 8.43097643097643,
"grad_norm": 9.6875,
"learning_rate": 9.951923076923077e-7,
"loss": 0.656,
"step": 25
},
{
"epoch": 8.861952861952862,
"grad_norm": 9.5,
"learning_rate": 9.903846153846153e-7,
"loss": 0.6336,
"step": 26
},
{
"epoch": 9,
"grad_norm": 9.5,
"learning_rate": 9.85576923076923e-7,
"loss": 0.6676,
"step": 27
},
{
"epoch": 9.43097643097643,
"grad_norm": 9.125,
"learning_rate": 9.807692307692306e-7,
"loss": 0.6297,
"step": 28
},
{
"epoch": 9.861952861952862,
"grad_norm": 8.875,
"learning_rate": 9.759615384615384e-7,
"loss": 0.6129,
"step": 29
},
{
"epoch": 10,
"grad_norm": 8.75,
"learning_rate": 9.711538461538462e-7,
"loss": 0.6084,
"step": 30
},
{
"epoch": 10.43097643097643,
"grad_norm": 8.3125,
"learning_rate": 9.663461538461537e-7,
"loss": 0.5924,
"step": 31
},
{
"epoch": 10.861952861952862,
"grad_norm": 8.25,
"learning_rate": 9.615384615384615e-7,
"loss": 0.6045,
"step": 32
},
{
"epoch": 11,
"grad_norm": 7.96875,
"learning_rate": 9.567307692307693e-7,
"loss": 0.5726,
"step": 33
},
{
"epoch": 11.43097643097643,
"grad_norm": 7.78125,
"learning_rate": 9.519230769230768e-7,
"loss": 0.5654,
"step": 34
},
{
"epoch": 11.861952861952862,
"grad_norm": 7.375,
"learning_rate": 9.471153846153846e-7,
"loss": 0.5819,
"step": 35
},
{
"epoch": 12,
"grad_norm": 7.46875,
"learning_rate": 9.423076923076923e-7,
"loss": 0.5634,
"step": 36
},
{
"epoch": 12.43097643097643,
"grad_norm": 7.09375,
"learning_rate": 9.374999999999999e-7,
"loss": 0.5515,
"step": 37
},
{
"epoch": 12.861952861952862,
"grad_norm": 7.15625,
"learning_rate": 9.326923076923077e-7,
"loss": 0.5583,
"step": 38
},
{
"epoch": 13,
"grad_norm": 7.125,
"learning_rate": 9.278846153846154e-7,
"loss": 0.5337,
"step": 39
},
{
"epoch": 13.43097643097643,
"grad_norm": 6.875,
"learning_rate": 9.230769230769231e-7,
"loss": 0.5439,
"step": 40
},
{
"epoch": 13.861952861952862,
"grad_norm": 6.59375,
"learning_rate": 9.182692307692307e-7,
"loss": 0.5212,
"step": 41
},
{
"epoch": 14,
"grad_norm": 8.8125,
"learning_rate": 9.134615384615383e-7,
"loss": 0.5372,
"step": 42
},
{
"epoch": 14.43097643097643,
"grad_norm": 6.53125,
"learning_rate": 9.086538461538461e-7,
"loss": 0.5146,
"step": 43
},
{
"epoch": 14.861952861952862,
"grad_norm": 6.6875,
"learning_rate": 9.038461538461538e-7,
"loss": 0.518,
"step": 44
},
{
"epoch": 15,
"grad_norm": 7.21875,
"learning_rate": 8.990384615384616e-7,
"loss": 0.5047,
"step": 45
},
{
"epoch": 15.43097643097643,
"grad_norm": 6.5625,
"learning_rate": 8.942307692307692e-7,
"loss": 0.5069,
"step": 46
},
{
"epoch": 15.861952861952862,
"grad_norm": 6.5,
"learning_rate": 8.894230769230768e-7,
"loss": 0.4899,
"step": 47
},
{
"epoch": 16,
"grad_norm": 7.15625,
"learning_rate": 8.846153846153846e-7,
"loss": 0.4903,
"step": 48
},
{
"epoch": 16.430976430976433,
"grad_norm": 6.6875,
"learning_rate": 8.798076923076922e-7,
"loss": 0.49,
"step": 49
},
{
"epoch": 16.86195286195286,
"grad_norm": 6.6875,
"learning_rate": 8.75e-7,
"loss": 0.4689,
"step": 50
},
{
"epoch": 17,
"grad_norm": 7.28125,
"learning_rate": 8.701923076923077e-7,
"loss": 0.4833,
"step": 51
},
{
"epoch": 17.430976430976433,
"grad_norm": 6.5,
"learning_rate": 8.653846153846154e-7,
"loss": 0.4585,
"step": 52
},
{
"epoch": 17.86195286195286,
"grad_norm": 6.5,
"learning_rate": 8.605769230769231e-7,
"loss": 0.4662,
"step": 53
},
{
"epoch": 18,
"grad_norm": 6.6875,
"learning_rate": 8.557692307692306e-7,
"loss": 0.4687,
"step": 54
},
{
"epoch": 18.430976430976433,
"grad_norm": 6.34375,
"learning_rate": 8.509615384615384e-7,
"loss": 0.4531,
"step": 55
},
{
"epoch": 18.86195286195286,
"grad_norm": 7.03125,
"learning_rate": 8.461538461538461e-7,
"loss": 0.442,
"step": 56
},
{
"epoch": 19,
"grad_norm": 8.3125,
"learning_rate": 8.413461538461539e-7,
"loss": 0.4503,
"step": 57
},
{
"epoch": 19.430976430976433,
"grad_norm": 6.5625,
"learning_rate": 8.365384615384615e-7,
"loss": 0.4328,
"step": 58
},
{
"epoch": 19.86195286195286,
"grad_norm": 5.6875,
"learning_rate": 8.317307692307692e-7,
"loss": 0.424,
"step": 59
},
{
"epoch": 20,
"grad_norm": 6.6875,
"learning_rate": 8.269230769230768e-7,
"loss": 0.4779,
"step": 60
},
{
"epoch": 20.430976430976433,
"grad_norm": 5.21875,
"learning_rate": 8.221153846153845e-7,
"loss": 0.4261,
"step": 61
},
{
"epoch": 20.86195286195286,
"grad_norm": 5.125,
"learning_rate": 8.173076923076923e-7,
"loss": 0.4161,
"step": 62
},
{
"epoch": 21,
"grad_norm": 5.28125,
"learning_rate": 8.125e-7,
"loss": 0.4355,
"step": 63
},
{
"epoch": 21.430976430976433,
"grad_norm": 4.625,
"learning_rate": 8.076923076923077e-7,
"loss": 0.4074,
"step": 64
},
{
"epoch": 21.86195286195286,
"grad_norm": 4.5,
"learning_rate": 8.028846153846154e-7,
"loss": 0.4235,
"step": 65
},
{
"epoch": 22,
"grad_norm": 5.125,
"learning_rate": 7.98076923076923e-7,
"loss": 0.3985,
"step": 66
},
{
"epoch": 22.430976430976433,
"grad_norm": 4.4375,
"learning_rate": 7.932692307692307e-7,
"loss": 0.3986,
"step": 67
},
{
"epoch": 22.86195286195286,
"grad_norm": 4.03125,
"learning_rate": 7.884615384615384e-7,
"loss": 0.4172,
"step": 68
},
{
"epoch": 23,
"grad_norm": 4.46875,
"learning_rate": 7.836538461538462e-7,
"loss": 0.3858,
"step": 69
},
{
"epoch": 23.430976430976433,
"grad_norm": 3.890625,
"learning_rate": 7.788461538461538e-7,
"loss": 0.3938,
"step": 70
},
{
"epoch": 23.86195286195286,
"grad_norm": 3.796875,
"learning_rate": 7.740384615384615e-7,
"loss": 0.4001,
"step": 71
},
{
"epoch": 24,
"grad_norm": 4.25,
"learning_rate": 7.692307692307693e-7,
"loss": 0.4046,
"step": 72
},
{
"epoch": 24.430976430976433,
"grad_norm": 3.765625,
"learning_rate": 7.644230769230768e-7,
"loss": 0.3868,
"step": 73
},
{
"epoch": 24.86195286195286,
"grad_norm": 3.46875,
"learning_rate": 7.596153846153846e-7,
"loss": 0.3902,
"step": 74
},
{
"epoch": 25,
"grad_norm": 4.21875,
"learning_rate": 7.548076923076922e-7,
"loss": 0.4087,
"step": 75
},
{
"epoch": 25.430976430976433,
"grad_norm": 3.296875,
"learning_rate": 7.5e-7,
"loss": 0.3873,
"step": 76
},
{
"epoch": 25.86195286195286,
"grad_norm": 3.609375,
"learning_rate": 7.451923076923077e-7,
"loss": 0.3797,
"step": 77
},
{
"epoch": 26,
"grad_norm": 4.125,
"learning_rate": 7.403846153846153e-7,
"loss": 0.3971,
"step": 78
},
{
"epoch": 26.430976430976433,
"grad_norm": 3.265625,
"learning_rate": 7.355769230769231e-7,
"loss": 0.3687,
"step": 79
},
{
"epoch": 26.86195286195286,
"grad_norm": 3.203125,
"learning_rate": 7.307692307692307e-7,
"loss": 0.3932,
"step": 80
},
{
"epoch": 27,
"grad_norm": 4.1875,
"learning_rate": 7.259615384615385e-7,
"loss": 0.3753,
"step": 81
},
{
"epoch": 27.430976430976433,
"grad_norm": 3.1875,
"learning_rate": 7.211538461538461e-7,
"loss": 0.3724,
"step": 82
},
{
"epoch": 27.86195286195286,
"grad_norm": 3.015625,
"learning_rate": 7.163461538461538e-7,
"loss": 0.3737,
"step": 83
},
{
"epoch": 28,
"grad_norm": 4.1875,
"learning_rate": 7.115384615384616e-7,
"loss": 0.3912,
"step": 84
},
{
"epoch": 28.430976430976433,
"grad_norm": 2.984375,
"learning_rate": 7.067307692307692e-7,
"loss": 0.3757,
"step": 85
},
{
"epoch": 28.86195286195286,
"grad_norm": 3.15625,
"learning_rate": 7.019230769230769e-7,
"loss": 0.3698,
"step": 86
},
{
"epoch": 29,
"grad_norm": 4.53125,
"learning_rate": 6.971153846153845e-7,
"loss": 0.3626,
"step": 87
},
{
"epoch": 29.430976430976433,
"grad_norm": 2.734375,
"learning_rate": 6.923076923076922e-7,
"loss": 0.3765,
"step": 88
},
{
"epoch": 29.86195286195286,
"grad_norm": 3.203125,
"learning_rate": 6.875e-7,
"loss": 0.3533,
"step": 89
},
{
"epoch": 30,
"grad_norm": 3.96875,
"learning_rate": 6.826923076923076e-7,
"loss": 0.3833,
"step": 90
},
{
"epoch": 30.430976430976433,
"grad_norm": 2.859375,
"learning_rate": 6.778846153846154e-7,
"loss": 0.3649,
"step": 91
},
{
"epoch": 30.86195286195286,
"grad_norm": 2.734375,
"learning_rate": 6.730769230769231e-7,
"loss": 0.3633,
"step": 92
},
{
"epoch": 31,
"grad_norm": 3.625,
"learning_rate": 6.682692307692307e-7,
"loss": 0.3661,
"step": 93
},
{
"epoch": 31.430976430976433,
"grad_norm": 2.796875,
"learning_rate": 6.634615384615384e-7,
"loss": 0.3642,
"step": 94
},
{
"epoch": 31.86195286195286,
"grad_norm": 2.65625,
"learning_rate": 6.586538461538461e-7,
"loss": 0.3527,
"step": 95
},
{
"epoch": 32,
"grad_norm": 3.84375,
"learning_rate": 6.538461538461538e-7,
"loss": 0.3801,
"step": 96
},
{
"epoch": 32.43097643097643,
"grad_norm": 2.53125,
"learning_rate": 6.490384615384615e-7,
"loss": 0.3507,
"step": 97
},
{
"epoch": 32.861952861952865,
"grad_norm": 2.765625,
"learning_rate": 6.442307692307693e-7,
"loss": 0.3612,
"step": 98
},
{
"epoch": 33,
"grad_norm": 3.9375,
"learning_rate": 6.394230769230768e-7,
"loss": 0.3758,
"step": 99
},
{
"epoch": 33.43097643097643,
"grad_norm": 2.84375,
"learning_rate": 6.346153846153845e-7,
"loss": 0.3635,
"step": 100
},
{
"epoch": 33.861952861952865,
"grad_norm": 2.53125,
"learning_rate": 6.298076923076923e-7,
"loss": 0.3502,
"step": 101
},
{
"epoch": 34,
"grad_norm": 3.6875,
"learning_rate": 6.249999999999999e-7,
"loss": 0.3524,
"step": 102
},
{
"epoch": 34.43097643097643,
"grad_norm": 2.75,
"learning_rate": 6.201923076923077e-7,
"loss": 0.3529,
"step": 103
},
{
"epoch": 34.861952861952865,
"grad_norm": 2.453125,
"learning_rate": 6.153846153846154e-7,
"loss": 0.3567,
"step": 104
},
{
"epoch": 35,
"grad_norm": 4.1875,
"learning_rate": 6.105769230769232e-7,
"loss": 0.3478,
"step": 105
},
{
"epoch": 35.43097643097643,
"grad_norm": 2.484375,
"learning_rate": 6.057692307692307e-7,
"loss": 0.3483,
"step": 106
},
{
"epoch": 35.861952861952865,
"grad_norm": 2.40625,
"learning_rate": 6.009615384615384e-7,
"loss": 0.3509,
"step": 107
},
{
"epoch": 36,
"grad_norm": 4.21875,
"learning_rate": 5.961538461538461e-7,
"loss": 0.3679,
"step": 108
},
{
"epoch": 36.43097643097643,
"grad_norm": 2.53125,
"learning_rate": 5.913461538461538e-7,
"loss": 0.3591,
"step": 109
},
{
"epoch": 36.861952861952865,
"grad_norm": 2.4375,
"learning_rate": 5.865384615384616e-7,
"loss": 0.3478,
"step": 110
},
{
"epoch": 37,
"grad_norm": 3.609375,
"learning_rate": 5.817307692307692e-7,
"loss": 0.3272,
"step": 111
},
{
"epoch": 37.43097643097643,
"grad_norm": 2.59375,
"learning_rate": 5.769230769230768e-7,
"loss": 0.3548,
"step": 112
},
{
"epoch": 37.861952861952865,
"grad_norm": 2.640625,
"learning_rate": 5.721153846153846e-7,
"loss": 0.3497,
"step": 113
},
{
"epoch": 38,
"grad_norm": 4.15625,
"learning_rate": 5.673076923076922e-7,
"loss": 0.3222,
"step": 114
},
{
"epoch": 38.43097643097643,
"grad_norm": 2.4375,
"learning_rate": 5.625e-7,
"loss": 0.3511,
"step": 115
},
{
"epoch": 38.861952861952865,
"grad_norm": 2.5625,
"learning_rate": 5.576923076923077e-7,
"loss": 0.3489,
"step": 116
},
{
"epoch": 39,
"grad_norm": 3.234375,
"learning_rate": 5.528846153846153e-7,
"loss": 0.3264,
"step": 117
},
{
"epoch": 39.43097643097643,
"grad_norm": 2.53125,
"learning_rate": 5.480769230769231e-7,
"loss": 0.3484,
"step": 118
},
{
"epoch": 39.861952861952865,
"grad_norm": 2.484375,
"learning_rate": 5.432692307692307e-7,
"loss": 0.3396,
"step": 119
},
{
"epoch": 40,
"grad_norm": 3.5,
"learning_rate": 5.384615384615384e-7,
"loss": 0.3525,
"step": 120
},
{
"epoch": 40.43097643097643,
"grad_norm": 2.390625,
"learning_rate": 5.336538461538461e-7,
"loss": 0.3343,
"step": 121
},
{
"epoch": 40.861952861952865,
"grad_norm": 2.609375,
"learning_rate": 5.288461538461539e-7,
"loss": 0.3541,
"step": 122
},
{
"epoch": 41,
"grad_norm": 3.265625,
"learning_rate": 5.240384615384615e-7,
"loss": 0.3404,
"step": 123
},
{
"epoch": 41.43097643097643,
"grad_norm": 2.375,
"learning_rate": 5.192307692307692e-7,
"loss": 0.3431,
"step": 124
},
{
"epoch": 41.861952861952865,
"grad_norm": 2.390625,
"learning_rate": 5.144230769230769e-7,
"loss": 0.3375,
"step": 125
},
{
"epoch": 42,
"grad_norm": 4.375,
"learning_rate": 5.096153846153845e-7,
"loss": 0.3538,
"step": 126
},
{
"epoch": 42.43097643097643,
"grad_norm": 2.328125,
"learning_rate": 5.048076923076923e-7,
"loss": 0.3364,
"step": 127
},
{
"epoch": 42.861952861952865,
"grad_norm": 2.265625,
"learning_rate": 5e-7,
"loss": 0.3421,
"step": 128
},
{
"epoch": 43,
"grad_norm": 3.84375,
"learning_rate": 4.951923076923076e-7,
"loss": 0.3522,
"step": 129
},
{
"epoch": 43.43097643097643,
"grad_norm": 2.46875,
"learning_rate": 4.903846153846153e-7,
"loss": 0.3502,
"step": 130
},
{
"epoch": 43.861952861952865,
"grad_norm": 2.3125,
"learning_rate": 4.855769230769231e-7,
"loss": 0.3296,
"step": 131
},
{
"epoch": 44,
"grad_norm": 4.34375,
"learning_rate": 4.807692307692307e-7,
"loss": 0.339,
"step": 132
},
{
"epoch": 44.43097643097643,
"grad_norm": 2.703125,
"learning_rate": 4.759615384615384e-7,
"loss": 0.3252,
"step": 133
},
{
"epoch": 44.861952861952865,
"grad_norm": 2.46875,
"learning_rate": 4.711538461538461e-7,
"loss": 0.355,
"step": 134
},
{
"epoch": 45,
"grad_norm": 3.765625,
"learning_rate": 4.6634615384615384e-7,
"loss": 0.3297,
"step": 135
},
{
"epoch": 45.43097643097643,
"grad_norm": 2.28125,
"learning_rate": 4.6153846153846156e-7,
"loss": 0.3316,
"step": 136
},
{
"epoch": 45.861952861952865,
"grad_norm": 2.515625,
"learning_rate": 4.567307692307692e-7,
"loss": 0.3453,
"step": 137
},
{
"epoch": 46,
"grad_norm": 3.265625,
"learning_rate": 4.519230769230769e-7,
"loss": 0.3334,
"step": 138
},
{
"epoch": 46.43097643097643,
"grad_norm": 2.53125,
"learning_rate": 4.471153846153846e-7,
"loss": 0.3419,
"step": 139
},
{
"epoch": 46.861952861952865,
"grad_norm": 2.265625,
"learning_rate": 4.423076923076923e-7,
"loss": 0.3361,
"step": 140
},
{
"epoch": 47,
"grad_norm": 3.203125,
"learning_rate": 4.375e-7,
"loss": 0.3243,
"step": 141
},
{
"epoch": 47.43097643097643,
"grad_norm": 2.359375,
"learning_rate": 4.326923076923077e-7,
"loss": 0.3299,
"step": 142
},
{
"epoch": 47.861952861952865,
"grad_norm": 2.375,
"learning_rate": 4.278846153846153e-7,
"loss": 0.3504,
"step": 143
},
{
"epoch": 48,
"grad_norm": 4.75,
"learning_rate": 4.2307692307692304e-7,
"loss": 0.3105,
"step": 144
},
{
"epoch": 48.43097643097643,
"grad_norm": 2.34375,
"learning_rate": 4.1826923076923076e-7,
"loss": 0.3305,
"step": 145
},
{
"epoch": 48.861952861952865,
"grad_norm": 2.4375,
"learning_rate": 4.134615384615384e-7,
"loss": 0.3377,
"step": 146
},
{
"epoch": 49,
"grad_norm": 4.78125,
"learning_rate": 4.0865384615384614e-7,
"loss": 0.3438,
"step": 147
},
{
"epoch": 49.43097643097643,
"grad_norm": 2.265625,
"learning_rate": 4.0384615384615386e-7,
"loss": 0.3263,
"step": 148
},
{
"epoch": 49.861952861952865,
"grad_norm": 2.734375,
"learning_rate": 3.990384615384615e-7,
"loss": 0.3381,
"step": 149
},
{
"epoch": 50,
"grad_norm": 4.46875,
"learning_rate": 3.942307692307692e-7,
"loss": 0.3517,
"step": 150
},
{
"epoch": 50.43097643097643,
"grad_norm": 2.28125,
"learning_rate": 3.894230769230769e-7,
"loss": 0.3368,
"step": 151
},
{
"epoch": 50.861952861952865,
"grad_norm": 2.421875,
"learning_rate": 3.8461538461538463e-7,
"loss": 0.3366,
"step": 152
},
{
"epoch": 51,
"grad_norm": 3.28125,
"learning_rate": 3.798076923076923e-7,
"loss": 0.3184,
"step": 153
},
{
"epoch": 51.43097643097643,
"grad_norm": 2.234375,
"learning_rate": 3.75e-7,
"loss": 0.3278,
"step": 154
},
{
"epoch": 51.861952861952865,
"grad_norm": 2.515625,
"learning_rate": 3.701923076923077e-7,
"loss": 0.3408,
"step": 155
},
{
"epoch": 52,
"grad_norm": 3.890625,
"learning_rate": 3.6538461538461534e-7,
"loss": 0.33,
"step": 156
},
{
"epoch": 52.43097643097643,
"grad_norm": 2.375,
"learning_rate": 3.6057692307692306e-7,
"loss": 0.344,
"step": 157
},
{
"epoch": 52.861952861952865,
"grad_norm": 2.625,
"learning_rate": 3.557692307692308e-7,
"loss": 0.33,
"step": 158
},
{
"epoch": 53,
"grad_norm": 3.53125,
"learning_rate": 3.5096153846153844e-7,
"loss": 0.3111,
"step": 159
},
{
"epoch": 53.43097643097643,
"grad_norm": 2.46875,
"learning_rate": 3.461538461538461e-7,
"loss": 0.337,
"step": 160
},
{
"epoch": 53.861952861952865,
"grad_norm": 2.1875,
"learning_rate": 3.413461538461538e-7,
"loss": 0.3314,
"step": 161
},
{
"epoch": 54,
"grad_norm": 4.0625,
"learning_rate": 3.3653846153846154e-7,
"loss": 0.3229,
"step": 162
},
{
"epoch": 54.43097643097643,
"grad_norm": 2.234375,
"learning_rate": 3.317307692307692e-7,
"loss": 0.3331,
"step": 163
},
{
"epoch": 54.861952861952865,
"grad_norm": 2.609375,
"learning_rate": 3.269230769230769e-7,
"loss": 0.3308,
"step": 164
},
{
"epoch": 55,
"grad_norm": 3.65625,
"learning_rate": 3.2211538461538464e-7,
"loss": 0.3352,
"step": 165
},
{
"epoch": 55.43097643097643,
"grad_norm": 2.296875,
"learning_rate": 3.1730769230769225e-7,
"loss": 0.3456,
"step": 166
},
{
"epoch": 55.861952861952865,
"grad_norm": 2.328125,
"learning_rate": 3.1249999999999997e-7,
"loss": 0.3167,
"step": 167
},
{
"epoch": 56,
"grad_norm": 4.0625,
"learning_rate": 3.076923076923077e-7,
"loss": 0.338,
"step": 168
},
{
"epoch": 56.43097643097643,
"grad_norm": 2.375,
"learning_rate": 3.0288461538461536e-7,
"loss": 0.3418,
"step": 169
},
{
"epoch": 56.861952861952865,
"grad_norm": 2.25,
"learning_rate": 2.980769230769231e-7,
"loss": 0.3249,
"step": 170
},
{
"epoch": 57,
"grad_norm": 3.578125,
"learning_rate": 2.932692307692308e-7,
"loss": 0.3217,
"step": 171
},
{
"epoch": 57.43097643097643,
"grad_norm": 2.28125,
"learning_rate": 2.884615384615384e-7,
"loss": 0.3301,
"step": 172
},
{
"epoch": 57.861952861952865,
"grad_norm": 2.46875,
"learning_rate": 2.836538461538461e-7,
"loss": 0.3323,
"step": 173
},
{
"epoch": 58,
"grad_norm": 3.671875,
"learning_rate": 2.7884615384615384e-7,
"loss": 0.3324,
"step": 174
},
{
"epoch": 58.43097643097643,
"grad_norm": 2.25,
"learning_rate": 2.7403846153846156e-7,
"loss": 0.3264,
"step": 175
},
{
"epoch": 58.861952861952865,
"grad_norm": 2.390625,
"learning_rate": 2.692307692307692e-7,
"loss": 0.3358,
"step": 176
},
{
"epoch": 59,
"grad_norm": 4.0625,
"learning_rate": 2.6442307692307694e-7,
"loss": 0.3304,
"step": 177
},
{
"epoch": 59.43097643097643,
"grad_norm": 2.328125,
"learning_rate": 2.596153846153846e-7,
"loss": 0.3382,
"step": 178
},
{
"epoch": 59.861952861952865,
"grad_norm": 2.21875,
"learning_rate": 2.5480769230769227e-7,
"loss": 0.3239,
"step": 179
},
{
"epoch": 60,
"grad_norm": 4,
"learning_rate": 2.5e-7,
"loss": 0.329,
"step": 180
},
{
"epoch": 60.43097643097643,
"grad_norm": 2.328125,
"learning_rate": 2.4519230769230765e-7,
"loss": 0.3174,
"step": 181
},
{
"epoch": 60.861952861952865,
"grad_norm": 2.375,
"learning_rate": 2.4038461538461537e-7,
"loss": 0.3457,
"step": 182
},
{
"epoch": 61,
"grad_norm": 3.734375,
"learning_rate": 2.3557692307692306e-7,
"loss": 0.3233,
"step": 183
},
{
"epoch": 61.43097643097643,
"grad_norm": 2.390625,
"learning_rate": 2.3076923076923078e-7,
"loss": 0.3297,
"step": 184
},
{
"epoch": 61.861952861952865,
"grad_norm": 2.3125,
"learning_rate": 2.2596153846153845e-7,
"loss": 0.329,
"step": 185
},
{
"epoch": 62,
"grad_norm": 3.625,
"learning_rate": 2.2115384615384614e-7,
"loss": 0.3368,
"step": 186
},
{
"epoch": 62.43097643097643,
"grad_norm": 2.203125,
"learning_rate": 2.1634615384615386e-7,
"loss": 0.3318,
"step": 187
},
{
"epoch": 62.861952861952865,
"grad_norm": 2.15625,
"learning_rate": 2.1153846153846152e-7,
"loss": 0.3259,
"step": 188
},
{
"epoch": 63,
"grad_norm": 3.65625,
"learning_rate": 2.067307692307692e-7,
"loss": 0.337,
"step": 189
},
{
"epoch": 63.43097643097643,
"grad_norm": 2.296875,
"learning_rate": 2.0192307692307693e-7,
"loss": 0.3362,
"step": 190
},
{
"epoch": 63.861952861952865,
"grad_norm": 2.34375,
"learning_rate": 1.971153846153846e-7,
"loss": 0.3277,
"step": 191
},
{
"epoch": 64,
"grad_norm": 3.5,
"learning_rate": 1.9230769230769231e-7,
"loss": 0.3159,
"step": 192
},
{
"epoch": 64.43097643097643,
"grad_norm": 2.375,
"learning_rate": 1.875e-7,
"loss": 0.3356,
"step": 193
},
{
"epoch": 64.86195286195286,
"grad_norm": 2.28125,
"learning_rate": 1.8269230769230767e-7,
"loss": 0.3248,
"step": 194
},
{
"epoch": 65,
"grad_norm": 3.765625,
"learning_rate": 1.778846153846154e-7,
"loss": 0.3281,
"step": 195
},
{
"epoch": 65.43097643097643,
"grad_norm": 2.6875,
"learning_rate": 1.7307692307692305e-7,
"loss": 0.3291,
"step": 196
},
{
"epoch": 65.86195286195286,
"grad_norm": 2.328125,
"learning_rate": 1.6826923076923077e-7,
"loss": 0.3281,
"step": 197
},
{
"epoch": 66,
"grad_norm": 3.578125,
"learning_rate": 1.6346153846153846e-7,
"loss": 0.3377,
"step": 198
},
{
"epoch": 66.43097643097643,
"grad_norm": 2.265625,
"learning_rate": 1.5865384615384613e-7,
"loss": 0.3251,
"step": 199
},
{
"epoch": 66.86195286195286,
"grad_norm": 2.4375,
"learning_rate": 1.5384615384615385e-7,
"loss": 0.3375,
"step": 200
},
{
"epoch": 67,
"grad_norm": 3.5625,
"learning_rate": 1.4903846153846154e-7,
"loss": 0.319,
"step": 201
},
{
"epoch": 67.43097643097643,
"grad_norm": 2.734375,
"learning_rate": 1.442307692307692e-7,
"loss": 0.322,
"step": 202
},
{
"epoch": 67.86195286195286,
"grad_norm": 2.421875,
"learning_rate": 1.3942307692307692e-7,
"loss": 0.3436,
"step": 203
},
{
"epoch": 68,
"grad_norm": 3.9375,
"learning_rate": 1.346153846153846e-7,
"loss": 0.3104,
"step": 204
},
{
"epoch": 68.43097643097643,
"grad_norm": 2.25,
"learning_rate": 1.298076923076923e-7,
"loss": 0.333,
"step": 205
},
{
"epoch": 68.86195286195286,
"grad_norm": 2.34375,
"learning_rate": 1.25e-7,
"loss": 0.3273,
"step": 206
},
{
"epoch": 69,
"grad_norm": 3.734375,
"learning_rate": 1.2019230769230769e-7,
"loss": 0.326,
"step": 207
},
{
"epoch": 69.43097643097643,
"grad_norm": 2.1875,
"learning_rate": 1.1538461538461539e-7,
"loss": 0.3258,
"step": 208
},
{
"epoch": 69.86195286195286,
"grad_norm": 2.28125,
"learning_rate": 1.1057692307692307e-7,
"loss": 0.3415,
"step": 209
},
{
"epoch": 70,
"grad_norm": 3.4375,
"learning_rate": 1.0576923076923076e-7,
"loss": 0.3046,
"step": 210
},
{
"epoch": 70.43097643097643,
"grad_norm": 2.171875,
"learning_rate": 1.0096153846153847e-7,
"loss": 0.3283,
"step": 211
},
{
"epoch": 70.86195286195286,
"grad_norm": 2.28125,
"learning_rate": 9.615384615384616e-8,
"loss": 0.3328,
"step": 212
},
{
"epoch": 71,
"grad_norm": 3.4375,
"learning_rate": 9.134615384615383e-8,
"loss": 0.3222,
"step": 213
},
{
"epoch": 71.43097643097643,
"grad_norm": 2.671875,
"learning_rate": 8.653846153846153e-8,
"loss": 0.3205,
"step": 214
},
{
"epoch": 71.86195286195286,
"grad_norm": 2.609375,
"learning_rate": 8.173076923076923e-8,
"loss": 0.3391,
"step": 215
},
{
"epoch": 72,
"grad_norm": 3.328125,
"learning_rate": 7.692307692307692e-8,
"loss": 0.3253,
"step": 216
},
{
"epoch": 72.43097643097643,
"grad_norm": 2.546875,
"learning_rate": 7.21153846153846e-8,
"loss": 0.333,
"step": 217
},
{
"epoch": 72.86195286195286,
"grad_norm": 2.21875,
"learning_rate": 6.73076923076923e-8,
"loss": 0.3286,
"step": 218
},
{
"epoch": 73,
"grad_norm": 3.796875,
"learning_rate": 6.25e-8,
"loss": 0.3201,
"step": 219
},
{
"epoch": 73.43097643097643,
"grad_norm": 2.390625,
"learning_rate": 5.7692307692307695e-8,
"loss": 0.3307,
"step": 220
},
{
"epoch": 73.86195286195286,
"grad_norm": 2.359375,
"learning_rate": 5.288461538461538e-8,
"loss": 0.326,
"step": 221
},
{
"epoch": 74,
"grad_norm": 4.375,
"learning_rate": 4.807692307692308e-8,
"loss": 0.337,
"step": 222
},
{
"epoch": 74.43097643097643,
"grad_norm": 2.125,
"learning_rate": 4.326923076923076e-8,
"loss": 0.3317,
"step": 223
},
{
"epoch": 74.86195286195286,
"grad_norm": 2.328125,
"learning_rate": 3.846153846153846e-8,
"loss": 0.3358,
"step": 224
},
{
"epoch": 75,
"grad_norm": 3.984375,
"learning_rate": 3.365384615384615e-8,
"loss": 0.2995,
"step": 225
},
{
"epoch": 75.43097643097643,
"grad_norm": 2.625,
"learning_rate": 2.8846153846153848e-8,
"loss": 0.3232,
"step": 226
},
{
"epoch": 75.86195286195286,
"grad_norm": 2.265625,
"learning_rate": 2.403846153846154e-8,
"loss": 0.329,
"step": 227
},
{
"epoch": 76,
"grad_norm": 3.5625,
"learning_rate": 1.923076923076923e-8,
"loss": 0.348,
"step": 228
},
{
"epoch": 76.43097643097643,
"grad_norm": 2.5625,
"learning_rate": 1.4423076923076924e-8,
"loss": 0.3301,
"step": 229
},
{
"epoch": 76.86195286195286,
"grad_norm": 2.546875,
"learning_rate": 9.615384615384615e-9,
"loss": 0.3246,
"step": 230
},
{
"epoch": 77,
"grad_norm": 3.34375,
"learning_rate": 4.807692307692308e-9,
"loss": 0.3421,
"step": 231
},
{
"epoch": 77.43097643097643,
"grad_norm": 2.25,
"learning_rate": 0,
"loss": 0.3283,
"step": 232
}
] | 1
| 232
| 0
| 116
| 10
|
{
"TrainerControl": {
"args": {
"should_epoch_stop": false,
"should_evaluate": false,
"should_log": false,
"should_save": true,
"should_training_stop": true
},
"attributes": {}
}
}
| 488,582,934,243,360,800
| 1
| null | null |
null | null | 77.430976
| 500
| 232
| false
| true
| true
|
[
{
"epoch": 0.43097643097643096,
"grad_norm": 9.5625,
"learning_rate": 4.166666666666666e-8,
"loss": 0.7483,
"step": 1
},
{
"epoch": 0.8619528619528619,
"grad_norm": 10,
"learning_rate": 8.333333333333333e-8,
"loss": 0.7577,
"step": 2
},
{
"epoch": 1,
"grad_norm": 9.5625,
"learning_rate": 1.25e-7,
"loss": 0.7634,
"step": 3
},
{
"epoch": 1.430976430976431,
"grad_norm": 10,
"learning_rate": 1.6666666666666665e-7,
"loss": 0.767,
"step": 4
},
{
"epoch": 1.861952861952862,
"grad_norm": 9.4375,
"learning_rate": 2.0833333333333333e-7,
"loss": 0.7352,
"step": 5
},
{
"epoch": 2,
"grad_norm": 9.875,
"learning_rate": 2.5e-7,
"loss": 0.7742,
"step": 6
},
{
"epoch": 2.430976430976431,
"grad_norm": 9.75,
"learning_rate": 2.916666666666667e-7,
"loss": 0.7649,
"step": 7
},
{
"epoch": 2.861952861952862,
"grad_norm": 9.4375,
"learning_rate": 3.333333333333333e-7,
"loss": 0.7255,
"step": 8
},
{
"epoch": 3,
"grad_norm": 10.375,
"learning_rate": 3.75e-7,
"loss": 0.8037,
"step": 9
},
{
"epoch": 3.430976430976431,
"grad_norm": 9.6875,
"learning_rate": 4.1666666666666667e-7,
"loss": 0.7575,
"step": 10
},
{
"epoch": 3.861952861952862,
"grad_norm": 9.5,
"learning_rate": 4.5833333333333327e-7,
"loss": 0.7439,
"step": 11
},
{
"epoch": 4,
"grad_norm": 9.5,
"learning_rate": 5e-7,
"loss": 0.7527,
"step": 12
},
{
"epoch": 4.430976430976431,
"grad_norm": 9.5,
"learning_rate": 5.416666666666666e-7,
"loss": 0.7478,
"step": 13
},
{
"epoch": 4.861952861952862,
"grad_norm": 9.6875,
"learning_rate": 5.833333333333334e-7,
"loss": 0.7407,
"step": 14
},
{
"epoch": 5,
"grad_norm": 9.5,
"learning_rate": 6.249999999999999e-7,
"loss": 0.7651,
"step": 15
},
{
"epoch": 5.430976430976431,
"grad_norm": 8.875,
"learning_rate": 6.666666666666666e-7,
"loss": 0.7251,
"step": 16
},
{
"epoch": 5.861952861952862,
"grad_norm": 9.0625,
"learning_rate": 7.083333333333334e-7,
"loss": 0.7423,
"step": 17
},
{
"epoch": 6,
"grad_norm": 9.8125,
"learning_rate": 7.5e-7,
"loss": 0.7722,
"step": 18
},
{
"epoch": 6.430976430976431,
"grad_norm": 9.125,
"learning_rate": 7.916666666666666e-7,
"loss": 0.736,
"step": 19
},
{
"epoch": 6.861952861952862,
"grad_norm": 8.4375,
"learning_rate": 8.333333333333333e-7,
"loss": 0.7303,
"step": 20
},
{
"epoch": 7,
"grad_norm": 8.5,
"learning_rate": 8.75e-7,
"loss": 0.708,
"step": 21
},
{
"epoch": 7.430976430976431,
"grad_norm": 8.375,
"learning_rate": 9.166666666666665e-7,
"loss": 0.7205,
"step": 22
},
{
"epoch": 7.861952861952862,
"grad_norm": 8.1875,
"learning_rate": 9.583333333333334e-7,
"loss": 0.7153,
"step": 23
},
{
"epoch": 8,
"grad_norm": 9.0625,
"learning_rate": 0.000001,
"loss": 0.7314,
"step": 24
},
{
"epoch": 8.43097643097643,
"grad_norm": 8.1875,
"learning_rate": 9.951923076923077e-7,
"loss": 0.7274,
"step": 25
},
{
"epoch": 8.861952861952862,
"grad_norm": 7.8125,
"learning_rate": 9.903846153846153e-7,
"loss": 0.6805,
"step": 26
},
{
"epoch": 9,
"grad_norm": 8.25,
"learning_rate": 9.85576923076923e-7,
"loss": 0.6962,
"step": 27
},
{
"epoch": 9.43097643097643,
"grad_norm": 7.625,
"learning_rate": 9.807692307692306e-7,
"loss": 0.673,
"step": 28
},
{
"epoch": 9.861952861952862,
"grad_norm": 7.53125,
"learning_rate": 9.759615384615384e-7,
"loss": 0.6913,
"step": 29
},
{
"epoch": 10,
"grad_norm": 7.40625,
"learning_rate": 9.711538461538462e-7,
"loss": 0.6668,
"step": 30
},
{
"epoch": 10.43097643097643,
"grad_norm": 7.375,
"learning_rate": 9.663461538461537e-7,
"loss": 0.6549,
"step": 31
},
{
"epoch": 10.861952861952862,
"grad_norm": 6.75,
"learning_rate": 9.615384615384615e-7,
"loss": 0.6662,
"step": 32
},
{
"epoch": 11,
"grad_norm": 7.1875,
"learning_rate": 9.567307692307693e-7,
"loss": 0.654,
"step": 33
},
{
"epoch": 11.43097643097643,
"grad_norm": 6.71875,
"learning_rate": 9.519230769230768e-7,
"loss": 0.6347,
"step": 34
},
{
"epoch": 11.861952861952862,
"grad_norm": 6.46875,
"learning_rate": 9.471153846153846e-7,
"loss": 0.6414,
"step": 35
},
{
"epoch": 12,
"grad_norm": 7.03125,
"learning_rate": 9.423076923076923e-7,
"loss": 0.6641,
"step": 36
},
{
"epoch": 12.43097643097643,
"grad_norm": 6.5625,
"learning_rate": 9.374999999999999e-7,
"loss": 0.6166,
"step": 37
},
{
"epoch": 12.861952861952862,
"grad_norm": 5.9375,
"learning_rate": 9.326923076923077e-7,
"loss": 0.6303,
"step": 38
},
{
"epoch": 13,
"grad_norm": 6.625,
"learning_rate": 9.278846153846154e-7,
"loss": 0.6309,
"step": 39
},
{
"epoch": 13.43097643097643,
"grad_norm": 5.875,
"learning_rate": 9.230769230769231e-7,
"loss": 0.5971,
"step": 40
},
{
"epoch": 13.861952861952862,
"grad_norm": 6.0625,
"learning_rate": 9.182692307692307e-7,
"loss": 0.6181,
"step": 41
},
{
"epoch": 14,
"grad_norm": 6.1875,
"learning_rate": 9.134615384615383e-7,
"loss": 0.6161,
"step": 42
},
{
"epoch": 14.43097643097643,
"grad_norm": 5.6875,
"learning_rate": 9.086538461538461e-7,
"loss": 0.6082,
"step": 43
},
{
"epoch": 14.861952861952862,
"grad_norm": 5.75,
"learning_rate": 9.038461538461538e-7,
"loss": 0.5807,
"step": 44
},
{
"epoch": 15,
"grad_norm": 6.09375,
"learning_rate": 8.990384615384616e-7,
"loss": 0.5885,
"step": 45
},
{
"epoch": 15.43097643097643,
"grad_norm": 5.4375,
"learning_rate": 8.942307692307692e-7,
"loss": 0.5793,
"step": 46
},
{
"epoch": 15.861952861952862,
"grad_norm": 5.875,
"learning_rate": 8.894230769230768e-7,
"loss": 0.5928,
"step": 47
},
{
"epoch": 16,
"grad_norm": 5.78125,
"learning_rate": 8.846153846153846e-7,
"loss": 0.5399,
"step": 48
},
{
"epoch": 16.430976430976433,
"grad_norm": 5.53125,
"learning_rate": 8.798076923076922e-7,
"loss": 0.575,
"step": 49
},
{
"epoch": 16.86195286195286,
"grad_norm": 5.71875,
"learning_rate": 8.75e-7,
"loss": 0.5664,
"step": 50
},
{
"epoch": 17,
"grad_norm": 18,
"learning_rate": 8.701923076923077e-7,
"loss": 0.5323,
"step": 51
},
{
"epoch": 17.430976430976433,
"grad_norm": 5.65625,
"learning_rate": 8.653846153846154e-7,
"loss": 0.5475,
"step": 52
},
{
"epoch": 17.86195286195286,
"grad_norm": 5.40625,
"learning_rate": 8.605769230769231e-7,
"loss": 0.5487,
"step": 53
},
{
"epoch": 18,
"grad_norm": 5.28125,
"learning_rate": 8.557692307692306e-7,
"loss": 0.5806,
"step": 54
},
{
"epoch": 18.430976430976433,
"grad_norm": 5.34375,
"learning_rate": 8.509615384615384e-7,
"loss": 0.5424,
"step": 55
},
{
"epoch": 18.86195286195286,
"grad_norm": 5.25,
"learning_rate": 8.461538461538461e-7,
"loss": 0.5483,
"step": 56
},
{
"epoch": 19,
"grad_norm": 5.90625,
"learning_rate": 8.413461538461539e-7,
"loss": 0.5175,
"step": 57
},
{
"epoch": 19.430976430976433,
"grad_norm": 5.25,
"learning_rate": 8.365384615384615e-7,
"loss": 0.5353,
"step": 58
},
{
"epoch": 19.86195286195286,
"grad_norm": 4.875,
"learning_rate": 8.317307692307692e-7,
"loss": 0.5333,
"step": 59
},
{
"epoch": 20,
"grad_norm": 5,
"learning_rate": 8.269230769230768e-7,
"loss": 0.5064,
"step": 60
},
{
"epoch": 20.430976430976433,
"grad_norm": 4.5,
"learning_rate": 8.221153846153845e-7,
"loss": 0.5244,
"step": 61
},
{
"epoch": 20.86195286195286,
"grad_norm": 4.40625,
"learning_rate": 8.173076923076923e-7,
"loss": 0.5134,
"step": 62
},
{
"epoch": 21,
"grad_norm": 5.21875,
"learning_rate": 8.125e-7,
"loss": 0.5235,
"step": 63
},
{
"epoch": 21.430976430976433,
"grad_norm": 3.96875,
"learning_rate": 8.076923076923077e-7,
"loss": 0.5239,
"step": 64
},
{
"epoch": 21.86195286195286,
"grad_norm": 4.375,
"learning_rate": 8.028846153846154e-7,
"loss": 0.491,
"step": 65
},
{
"epoch": 22,
"grad_norm": 4.84375,
"learning_rate": 7.98076923076923e-7,
"loss": 0.527,
"step": 66
},
{
"epoch": 22.430976430976433,
"grad_norm": 4,
"learning_rate": 7.932692307692307e-7,
"loss": 0.5102,
"step": 67
},
{
"epoch": 22.86195286195286,
"grad_norm": 3.9375,
"learning_rate": 7.884615384615384e-7,
"loss": 0.4958,
"step": 68
},
{
"epoch": 23,
"grad_norm": 4.4375,
"learning_rate": 7.836538461538462e-7,
"loss": 0.5022,
"step": 69
},
{
"epoch": 23.430976430976433,
"grad_norm": 3.59375,
"learning_rate": 7.788461538461538e-7,
"loss": 0.4984,
"step": 70
},
{
"epoch": 23.86195286195286,
"grad_norm": 3.28125,
"learning_rate": 7.740384615384615e-7,
"loss": 0.4911,
"step": 71
},
{
"epoch": 24,
"grad_norm": 4.6875,
"learning_rate": 7.692307692307693e-7,
"loss": 0.5074,
"step": 72
},
{
"epoch": 24.430976430976433,
"grad_norm": 3.390625,
"learning_rate": 7.644230769230768e-7,
"loss": 0.5145,
"step": 73
},
{
"epoch": 24.86195286195286,
"grad_norm": 3.328125,
"learning_rate": 7.596153846153846e-7,
"loss": 0.4677,
"step": 74
},
{
"epoch": 25,
"grad_norm": 4.03125,
"learning_rate": 7.548076923076922e-7,
"loss": 0.48,
"step": 75
},
{
"epoch": 25.430976430976433,
"grad_norm": 3.15625,
"learning_rate": 7.5e-7,
"loss": 0.4715,
"step": 76
},
{
"epoch": 25.86195286195286,
"grad_norm": 3.328125,
"learning_rate": 7.451923076923077e-7,
"loss": 0.499,
"step": 77
},
{
"epoch": 26,
"grad_norm": 4.15625,
"learning_rate": 7.403846153846153e-7,
"loss": 0.485,
"step": 78
},
{
"epoch": 26.430976430976433,
"grad_norm": 3.09375,
"learning_rate": 7.355769230769231e-7,
"loss": 0.4855,
"step": 79
},
{
"epoch": 26.86195286195286,
"grad_norm": 3.109375,
"learning_rate": 7.307692307692307e-7,
"loss": 0.4821,
"step": 80
},
{
"epoch": 27,
"grad_norm": 3.75,
"learning_rate": 7.259615384615385e-7,
"loss": 0.457,
"step": 81
},
{
"epoch": 27.430976430976433,
"grad_norm": 2.859375,
"learning_rate": 7.211538461538461e-7,
"loss": 0.4594,
"step": 82
},
{
"epoch": 27.86195286195286,
"grad_norm": 2.9375,
"learning_rate": 7.163461538461538e-7,
"loss": 0.4983,
"step": 83
},
{
"epoch": 28,
"grad_norm": 3.78125,
"learning_rate": 7.115384615384616e-7,
"loss": 0.4535,
"step": 84
},
{
"epoch": 28.430976430976433,
"grad_norm": 2.828125,
"learning_rate": 7.067307692307692e-7,
"loss": 0.4679,
"step": 85
},
{
"epoch": 28.86195286195286,
"grad_norm": 3.140625,
"learning_rate": 7.019230769230769e-7,
"loss": 0.4771,
"step": 86
},
{
"epoch": 29,
"grad_norm": 3.796875,
"learning_rate": 6.971153846153845e-7,
"loss": 0.4686,
"step": 87
},
{
"epoch": 29.430976430976433,
"grad_norm": 2.859375,
"learning_rate": 6.923076923076922e-7,
"loss": 0.4678,
"step": 88
},
{
"epoch": 29.86195286195286,
"grad_norm": 2.671875,
"learning_rate": 6.875e-7,
"loss": 0.4572,
"step": 89
},
{
"epoch": 30,
"grad_norm": 3.484375,
"learning_rate": 6.826923076923076e-7,
"loss": 0.503,
"step": 90
},
{
"epoch": 30.430976430976433,
"grad_norm": 2.640625,
"learning_rate": 6.778846153846154e-7,
"loss": 0.4541,
"step": 91
},
{
"epoch": 30.86195286195286,
"grad_norm": 2.640625,
"learning_rate": 6.730769230769231e-7,
"loss": 0.47,
"step": 92
},
{
"epoch": 31,
"grad_norm": 3.453125,
"learning_rate": 6.682692307692307e-7,
"loss": 0.4849,
"step": 93
},
{
"epoch": 31.430976430976433,
"grad_norm": 2.828125,
"learning_rate": 6.634615384615384e-7,
"loss": 0.4649,
"step": 94
},
{
"epoch": 31.86195286195286,
"grad_norm": 2.6875,
"learning_rate": 6.586538461538461e-7,
"loss": 0.45,
"step": 95
},
{
"epoch": 32,
"grad_norm": 3.84375,
"learning_rate": 6.538461538461538e-7,
"loss": 0.4912,
"step": 96
},
{
"epoch": 32.43097643097643,
"grad_norm": 2.625,
"learning_rate": 6.490384615384615e-7,
"loss": 0.4627,
"step": 97
},
{
"epoch": 32.861952861952865,
"grad_norm": 2.6875,
"learning_rate": 6.442307692307693e-7,
"loss": 0.4642,
"step": 98
},
{
"epoch": 33,
"grad_norm": 4.0625,
"learning_rate": 6.394230769230768e-7,
"loss": 0.4362,
"step": 99
},
{
"epoch": 33.43097643097643,
"grad_norm": 2.578125,
"learning_rate": 6.346153846153845e-7,
"loss": 0.4674,
"step": 100
},
{
"epoch": 33.861952861952865,
"grad_norm": 2.546875,
"learning_rate": 6.298076923076923e-7,
"loss": 0.4527,
"step": 101
},
{
"epoch": 34,
"grad_norm": 3.359375,
"learning_rate": 6.249999999999999e-7,
"loss": 0.4408,
"step": 102
},
{
"epoch": 34.43097643097643,
"grad_norm": 2.484375,
"learning_rate": 6.201923076923077e-7,
"loss": 0.457,
"step": 103
},
{
"epoch": 34.861952861952865,
"grad_norm": 2.5,
"learning_rate": 6.153846153846154e-7,
"loss": 0.4615,
"step": 104
},
{
"epoch": 35,
"grad_norm": 3.359375,
"learning_rate": 6.105769230769232e-7,
"loss": 0.4326,
"step": 105
},
{
"epoch": 35.43097643097643,
"grad_norm": 2.5625,
"learning_rate": 6.057692307692307e-7,
"loss": 0.4549,
"step": 106
},
{
"epoch": 35.861952861952865,
"grad_norm": 2.53125,
"learning_rate": 6.009615384615384e-7,
"loss": 0.4608,
"step": 107
},
{
"epoch": 36,
"grad_norm": 3.5,
"learning_rate": 5.961538461538461e-7,
"loss": 0.4261,
"step": 108
},
{
"epoch": 36.43097643097643,
"grad_norm": 2.65625,
"learning_rate": 5.913461538461538e-7,
"loss": 0.4605,
"step": 109
},
{
"epoch": 36.861952861952865,
"grad_norm": 2.40625,
"learning_rate": 5.865384615384616e-7,
"loss": 0.4485,
"step": 110
},
{
"epoch": 37,
"grad_norm": 3.828125,
"learning_rate": 5.817307692307692e-7,
"loss": 0.433,
"step": 111
},
{
"epoch": 37.43097643097643,
"grad_norm": 2.40625,
"learning_rate": 5.769230769230768e-7,
"loss": 0.4547,
"step": 112
},
{
"epoch": 37.861952861952865,
"grad_norm": 2.5625,
"learning_rate": 5.721153846153846e-7,
"loss": 0.4533,
"step": 113
},
{
"epoch": 38,
"grad_norm": 3.109375,
"learning_rate": 5.673076923076922e-7,
"loss": 0.4244,
"step": 114
},
{
"epoch": 38.43097643097643,
"grad_norm": 2.390625,
"learning_rate": 5.625e-7,
"loss": 0.4618,
"step": 115
},
{
"epoch": 38.861952861952865,
"grad_norm": 2.265625,
"learning_rate": 5.576923076923077e-7,
"loss": 0.4479,
"step": 116
},
{
"epoch": 39,
"grad_norm": 3.546875,
"learning_rate": 5.528846153846153e-7,
"loss": 0.4054,
"step": 117
},
{
"epoch": 39.43097643097643,
"grad_norm": 2.296875,
"learning_rate": 5.480769230769231e-7,
"loss": 0.4479,
"step": 118
},
{
"epoch": 39.861952861952865,
"grad_norm": 2.296875,
"learning_rate": 5.432692307692307e-7,
"loss": 0.4403,
"step": 119
},
{
"epoch": 40,
"grad_norm": 3.53125,
"learning_rate": 5.384615384615384e-7,
"loss": 0.464,
"step": 120
},
{
"epoch": 40.43097643097643,
"grad_norm": 2.46875,
"learning_rate": 5.336538461538461e-7,
"loss": 0.4361,
"step": 121
},
{
"epoch": 40.861952861952865,
"grad_norm": 2.484375,
"learning_rate": 5.288461538461539e-7,
"loss": 0.4607,
"step": 122
},
{
"epoch": 41,
"grad_norm": 3.546875,
"learning_rate": 5.240384615384615e-7,
"loss": 0.4249,
"step": 123
},
{
"epoch": 41.43097643097643,
"grad_norm": 2.375,
"learning_rate": 5.192307692307692e-7,
"loss": 0.4387,
"step": 124
},
{
"epoch": 41.861952861952865,
"grad_norm": 2.4375,
"learning_rate": 5.144230769230769e-7,
"loss": 0.4352,
"step": 125
},
{
"epoch": 42,
"grad_norm": 3.71875,
"learning_rate": 5.096153846153845e-7,
"loss": 0.4855,
"step": 126
},
{
"epoch": 42.43097643097643,
"grad_norm": 2.375,
"learning_rate": 5.048076923076923e-7,
"loss": 0.4609,
"step": 127
},
{
"epoch": 42.861952861952865,
"grad_norm": 2.359375,
"learning_rate": 5e-7,
"loss": 0.4259,
"step": 128
},
{
"epoch": 43,
"grad_norm": 3.421875,
"learning_rate": 4.951923076923076e-7,
"loss": 0.4364,
"step": 129
},
{
"epoch": 43.43097643097643,
"grad_norm": 2.4375,
"learning_rate": 4.903846153846153e-7,
"loss": 0.4507,
"step": 130
},
{
"epoch": 43.861952861952865,
"grad_norm": 2.53125,
"learning_rate": 4.855769230769231e-7,
"loss": 0.4384,
"step": 131
},
{
"epoch": 44,
"grad_norm": 3.5625,
"learning_rate": 4.807692307692307e-7,
"loss": 0.4231,
"step": 132
},
{
"epoch": 44.43097643097643,
"grad_norm": 2.359375,
"learning_rate": 4.759615384615384e-7,
"loss": 0.4435,
"step": 133
},
{
"epoch": 44.861952861952865,
"grad_norm": 2.15625,
"learning_rate": 4.711538461538461e-7,
"loss": 0.4321,
"step": 134
},
{
"epoch": 45,
"grad_norm": 3.9375,
"learning_rate": 4.6634615384615384e-7,
"loss": 0.4561,
"step": 135
},
{
"epoch": 45.43097643097643,
"grad_norm": 2.28125,
"learning_rate": 4.6153846153846156e-7,
"loss": 0.4379,
"step": 136
},
{
"epoch": 45.861952861952865,
"grad_norm": 2.390625,
"learning_rate": 4.567307692307692e-7,
"loss": 0.445,
"step": 137
},
{
"epoch": 46,
"grad_norm": 3.6875,
"learning_rate": 4.519230769230769e-7,
"loss": 0.4272,
"step": 138
},
{
"epoch": 46.43097643097643,
"grad_norm": 2.234375,
"learning_rate": 4.471153846153846e-7,
"loss": 0.4434,
"step": 139
},
{
"epoch": 46.861952861952865,
"grad_norm": 2.40625,
"learning_rate": 4.423076923076923e-7,
"loss": 0.4447,
"step": 140
},
{
"epoch": 47,
"grad_norm": 3.28125,
"learning_rate": 4.375e-7,
"loss": 0.4029,
"step": 141
},
{
"epoch": 47.43097643097643,
"grad_norm": 2.1875,
"learning_rate": 4.326923076923077e-7,
"loss": 0.4264,
"step": 142
},
{
"epoch": 47.861952861952865,
"grad_norm": 2.421875,
"learning_rate": 4.278846153846153e-7,
"loss": 0.4471,
"step": 143
},
{
"epoch": 48,
"grad_norm": 3.40625,
"learning_rate": 4.2307692307692304e-7,
"loss": 0.4442,
"step": 144
},
{
"epoch": 48.43097643097643,
"grad_norm": 2.234375,
"learning_rate": 4.1826923076923076e-7,
"loss": 0.4241,
"step": 145
},
{
"epoch": 48.861952861952865,
"grad_norm": 2.234375,
"learning_rate": 4.134615384615384e-7,
"loss": 0.4439,
"step": 146
},
{
"epoch": 49,
"grad_norm": 3.703125,
"learning_rate": 4.0865384615384614e-7,
"loss": 0.4536,
"step": 147
},
{
"epoch": 49.43097643097643,
"grad_norm": 2.25,
"learning_rate": 4.0384615384615386e-7,
"loss": 0.4479,
"step": 148
},
{
"epoch": 49.861952861952865,
"grad_norm": 2.171875,
"learning_rate": 3.990384615384615e-7,
"loss": 0.428,
"step": 149
},
{
"epoch": 50,
"grad_norm": 3.734375,
"learning_rate": 3.942307692307692e-7,
"loss": 0.4263,
"step": 150
},
{
"epoch": 50.43097643097643,
"grad_norm": 2.28125,
"learning_rate": 3.894230769230769e-7,
"loss": 0.43,
"step": 151
},
{
"epoch": 50.861952861952865,
"grad_norm": 2.171875,
"learning_rate": 3.8461538461538463e-7,
"loss": 0.4453,
"step": 152
},
{
"epoch": 51,
"grad_norm": 3.75,
"learning_rate": 3.798076923076923e-7,
"loss": 0.423,
"step": 153
},
{
"epoch": 51.43097643097643,
"grad_norm": 2.3125,
"learning_rate": 3.75e-7,
"loss": 0.4188,
"step": 154
},
{
"epoch": 51.861952861952865,
"grad_norm": 2.28125,
"learning_rate": 3.701923076923077e-7,
"loss": 0.4514,
"step": 155
},
{
"epoch": 52,
"grad_norm": 3.203125,
"learning_rate": 3.6538461538461534e-7,
"loss": 0.4353,
"step": 156
},
{
"epoch": 52.43097643097643,
"grad_norm": 2.390625,
"learning_rate": 3.6057692307692306e-7,
"loss": 0.4247,
"step": 157
},
{
"epoch": 52.861952861952865,
"grad_norm": 2.3125,
"learning_rate": 3.557692307692308e-7,
"loss": 0.4497,
"step": 158
},
{
"epoch": 53,
"grad_norm": 3.71875,
"learning_rate": 3.5096153846153844e-7,
"loss": 0.4205,
"step": 159
},
{
"epoch": 53.43097643097643,
"grad_norm": 2.359375,
"learning_rate": 3.461538461538461e-7,
"loss": 0.4403,
"step": 160
},
{
"epoch": 53.861952861952865,
"grad_norm": 2.15625,
"learning_rate": 3.413461538461538e-7,
"loss": 0.4362,
"step": 161
},
{
"epoch": 54,
"grad_norm": 3.265625,
"learning_rate": 3.3653846153846154e-7,
"loss": 0.4107,
"step": 162
},
{
"epoch": 54.43097643097643,
"grad_norm": 2.34375,
"learning_rate": 3.317307692307692e-7,
"loss": 0.4428,
"step": 163
},
{
"epoch": 54.861952861952865,
"grad_norm": 2.203125,
"learning_rate": 3.269230769230769e-7,
"loss": 0.4328,
"step": 164
},
{
"epoch": 55,
"grad_norm": 3.53125,
"learning_rate": 3.2211538461538464e-7,
"loss": 0.4097,
"step": 165
},
{
"epoch": 55.43097643097643,
"grad_norm": 2.125,
"learning_rate": 3.1730769230769225e-7,
"loss": 0.4295,
"step": 166
},
{
"epoch": 55.861952861952865,
"grad_norm": 2.328125,
"learning_rate": 3.1249999999999997e-7,
"loss": 0.4331,
"step": 167
},
{
"epoch": 56,
"grad_norm": 3.484375,
"learning_rate": 3.076923076923077e-7,
"loss": 0.4458,
"step": 168
},
{
"epoch": 56.43097643097643,
"grad_norm": 2.140625,
"learning_rate": 3.0288461538461536e-7,
"loss": 0.4323,
"step": 169
},
{
"epoch": 56.861952861952865,
"grad_norm": 2.234375,
"learning_rate": 2.980769230769231e-7,
"loss": 0.4336,
"step": 170
},
{
"epoch": 57,
"grad_norm": 3.296875,
"learning_rate": 2.932692307692308e-7,
"loss": 0.4337,
"step": 171
},
{
"epoch": 57.43097643097643,
"grad_norm": 2.28125,
"learning_rate": 2.884615384615384e-7,
"loss": 0.413,
"step": 172
},
{
"epoch": 57.861952861952865,
"grad_norm": 2.09375,
"learning_rate": 2.836538461538461e-7,
"loss": 0.4392,
"step": 173
},
{
"epoch": 58,
"grad_norm": 3.796875,
"learning_rate": 2.7884615384615384e-7,
"loss": 0.471,
"step": 174
},
{
"epoch": 58.43097643097643,
"grad_norm": 2.296875,
"learning_rate": 2.7403846153846156e-7,
"loss": 0.4298,
"step": 175
},
{
"epoch": 58.861952861952865,
"grad_norm": 2.34375,
"learning_rate": 2.692307692307692e-7,
"loss": 0.4323,
"step": 176
},
{
"epoch": 59,
"grad_norm": 3.71875,
"learning_rate": 2.6442307692307694e-7,
"loss": 0.4405,
"step": 177
},
{
"epoch": 59.43097643097643,
"grad_norm": 2.15625,
"learning_rate": 2.596153846153846e-7,
"loss": 0.4249,
"step": 178
},
{
"epoch": 59.861952861952865,
"grad_norm": 2.1875,
"learning_rate": 2.5480769230769227e-7,
"loss": 0.428,
"step": 179
},
{
"epoch": 60,
"grad_norm": 3.65625,
"learning_rate": 2.5e-7,
"loss": 0.4671,
"step": 180
},
{
"epoch": 60.43097643097643,
"grad_norm": 2.328125,
"learning_rate": 2.4519230769230765e-7,
"loss": 0.4329,
"step": 181
},
{
"epoch": 60.861952861952865,
"grad_norm": 2.359375,
"learning_rate": 2.4038461538461537e-7,
"loss": 0.4281,
"step": 182
},
{
"epoch": 61,
"grad_norm": 3.53125,
"learning_rate": 2.3557692307692306e-7,
"loss": 0.4393,
"step": 183
},
{
"epoch": 61.43097643097643,
"grad_norm": 2.203125,
"learning_rate": 2.3076923076923078e-7,
"loss": 0.4434,
"step": 184
},
{
"epoch": 61.861952861952865,
"grad_norm": 2.375,
"learning_rate": 2.2596153846153845e-7,
"loss": 0.4307,
"step": 185
},
{
"epoch": 62,
"grad_norm": 4.0625,
"learning_rate": 2.2115384615384614e-7,
"loss": 0.3933,
"step": 186
},
{
"epoch": 62.43097643097643,
"grad_norm": 2.078125,
"learning_rate": 2.1634615384615386e-7,
"loss": 0.4351,
"step": 187
},
{
"epoch": 62.861952861952865,
"grad_norm": 2.265625,
"learning_rate": 2.1153846153846152e-7,
"loss": 0.4273,
"step": 188
},
{
"epoch": 63,
"grad_norm": 3.90625,
"learning_rate": 2.067307692307692e-7,
"loss": 0.4315,
"step": 189
},
{
"epoch": 63.43097643097643,
"grad_norm": 2.390625,
"learning_rate": 2.0192307692307693e-7,
"loss": 0.4215,
"step": 190
},
{
"epoch": 63.861952861952865,
"grad_norm": 2.421875,
"learning_rate": 1.971153846153846e-7,
"loss": 0.439,
"step": 191
},
{
"epoch": 64,
"grad_norm": 3.296875,
"learning_rate": 1.9230769230769231e-7,
"loss": 0.4359,
"step": 192
},
{
"epoch": 64.43097643097643,
"grad_norm": 2.375,
"learning_rate": 1.875e-7,
"loss": 0.4373,
"step": 193
},
{
"epoch": 64.86195286195286,
"grad_norm": 2.140625,
"learning_rate": 1.8269230769230767e-7,
"loss": 0.431,
"step": 194
},
{
"epoch": 65,
"grad_norm": 3.359375,
"learning_rate": 1.778846153846154e-7,
"loss": 0.4122,
"step": 195
},
{
"epoch": 65.43097643097643,
"grad_norm": 2.375,
"learning_rate": 1.7307692307692305e-7,
"loss": 0.4426,
"step": 196
},
{
"epoch": 65.86195286195286,
"grad_norm": 2.203125,
"learning_rate": 1.6826923076923077e-7,
"loss": 0.4228,
"step": 197
},
{
"epoch": 66,
"grad_norm": 3.5,
"learning_rate": 1.6346153846153846e-7,
"loss": 0.4206,
"step": 198
},
{
"epoch": 66.43097643097643,
"grad_norm": 2.140625,
"learning_rate": 1.5865384615384613e-7,
"loss": 0.4272,
"step": 199
},
{
"epoch": 66.86195286195286,
"grad_norm": 2.21875,
"learning_rate": 1.5384615384615385e-7,
"loss": 0.4375,
"step": 200
},
{
"epoch": 67,
"grad_norm": 3.375,
"learning_rate": 1.4903846153846154e-7,
"loss": 0.4203,
"step": 201
},
{
"epoch": 67.43097643097643,
"grad_norm": 2.390625,
"learning_rate": 1.442307692307692e-7,
"loss": 0.4314,
"step": 202
},
{
"epoch": 67.86195286195286,
"grad_norm": 2.296875,
"learning_rate": 1.3942307692307692e-7,
"loss": 0.426,
"step": 203
},
{
"epoch": 68,
"grad_norm": 3.4375,
"learning_rate": 1.346153846153846e-7,
"loss": 0.442,
"step": 204
},
{
"epoch": 68.43097643097643,
"grad_norm": 2.4375,
"learning_rate": 1.298076923076923e-7,
"loss": 0.4342,
"step": 205
},
{
"epoch": 68.86195286195286,
"grad_norm": 2.25,
"learning_rate": 1.25e-7,
"loss": 0.4299,
"step": 206
},
{
"epoch": 69,
"grad_norm": 3.90625,
"learning_rate": 1.2019230769230769e-7,
"loss": 0.4224,
"step": 207
},
{
"epoch": 69.43097643097643,
"grad_norm": 2.328125,
"learning_rate": 1.1538461538461539e-7,
"loss": 0.4266,
"step": 208
},
{
"epoch": 69.86195286195286,
"grad_norm": 2.234375,
"learning_rate": 1.1057692307692307e-7,
"loss": 0.4435,
"step": 209
},
{
"epoch": 70,
"grad_norm": 3.703125,
"learning_rate": 1.0576923076923076e-7,
"loss": 0.4042,
"step": 210
},
{
"epoch": 70.43097643097643,
"grad_norm": 2.15625,
"learning_rate": 1.0096153846153847e-7,
"loss": 0.4341,
"step": 211
},
{
"epoch": 70.86195286195286,
"grad_norm": 2.140625,
"learning_rate": 9.615384615384616e-8,
"loss": 0.4251,
"step": 212
},
{
"epoch": 71,
"grad_norm": 3.234375,
"learning_rate": 9.134615384615383e-8,
"loss": 0.4363,
"step": 213
},
{
"epoch": 71.43097643097643,
"grad_norm": 2.21875,
"learning_rate": 8.653846153846153e-8,
"loss": 0.4494,
"step": 214
},
{
"epoch": 71.86195286195286,
"grad_norm": 2.140625,
"learning_rate": 8.173076923076923e-8,
"loss": 0.4059,
"step": 215
},
{
"epoch": 72,
"grad_norm": 3.8125,
"learning_rate": 7.692307692307692e-8,
"loss": 0.4442,
"step": 216
},
{
"epoch": 72.43097643097643,
"grad_norm": 2.40625,
"learning_rate": 7.21153846153846e-8,
"loss": 0.4361,
"step": 217
},
{
"epoch": 72.86195286195286,
"grad_norm": 2.28125,
"learning_rate": 6.73076923076923e-8,
"loss": 0.427,
"step": 218
},
{
"epoch": 73,
"grad_norm": 3.5,
"learning_rate": 6.25e-8,
"loss": 0.4243,
"step": 219
},
{
"epoch": 73.43097643097643,
"grad_norm": 2.234375,
"learning_rate": 5.7692307692307695e-8,
"loss": 0.4249,
"step": 220
},
{
"epoch": 73.86195286195286,
"grad_norm": 2.21875,
"learning_rate": 5.288461538461538e-8,
"loss": 0.4351,
"step": 221
},
{
"epoch": 74,
"grad_norm": 3.421875,
"learning_rate": 4.807692307692308e-8,
"loss": 0.4347,
"step": 222
},
{
"epoch": 74.43097643097643,
"grad_norm": 2.34375,
"learning_rate": 4.326923076923076e-8,
"loss": 0.4375,
"step": 223
},
{
"epoch": 74.86195286195286,
"grad_norm": 2.328125,
"learning_rate": 3.846153846153846e-8,
"loss": 0.4315,
"step": 224
},
{
"epoch": 75,
"grad_norm": 3.796875,
"learning_rate": 3.365384615384615e-8,
"loss": 0.4076,
"step": 225
},
{
"epoch": 75.43097643097643,
"grad_norm": 2.1875,
"learning_rate": 2.8846153846153848e-8,
"loss": 0.428,
"step": 226
},
{
"epoch": 75.86195286195286,
"grad_norm": 2.1875,
"learning_rate": 2.403846153846154e-8,
"loss": 0.4327,
"step": 227
},
{
"epoch": 76,
"grad_norm": 3.328125,
"learning_rate": 1.923076923076923e-8,
"loss": 0.431,
"step": 228
},
{
"epoch": 76.43097643097643,
"grad_norm": 2.234375,
"learning_rate": 1.4423076923076924e-8,
"loss": 0.4342,
"step": 229
},
{
"epoch": 76.86195286195286,
"grad_norm": 2.421875,
"learning_rate": 9.615384615384615e-9,
"loss": 0.4324,
"step": 230
},
{
"epoch": 77,
"grad_norm": 3.890625,
"learning_rate": 4.807692307692308e-9,
"loss": 0.4132,
"step": 231
},
{
"epoch": 77.43097643097643,
"grad_norm": 2.234375,
"learning_rate": 0,
"loss": 0.4387,
"step": 232
}
] | 1
| 232
| 0
| 116
| 10
|
{
"TrainerControl": {
"args": {
"should_epoch_stop": false,
"should_evaluate": false,
"should_log": false,
"should_save": true,
"should_training_stop": true
},
"attributes": {}
}
}
| 567,702,596,938,014,700
| 1
| null | null |
null | null | 77.430976
| 500
| 232
| false
| true
| true
|
[
{
"epoch": 0.43097643097643096,
"grad_norm": 12.9375,
"learning_rate": 4.166666666666666e-8,
"loss": 1.133,
"step": 1
},
{
"epoch": 0.8619528619528619,
"grad_norm": 13.6875,
"learning_rate": 8.333333333333333e-8,
"loss": 1.1669,
"step": 2
},
{
"epoch": 1,
"grad_norm": 12.75,
"learning_rate": 1.25e-7,
"loss": 1.1397,
"step": 3
},
{
"epoch": 1.430976430976431,
"grad_norm": 13.0625,
"learning_rate": 1.6666666666666665e-7,
"loss": 1.1526,
"step": 4
},
{
"epoch": 1.861952861952862,
"grad_norm": 13.0625,
"learning_rate": 2.0833333333333333e-7,
"loss": 1.1573,
"step": 5
},
{
"epoch": 2,
"grad_norm": 13.5,
"learning_rate": 2.5e-7,
"loss": 1.1022,
"step": 6
},
{
"epoch": 2.430976430976431,
"grad_norm": 13.375,
"learning_rate": 2.916666666666667e-7,
"loss": 1.1536,
"step": 7
},
{
"epoch": 2.861952861952862,
"grad_norm": 12.4375,
"learning_rate": 3.333333333333333e-7,
"loss": 1.1162,
"step": 8
},
{
"epoch": 3,
"grad_norm": 13.875,
"learning_rate": 3.75e-7,
"loss": 1.2274,
"step": 9
},
{
"epoch": 3.430976430976431,
"grad_norm": 12.875,
"learning_rate": 4.1666666666666667e-7,
"loss": 1.1449,
"step": 10
},
{
"epoch": 3.861952861952862,
"grad_norm": 12.8125,
"learning_rate": 4.5833333333333327e-7,
"loss": 1.1399,
"step": 11
},
{
"epoch": 4,
"grad_norm": 13.375,
"learning_rate": 5e-7,
"loss": 1.1437,
"step": 12
},
{
"epoch": 4.430976430976431,
"grad_norm": 12.75,
"learning_rate": 5.416666666666666e-7,
"loss": 1.1336,
"step": 13
},
{
"epoch": 4.861952861952862,
"grad_norm": 12.5,
"learning_rate": 5.833333333333334e-7,
"loss": 1.1251,
"step": 14
},
{
"epoch": 5,
"grad_norm": 13.875,
"learning_rate": 6.249999999999999e-7,
"loss": 1.1827,
"step": 15
},
{
"epoch": 5.430976430976431,
"grad_norm": 12.25,
"learning_rate": 6.666666666666666e-7,
"loss": 1.1322,
"step": 16
},
{
"epoch": 5.861952861952862,
"grad_norm": 12.5,
"learning_rate": 7.083333333333334e-7,
"loss": 1.1,
"step": 17
},
{
"epoch": 6,
"grad_norm": 12.875,
"learning_rate": 7.5e-7,
"loss": 1.1677,
"step": 18
},
{
"epoch": 6.430976430976431,
"grad_norm": 12,
"learning_rate": 7.916666666666666e-7,
"loss": 1.1225,
"step": 19
},
{
"epoch": 6.861952861952862,
"grad_norm": 11.875,
"learning_rate": 8.333333333333333e-7,
"loss": 1.0985,
"step": 20
},
{
"epoch": 7,
"grad_norm": 12.6875,
"learning_rate": 8.75e-7,
"loss": 1.0917,
"step": 21
},
{
"epoch": 7.430976430976431,
"grad_norm": 11.4375,
"learning_rate": 9.166666666666665e-7,
"loss": 1.1029,
"step": 22
},
{
"epoch": 7.861952861952862,
"grad_norm": 11.5,
"learning_rate": 9.583333333333334e-7,
"loss": 1.0748,
"step": 23
},
{
"epoch": 8,
"grad_norm": 11.8125,
"learning_rate": 0.000001,
"loss": 1.1137,
"step": 24
},
{
"epoch": 8.43097643097643,
"grad_norm": 11.3125,
"learning_rate": 9.951923076923077e-7,
"loss": 1.0812,
"step": 25
},
{
"epoch": 8.861952861952862,
"grad_norm": 10.5,
"learning_rate": 9.903846153846153e-7,
"loss": 1.0494,
"step": 26
},
{
"epoch": 9,
"grad_norm": 10.5,
"learning_rate": 9.85576923076923e-7,
"loss": 1.0699,
"step": 27
},
{
"epoch": 9.43097643097643,
"grad_norm": 9.75,
"learning_rate": 9.807692307692306e-7,
"loss": 1.0282,
"step": 28
},
{
"epoch": 9.861952861952862,
"grad_norm": 10.25,
"learning_rate": 9.759615384615384e-7,
"loss": 1.0428,
"step": 29
},
{
"epoch": 10,
"grad_norm": 9.875,
"learning_rate": 9.711538461538462e-7,
"loss": 1.0142,
"step": 30
},
{
"epoch": 10.43097643097643,
"grad_norm": 9.625,
"learning_rate": 9.663461538461537e-7,
"loss": 1.0216,
"step": 31
},
{
"epoch": 10.861952861952862,
"grad_norm": 9.625,
"learning_rate": 9.615384615384615e-7,
"loss": 0.9952,
"step": 32
},
{
"epoch": 11,
"grad_norm": 10.125,
"learning_rate": 9.567307692307693e-7,
"loss": 0.9656,
"step": 33
},
{
"epoch": 11.43097643097643,
"grad_norm": 9.6875,
"learning_rate": 9.519230769230768e-7,
"loss": 0.9718,
"step": 34
},
{
"epoch": 11.861952861952862,
"grad_norm": 9.5,
"learning_rate": 9.471153846153846e-7,
"loss": 0.9743,
"step": 35
},
{
"epoch": 12,
"grad_norm": 9.6875,
"learning_rate": 9.423076923076923e-7,
"loss": 0.984,
"step": 36
},
{
"epoch": 12.43097643097643,
"grad_norm": 9.375,
"learning_rate": 9.374999999999999e-7,
"loss": 0.9413,
"step": 37
},
{
"epoch": 12.861952861952862,
"grad_norm": 9.75,
"learning_rate": 9.326923076923077e-7,
"loss": 0.9654,
"step": 38
},
{
"epoch": 13,
"grad_norm": 9.6875,
"learning_rate": 9.278846153846154e-7,
"loss": 0.9133,
"step": 39
},
{
"epoch": 13.43097643097643,
"grad_norm": 9.0625,
"learning_rate": 9.230769230769231e-7,
"loss": 0.9116,
"step": 40
},
{
"epoch": 13.861952861952862,
"grad_norm": 9.625,
"learning_rate": 9.182692307692307e-7,
"loss": 0.9378,
"step": 41
},
{
"epoch": 14,
"grad_norm": 8.875,
"learning_rate": 9.134615384615383e-7,
"loss": 0.8992,
"step": 42
},
{
"epoch": 14.43097643097643,
"grad_norm": 9,
"learning_rate": 9.086538461538461e-7,
"loss": 0.8975,
"step": 43
},
{
"epoch": 14.861952861952862,
"grad_norm": 8.625,
"learning_rate": 9.038461538461538e-7,
"loss": 0.8899,
"step": 44
},
{
"epoch": 15,
"grad_norm": 9.375,
"learning_rate": 8.990384615384616e-7,
"loss": 0.9031,
"step": 45
},
{
"epoch": 15.43097643097643,
"grad_norm": 8.4375,
"learning_rate": 8.942307692307692e-7,
"loss": 0.8635,
"step": 46
},
{
"epoch": 15.861952861952862,
"grad_norm": 8.75,
"learning_rate": 8.894230769230768e-7,
"loss": 0.8832,
"step": 47
},
{
"epoch": 16,
"grad_norm": 8.625,
"learning_rate": 8.846153846153846e-7,
"loss": 0.8494,
"step": 48
},
{
"epoch": 16.430976430976433,
"grad_norm": 8.5625,
"learning_rate": 8.798076923076922e-7,
"loss": 0.8395,
"step": 49
},
{
"epoch": 16.86195286195286,
"grad_norm": 8.1875,
"learning_rate": 8.75e-7,
"loss": 0.8598,
"step": 50
},
{
"epoch": 17,
"grad_norm": 8.125,
"learning_rate": 8.701923076923077e-7,
"loss": 0.8247,
"step": 51
},
{
"epoch": 17.430976430976433,
"grad_norm": 8.125,
"learning_rate": 8.653846153846154e-7,
"loss": 0.8142,
"step": 52
},
{
"epoch": 17.86195286195286,
"grad_norm": 8.125,
"learning_rate": 8.605769230769231e-7,
"loss": 0.8256,
"step": 53
},
{
"epoch": 18,
"grad_norm": 7.84375,
"learning_rate": 8.557692307692306e-7,
"loss": 0.8434,
"step": 54
},
{
"epoch": 18.430976430976433,
"grad_norm": 7.5625,
"learning_rate": 8.509615384615384e-7,
"loss": 0.805,
"step": 55
},
{
"epoch": 18.86195286195286,
"grad_norm": 7.8125,
"learning_rate": 8.461538461538461e-7,
"loss": 0.7945,
"step": 56
},
{
"epoch": 19,
"grad_norm": 8.5,
"learning_rate": 8.413461538461539e-7,
"loss": 0.8119,
"step": 57
},
{
"epoch": 19.430976430976433,
"grad_norm": 7.5625,
"learning_rate": 8.365384615384615e-7,
"loss": 0.7873,
"step": 58
},
{
"epoch": 19.86195286195286,
"grad_norm": 7.375,
"learning_rate": 8.317307692307692e-7,
"loss": 0.7888,
"step": 59
},
{
"epoch": 20,
"grad_norm": 7.3125,
"learning_rate": 8.269230769230768e-7,
"loss": 0.7436,
"step": 60
},
{
"epoch": 20.430976430976433,
"grad_norm": 6.6875,
"learning_rate": 8.221153846153845e-7,
"loss": 0.7814,
"step": 61
},
{
"epoch": 20.86195286195286,
"grad_norm": 6.5625,
"learning_rate": 8.173076923076923e-7,
"loss": 0.7413,
"step": 62
},
{
"epoch": 21,
"grad_norm": 7.53125,
"learning_rate": 8.125e-7,
"loss": 0.7735,
"step": 63
},
{
"epoch": 21.430976430976433,
"grad_norm": 5.96875,
"learning_rate": 8.076923076923077e-7,
"loss": 0.7398,
"step": 64
},
{
"epoch": 21.86195286195286,
"grad_norm": 6.15625,
"learning_rate": 8.028846153846154e-7,
"loss": 0.7457,
"step": 65
},
{
"epoch": 22,
"grad_norm": 6.59375,
"learning_rate": 7.98076923076923e-7,
"loss": 0.7762,
"step": 66
},
{
"epoch": 22.430976430976433,
"grad_norm": 5.25,
"learning_rate": 7.932692307692307e-7,
"loss": 0.7274,
"step": 67
},
{
"epoch": 22.86195286195286,
"grad_norm": 5.40625,
"learning_rate": 7.884615384615384e-7,
"loss": 0.7495,
"step": 68
},
{
"epoch": 23,
"grad_norm": 6.25,
"learning_rate": 7.836538461538462e-7,
"loss": 0.7053,
"step": 69
},
{
"epoch": 23.430976430976433,
"grad_norm": 5.09375,
"learning_rate": 7.788461538461538e-7,
"loss": 0.726,
"step": 70
},
{
"epoch": 23.86195286195286,
"grad_norm": 4.8125,
"learning_rate": 7.740384615384615e-7,
"loss": 0.7257,
"step": 71
},
{
"epoch": 24,
"grad_norm": 5.78125,
"learning_rate": 7.692307692307693e-7,
"loss": 0.7055,
"step": 72
},
{
"epoch": 24.430976430976433,
"grad_norm": 4.46875,
"learning_rate": 7.644230769230768e-7,
"loss": 0.7143,
"step": 73
},
{
"epoch": 24.86195286195286,
"grad_norm": 4.875,
"learning_rate": 7.596153846153846e-7,
"loss": 0.7021,
"step": 74
},
{
"epoch": 25,
"grad_norm": 5.1875,
"learning_rate": 7.548076923076922e-7,
"loss": 0.7505,
"step": 75
},
{
"epoch": 25.430976430976433,
"grad_norm": 4.125,
"learning_rate": 7.5e-7,
"loss": 0.7027,
"step": 76
},
{
"epoch": 25.86195286195286,
"grad_norm": 4.0625,
"learning_rate": 7.451923076923077e-7,
"loss": 0.7091,
"step": 77
},
{
"epoch": 26,
"grad_norm": 5.09375,
"learning_rate": 7.403846153846153e-7,
"loss": 0.7165,
"step": 78
},
{
"epoch": 26.430976430976433,
"grad_norm": 3.828125,
"learning_rate": 7.355769230769231e-7,
"loss": 0.6989,
"step": 79
},
{
"epoch": 26.86195286195286,
"grad_norm": 3.8125,
"learning_rate": 7.307692307692307e-7,
"loss": 0.6927,
"step": 80
},
{
"epoch": 27,
"grad_norm": 5.15625,
"learning_rate": 7.259615384615385e-7,
"loss": 0.7312,
"step": 81
},
{
"epoch": 27.430976430976433,
"grad_norm": 3.6875,
"learning_rate": 7.211538461538461e-7,
"loss": 0.6839,
"step": 82
},
{
"epoch": 27.86195286195286,
"grad_norm": 3.734375,
"learning_rate": 7.163461538461538e-7,
"loss": 0.695,
"step": 83
},
{
"epoch": 28,
"grad_norm": 5,
"learning_rate": 7.115384615384616e-7,
"loss": 0.7322,
"step": 84
},
{
"epoch": 28.430976430976433,
"grad_norm": 3.546875,
"learning_rate": 7.067307692307692e-7,
"loss": 0.6901,
"step": 85
},
{
"epoch": 28.86195286195286,
"grad_norm": 3.328125,
"learning_rate": 7.019230769230769e-7,
"loss": 0.7092,
"step": 86
},
{
"epoch": 29,
"grad_norm": 4.40625,
"learning_rate": 6.971153846153845e-7,
"loss": 0.6386,
"step": 87
},
{
"epoch": 29.430976430976433,
"grad_norm": 3.453125,
"learning_rate": 6.923076923076922e-7,
"loss": 0.6861,
"step": 88
},
{
"epoch": 29.86195286195286,
"grad_norm": 3.203125,
"learning_rate": 6.875e-7,
"loss": 0.6823,
"step": 89
},
{
"epoch": 30,
"grad_norm": 4.28125,
"learning_rate": 6.826923076923076e-7,
"loss": 0.7013,
"step": 90
},
{
"epoch": 30.430976430976433,
"grad_norm": 3.109375,
"learning_rate": 6.778846153846154e-7,
"loss": 0.6564,
"step": 91
},
{
"epoch": 30.86195286195286,
"grad_norm": 3.0625,
"learning_rate": 6.730769230769231e-7,
"loss": 0.7077,
"step": 92
},
{
"epoch": 31,
"grad_norm": 4.5,
"learning_rate": 6.682692307692307e-7,
"loss": 0.6906,
"step": 93
},
{
"epoch": 31.430976430976433,
"grad_norm": 2.921875,
"learning_rate": 6.634615384615384e-7,
"loss": 0.6646,
"step": 94
},
{
"epoch": 31.86195286195286,
"grad_norm": 3.296875,
"learning_rate": 6.586538461538461e-7,
"loss": 0.674,
"step": 95
},
{
"epoch": 32,
"grad_norm": 5.625,
"learning_rate": 6.538461538461538e-7,
"loss": 0.7492,
"step": 96
},
{
"epoch": 32.43097643097643,
"grad_norm": 2.84375,
"learning_rate": 6.490384615384615e-7,
"loss": 0.683,
"step": 97
},
{
"epoch": 32.861952861952865,
"grad_norm": 2.890625,
"learning_rate": 6.442307692307693e-7,
"loss": 0.6796,
"step": 98
},
{
"epoch": 33,
"grad_norm": 4.4375,
"learning_rate": 6.394230769230768e-7,
"loss": 0.6483,
"step": 99
},
{
"epoch": 33.43097643097643,
"grad_norm": 2.796875,
"learning_rate": 6.346153846153845e-7,
"loss": 0.6884,
"step": 100
},
{
"epoch": 33.861952861952865,
"grad_norm": 2.765625,
"learning_rate": 6.298076923076923e-7,
"loss": 0.6739,
"step": 101
},
{
"epoch": 34,
"grad_norm": 4.59375,
"learning_rate": 6.249999999999999e-7,
"loss": 0.634,
"step": 102
},
{
"epoch": 34.43097643097643,
"grad_norm": 2.765625,
"learning_rate": 6.201923076923077e-7,
"loss": 0.674,
"step": 103
},
{
"epoch": 34.861952861952865,
"grad_norm": 2.8125,
"learning_rate": 6.153846153846154e-7,
"loss": 0.6732,
"step": 104
},
{
"epoch": 35,
"grad_norm": 3.984375,
"learning_rate": 6.105769230769232e-7,
"loss": 0.6642,
"step": 105
},
{
"epoch": 35.43097643097643,
"grad_norm": 2.78125,
"learning_rate": 6.057692307692307e-7,
"loss": 0.663,
"step": 106
},
{
"epoch": 35.861952861952865,
"grad_norm": 2.9375,
"learning_rate": 6.009615384615384e-7,
"loss": 0.6762,
"step": 107
},
{
"epoch": 36,
"grad_norm": 4.03125,
"learning_rate": 5.961538461538461e-7,
"loss": 0.6714,
"step": 108
},
{
"epoch": 36.43097643097643,
"grad_norm": 2.75,
"learning_rate": 5.913461538461538e-7,
"loss": 0.6838,
"step": 109
},
{
"epoch": 36.861952861952865,
"grad_norm": 2.828125,
"learning_rate": 5.865384615384616e-7,
"loss": 0.6636,
"step": 110
},
{
"epoch": 37,
"grad_norm": 4.125,
"learning_rate": 5.817307692307692e-7,
"loss": 0.6345,
"step": 111
},
{
"epoch": 37.43097643097643,
"grad_norm": 2.65625,
"learning_rate": 5.769230769230768e-7,
"loss": 0.6715,
"step": 112
},
{
"epoch": 37.861952861952865,
"grad_norm": 2.671875,
"learning_rate": 5.721153846153846e-7,
"loss": 0.6528,
"step": 113
},
{
"epoch": 38,
"grad_norm": 3.890625,
"learning_rate": 5.673076923076922e-7,
"loss": 0.69,
"step": 114
},
{
"epoch": 38.43097643097643,
"grad_norm": 2.484375,
"learning_rate": 5.625e-7,
"loss": 0.6795,
"step": 115
},
{
"epoch": 38.861952861952865,
"grad_norm": 2.609375,
"learning_rate": 5.576923076923077e-7,
"loss": 0.6652,
"step": 116
},
{
"epoch": 39,
"grad_norm": 4.3125,
"learning_rate": 5.528846153846153e-7,
"loss": 0.6105,
"step": 117
},
{
"epoch": 39.43097643097643,
"grad_norm": 2.6875,
"learning_rate": 5.480769230769231e-7,
"loss": 0.6669,
"step": 118
},
{
"epoch": 39.861952861952865,
"grad_norm": 2.671875,
"learning_rate": 5.432692307692307e-7,
"loss": 0.662,
"step": 119
},
{
"epoch": 40,
"grad_norm": 4.1875,
"learning_rate": 5.384615384615384e-7,
"loss": 0.6532,
"step": 120
},
{
"epoch": 40.43097643097643,
"grad_norm": 2.578125,
"learning_rate": 5.336538461538461e-7,
"loss": 0.6306,
"step": 121
},
{
"epoch": 40.861952861952865,
"grad_norm": 2.578125,
"learning_rate": 5.288461538461539e-7,
"loss": 0.689,
"step": 122
},
{
"epoch": 41,
"grad_norm": 3.984375,
"learning_rate": 5.240384615384615e-7,
"loss": 0.67,
"step": 123
},
{
"epoch": 41.43097643097643,
"grad_norm": 2.5625,
"learning_rate": 5.192307692307692e-7,
"loss": 0.6475,
"step": 124
},
{
"epoch": 41.861952861952865,
"grad_norm": 2.53125,
"learning_rate": 5.144230769230769e-7,
"loss": 0.6606,
"step": 125
},
{
"epoch": 42,
"grad_norm": 3.9375,
"learning_rate": 5.096153846153845e-7,
"loss": 0.7006,
"step": 126
},
{
"epoch": 42.43097643097643,
"grad_norm": 2.5625,
"learning_rate": 5.048076923076923e-7,
"loss": 0.6532,
"step": 127
},
{
"epoch": 42.861952861952865,
"grad_norm": 2.515625,
"learning_rate": 5e-7,
"loss": 0.6641,
"step": 128
},
{
"epoch": 43,
"grad_norm": 4.0625,
"learning_rate": 4.951923076923076e-7,
"loss": 0.6607,
"step": 129
},
{
"epoch": 43.43097643097643,
"grad_norm": 2.578125,
"learning_rate": 4.903846153846153e-7,
"loss": 0.6739,
"step": 130
},
{
"epoch": 43.861952861952865,
"grad_norm": 2.671875,
"learning_rate": 4.855769230769231e-7,
"loss": 0.6452,
"step": 131
},
{
"epoch": 44,
"grad_norm": 4.09375,
"learning_rate": 4.807692307692307e-7,
"loss": 0.6473,
"step": 132
},
{
"epoch": 44.43097643097643,
"grad_norm": 2.609375,
"learning_rate": 4.759615384615384e-7,
"loss": 0.6559,
"step": 133
},
{
"epoch": 44.861952861952865,
"grad_norm": 2.390625,
"learning_rate": 4.711538461538461e-7,
"loss": 0.6482,
"step": 134
},
{
"epoch": 45,
"grad_norm": 4.15625,
"learning_rate": 4.6634615384615384e-7,
"loss": 0.6874,
"step": 135
},
{
"epoch": 45.43097643097643,
"grad_norm": 2.421875,
"learning_rate": 4.6153846153846156e-7,
"loss": 0.6561,
"step": 136
},
{
"epoch": 45.861952861952865,
"grad_norm": 2.5,
"learning_rate": 4.567307692307692e-7,
"loss": 0.6676,
"step": 137
},
{
"epoch": 46,
"grad_norm": 4.03125,
"learning_rate": 4.519230769230769e-7,
"loss": 0.6185,
"step": 138
},
{
"epoch": 46.43097643097643,
"grad_norm": 2.59375,
"learning_rate": 4.471153846153846e-7,
"loss": 0.6455,
"step": 139
},
{
"epoch": 46.861952861952865,
"grad_norm": 2.640625,
"learning_rate": 4.423076923076923e-7,
"loss": 0.671,
"step": 140
},
{
"epoch": 47,
"grad_norm": 3.6875,
"learning_rate": 4.375e-7,
"loss": 0.6383,
"step": 141
},
{
"epoch": 47.43097643097643,
"grad_norm": 2.515625,
"learning_rate": 4.326923076923077e-7,
"loss": 0.6588,
"step": 142
},
{
"epoch": 47.861952861952865,
"grad_norm": 2.71875,
"learning_rate": 4.278846153846153e-7,
"loss": 0.6583,
"step": 143
},
{
"epoch": 48,
"grad_norm": 4.125,
"learning_rate": 4.2307692307692304e-7,
"loss": 0.6262,
"step": 144
},
{
"epoch": 48.43097643097643,
"grad_norm": 2.515625,
"learning_rate": 4.1826923076923076e-7,
"loss": 0.6363,
"step": 145
},
{
"epoch": 48.861952861952865,
"grad_norm": 2.40625,
"learning_rate": 4.134615384615384e-7,
"loss": 0.6735,
"step": 146
},
{
"epoch": 49,
"grad_norm": 4.40625,
"learning_rate": 4.0865384615384614e-7,
"loss": 0.6464,
"step": 147
},
{
"epoch": 49.43097643097643,
"grad_norm": 2.75,
"learning_rate": 4.0384615384615386e-7,
"loss": 0.6565,
"step": 148
},
{
"epoch": 49.861952861952865,
"grad_norm": 2.5625,
"learning_rate": 3.990384615384615e-7,
"loss": 0.6388,
"step": 149
},
{
"epoch": 50,
"grad_norm": 4.3125,
"learning_rate": 3.942307692307692e-7,
"loss": 0.6859,
"step": 150
},
{
"epoch": 50.43097643097643,
"grad_norm": 2.5625,
"learning_rate": 3.894230769230769e-7,
"loss": 0.6576,
"step": 151
},
{
"epoch": 50.861952861952865,
"grad_norm": 2.46875,
"learning_rate": 3.8461538461538463e-7,
"loss": 0.6495,
"step": 152
},
{
"epoch": 51,
"grad_norm": 3.875,
"learning_rate": 3.798076923076923e-7,
"loss": 0.6457,
"step": 153
},
{
"epoch": 51.43097643097643,
"grad_norm": 2.609375,
"learning_rate": 3.75e-7,
"loss": 0.6138,
"step": 154
},
{
"epoch": 51.861952861952865,
"grad_norm": 2.4375,
"learning_rate": 3.701923076923077e-7,
"loss": 0.6805,
"step": 155
},
{
"epoch": 52,
"grad_norm": 4.1875,
"learning_rate": 3.6538461538461534e-7,
"loss": 0.6769,
"step": 156
},
{
"epoch": 52.43097643097643,
"grad_norm": 2.4375,
"learning_rate": 3.6057692307692306e-7,
"loss": 0.6475,
"step": 157
},
{
"epoch": 52.861952861952865,
"grad_norm": 2.640625,
"learning_rate": 3.557692307692308e-7,
"loss": 0.6638,
"step": 158
},
{
"epoch": 53,
"grad_norm": 4.65625,
"learning_rate": 3.5096153846153844e-7,
"loss": 0.6296,
"step": 159
},
{
"epoch": 53.43097643097643,
"grad_norm": 2.703125,
"learning_rate": 3.461538461538461e-7,
"loss": 0.6555,
"step": 160
},
{
"epoch": 53.861952861952865,
"grad_norm": 2.53125,
"learning_rate": 3.413461538461538e-7,
"loss": 0.646,
"step": 161
},
{
"epoch": 54,
"grad_norm": 4.03125,
"learning_rate": 3.3653846153846154e-7,
"loss": 0.6546,
"step": 162
},
{
"epoch": 54.43097643097643,
"grad_norm": 2.640625,
"learning_rate": 3.317307692307692e-7,
"loss": 0.6416,
"step": 163
},
{
"epoch": 54.861952861952865,
"grad_norm": 2.609375,
"learning_rate": 3.269230769230769e-7,
"loss": 0.6516,
"step": 164
},
{
"epoch": 55,
"grad_norm": 4.5,
"learning_rate": 3.2211538461538464e-7,
"loss": 0.6745,
"step": 165
},
{
"epoch": 55.43097643097643,
"grad_norm": 2.453125,
"learning_rate": 3.1730769230769225e-7,
"loss": 0.6413,
"step": 166
},
{
"epoch": 55.861952861952865,
"grad_norm": 2.4375,
"learning_rate": 3.1249999999999997e-7,
"loss": 0.6661,
"step": 167
},
{
"epoch": 56,
"grad_norm": 3.625,
"learning_rate": 3.076923076923077e-7,
"loss": 0.6304,
"step": 168
},
{
"epoch": 56.43097643097643,
"grad_norm": 2.421875,
"learning_rate": 3.0288461538461536e-7,
"loss": 0.6396,
"step": 169
},
{
"epoch": 56.861952861952865,
"grad_norm": 2.609375,
"learning_rate": 2.980769230769231e-7,
"loss": 0.6644,
"step": 170
},
{
"epoch": 57,
"grad_norm": 4.1875,
"learning_rate": 2.932692307692308e-7,
"loss": 0.6359,
"step": 171
},
{
"epoch": 57.43097643097643,
"grad_norm": 2.25,
"learning_rate": 2.884615384615384e-7,
"loss": 0.6424,
"step": 172
},
{
"epoch": 57.861952861952865,
"grad_norm": 2.4375,
"learning_rate": 2.836538461538461e-7,
"loss": 0.6367,
"step": 173
},
{
"epoch": 58,
"grad_norm": 4.53125,
"learning_rate": 2.7884615384615384e-7,
"loss": 0.7177,
"step": 174
},
{
"epoch": 58.43097643097643,
"grad_norm": 2.390625,
"learning_rate": 2.7403846153846156e-7,
"loss": 0.644,
"step": 175
},
{
"epoch": 58.861952861952865,
"grad_norm": 2.296875,
"learning_rate": 2.692307692307692e-7,
"loss": 0.6598,
"step": 176
},
{
"epoch": 59,
"grad_norm": 3.921875,
"learning_rate": 2.6442307692307694e-7,
"loss": 0.6329,
"step": 177
},
{
"epoch": 59.43097643097643,
"grad_norm": 2.453125,
"learning_rate": 2.596153846153846e-7,
"loss": 0.6454,
"step": 178
},
{
"epoch": 59.861952861952865,
"grad_norm": 2.453125,
"learning_rate": 2.5480769230769227e-7,
"loss": 0.6465,
"step": 179
},
{
"epoch": 60,
"grad_norm": 3.96875,
"learning_rate": 2.5e-7,
"loss": 0.6695,
"step": 180
},
{
"epoch": 60.43097643097643,
"grad_norm": 2.328125,
"learning_rate": 2.4519230769230765e-7,
"loss": 0.6445,
"step": 181
},
{
"epoch": 60.861952861952865,
"grad_norm": 2.75,
"learning_rate": 2.4038461538461537e-7,
"loss": 0.653,
"step": 182
},
{
"epoch": 61,
"grad_norm": 4.21875,
"learning_rate": 2.3557692307692306e-7,
"loss": 0.6487,
"step": 183
},
{
"epoch": 61.43097643097643,
"grad_norm": 2.640625,
"learning_rate": 2.3076923076923078e-7,
"loss": 0.6372,
"step": 184
},
{
"epoch": 61.861952861952865,
"grad_norm": 2.515625,
"learning_rate": 2.2596153846153845e-7,
"loss": 0.6621,
"step": 185
},
{
"epoch": 62,
"grad_norm": 4.75,
"learning_rate": 2.2115384615384614e-7,
"loss": 0.6451,
"step": 186
},
{
"epoch": 62.43097643097643,
"grad_norm": 2.421875,
"learning_rate": 2.1634615384615386e-7,
"loss": 0.6628,
"step": 187
},
{
"epoch": 62.861952861952865,
"grad_norm": 2.46875,
"learning_rate": 2.1153846153846152e-7,
"loss": 0.6486,
"step": 188
},
{
"epoch": 63,
"grad_norm": 3.640625,
"learning_rate": 2.067307692307692e-7,
"loss": 0.606,
"step": 189
},
{
"epoch": 63.43097643097643,
"grad_norm": 2.40625,
"learning_rate": 2.0192307692307693e-7,
"loss": 0.6471,
"step": 190
},
{
"epoch": 63.861952861952865,
"grad_norm": 2.34375,
"learning_rate": 1.971153846153846e-7,
"loss": 0.6492,
"step": 191
},
{
"epoch": 64,
"grad_norm": 4.34375,
"learning_rate": 1.9230769230769231e-7,
"loss": 0.6491,
"step": 192
},
{
"epoch": 64.43097643097643,
"grad_norm": 2.390625,
"learning_rate": 1.875e-7,
"loss": 0.6413,
"step": 193
},
{
"epoch": 64.86195286195286,
"grad_norm": 2.40625,
"learning_rate": 1.8269230769230767e-7,
"loss": 0.6666,
"step": 194
},
{
"epoch": 65,
"grad_norm": 3.859375,
"learning_rate": 1.778846153846154e-7,
"loss": 0.614,
"step": 195
},
{
"epoch": 65.43097643097643,
"grad_norm": 2.5625,
"learning_rate": 1.7307692307692305e-7,
"loss": 0.6571,
"step": 196
},
{
"epoch": 65.86195286195286,
"grad_norm": 2.3125,
"learning_rate": 1.6826923076923077e-7,
"loss": 0.6284,
"step": 197
},
{
"epoch": 66,
"grad_norm": 4.625,
"learning_rate": 1.6346153846153846e-7,
"loss": 0.6826,
"step": 198
},
{
"epoch": 66.43097643097643,
"grad_norm": 2.40625,
"learning_rate": 1.5865384615384613e-7,
"loss": 0.6262,
"step": 199
},
{
"epoch": 66.86195286195286,
"grad_norm": 2.578125,
"learning_rate": 1.5384615384615385e-7,
"loss": 0.6765,
"step": 200
},
{
"epoch": 67,
"grad_norm": 4.25,
"learning_rate": 1.4903846153846154e-7,
"loss": 0.6241,
"step": 201
},
{
"epoch": 67.43097643097643,
"grad_norm": 2.328125,
"learning_rate": 1.442307692307692e-7,
"loss": 0.6577,
"step": 202
},
{
"epoch": 67.86195286195286,
"grad_norm": 2.421875,
"learning_rate": 1.3942307692307692e-7,
"loss": 0.6427,
"step": 203
},
{
"epoch": 68,
"grad_norm": 3.703125,
"learning_rate": 1.346153846153846e-7,
"loss": 0.6346,
"step": 204
},
{
"epoch": 68.43097643097643,
"grad_norm": 2.390625,
"learning_rate": 1.298076923076923e-7,
"loss": 0.66,
"step": 205
},
{
"epoch": 68.86195286195286,
"grad_norm": 2.453125,
"learning_rate": 1.25e-7,
"loss": 0.6272,
"step": 206
},
{
"epoch": 69,
"grad_norm": 4.5,
"learning_rate": 1.2019230769230769e-7,
"loss": 0.6707,
"step": 207
},
{
"epoch": 69.43097643097643,
"grad_norm": 2.328125,
"learning_rate": 1.1538461538461539e-7,
"loss": 0.6397,
"step": 208
},
{
"epoch": 69.86195286195286,
"grad_norm": 2.578125,
"learning_rate": 1.1057692307692307e-7,
"loss": 0.6571,
"step": 209
},
{
"epoch": 70,
"grad_norm": 4.3125,
"learning_rate": 1.0576923076923076e-7,
"loss": 0.6455,
"step": 210
},
{
"epoch": 70.43097643097643,
"grad_norm": 2.796875,
"learning_rate": 1.0096153846153847e-7,
"loss": 0.6518,
"step": 211
},
{
"epoch": 70.86195286195286,
"grad_norm": 2.515625,
"learning_rate": 9.615384615384616e-8,
"loss": 0.6466,
"step": 212
},
{
"epoch": 71,
"grad_norm": 3.921875,
"learning_rate": 9.134615384615383e-8,
"loss": 0.6385,
"step": 213
},
{
"epoch": 71.43097643097643,
"grad_norm": 2.265625,
"learning_rate": 8.653846153846153e-8,
"loss": 0.6468,
"step": 214
},
{
"epoch": 71.86195286195286,
"grad_norm": 2.359375,
"learning_rate": 8.173076923076923e-8,
"loss": 0.6488,
"step": 215
},
{
"epoch": 72,
"grad_norm": 3.953125,
"learning_rate": 7.692307692307692e-8,
"loss": 0.6483,
"step": 216
},
{
"epoch": 72.43097643097643,
"grad_norm": 2.40625,
"learning_rate": 7.21153846153846e-8,
"loss": 0.6374,
"step": 217
},
{
"epoch": 72.86195286195286,
"grad_norm": 2.5625,
"learning_rate": 6.73076923076923e-8,
"loss": 0.6591,
"step": 218
},
{
"epoch": 73,
"grad_norm": 3.734375,
"learning_rate": 6.25e-8,
"loss": 0.6459,
"step": 219
},
{
"epoch": 73.43097643097643,
"grad_norm": 2.59375,
"learning_rate": 5.7692307692307695e-8,
"loss": 0.6445,
"step": 220
},
{
"epoch": 73.86195286195286,
"grad_norm": 2.65625,
"learning_rate": 5.288461538461538e-8,
"loss": 0.6464,
"step": 221
},
{
"epoch": 74,
"grad_norm": 4.25,
"learning_rate": 4.807692307692308e-8,
"loss": 0.6614,
"step": 222
},
{
"epoch": 74.43097643097643,
"grad_norm": 2.421875,
"learning_rate": 4.326923076923076e-8,
"loss": 0.655,
"step": 223
},
{
"epoch": 74.86195286195286,
"grad_norm": 2.328125,
"learning_rate": 3.846153846153846e-8,
"loss": 0.6331,
"step": 224
},
{
"epoch": 75,
"grad_norm": 3.71875,
"learning_rate": 3.365384615384615e-8,
"loss": 0.6695,
"step": 225
},
{
"epoch": 75.43097643097643,
"grad_norm": 2.359375,
"learning_rate": 2.8846153846153848e-8,
"loss": 0.6616,
"step": 226
},
{
"epoch": 75.86195286195286,
"grad_norm": 2.328125,
"learning_rate": 2.403846153846154e-8,
"loss": 0.6332,
"step": 227
},
{
"epoch": 76,
"grad_norm": 5.09375,
"learning_rate": 1.923076923076923e-8,
"loss": 0.6526,
"step": 228
},
{
"epoch": 76.43097643097643,
"grad_norm": 2.484375,
"learning_rate": 1.4423076923076924e-8,
"loss": 0.6523,
"step": 229
},
{
"epoch": 76.86195286195286,
"grad_norm": 2.828125,
"learning_rate": 9.615384615384615e-9,
"loss": 0.6386,
"step": 230
},
{
"epoch": 77,
"grad_norm": 4.40625,
"learning_rate": 4.807692307692308e-9,
"loss": 0.6632,
"step": 231
},
{
"epoch": 77.43097643097643,
"grad_norm": 2.328125,
"learning_rate": 0,
"loss": 0.6418,
"step": 232
}
] | 1
| 232
| 0
| 116
| 10
|
{
"TrainerControl": {
"args": {
"should_epoch_stop": false,
"should_evaluate": false,
"should_log": false,
"should_save": true,
"should_training_stop": true
},
"attributes": {}
}
}
| 485,317,661,597,368,300
| 1
| null | null |
null | null | 115.85906
| 500
| 232
| false
| true
| true
|
[
{
"epoch": 0.42953020134228187,
"grad_norm": 5.40625,
"learning_rate": 4.166666666666666e-8,
"loss": 0.4544,
"step": 1
},
{
"epoch": 0.8590604026845637,
"grad_norm": 5.40625,
"learning_rate": 8.333333333333333e-8,
"loss": 0.4636,
"step": 2
},
{
"epoch": 1.429530201342282,
"grad_norm": 11.125,
"learning_rate": 1.25e-7,
"loss": 0.9304,
"step": 3
},
{
"epoch": 1.8590604026845639,
"grad_norm": 5.125,
"learning_rate": 1.6666666666666665e-7,
"loss": 0.4599,
"step": 4
},
{
"epoch": 2.4295302013422817,
"grad_norm": 11.375,
"learning_rate": 2.0833333333333333e-7,
"loss": 0.9355,
"step": 5
},
{
"epoch": 2.859060402684564,
"grad_norm": 5.3125,
"learning_rate": 2.5e-7,
"loss": 0.4638,
"step": 6
},
{
"epoch": 3.4295302013422817,
"grad_norm": 10.625,
"learning_rate": 2.916666666666667e-7,
"loss": 0.9105,
"step": 7
},
{
"epoch": 3.859060402684564,
"grad_norm": 5.375,
"learning_rate": 3.333333333333333e-7,
"loss": 0.4634,
"step": 8
},
{
"epoch": 4.429530201342282,
"grad_norm": 10.875,
"learning_rate": 3.75e-7,
"loss": 0.9292,
"step": 9
},
{
"epoch": 4.859060402684563,
"grad_norm": 5.3125,
"learning_rate": 4.1666666666666667e-7,
"loss": 0.4652,
"step": 10
},
{
"epoch": 5.429530201342282,
"grad_norm": 10.75,
"learning_rate": 4.5833333333333327e-7,
"loss": 0.9002,
"step": 11
},
{
"epoch": 5.859060402684563,
"grad_norm": 5.5,
"learning_rate": 5e-7,
"loss": 0.4812,
"step": 12
},
{
"epoch": 6.429530201342282,
"grad_norm": 10.6875,
"learning_rate": 5.416666666666666e-7,
"loss": 0.901,
"step": 13
},
{
"epoch": 6.859060402684563,
"grad_norm": 5.15625,
"learning_rate": 5.833333333333334e-7,
"loss": 0.4532,
"step": 14
},
{
"epoch": 7.429530201342282,
"grad_norm": 11.4375,
"learning_rate": 6.249999999999999e-7,
"loss": 0.9517,
"step": 15
},
{
"epoch": 7.859060402684563,
"grad_norm": 5.09375,
"learning_rate": 6.666666666666666e-7,
"loss": 0.454,
"step": 16
},
{
"epoch": 8.429530201342281,
"grad_norm": 10.6875,
"learning_rate": 7.083333333333334e-7,
"loss": 0.8804,
"step": 17
},
{
"epoch": 8.859060402684564,
"grad_norm": 5.25,
"learning_rate": 7.5e-7,
"loss": 0.4492,
"step": 18
},
{
"epoch": 9.429530201342281,
"grad_norm": 10.875,
"learning_rate": 7.916666666666666e-7,
"loss": 0.9459,
"step": 19
},
{
"epoch": 9.859060402684564,
"grad_norm": 5.25,
"learning_rate": 8.333333333333333e-7,
"loss": 0.4485,
"step": 20
},
{
"epoch": 10.429530201342281,
"grad_norm": 10.6875,
"learning_rate": 8.75e-7,
"loss": 0.9095,
"step": 21
},
{
"epoch": 10.859060402684564,
"grad_norm": 5.375,
"learning_rate": 9.166666666666665e-7,
"loss": 0.4664,
"step": 22
},
{
"epoch": 11.429530201342281,
"grad_norm": 10.3125,
"learning_rate": 9.583333333333334e-7,
"loss": 0.9063,
"step": 23
},
{
"epoch": 11.859060402684564,
"grad_norm": 5.28125,
"learning_rate": 0.000001,
"loss": 0.4646,
"step": 24
},
{
"epoch": 12.429530201342281,
"grad_norm": 10.3125,
"learning_rate": 9.951923076923077e-7,
"loss": 0.9011,
"step": 25
},
{
"epoch": 12.859060402684564,
"grad_norm": 5.375,
"learning_rate": 9.903846153846153e-7,
"loss": 0.4588,
"step": 26
},
{
"epoch": 13.429530201342281,
"grad_norm": 10.25,
"learning_rate": 9.85576923076923e-7,
"loss": 0.9001,
"step": 27
},
{
"epoch": 13.859060402684564,
"grad_norm": 4.96875,
"learning_rate": 9.807692307692306e-7,
"loss": 0.4407,
"step": 28
},
{
"epoch": 14.429530201342281,
"grad_norm": 11.0625,
"learning_rate": 9.759615384615384e-7,
"loss": 0.9151,
"step": 29
},
{
"epoch": 14.859060402684564,
"grad_norm": 5.1875,
"learning_rate": 9.711538461538462e-7,
"loss": 0.4529,
"step": 30
},
{
"epoch": 15.429530201342281,
"grad_norm": 10.6875,
"learning_rate": 9.663461538461537e-7,
"loss": 0.9026,
"step": 31
},
{
"epoch": 15.859060402684564,
"grad_norm": 5.15625,
"learning_rate": 9.615384615384615e-7,
"loss": 0.4418,
"step": 32
},
{
"epoch": 16.42953020134228,
"grad_norm": 10.5,
"learning_rate": 9.567307692307693e-7,
"loss": 0.8999,
"step": 33
},
{
"epoch": 16.859060402684563,
"grad_norm": 5.09375,
"learning_rate": 9.519230769230768e-7,
"loss": 0.436,
"step": 34
},
{
"epoch": 17.42953020134228,
"grad_norm": 10.5,
"learning_rate": 9.471153846153846e-7,
"loss": 0.8918,
"step": 35
},
{
"epoch": 17.859060402684563,
"grad_norm": 5.1875,
"learning_rate": 9.423076923076923e-7,
"loss": 0.4494,
"step": 36
},
{
"epoch": 18.42953020134228,
"grad_norm": 10.1875,
"learning_rate": 9.374999999999999e-7,
"loss": 0.8895,
"step": 37
},
{
"epoch": 18.859060402684563,
"grad_norm": 5.21875,
"learning_rate": 9.326923076923077e-7,
"loss": 0.4412,
"step": 38
},
{
"epoch": 19.42953020134228,
"grad_norm": 10.375,
"learning_rate": 9.278846153846154e-7,
"loss": 0.8686,
"step": 39
},
{
"epoch": 19.859060402684563,
"grad_norm": 5.03125,
"learning_rate": 9.230769230769231e-7,
"loss": 0.4311,
"step": 40
},
{
"epoch": 20.42953020134228,
"grad_norm": 10.875,
"learning_rate": 9.182692307692307e-7,
"loss": 0.9158,
"step": 41
},
{
"epoch": 20.859060402684563,
"grad_norm": 5.09375,
"learning_rate": 9.134615384615383e-7,
"loss": 0.4339,
"step": 42
},
{
"epoch": 21.42953020134228,
"grad_norm": 11.125,
"learning_rate": 9.086538461538461e-7,
"loss": 0.8986,
"step": 43
},
{
"epoch": 21.859060402684563,
"grad_norm": 5.03125,
"learning_rate": 9.038461538461538e-7,
"loss": 0.4437,
"step": 44
},
{
"epoch": 22.42953020134228,
"grad_norm": 10.375,
"learning_rate": 8.990384615384616e-7,
"loss": 0.8707,
"step": 45
},
{
"epoch": 22.859060402684563,
"grad_norm": 4.875,
"learning_rate": 8.942307692307692e-7,
"loss": 0.4377,
"step": 46
},
{
"epoch": 23.42953020134228,
"grad_norm": 10.25,
"learning_rate": 8.894230769230768e-7,
"loss": 0.8443,
"step": 47
},
{
"epoch": 23.859060402684563,
"grad_norm": 5.1875,
"learning_rate": 8.846153846153846e-7,
"loss": 0.4418,
"step": 48
},
{
"epoch": 24.42953020134228,
"grad_norm": 10.125,
"learning_rate": 8.798076923076922e-7,
"loss": 0.8762,
"step": 49
},
{
"epoch": 24.859060402684563,
"grad_norm": 4.96875,
"learning_rate": 8.75e-7,
"loss": 0.4323,
"step": 50
},
{
"epoch": 25.42953020134228,
"grad_norm": 9.6875,
"learning_rate": 8.701923076923077e-7,
"loss": 0.8705,
"step": 51
},
{
"epoch": 25.859060402684563,
"grad_norm": 5.3125,
"learning_rate": 8.653846153846154e-7,
"loss": 0.4304,
"step": 52
},
{
"epoch": 26.42953020134228,
"grad_norm": 10.0625,
"learning_rate": 8.605769230769231e-7,
"loss": 0.8701,
"step": 53
},
{
"epoch": 26.859060402684563,
"grad_norm": 5.03125,
"learning_rate": 8.557692307692306e-7,
"loss": 0.4374,
"step": 54
},
{
"epoch": 27.42953020134228,
"grad_norm": 9.9375,
"learning_rate": 8.509615384615384e-7,
"loss": 0.8511,
"step": 55
},
{
"epoch": 27.859060402684563,
"grad_norm": 4.96875,
"learning_rate": 8.461538461538461e-7,
"loss": 0.4227,
"step": 56
},
{
"epoch": 28.42953020134228,
"grad_norm": 10.0625,
"learning_rate": 8.413461538461539e-7,
"loss": 0.8821,
"step": 57
},
{
"epoch": 28.859060402684563,
"grad_norm": 5.09375,
"learning_rate": 8.365384615384615e-7,
"loss": 0.4288,
"step": 58
},
{
"epoch": 29.42953020134228,
"grad_norm": 10.3125,
"learning_rate": 8.317307692307692e-7,
"loss": 0.8643,
"step": 59
},
{
"epoch": 29.859060402684563,
"grad_norm": 5,
"learning_rate": 8.269230769230768e-7,
"loss": 0.4251,
"step": 60
},
{
"epoch": 30.42953020134228,
"grad_norm": 10.125,
"learning_rate": 8.221153846153845e-7,
"loss": 0.8542,
"step": 61
},
{
"epoch": 30.859060402684563,
"grad_norm": 4.75,
"learning_rate": 8.173076923076923e-7,
"loss": 0.4227,
"step": 62
},
{
"epoch": 31.42953020134228,
"grad_norm": 9.9375,
"learning_rate": 8.125e-7,
"loss": 0.8503,
"step": 63
},
{
"epoch": 31.859060402684563,
"grad_norm": 5,
"learning_rate": 8.076923076923077e-7,
"loss": 0.4192,
"step": 64
},
{
"epoch": 32.42953020134228,
"grad_norm": 9.5625,
"learning_rate": 8.028846153846154e-7,
"loss": 0.8478,
"step": 65
},
{
"epoch": 32.85906040268456,
"grad_norm": 4.84375,
"learning_rate": 7.98076923076923e-7,
"loss": 0.4235,
"step": 66
},
{
"epoch": 33.42953020134228,
"grad_norm": 10.0625,
"learning_rate": 7.932692307692307e-7,
"loss": 0.8821,
"step": 67
},
{
"epoch": 33.85906040268456,
"grad_norm": 4.78125,
"learning_rate": 7.884615384615384e-7,
"loss": 0.4102,
"step": 68
},
{
"epoch": 34.42953020134228,
"grad_norm": 9.5,
"learning_rate": 7.836538461538462e-7,
"loss": 0.8394,
"step": 69
},
{
"epoch": 34.85906040268456,
"grad_norm": 4.84375,
"learning_rate": 7.788461538461538e-7,
"loss": 0.4265,
"step": 70
},
{
"epoch": 35.42953020134228,
"grad_norm": 10.1875,
"learning_rate": 7.740384615384615e-7,
"loss": 0.8449,
"step": 71
},
{
"epoch": 35.85906040268456,
"grad_norm": 4.75,
"learning_rate": 7.692307692307693e-7,
"loss": 0.4192,
"step": 72
},
{
"epoch": 36.42953020134228,
"grad_norm": 9.8125,
"learning_rate": 7.644230769230768e-7,
"loss": 0.864,
"step": 73
},
{
"epoch": 36.85906040268456,
"grad_norm": 4.5625,
"learning_rate": 7.596153846153846e-7,
"loss": 0.4132,
"step": 74
},
{
"epoch": 37.42953020134228,
"grad_norm": 10.25,
"learning_rate": 7.548076923076922e-7,
"loss": 0.849,
"step": 75
},
{
"epoch": 37.85906040268456,
"grad_norm": 4.65625,
"learning_rate": 7.5e-7,
"loss": 0.4184,
"step": 76
},
{
"epoch": 38.42953020134228,
"grad_norm": 9.75,
"learning_rate": 7.451923076923077e-7,
"loss": 0.8327,
"step": 77
},
{
"epoch": 38.85906040268456,
"grad_norm": 4.78125,
"learning_rate": 7.403846153846153e-7,
"loss": 0.423,
"step": 78
},
{
"epoch": 39.42953020134228,
"grad_norm": 9.4375,
"learning_rate": 7.355769230769231e-7,
"loss": 0.8156,
"step": 79
},
{
"epoch": 39.85906040268456,
"grad_norm": 4.625,
"learning_rate": 7.307692307692307e-7,
"loss": 0.4061,
"step": 80
},
{
"epoch": 40.42953020134228,
"grad_norm": 9.625,
"learning_rate": 7.259615384615385e-7,
"loss": 0.8401,
"step": 81
},
{
"epoch": 40.85906040268456,
"grad_norm": 4.65625,
"learning_rate": 7.211538461538461e-7,
"loss": 0.4225,
"step": 82
},
{
"epoch": 41.42953020134228,
"grad_norm": 9.3125,
"learning_rate": 7.163461538461538e-7,
"loss": 0.8225,
"step": 83
},
{
"epoch": 41.85906040268456,
"grad_norm": 4.6875,
"learning_rate": 7.115384615384616e-7,
"loss": 0.4087,
"step": 84
},
{
"epoch": 42.42953020134228,
"grad_norm": 9.25,
"learning_rate": 7.067307692307692e-7,
"loss": 0.8392,
"step": 85
},
{
"epoch": 42.85906040268456,
"grad_norm": 4.5625,
"learning_rate": 7.019230769230769e-7,
"loss": 0.4144,
"step": 86
},
{
"epoch": 43.42953020134228,
"grad_norm": 9.3125,
"learning_rate": 6.971153846153845e-7,
"loss": 0.844,
"step": 87
},
{
"epoch": 43.85906040268456,
"grad_norm": 4.65625,
"learning_rate": 6.923076923076922e-7,
"loss": 0.4065,
"step": 88
},
{
"epoch": 44.42953020134228,
"grad_norm": 9.6875,
"learning_rate": 6.875e-7,
"loss": 0.8209,
"step": 89
},
{
"epoch": 44.85906040268456,
"grad_norm": 4.625,
"learning_rate": 6.826923076923076e-7,
"loss": 0.4236,
"step": 90
},
{
"epoch": 45.42953020134228,
"grad_norm": 9.1875,
"learning_rate": 6.778846153846154e-7,
"loss": 0.8065,
"step": 91
},
{
"epoch": 45.85906040268456,
"grad_norm": 4.6875,
"learning_rate": 6.730769230769231e-7,
"loss": 0.4197,
"step": 92
},
{
"epoch": 46.42953020134228,
"grad_norm": 9.0625,
"learning_rate": 6.682692307692307e-7,
"loss": 0.8189,
"step": 93
},
{
"epoch": 46.85906040268456,
"grad_norm": 4.53125,
"learning_rate": 6.634615384615384e-7,
"loss": 0.408,
"step": 94
},
{
"epoch": 47.42953020134228,
"grad_norm": 9.125,
"learning_rate": 6.586538461538461e-7,
"loss": 0.799,
"step": 95
},
{
"epoch": 47.85906040268456,
"grad_norm": 4.53125,
"learning_rate": 6.538461538461538e-7,
"loss": 0.4187,
"step": 96
},
{
"epoch": 48.42953020134228,
"grad_norm": 9.625,
"learning_rate": 6.490384615384615e-7,
"loss": 0.7994,
"step": 97
},
{
"epoch": 48.85906040268456,
"grad_norm": 4.65625,
"learning_rate": 6.442307692307693e-7,
"loss": 0.4162,
"step": 98
},
{
"epoch": 49.42953020134228,
"grad_norm": 9.3125,
"learning_rate": 6.394230769230768e-7,
"loss": 0.819,
"step": 99
},
{
"epoch": 49.85906040268456,
"grad_norm": 4.65625,
"learning_rate": 6.346153846153845e-7,
"loss": 0.41,
"step": 100
},
{
"epoch": 50.42953020134228,
"grad_norm": 9.375,
"learning_rate": 6.298076923076923e-7,
"loss": 0.8426,
"step": 101
},
{
"epoch": 50.85906040268456,
"grad_norm": 4.46875,
"learning_rate": 6.249999999999999e-7,
"loss": 0.409,
"step": 102
},
{
"epoch": 51.42953020134228,
"grad_norm": 9,
"learning_rate": 6.201923076923077e-7,
"loss": 0.7921,
"step": 103
},
{
"epoch": 51.85906040268456,
"grad_norm": 4.53125,
"learning_rate": 6.153846153846154e-7,
"loss": 0.4125,
"step": 104
},
{
"epoch": 52.42953020134228,
"grad_norm": 9.375,
"learning_rate": 6.105769230769232e-7,
"loss": 0.8194,
"step": 105
},
{
"epoch": 52.85906040268456,
"grad_norm": 4.53125,
"learning_rate": 6.057692307692307e-7,
"loss": 0.3989,
"step": 106
},
{
"epoch": 53.42953020134228,
"grad_norm": 9,
"learning_rate": 6.009615384615384e-7,
"loss": 0.7911,
"step": 107
},
{
"epoch": 53.85906040268456,
"grad_norm": 4.46875,
"learning_rate": 5.961538461538461e-7,
"loss": 0.4055,
"step": 108
},
{
"epoch": 54.42953020134228,
"grad_norm": 9.5625,
"learning_rate": 5.913461538461538e-7,
"loss": 0.8213,
"step": 109
},
{
"epoch": 54.85906040268456,
"grad_norm": 4.46875,
"learning_rate": 5.865384615384616e-7,
"loss": 0.4033,
"step": 110
},
{
"epoch": 55.42953020134228,
"grad_norm": 9.375,
"learning_rate": 5.817307692307692e-7,
"loss": 0.8328,
"step": 111
},
{
"epoch": 55.85906040268456,
"grad_norm": 4.5,
"learning_rate": 5.769230769230768e-7,
"loss": 0.3976,
"step": 112
},
{
"epoch": 56.42953020134228,
"grad_norm": 9.0625,
"learning_rate": 5.721153846153846e-7,
"loss": 0.8259,
"step": 113
},
{
"epoch": 56.85906040268456,
"grad_norm": 4.5,
"learning_rate": 5.673076923076922e-7,
"loss": 0.3978,
"step": 114
},
{
"epoch": 57.42953020134228,
"grad_norm": 9.4375,
"learning_rate": 5.625e-7,
"loss": 0.8224,
"step": 115
},
{
"epoch": 57.85906040268456,
"grad_norm": 4.46875,
"learning_rate": 5.576923076923077e-7,
"loss": 0.4042,
"step": 116
},
{
"epoch": 58.42953020134228,
"grad_norm": 8.75,
"learning_rate": 5.528846153846153e-7,
"loss": 0.8015,
"step": 117
},
{
"epoch": 58.85906040268456,
"grad_norm": 4.5,
"learning_rate": 5.480769230769231e-7,
"loss": 0.41,
"step": 118
},
{
"epoch": 59.42953020134228,
"grad_norm": 8.875,
"learning_rate": 5.432692307692307e-7,
"loss": 0.8112,
"step": 119
},
{
"epoch": 59.85906040268456,
"grad_norm": 4.53125,
"learning_rate": 5.384615384615384e-7,
"loss": 0.3943,
"step": 120
},
{
"epoch": 60.42953020134228,
"grad_norm": 9.125,
"learning_rate": 5.336538461538461e-7,
"loss": 0.8038,
"step": 121
},
{
"epoch": 60.85906040268456,
"grad_norm": 4.5625,
"learning_rate": 5.288461538461539e-7,
"loss": 0.4169,
"step": 122
},
{
"epoch": 61.42953020134228,
"grad_norm": 8.6875,
"learning_rate": 5.240384615384615e-7,
"loss": 0.7946,
"step": 123
},
{
"epoch": 61.85906040268456,
"grad_norm": 4.375,
"learning_rate": 5.192307692307692e-7,
"loss": 0.3981,
"step": 124
},
{
"epoch": 62.42953020134228,
"grad_norm": 9.25,
"learning_rate": 5.144230769230769e-7,
"loss": 0.8225,
"step": 125
},
{
"epoch": 62.85906040268456,
"grad_norm": 4.4375,
"learning_rate": 5.096153846153845e-7,
"loss": 0.3955,
"step": 126
},
{
"epoch": 63.42953020134228,
"grad_norm": 9.25,
"learning_rate": 5.048076923076923e-7,
"loss": 0.8322,
"step": 127
},
{
"epoch": 63.85906040268456,
"grad_norm": 4.4375,
"learning_rate": 5e-7,
"loss": 0.396,
"step": 128
},
{
"epoch": 64.42953020134229,
"grad_norm": 9.125,
"learning_rate": 4.951923076923076e-7,
"loss": 0.8144,
"step": 129
},
{
"epoch": 64.85906040268456,
"grad_norm": 4.375,
"learning_rate": 4.903846153846153e-7,
"loss": 0.3951,
"step": 130
},
{
"epoch": 65.42953020134229,
"grad_norm": 9.0625,
"learning_rate": 4.855769230769231e-7,
"loss": 0.8118,
"step": 131
},
{
"epoch": 65.85906040268456,
"grad_norm": 4.3125,
"learning_rate": 4.807692307692307e-7,
"loss": 0.395,
"step": 132
},
{
"epoch": 66.42953020134229,
"grad_norm": 9.4375,
"learning_rate": 4.759615384615384e-7,
"loss": 0.823,
"step": 133
},
{
"epoch": 66.85906040268456,
"grad_norm": 4.28125,
"learning_rate": 4.711538461538461e-7,
"loss": 0.4049,
"step": 134
},
{
"epoch": 67.42953020134229,
"grad_norm": 8.6875,
"learning_rate": 4.6634615384615384e-7,
"loss": 0.7828,
"step": 135
},
{
"epoch": 67.85906040268456,
"grad_norm": 4.375,
"learning_rate": 4.6153846153846156e-7,
"loss": 0.4122,
"step": 136
},
{
"epoch": 68.42953020134229,
"grad_norm": 8.5625,
"learning_rate": 4.567307692307692e-7,
"loss": 0.7799,
"step": 137
},
{
"epoch": 68.85906040268456,
"grad_norm": 4.4375,
"learning_rate": 4.519230769230769e-7,
"loss": 0.4009,
"step": 138
},
{
"epoch": 69.42953020134229,
"grad_norm": 8.3125,
"learning_rate": 4.471153846153846e-7,
"loss": 0.7939,
"step": 139
},
{
"epoch": 69.85906040268456,
"grad_norm": 4.46875,
"learning_rate": 4.423076923076923e-7,
"loss": 0.4141,
"step": 140
},
{
"epoch": 70.42953020134229,
"grad_norm": 9.0625,
"learning_rate": 4.375e-7,
"loss": 0.7813,
"step": 141
},
{
"epoch": 70.85906040268456,
"grad_norm": 4.34375,
"learning_rate": 4.326923076923077e-7,
"loss": 0.4003,
"step": 142
},
{
"epoch": 71.42953020134229,
"grad_norm": 8.9375,
"learning_rate": 4.278846153846153e-7,
"loss": 0.7962,
"step": 143
},
{
"epoch": 71.85906040268456,
"grad_norm": 4.25,
"learning_rate": 4.2307692307692304e-7,
"loss": 0.4028,
"step": 144
},
{
"epoch": 72.42953020134229,
"grad_norm": 8.8125,
"learning_rate": 4.1826923076923076e-7,
"loss": 0.7956,
"step": 145
},
{
"epoch": 72.85906040268456,
"grad_norm": 4.25,
"learning_rate": 4.134615384615384e-7,
"loss": 0.395,
"step": 146
},
{
"epoch": 73.42953020134229,
"grad_norm": 8.5625,
"learning_rate": 4.0865384615384614e-7,
"loss": 0.7906,
"step": 147
},
{
"epoch": 73.85906040268456,
"grad_norm": 4.4375,
"learning_rate": 4.0384615384615386e-7,
"loss": 0.398,
"step": 148
},
{
"epoch": 74.42953020134229,
"grad_norm": 8.875,
"learning_rate": 3.990384615384615e-7,
"loss": 0.8147,
"step": 149
},
{
"epoch": 74.85906040268456,
"grad_norm": 4.3125,
"learning_rate": 3.942307692307692e-7,
"loss": 0.4061,
"step": 150
},
{
"epoch": 75.42953020134229,
"grad_norm": 9.1875,
"learning_rate": 3.894230769230769e-7,
"loss": 0.788,
"step": 151
},
{
"epoch": 75.85906040268456,
"grad_norm": 4.125,
"learning_rate": 3.8461538461538463e-7,
"loss": 0.3916,
"step": 152
},
{
"epoch": 76.42953020134229,
"grad_norm": 8.6875,
"learning_rate": 3.798076923076923e-7,
"loss": 0.8253,
"step": 153
},
{
"epoch": 76.85906040268456,
"grad_norm": 4.28125,
"learning_rate": 3.75e-7,
"loss": 0.3927,
"step": 154
},
{
"epoch": 77.42953020134229,
"grad_norm": 8.625,
"learning_rate": 3.701923076923077e-7,
"loss": 0.8138,
"step": 155
},
{
"epoch": 77.85906040268456,
"grad_norm": 4.34375,
"learning_rate": 3.6538461538461534e-7,
"loss": 0.3969,
"step": 156
},
{
"epoch": 78.42953020134229,
"grad_norm": 8.5625,
"learning_rate": 3.6057692307692306e-7,
"loss": 0.7829,
"step": 157
},
{
"epoch": 78.85906040268456,
"grad_norm": 4.375,
"learning_rate": 3.557692307692308e-7,
"loss": 0.3952,
"step": 158
},
{
"epoch": 79.42953020134229,
"grad_norm": 8.5625,
"learning_rate": 3.5096153846153844e-7,
"loss": 0.7945,
"step": 159
},
{
"epoch": 79.85906040268456,
"grad_norm": 4.40625,
"learning_rate": 3.461538461538461e-7,
"loss": 0.4081,
"step": 160
},
{
"epoch": 80.42953020134229,
"grad_norm": 8.5625,
"learning_rate": 3.413461538461538e-7,
"loss": 0.7685,
"step": 161
},
{
"epoch": 80.85906040268456,
"grad_norm": 4.1875,
"learning_rate": 3.3653846153846154e-7,
"loss": 0.3931,
"step": 162
},
{
"epoch": 81.42953020134229,
"grad_norm": 8.6875,
"learning_rate": 3.317307692307692e-7,
"loss": 0.8036,
"step": 163
},
{
"epoch": 81.85906040268456,
"grad_norm": 4.15625,
"learning_rate": 3.269230769230769e-7,
"loss": 0.3912,
"step": 164
},
{
"epoch": 82.42953020134229,
"grad_norm": 8.75,
"learning_rate": 3.2211538461538464e-7,
"loss": 0.7929,
"step": 165
},
{
"epoch": 82.85906040268456,
"grad_norm": 4.34375,
"learning_rate": 3.1730769230769225e-7,
"loss": 0.4027,
"step": 166
},
{
"epoch": 83.42953020134229,
"grad_norm": 8.4375,
"learning_rate": 3.1249999999999997e-7,
"loss": 0.7854,
"step": 167
},
{
"epoch": 83.85906040268456,
"grad_norm": 4.25,
"learning_rate": 3.076923076923077e-7,
"loss": 0.3975,
"step": 168
},
{
"epoch": 84.42953020134229,
"grad_norm": 8.375,
"learning_rate": 3.0288461538461536e-7,
"loss": 0.7996,
"step": 169
},
{
"epoch": 84.85906040268456,
"grad_norm": 4.25,
"learning_rate": 2.980769230769231e-7,
"loss": 0.3964,
"step": 170
},
{
"epoch": 85.42953020134229,
"grad_norm": 8.875,
"learning_rate": 2.932692307692308e-7,
"loss": 0.7679,
"step": 171
},
{
"epoch": 85.85906040268456,
"grad_norm": 4.28125,
"learning_rate": 2.884615384615384e-7,
"loss": 0.4038,
"step": 172
},
{
"epoch": 86.42953020134229,
"grad_norm": 8.4375,
"learning_rate": 2.836538461538461e-7,
"loss": 0.7907,
"step": 173
},
{
"epoch": 86.85906040268456,
"grad_norm": 4.40625,
"learning_rate": 2.7884615384615384e-7,
"loss": 0.411,
"step": 174
},
{
"epoch": 87.42953020134229,
"grad_norm": 9.125,
"learning_rate": 2.7403846153846156e-7,
"loss": 0.7833,
"step": 175
},
{
"epoch": 87.85906040268456,
"grad_norm": 4.21875,
"learning_rate": 2.692307692307692e-7,
"loss": 0.3966,
"step": 176
},
{
"epoch": 88.42953020134229,
"grad_norm": 8.6875,
"learning_rate": 2.6442307692307694e-7,
"loss": 0.8066,
"step": 177
},
{
"epoch": 88.85906040268456,
"grad_norm": 4.21875,
"learning_rate": 2.596153846153846e-7,
"loss": 0.3927,
"step": 178
},
{
"epoch": 89.42953020134229,
"grad_norm": 8.875,
"learning_rate": 2.5480769230769227e-7,
"loss": 0.7937,
"step": 179
},
{
"epoch": 89.85906040268456,
"grad_norm": 4.25,
"learning_rate": 2.5e-7,
"loss": 0.3917,
"step": 180
},
{
"epoch": 90.42953020134229,
"grad_norm": 8.375,
"learning_rate": 2.4519230769230765e-7,
"loss": 0.8233,
"step": 181
},
{
"epoch": 90.85906040268456,
"grad_norm": 4.34375,
"learning_rate": 2.4038461538461537e-7,
"loss": 0.3893,
"step": 182
},
{
"epoch": 91.42953020134229,
"grad_norm": 8.625,
"learning_rate": 2.3557692307692306e-7,
"loss": 0.7911,
"step": 183
},
{
"epoch": 91.85906040268456,
"grad_norm": 4.21875,
"learning_rate": 2.3076923076923078e-7,
"loss": 0.3896,
"step": 184
},
{
"epoch": 92.42953020134229,
"grad_norm": 8.75,
"learning_rate": 2.2596153846153845e-7,
"loss": 0.7709,
"step": 185
},
{
"epoch": 92.85906040268456,
"grad_norm": 4.1875,
"learning_rate": 2.2115384615384614e-7,
"loss": 0.4018,
"step": 186
},
{
"epoch": 93.42953020134229,
"grad_norm": 8.4375,
"learning_rate": 2.1634615384615386e-7,
"loss": 0.7832,
"step": 187
},
{
"epoch": 93.85906040268456,
"grad_norm": 4.3125,
"learning_rate": 2.1153846153846152e-7,
"loss": 0.394,
"step": 188
},
{
"epoch": 94.42953020134229,
"grad_norm": 8.1875,
"learning_rate": 2.067307692307692e-7,
"loss": 0.8081,
"step": 189
},
{
"epoch": 94.85906040268456,
"grad_norm": 4.28125,
"learning_rate": 2.0192307692307693e-7,
"loss": 0.3906,
"step": 190
},
{
"epoch": 95.42953020134229,
"grad_norm": 8.6875,
"learning_rate": 1.971153846153846e-7,
"loss": 0.7729,
"step": 191
},
{
"epoch": 95.85906040268456,
"grad_norm": 4.28125,
"learning_rate": 1.9230769230769231e-7,
"loss": 0.3922,
"step": 192
},
{
"epoch": 96.42953020134229,
"grad_norm": 8.375,
"learning_rate": 1.875e-7,
"loss": 0.8232,
"step": 193
},
{
"epoch": 96.85906040268456,
"grad_norm": 4.28125,
"learning_rate": 1.8269230769230767e-7,
"loss": 0.3904,
"step": 194
},
{
"epoch": 97.42953020134229,
"grad_norm": 8.625,
"learning_rate": 1.778846153846154e-7,
"loss": 0.8214,
"step": 195
},
{
"epoch": 97.85906040268456,
"grad_norm": 4.15625,
"learning_rate": 1.7307692307692305e-7,
"loss": 0.3819,
"step": 196
},
{
"epoch": 98.42953020134229,
"grad_norm": 9,
"learning_rate": 1.6826923076923077e-7,
"loss": 0.8211,
"step": 197
},
{
"epoch": 98.85906040268456,
"grad_norm": 4.25,
"learning_rate": 1.6346153846153846e-7,
"loss": 0.4015,
"step": 198
},
{
"epoch": 99.42953020134229,
"grad_norm": 8.125,
"learning_rate": 1.5865384615384613e-7,
"loss": 0.7773,
"step": 199
},
{
"epoch": 99.85906040268456,
"grad_norm": 4.3125,
"learning_rate": 1.5384615384615385e-7,
"loss": 0.3961,
"step": 200
},
{
"epoch": 100.42953020134229,
"grad_norm": 8.8125,
"learning_rate": 1.4903846153846154e-7,
"loss": 0.7914,
"step": 201
},
{
"epoch": 100.85906040268456,
"grad_norm": 4.1875,
"learning_rate": 1.442307692307692e-7,
"loss": 0.3912,
"step": 202
},
{
"epoch": 101.42953020134229,
"grad_norm": 9,
"learning_rate": 1.3942307692307692e-7,
"loss": 0.7793,
"step": 203
},
{
"epoch": 101.85906040268456,
"grad_norm": 4.28125,
"learning_rate": 1.346153846153846e-7,
"loss": 0.4001,
"step": 204
},
{
"epoch": 102.42953020134229,
"grad_norm": 8.25,
"learning_rate": 1.298076923076923e-7,
"loss": 0.7872,
"step": 205
},
{
"epoch": 102.85906040268456,
"grad_norm": 4.1875,
"learning_rate": 1.25e-7,
"loss": 0.3942,
"step": 206
},
{
"epoch": 103.42953020134229,
"grad_norm": 8.75,
"learning_rate": 1.2019230769230769e-7,
"loss": 0.7743,
"step": 207
},
{
"epoch": 103.85906040268456,
"grad_norm": 4.15625,
"learning_rate": 1.1538461538461539e-7,
"loss": 0.392,
"step": 208
},
{
"epoch": 104.42953020134229,
"grad_norm": 8.1875,
"learning_rate": 1.1057692307692307e-7,
"loss": 0.7783,
"step": 209
},
{
"epoch": 104.85906040268456,
"grad_norm": 4.21875,
"learning_rate": 1.0576923076923076e-7,
"loss": 0.3919,
"step": 210
},
{
"epoch": 105.42953020134229,
"grad_norm": 8.5,
"learning_rate": 1.0096153846153847e-7,
"loss": 0.8066,
"step": 211
},
{
"epoch": 105.85906040268456,
"grad_norm": 4.21875,
"learning_rate": 9.615384615384616e-8,
"loss": 0.3834,
"step": 212
},
{
"epoch": 106.42953020134229,
"grad_norm": 8.625,
"learning_rate": 9.134615384615383e-8,
"loss": 0.8136,
"step": 213
},
{
"epoch": 106.85906040268456,
"grad_norm": 4.28125,
"learning_rate": 8.653846153846153e-8,
"loss": 0.3896,
"step": 214
},
{
"epoch": 107.42953020134229,
"grad_norm": 8.5625,
"learning_rate": 8.173076923076923e-8,
"loss": 0.8189,
"step": 215
},
{
"epoch": 107.85906040268456,
"grad_norm": 4.1875,
"learning_rate": 7.692307692307692e-8,
"loss": 0.3924,
"step": 216
},
{
"epoch": 108.42953020134229,
"grad_norm": 8.5625,
"learning_rate": 7.21153846153846e-8,
"loss": 0.7861,
"step": 217
},
{
"epoch": 108.85906040268456,
"grad_norm": 4.3125,
"learning_rate": 6.73076923076923e-8,
"loss": 0.401,
"step": 218
},
{
"epoch": 109.42953020134229,
"grad_norm": 8.5625,
"learning_rate": 6.25e-8,
"loss": 0.7834,
"step": 219
},
{
"epoch": 109.85906040268456,
"grad_norm": 4.375,
"learning_rate": 5.7692307692307695e-8,
"loss": 0.3986,
"step": 220
},
{
"epoch": 110.42953020134229,
"grad_norm": 8.3125,
"learning_rate": 5.288461538461538e-8,
"loss": 0.7767,
"step": 221
},
{
"epoch": 110.85906040268456,
"grad_norm": 4.21875,
"learning_rate": 4.807692307692308e-8,
"loss": 0.3972,
"step": 222
},
{
"epoch": 111.42953020134229,
"grad_norm": 8.5625,
"learning_rate": 4.326923076923076e-8,
"loss": 0.8032,
"step": 223
},
{
"epoch": 111.85906040268456,
"grad_norm": 4.28125,
"learning_rate": 3.846153846153846e-8,
"loss": 0.3836,
"step": 224
},
{
"epoch": 112.42953020134229,
"grad_norm": 8.75,
"learning_rate": 3.365384615384615e-8,
"loss": 0.8067,
"step": 225
},
{
"epoch": 112.85906040268456,
"grad_norm": 4.21875,
"learning_rate": 2.8846153846153848e-8,
"loss": 0.4021,
"step": 226
},
{
"epoch": 113.42953020134229,
"grad_norm": 8.5,
"learning_rate": 2.403846153846154e-8,
"loss": 0.7858,
"step": 227
},
{
"epoch": 113.85906040268456,
"grad_norm": 4.3125,
"learning_rate": 1.923076923076923e-8,
"loss": 0.3969,
"step": 228
},
{
"epoch": 114.42953020134229,
"grad_norm": 8.4375,
"learning_rate": 1.4423076923076924e-8,
"loss": 0.8095,
"step": 229
},
{
"epoch": 114.85906040268456,
"grad_norm": 4.1875,
"learning_rate": 9.615384615384615e-9,
"loss": 0.3909,
"step": 230
},
{
"epoch": 115.42953020134229,
"grad_norm": 8.6875,
"learning_rate": 4.807692307692308e-9,
"loss": 0.7915,
"step": 231
},
{
"epoch": 115.85906040268456,
"grad_norm": 4.1875,
"learning_rate": 0,
"loss": 0.3963,
"step": 232
}
] | 1
| 232
| 0
| 116
| 10
|
{
"TrainerControl": {
"args": {
"should_epoch_stop": false,
"should_evaluate": false,
"should_log": false,
"should_save": true,
"should_training_stop": true
},
"attributes": {}
}
}
| 133,227,408,936,402,940
| 1
| null | null |
null | null | 109.85906
| 500
| 220
| false
| true
| true
| [{"epoch":0.42953020134228187,"grad_norm":4.4375,"learning_rate":4.545454545454545e-8,"loss":0.5185,(...TRUNCATED)
| 1
| 220
| 0
| 110
| 10
| {"TrainerControl":{"args":{"should_epoch_stop":false,"should_evaluate":false,"should_log":false,"sho(...TRUNCATED)
| 146,568,203,953,242,100
| 1
| null | null |
null | null | 115.85906
| 500
| 232
| false
| true
| true
| [{"epoch":0.42953020134228187,"grad_norm":6.875,"learning_rate":4.166666666666666e-8,"loss":0.5747,"(...TRUNCATED)
| 1
| 232
| 0
| 116
| 10
| {"TrainerControl":{"args":{"should_epoch_stop":false,"should_evaluate":false,"should_log":false,"sho(...TRUNCATED)
| 138,321,463,118,659,580
| 1
| null | null |
No dataset card yet
- Downloads last month
- 8