Update README.md
Browse files
README.md
CHANGED
|
@@ -8,7 +8,7 @@ license: apache-2.0
|
|
| 8 |
## Model Resources
|
| 9 |
|
| 10 |
- **Repository:** https://github.com/openevabyte/evabyte
|
| 11 |
-
- **Blog:** https://hkunlp.github.io/blog/2025/evabyte
|
| 12 |
- **Paper:** Coming soon
|
| 13 |
|
| 14 |
## Model Details
|
|
@@ -92,7 +92,7 @@ As a pretrained base model, **EvaByte-Phase1** has not been fine-tuned for chat
|
|
| 92 |
|
| 93 |
## Evaluation
|
| 94 |
|
| 95 |
-
For detailed evaluation results,
|
| 96 |
|
| 97 |
## Citation
|
| 98 |
```bibtex
|
|
|
|
| 8 |
## Model Resources
|
| 9 |
|
| 10 |
- **Repository:** https://github.com/openevabyte/evabyte
|
| 11 |
+
- **Blog:** https://hkunlp.github.io/blog/2025/evabyte and https://sambanova.ai/blog/evabyte-efficient-byte-level-language-models-at-scale
|
| 12 |
- **Paper:** Coming soon
|
| 13 |
|
| 14 |
## Model Details
|
|
|
|
| 92 |
|
| 93 |
## Evaluation
|
| 94 |
|
| 95 |
+
For detailed evaluation results, check out our blog post at [SambaNova](https://sambanova.ai/blog/evabyte-efficient-byte-level-language-models-at-scale) or [HKUNLP](https://hkunlp.github.io/blog/2025/evabyte).
|
| 96 |
|
| 97 |
## Citation
|
| 98 |
```bibtex
|