From 83900f2343d89096e6dcb5e19417f24f43648cfc Mon Sep 17 00:00:00 2001 From: Jai Date: Thu, 7 Mar 2024 03:36:36 +0000 Subject: [PATCH] added codecov badge --- readme.md | 2 ++ 1 file changed, 2 insertions(+) diff --git a/readme.md b/readme.md index 5f2c42a..fb88150 100644 --- a/readme.md +++ b/readme.md @@ -2,6 +2,8 @@ [![build_and_tests](https://github.com/jkbhagatio/nanoGPT/actions/workflows/build_env_run_tests.yml/badge.svg)](https://github.com/jkbhagatio/nanoGPT/actions/workflows/build_env_run_tests.yml) +[![codecov coverage](https://codecov.io/gh/jkbhagatio/nanoGPT/graph/badge.svg?token=HxfQTpcSZR)](https://codecov.io/gh/jkbhagatio/nanoGPT) + A minimal (nanomal?) repository containing code for building, training, and running nanoGPT: a nano-version of OpenAI's GPT-3 Decoder-only Transformer, following this tutorial from Andrej Karpathy: https://www.youtube.com/watch?v=kCc8FmEb1nY A trained nanoGPT using this codebase acts only as a character-level text-completer (i.e. the end of the "pretraining stage" in typical Large Language Model development, here with tokens as only single characters).