Skip to content

Changes between January, 9th and January, 15th

Compare
Choose a tag to compare
@KarelZe KarelZe released this 15 Jan 13:29
· 267 commits to main since this release
3ca6b27

What's Changed

Empirical Study ⚗️

  • Run feature engineering on large scale (100 %) 💡 by @KarelZe in #109
  • Run exploratory data analysis on cluster (10 %) by @KarelZe in #108

Writing 📖

  • Add chapter on input embedding (finished) positional encoding (cont'd) 🛌 by @KarelZe in #107
  • Finish chapter on positional encoding🧵 by @KarelZe in #111
  • Add chapter on TabTransformer🔢 by @KarelZe in #112
  • Add chapter on FTTransformer 🤖 by @KarelZe in #113
  • Correction of column embedding in chapter TabTransformer 🤖 by @KarelZe in #115

Other Changes

Outlook💡

  • Perform a code review of all previously written code.
  • Continue with transformer week. 🤖 Mainly write remaining chapters on the classical transformer architecture, attention and MHSA, as well as pre-training of transformers.
  • Research additional tricks from literature to optimize training behaviour of transformers. Structure them for the chapter on training and tuning our models.
  • Increase performance of current transformer implementations by applying the tricks from above to match the performance of gradient-boosted trees.
  • Add shared embeddings to the TabTransformer implementation.
  • Restructure notes and draft chapter for model selection of supervised and semi-supervised models.

Full Changelog: 23-02...23-03