Why multi-head self attention works: math, intuitions and 10+1 hidden insights

Learn everything there is to know about the attention mechanisms of the infamous transformer, through 10+1 hidden insights and observations