- The Transformer architecture has expanded far beyond NLP and is emerging as a general purpose architecture for machine learning.
- Large language models (LLM) are in the scale-out phase and have become “nationalised” where each country wants their own LLM.
- AI-first approaches have taken structural biology by storm: proteins and RNA (cellular machinery) is being simulated with high fidelity.
- JAX emerges as a popular ML framework as the pace of research productivity accelerates/researchers become first class citizens.
- China universities have rocket+ Expand description