The LLaMA model family includes various sizes, with the 13B model showing competitive performance against larger models. This highlights the potential of smaller models in the evolving landscape of open-source LLMs.
> LLaMA comes in different sizes: 7B, 13B, 33B, 65B. even the 13B model can compete with much larger models
> decoder-based transformer
> sparked the open-source LLM revolution
Microsoft has released Skala, a neural network exchange-correlation functional that achieves chemical accuracy comparable to hybrid functionals at a semi-local cost. This could be relevant for engineers working on computational chemistry applications.
Microsoft just released Skala on Hugging Face
A neural network exchange-correlation functional for density functional theory
that achieves chemical accuracy on par with hybrid functionals at semi-local cost.