Benchmarking EngGPT2-16B-A3B against Comparable Italian and International Open-source LLMs
概要
arXiv:2605.07731v1 Announce Type: cross Abstract: This report benchmarks the performance of ENGINEERING Ingegneria Informatica S.p.A.'s EngGPT2MoE-16B-A3B LLM, a 16B parameter Mixture of Experts (MoE) model with 3B active parameters. Performance is investigated across a wide variety of representati…