Doktorandské kolokvium KAI - Filip Kerák (10.11.2025)
v pondelok 10.11.2025 o 13:10 hod. v miestnosti I 9
Prednášajúci: Filip Kerák
Názov: Shrinking the giants - Sparse Neural Networks a necessity of the future
Termín: 10.11.2025, 13:10 hod., I/9
Abstrakt:
How to live long enough to understand the ultimate question of life universe and everything... Modern neural networks often comprise billions of parameters, resulting in substantial time and computational demands during both training and inference. While these large-scale models have achieved remarkable success across a wide range of tasks, their size and resource requirements pose significant challenges for broader adoption and sustainability.
A promising solution lies in reducing the size of these models. Strategies like lowering precision or quantization can only by applied to a certain point, therefore minimizing the number of parameters without sacrificing accuracy is a growing next step. One effective approach is to introduce sparsity during training. By selectively retaining only the most essential connections, sparse training techniques can produce models that are significantly smaller and faster, yet still highly capable.

