Description
The increasing complexity of modern neural network architectures demands fast and memory-efficient implementations to mitigate computational bottlenecks. In this talk, we present a comprehensive evaluation of the recently proposed BITNET architecture across multiple HEP tasks, including quark-gluon discrimination, SMEFT parameter estimation, and detector simulation. We assess its performance in classification, regression, and generative modeling, benchmarking it against state-of-the-art methods. Our findings indicate that BITNET achieves competitive performance in classification tasks, while its effectiveness in regression and generation depends significantly on the network configuration. These results provide insights into both the current limitations and the potential of BITNET for future HEP applications.
Secondary track | T12 - Data Handling and Computing |
---|