A Tighter Complexity Analysis of SparseGPT
CoRR(2024)
摘要
In this work, we improved the analysis of the running time of SparseGPT [Frantar, Alistarh ICML 2023] from O(d^3) to O(d^ω + d^2+a+o(1) + d^1+ω(1,1,a)-a) for any a ∈ [0, 1], where ω is the exponent of matrix multiplication. In particular, for the current ω≈ 2.371 [Alman, Duan, Williams, Xu, Xu, Zhou 2024], our running times boil down to O(d^2.53). This running time is due to the analysis of the lazy update behavior in iterative maintenance problems, such as [Deng, Song, Weinstein 2022, Brand, Song, Zhou ICML 2024].
更多查看译文
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
数据免责声明
页面数据均来自互联网公开来源、合作出版商和通过AI技术自动分析结果,我们不对页面数据的有效性、准确性、正确性、可靠性、完整性和及时性做出任何承诺和保证。若有疑问,可以通过电子邮件方式联系我们:report@aminer.cn