AI在线 AI在线

Meta Launches CATransformers Framework to Help the AI Industry Achieve Emission Reduction Goals

In today's rapidly advancing field of artificial intelligence, the FAIR team at Meta, in collaboration with the Georgia Institute of Technology, has developed a brand-new framework called CATransformers. This framework is designed with the core concept of reducing carbon emissions, aiming to significantly decrease the carbon footprint of AI technology during operations by optimizing model architectures and hardware performance, thereby laying the foundation for sustainable AI development.. With the widespread application of machine learning technologies across various fields, from recommendation systems to autonomous driving, the computational demands behind these technologies are continuously increasing.

In today's rapidly advancing field of artificial intelligence, the FAIR team at Meta, in collaboration with the Georgia Institute of Technology, has developed a brand-new framework called CATransformers. This framework is designed with the core concept of reducing carbon emissions, aiming to significantly decrease the carbon footprint of AI technology during operations by optimizing model architectures and hardware performance, thereby laying the foundation for sustainable AI development.

With the widespread application of machine learning technologies across various fields, from recommendation systems to autonomous driving, the computational demands behind these technologies are continuously increasing. However, the high energy consumption issue of these technologies is also becoming increasingly prominent. Traditional AI systems usually require powerful computing resources and rely on customized hardware accelerators to operate, which not only consumes large amounts of energy during training and inference stages but also leads to higher carbon emissions during operations. Additionally, the entire lifecycle of hardware, including manufacturing and disposal, also releases "embedded carbon," further exacerbating ecological burdens.

image.png

Current emission reduction strategies often focus on improving operational efficiency, such as optimizing energy consumption and increasing hardware utilization. However, this often overlooks carbon emissions during hardware design and manufacturing processes. To address this challenge, the CATransformers framework was born. It uses a multi-objective Bayesian optimization engine to comprehensively evaluate the performance of model architectures and hardware accelerators, achieving a balance between latency, energy consumption, accuracy, and total carbon footprint.

image.png

The CATransformers framework is particularly optimized for edge inference devices. By pruning a large CLIP model, it generates variants that have lower carbon emissions but excellent performance. For example, CarbonCLIP-S and TinyCLIP-39M perform comparably in terms of accuracy but reduce carbon emissions by 17%, maintaining a latency of less than 15 milliseconds. Meanwhile, CarbonCLIP-XS improves accuracy by 8% compared to TinyCLIP-8M while reducing carbon emissions by 3%, with a latency of less than 10 milliseconds.

image.png

Research findings indicate that solely optimizing latency designs may lead to hidden carbon increases of up to 2.4 times. In contrast, adopting a design strategy that considers both carbon emissions and latency can achieve a total emission reduction of 19% to 20%, with negligible latency loss. The introduction of CATransformers provides a solid foundation for the design of sustainable machine learning systems, demonstrating how an AI development model that considers hardware capabilities and carbon impact from the outset can achieve a win-win situation for performance and sustainability.

As AI technology continues to evolve, CATransformers will provide the industry with a practical path to emission reduction, helping to realize the future of green technology.

相关资讯

Meta Collaborates with Georgia Tech to Launch CATransformers Framework to Reduce AI Carbon Footprint

Recently, Meta's FAIR team collaborated with the Georgia Institute of Technology to develop the CATransformers framework, which aims to make carbon emissions a core consideration in the design of AI systems. This new framework significantly reduces the total carbon footprint of AI technologies by jointly optimizing model architectures and hardware performance, marking an important step toward sustainable AI development.With the rapid popularization of machine learning technology, applications in areas such as recommendation systems and autonomous driving are increasing, but their environmental costs cannot be ignored. Many AI systems require substantial computational resources and often rely on custom hardware accelerators for operations.
5/15/2025 10:01:53 AM
AI在线

Meta 推出 CATransformers 框架 助力AI行业实现减排目标

在人工智能迅猛发展的今天,Meta 的 FAIR 团队与佐治亚理工学院联合研发了一款名为 CATransformers 的全新框架。 该框架以降低碳排放为核心设计理念,旨在通过优化模型架构与硬件性能,显著减少 AI 技术在运营中的碳足迹,为可持续的 AI 发展奠定基础。 随着机器学习技术在各个领域的广泛应用,从推荐系统到自动驾驶,其背后的计算需求不断增加。
5/15/2025 9:01:02 AM
AI在线

Meta与佐治亚理工合作推出CATransformers框架 降低AI碳足迹

近日,Meta 的 FAIR 团队与佐治亚理工学院合作开发了 CATransformers 框架,旨在将碳排放作为 AI 系统设计的核心考量。 这一新框架通过联合优化模型架构和硬件性能,显著降低了 AI 技术的总碳足迹,为实现可持续的 AI 发展迈出了重要的一步。 随着机器学习技术的迅速普及,从推荐系统到自动驾驶等多个领域的应用不断增加,但其带来的环境代价同样不容忽视。
5/15/2025 11:01:54 AM
AI在线
  • 1