Currently, artificial intelligence (AI) has been widely used in many fields, including computer vision, natural language processing, time series analysis, and speech synthesis. In the era of deep learning, especially with the emergence of large language models (LLMs), most researchers' attention has been focused on pursuing new state-of-the-art (SOTA) results, resulting in an ever-increasing model size and computational complexity. The demand for high computing power leads to higher carbon emissions and also hinders the participation of small and medium-sized companies and research institutions with limited funds, thereby undermining research fairness. In order to address the challenges of AI in terms of computing resources and environmental impact, green computing has become a hot research topic. Recently, Ant Group, together with many domestic universities and research institutions, jointly released a survey report, which systematically outlined the technologies used in green computing and proposed a green computing framework, which includes the following four key components: Measures of Greenness : Key factors and methods for measuring the computing resources required by intelligent systems. Common measurements include direct indicators such as running time, power consumption, and model size, as well as indirect indicators such as carbon emissions. Energy-Efficient AI : Energy-efficient methods to optimize the entire life cycle of AI models, including model design, training, and reasoning, as well as optimization techniques for large language models to reduce power consumption during training and reasoning. Energy-Efficient Computing Systems : Technologies for optimizing resource consumption in computing systems, including cluster resource scheduling, partitioning, and data management optimization. AI for Sustainability: Applications that use AI to improve sustainability, including applications for environmental benefits (green computing for the environment) and improving engineering efficiency (green computing for engineering). Environmental green computing includes applications such as using satellite imaging CV to monitor air pollution emissions and carbon sequestration estimation, and engineering green computing includes optimizing database security encryption. The study states, “This new research direction has the potential to resolve the conflict between resource limitations and AI development.” The related research paper, titled “On the Opportunities of Green Computing: A Survey”, has been published on the preprint website arXiv. Paper link: https://arxiv.org/abs/2311.00447 From the training and reasoning cases of many AI algorithms, model size, parameter tuning and training data have become the three main factors affecting computing resources. On this basis, the study summarizes six common "environmental protection" measurement methods, including running time, model size, FPO/FLOPS (floating point operations), hardware power consumption, energy consumption and carbon emissions. Tools for tracking “greenness” measurements include tfprof, Green Algorithms, CodeCarbon, Carbontracker, and the Automatic AI Model Greenness Tracking Toolkit. In image classification, object detection and other AI tasks, some traditional deep learning neural network models, such as LeNet, VGG, GoogleNet, etc., have achieved good performance, but require too many computing resources. Therefore, this study proposes to use Depth-wise Separable Convolution, Fire Convolution, Flattened Convolution and Shrinked Convolution to solve this problem. In addition, in terms of developing neural networks based on graph data, the study also proposed ImprovedGCN, which contains the main necessary components of GCN. In addition, the study also recommended another neural network, SeHGNN , which is used to aggregate pre-computed neighboring representations, reducing complexity and avoiding redundant operations of repeatedly aggregating neighboring vertices in each training cycle. In terms of time series classification, the currently commonly used ensemble learning methods require a lot of computing resources. Therefore, the study proposes to use two methods, LightTS and LightCTS, to solve this problem. In addition, Transformer is a powerful sequence model, but the time and memory required grow exponentially as the sequence length increases. Self-Attention type networks require a lot of memory and computing resources when processing long sequences. To this end, the study recommends using Effective Attention and EdgeBERT and R2D2 models to meet this challenge. In addition to the design of specific neural network components, there are also some general strategies that can be used for efficient neural network structure design, such as low-rank module strategy, static parameter sharing, dynamic network and super network strategy. These strategies can be seamlessly integrated into any parameterized structure. In terms of model training, the study summarizes three methods: effective training paradigm, training data efficiency, and hyperparameter optimization . In order to achieve green AI and reduce the energy consumption of neural networks, effective methods such as model pruning, low-rank decomposition, quantization, and distillation can be used. In terms of energy-efficient computing systems, the study briefly introduces solutions in many aspects, including optimizing cloud database resource utilization and hardware and software co-design . These principles are also applicable to the field of data analysis, including the use of technologies such as hybrid query optimization and machine learning to improve the energy efficiency of the processing process. It is worth noting that green computing emphasizes that AI should not only be energy efficient in its own development and operation, but should also actively participate in various green application areas to address environmental and sustainability challenges. The study pointed out that AI can effectively extract useful information from monitoring data, remote sensing data and meteorological data, covering many fields such as air pollution monitoring, carbon sequestration estimation, carbon price forecasting, etc., thereby providing guidance for decision-making and action. At present, although green computing has achieved success in energy efficiency and carbon emission reduction, computing resources still become a bottleneck for industry growth. To this end, this study proposes some future research directions, including adding "greenness" measurement to model evaluation, developing a widely accepted greenness evaluation framework, exploring smaller but more efficient language models, and encouraging more industrial applications to reduce environmental impact. In addition, the study pointed out that the future of green computing will rely on the joint efforts of academia, industry and government to achieve a balanced development of environmental sustainability and AI efficiency. Policy support, innovative cooperation and sharing of best practices will be key to promoting further development in this field. Author: Yan Yimi Editor: Academic Jun |
Shaoxing Shengzhou is trialing a 4.5-day flexible...
Course Contents: 1-Lecture 1 Learning Guidance an...
Nowadays, it is becoming more and more difficult ...
All customers will be involved in three major iss...
While studying the paper on the twin prime proble...
This article will focus on "content products...
In cold weather, in addition to wearing more clot...
Since I was a kid, I have heard my elders say: &q...
Some people believe that the positioning theory, ...
After the BYD Qin L with a starting price of 99,8...
I once attended a training course at a certain in...
How did you come up with the title? Do you receiv...
Author: The Nutcracker Studio To this day, we sti...
In many companies, bosses have a mindset: the mor...
Nowadays, community group buying is booming and f...