Blog

Uncategorized From Structs and Lambdas to CO2 Emissions: Microsoft Software Development Gets Greener – Visual Studio Magazine

News
As humanity plummets into a suicidal climate-change death spiral (some might say), Microsoft-centric software development is increasingly adopting “green” practices such as lowering CO2 emissions in order to slow the roll.
More and more, developers who previously worried about working with structs, lambdas and loops are now taking into account techniques to tune machine learning models and optimize network traffic to lessen environmental impacts.
Microsoft has undertaken myriad efforts that target sustainable software engineering and sustainable computing in general. In fact, the company is even taking the remarkable step of factoring in progress on sustainability goals in determining executive pay. That’s probably certain to speed things up (more on that below).
Another example: Microsoft developer blogs now have a “Sustainable Software” section with posts published just in the last month including:
The introductory blog post of that series, in August 2020, was written by Scott Chamberlin, Principal Software Engineering Lead, who said, “Green Software Engineering is an emerging discipline at the intersection of climate science, software practices and architecture, electricity markets, hardware and datacenter design.” Chamberlin pointed to eight Principles of Green Software Engineering that help guide Microsoft’s efforts. It’s a personal project guided by Asim Hussain, Green Cloud Advocacy Lead at Microsoft, who also contributes to the Sustainable Software blog. The eight principles are:
Along with those principles, displayed on the Principles.Green site, are specific techniques to apply them. For example, it lists Web-Queue-Worker, N-tier and Microservices. Drilling down into those areas you can find guidance such as optimizing network traffic, increasing compute utilization, optimizing databases and more.
In drilling down further into the Web-Queue-Worker area (handling HTTP requests and handling time or processing-intensive operations) you can find network optimization techniques to reduce the amount of traffic an architecture creates per operation along with the distance each request and response travels. Specifically, those include:
For hands-on learning of the above, Microsoft offers a course titled The Principles of Sustainable Software Engineering, consisting of 12 units that the company estimates will take 33 minutes to work through.
Following those eight principles and other guidance, Microsoft has approached sustainable software engineering on many fronts, including the Visual Azure post A Visual Guide To Sustainable Software Engineering, published a few months ago:
While most Microsoft nuts-and-bolts software development posts don’t say much about green techniques or sustainable software engineering, Microsoft’s Bill Johnson points out there is a role for individual software engineers. Noting the broader carbon reduction efforts of major cloud providers, Johnson said, “it’s not always easy to draw a line between the code we write and sustainability efforts like these.”
Johnson, Principal Software Engineering Manager on the Azure SRE team, positions software engineers on the technical level of three levels of sustainable engineering that also include operational (DevOps/site reliability engineer) and environmental (sustainability engineer). He said sustainable software engineering “is about finding the balance between the technical, operational, and environmental aspects of a system to provide an optimal level of sustainability.”
“Technical sustainability covers the direct decisions we make for the system to produce its desired results,” Johnson said. “This includes both hardware decisions (CPUs, memory, networks) and software decisions (language, architecture, complexity) as well as things like latency in the system, testing requirements, or the scale up/out requirements. You can loosely think of this as ‘traditional’ software engineering.”
Individual Microsoft software developers/engineers/data scientists are indeed doing their part. One close-to-home example is Dr. James McCaffrey of Microsoft Research, a pre-eminent data scientist who writes The Data Science Lab in his role as a senior technical editor for Visual Studio Magazine. He was recently featured in a Pure AI article about intelligent sampling of huge machine learning datasets to reduce costs and maintain model fairness.
McCaffrey combined with fellow researchers Ziqi Ma, Paul Mineiro and KC Tung for a technique that also had green connotations, explained under the heading “Energy Savings and CO2 Emissions.” It read:
Large, deep neural network machine learning models, with millions or billions of trainable parameters, can require weeks of processing to train. This is costly in terms of money as well as in associated CO2 emissions. Training a very large natural language model, such as a BERT (Bidirectional Encoder Representations from Transformers) model can cost well over $1 million in cloud compute resources. Even a moderately sized machine learning model can cost thousands of dollars to train — with no guarantee that the resulting model will be any good.
It’s not unreasonable to assume a near-linear relationship between the size of a training dataset and the time required to train a machine learning model. Therefore, reducing the size of a training dataset by a factor of 90 percent will reduce the time required to train the model by approximately 90 percent. This in turn will reduce the amount of electrical energy required by about 90 percent, and significantly reduce the amount of associated CO2 emissions.
For example, a commercial airliner flying from New York to San Francisco will emit approximately 2,000 lbs. (one ton) of CO2 into the atmosphere — per person on the plane. This is a scary statistic. And unfortunately, it has been estimated that the energy required to train a large BERT model releases approximately 600,000 lbs. of CO2 into the atmosphere. In short, reducing the size of machine learning training datasets can have a big positive impact on CO2 emissions and their effect on climate conditions.
Those concerns were summarized by Tung, who said, “I was surprised to learn how much CO2 is released during machine learning model training. The current approach for building ML models is not sustainable and we will hit a ceiling soon, if not already.”
For this article, McCaffrey shared his thoughts on the matter.
“Many of us at Microsoft watched as the company went all-in on cloud computing — it was one of those key inflection points that large companies go through every 10 years or so,” McCaffrey said. “I vividly remember seeing for the first time some photographs of one of Microsoft’s datacenters — with huge buildings and rack after rack of server machines. That image more or less galvanized many of us towards the reality of a new era of computer science where scales are gigantic — quite a change from the days of early PCs.
“A natural consequence of large scale is large impact. One of the earliest projects that I worked with in my role as the director of the internal Microsoft AI School, was an effort to use machine learning to reduce the energy used by a Microsoft’s datacenter in Quincy, Wash. There’s a delicate balance between cooling the facility, which is very expensive, and saving cooling costs, which can result in increased hardware failures. This energy-saving effort, and others like it, naturally led to investigations of the impact on environmental factors, such as CO2 emissions and their impact on global climate. With large scale, a small improvement in efficiency can have a big impact.”
Microsoft’s software engineering greenness is part of a broader scope for all computing efforts that began well over a decade ago. A big milestone in that effort was a Jan. 16, 2020, post that said “Microsoft will be carbon negative by 2030.” It started out: “The scientific consensus is clear. The world confronts an urgent carbon problem. The carbon in our atmosphere has created a blanket of gas that traps heat and is changing the world’s climate. Already, the planet’s temperature has risen by 1 degree centigrade. If we don’t curb emissions, and temperatures continue to climb, science tells us that the results will be catastrophic.”
Earlier this year, the company provided a one-year progress report, part of which listed these items:
A key figure in Microsoft’s efforts is Dr. Lucas Joppa, Chief Environmental Officer. As McCaffrey explained, “Lucas was very passionate about the importance of environmental issues and he quickly formed an organization within Microsoft to look at these issues.”
In January Joppa commented on the one-year carbon negative progress report mentioned above: “As Microsoft’s Chief Environmental Officer, I know it won’t be easy to achieve these commitments. It will take the entire decade and it won’t happen if we ‘set it and forget it.’ It will be the result of a decade of purposeful action to enact operational and systemic changes. But over the next decade we will act in accordance with what we think needs to be done today to create the world we need to be operating in by 2030.”
About the Author
David Ramel is an editor and writer for Converge360.

Printable Format
The R1 2022 release of Progress Telerik development tooling adds more than 20 new components to the Blazor, .NET MAUI and WinUI offerings.
Like all big dev tooling releases, Microsoft’s .NET MAUI is now mostly getting quality and stabilization attention as the general availability date grows closer.
With .NET 6 and the latest Visual Studio 2022 preview, developers can create a hybrid Blazor/.NET MAUI app that can work with local machine resources in ways that ordinary Blazor (web) apps can’t.
Microsoft provided an update on its years-long effort to bring the new Windows Forms designer up to speed with the old .NET Framework version.
> More Webcasts
Problems? Questions? Feedback? E-mail us.

source

Author Details

Sign up for our newsletter to stay up to
date with tech news!