Editorial Note (9/1/13): The program “Marketplace” recently looked into Mr. Mills’ claim. They found it to be nonsense. See: http://www.marketplace.org/topics/sustainability/no-your-phone-doesnt-use-much-electricity-refrigerator. I like that they put the answer right in their headline: NO.
Editorial Note (8/23/13): since originally posting the article, a commenter (“Jay”) pointed out that the number I pulled from the article was off by 3 orders of magnitude. I had mistakenly copied “3.5kWh/year” as “4.5 kWh/charge”. Instead, the correct number for the iPhone 5 is9.5 Wh/charge! Thanks for catching that – the post is completely revamped with the correct numbers. Having revamped the conclusions, I also did some digging into the original claimant, and added my findings at the end. This is what I get for writing the original article while super-tired. Thanks, Jay!
The following claim was recently brought to my attention by someone on a social network, nick-named “kete”. It’s a cool claim, and I thought I’d post my response here.
The claim has been circulating in the news headlines recently, as evidenced by the Forbes article linked at the bottom of the page (“Your iPhone uses more energy than your refigerator”) .
CLAIM: your iPhone uses more energy than your refrigerator.
Before we dive into this, keep in mind that the refrigerator is old technology, and that thanks to government regulation it has been made into a leaner, more energy-efficient device over decades. Modern refrigerators are a pinnacle of good engine design. By comparison, the modern smartphone is only about a decade old, and while the development cycle is short the only pressure on phone batteries is to last longer, which means supplying bigger batteries (not necessarily more efficient ones) taking more energy to charge from empty to full.
For this calculation, we have to define what we mean by “a fridge.” Is it old? Is it brand new? They come in many sizes and models and qualities. A 20-year old, 18 cubic-foot (CF) unit can use about 1200 kWh per year; a 10-year old model, about 800 kWh; and a new model, about 500 kWh. I confirmed these numbers with others from the U.S. Department of Energy website. You can see the tremendous strides that government regulation helped place on refrigerator efficiency.
What about an iPhone? We have to make some assumptions here about battery life, because that depends on how you use it. If you do a lot of GPS usage, play video/music, bluetooth, wifi, 3G/4G, etc, you probably have to charge every day or every other day. Let’s assume every other day; we can always double the number later for comparison to having to do a full recharge of the phone every day.
To charge an iPhone 5 (battery capacity is about 2000 mAh) from empty to 100% full has been measured to be about 3.5 Wh. My wife has an iPhone 4 (battery capacity is about 1500 mAh), and I could fire up the “Kill-A-Watt” and measure it next time to compare, but let’s assume you have an iPhone 5. If you charge it from 0% to 100% every other day, that means you use 365/2 * 0.0095 kWh = 1.7 kWh every year to charge it. If you charge every day, then it’s more like 3.5 kWh per year to charge it.
So, if we’re comparing the iPhone 5 (probably the most efficient iPhone, although certainly it also has the largest battery) to any 18 CF refrigerator within the last 20 years, and you charge the iPhone from 0% to 100% every other day, then it’s no comparison – your fridge costs way more energy and the iPhone 5 is pretty energy-friendly by comparison.
So this claim is false. There is no reasonable way to make charging your iPhone during the year outpace the energy consumption of your fridge.
The claim has been printed in Forbes and repeated in “The Week” . It’s based on a paper by Mark Mills, CEO of Digital Power Group, entitled “The Cloud Begins With Coal: Big Data, Big Networks, Big Infrastructure, and Big Power.” Scary title.
We can use our bullshit detector skills to critically analyze the paper:
- It’s not published or peer reviewed. That’s a huge red flag.
- It doesn’t appear in a journal. It appears on the web. That’s another huge red flag.
- The sponsors of the study are listed as “National Mining Association” and “American Coalition for Clean Coal Electricity.” Not exactly basic science funding agencies. Red flag.
- According to his bio, “Mark holds a BSc Honours degree in physics from Queen’s University, Canada, and is a member of numerous professional societies including the American Physical Society and Institute of Electric and Electronic Engineers.”  Red flags abound. He has no formal research training – a B.Sc. in physics (I found another source  that specified his area of study) education does not provide adequate research training, even if it involves research. He is a member of some fancy-sounding societies – except you join them by paying. Not very prestigious.
How does Mr. Mills draw his conclusion? He writes:
Reduced to personal terms, although charging up a single tablet or smart phone requires a negligible amount of electricity, using either to watch an hour of video weekly consumes annually more electricity in the remote networks than two new refrigerators use in a year. 
This text is followed by a footnote. Reading the footnote, we find:
New refrigerator 350 kWh per EPA Energy Star; ~700 kWh/yr weekly streaming HD from [network operations] + [network embodied energy] + [tablet embodied energy]; note, ignores data centers & end-‐use tablet charging: ~ 300 kWh/yr wireless network operations from HD video 2.8 GB/hr per Netflix, network energy ~2 kWh/GB. Note energy use varies w location (type/age equipment), system utilization (see Auer et al, “HowMuch Energy is Needed to Run a Wireless Network?” June 2012). Network energy ranges from 19 kWh/GB The Mobile Economy, 2013, ATKearney, to ~2 kWh/GB per CEET, The Power of Wireless Cloud, April 2013. Annualized embodied/manufacturing energy to produce tablet (details in this report) ~100 kWh/yr per tablet, and cell network operating energy equals annualized embodied energy of network equipment used for 5 years. Refrigerator embodied energy adds 5 -‐ 10% to lifecycle energy use of refrigerator. 
So, in fact, what Mr. Mills is doing is adding:
- The cost of operating wireless networks
- The cost of manufacturing your mobile device
So “Forbes” and “The Week” got the headline all wrong. This is a paper (whose veracity is in doubt but would need to be reviewed closely to make a full assessment) about total cost of running a wireless infrastructure and building a mobile device to operate on that infrastructure. Mr. Mills is comparing apples and oranges. He didn’t include the cost of manufacturing the refrigerator in his comparison (but DID include it in manufacturing the mobile device). He actually ignores the cost of charging in his paper!
This is a paper about the cost of running wireless infrastructure. This is hardly a fair comparison to owning a refrigerator. Mr. Mills is comparing two wildly different things. It would have been a more fair comparison to look at the energy cost of operating a water infrastructure (e.g. fresh water, water recycling, sewage, etc.) to a wireless infrastructure. Both are infrastructures. The users of each infrastructure (e.g. toilets, sinks . . . heck, even refrigerators with ice makers and water dispensers) cost energy to manufacture. Why not do a FAIR comparison of infrastructure and tools that use the infrastructure?
No doubt, wireless infrastructure costs energy. We need to get that under control. But this paper, and the poor headlines that resulted from it, are BAD SCIENCE and BAD SCIENCE REPORTING.
I rate the claim as embodied in these badly written news headlines as false (iPhones do NOT cost more to use each year compared to fridges, if you make a fair wall-socket comparison of power needs). I rate the comparison of operating a fridge to operating a household and nationwide wireless infrastructure as SHENANIGANS. Mr. Mills should have compared water infrastructure to wireless infrastructure if he wanted an apples-to-apples comparison of energy usage.