It was nearly a miracle when we were able to set our phone on a wireless charger. But, this technology has been found about 130 years ago. But despite the fact that its story measures in centuries, the tech is only now becoming ubiquitous in things you buy. Mostly, your next (new) phone will have it and even some cars include them in the front seat (and they might even wirelessly charge themselves).
The wireless power market is expected to grow to one billion charging units by 2020, according to IHS, a market research firm in London. So how does this sci-fi like technology work, and why is it suddenly the new hotness?
First, a History
Wireless charging actually isn’t a novel idea. It’s even older than the Ford Model T. In 1831, the English physicist Michael Faraday first discovered the underlying magnetic and electrical ideas that led to induction charging, which transfers energy wirelessly between two receivers.
He described his experiment, which produced a “current of electricity by ordinary magnets,” in an 1831 series of lectures at the Royal Society in London. Faraday had used a liquid battery to send an electric current through a small coil. Then when it moved in our out of a larger coil, the magnetic field changed—it created a momentary voltage in the smaller coil.
Then, there’s Nikola Tesla, who was hellbent on transmitting electricity without wires. He used Faraway’s underlying principles to first demonstrate the ability to transmit energy through the air. He created a magnetic field between two circuits, a transmitter and a receiver, in the late 19th century. And if you’re picturing something straight out of The Prestige, you’re not far off.
If you head over to the Griffith Observatory in Los Angeles, you can see this history in action. Tesla’s coil prototype has been on display there since 1937. In the demo, it powers a neon sign without any wires—and that’s what’s going on inside your smartphone when you place it on a wireless charger.
While scientists discovered wireless charging—it didn’t have many practical uses, at least not at first. Prior to smartphones, smartwatches and electric vehicles, most applications for wireless charging came down to…electric toothbrushes. Since the 90s, electric toothbrushes with plastic bottoms have used inductive charging built into the stand.
So…How Does It Work?
BMW sedans and iPhones rely on the same concept to catch a wireless charge: inductive charging. Long story short, inductive charging transfers energy from a charger to a receiver in the back of the phone through electromagnetic induction. Inside the charging pad is an induction coil that creates an oscillating electromagnetic field. The receiver coil in the smartphone or other device helps convert that magnetic field back into electricity to charge up the battery, just like Tesla had done back in the 1800s with his massive transmitter and receiver—only smaller.
The larger the coils are inside the charger, the further away you can move your phone, laptop or other device. There are two primary standards for wireless charging and most smartphones support both:
- Wireless Power Consortium’s Qi standard : Primarily used for smartphones, this standard also applies to other consumer devices. There are about 3,700 Qi-certified products on the market at the moment, according to the consortium and each can support between five and 15 watts.
- AirFuel Alliance Resonant standard: The latest standard allows users to charge from 50 millimeters away, meaning no need to perfectly align your device to the charger and gives you freedom to use your phone while charging. This standard also supports charging multiple items at once, like a smartwatch and a smartphone.
A Look Into the Future
Wireless charging isn’t widespread in most applications because there are limitations to how far a device can be from the charging pad. But companies like WiTricity, founded by researchers at the Massachusetts Institute of Technology in Boston, are focused on creating electric charging that is actually practical for uses in the real world.
WiTricity’s CEO Morris Kesler told NPR last year that he imagines a future where wireless charging is ubiquitous.
“You drive your electric car into a garage, where wireless charging pads are on the floor,” he said. “You open the door to the house and throw your cellphone on the kitchen counter, where wireless charging tech is built into the countertops.”
Meanwhile, in 2017, researchers at Disney Research showed that open-air wireless charging is possible. As in, charging your phone while the receiver is across the room, just as your laptop can use a WiFi signal through the air without holding it over the ethernet cable.
Disney calls it “quasistatic cavity resonance” and it allows structures like abinets to generate quasistatic magnetic fields that “safely deliver kilowatts of power to mobile receivers contained nearly anywhere within.” Tesla would be proud.
A Few Drawbacks
Lets rewind to 2017, when Apple had come out with guns blazing at its annual developers conference. The AirPower, as the company was referring to it, would be a wireless charging pad that could power your iPhone X or iPhone 8, plus the Apple Watch and of course your AirPods all at once.
Apple’s concept, at least based on patent filings, was to use many 3D coils in close proximity, which would allow multiple devices to charge, no matter the orientation. But that would also require some complicated power management and maintenance of excessive heat.
And since iPhones wirelessly charge at just 7.5 watts—rather than the Qi standard maximums of between 10 and 15 watts, which Androids use—the phones could get way too hot.
Wireless charging—which really should be thought of as plugless charging in its current state—does have its limitations. Sure, you can rest your phone right on the dock, but you can’t really use it unless you hold the whole charging station, which is not an altogether pleasurable experience. Plus, wireless charging actually takes longer than most regular wall chargers if using the original.
Is Wireless Charging Worth Using?
Wireless charging also uses more energy, which means slightly higher electric bills. Given that it’s also less efficient, lost energy mostly takes the form of heat, which can mean extra wear and tear on your battery.
That begs the question: what can companies do to mitigate this loss of energy? The Wireless Power Consortium says that the 30 percent of energy wasted, on average, during each charge, could amount to less than pennies on your bills.
However, the environmental costs are more serious. While a smartphone may take only 5 watts to charge, and there’s one billion phones being used, that’s 900 billion watts of energy used. If each used some kind of wireless charger, that rises to 1.13 trillion watts of energy, meaning a net waste of 225 billion watts of energy per year.
That wasted energy could power 35,000 homes and produces about 100,000 metric tons of carbon dioxide pollution per year, according to data from a 2012 report put out by the California Energy Commission.