Solar panels aren’t magic: a guide to understanding the ratings
Let’s start with the worst case scenario: you bought a 100 watt solar panel to charge your 500 watt-hour ebike battery with the expectation that it would charge your battery from empty to full in 5 hours. What could be simpler? 100 watts times 5 hours equals 500 watt-hours, right?
You unwrap your shiny, new panel, plug it into your boost solar charge controller, plug that into your watt-hour meter (because you’re a smart cookie) and plug that into your ebike battery. Trembling with anticipation, you focus on the meter’s display but instead of 100 watts, it shows that you’re only pushing 70 watts into the battery. You remember reading somewhere that tilting the solar panel to directly face the sun is a good thing so you try that and the power goes up to 82 watts. That’s better. But almost immediately, it starts to go down again, 81 watts, 80, 79… finally settling in at 75 watts as the panel warms up in the sun. You’re thinking “What’s going on here? Did I get ripped off? This solar crap is a total scam! Somebody owes me an explanation!!” Ok, settle down. Let’s break it down.
Did you get ripped off by a dishonest purveyor of solar merchandise? Probably not. Did the seller’s marketing pitch set realistic expectations about the product’s performance under real-world conditions? Probably not. And that’s the problem.
That 100 watt label is a nominal power rating, variously referred to as rated power, PMP, PMPP, Pmax, or Pnom. A solar panel’s voltage, current and power output varies depending on the temperature of the cells and the irradiance level (sun brightness) so the solar industry had to decide on a standard set of conditions under which solar panels would be tested and labeled. The conditions they picked are called “STC” or “Standard Test Conditions” and are defined as an irradiance of 1000 W/m2 and a cell temperature of 25°C (77°F). These values were selected not because they represent typical outdoor conditions under which the panels will be used but because they are cost-effective when flash-testing each panel as it comes off the assembly line in a factory operating at a comfortable room temperature.
For decades, this caused relatively few problems. Consumer solar devices were limited to things like solar calculators and tiny portable lights where the solar panel’s rating didn’t matter to the consumer. Serious, big-boy solar panels like the ones on used on rooftops were installed by professionals who understood the measurement conventions and no one was upset or confused by the whole thing.
More recently, prices have come down and efficiency has gone up until suddenly small but relatively powerful solar panels can be had for US$ 1-3 per watt. Supply and demand being what they are, we now have a very competitive marketplace full of inexpensive solar charging products and equally full of marketing claims. What a time to be alive! The average consumer doesn’t want to study for an electrical engineering degree before buying a solar panel and the solar industry hasn’t been able to get away from that STC rating scheme because it’s just too well established. That’s how we get 100 watt solar panels that only give us 75 watts under relatively ideal conditions. It’s not a scam. It’s just a B2B standard in a B2C world.
Warning: math ahead. If math gives you hives, shortness of breath and/or flatulence, take a deep breath and go to your happy place. I’ll try to be gentle.
This is the label from a typical 100 watt solar panel. Notice how in tiny print it says “Standard Test Conditions”? This panel was tested to have these voltage, current and power values at 1000 W/m2 and 25°C. What the label does not tell you is how to figure out the voltage, current and power values at any other conditions. For that, we will need to look at the datasheet (PDF). I picked this panel because apparently Sunpower is the only semi-flexible panel manufacturer who can be bothered to publish a datasheet. This would never fly in the world of big-boy glass-and-aluminum framed panels with bankable 25 year power warranties, but I digress.
If your irradiance level is 500 W/m2 because it’s slightly cloudy then you might expect to get 50 watts from the panel (500/1000 * 100) but only as long as the temperature of the individual cells inside the panel is 25°C (77°F). Because the cells are almost black, they heat up significantly in the sun. On a warm, summer day when the air temperature is 25°C and we’re getting full irradiance at noon, the cell temperature is going to be somewhere between 45°C and 55°C, depending on air flow over the top and bottom of the panel. On a really hot day, it can get up to 75°C (167°F). Yes, you could cook an egg on that. I wouldn’t.
Let’s assume a cell temperature of 50°C which is 25°C above the standard temperature (50-25). From the datasheet, we see that our panel has a Power Temperature Coefficient of -0.35%/°C meaning that for every °C of temperature rise above 25°C we lose 0.35% of power. That’s right. Photovoltaic (solar electric) panels lose power as they get hotter. I know, it’s kind of counterintuitive but you get used to it. With a 25°C rise, that’s a loss of 8.75% (25 * -0.35%) or we can call it a 91.25% derate (100-8.75).
So, why aren’t we getting 91 watts (100*0.9125)? Odds are, we’re not getting the full 1000 W/m2 of irradiance. Even with perfect panel positioning relative to the sun’s position in the sky, the atmosphere filters out some light, more so when the sun is lower in the sky or when there’s air pollution. Let’s assume we’re getting 900 W/m2 because we don’t see any clouds in the sky and it’s the middle of the day. That would mean we should be getting 82 watts (100*0.9125*0.9).
Since we’re measuring at the output of the solar controller, we have to take into account any losses there as well. No charge controller is 100% efficient. A high frequency DC-DC converter like our boost charge controller is likely to be 85-95% efficient under typical operating conditions. Don’t be fooled by the marketing copy promising “99% peak efficiency”. We could measure the charge controller’s efficiency with a second watt-hour meter connected inline on the input side but let’s just assume it’s operating at 93% efficiency which gives us 76 watts (100*0.9125*0.9*0.93). The remaining 1 watt loss is in the connectors and wiring.
So, how long would it take to charge our 500 Wh battery using a 100 watt solar panel on a clear, sunny day? Lithium batteries like to be charged using a CC/CV profile meaning that the first 80% of the charge is at constant current and the last 20% is at constant voltage. That last part is much slower which is not the solar panel’s fault so let’s ask instead “How long to get to 80% state of charge?” Using our 75 watt example, that would be 5.3 hours (500*0.8/75) assuming you start a little before solar noon and keep re-positioning the panel to maintain the optimal angle. If it’s cloudy or you’re charging very early/late in the day, it make take quite a bit longer.
I’m leaving out some details here like that the battery’s internal resistance means that you have to push in a little more than 500 Wh to fill a 500Wh battery but I’m running out of steam and I’m craving a cookie so let’s call it done for now.
BONUS: If you’re matching a solar panel to the input and output limits of your charge controller and to your battery, you’ll need to take temperature into account. The VOC and VMP values on the label are just a starting point. You’ll need the Voltage Temperature Coefficient from the datasheet for those calculations. If there’s interest, I’ll cover that in part 2. Let me know in the comments. Don’t forget to like, subscribe and smash that notification bell! Oh, wait, this isn’t YouTube. Yeah, I suck at social media.