What could happen if you connect a 60W max USB C cable to a 100W charger?

In my case, the charger proceed to power the device without checking the cable. It could be the charger or the receiver is not checking compatibility. My sample size is too limit to tell what went wrong.

Cable tested:
Anker Powerline II (5A / 100W capable)
Anker Powerline III (60W max)
Apple iPad Pro Stock Cable (USB C to C, slim cable, max power unknown)

AUKEY Omnia 100W (PA-B5) (max output 100W)

Anker Powercore Elite III (87W variant, max input 100W)

Using a basic inline USB-C power meter to check voltage, ampere, watt.
All three cables are carrying 20V/5A (100W) even though two of them are not 5A certified.

Is this right? What could happen? I did not let the test run more than a minute, worried about damage to device.

Damage is not possible. The Power Delivery protocol communicates with each side and there is electronics in the cable.

A 100W cable advertises it is 100W.

The only you can damage anything is to use a really cheap non-PD charger with a device not capable of handling the voltage. For example a 5V device if supplied with 20V would harm it probably. But a PD device which can take 15V connected via a 20V PD cable from a 100W 20V charger they’d negotiate and use 15V as highest each can handle.

So I think what you’re saying is the 60W cables are being metered to be 100W and so are they faulty. That’s unlikely for these brands, so I suspect the cables are tested to 60W but are advertised as 100W. If you were to place a USBC meter at the opposite end of the cable so measure the voltage drop you may see the 60W cables are showing lower voltage at output than the 100W cables as they have higher resistance.

20V in will be less than 20V out. The thinner the metal in the cable the higher their resistance. The longer the cable the higher the resistance. The higher the resistance the greater the voltage drop. So if you measure the voltage at the power source end, and then move the meter (or better own two meters) and see a lower voltage. The best cable would be the smallest voltage drop.

If Anker is selling 100W marked cables as 60W then that’s naughty. It could be it’s just tested to be reliable at 60W and you shouldn’t really use for 100W. You’d be able to tell by using the USB meter.

not exactly what I meant but you did give me a good answer. Cables don’t think for themselves, they just tell both sides what they are capable of. The issue here is neither the power provider nor the receiver acknowledged this info and still proceed to pump more power than they are advertised/marked. and my original concern is heat and potential fire hazard.

If that’s true then there is a safety issue with the Powercore. There are a number of places in the chain Powercore-meter-cable-device may be failing to adhere to Power Delivery. Below are steps to find which is the culprit.

  • If the Powercore is ignoring PD then it’s forcing 5A through a cable whixh is rated for 60W and risks damage, potentially fire, of the cable. So if this then urgent to email support and have the product retract from sales, all units recalled and the lawyers have their day. You have to do the tests below first to eliminate to this fault.
  • if the meter is at fault, it is not pass-thru the protocol and negotiation, then it’s causing the problem (and their lawyers have their day). You’d be able test this as if it was always presenting 5A 20W capability, regardless of what is connected to it, then you’d see a difference based on which end it was connected. If connected to Powercore shows 5A 20V but a 60W cable causes it to show 3A 20V when opposite end of the cable away from Powercore, then the cable is correctly stating it can only do 60W but the meter was overruling incorrectly.
  • if Anker is selling 100W emarked cables as 60W then that’s a potential issue, as if the cable is only rated for 60W but advertising as 100W then the cable risks being damaged. This is easy to test, measure the voltage at each end, the difference if large is a sign of heat, should be less than 0.5V, but for example 2V would be a red flag.

So you have it in your hands now to figure it out, move the meter between the ends of the cables and note how it changes.


It is interesting that the combination in your case appears to be ignoring the lack of an emark chip in the cable. @professor did an excellent job running down the possibilities, the power meter misleading both ends seems like the most likely candidate.

More information on the design of these cables:

You only need one emark chip in the circuit, so it shouldn’t matter which end the meter is on if that is the problem.

It is very unlikely that Anker included an emark chip in a cable that isn’t advertised or priced as having it. That would not be cost effective. And the iPad Pro doesn’t use more than 30W, so no reason for apple to provide one either.

You aren’t going to damage the devices at each end directly - they both can provide / take the power. The only real concern is damage to the cable.

Without knowing the gauge of the wire in the cable and the length, it is hard to predict. Since they didn’t fail immediately, the remaining concern is heating over time if the wires are providing too much resistance. And that depends on the gauge. Ampacity information for small gauge wires is hard to come by anyway - the NEC doesn’t cover them, and I am not aware of another governing body that does. Different manufacturers have different guidelines and tables for that, and length comes in to play as well as ambient temperature.

To remove the power meter from the question - you could fully discharge the power bank, then fully charge it with the 100W cable, and then repeat with the 60W cable, timing it and compare the times. If it is substantially slower with the 60W cable, everything is probably working correctly and you can blame the power meter. Note that it won’t be 40% slower, as the current flow is always slower as the bank gets close to fully charged.

You can infer if the cable is heating by measuring the voltage drop between the end while the current is at its maximum for a few minutes.


You can’t measure the resistance directly with the tools owned but you can infer it from voltage difference times the current, that’s Watts lost in cable. A lot of lost Watts in a short cable is a red flag. So move the meter around and see the differences.

1 Like