In layman’s terms, think of it electricity like water.
* Amps is the stream width (almost nozzle width.)
* Volts is the spray pressure.
* Watts is the volume of water in one second.
So, .00000001 amps at a million volts won’t kill you, but it might poke a tiny hole in you. 1000 amps at 5v would be like having an ocean land on top of you slowly.
At zero volts, amps is meaningless. It’s frozen. At zero amps, volts does not matter. All the pressure in the world cannot do anything if not a single molecule of water (or electron) can flow.
Capacitance matters too. It is how much can flow before volts and amps drop off, like the last dribbles in a bucket or a drained water tower. The stream will thin, and the pressure will fall off.
The power banks mostly use Lithium Ion, which are always 3.7 volts. Some are flat, and some are coiled into a round cell like tge 18650 used in most 1″ thick chargers. Same volts though.
Your device uses 3.7v inside too, since it also has a Lithium Ion cell inside, but USB has to be 5v per standard.
There is a conversion loss stepping up to 5v in the power bank, and another loss stepping down inside your devices, plus losses in the USB cable depending on the wire thickness and length (short and fat is lowest loss).
There is also a loss based on current load. Pulling more current (amps) will drop the voltage faster. Once at some minimum voltage, maybe 2.4v, the bank shuts off. Basically, the battery burns very slowly, and at a certain point, it can crack inside. Charging it un-burns it, but does not weld the cracks back together (usually). Eventually, the cracks isolate parts of the battery from the wires inside, which is why rechargeable batteries wear out.
Rather than guess at all of the losses, they just give best-case, new cell mAh ratings added together. Usually that is optimistic, and decays with use.
The solution for test expectations is to use Watt-Hours. Volts times amps is watts. Milliamps just divide by 1000. A good device will be 95% efficient, and will start slightly above its rated capacity. An okay device will provide 90% efficient and 90% of its rated capacity.
Capacity usually drops off by around 30% for 300 cycles, but leaving a cell charging past 99% can steal some of that, as can depleting below 30%. A good device factored that into their claims and charging circuitry. Some devices do not, either to make the electronics cheaper, or to inflate the new-battery capacity claims.
Usually, discharge should take 1-2 hours. Anything more will reduce effective capacity, but you can only know the real limit for sure is to find out the cell limits for discharge rate (1C is 1 hour for capacity), the real cell capacity in watt- hours, the conversion efficiency of the transformers, and the load in watts placed on the device (and by proxy, cells).
So, a 10,000 mAh 3.7V battery has 37,000 mWh, or 37 Watt Hours. at 95% efficient, that is 35.15 Wh. At 5V, that is 7.03 Ah, or 7,030 mWh, or about 4.5 recharges of an iPhone5 from zero to full with the phone off (low/no load).