How is that the case, then? A higher rated adapter will be larger and better heatsunk, so will be able to disperse heat better.
Also, switched mode power supplies (which most quality adapters are these days) work most efficiently at around 40-60% of their maximum output - this is more efficient (and cooler) than running one at 100%.
Well, just try and find me an official source for your claim...
Here's one I'll provide:
A dozen USB chargers in the lab: Apple is very good, but not quite the best
Multiple tests, for adapters 0.7A to 2.1A rated adapters... funny but the efficiency is similar across all of them from the brand names (decent range) the cheap ones (which is bad).
If an adapter would be so inefficient as you claim, they'd be a risk regardless of being use with any device.
What it also comes down to is that the device is what controls the power, and if anything, 0.5A is better to charge 18650 for it's lifespan.
And no.. the device cannot "force" and pull power like many believe they do (causing this mythical overheating). The device simply takes what it's given and uses it. It extra Amps are available, it will and can only take up to what it's designed to handle as a max.
And in the end, the thing is, adapters that aren't crap, actually have to pass efficiency testing and be within a proper range. It's actually an industry standard. So if you have a crappy adapter, odds are, it won't be certified.
If the adapter cannot disperse any extra heat and such, then it's not a very good one, and a larger adapter may take a bit longer to heat up due to being a bit larger, but it will heat up just the same.