I have 0 knowledge in electronics, however what i got from Provari, DNA 20 etc... information on the web, is that the higher resistance for same wattage will give less amp draw. And the first part of what you described coincides with what i read that watts is what matters. So to achieve the high voltage, provari for example calculates the wattage required and draws the amps required from the battery. With higher resistance this draw would be less. As for the second part of what you are saying this means that for a set watt, the amp draw is always the same irrespective of the resistance?
I am quoting here the provari 1 FAQ:
Although this does not specify the load resistance it is assuming 2.5 amps output current which is max.
From DNA chip datasheet also we have minimum input current of 1.5A and maximum of 12A. Does this mean DNA 30 will always draw 12A of the battery at 30 watts whatever the resistance is?
(Sorry too much OT)
What it means is that the amount of energy being drawn from the battery is the same as what you are putting into the coil plus 2% to 15% or so depending on the converter and where it is in its efficiency range. Though not technically equivalent (for those that nitpick) wattage is the same as energy for our purposes. So if you are powering a .1 ohm coil at 20 watts it draws the same amperage from the battery as a 10 ohm coil at 20 watts. (Once again neglecting small variances in efficiency although they could be identical or not) The practical advantage possible would be with a down converter using 2 or more batteries. In that case less amperage (appoximatly 1/2 for 2 batteries and 1/3 for 3 batteries etc) allows for using batteries of higher mah rating since the amperage needed is less and typically the higher mah batteries have a lower maximum amperage rating. I have to get back to work now, however here is something I wrote one time regarding amperage and converters:
I will not go into the internal circuit design itself here as it requires advanced knowledge of power electronics and a little black magic as many have found out during implementaion of a design in the real world. I will however attempt to explain the basics necessary to understand the relationships between input voltage and amperage and output voltage and amperage which is hidden behind a lot of equations that look like alien writings. Bear in mind this is simplified and there are many things I am leaving out about the actual internal workings. Also I will assume the external circuit is properly designed for the application.
To understand these relationships we need to breifly look at how the converters work.
A DC-DC buck converter, also called a down converter, will always have the output voltage lower than the input voltage.
(A linear regulator is a form of down converter but is less efficient as it has to waste all the power in conversion since it does not store energy)
A DC-DC boost conveter, also called an up converter, is very similar to a buck except the output voltage will always be higher than the input.
There are various other converters (buck-boost, flyback, SEPIC, Cuk, and variations) that will provide both lower and higher output voltages than the input voltage but are normally found in complete end user devices.
The following will apply to all modern DC-DC converters.
A DC-DC convrter breifly stores energy from the input in either a magnetic field or a capacitor. In most all converters it is in a magnetic field so we will look at them. A magnetic field is built up in what is called an inductor that is a coil or that block looking thing with a coil inside it that is normally the largest component in a converter. Once that energy is stored in the magnetic field there is no longer voltage or current, just the magnetic field strength. It then releases that energy to the output. During this release the design of the converter and possibly external components determines what average voltage is present at the output.
What about current? Since voltage and current "dissapear" inside the inductor when they create the magnetic field, input current has no direct bearing on output current.
To determine the output current is simple. Divide the output voltage by the load resistance. I.E. if the output voltage is 4 volts and the output resistance is 2 ohms then the output current is 2 amps. (4V/2O=2A)
To determine the input current however you must know the input or output wattage, converter efficiency at the working parameters, and the input voltage. Since we will normally know the output wattage we take output voltage times output current. Using the above example 4V X 2A = 8W of output wattage. Converters do not have the same efficiency throught their range so normally you will have to look at charts provided by the manufacturer in the data sheet to determine the efficiency at the input and output we want to figure it for. For simplicity we will say that in this case the converter is working at 90% efficiency. So if we are getting 8W out of it there must be 8.9W going into it. 8W / 90% = 8.9W (rounded up to the 1/10th watt). To figure the current input we take the input wattage divided by the input voltage. Lets say we have 8.2V input (2 cells at 4.1V each). 8.9W / 8.2V = 1.1A (again rounded up)
So you can see that while we are getting 2A out of the converter we are only drawing 1.1A from the batteries. However the input current will rise as the batteries discharge. Lets say our batteries have fallen from 8.2V total to 6.4V total. Thanks to the converter we still have 8 watts of output which means we still have 8.9 watts of input (assuming efficiency has not changed). But for the converter to get that 8.9 watts from the batteries it has to draw more current since we are giving it a lower voltage. 8.9W / 6.4V =1.4A. In both cases our batteries are seeing a lower amperage than the output is providing.
Now lets look at a boost converter. The math is the same. Lets say we still have 8W output at 4V and our input voltage is now 3.2V (1 cell) and our efficiency is the same at 90%. So the input wattage will still be 8.9W but the input current will not be the same. 8.9W / 3.2V = 2.8A. Now our battery is seeing more amperage than the output is providing.
As a general rule of thumb a buck converter will always draw less amperage from the batteries than the output is providing and a boost converter will always draw more amperage than the output is providing. A converter that will do both more and less voltage out than in will as a rule of thumb do the same as a buck if it is giving less voltage out than in and the same as a boost if it is giving more voltage out than in.
You should always look at the lowest battery voltage the converter will see when determining what the maximum amperage the draw will be on the batteries.
Most all modern DC-DC converters will have a thermal shut down mode. Internal components will determine if the converter is getting too hot and will shut the converter down. Converters need to shed heat. This heat build up is determined by load, duty cycle (how long they are on compared to off), and how much they can shed this heat. If driven hard and a long it is important to not keep air from being able to convey this heat away or thermal shut down may occur.
Most all modern DC-DC converters also incorporate what is normally called the "hiccup" mode. This is incorporated into most all modern DC-DC converters. When a setpoint is reached (determined by the manufacturer) where the electronics inside the converter are not able to provide the current out that you are trying to draw from it, the converter will go into hiccup mode. The converter will output the most it can for a very breif time then output nothing for a longer period of time (a short duty cycle) then start over again. You can see where the name hiccup came from. This is to prevent damage to the internal components. The point this happens is normally refered to as the "inception point". The inception point will always be higher than manufacturers rated amperage for the converter. If not the converter could never reach the rated amperage without cycling into and out of hiccup mode. Normally a manufacturer will set that point at 110% to 140% of rated amperage. This is determined by the manufacurer and will be below what destructive testing has shown will damage the internal components. The inception point will normally be included in the manufactures data sheet.
Most manufacturers will also include a warning that a fuse should be used to protect the converter and power supply. This is because hiccup mode will only protect the conveter for a short period of time against a dead short. In some cases 1 second or less. If the converter fails in a short mode instead of open mode the batteries will be directly connected to the load since the majority of converters are "non isolated" and the ground for the input is also the ground for the output.