Looks like a good one to use.
To find maximum output current, you first need to determine maximum switching current.
For that particular controller, maximum switching current is going to be determined by the configuration of the IPRG pin and the ON resistance of the MOSFET (for the higher efficiency config) or the value of the current
sense resistor (for the lower efficiency config).
To find maximum output current, first calculate maximum input power (min input voltage * max switching current). Find max output current by dividing max input power by max output voltage less 20%. For example, with 5A max switching current at 3.5V minimum, you'd have maximum 17.5W in. Less 20% would be 14W out. At 6V out, that would be a max output current of 2.33A.
Tolerance on max output current is mainly going to depend on MOSFET selection and inductor selection. These will determine the physical limit on switching current. That in turn will determine the setting requirement for the IPRG pin. If you design for 10A switching current and your MOSFET or inductor is only capable of 5A, either could overheat and burn out with higher output currents.
It's a fairly high frequency switcher so you'll need a MOSFET with low gate charge. That's a critical requirement. You'll also want one with an appropriate ON resistance for the selection range of the IPRG pin. Of course, you'll want fairly high current capacity, 10A would be a good minimum for inductor and MOSFET with a high output booster, but you could probably go somewhat lower if you need to.
If none of what I just said makes
sense, you might have a hard time designing a booster that works well. For DC-DC converters, component selection and PCB layout is just as important as anything else.