I will start with the caveat that it has been a little while since I have done muh with buck or boost converters, so hopefully someone with more recent experience will also chime in.
not sure what that "optimal max" rating is, have never encountered that before. If it;s the output at which the converter has maximum efficiency then it sounds rather poor, unless you meant 2.1A?
How much current the converter uses to do it's job will depend on the percentage efficiency, and the actual output it is being run at.
No. If the converter is rated to 3 amps at 5 volts, then the maximum output will be 3 amps. Depending on the quality of the converter you might be able to push it beyond that rating a little bit, or you might be close to breaking point running it close to the maximum. Depends if the rating has a safety factor, or if it has been wildly overstated(not uncommon in my experience). If your converter has adjustable voltage output, and is nominally rated to 3A@5V, then output current can vary depending on what your voltage is set to.
Assuming that you are running the converter at max, 5V 3A, and assuming an efficiency of 90%, your 15W ouptut works out around 16.7W input, which for a 12V supply is around 1.4A input, less than your available maximum supply current.
Your actual values will all differ depending on the rated efficiency of your converter, and your actual load current values.
It also depends on if you have a true buck converter, or if your device is actually a 3-pin regulator. For a 3 pin regulator I might expect to see a sizeable heatsink.