Abstract
We examine GRBs with both Fermi-LAT and X-ray afterglow data. Assuming that
the 100MeV (LAT) emission is radiation from cooled electrons accelerated by
external shocks, we show that the kinetic energy of the blast wave estimated
from the 100MeV flux is 50 times larger than the one estimated from the X-ray
flux. This can be explained if either: i) electrons radiating at X-rays are
significantly cooled by SSC (suppressing the synchrotron flux above the cooling
frequency) or ii) if the X-ray emitting electrons, unlike those emitting at
100MeV energies, are in the slow cooling regime. In both cases the X-ray flux
is no longer an immediate proxy of the blast wave kinetic energy. We model the
LAT, X-ray and optical data and show that in general these possibilities are
consistent with the data, and explain the apparent disagreement between X-ray
and LAT observations. All possible solutions require weak magnetic fields:
$10^{-6}< \epsilon_B < 10^{-3}$ (where $\epsilon_B$ is the fraction of shocked
plasma energy in magnetic fields). Using the LAT emission as a proxy for the
blast wave kinetic energy we find that the derived prompt efficiencies are of
order 15%. This is considerably lower compared with previous estimates (87% and
higher for the same bursts). This provides at least a partial solution to the
"prompt high efficiency paradox".