Most µ+ beams today are literally emitted from positive pion decay at rest in the surface layer of the primary target where the pions themselves are produced by collisions of high energy protons with target nuclei -- hence the common mnemonic name, surface muons.
The asymmetry factor a increases monotonically with the e+ energy and is 100% for the maximum energy. Note that a changes sign at low energy; however, very few positrons are emitted with such low energies (see above) and those which are will usually not be detected.
In any real experiment, some of the lower energy positrons do not penetrate intervening material to trigger the detectors, or are ``curled up'' by applied magnetic fields, so that the efficiency f(x) for detecting positrons is energy dependent, forcing an integration over E(x) a(x) f(x) dx to obtain the average asymmetry. This, combined with the finite solid angle of any real detectors, renders the experimental maximum asymmetry Ao an empirical parameter to be determined by measurement on a sample known not to produce any muon depolarization but otherwise identical in every respect to that under investigation. Obviously, this calibration can never be perfect; in general one should not believe any absolute calibration of Ao to better than about 5% of itself.
It is perhaps unfortunate that a is traditionally known as the ``asymmetry'' (rather than perhaps the ``anisotropy'') since this term does not connote polarization (of a spin ensemble) to most people in the magnetic resonance community. However, at this point we are stuck with the term, much as we are stuck with the technical misnomer ``muonium.''