It is here that event generators come to the rescue. In an event generator, the objective striven for is to use computers to generate events as detailed as could be observed by a perfect detector. This is not done in one step, but rather by `factorizing' the full problem into a number of components, each of which can be handled reasonably accurately. Basically, this means that the hard process is used as input to generate bremsstrahlung corrections, and that the result of this exercise is thereafter left to hadronize. This sounds a bit easier than it really is -- else this report would be a lot thinner. However, the basic idea is there: if the full problem is too complicated to be solved in one go, it may be possible to subdivide it into smaller tasks of more manageable proportions. In the actual generation procedure, most steps therefore involve the branching of one object into two, or at least into a very small number, with the daughters free to branch in their turn. A lot of book-keeping is involved, but much is of a repetitive nature, and can therefore be left for the computer to handle.

As the name indicates, the output of an event generator should be in the form of `events', with the same average behaviour and the same fluctuations as real data. In the data, fluctuations arise from the quantum mechanics of the underlying theory. In generators, Monte Carlo techniques are used to select all relevant variables according to the desired probability distributions, and thereby ensure (quasi-)randomness in the final events. Clearly some loss of information is entailed: quantum mechanics is based on amplitudes, not probabilities. However, only very rarely do (known) interference phenomena appear that cannot be cast in a probabilistic language. This is therefore not a more restraining approximation than many others.

Once there, an event generator can be used in many different ways. The five main applications are probably the following:

- To give physicists a feeling for the kind of events one may expect/hope to find, and at what rates.
- As a help in the planning of a new detector, so that detector performance is optimized, within other constraints, for the study of interesting physics scenarios.
- As a tool for devising the analysis strategies that should be used on real data, so that signal-to-background conditions are optimized.
- As a method for estimating detector acceptance corrections that have to be applied to raw data, in order to extract the `true' physics signal.
- As a convenient framework within which to interpret the observed phenomena in terms of a more fundamental underlying theory (usually the Standard Model).

Where does a generator fit into the overall analysis chain of an experiment? In `real life', the machine produces interactions. These events are observed by detectors, and the interesting ones are written to tape by the data acquisition system. Afterward the events may be reconstructed, i.e. the electronics signals (from wire chambers, calorimeters, and all the rest) may be translated into a deduced setup of charged tracks or neutral energy depositions, in the best of worlds with full knowledge of momenta and particle species. Based on this cleaned-up information, one may proceed with the physics analysis. In the Monte Carlo world, the rôle of the machine, namely to produce events, is taken by the event generators described in this report. The behaviour of the detectors -- how particles produced by the event generator traverse the detector, spiral in magnetic fields, shower in calorimeters, or sneak out through cracks, etc. -- is simulated in programs such as GEANT [Bru89]. Be warned that this latter activity is sometimes called event simulation, which is somewhat unfortunate since the same words could equally well be applied to what, here, we call event generation. A more appropriate term is detector simulation. Ideally, the output of this simulation has exactly the same format as the real data recorded by the detector, and can therefore be put through the same event reconstruction and physics analysis chain, except that here we know what the `right answer' should be, and so can see how well we are doing.

Since the full chain of detector simulation and event reconstruction is very time-consuming, one often does `quick and dirty' studies in which these steps are skipped entirely, or at least replaced by very simplified procedures which only take into account the geometric acceptance of the detector and other trivial effects. One may then use the output of the event generator directly in the physics studies.

There are still many holes in our understanding of the full event structure, despite an impressive amount of work and detailed calculations. To put together a generator therefore involves making a choice on what to include, and how to include it. At best, the spread between generators can be used to give some impression of the uncertainties involved. A multitude of approximations will be discussed in the main part of this report, but already here is should be noted that many major approximations are related to the almost complete neglect of non-`trivial' higher-order effects, as already mentioned. It can therefore only be hoped that the `trivial' higher order parts give the bulk of the experimental behaviour. By and large, this seems to be the case; for annihilation it even turns out to be a very good approximation.

The necessity to make compromises has one major implication: to write a good event generator is an art, not an exact science. It is therefore essential not to blindly trust the results of any single event generator, but always to make several cross-checks. In addition, with computer programs of tens of thousands of lines, the question is not whether bugs exist, but how many there are, and how critical their positions. Further, an event generator cannot be thought of as all-powerful, or able to give intelligent answers to ill-posed questions; sound judgement and some understanding of a generator are necessary prerequisites for successful use. In spite of these limitations, the event-generator approach is the most powerful tool at our disposal if we wish to gain a detailed and realistic understanding of physics at current or future high-energy colliders.