Abstract :
A model for concentration fluctuation moments has previously been developed for a scalar dispersing in a turbulent flow (see Mole et al., 1997). This model assumes that the mean concentration is known as a function of space, and of time t, and that higher moments of concentration for a dispersing cloud can be determined fully by a further two parameters α(t) and β(t) (see Chatwin and Sullivan, 1990a). A closure is used which enables a coupled pair of first-order differential equations for α and β to be written down. Here attention is restricted to cases when the mean concentration is self-similar, with a spatial scale L(t). (It is assumed that L−1 dL/dt→0 as t→∞, so cloud growth must be slower than exponential.) It is shown that there is a constant αs, dependent on the spatial form of the mean concentration, such that α→αs as t→∞ when αs>0, and α→∞ when αs<0 (and β→0 in all cases). In the former case the asymptotic analysis shows that α−αs∝(L−1 dL/dt)1/2 and β∝(L−1 dL/dt)1/2. In the latter case it shows that α∝(L−1 dL/dt)−1 and β∝L−1 dL/dt. These results are supported by the numerical solutions for a variety of cases. Some of the corresponding results for the concentration moments are compared with experimental measurements for line sources in wind tunnels.