This was my PhD project.
Particle tracking in three dimensions is based on stereometric identification of particle tracks. Until now particle tracking has primarily been used for two dimensional systems, but a system for particle tracking in three dimension is being built at Risø National Laboratory. With this system it will be possible to study turbulent flow and turbulent diffusion under controlled conditions. Since particle tracking makes it possible to determine the statistical properties of a turbulent system very accurately, it can become an invaluable touchstone for basic elements in the models that are being developed to solve practical problems regarding spreading of matter in the atmosphere. Among users of risk analyses where spreading of matter in the atmosphere is a central element, there is a growing need to know uncertainties of, for instance, estimated safety distances. To meet this demand it is neccessary to continue the development and test for theories that view turbulence as a stochastic phenomena. The particle tracking setup at Risø National Laboratory will be an important tool for testing these theories.
Risø report on the project:
Jakob Mann, Søren Ott, and Jacob Sparre Andersen: Experimental Study of Relative, Turbulent Diffusion. Risø National Laboratory, August 1999, ISBN 87-550-2603-6, 75 p.
I visited École Normale Supérieure in Paris from September 1999 to March 2000, to learn about the measurement techniques used in gas flow experiments, and to study passive scalars in turbulence.
My master's thesis presents two experimental studies of relative diffusion. I show how my measurements fit to Kolmogorov's theory for fully developed turbulence and a later theory of weak turbulence. I present analyses using both the "classic" view, where you look at the spreading of a cluster of particles which initially are very close together, and the Kolmogorov view, where you look at a random collection of particles in the velocity field.
I have presented a general method for optimising algorithms to make the most use of multiprocessor computers. The key point in the method is to map the data-flow in the algorithm to a directed non-cyclic graph, where the nodes correspond to operations on data, and connections correspond to the flow of data from one operation to the next. I then use simulated annealing to distribute the nodes on the various processors, such that the execution time is minimised.
The function that estimates the execution time of a distribution of nodes will of cause have to be tailored to the target architecture, but the rest of the process is the same for any system.
I have made some simulations of the evolution of forest fires based on a three-state model with nearest-neighbour-interactions. The model has two control parameters, and I have looked at the final state of the model depending on the values of these control parameters.