# Direct Numerical Simulation and Large Eddy Simulation of Turbulent Pipe and Channel Flows

A substantial fraction of the energy required to push fluids through pipes and channels is dissipated by turbulence in the vicinity of solid walls. As a consequence, the study of wall-bounded turbulence in general and pipe flow in specific is not only of theorectial interest but also of importance for many real-life engineering applicatons.

While most applied computational fluid dynamics (CFD) approaches --- such as Reynolds averaged Navier Stokes (RANS) --- require turbulence modelling, we study wall turbulence by means of direct numerical simulation (DNS) and large-eddy simulation (LES). In case of DNSs all turbulent scales are fully resolved spatially and temporally, wheras in case of LESs only large scales are resolved and so-called sub-grid scales are modelled.

Our work particularly focusses on the characterisation and interaction of turbulent coherent structures that appear in wall-bounded turbulence. Characteristic are streaky structures of different scales, the most energetic of them appear in the buffer layer and scale in viscous units (buffer-layer streaks). Additionally, so-called very-large-scale motions (VLSMs) appear in the outer layer of high Reynolds number wall turbulence and interact with the near-wall small-scale structures.

The different scales of turbulent pipe flow at Re_{Τ}=1500 --- Re_{T} = u_{T} R / v being the friction Reynolds number based on the friction velocity, the pipe radius and the kinematic viscosity --- can be extracted from Fig. 1 and 2, where iso-contours and iso-volumes of the streamwise turbulent fluctuation are shown.

To study the interaction between VLSMs and the small viscous scales, Reynolds numbers and computational domains large enough for VLSMs to settle are required. Hence we set up DNSs of friction Reynolds numbers up to ReT=2880 and computational domains up to 42R. Since these simulations involve up to 32 billion grid points and, thus, require large computational power and memory, they are carried out on high performance computing (HPC) clusters. Our simulations are partly performed on the DLR clusters SCART and CARA, as well as on the LRZ cluster SuperMUC (project id pr62zu).

From our simulations we obtain velocity and pressure field snapshots (as shown in Fig. 1 and 2) as well as stastistical quantities as the two-point velocity correlation shown in Fig. 3.

Currently we are investigating the exchange of kinetic energy between the different turbulent scales by evaluating the transport equation of turbulent kinetic energy for filtered velocity fields.

We highly acknowledge the Gauss Centre for Supercomputing e.V. (www.gauss-centre.eu) for partly funding the our project under grant pr62zu by providing computing time on the GCS Supercomputer SuperMUC at Leibniz Supercomputing Centre (www.lrz.de).

Selected publications:

C. Bauer, D. Feldmann, and C. Wagner, On the convergence and scaling of high-order statistical moments in turbulent pipe flow using direct numerical simulations, *Phyics of Fluids*, 29, 125105 (2017)

D. Feldmann, C. Bauer, and C. Wagner, Computational domain length and Reynolds number effects on large-scale coherent motions in turbulent pipe flow, *Journal of Turbulence*, 19(3), 274--295 (2018)

C. Bauer, A. von Kameke, and C. Wagner, Kinetic energy budget of the largest scales in turbulent pipe flow, *Phyical Review Fluids*, 4, 064607 (2019)

# Contact:

Christian Bauer

German Aerospace Center (DLR)

Institute of Aerodynamics and Flow Technology, Department Ground Vehicles

Göttingen

Phone: +49 551 709-2132