mgaedtke
Forum Replies Created
-
AuthorPosts
-
mgaedtkeKeymaster
Hi mengqiang,
you should use the reference density, which is in most cases given by the density at mean temperature.
Best,
MaxJuly 5, 2021 at 9:35 pm in reply to: Boudnary conditions in couple of Navier-stokes and thermal advection-diffussion #5805mgaedtkeKeymasterHi mengqiang,
the setLocal*Boundary functions are in deed using the regularized boundary formulations. For additional stability, I would suggest using the interpolation versions of those.
For pressure outlets, I recommend to use the local convection boundary for the according temperature field. Make sure to execute the coupling operation on those material numbers as well.
Also check your relaxation time and lattice velocity to be in the stable regime. In Timm Krüger’s book on LBM, there is a chapter on how to choose these parameters consistently.
Best regards,
MaxmgaedtkeKeymasterHi Alex,
3D simulations being easier to be instable sounds resonable. You have more degrees of freedom, more calculations that could potentially intodruce numerical errors. Also, do not forget that turbulence in 2D is a completely different phenomenon than in 3D!
Above Ra>=1e7 these simulations transition to turbulent natural convection and eddies will arise. Are you using a turbulence model for your simulations or are you wanting to resolve all structures? If no turbulence model is applied, check your resolution. It should be high enough to resolve the boundary layer that develops close to the heated and cooled walls. You can find some more info in e.g. https://doi.org/10.1016/j.camwa.2018.08.018.
Best,
MaxmgaedtkeKeymasterHi Ramiro,
great to hear, that you had success in setting up the total enthalpy method to at least some extent.
For adiabatic outlet boundaries openlb provied a special convection boundary type, which you can set via
setAdvectionDiffusionConvectionBoundary
. For the other, stationary adiabatic boundaries, bounce back is what I used in my simulations, too.Best regards,
MaxmgaedtkeKeymasterHi Tamas,
this looks a lot like a problem which is resolution dependent. Keep in mind that those are only warnings (falling back into regular bounce back boundary for these cells), the simulation will still run if the Mach number is realistic.
You can either try to find a compromise for the resolution, for which the mesh still fits your RAM, or run this case on a cluster computer. For 700 million cells I recommend at least 100 cores, better would be 2000.
Best,
MaxmgaedtkeKeymasterHi Andreas,
I’m not an expert in weather or atmospheric modeling, but I find the overview in https://core.ac.uk/download/pdf/207513141.pdf very comprehensive. See in the references therein for example the work of Hess et al. (https://gmd.copernicus.org/articles/3/415/2010/gmd-3-415-2010.pdf), where a weather model based on LES is proposed.
Cheers,
MaxmgaedtkeKeymasterHi Bhuttu,
it’s the number of cuboids, in which we distribute our domain for parallel execution. In order to get a deeper insight into OpenLB I can recommend to attend the upcoming Spring School: https://www.openlb.net/spring-school-2021/
Best,
MaxmgaedtkeKeymasterHi Bhuttu,
this is just a warning from the gnuplot interface and will be fixed in the upcoming release.
Thank’s for mentioning!
Best Regards,
MaxmgaedtkeKeymasterHi Bhuttu,
Good that simulation is still working. This is just a warning from the gnuplot interface and will be fixed in the upcoming release.
Thank’s for mentioning!
Best Regards,
MaxAugust 10, 2020 at 9:02 am in reply to: Density calculation error in Sourced AD BGK dynamics computeRhoU #5089mgaedtkeKeymasterHi Julius,
thank you very much for identifying this. You are absolutely right. I changed our development version accordingly and the model will be fixed in the upcoming release.
Kind Regards,
MaxmgaedtkeKeymasterHey Julius,
I can’t reproduce your errors. Please make sure to understand the return values of our cell objects: The distribution functions we save in the data structure are actually f* = f – weight, so that the sum of f* = rho – 1, because the sum of the weights equals to 1. We apply the same shift to all the equilibrium distributions.
See Section V in https://journals.aps.org/pre/abstract/10.1103/PhysRevE.48.4823 for some more information on this topic.
Cheers,
MaxmgaedtkeKeymasterHi Sergey,
using a constant relaxation time while changing the resolution results in changing the lattice velocity as well. These parameters are not independent. However, the Reynolds number is 20 for both cases. So, you should see similar simulation results for both cases. Keep in mind that changing the discretization will also change corresponding discretization errors.
For a good write up of the dimensionalization of the lattice boltzmann equation, have a look in the book by Krüger et al., chapter 7 (https://www.springer.com/de/book/9783319446479).
Best,
MaxmgaedtkeKeymasterHello Bhuttu,
in theory, this should be possible by making the model constant G a function of the local temperature. I don’t know of any one who tried this before, although it sounds quite interesting. Feel free to report back on your findings. Looking forward!
Best regards,
MaxmgaedtkeKeymasterHi iJoker,
I can only recommend to have a look in the text book of Krüger et al. https://www.springer.com/de/book/9783319446479. Chapter 7 is about dimensionalization of the Lattice Boltzmann Equation and on how to choose stable discretization parameters.
Best,
MaxmgaedtkeKeymasterHi iJoker,
I just send you an email.
Best,
Max -
AuthorPosts