Update 3/30
MHD
Test started recently. Will do some visualization checks soon, but appears to be running (slowly, but well). Will have to stop to finish the charge exchange postprocessing.
Charge Exchange
Low wind finished. Here are movies of both (missing last few frames of low wind):
High Wind
Low Wind
Spring Code
Converted Alice's code to C++ to make some of the math significantly easier to express. Still working on sorting and refactoring the majority of the code. There also appear to be dozens of unused routines - need to check on those.
Misc
Contacted by Research Outreach, paid magazine publication service. I'm assuming we're not interested, but wanted to make sure.
Updates on High Resistivity case and Equivalent Resistivity
High Resistivity (Moon-like)
The prior runs of this case (back in October'19) did not have the proper absorbing boundary conditions and the density of the planet was not being used. That was case was running quite fast.
Now, the density of the moon is kept low as it is supposed to be an excellent absorber of plasma. The fields go through quickly and try to pull material out (which there is not much of) leading to smaller timesteps.
Here is the simulation with a higher density wind and moon:
Another one with present day solar wind (and moon with half the density):
We also found an unintended line in the Boundary condition function, so that could have helped.
Calculating Effective Resistivity
To compare effective resistivity of different models we are doing a weighted average of the conductivity with the current (suggested by Prof. Pierre Gourdain). That is (image to be added later. Latex makes blog crash.)
Convection project update 03-30-2020
MESA profile
- Used MESA EOS to "regenerate" MESA Profile
- Profile on EOS parameter space:
There seems to be a inconsistency between profile and EOS. Perhaps result from the way MESA interpolate EOS.
astrobear
- modified EOS.f90 slightly to calculate X, Y and Z used in EOS. Astrobear uses atomic number fraction whereas MESA uses mass fraction
- There is still a problem with running the code.. Not sure how to solve it.
Update 3/23
MHD
Diffusion coefficient still appears too high. However, when set at what I think it should be, I run into min timestep issues (Atma appears to be having a similar issue). I first believed this to be related to the addition of a DIFF_ALPHA2 term for the maxspeed:
maxspeed(Info%level) = max(maxsolverspeed(Info%level), max(maxspeed(Info%level), 10d0*DIFF_ALPHA2/levels(Info%level)%dx))
But looking again, the lower DIFF_ALPHA2, the lower that term should be, so I don't believe that's the problem.
Charge Exchange
Low-wind run finished over the weekend. Working on the postprocessing now, along with the new separate-temperature postprocessing for both.
COMMON ENVELOPE SIMULATIONS
Papers Relevant for dependence of initial separation on CE outcomes and transition from RLOF phase to CE phase)
Iaconi, Reichardt, Staff, De Marco, Passy, Price, Wurster & Herwig 2017
MacLeod, Ostriker & Stone 2018a
MacLeod, Ostriker & Stone 2018b
Reichardt, De Marco, Iaconi, Tout & Price 2019
MacLeod, Vick, Lai & Stone 2019
MacLeod & Loeb 2019
MacLeod & Loeb 2020
Goals
- Achieve a numerically stable simulation with separation equal to Roche Limit separation
- Calculated to be 108 Rsun for our fiducial RGB primary (radius 48.1 Rsun, M_1=2 Msun) and secondary (M_2=1 Msun).
- It would be fine to gradually increase the separation in successive runs, thus working our way out to 108 Rsun, if necessary.
- Low ambient density
- But ambient pressure needs to be high to cut off stellar pressure profile and thus avoid inability to resolve pressure scale height
- Hence usual problem is that by making density low, temperature gets high and timestep gets very small, making the simulation slow.
- Is there a way around this problem?
- One possibility is to just be patient!
- Another is to include a hydrostatic atmosphere (typically with some constant temperature—see MacLeod+18a, but they use a spherical mesh)
- This atmosphere cannot extend all the way to the mesh boundary because produces gradients at corners that crash the code (results from 3 years ago)
- Can try making the atmosphere transition to a uniform ambient medium at some radius (See Run 199, below)
- Rotation of the primary
- The Roche analysis is technically only valid for stars that are in corotation with the orbital angular velocity
- I have not yet tried to give the star an initial rotation (an important step in the near future)
- Part of the reason the primary star distorts in some of the simulations below may be physical and related to the results of MacLeod+19a
- Stable primary core
- We have seen with recent models (AGB run) that this can be achieved by putting higher resolution at the core
- Larger simulation domain
- the strategy I am using is to double the mesh size while reducing the base resolution by a factor of two (so 5123 would stay as 5123), and increasing maxlevel by 1. (Alternatively, one could keep the base resolution the same (5123 becomes 10243) and keep maxlevel the same, which would result in ~8 times more base cells but one less AMR level.)
Comments
- Goal 1 is the main goal.
- Goal 2 is desirable but not essential in the short term.
- Goal 3 is also desirable but not essential in the short term, though it may make achieving goal 1 easier (see below).
- Goal 4 we know how to achieve at a high enough fidelity for the time being.
- Goal 5 is also desirable but not essential in the short term.
RLOF+CEE Simulations
- Continuation of old runs (only the last chombo of each had been kept!)
- Run 195: Separation of 1.5 times old separation, restarted from frame 50 of Run 161 with higher density thresholds for refinement and one extra refinement level for very high density—> completed up to frame 201
- Run 201: Roche-limit separation (=2.2 times old separation), restarted from frame 53 of Run 160 with higher density thresholds for refinement and one extra refinement level for very high density—> completed up to frame 251
- Run 202: Roche-limit separation (=2.2 times old separation) with low density ambient (67 times lower than fiducial), restarted from frame 27 of Run 162 with higher density thresholds for refinement and one extra refinement level for very high density—> completed up to frame 57
- New runs
- Run 199: Setup with smoothly matched hydrostatic atmosphere matching on to constant ambient at larger radius and density thresholds for refinement. >>Crashes ~immediately due to high pressure gradients just outside primary. Rerunning now with AstroBEAR default refinement. —> get insufficient memory error
- Run 196: Setup with low constant ambient, refined on density, eventually slows down a lot. Restarted from frame 29 with AstroBEAR default refinement. Runs fast but chombo sizes are HUGE since much more volume is refined. (First I tried qTolerance 0.2, then later changed to 0.4 to reduce file size. DesiredFillRatios set to 0.7.) Ran but then crashed on a restart when reading chombo (not sure why). Restart from frame 33 with qTolerance 0.2, ran up to frame 40. After that slurm and chombo output stopped, presumably because of insufficient memory. Chombo file sizes >130 GB and increasing. Get problem where no slurm output or chombos produced (in the past this problem was found to be caused by insufficient memory).
Run 195
Movies of gas density slices
- Face-on, simulation coordinates, frames 0-40 (old movie of Run 161 from ~two years ago))
- Face-on, simulation coordinates, frames 50-201
- Face-on, simulation coordinates, frames 50-201, zoomed in
Snapshots of Pressure and Temperature
- Pressure frame 50
- Temperature frame 50
- Pressure frame 70
- Temperature frame 70
- Pressure frame 90
- Temperature frame 90
- Pressure frame 140
- Temperature frame 140
- Pressure frame 201
- Temperature frame 201
Snapshots showing mesh
- mesh frame 50 -- for frames 0-50 used more conservative density-based refinement
- mesh frame 51 -- for frames 51-201 used less conservative density-based refinement with additional AMR level for extra resolution near particles
- mesh frame 77
- mesh frame 50, zoomed in
- mesh frame 51, zoomed in
- mesh frame 77, zoomed in
Separation vs time
Run 201
Movies of gas density slices
- Face-on, simulation coordinates, frames 0-251 (frames 0-53 are from ~2 years ago and frames 54-251 are new)
- Face-on, simulation coordinates, frames 53-170, now showing mesh
Snapshots with and without mesh
- Frame 53 with mesh
- Frame 53 without mesh
- Frame 53 pressure
- Frame 53 temperature
- Frame 54 pressure
- Frame 54 temperature
- Frame 116 pressure
- Frame 116 temperature
Run 202
Movies of gas density slices
Snapshots
- Frame 27 Density
- Frame 27 Density extended colorbar
- Frame 27 Pressure
- Frame 27 Temperature
- Frame 27 Temperature with mesh
- Frame 28 Density
- Frame 28 Density extended colorbar
- Frame 28 Pressure
- Frame 28 Temperature
- Frame 28 Temperature with mesh
- Frame 57 Density
- Frame 57 Density extended colorbar
- Frame 57 Pressure
- Frame 57 Temperature
- Frame 57 Temperature with mesh
Run 196
Snapshots
- temperature frame 25
- temperature frame 25, with mesh
- density frame 29
- density frame 29, with mesh
- temperature frame 29
- temperature frame 29, zoomed colorbar
- density frame 40
- density frame 40, with mesh
- pressure frame 40
- temperature frame 40
- temperature frame 40, with mesh
Comparing setup to that of Run 195
- Recall differences in setup from Run 195:
- twice bigger simulation box than Run 195 (and base resolution coarser by factor of two)
- extra AMR level not present in Run 195 for refinement around particles
- start out with higher density threshold for bulk of envelope so not as much of ambient is refined compared to Run 195, but after frame 29 more of ambient is refined than before frame 29 (switched from density threshold to default gradients-based refinement)
- ambient density is 6.7 times lower than for Run 195 (and 6.7 times lower than at the stellar surface)
Results and Interpretation
- Less stable at stellar surface than Run 195.
- Is this partly caused by density jump between surface and ambient medium?
- But surface is also less stable than Run 202, where ambient density was 10 times lower still, so this is probably not the reason.
- Is it related to refining less of the ambient at the beginning of the simulation than for Run 195?
- From the snapshots, it does seem like numerical instabilities form where the mesh transitions between AMR levels and propagate inward, "squeezing" the star along the rectangular mesh
- It seems like more refinement of the ambient medium helps to avert this problem
- This is consistent with what MacLeod was finding (that lower base resolution exarcebates this effect, also seen in his sims, private communication)
- Is this partly caused by density jump between surface and ambient medium?
Conclusions
- Tentatively, it seems necessary to refine very conservatively such that much of the surrounding ambient is refined at the beginning of the simulation before the dynamical plunge-in phase.
- When the dynamical plunge-in phase begins, this can be relaxed and the refinement can be concentrated on the stellar material only.
- Fundamental oscillation modes can be excited when the primary does not rotate synchronously with the orbit at the Roche limit separation (MacLeod+19). This instability is likely being seen in our Runs 201 and 202. Initializing the primary in synchronous rotation might drastically improve stability.
- Seeing as how the numerical instability occurs at the stellar surface, it would probably also be beneficial to increase the resolution there, if possible
Run 199
Snapshots (screenshots for debugging)
- density frame 0, with mesh
- density frame 0, with mesh, extended range
- pressure frame 0, with mesh, extended range
- pressure frame 0, with mesh, extended range, zoomed in
- temperature frame 0
- density frame 1, extended range
- pressure frame 1
- pressure frame 1, small range
- pressure frame 1, small range, zoomed in
- temperature frame 1
PN paper revision
List of new runs
- Run like model A, but has rsoft=5, rwind=4 (production runs have rsoft = 5, rwind=6). -stopped at frame 32, too slow. mintracer=1d-7
- Another run with rsoft = rwind = 5 to capture the central mass. Also, set mintracer=.1 for quiescent period.
- Run like model A, but without the quiescent phase. then compare the momentum analysis for both. — queue on bluehive.
- Run like model A, but with larger refinement on level 7, restart on frame 149. — waiting for reservation on bluehive.
Major issues
Issue 1.
- In section 2, paragraph 3. explain the SPH-AMR mapping. Whether it is conservative or not. —basically averaging and interpolating by some standard sph practice. Add inputs from Thomas.
Issue 2.
- Modify section 2, discuss the choice of momentum parameters. explain that the momentum was chosen to be in line with Bujaharrabal.
- Discuss how sensitive are the results to the choice of injection radius.
— New run 1. but had issue with central high density cells.
— also compare the outflow starting time, New run 2.
Issue 3.
- max refinement level was never reduced during the simulation.
- boxy structure: is always leading ahead of the central cylinder of refinement.
- New run 3, doubling refinement radius on level 7.
Issue 4.
- Clearer description about 'tracer' in Astrobear.
The referee was confused about tracer as added material, but it's just a label, or a 'dye'.
Issue 5.
More discussion on the appendix in the main text. The horizontally launched jets enhance the shaping argument.
Issue 6.
- Add input from Bruce.
- Produce density projection figures.
Minor issues
- In table 2, added a line to indicate model names (A,B,C,D) to the corresponding run parameters.
- Fixed typo throughout the paper.
-TBD-
- Replace the text labels in figures with larger font.
- Redo figure 6. Experiment with other color-tables and vector properties. Just very hard to interpret. Try use arrows of variable length on a uniform mesh of points.
Dust in AstroBEAR - Update 2020/03/16
Comment:
I wasn't able to make blog posts for the last couple of weeks so this is a summary of the progress made since then.
Objectives:
- Finish active advection routines
- Extend interpolation options to dust
- Continue dust processing routines
Progress:
Interpolation
- I've extended the interpolation scheme to include dust. So PCM, PLM, and PPM are now also available for the dust.
Dust Advection
- I reduced the Cloud Crushing setup to just a clump moving through gas as I wasn't sure whether there's an issue with the boundaries, the wind, the fluxes, etc.
- Found out that way that the x- and y-fluxes kept getting messed up → fixed in F_prim and included and added an HLL solver for the dust to both the HLL and HLLC gas solver.
Below are videos of the tests I ran to check the advection:
Simulations with the HLL Gas Solver (and HLL Dust Solver)
- Moving Clump (initial velocity in x): https://youtu.be/RcKyjCDi2HM
- Moving Clump (initial velocity in y): https://youtu.be/eQp_NtMpx0M
- Moving Clump (initial velocity in x and y): https://youtu.be/Z0rndg-28ug
- Moving Clump (initial velocity in x and y with the dust velocity equal to 1.5 times the gas velocity): https://youtu.be/_rNcDg2Ba64
Simulations with the HLLC Gas Solver (and HLL Dust Solver)
- Cloud Crushing Setup (constant dust distribution): https://youtu.be/uE0pXMotopQ
- Cloud Crushing with HLLC and PCM: https://youtu.be/2W6eQVvJEgY
- Cloud Crushing with HLLC and PLM: https://youtu.be/D48wUrkqpwg
- Cloud Crushing with HLLC and PPM: https://youtu.be/rhRvjBOLxjQ
- Cloud Crushing with HLLC and PCM and reflective boundary along the y-axis: https://youtu.be/Zi-2GXu0oQc
- Cloud Crushing with HLLC and PCM and dust grain size distribution across the bins: https://youtu.be/0O9YaNauS1o
- Cloud Crushing with HLLC and PCM in 3D (Slice): https://youtu.be/GJWClB9gUEA
- Cloud Crushing with HLLC and PCM in 3D (Contour): https://youtu.be/PbXodsg2PZ4
This looks like it's working alright considering the dust is implemented as a pressure-less fluid. Once the dust and the gas are coupled via drag, the dust should start getting a velocity in y as well.
Next steps:
- Now working on turning on the drag routines I have in the code (they still need to be properly integrated into the source routine)
Update 3/16
AMR Line Transfer
Buffered sends with a stack of rays is a significant improvement over blocking and looping. Tests on 120 cores:
Buff+Stack | Block+Loop | Proj | |
Time | ~16 hrs | ~30 hrs | ~12 hrs |
% in LT | ~72% | ~79% | ~50% |
Time in LT | ~12 hrs | ~24 hrs | ~6 hrs |
I've merged the AMR and projected versions of line transfer into one branch, so it's possible to choose between them with a single switch. Default is non-AMR. During these tests, noticed that AMR version is ~2x faster than non-AMR on 24 cores. Hasn't affected execution time for non-AMR version.
Still probably a way to improve the ray processing order to minimize dependencies, but will take significant thought.
Charge exchange
To get a reasonable time for averaging, need about 9 more frames. Almost there…
MHD
Testing a diffusion (DIFF_ALPHA2) run. Still having pressure protections, but entirely in ghost zones now. Not clear yet if it will improve runtimes if these protections are solved, or if it will behave well (seems like too much diffusion right now).
DIFF_ALPHA = 0
DIFF_ALPHA2 = 0.1
With Diffusion, Frame 39
Without Diffusion, Frame 27
Convection project update 03-16-2020: EOS test
Problem
When astrobear initialize, it does primitive form → conservative form and back, which can cause some problem.
The root of the problem is P(D, E) and E(D, P) have some troubles with self-consistency.
Technically speaking, P(D, T) and T(D, P); E(D, T) and T(D, E) both have this problem also, but they are directly invert of the inversion algorithm, so the numerical effects are much better.
Most obvious consequence: energy is not conserved.
Test of self-consistency
Choose
-12 ⇐ log(rho) ⇐ -3
3 ⇐ log(pressure) ⇐ 8
0 ⇐ log(T) ⇐ 8
And discard everything outside our original table P(DT) and E(DT)
Flipping between P(D, E) and E(D, P) 200 times:
Input P0, calculate E0, then use E0 to find P1, then P1 to find E1 … Then so on and so forth, until the index reaches 200.
Plotted histogram the difference between P0 - P200 and E0 - E200
Similarly, do the same for P(D, T) and E(D, T)
PDE - EDP | |
Pressure | |
Energy | |
PDT - TDP | |
Temperature | |
Pressure |
Conclusion: PDE - EDP has a worse self-consistency then PDT - TDP.
Potential solutions
- flip the EOS until reaching equilibrium
- use one of PDE or EDP, and invert that EOS (this will give consistancy between PDE and EDP, but not the chain)
NOT Potential solutions
(Where I tried and failed)
- changing the inversion interpolation routine to bicubic. This actually increases the numerical problem
- Adding resolution to final table directly (cubic interpolate and sample better). In some sense, it will reduce numerical noise, but the improvement is minimal comparing to the memory used (double resolution increases file size by 4)
MESA profile
- Tried to rewrite the get profile script in python, but encountering some understanding issue of the problem setup..
COMMON ENVELOPE SIMULATIONS
AGB paper
I have "completed" all the figures for the AGB paper. Will now write it up.
AGB paper (pay attention to the figures and ignore everything else).
RLOF+CEE Simulations
Goal for this month is to (1) set up new high resolution, large box, optimal RLOF run (now undergoing testing) and (2) restart from end of old RLOF runs with improved refinement and evolve as long as possible (to serve ultimately as low resolution comparison runs).
- New runs
- Run 199: Setup with smoothly matched hydrostatic atmosphere matching on to constant ambient at larger radius and density thresholds for refinement. >>Crashes ~immediately due to high pressure gradients just outside primary. Rerunning now with AstroBEAR default refinement. —> queued
- Run 196: Setup with low constant ambient, refined on density, eventually slows down a lot. Restarted from frame 29 with AstroBEAR default refinement. Runs fast but chombo sizes are HUGE since much more volume is refined. (First I tried qTolerance 0.2, then later changed to 0.4 to reduce file size. DesiredFillRatios set to 0.7.) Ran but then crashed on a restart when reading chombo (not sure why). Restart from frame 35 is now in the queue.
- Continuation of old runs (only the last chombo of each had been kept!)
- Run 195: Separation of 1.5 times old separation, restarted from frame 50 of Run 161 with higher density thresholds for refinement and one extra refinement level for very high density—> completed up to frame 201
- Run 201: Roche-limit separation (=2.2 times old separation), restarted from frame 53 of Run 160 with higher density thresholds for refinement and one extra refinement level for very high density—> running, completed up to frame 146
- Run 202: Roche-limit separation (=2.2 times old separation) with low density ambient (67 times lower than fiducial), restarted from frame 27 of Run 162 with higher density thresholds for refinement and one extra refinement level for very high density—> queued
Update 03/02 - Jet
http://www.pas.rochester.edu/~yzou5/CE/Blogpost_20200302/
- Backed up data and code versions from Stampede2 to Bluehive.
- Extended model 6 (half-mass companion) to frame 146.
Update 3/2
Charge Exchange
Weak wind should be done this week. Here are the current figures for the high wind:
I find the differences between the initial state and the high wind interesting. The absorption is definitely more spread out, but I'm not sure enough for the significantly less-deep line center absorption. Need to compare total optical depths.
Weak wind
MHD
Results from fixed left boundary field. Left is original run, right is with fixed boundary conditions.
Convection project update 03-02-2020: clump test (3)
Convergence study
- Upper: dx = 0.005
- lower: dx = 0.01
Movies
- subsonic: v = 1e7
- supersonic: v = 1e8 (Mach 7)
Clump blowing up
- Ambient density: 1e-8
- Ambient pressure: 1e6
- Clump density: 1e-7
- Clump density: 1e9
Movie
- Left: ideal gas
- Right: table eos
Clump blowing up (Energy)
- plot: https://www.pas.rochester.edu/~yishengtu/research_files/CEE_gp_meeting_03022020/Energy_analysis.png
- Theory: Ionization = Energy increase?
Notebook: https://www.pas.rochester.edu/~yishengtu/research_files/CEE_gp_meeting_03022020/CEE_convection_notebook.pdf
pg 4, 13, 25
Next step
- What is the unit output of internal energy in chombo?
- dx = 0.0025 is running. ETA Thursday (earliest).
- Setup a star on grid?