Posts for the month of March 2020

Coupled EBM Project Update 03/30

Update 3/30

MHD

Test started recently. Will do some visualization checks soon, but appears to be running (slowly, but well). Will have to stop to finish the charge exchange postprocessing.

Charge Exchange

Low wind finished. Here are movies of both (missing last few frames of low wind):

High Wind

https://www.pas.rochester.edu/~adebrech/ChargeExchange/HD209458b_charge_exchange0220.png

Low Wind

https://www.pas.rochester.edu/~adebrech/ChargeExchange/HD209458b_charge_exchange_low0220.png

https://www.pas.rochester.edu/~adebrech/ChargeExchange/HD209458b_charge_exchange_low_species0220.png

Spring Code

Converted Alice's code to C++ to make some of the math significantly easier to express. Still working on sorting and refactoring the majority of the code. There also appear to be dozens of unused routines - need to check on those.

Misc

Contacted by Research Outreach, paid magazine publication service. I'm assuming we're not interested, but wanted to make sure.

Updates on High Resistivity case and Equivalent Resistivity

High Resistivity (Moon-like)

The prior runs of this case (back in October'19) did not have the proper absorbing boundary conditions and the density of the planet was not being used. That was case was running quite fast.

Now, the density of the moon is kept low as it is supposed to be an excellent absorber of plasma. The fields go through quickly and try to pull material out (which there is not much of) leading to smaller timesteps.

Here is the simulation with a higher density wind and moon:

Another one with present day solar wind (and moon with half the density):

We also found an unintended line in the Boundary condition function, so that could have helped.

Calculating Effective Resistivity

To compare effective resistivity of different models we are doing a weighted average of the conductivity with the current (suggested by Prof. Pierre Gourdain). That is (image to be added later. Latex makes blog crash.)

Convection project update 03-30-2020

MESA profile

There seems to be a inconsistency between profile and EOS. Perhaps result from the way MESA interpolate EOS.

astrobear

  • modified EOS.f90 slightly to calculate X, Y and Z used in EOS. Astrobear uses atomic number fraction whereas MESA uses mass fraction
  • There is still a problem with running the code.. Not sure how to solve it.

Coupled EBM Project Update 03/23

Update 3/23

MHD

Diffusion coefficient still appears too high. However, when set at what I think it should be, I run into min timestep issues (Atma appears to be having a similar issue). I first believed this to be related to the addition of a DIFF_ALPHA2 term for the maxspeed:

maxspeed(Info%level) = max(maxsolverspeed(Info%level), max(maxspeed(Info%level), 10d0*DIFF_ALPHA2/levels(Info%level)%dx))

But looking again, the lower DIFF_ALPHA2, the lower that term should be, so I don't believe that's the problem.

Charge Exchange

Low-wind run finished over the weekend. Working on the postprocessing now, along with the new separate-temperature postprocessing for both.

COMMON ENVELOPE SIMULATIONS

Papers Relevant for dependence of initial separation on CE outcomes and transition from RLOF phase to CE phase)

Iaconi, Reichardt, Staff, De Marco, Passy, Price, Wurster & Herwig 2017
MacLeod, Ostriker & Stone 2018a
MacLeod, Ostriker & Stone 2018b
Reichardt, De Marco, Iaconi, Tout & Price 2019
MacLeod, Vick, Lai & Stone 2019
MacLeod & Loeb 2019
MacLeod & Loeb 2020

Goals

  1. Achieve a numerically stable simulation with separation equal to Roche Limit separation
    • Calculated to be 108 Rsun for our fiducial RGB primary (radius 48.1 Rsun, M_1=2 Msun) and secondary (M_2=1 Msun).
    • It would be fine to gradually increase the separation in successive runs, thus working our way out to 108 Rsun, if necessary.
  2. Low ambient density
    • But ambient pressure needs to be high to cut off stellar pressure profile and thus avoid inability to resolve pressure scale height
    • Hence usual problem is that by making density low, temperature gets high and timestep gets very small, making the simulation slow.
    • Is there a way around this problem?
      • One possibility is to just be patient!
      • Another is to include a hydrostatic atmosphere (typically with some constant temperature—see MacLeod+18a, but they use a spherical mesh)
        • This atmosphere cannot extend all the way to the mesh boundary because produces gradients at corners that crash the code (results from 3 years ago)
        • Can try making the atmosphere transition to a uniform ambient medium at some radius (See Run 199, below)
  3. Rotation of the primary
    • The Roche analysis is technically only valid for stars that are in corotation with the orbital angular velocity
    • I have not yet tried to give the star an initial rotation (an important step in the near future)
    • Part of the reason the primary star distorts in some of the simulations below may be physical and related to the results of MacLeod+19a
  4. Stable primary core
    • We have seen with recent models (AGB run) that this can be achieved by putting higher resolution at the core
  5. Larger simulation domain
    • the strategy I am using is to double the mesh size while reducing the base resolution by a factor of two (so 5123 would stay as 5123), and increasing maxlevel by 1. (Alternatively, one could keep the base resolution the same (5123 becomes 10243) and keep maxlevel the same, which would result in ~8 times more base cells but one less AMR level.)

Comments

  • Goal 1 is the main goal.
  • Goal 2 is desirable but not essential in the short term.
  • Goal 3 is also desirable but not essential in the short term, though it may make achieving goal 1 easier (see below).
  • Goal 4 we know how to achieve at a high enough fidelity for the time being.
  • Goal 5 is also desirable but not essential in the short term.

RLOF+CEE Simulations

  1. Continuation of old runs (only the last chombo of each had been kept!)
    1. Run 195: Separation of 1.5 times old separation, restarted from frame 50 of Run 161 with higher density thresholds for refinement and one extra refinement level for very high density—> completed up to frame 201
    2. Run 201: Roche-limit separation (=2.2 times old separation), restarted from frame 53 of Run 160 with higher density thresholds for refinement and one extra refinement level for very high density—> completed up to frame 251
    3. Run 202: Roche-limit separation (=2.2 times old separation) with low density ambient (67 times lower than fiducial), restarted from frame 27 of Run 162 with higher density thresholds for refinement and one extra refinement level for very high density—> completed up to frame 57
  2. New runs
    1. Run 199: Setup with smoothly matched hydrostatic atmosphere matching on to constant ambient at larger radius and density thresholds for refinement. >>Crashes ~immediately due to high pressure gradients just outside primary. Rerunning now with AstroBEAR default refinement. —> get insufficient memory error
    2. Run 196: Setup with low constant ambient, refined on density, eventually slows down a lot. Restarted from frame 29 with AstroBEAR default refinement. Runs fast but chombo sizes are HUGE since much more volume is refined. (First I tried qTolerance 0.2, then later changed to 0.4 to reduce file size. DesiredFillRatios set to 0.7.) Ran but then crashed on a restart when reading chombo (not sure why). Restart from frame 33 with qTolerance 0.2, ran up to frame 40. After that slurm and chombo output stopped, presumably because of insufficient memory. Chombo file sizes >130 GB and increasing. Get problem where no slurm output or chombos produced (in the past this problem was found to be caused by insufficient memory).

Run 195

Movies of gas density slices

  1. Face-on, simulation coordinates, frames 0-40 (old movie of Run 161 from ~two years ago))
  2. Face-on, simulation coordinates, frames 50-201
  3. Face-on, simulation coordinates, frames 50-201, zoomed in

Snapshots of Pressure and Temperature

  1. Pressure frame 50
  2. Temperature frame 50
  3. Pressure frame 70
  4. Temperature frame 70
  5. Pressure frame 90
  6. Temperature frame 90
  7. Pressure frame 140
  8. Temperature frame 140
  9. Pressure frame 201
  10. Temperature frame 201

Snapshots showing mesh

  1. mesh frame 50 -- for frames 0-50 used more conservative density-based refinement
  2. mesh frame 51 -- for frames 51-201 used less conservative density-based refinement with additional AMR level for extra resolution near particles
  3. mesh frame 77
  4. mesh frame 50, zoomed in
  5. mesh frame 51, zoomed in
  6. mesh frame 77, zoomed in

Separation vs time

  1. Comparison with fiducial run (Run 143)

Run 201

Movies of gas density slices

  1. Face-on, simulation coordinates, frames 0-251 (frames 0-53 are from ~2 years ago and frames 54-251 are new)
  2. Face-on, simulation coordinates, frames 53-170, now showing mesh

Snapshots with and without mesh

  1. Frame 53 with mesh
  2. Frame 53 without mesh
  3. Frame 53 pressure
  4. Frame 53 temperature
  5. Frame 54 pressure
  6. Frame 54 temperature
  7. Frame 116 pressure
  8. Frame 116 temperature

Run 202

Movies of gas density slices

  1. Face-on, simulation coordinates, frames 0-57 (frames 0-27 are from ~2 years ago and frames 28-57 are new)

Snapshots

  1. Frame 27 Density
  2. Frame 27 Density extended colorbar
  3. Frame 27 Pressure
  4. Frame 27 Temperature
  5. Frame 27 Temperature with mesh
  6. Frame 28 Density
  7. Frame 28 Density extended colorbar
  8. Frame 28 Pressure
  9. Frame 28 Temperature
  10. Frame 28 Temperature with mesh
  11. Frame 57 Density
  12. Frame 57 Density extended colorbar
  13. Frame 57 Pressure
  14. Frame 57 Temperature
  15. Frame 57 Temperature with mesh

Run 196

Snapshots

  1. temperature frame 25
  2. temperature frame 25, with mesh
  3. density frame 29
  4. density frame 29, with mesh
  5. temperature frame 29
  6. temperature frame 29, zoomed colorbar
  7. density frame 40
  8. density frame 40, with mesh
  9. pressure frame 40
  10. temperature frame 40
  11. temperature frame 40, with mesh

Comparing setup to that of Run 195

  • Recall differences in setup from Run 195:
    1. twice bigger simulation box than Run 195 (and base resolution coarser by factor of two)
    2. extra AMR level not present in Run 195 for refinement around particles
    3. start out with higher density threshold for bulk of envelope so not as much of ambient is refined compared to Run 195, but after frame 29 more of ambient is refined than before frame 29 (switched from density threshold to default gradients-based refinement)
    4. ambient density is 6.7 times lower than for Run 195 (and 6.7 times lower than at the stellar surface)

Results and Interpretation

  • Less stable at stellar surface than Run 195.
    • Is this partly caused by density jump between surface and ambient medium?
      • But surface is also less stable than Run 202, where ambient density was 10 times lower still, so this is probably not the reason.
    • Is it related to refining less of the ambient at the beginning of the simulation than for Run 195?
      • From the snapshots, it does seem like numerical instabilities form where the mesh transitions between AMR levels and propagate inward, "squeezing" the star along the rectangular mesh
      • It seems like more refinement of the ambient medium helps to avert this problem
        • This is consistent with what MacLeod was finding (that lower base resolution exarcebates this effect, also seen in his sims, private communication)

Conclusions

  • Tentatively, it seems necessary to refine very conservatively such that much of the surrounding ambient is refined at the beginning of the simulation before the dynamical plunge-in phase.
  • When the dynamical plunge-in phase begins, this can be relaxed and the refinement can be concentrated on the stellar material only.
  • Fundamental oscillation modes can be excited when the primary does not rotate synchronously with the orbit at the Roche limit separation (MacLeod+19). This instability is likely being seen in our Runs 201 and 202. Initializing the primary in synchronous rotation might drastically improve stability.
  • Seeing as how the numerical instability occurs at the stellar surface, it would probably also be beneficial to increase the resolution there, if possible

Run 199

Snapshots (screenshots for debugging)

  1. density frame 0, with mesh
  2. density frame 0, with mesh, extended range
  3. pressure frame 0, with mesh, extended range
  4. pressure frame 0, with mesh, extended range, zoomed in
  5. temperature frame 0
  6. density frame 1, extended range
  7. pressure frame 1
  8. pressure frame 1, small range
  9. pressure frame 1, small range, zoomed in
  10. temperature frame 1

PN paper revision

List of new runs

  1. Run like model A, but has rsoft=5, rwind=4 (production runs have rsoft = 5, rwind=6). -stopped at frame 32, too slow. mintracer=1d-7
    • Another run with rsoft = rwind = 5 to capture the central mass. Also, set mintracer=.1 for quiescent period.
  2. Run like model A, but without the quiescent phase. then compare the momentum analysis for both. — queue on bluehive.
  3. Run like model A, but with larger refinement on level 7, restart on frame 149. — waiting for reservation on bluehive.

Major issues

Issue 1.

  • In section 2, paragraph 3. explain the SPH-AMR mapping. Whether it is conservative or not. —basically averaging and interpolating by some standard sph practice. Add inputs from Thomas.

Issue 2.

  • Modify section 2, discuss the choice of momentum parameters. explain that the momentum was chosen to be in line with Bujaharrabal.
  • Discuss how sensitive are the results to the choice of injection radius.

— New run 1. but had issue with central high density cells.

— also compare the outflow starting time, New run 2.

Issue 3.

  • max refinement level was never reduced during the simulation.
  • boxy structure: is always leading ahead of the central cylinder of refinement.
  • New run 3, doubling refinement radius on level 7.

Issue 4.

  • Clearer description about 'tracer' in Astrobear.

The referee was confused about tracer as added material, but it's just a label, or a 'dye'.

Issue 5.

More discussion on the appendix in the main text. The horizontally launched jets enhance the shaping argument.

Issue 6.

  • Add input from Bruce.
  • Produce density projection figures.

Minor issues

  • In table 2, added a line to indicate model names (A,B,C,D) to the corresponding run parameters.
  • Fixed typo throughout the paper.

-TBD-

  • Replace the text labels in figures with larger font.
  • Redo figure 6. Experiment with other color-tables and vector properties. Just very hard to interpret. Try use arrows of variable length on a uniform mesh of points.

Dust in AstroBEAR - Update 2020/03/16

Comment:

I wasn't able to make blog posts for the last couple of weeks so this is a summary of the progress made since then.

Objectives:

  • Finish active advection routines
  • Extend interpolation options to dust
  • Continue dust processing routines

Progress:

Interpolation

  • I've extended the interpolation scheme to include dust. So PCM, PLM, and PPM are now also available for the dust.

Dust Advection

  • I reduced the Cloud Crushing setup to just a clump moving through gas as I wasn't sure whether there's an issue with the boundaries, the wind, the fluxes, etc.
  • Found out that way that the x- and y-fluxes kept getting messed up → fixed in F_prim and included and added an HLL solver for the dust to both the HLL and HLLC gas solver.

Below are videos of the tests I ran to check the advection:

Simulations with the HLL Gas Solver (and HLL Dust Solver)

Simulations with the HLLC Gas Solver (and HLL Dust Solver)

This looks like it's working alright considering the dust is implemented as a pressure-less fluid. Once the dust and the gas are coupled via drag, the dust should start getting a velocity in y as well.

Next steps:

  • Now working on turning on the drag routines I have in the code (they still need to be properly integrated into the source routine)

Update 3/16

AMR Line Transfer

Buffered sends with a stack of rays is a significant improvement over blocking and looping. Tests on 120 cores:

Buff+Stack Block+Loop Proj
Time ~16 hrs ~30 hrs ~12 hrs
% in LT ~72% ~79% ~50%
Time in LT ~12 hrs ~24 hrs ~6 hrs

I've merged the AMR and projected versions of line transfer into one branch, so it's possible to choose between them with a single switch. Default is non-AMR. During these tests, noticed that AMR version is ~2x faster than non-AMR on 24 cores. Hasn't affected execution time for non-AMR version.

Still probably a way to improve the ray processing order to minimize dependencies, but will take significant thought.

Charge exchange

To get a reasonable time for averaging, need about 9 more frames. Almost there…

https://www.pas.rochester.edu/~adebrech/ChargeExchange/HD209458b_charge_exchange_low0220.png

https://www.pas.rochester.edu/~adebrech/ChargeExchange/HD209458b_charge_exchange_low_species0220.png

MHD

Testing a diffusion (DIFF_ALPHA2) run. Still having pressure protections, but entirely in ghost zones now. Not clear yet if it will improve runtimes if these protections are solved, or if it will behave well (seems like too much diffusion right now).

DIFF_ALPHA = 0

DIFF_ALPHA2 = 0.1

With Diffusion, Frame 39

https://www.pas.rochester.edu/~adebrech/MHD/diff_comp.png

Without Diffusion, Frame 27

https://www.pas.rochester.edu/~adebrech/MHD/MHD_high_res0027.png

Convection project update 03-16-2020: EOS test

Problem

When astrobear initialize, it does primitive form → conservative form and back, which can cause some problem.
The root of the problem is P(D, E) and E(D, P) have some troubles with self-consistency.
Technically speaking, P(D, T) and T(D, P); E(D, T) and T(D, E) both have this problem also, but they are directly invert of the inversion algorithm, so the numerical effects are much better.

https://www.pas.rochester.edu/~yishengtu/research_files/CEE_gp_meeting_03162020/How_EOS_tables_obtained.png

Most obvious consequence: energy is not conserved.

Test of self-consistency

Choose

-12 ⇐ log(rho) ⇐ -3
3 ⇐ log(pressure) ⇐ 8
0 ⇐ log(T) ⇐ 8

And discard everything outside our original table P(DT) and E(DT)

Flipping between P(D, E) and E(D, P) 200 times:
Input P0, calculate E0, then use E0 to find P1, then P1 to find E1 … Then so on and so forth, until the index reaches 200.
Plotted histogram the difference between P0 - P200 and E0 - E200

Similarly, do the same for P(D, T) and E(D, T)

PDE - EDP
Pressure https://www.pas.rochester.edu/~yishengtu/research_files/CEE_gp_meeting_03162020/Pressure_PDE_panta.png
Energy https://www.pas.rochester.edu/~yishengtu/research_files/CEE_gp_meeting_03162020/Energy_PDE_panta.png
PDT - TDP
Temperature https://www.pas.rochester.edu/~yishengtu/research_files/CEE_gp_meeting_03162020/Temperature_TDP_panta.png
Pressure https://www.pas.rochester.edu/~yishengtu/research_files/CEE_gp_meeting_03162020/Pressure_TDP_panta.png

Conclusion: PDE - EDP has a worse self-consistency then PDT - TDP.

Potential solutions

  • flip the EOS until reaching equilibrium
  • use one of PDE or EDP, and invert that EOS (this will give consistancy between PDE and EDP, but not the chain)

NOT Potential solutions

(Where I tried and failed)

  • changing the inversion interpolation routine to bicubic. This actually increases the numerical problem
  • Adding resolution to final table directly (cubic interpolate and sample better). In some sense, it will reduce numerical noise, but the improvement is minimal comparing to the memory used (double resolution increases file size by 4)

MESA profile

  • Tried to rewrite the get profile script in python, but encountering some understanding issue of the problem setup..

COMMON ENVELOPE SIMULATIONS

AGB paper

I have "completed" all the figures for the AGB paper. Will now write it up.
AGB paper (pay attention to the figures and ignore everything else).

RLOF+CEE Simulations

Goal for this month is to (1) set up new high resolution, large box, optimal RLOF run (now undergoing testing) and (2) restart from end of old RLOF runs with improved refinement and evolve as long as possible (to serve ultimately as low resolution comparison runs).

  1. New runs
    1. Run 199: Setup with smoothly matched hydrostatic atmosphere matching on to constant ambient at larger radius and density thresholds for refinement. >>Crashes ~immediately due to high pressure gradients just outside primary. Rerunning now with AstroBEAR default refinement. —> queued
    2. Run 196: Setup with low constant ambient, refined on density, eventually slows down a lot. Restarted from frame 29 with AstroBEAR default refinement. Runs fast but chombo sizes are HUGE since much more volume is refined. (First I tried qTolerance 0.2, then later changed to 0.4 to reduce file size. DesiredFillRatios set to 0.7.) Ran but then crashed on a restart when reading chombo (not sure why). Restart from frame 35 is now in the queue.
  2. Continuation of old runs (only the last chombo of each had been kept!)
    1. Run 195: Separation of 1.5 times old separation, restarted from frame 50 of Run 161 with higher density thresholds for refinement and one extra refinement level for very high density—> completed up to frame 201
    2. Run 201: Roche-limit separation (=2.2 times old separation), restarted from frame 53 of Run 160 with higher density thresholds for refinement and one extra refinement level for very high density—> running, completed up to frame 146
    3. Run 202: Roche-limit separation (=2.2 times old separation) with low density ambient (67 times lower than fiducial), restarted from frame 27 of Run 162 with higher density thresholds for refinement and one extra refinement level for very high density—> queued

Update 03/02 - Jet

http://www.pas.rochester.edu/~yzou5/CE/Blogpost_20200302/

  • Backed up data and code versions from Stampede2 to Bluehive.
  • Extended model 6 (half-mass companion) to frame 146.

Update 3/2

Charge Exchange

Weak wind should be done this week. Here are the current figures for the high wind:

https://www.pas.rochester.edu/~adebrech/ChargeExchange/HD209458b_charge_exchange_obs_comparison_three_diff.png

https://www.pas.rochester.edu/~adebrech/ChargeExchange/HD209458b_charge_exchange_attenuation_comparison_three.png

I find the differences between the initial state and the high wind interesting. The absorption is definitely more spread out, but I'm not sure enough for the significantly less-deep line center absorption. Need to compare total optical depths.

Weak wind

https://www.pas.rochester.edu/~adebrech/ChargeExchange/HD209458b_charge_exchange_low0220.png

https://www.pas.rochester.edu/~adebrech/ChargeExchange/HD209458b_charge_exchange_low_species0220.png

MHD

Results from fixed left boundary field. Left is original run, right is with fixed boundary conditions.

https://www.pas.rochester.edu/~adebrech/MHD/MHD_fixed_left_comp0100.png

Convection project update 03-02-2020: clump test (3)

Convergence study

  • Upper: dx = 0.005
  • lower: dx = 0.01

Movies

  • subsonic: v = 1e7

https://www.pas.rochester.edu/~yishengtu/research_files/CEE_gp_meeting_03022020/res_fixed_grid_subsonic/table_eos_fixed_grid_subsonic.gif

  • supersonic: v = 1e8 (Mach 7)

https://www.pas.rochester.edu/~yishengtu/research_files/CEE_gp_meeting_03022020/res_fixed_grid_supersonic/table_eos_fixed_grid_supersonic.gif

Clump blowing up

  • Ambient density: 1e-8
  • Ambient pressure: 1e6
  • Clump density: 1e-7
  • Clump density: 1e9

Movie

  • Left: ideal gas
  • Right: table eos

https://www.pas.rochester.edu/~yishengtu/research_files/CEE_gp_meeting_03022020/res_steady_state/table_eos_fixed_grid_subsonic.gif

Clump blowing up (Energy)

Notebook: https://www.pas.rochester.edu/~yishengtu/research_files/CEE_gp_meeting_03022020/CEE_convection_notebook.pdf
pg 4, 13, 25

Next step

  • What is the unit output of internal energy in chombo?
  • dx = 0.0025 is running. ETA Thursday (earliest).
  • Setup a star on grid?