# Grid aligned effects

With the the planet creation correctly implemented, I've now rediscovered the grid aligned effects. Increasing resolution does not seem to resolve the issue, it only seems to exacerbates the issues of the grid aligned effects. That being the time step during the grid aligned effects is significantly smaller than after the planet reaches a steady state. At higher resolution it seems to use smaller time steps to resolve the effects.

# CE

It is unphysical to force the initial condition to be the initial condition that Thomas (Orsola's student) sent me since the density in SPH is sampled via kernel function with its own radius but AstroBEAR has another mesh size. In short, the density at the central point is sampled within a small volume in SPH but our finest grid size is huge compared to that volume. If nothing done to the initial condition, the core will explode since it has around 3.5 solar mass. The density scale is in computational unit (1e-12 g/cc)

After carving out the core with a replacement of 50R_sun radius, 10000 Kelvin, 5E-9 g/cc density and 0 outward velocity spherical boundary. The density is in physical unit (g/cc). Sorry that I did not make two sets of movies in the same scale.

The flow looks more smooth and outflow is less violent. The flow structure is preserved within 6 years of simulation.

# Update on CE project

Recap of Last Post

- In the last post I presented the first runs that included a secondary point mass. I had forgotten to initialize the primary with an orbital velocity.

- I had also presented the first runs that attempted to translate the star across the grid. I had been implementing this incorrectly, as I had been adding the velocity every time step rather than just at the start of the run, and also not giving any velocity to the point mass.

New Work

This time I've tried the following:

- Translating the RG across the grid at 100 km/s. This is important because otherwise in the binary, there would be no way of knowing whether effects on the RG were caused by the secondary or by motion through the ambient medium. We want to make sure the RG is stable as it moves through the ambient medium.

- Evolving the RG from t=0 (now without translating) with a nested grid that minimizes resolution outside the RG. The point is to save computation time since there is no need to have high resolution outside the star.

- I've also written a script in IDL that calculates the required initial velocities for a given set of orbital parameters (masses, binary separation, eccentricity, orientation of the orbit on the x-y plane), and makes a simple animation of the orbit, showing the positions and velocities as a function of time in CM coordinates.

Summary of New Results

- I was able to translate the star across the grid. After about a dynamical time, it begins to become unstable on the trailing side.

- The simulation with low resolution in the ambient medium runs about 4-5 times faster than the previous sims which allowed for AMR in the ambient medium. However, the star becomes unstable earlier, and relatively large density contrasts develop in the ambient medium near the star.

Results

I) Translation across the grid at 100 km/s

Damp069) Extrapolated hydro BCs, Multipole expansion Poisson BCs, ambient dyne/cm

(bluehive standard 120 cores)

( cm, , 5 levels AMR)

(Restarted from run Damp062, at s, after Damping stopped, to s

2d density (continuous)

2d density (1 loop)

2d density and velocity (continuous)

2d density and velocity (1 loop)

For comparison, from a previous post, here is the same run without translating

Damp062) Extrapolated hydro BCs, Multipole expansion Poisson BCs, ambient dyne/cm

( cm, , 5 levels AMR)

(bluehive standard 120 cores up to frame 7 and then about 2 days on comet compute 864 cores, 2 cpus/task up to frame 150 )

2d density (continuous)

2d density (1 loop)

COMMENT: The star becomes unstable on the trailing side sooner than it takes to become unstable when it is not in motion.

II) Evolve RG with low resolution outside the star

Damp070) Periodic hydro BCs, Periodic Poisson BCs, ambient dyne/cm

(bluestreak standard 8192 cores, about 3-4 days computation time)

( cm, , 5 levels AMR)

(As Damp059 except that resolution reduced in ambient medium, with 2 buffer cells per level)

2d density (continuous)

2d density (1 loop)

For comparison, from a previous post, here is the same run without constraining refinement outside RG

Damp059) Periodic hydro BCs, Periodic Poisson BCs, ambient dyne/cm

(bluestreak standard 8192 cores, about 15 days computation time up to s)

( cm, , 5 levels AMR, run up to s)

2d density (continuous)

2d density (1 loop)

Damp078) Extrapolated hydro BCs, Multipole Expansion Poisson BCs, ambient

(bluehive standard 120 cores, about 34 hours computation time)

( cm, , 5 levels AMR)

(As Damp070 except that different BCs, and now 8 buffer cells per level)

2d density (continuous)

2d density (1 loop)

2d density with mesh (continuous)

2d density with mesh (1 loop)

COMMENT: The star becomes unstable sooner than it takes to become unstable when low resolution is not imposed on the ambient medium. But there is a tradeoff as the computation time is reduced by a factor of 4-5.

III) Circular binary orbit with 1 solar mass secondary (as Ohlmann+16a)

(comet compute 1728 cores, 2 cores per task to increase memory per task, a little over 1 day computation time)

( cm, , 5 levels AMR)

2d density (continuous)

2d density (1 loop)

2d density (2x zoom, continuous)

2d density (2x zoom, 1 loop)

2d density (4x zoom, continuous)

2d density (4x zoom, 1 loop)

2d density (Edge-on, continuous)

2d density (Edge-on, 1 loop)

2d density (Edge-on, 2x zoom, continuous)

2d density (Edge-on, 2x zoom, 1 loop)

2d density (Edge-on, 4x zoom, continuous)

2d density (Edge-on, 4x zoom, 1 loop)

Discussion

The RG is still not quite stable enough after damping is turned off and it is allowed to evolve for a few dynamical times. The situation worsens somewhat when the star is translated across the grid. It is worth comparing with Ohlmann+ to see what differences may be important between their setup and ours. Here is a table comparing the two setups:

The most obvious difference is the density of the ambient medium being much larger in our setup. I am currenlty trying to reduce this ambient density to see if it will help to improve the stability of the RG.

The other curious thing is that they used a very small box in their relaxation run, and they used periodic BCs. Does this explain the oscillations they were getting?

Next steps

- Experiment with lower ambient densities

- Longer binary runs

# AMR flux

Link to some thoughts.

# CE outflow

Initial condition: from Orsola's SPH result.

Boundary condition: a spherically symmetric outflow set at r=156R_sun, the outflow density is 5e-11 g/cc, outflow speed ram from 10km/s to 1000km/s. The duration of the 1000km/s outflow is from time=1day to time=40day. The mass loss rate is 1e-1 solar mass per year and the total mass loss is about 1e-2 solar mass. I have also added a 2 solar mass object at the center of the outflow.

Simulation: I use gamma=1.667 in this simulation. The maximum temperature during the late simulation is about 5e7 Kelvin which is unphysical. That high temperature happens at low density region where the shock heating is strong. I use 120 cores for 25 hours, that is 3000 CPU hours. It is static mesh refinement.

# Hydrostatic equilibrium planetary atmosphere

*AstroBEAR* Planetary HSE

Currently the "hydrostatic" atmosphere is diffusing into the ambient. These runs take a couple of hours to get a few frames on 120 cores. From the images (left density, right pressure.) it appears the core of the planet is not being reset every time step as I intended it to. This is attempted by mimicking how the star is reset in your simulations; with the "surface" being the masked area of constant values deep interior the planet, and the "envelope" being the layer with variables as a function of

and held steady. This is significantly different than the issue I had before and I am looking into why it has arisen. I'll email source code shortly.
This could also be a resolution issue judging by the images. I don't see any further refinement (GmX=64^{3}), when my global.data asked for 4 additional SMR levels (MaxLevel=4, LastStaticLevel=-1). Either the chombos or my Visit techniques are failing to plot the higher resolution I thought I had, or I misused the refinement feature in *AstroBEAR*.

## Radiative Transfer in *ATHENA*

The primary radiative energy considerations were previously recorded by Jonathan back in January (link). For reference on cooling rates, the recombination rate is taken from Osterbrock (1989), pg. 19 and Lyman alpha from Black (1981).

To prevent large transients from the initial ionization of the neutral planet, the incoming flux is ramped by the function

The two constants within the error function prescribe when the function is at half value and the speed at which it ramps. Note

and# First common envelope trial runs

Recap of Last Post

The last post I had presented runs of an isolated RG that was quasi-stable after applying the damping prescription of Ohlmann+17, using the freefall timescale of s as the dynamical timescale.

New Work

This time I've tried the following:

- Translating the star across the grid

- Adding a secondary

Summary of New Results

- Translating the star did not work because I was doing it incorrectly I now realize.

- Adding the secondary seems to produce reasonable results, except I realize now that I forgot to initialize the primary with the required velocity.

Results

**I) Isolated RG runs from last blog**

**Damp059) Periodic hydro BCs, Periodic Poisson BCs, ambient ** dyne/cm

**(bluehive standard 120 cores up to 33 then bluestreak 8192 cores, 2 cpus/task)**

**(** cm, , 4 levels AMR)

2d density

- Note: this has been extended in time since the last post, but I haven't used this run for the simulations presented below. Instead I've used run "Damp062" below with extrapolated hydro BCs and multipole expansion Poisson BCs.

**Damp060) Extrapolated hydro BCs, Multipole expansion Poisson BCs, ambient ** dyne/cm

**(** cm, , 4 levels AMR)

**(Damp060 stampede normal, about 2 days with 1024 cores, 1 cpu/task for half and then 512 cores, 1 cpu/task for half)**

2d density

**Damp062) Extrapolated hydro BCs, Multipole expansion Poisson BCs, ambient ** dyne/cm

**(** cm, , 5 levels AMR)

**(Bluehive standard 120 cores up to frame 7 and then about 2 days on comet compute 864 cores, 2 cpus/task up to frame 150 )**

2d density

**II) Translating the RG across the grid**

- Two methods were tried: 1) add 30 km/s to every point; and 2) add 30 km/s to only those points that belong to the star, defined by a threshold density.

**Damp064) Restarts Damp062 from ** s just after damping ends

**(Bluehive standard 120 cores)**

2d density

- Translating every point at once leads to a numerical instability and the code crashes

**Damp067) Restarts Damp062 from ** s just after damping ends

**(Bluehive standard 120 cores)**

2d density

2d zoomed density

- Translating only points with g/cm allows the code to run but results look unphysical. Also, code becomes unreasonable slow.

**III) Adding a secondary**

- A secondary of mass Msun was added at binary separation Rsun, so just oustide the outer radius of the RG Rsun.

- The secondary was given an initial tangential velocity of km/s for run Damp066, and km/s for run Damp068. The latter velocity should result in a circular orbit according to my analytical calculation.

- The IACCRETE variable was specified to be "KRUMHOLZ_ACCRETION"

- The softening length for the spline softening was specified to be the same as that used for the RG.

**Damp066) Restarts Damp062 from ** s just after damping ends, km/s

**(Bluehive standard 120 cores)**

2d density

2d zoomed density

**Damp068) Restarts Damp062 from ** s just after damping ends, km/s

**(Comet compute 1728 cores, 2 cores/task)**

2d density

2d zoomed density

**Discussion and Next Steps**

- Translating the star needs to be done by giving a velocity to the central point mass and envelope only at the start.

- When introducing the secondary, I forgot to give the primary an initial velocity. I will redo run Damp068 giving the primary the appropriate inital velocity.

- At the same time I am working on improving the stability of the RG. I will be experimenting with buffer zones, which Baowei has shown me how to implement. The gain in efficiency should allow me to impose a higher max resolution.

- Must put label "X" on particles…

# Meeting Update --5/15/17

- Disk Space
- received 12TB external hard disks from Erica.
- archiving Planetary Atmosphere data on Bluehive. Will clean ~2.9TB space.
- received several 500GB/1TB hard disks with total size ~5TB for clover from Dave. Will use them for archiving also.
- grassdata/ is mainly occupied by the WT data. So will not change it for now.

# Meeting update 05/09/2017

- 1. Poster Print fee/grant account for Dave
- 2. Updated Bluehive space
**39TB**with 97$ per TB per year: blog:bliu04182017 . New external disk? - 3. Wire Turbulence Poster and Paper conclusions 1). Turbulence generated are mainly solenoidal which follows the -5/3 Kolmorov law for both hydro and MHD velocities. 2). The driving factor is ~ 1/3 as the solenoidal turbulence dominants which makes the Mach number > 1 for both hydro and MHD runs.

# Update 5/9

- Wrote multidimensional Euler solver, found here. Some highlights of the tests:

Comparing HLL to HLLC with stationary CD:

Strips in x and y only seem to appear after wave hits boundary - something to worry about?

Can't see quite as much - can't get VisIt to make slices of pseudocolor plots. But still looks pretty good.

The remainder of the tests can be found here.

- Anything else to do with code? Haven't implemented RCM, FVS, Roe, Osher, higher-order (WAF, finite volume, etc.) schemes, source terms.

- WASP12:

# Some feedbacks on 'Orbital evolution in binary systems with giant stars'

Orbital evolution in binary systems with giant stars

Binary torque

Angular momentum loss and mass-exchange instability in binary stars

An Eccentric Circumbinary Accretion Disk and the Detection of Binary Massive Black Holes

Eccentricity

Orbital evolution of mass-transferring eccentric binary systems. I. Phase-dependent evolution

Orbital evolution of mass-transferring eccentric binary systems. II. Secular Evolution

Soker

Forming equatorial rings around dying stars

Sweden

# Update on RGB star for CE sims

Introduction

The last blog post I was struggling to avoid a cubical ("boxy") and thus unstable star. This boxiness is worse with AMR but reduces with damping.
However, we cannot keep damping turned on indefinitely. I had found that changing the BCs can make a difference.
It occurred to me that using periodic BCs may avoid this problem. Periodic BCs were used by Ohlmann+17.
I experimented again with different BCs in Sections I and II.

In Section II below I vary the value of the damping time scale

according to the prescription of Ohlmann+17, using the free-fall time s as the dynamical timescale , rather than the more conservative sound-crossing time of . Here is ramped up to over , and then left undamped for another . In Section III I try a run that includes AMR and the Ohlmann damping prescription (still running).
Results

- For the small-box simulations without AMR of Section II below, the periodic/periodic hydro/Poisson BCs seem to help to keep the star spherical,

though they increase the computation time by a factor of a few compared to extrapolating/multipole expansion BCs.

- For AMR and a twice larger box, the BCs seem to matter much less (see comparison of Ib and If below).

- It donned on me that maybe the ambient pressure is just too high at dyne/cm , and that this leads to boxiness. This value had led to a more stable star than dyne/cm in the small-box low res uniform grid sims. But with AMR we can still resolve the outer scale height if the ambient pressure is .

- So I did some runs with dyne/cm , and the results were encouraging. Not only is the star more spherical, but the computation time is typically reduced by a factor of a few.

- The star is still not perfectly spherical nor perfectly stable for the dyne/cm runs below, but clearly reducing the ambient pressure is the correct thing to do.

Next steps

- Run III(a) is ongoing. In the meantime, it will be worth doing the same run but with Extrapolating/multipole expansion BCs, which should be faster. If the results are similar, I will stick with the Extrapolating/multipole expansion, which are probably more physical than periodic/periodic and reduce the computation time.

- Simultaneously I will try the same run but with ambient pressure reduced to dyne/cm and dyne/cm to see if further improvements can be made. Clearly the pressure must be as low as possibly while still adequately resolving the scale height at the surface.

- Then it would be worth increasing the box size and resolution to be more comparable with those of Ohlmann+17. At this point we would have to make a final choice for the ambient pressure and dynamical time scale.

- After this, if everything looks good, we need to test the stability of the star as it translates along the grid. Finally we can introduce the secondary (point particle).

**I) Damping with AMR**

**a) Reflecting hydro BCs, Multipole expansion Poisson BCs, ** s, ambient dyne/cm (Damp044)

2d density
2d density and velocity

**b) Extrapolating hydro BCs, Multipole expansion Poisson BCs, ** s, ambient dyne/cm (Damp047 27 hrs on comet compute 576 cores)

2d density
2d density and velocity

**c) Extrapolating hydro BCs, Multipole expansion Poisson BCs, ** s, ambient dyne/cm (Damp049)

2d density
2d density and velocity

**d) Extrapolating hydro BCs, Multipole expansion Poisson BCs, ** s, ambient dyne/cm (Damp050 run on comet)

2d density
2d density and velocity

**e) Extrapolating hydro BCs, Multipole expansion Poisson BCs, ** s, ambient dyne/cm (Damp058 run on comet)

2d density
2d density and velocity

**f) Periodic hydro BCs, Periodic Poisson BCs, ** s, ambient dyne/cm (Damp057 22 hrs on stampede normal 512 cores)

2d density
2d density and velocity

**Comparison with (a) on left and (b) on right**

2d density
2d density and velocity

**Comparison with (b) on left and (f) on right**

2d density
2d density and velocity

**II) Damping with evolving tau**

- Damping prescription as in Ohlmann+17, using s (about equal to the freefall time, while the sound-crossing time is about s).

**a) Reflecting hydro BCs, Multipole expansion Poisson BCs, ambient ** dyne/cm (Damp051)

2d density
2d density and velocity

**b) Extrapolating hydro BCs, Multipole expansion Poisson BCs, ambient ** dyne/cm (Damp052 9 hrs on bluehive standard 120 cores)

2d density
2d density and velocity

**c) Extrapolating hydro BCs, Periodic Poisson BCs, ambient ** dyne/cm (Damp053 8 hrs on bluehive standard 120 cores)

2d density
2d density and velocity

**d) Periodic hydro BCs, Periodic Poisson BCs, ambient ** dyne/cm (Damp054 33 hrs on bluehive standard 120 cores)

2d density
2d density and velocity

**e) Extrapolating hydro BCs, Multipole expansion Poisson BCs, ambient ** dyne/cm (Damp055 4 hrs on bluehive standard 120 cores)

2d density
2d density and velocity

**f) Periodic hydro BCs, Periodic Poisson BCs, ambient ** dyne/cm (Damp056 10 hrs on bluehive standard 120 cores)

2d density
2d density and velocity

**III) Damping with AMR and evolving tau**

**a) Periodic hydro BCs, Periodic Poisson BCs, ambient ** dyne/cm (Damp059 bluehive standard 120 cores up to 33 then bluestreak 8192 cores)

2d density
2d density and velocity

**UPDATE, May 9, 2017**

- Run III(a) above is almost but not quite complete now (120 of 150 frames). It is running on bluestreak

**III) Damping with AMR and evolving tau**

**a) Periodic hydro BCs, Periodic Poisson BCs, ambient ** dyne/cm

**(Damp059 bluehive standard 120 cores up to 33 then bluestreak 8192 cores, 2 cpus/task up to ** s )

**(** cm, , 4 levels AMR)

2d density

**b) Extrapolated hydro BCs, Multipole expansion Poisson BCs, ambient ** dyne/cm

**(** cm, , 4 levels AMR)

**(Damp060 stampede normal, about 2 days with 1024 cores, 1 cpu/task for half and then 512 cores, 1 cpu/task for half)**

2d density

**c) Extrapolated hydro BCs, Multipole expansion Poisson BCs, ambient ** dyne/cm

**(** cm, , 5 levels AMR)

**(Damp062 Bluehive standard 120 cores up to frame 7 and then about 2 days on comet compute 864 cores, 2 cpus/task up to frame 150 )**

2d density

**Discussion**

- Ohlmann+17 also tried an adaptive cubic grid and found results that were almost identical to results using their HEALPix grid. But their simulations employed a moving mesh "with an adaptive refinement ensuring similar cell masses."

**Next Steps?**

- Increase box size further to cm ( with 6 levels of AMR or with 5 levels of AMR?).
- Increase max resolution by a factor of two.

**Computing Issues**

- Current runs take about two days on XSEDE/comet (or a little less on stampede) with the max number of cores, plus a few days in the queue, so say 1 week per run. Each frame produces a chombo file of about 20 GB.

- Is it better to use and 5 levels AMR or and 4 levels of AMR?

- Would I gain much by forcing only the center and surface of the star to have max refinement instead of the whole star?

- How does the result depend on choice of machine used, e.g. bluestreak vs stampede?

- When running jobs that need more than the default memory, is it necessary to run with half the cores?

**Upcoming Conferences**

The Physics of Evolved Stars: The Role of Binarity

Nice, France, July 10-13, 2017

Accepted for a poster. Possibility of sharing a talk slot.

The Impact of Binaries on Stellar Evolution

Garching, Germany, July 3-7, 2017

Accepted for a poster.