Binary simulation
- In fact binary star, but the secondary is 400AU away from the primary and its mass is only 1e-9 solar mass (Pluto), so its impact is negligible.
Primary star use Krumholz accretion criterion. I think rectangular geometry induce the asymmetry and the accreted gas has similar behaviour to the no accretion case.
Why the Krumholz accretion will give more symmetric simulation?
In Lagrangian picture (lab frame):
is Boltzmann constant, is isothermal temperature.
therefore:
Since gas is not in Keplerian motion - pressure balance a fraction of gravity.
The effective gravity is smaller than the case without isothermal assumption. Gas can stay in higher orbit if there is pressure gradient. In isothermal simulation without accretion, it is really the case.
Krumholz accretion remove gas, therefore remove pressure. Gas will suffer much more "gravity" and fall back - finally ate by the sink particle.
So even if there is some asymmetry, i.e. angular momentum in this simulation, we can not see because gas is eaten. But in no accretion case, the unwanted "angular momentum" is amplified by pressure.
In all, simulation become more insensitive to angular momentum if there is accretion. That is why Krumholz accretion seems to make the fall back symmetric. Actually it is not convincing.
- Plot of isosurface of density around sink particle(half of the MAX density). The data of this movie is from the same simulation of the above one.
- Wind density decays exponentially at first. After 1 e folding time, I shut off the wind(at cycle 150, time=50). There is still a simulation that does not shut off the wind running on bluehive. Wind is ejected from the primary at escape velocity from 10 AU.
Binary simulation
- 40AU separation, 1.5 solar mass + 1.0 solar mass. There is no accretion for both stars. I did not assume the primary star has radiative pressure that balance out its gravity in the outflow phase, that is, the primary impose gravity on gas throughout the simulation. outflow (wind) temp=2000K, sound speed is roughly 4km/s, wind velocity=8km/s therefore Mach=2, supersonic flow. Mass loss is supposed to be 10e-5 solar mass/yr but I doubt that because the radius of primary seems larger than 10AU which is supposed to be. But in the code, its initial setting is 10AU indeed. Wind duration is 1200yr, whole simulation time is 2400yr. Simulation box dimension is 1200AU*1200AU*600AU.
Comment: The fall back disk is not so obvious when looking from the side, I think it may be due to the high temperature (2000K) of the outflow.
z=0 plane view
Side view
Enlarged view.
- single star
Meeting Update 1125
Triggered Star Formation, with Federath + Mach 3, high res.
Created two particles offcenter…
http://www.pas.rochester.edu/~shuleli/tsf_feed/tsf1125.gif
Meeting Update 11/25/2013 - Eddie
- Made some cool looking emission maps for Andrea ehansen11212013
- Fixed a memory leak in Marvin's code, but it did not resolve the error. Couldn't fit any other leaks, so maybe it is time for Baowei and/or Jonathan to pick this up?
- I have a test run of my 3D pulsed jet waiting in the queue on bluestreak. I want to see if it can run at least 12 frames without too many restarts before I ask for another reservation.
- I found an error in my mach stem calculations because I was using the wrong angle. I fixed that and changed the simulation set ups that I plan on running. ehansen11192013
- Extended the lower boundary on my mach stem simulation. The result was the same. I will try a 2D run next.
Error in CND-Simulation & Quasiperiodic Boundary Conditions
Eddie and I worked on finding the error that causes my code-crashes, he found a memory leak in the outflow module, but unfortunately that didn't solve the problem. I was able to install valgrind on one of the clusters that I am using, and found out that the code often tries to use an outfow object that has recently been free'd by the routine SinkParticleRestore (see more details in the ticket 324). So I will now focus my attention on this routine and I am open to suggestions and clues leading to the capture of the culprit.
Furthermore I worked on the quasiperiodic boundary conditions, I already finished parallelisation. I am now working on the MHD-part (see ticket 317), I created a module with a disk and a torodial magnetic field as a test case. Here you can see the movie. The results show on the left side a simulation with QPBC and on the right side a simulation of the full disk (but I only show one quarter here). The arrows show the direction of the magnetic field and the color code the magnetic energy density. In the QPBC simulation the field energy gets concentrated at the center and a kind of striped pattern appears at the lower boundary. I think there is still something going wrong, maybe its the EMF-Synchronisation.
cool emission maps of lab jets
Just wanted to share these images with everyone, because I think they look cool.
These are emission maps of H-alpha (green) and [S II] (red). They are generated from data that Andrea provided from laboratory jet experiments. They are consistent with the concept that H-alpha marks shock fronts, and [S II] follows behind shock fronts in cooling regions.
edit to mach stem runs
I looked at Kris' paper on mach stems and hysteresis: http://adsabs.harvard.edu/abs/2013HEDP....9..251Y
Here they labeled the critical angle as the angle that the bow shock makes with a vertical line, not the included angle between the bow shocks. So my previous calculations to find the critical separation distance were actually using the wrong angle.
With this correction, the critical separations decreased. Unfortunately, some of them decreased to the point where it would be difficult to simulate. In other words, when the critical separation gets really close to 2 rclump it becomes difficult to put the clumps close together below critical because the clumps would overlap. So I had to change my runs to be 5% above and 5% below critical instead of 10% like I had before. Below is the corrected table with the grid domain and resolution for each of my runs:
Run | Mach Stem? | Domain Size (x, y, z) | Base Grid (mx, my, mz) | |
---|---|---|---|---|
A | no | 1.243, 2.486, 0.6215 | 6, 12, 3 | ←— lowest physical resolution at 77 cells/rclump |
B | yes | 1.125, 2.250, 0.5625 | 6, 12, 3 | |
C | no | 1.205, 2.410, 0.6025 | 6, 12, 3 | |
D | yes | 1.090, 2.180, 0.545 | 6, 12, 3 | |
E | no | 1.168, 2.336, 0.584 | 6, 12, 3 | |
F | yes | 1.057, 2.114, 0.5285 | 6, 12, 3 | |
G | no | 1.143, 2.286, 0.5715 | 6, 12, 3 | |
H | yes | 1.034, 2.068, 0.517 | 6, 12, 3 | ←— highest physical resolution at 92 cells/rclump |
According to these numbers, the run that I showed yesterday should not have formed a mach stem. It was at x = 1.242 which is pretty close to what I am now calling run A. Perhaps what we saw was something non-physical due to some boundary effects. I am rerunning that set up with the lower y boundary extended.
Meeting Update 1118
- ART problem, I think Baowei is getting a mismatch between the bottom temperatures of his profile and the given bottom heat flux (the bottom temperature should give a flux that is approx close to q0 given by Riccardo).
- Wrote a routine that reads from Jacques' Argon and SiO2 ionization fraction Tables, and wired it to the resistivity routine. Current features:
- read the Table.txt, from cell density and temperature, infer the ionization fraction Z by averaging the lower and higher bounds (should be better with Jonathan's extrapolation routine, I haven't yet received it btw).
- look at the tracer field and determine which table to use (which material has higher fraction). It may be worthwhile to do another average.
- from Z, calculate desired physics quantity on that cell using Spitzer equation (resistivity and heat conduction).
I also modified heat conduction routine to do the same thing, so that only ionized electrons can conduct heat, and only along B field directions (the anisotropicity is determined by gyroradius). So I think we are more sophisticatd than before with the explicit routines, except they don't do subcycles.
- Got contacted by UIUC comp astro group, read some of their papers (they do GRMHD, I found the magnetized disk simulations particularly interesting).
- TSF paper (short version) should be out to Adam and Eric this week. The isothermal runs got a memory error, see below.
http://www.pas.rochester.edu/~shuleli/TSF.e3864249
- Started high-res magnetized TSF (Krumholz) runs on bluestreak. Until the end of this year, I'll be more dedicated to writing (short letter for the TSF, resistive clumps paper, and also potentially the longer TSF paper) while waiting for these magnetized runs.
Meeting Update 11/18/2013 - Eddie
Pulsed Jets
I fixed a couple of lines in my pulsed jet problem module, and the code ran better without restarting. I am doing one more test, and if everything looks good, I will be ready to try this again with another reservation.
Mach Stems
I came up with a plan for the mach stem runs: ehansen11122013.
I changed the grid and mesh to make these runs go a bit faster…
- The mach stem should occur in the upper z plane, so I cut off part of the clump in the z-direction.
- I implemented some derefinement inside the clump and below the clump.
With these changes, I was able to complete run B in less than 1.5 days on 8 cores on Grass. When I use more cores on bluestreak, I should be able to complete all the runs in a day. I could also increase the resolution and still get the runs done in a reasonable amount of time.
Below are some density and velocity images and movies for run B:
Meeting update - Nov. 18, 2013
Last week was a short week for me — I left town late Thursday for the weekend to celebrate my grandmother's 70th birthday. I spent the time I had learning about estimating memory usage for simulations (following this page here- http://astrobear.pas.rochester.edu/trac/astrobear/wiki/Tutorials/JobSizes). It was basically straightforward in retrospect, but got a little confused on the wording/explanation on the wiki page for it — so spent a bit of time learning about that. I plan on editing that page to make it clearer. After many emails back and forth, I think this is how it goes - ,
- The amount of allocated variables for MHD vs. Hydro simulations are different than I expected. For hydro simulations, you only count the fluid variables you need for the simulation (e.g. in 2D this would be rho, px, py, E - 4 variables). For MHD , there are *always* 8 variables in the q-array (cell-centered quantities), no matter the dimension of the problem - (rho, px, py, pz, E, Bx, By, Bz). In addition to these 8 variables, you also count the variables in the aux arrays - these include the EMF's on cell edges, and the B fields at cell faces. There are different arrays for EMF parents/children and the accumulated EMF over the steps. Together in 2D these give another 5 'auxillary' variables - Bx, By, Ez, Ez_child, Ez_parent. In 3D there are 12 - Bx, By, Bz, Ex, Ey, Ez, Ex_child, Ey_child, Ez_child, Ex_parent, Ey_parent, Ez_parent. What about 1D? And should we put that on the wiki page too?
- GB is a misnomer for RAM, where the unit is actually the GiB in *most* usages of the word. 1 GiB = 10243 bytes. So, when I was doing my memory calculations using the GB = 1e9 bytes, I was off from the values quoted on the wiki page. I didn't know RAM is actually in units of GiB, even though the CIRC pages quote them in GB. This should be explained on the wiki page for memory estimates as well.
Also just a tip — we need 9 credit hours to be full-time graduate students now — not the usual 12, according to the department website and verified by the graduate student coordinator.
Spherical winds with toroidal B fields
A modification of:
http://adsabs.harvard.edu/abs/1996ApJ...469L.127R (Rozyczka and Franco '96)
http://adsabs.harvard.edu/abs/2000ApJ...544..336G (Garcia-Segura & Lopez 2000)
Our field is: Btoro= (Rstar/x)2 (x/Rstar-1)*sigma. In contrast to the above authors, ours is not zero at the pole; I did not find the correct morphologies otherwise.
log(particles/cc) in colors. log(Btoroidal) contours.
Circumnuclear Disk
Here we see a simulation of the circumnuclear disk with a total time of about 1e6 years. I updated some of the parameters: I decreased the disk mass by one order of magnitude (it has now about 4e4 solar masses), increased the velocity of the outflow from 10 km/s to 700 km/s (speed of sound of the ambient medium: 100 km/s) and changed the outflow density so that the mass outflow is about 4e-3 M_sun/year.
At the beginning the edge-on view shows some Kelvin–Helmholtz-Type instabilities, but after some time the inner region of the disk seem to reach a more or less settled state.
Quasiperiodic Boundary Conditions
This clump that is moving around is a test of the quasiperiodic boundary conditions with a refinement level of 3 but still without magnetic fields and without parallelisation.
Binary simulation
- Separation = 40AU, 1.5 solar mass primary + 1 solar mass secondary. wind initial velocity is 1 c.u. = 4km/s (below escaping velocity). wind density is 2.64e-5 c.u. emitted from 10 AU, mass loss rate 10e-5 solar mass per year. Non-accretion for both stars.
The result is very complicated. It seems the gas is on the transition to fall back or the wave pattern will continue until very far away.
I doubted that the unbinding effect win at this separation, so I did another sim.
This is the side view of the same sim.
- Separation = 400AU, others unchanged.
Instead of falling back directly, there is acceleration region around the primary star which is not due to unbinding process of the secondary. Which is so weird. And the gas decelerated very fast in a small region, but I don't see significant fall back process.
Both star are not accreting gas.
Journal Club
Here is the paper I mentioned was so cool yesterday —
Mach Stem Runs
Following work done by Kris, (see Mach Stems), I redid some of my old calculations.
The formulas Kris and I used in our calculations were for bow shocks from jets, but they should be applicable to bow shocks formed by a wind and clump interaction.
The theory gives a critical angle based entirely on
. Mach stems should form at angles equal to and greater than the critical angle. This does not depend on how the bow shock is formed…critical angle (deg) | |
---|---|
5/3 | 37.4 |
1.4 | 41.7 |
1.2 | 47.7 |
1.1 | 53.5 |
The set up I am doing right now is for two stationary clumps. Their symmetric bow shocks will interact exactly in between the clumps and either form or not form a mach stem. The angle formed between the bow shocks decreases with increasing separation distance. So if the critical angle is a minimum for mach stem formation, then critical separation distance is a maximum.
Based on the formula for the shape of a bow shock, you can get a critical separation distance for each critical angle. Note that the separation distance is measured from clump center to clump center. These are given in units of clump radii…
critical angle (deg) | critical separation (rclump) | |
---|---|---|
5/3 | 37.4 | 2.76 |
1.4 | 41.7 | 2.64 |
1.2 | 47.7 | 2.52 |
1.1 | 53.5 | 2.45 |
Since I am simulating one of the bow shocks with a reflecting wall, my domain has to be half the size of this critical separation. I think it will suffice to do two runs for each gamma: one above critical and one below. Here are my proposed runs:
Run | critical separation (rclump) | simulation separation (rclump) | Mach Stem? | |
---|---|---|---|---|
A | 5/3 | 2.76 | 3.036 | no |
B | 5/3 | 2.76 | 2.484 | yes |
C | 1.4 | 2.64 | 2.904 | no |
D | 1.4 | 2.64 | 2.376 | yes |
E | 1.2 | 2.52 | 2.772 | no |
F | 1.2 | 2.52 | 2.268 | yes |
G | 1.1 | 2.45 | 2.695 | no |
H | 1.1 | 2.45 | 2.205 | yes |
If I use 4 levels of AMR and aim for at least 64 cells per clump radii, I can use these resolutions:
Run | Mach Stem? | Domain Size (x, y, z) | Base Grid (mx, my, mz) | |
---|---|---|---|---|
A | no | 1.518, 3.036, 0.759 | 8, 16, 4 | |
B | yes | 1.242, 2.484, 0.621 | 6, 12, 3 | |
C | no | 1.452, 2.904, 0.726 | 6, 12, 3 | ←— lowest physical resolution at 66 cells/rclump |
D | yes | 1.188, 2.376, 0.594 | 6, 12, 3 | |
E | no | 1.386, 2.772, 0.693 | 6, 12, 3 | |
F | yes | 1.134, 2.268, 0.567 | 6, 12, 3 | |
G | no | 1.3475, 2.695, 0.67375 | 6, 12, 3 | |
H | yes | 1.1025, 2.205, 0.55125 | 6, 12, 3 | ←— highest physical resolution at 87 cells/rclump |
Meeting Update 11/11/2013 - Eddie
I did not get much work done in the past 4 days or so because I was busy packing, moving, unpacking, etc.
Pulsed Jets
The restart that I ran on bluestreak did not give me any more frames. I need to check the output to see if anything looks odd.
Mach Stems
Redid the previous run (
), but this time with density gradient refinement, and no position based refinement. Only got 80 frames out of 100 before our systems got rebooted last week.I still want to go through some of my old notes, redo some calculations, and come up with a grid of problems to run (different gammas and different separations).
Emission Maps
I am finished with the coding aspect of this side project for Andrea. Wrote a module that takes a table of x, y, z, n, T and initializes a grid. The gas is assumed to be purely H, and it uses the equilibrium ionization fraction at the given temperatures. It outputs the initial chombo and also the corresponding emission projections as a bov file and then stops.
All that is left to do is run it and give the data and some images to Andrea.
Other
An astronomy senior at RIT (Dave Anderson) is looking at UR for grad school. He is interested in astrophysics and is curious about out group. If there are no objections, I would like to invite him to next week's group meeting?
I am giving a colloquium at Geneseo on the 21st, so I need to start planning that. I suppose it should just be a general talk about what our research group does. Adam, you have done these kinds of talks before, any suggestions?
Binary simulation
- 40 AU separation, 1.5 solar mass primary, 1 solar mass secondary, wind emitted from 10 AU with velocity of 1km/s, which is far below escape velocity. Mass loss rate is 10E-4 solar mass per year. Something like CE or fall back disk formed and there is a numerical shock. The secondary is actually unbinding the envelope.
- 40 AU separation, 1.5 solar mass primary, 1 solar mass secondary, wind emitting from 10 AU with velocity of 5km/s, also below escape velocity. Mass loss is 10E-2 solar mass per year, which is rather fast. A steady disk formed around secondary. We can see the boundary of primary impose some non-physical effect and induce inconsistent behaviour.
Meeting update
This past week I ran 2D simulations of the colliding flows (Hydro, previous posts), began working through understanding how the module adds clumps to the flow, and am working through the literature on colliding flows. I found a very interesting paper on this problem in 2D, will post in my library. Tried moving to Bluehive for the runs, but am having a bit of trouble there. Will check into this later today.
Meeting Update 11/11/2013 -- Baowei
- Users
- new users: From Instituto de Astrofisica de Andalucia (Formation and X-ray emission from planetary nebulae and Wolf-Rayet nebulae.) and From institute of astronomy and astrophysics of TaiWan(code comparison).
- Resources
- alfalfa for Zhuo?
- intel fortran compiler on local workstations were not working properly last Friday due to the software updates but fixed now
- Worked on
- ticket #309: fixed a new bug (http://astrobear.pas.rochester.edu/trac/astrobear/ticket/309#comment:27). reran the tests. obtained hydro tests results very close to Reale's paper with average particle mass as half of the solar abundance (http://astrobear.pas.rochester.edu/trac/astrobear/ticket/309#comment:28). New code with the Ablative RT module (my data files) found nans in hypre.
- new user & local users
- Will attend Supercomputing Conference (SC13 Denver, CO) with Jonathan next week.
Comparison Runs
I spent today playing with visualizing these runs in visit, and adjusting some of the parameters in the data files. Here is a plot summarizing some of my findings -
Focusing on the left column of the plot, we see the field restricts the lateral dispersal of the collision region away from the colliding streams. This seems to have 3 important effects. 1) In the hydro run, the clumps fragment and become localized structures moving in the flow, whereas in the MHD run a long filamentary structure forms. 2) The 'clumping' of material in the MHD run is enhanced, producing higher densities than in the Hydro run. A sink particle forms in the MHD run by frame 134, but does not form (at least by the final frame 140) in the hydro run. 3) The collision region seems bound in the MHD run, but not in the hydro run.
The right column shows the following: without cooling, we see the collision region expands (quickly) — increasing density leads to increased pressure, which forces the evacuation of the gas. In the hydro case, we see little localized clumps of lower density material form within the collision region. These structures do not form in the MHD runs, but rather we see again longer filamentary structures form there.
The boundary conditions are reflecting on left and right, and extrapolating on top and bottom. The behavior in the hydro, non-cooling case makes sense with these BCs, but I am not entirely sure what is going on in the MHD non-cooling case where there is some weird flow coming back into the grid on the top and bottom..
High Res 2D MHD Colliding Flow run
This is 64 + 3 levels 2D run with reflecting, B-parallel BCs on x1 and x2 (extrapolating on y1 and y2). I am seeing the cells reaching higher densities in this higher res run than in the lower res run, and a single sink particle by about 16 Myr (compared to 3D smooth hydro runs, tsink~20 Myr).
http://www.pas.rochester.edu/~erica/CFrho_reflectingBParallelBCs.gif
I am still NOT seeing striations with these boundary conditions, even at higher resolution here in 2D.
For comparison, here is the same movie at the original lower resolution. Note, no sink particles are formed here, and the max density is ~500-
http://www.pas.rochester.edu/~erica/CFrho_reflectingBParallelBCs_lowres.gif
2D MHD Colliding Flows and different Boundary Condition Effects
2D Mods to global.data
A 2D version of the colliding flows problem seems to be working. I just modified 3 points in the global.data file,
ndim = 2 !before was 3
…
mx = 64, 64, 1 !before was 64, 64, 64
…
xbounds = -25d0, -25d0, 0d0, 25d0, 25d0, 0.78125d0 !this has domain running from 0 to dx in z direction
Results
This reduces simulation time from ~hr. to ~min. While gravity in 2D might be wonky, this allows us to quickly debug the striations.
I checked that the striations at boundary still exist in 2D with extrapolating BCs. They do -
http://www.pas.rochester.edu/~erica/2DCF_rho_extrapBCs.gif
I next tried to change the boundary on the left and right sides of the box (x1 and x2). I find that different boundary conditions produce wildly different results in the box,
- Using the BCs "reflecting, B-parallel" reduces or nearly completely removes the striations. These BCs force the magnetic field to remain normal at the wall.
- Using "reflecting, wall" seems to create near vacuum states in the colliding flows. The lower densities seem to be greatly reducing the time step, making these simulations run quite long (~>hr). I don't see a change in the max speed in the standard out however, which I am curious about. This is clearly not right. I am not sure why the boundary condition is leading to this behavior inside the flow.
- Using "periodic" boundary conditions leads to very strange behavior as well. It seems to make sense that you'd expect this boundary condition and reflecting to produce the same behavior, but they do not. Again, I am not sure why the boundary is effecting the flow in this way.
Here is a movie of the density for the different runs (the simulation is 2D in the x-y plane) - http://www.pas.rochester.edu/~erica/all4BCs_CFrho.gif
I will make note of these effects and when I can look more deeply at the code's prescription for boundary conditions will check my understanding. For now, I am running the 3D version of the code with the reflecting, B-parallel BCs to check that they look good still.
Directory Locations
Build directory is @: /grassdata/erica/CollidingFlows/scrambler_3.0
Runs are @: /grassdata/erica/CollidingFlows/CollidingFlows/2D/MHD
Binary simulation
I don't quite understand the meaning of t1,t2 and radiusw in the binary module. Does the radiusw refer to where the wind is emitted in the unit of base cell? That is, if the sim box is 200 AU and I have 256 base grid, then each cell has 0.8 AU and if radiusw=12 then the wind emit at 10AU?
Meeting Update 1104
http://www.pas.rochester.edu/~shuleli/tsf_feed/fediso.gif
density movie with gamma = 1.000001.
Meeting Update 11/04/2013 - Eddie
- 3D pulsed jet with beta = 1 is running on my reservation on bluestreak. We will see how far it runs by Wednesday morning.
- I am running mach stem simulations with density gradient refinement. This will take longer than previous runs, but it may be fine.
- Working on creating synthetic emission maps for Andrea. I need to figure out how to use the emission routines with ASCII data outside of astrobear. Another issue is that Andrea only has data for x, y, z, n, and T. The emission routines require ne, x, and T. Perhaps I can use some assumptions and approximations to at least produce something that is qualitatively relevant.
Meeting Update -- 11/4/13 -- Erica
Working on readings and colliding flows simulations. I first ran a quick and dirty simulation with MHD on, and extrapolating BCs to reproduce the striations Jonathan talked about earlier.
MHD COLLIDING FLOWS:
Here are some movies,
normal slice (y-z plane is normal to flow direction) of the density and Bx-
http://www.pas.rochester.edu/~erica/normal_slice_BandRho.gif
parallel slice(y-z plane is normal to flow direction) of the density and Bx-
http://www.pas.rochester.edu/~erica/parallel_slice_BandRho.gif
here is positive vx on a log scale, both types of slice -
http://www.pas.rochester.edu/~erica/sliced_vx.gif
The large shock feature that blows backward off of the collision region jumps out at me. The sound speed is initially Cs = 117 everywhere for the 0th frame. Dt = 0.01. So the farthest a sound wave should have been able to travel is ~1 by the 1st frame. What is causing this blow off?? — This is wrong I think, because I didn't take B into account in the sound speed. . .
here is a normal slice of pressure -
http://www.pas.rochester.edu/~erica/normal_slice_press.gif
HYDRO COLLIDING FLOWS:
normal slice of rho-
http://www.pas.rochester.edu/~erica/normal_slice_rho.gif
vx -
http://www.pas.rochester.edu/~erica/normal_slice_vx_hydroCF.gif
press-
http://www.pas.rochester.edu/~erica/normal_slice_press_hydroCF.gif
to see that the striations are absent in the hydro set-up, see this parallel slice of vx -
http://www.pas.rochester.edu/~erica/parallel_slice_vx_hydroCF.gif
For the hydro case, dt = .8, scalegrav = .02, Cs = 90. The Jeans refinement is lambdaj = 4dx, solving for rho gives - (Cs/(4dx))2(pi/G) ~ 100,000 (increase of more than 5 orders of magnitude) before a sink can form…
Meeting Update 11/04/2013 -- Baowei
- Tickets
- new: #311(Implement energy & momentum conserving self gravity), #312(Array bound mismatch in ProlongateCellCenteredData), #313(NaNs in Info%q when creating displaced disk), #314(Link broken), #315(Bugs in Binary), #316(Bugs in Binary), #317(Quasi Periodic boundaries in a quarter plane), #318(Array bound mismatch in SyncHydroFlux), #319(Invalid pointer assignment in ObjectListRemove), #320(Usage of uninitialized variable levels(0)%gmbc(1) in CreateAmbient), #321(Implementing simple line transfer in AstroBEAR)
- closed: #315 duplicate
- Resources
- New machine to replace Alethea (for Joe)?
- Worked on
- ticket #316(Joe's jobs on bluehive), Bugs found by Marvin with gfortran compiler (#312, #313, #318) — all our local machines & Teragrid use ifort as fortran compilers which is more tolerant to the array bounds checking. I tried running test suites with gfortran on alfalfa and found more tiny bugs and a fatal compile-time error with HDF5_gfortran. Still working on it.
- ticket #309 (Conduction Front Test with hydro)
Simulation of the circumnuclear disk
This is a simulation of the circumnuclear disk with the same parameters as last week, I only extended the final time by a factor of 2.5 (and worked on a different cluster that uses gcc instead of intel compilers).
So the first 40% of this simulation should look the same as last weeks simulation, but as you can see, there is a kind of wind/outflow from the bottom of the disk. The reason is that the initial conditions are not set properly as you can see in the second picture, where we see a contour plot of the disk at t=0 that shows a kind of patchy structure at one of the quarters.
I'm currently trying to find out what is causing this behaviour.