Meeting Update 09/29/2014 - Eddie
- Did some interesting testing on velocity/momentum smoothing for clumps: ehansen09262014. What do we want to be the "correct" way? Should I remove velocity smoothing in our main development branch?
- Tested the 3-D clump set up (see below)
- Plan on doing 2.5-D pulsed jet paper revisions tonight.
3-D Clump
- single clump (density contrast = 2000)
- no cooling, gamma = 5/3
- ambient/wind and clump in motion (M ~ 10)
- velocity smoothing is still on
- uses exact Riemann solver, linear interpolation, CFL = 0.5, 0.2, 0.5
- effective resolution = 40 cells/rclump
Below is a 2-D slice of the density:
Still playing around with 3-D rendering. It takes a long time on my machine, so I may try using interactive jobs on bluehive from now on.
The above simulation completed 28.5/100 frames in 8 hours on 64 procs on Stampede. The remaining wall time estimates lead to a total run time of about 2 days for the entire 100 frames. On 64 procs, this is about 3000 SUs. Using Z cooling will at least double this to 6000. Adding a 2nd clump will generate more gradients and significantly increase run time (I'm guessing by a factor of at least 10). Best case scenario is that these runs take about 60,000 SUs at this low resolution of 40 cells/rclump.
I should also note that the above numbers are for a simulation time of 32.87 years, which seems reasonable when compared to observational time scales. Pat's HST observations show changes in emission features on even shorter time scales (~10 years). We may be able to improve resolution and run times if we limit our simulation parameters in such a way to make it easier on the code.
I spoke with Shule, and he said that his shocked clump simulations were at 64 cells/rclump, MHD, and took up to 2 days on 1024 procs on Kraken. So about 48,000 SUs. This seems much better than what I'm getting. It's hard to compare hydro + Z cooling vs. MHD + no cooling, but maybe if I tweak my parameters I can get up to this resolution.
Meeting Update 09/29/2014
Tasks for this week:
- Particular CDM movies.
- Post Processing for convergence tests on Bluestreak. Encountered the following error on Erica's account:
2014-09-27 05:11:02.120 (FATAL) [0x40001069280] 22137:ibm.runjob.client.Job: could not start job: job failed to start 2014-09-27 05:11:02.126 (FATAL) [0x40001069280] 22137:ibm.runjob.client.Job: Load failed on R00-ID-J06: Application executable ELF header contains\ invalid value, errno 8 Exec format error
- Reading Shapes and Shaping of Planetary Nebulas (2002) by Bruce and Adam.
- Attempting to visualize these ascii files in SHAPE.
Clump Behavior: Velocity/Momentum Smoothing
After comparing the last set of simulations with moving vs. stationary clumps, I decided to investigate the role of velocity/momentum smoothing at the clump boundary. I found a small possible error in the code. Momentum gets smoothed by two factors of f where f is a tanh function. This happens because both density and velocity are smoothed by a factor of f. I think that it might be better to only smooth density, since smoothing velocity would make the code more dependent on reference frame. I took out the extra velocity smoothing, but I am not sure what affect this will have if any.
I looked at the initial frames for a few cases in velocity and momentum. There is the original smoothing, the corrected smoothing, and no smoothing. Below are the images for momentum.
Stationary (orig) | Moving (orig) |
---|---|
Moving (correct) | Moving (no smoothing) |
Below are the movies showing density.
Stationary (orig) | Moving (orig) |
---|---|
movie | movie |
Moving (correct) | Moving (no smoothing) |
movie | movie |
Some strange stuff happens without any momentum smoothing. This is most likely due to the fact that density is still smoothed, so there is a mismatch between density and momentum.
The corrected smoothing case looks a bit more like the stationary case which is good. However, there is velocity smoothing even in the original stationary model. So I should really do a new stationary run without the extra factor in velocity smoothing, and compare that to the moving clump also without that extra smoothing factor. We want these two simulations to look the same since the code should be Galilean invariant. Below are the final results with corrected smoothing:
Stationary (correct) | Moving (correct) |
---|---|
movie | movie |
So taking out the extra velocity smoothing didn't affect the stationary model as much as it affected the moving model. However, I would still argue that the correction is an improvement. I don't think we should expect the stationary and moving models to be exactly the same, since you will get a different amount of numerical diffusion when you have things moving on the grid differently. Stay tuned for 3D runs.
Candidate movies to show on Collaboratory Wall for the film
These are the candidates movies I've received so far:
- Eddies high rez 2.5D MHD jet simulations (the one with all 4 jets evolving in [SII] and Halpha) https://astrobear.pas.rochester.edu/trac/blog/ehansen09292013
- Some of bruce/kira's simulations of PN lobe evolution
a) 3D: http://www.pas.rochester.edu/~bliu/pnStudy/rhoPN_3d.gif
b) 2D: http://www.pas.rochester.edu/~bliu/pnStudy/2Dclump_bl.gif
data set to bring up directly from visit
- Some of Zhuo's simulations of fall back shells and binary evolution
https://astrobear.pas.rochester.edu/trac/wiki/u/zchen/simulations
https://astrobear.pas.rochester.edu/trac/wiki/u/zchen/3Dsimulations
- A rotating version of a SHAPE visualization of one of Bruce/Kira simulation? https://astrobear.pas.rochester.edu/trac/blog/crl618Figures http://www.pas.rochester.edu/~martinhe/2012/crl/f4.
- Magnetic Tower
- Accretion Disks
http://www.pas.rochester.edu/~martinhe/2012/binary/10lines2.gif
http://www.pas.rochester.edu/~martinhe/2012/binary/20lines2.gif
http://www.pas.rochester.edu/~martinhe/2011/binary/gene-4.gif
http://www.pas.rochester.edu/~martinhe/2011/binary/20mar1144.gif
http://www.pas.rochester.edu/~martinhe/2011/binary/40au-bb5-3d.gif
- Youtube channel:
CF High Resolution Runs Visualized
Click open the tab on the following page: https://astrobear.pas.rochester.edu/trac/wiki/u/erica/CFRunStatsHighRes
Run of blast wave across magnetized wire up to late times
Attached - MPEG comparing density plots for runs with:
-Top, 10 T wire surface field
-Bottom, hydro
-The blast wave followed the behavior described in post aliao09082014; the density of the wire was 7.5 g/cc (80% Cu).
-Contours of the magnetic field strength are overlaid.
Key observations from the video:
-Before the rarefaction of the blast wave reaches the wire, i.e. before the peak ram pressure/density hits, the bow shock moves leeward monotonically. Once the ram pressure starts to decrease the bow shock relaxes windward.
-At early times, i.e. before the shock passes the center of the FOV, the contact discontinuity is visibly deflected by the magnetic field. At high latitudes with respect to the wire, this bowing out should be readily distinguishable from the hydro case regardless of possible projection or timing complications.
-At late times, the displacement of the nose of the bow shock in the magnetized run vis-a-vis the hydro run re-emerges after a long intermission where magnetic effects were suppressed due to the high sigma. At these late times, the ram pressure has dropped back down to low sigma regime.
-At late times, the dynamics of the near-wire region is dominated by the "ablation" of the most windward crescent of wire. The CD is visible in both runs to be between the "ablated" wire material and the wind. The dynamics of the runs are indistinguishable within this inner region.
-The wire temperature is set close to absolute zero, and the "ablation" at late times only spreads around cold, non-emitting matter in the inner region. Thus in both runs the inner region is not a significant source of emissions. I don't trust what the wire is doing at all here.
Image: emissivity of final frame of movie ~100 ns
3D Binary
3D binary star:
Mass loss rate varies from 10-7 to 10-5 solar mass per year. An AGB star locates at the origin and the separation of two stars are 7.5 AU.
From the side view, we can see BH accretion of the secondary star during the pulsation of the AGB star.
I think the tilting disk has gone. Since it is low temperature simulation and during the evolution we saw a flat plane of high density gas.
The simulation stuck due to "Restart Required"
Meeting Update 09/22/2014 -- Baowei
- worked with users from SUNY Oswego and Laurence's student to install AstroBEAR on their machines: got issues with the compiler and libraries on their machines.
- configure file (ticket #255): first version with development branch worked on local machines and hopefully most of other machines.
1) The problem module can be set with option "—with-module=". Module list will be shown in README and INSTALL documents. This option is required. Error will be reported if no module given.
2) Check the hdf5, fftw3 and hypre libraries. The paths can be set with options "—with-hdf5=", "—with-fftw3=" and "—with-hypre=". These options are optional. If no library found, it will report error and provide help information about downloading and installing the library.
3) A new run_dir folder will be created. If the folder exit, a backup "run_dir_Currenttime/" will be made to avoid erasing last runs. After compile, all necessary data files and the executable file "astrobear" will be copied to the run_dir/ folder. And an out/ subfolder will also be created.. Will add the pbs and slurm sample scripts to make the run_dir/ really ready to go on all machines.
4) pthreads is there but hasn't been tested.
5) Haven't included the IBM xl compilers, OpenMP, etc… but planning to do..
- OpenMP optimization (ticket #361): on it…
- Trying installing Paraview on Bluehive: current got errors with qt4 library and VTK.
Meeting Update 09/22/2014
Last week was spent attempting to construct a quick dirty table of the high res colliding flows runs… however I ran into a plethora of issues regarding visit on BH2. Using x2go is fine and works well. Apparently my account is unable to use public key authentication to use some particular nodes. Talk with Jonathan and this is getting sorted with CIRC.
Visualizing from my own computer while ssh-ing into a local machine painstaking. It takes half a day just to get a movie of a .bov. So now I am working on Erica's computer and it is going quite swimmingly. Also have these simulations with sink particles. There are still some issues regarding particular files, however, that are putting a damper on getting the data out there.
As I've tried to go about visualizing the runs, noticed some issues with the files other than data endians. A few are missing or have gotten lost/deleted during our move around.
Last week I accomplished the following:
- Using sed I changed the data endians and variable names.
- Wrote a script to change the name of the files so they are all the same and we can efficiently access them with the "smart" grouping in visit. Did this for both BS and local sets of the data.
- Finished the NoShear data set (now we have all hist and pdfs — transferred between local and BS).
- Cherry picking sections of missing data or areas where visit doesn't like the files. Still working on this actually…
Meeting Update 09/22/2014 - Eddie
Clump Behavior
Before moving on to 3-D simulations, I wanted to see how setting the clump in motion affected the clump behavior. I set the clump velocity at 10 km/s, and reduced the wind by the same amount to keep the relative velocity the same. I tested both the low density contrast (X = 500) and high density contrast (X = 2000) set ups.
There was also a small error in my previous simulations. I forgot to change the ambient temperature to 5000 K as I had stated. The temperature was actually at 1000 K, which means that the shock was not Mach 10, it was more like Mach 22. This doesn't change what we found out about the different solvers and what not. The new set of runs below are with the correct ambient temperature of 5000 K =⇒ M = 10 shock.
Low Contrast | High Contrast | |
---|---|---|
Stationary | ||
movie | movie | |
Moving | ||
movie | movie |
These were all done with 2nd order interpolation and the exact Riemann solver, which still seems like the way to go.
Other Stuff
- The 3-D version of the high contrast, moving clump simulation above is currently running. If that looks good, I can think about using a 2nd clump again, and Z cooling as well.
- Need to redo the 2.5-D pulsed jets emission maps at higher resolution for the film thingy on Friday
- Working on 2.5-D pulsed jets paper revisions
Clump Behavior: Alexei's Setup + High Density Contrast
The same setup as before (ehansen09162014), except now the clump density has been increased by a factor of 4 to make the density contrast 2000. This time, I only tested the 3 different Riemann solvers with 2nd order accurate interpolation. Also, the CFL vars have been lowered to 0.5, 0.2, 0.5 for all runs.
Exact | HLL | HLLC |
---|---|---|
movie | movie | movie |
Some interesting facts about run times…These all ran on 64 procs on Stampede.
Solver | Run Time (mins) |
---|---|
Exact | 66.88 |
HLL | 180 (to frame 92.5) |
HLLC | 68.34 |
Some Observations
- HLLC is still very asymmetric
- Exact has become more asymmetric
- HLL is now most symmetric but is also most diffusive and for some reason the code is much slower with HLL
So Exact still looks like the best option even though a high density contrast has led to more asymmetry. Is it possible that we are pushing the accuracy limits of the code with such a high density contrast?
I think the next logical step is to put the contrast back down to 500 and put the clump in motion. Then we will know how clump motion affects the simulation independent of the high density contrast.
Clump Behavior w/ Alexei's Setup
Going back to basics, with a simulation setup that we know should work. Here are the parameters:
nwind = 1000 cm-3 Twind = 5000 K vwind = 74.53 km/s ==> M = 10 nclump = 500,000 cm-3 ==> X = 500 Tclump = 10 K (pressure equilibrium) vclump = 0
I tested different Riemann solvers and different interpolation orders. I was only able to test the Sweep method, because MUSCL is only implemented for fixed grid and 1 proc.
Also, the HLL linear run had the most trouble. It only got through 61/100 frames in 2 hours on 64 procs. This may have to do with whatever happened around frame 45…looks like a density protection maybe. All the other runs got through the full 100, and usually in much less than 2 hours with 64 procs.
Exact | HLL | HLLC | |
---|---|---|---|
1st Order (Godunov) | |||
movie | movie | movie | |
2nd Order (linear) | |||
movie | movie | movie |
I also did a run with the 'default' solver settings (sweep, HLLC, linear), but with a lower CFL. The original CFL variables were 1, .5, .5, and the run below has .5, .2, .5.
Some Observations
- All the Godunov runs are more diffusive which is expected.
- Out of all 3 Godunov runs, only the HLLC solver produces asymmetries.
- All of the 2nd order runs become asymmetric at some point with perhaps the Exact solver being most symmetric and HLLC most asymmetric.
- Lowering the CFL did not seem to make a significant difference.
Meeting Update 09/15/2014 - Eddie
- A decent chunk of my time is being spent on diagnosing the strange behavior in my clump simulations.
- Checked AMR vs. fixed, and cooling: ehansen09092014
- Tried 2.5-D: ehansen09112014
- Looked back at Kris' shocked clump paper, and I am working on calculating different time scales. I'm curious to see if there are any instability time scales, or perhaps the cooling time scale, that are much shorter than the clump destruction time scale.
- Ran some simulations to check for convergence in my 2.5D pulsed jet simulations: ehansen09122014. Just need to come up with a better metric to show convergence.
- Implemented tracking for pressure and density protections. See image below where the cells that have triggered protections are marked in black. I will post another blog post later to further explain this new feature.
Meeting Update 9/15/2014
-WindOutflow Module first run and debugging.
-B.C's can be replicated from the Stone/Proga paper.
-Working on incorporating the non-radial component of the velocity in the code.
Convergence of 2.5D Pulsed Jets
This is related to one of the comments by the reviewer of my 2.5D pulsed jets paper. Basically, he wanted to know if our simulations had converged to a solution. So I did a quick resolution study. From left to right is increasing levels of AMR.
The resolutions are so high for all of these simulations that you can't really see a difference here. In other words, there are more computational cells than there are pixels to display. I'll post an enlarged image later of one of the clumps.
UPDATE
Below is the clump located near z = 19 Rj from the above image.
2.5D Single Clump Tests
Since the instabilities at the head of my 2-D clumps are not going away, we decided to try 2.5-D. I ran two cases, one where the clump is still in the center of the grid (like simulating the cross section of a donut), and one where the clump is centered on the z-axis to take advantage of the imposed symmetry (like simulating the cross section of a ball).
I'm beginning to think that these instabilities may be physical. If I'm right then perhaps the reason I see them in my simulations when others have not is due to the large density contrast between clump and wind. Stay tuned as I figure out if there's a reasonable explanation for the instabilities.
Plotting for MHD CF paper
Here is some comparisons of Jonathan's paper figures, and plots that I have:
Density vs Pressure PDFs
Questions,
How to choose the bounds?
Adding those lines in visit?
Density vs. velocity
The plot Jonathan used in paper was velocity dispersion over time, density weighted. The plot I have now is volume weighted density vs. velocity PDF. It doesn't show the information as cleanly as the line curve does..
Weighted mixing ratios
The plot Jonathan used in paper was mass-weighted mixing ratio, what I have is volume-weighted mixing weighted.
So here is a collection of plots:
Column Density | have |
Density-weighted joint probability distribution function for density vs. pressure | have |
Kinetic and gravitational energy spectra | — |
Mean kinetic and gravitational energy densities E¯k,g against time, split between large (> D, see eq. 2) and small scales | — |
Mass history against time | have |
Density-weighted velocity dispersion against time | — |
Core mass distribution | have |
Mixing bias (eq. 8) vs distance of cores to mid-plane, at time t | have |
Mass-weighted mixing ratio (eq. 7) for the gas in the analysis region | — |
Mixing ratio of cores at time t | have |
Strangeness in Clump Simulations
In order to figure out what is going on with these clumps, I am simplifying the problem to 1 moving clump. We will also be looking at a clump that is initialized in a moving ambient.
Here is the run with all of the original parameters. This uses Z cooling, 80 cells per rclump (4 levels of AMR), Hvisc, and lapplydiffusion are on. Additionally, interporder = 2, lcharlimiters is on, lrestartonpressureprotections and lrestartondensityprotections are both off, and cfl_vars = 1, 0.5, 0.5.
Test Runs
Here are all the models we talked about yesterday. They all have an effective/actual resolution of 80 cells/rclump.
- Zcool/nocool, AMR/fixed models are self-explanatory
- lower cooling = all densities decreased by factor of 10
- lower contrast = clump density decreased by factor of 50
- increase diffusion = increased DIFF_ALPHA in solver.data from 0.1 to 0.2
Model | Zcool, AMR (orig) | Zcool, fixed | nocool, AMR | nocool, fixed |
---|---|---|---|---|
Image | ||||
Movie | movie | movie | movie | movie |
Model | Zcool, AMR (orig) | lower cooling | lower contrast | increase diffusion |
---|---|---|---|---|
Image | ||||
Movie | movie | movie | movie | movie |
My thoughts:
- Zcool, fixed still shows the weird stuff at the head of the clump. It starts off symmetric, but then develops asymmetries which is odd. Thus, it would appear that the AMR is not seeding any instabilities.
- nocool, AMR/fixed both look better but not perfect. So although much of the problem may be caused by the cooling, I may still need to add some diffusion to get rid of the problem entirely.
- lowering the cooling by lowering the density everywhere helped. I suspect that not resolving the cooling length may directly lead to the instabilities, or it causes pressure protections that lead to the instabilities. One thing to try is to turn on restarts for pressure protections.
- lowering the contrast by lowering the clump density did not help. The clump was destroyed much more easily, so it's hard to draw any useful conclusions from this run.
- increasing the diffusion helped a lot. This run looks nice and smooth. We just have to ask ourselves at what point does adding more diffusion become nonphysical.
UPDATE
Tried turning on restarts for pressure protections. Restarts are triggered almost immediately and often. This will prevent the run from ever finishing, so I had to stop it. I don't see any nans, so it is the protections are presumably caused by very strong cooling that brings the temperature below the mintemp of 1d-10. Perhaps increased resolution could help, or lowering the mintemp and/or raising the floortemp.
They are being triggered at the point (-0.41, 9.12) which is near the lower left edge of the clump. Maybe there is something with the softening or refinement buffering that might help.
Running magnetized wire problem with Peter Graham's blast wave
Previous runs of the magnetized wire problem used a uniform wind, Since early last month more realistic blast wave results have become available:
- Rise time of shock density-
10 ns for 1000 fold exponential increase with time from initial 0.01 mg/cc
- Decay time of rarefaction-
100 ns for 1000 fold exponential decay with time from the peak density of (1)
- Deceleration of the flow-
40 ns for 5 fold exponential decay with time from initial *150 km/s *This is already less than ½ of what Peter Graham starts with (>300 km/s)
The simulations w/; w/o B are running fairly slowly, much more so than what I've seen before. Before the rise phase of the blast wave passes completely by the wire we would be seeing times much later than what we've seen before.
Below: emissivity maps at 10 ns, separation between wire center and left boundary is 1.25 mm; the density peak of the blast wave has just emerged from the left boundary.
We should be able to see an effect on the flow from the magnetic field long before the bulk of the flow passes by the wire. If we can avoid going through the hump we can mitigate some of the effects of wire erosion and high sigma due to the large material flux.
Note: wire density in simulation is only 1% of copper, whatever erosion we see at later times are a gross overestimate.
Update
Will work on Fourier stuff this week.
Working on submitting abstract for poster, here is a rough outline:
Large scale colliding flows are an interesting model for molecular cloud formation. The model produces clouds that are turbulent and filamentary in nature, produce stars across the whole cloud interface, (anything else? lifetimes that match??). Previously, we have looked at hydro, looking at energy budgets, sink etc. using the astrobear AMR fluid code. Now we have extended this further, adding the effects of magnetic field and shear. In addition to performing analyses as before, we plan on extending it further by studying the filaments formed in our model. How large are they? How many stars do they form and what is their mass distribution? What is the gravitational stability of the filaments? (Take for example points brought up in the observational papers that they studies in Sco Cen). Given the large scale nature of our simulations (quote the resolution), we form dozens of filaments, and are excited to see the distribution of the filament properties over the different runs. (Cite the runs?)
Meeting Update 09/08/2014 - Eddie
Not too much to report since last Thursday. I've been really busy setting up the PHY 101 course. The organization for this course has been a complete mess, but all the confusion should clear up this week which will free up more time for me to get back to research.
Mach Stems
I have a few runs queued up on Stampede to test a moving clump through an ambient. Not sure why they haven't started yet. Perhaps there is something going on with Stampede that I'm unaware of. ehansen09042014
This Z cooling run took about 90 minutes on 256 processors on Stampede (80 cells per clump radius). A no-cooling run took about 50 minutes.
The other test runs will be testing parameters related to diffusion.
Intersecting Shocks
I am going to work on the theory behind shock interactions today. I need to use shock jump conditions to figure out parameters for a post shock region behind two intersecting oblique shocks.
New Outflow Object
I am also testing the new outflow object. The diverging wind option quit with an error that I am working on debugging. I can expand on this topic tomorrow.
Meeting Update 09/08/2014
Over the past few weeks I've spent my time shuffling around data and organizing things to make visualization not… impossible? So now we have all of the data together in their respective directories, shared between bamboodata and grassdata. Although one can use Erica's directories in grassdata as "home base" as I made sym-links to the bamboodata directories. So everything seems like it is stored with its applicable files on grassdata.
Current Disk Usage:
/bamboodata/ | 86% |
/grassdata/ | 89% |
Now we're doing post processing on the data, along with resolution testing. We have updated our wikipage (see: CFRunStatsHighRes).
I am also studying for the physics GRE, and given my study/work schedule I might not show up for group meetings. I'll still post updates.
Running Jobs
- Currently:
- Bluestreak (BS):
- Post processing (PP) on our chombos (reservation, for B1S60 and B10S15 256 nodes). Once those two runs are done, we'll do the other two. Eventually we'll need to throw the NoShear case into the mix (see 2.).
- Convergence testing at AMR level 3 (pending in Standard queue for both shear 60 cases). Once these jobs are done, we'll do the other two if necessary.
- Stampede:
- Running the NoShear case on there. Have done approximately 3 restarts. Current restart began 187, so this should be the last restart there. After each restart completes and I restart the next one, I tranfer files back to our local directories. Once we have all the files, I'll transfer them to BS to do PP.
- Bluestreak (BS):
- Future:
- PP on BS of NoShear data.
- PP for B10S30 and B10S60 on BS.
- Convergence Tests for the rest of the data sets.
Tranferring Files
- Once all PP on BS is done, transfer everything back to local directories.
- Tranferring NoShear to BS and local machines.
Visualization
- Erica submitted the proposal to the Vista Collaboratory Contest this past Friday.
- Visualization on BS using x2go.
Science Meeting Update 09/08/14 -- Baowei
- Ablative RT
- Coarse grid + 1 level AMR: Movie with no Mesh; Movie with Mesh
- Coarse grid + 3 level AMR:Movie with no Mesh;
- Compare with results with original resolution and 0 level AMR in blog:bliu08182014 (Still waiting for the growth rate).
- The way to get coarse grid with AstroBEAR — blog:bliu08282014
Mach stems w/ moving clumps
I'm in the process of setting this up. Here is an example image/movie of what the simulation is:
So the functionality is there, but I need to get the spacing and timing right so that we get a satisfactory simulation to look at and analyze.
Update
The above simulation was using Z cooling. The one below is without cooling…