Meeting 4/28 - Ruka
- Mach Stems- all but one run done. Made a very silly mistake with restarts, should be finished within 4 days.
- Finished Chapter 13 & 14 of Toro. I have some ideas for the scope of my final paper- a possible New User's Guide to Numerics?
- PN Poster. Received template for poster from Martin, I will begin to work on constructing this poster to present at the CIRC symposium. What are the timelines?
More solvers!
I've spent the majority of my time last week troubleshooting my adaptive Riemann Solvers, which are now working (see attached for pretty pictures). The HLL solver is almost finished. I'm optimistic about getting them fully functional in the next day or two. Where to next?
On Mach Stems: 1/16 complete, 2/16 ~60% finished after 1 day of runtime, currently in queue to restart, 13/16 still to be run.
In other news, I just started a CourseEra class of on high performance scientific computing taught by Dr. LeVeque at University of Washington. It focuses on parallel processing and associated algorithms in Python and Fortran. May be of some interest to other in the group- https://class.coursera.org/scicomp-002 , it looks like it should be a lot of fun!
Meeting 3/24
I'm now helping Eddie run his high res Mach Stems on BlueStreak. They look like they will take ~30 hours ea. on 128 nodes + queue time.
I've written in my approximate Riemann Solvers and reconstructed my code so that it's easier to swap in different solvers in the update stage. Currently, it has the solvers which approximate the star states:
- Exact
- Primitive Variable
- Two Shock
- Two Rarefaction
and then options that cycle through those routines depending on the state U:
- Exact Solver
- Adaptive Iterative
- Adaptive Non-Iterative.
Unfortunately I'm running into some bugs where my code will crash about halfway through the run, which I think may be related the routine to find Smax (the largest wavespeed) in order to calculate the time step. I'm going to get that ironed out this week and hopefully implement the HLLC solvers in the Flux calculation portion of the code as well.
Fun with Approximate Solvers pt 1
Martin was unable to meet with me last week, but gave me some homework (reading through the current iteration of the module and coming up with questions) that I am working on now.
I spent some quality time with Toro the last week and burned through Chapter 9 and most of Chapter 10, which included the following approximate solvers:
- Primitive Variable Riemann Solver
- Two Rarefaction Riemann Solver
- Two Shock Riemann Solver
- Adaptive Iterative Riemann Solver
- Adaptive Non-iterative Riemann Solver
- HLL Solver
- HLLC Solver
The exact solver that I wrote was entirely self contained, but I hadn't anticipated the new solvers I'd be learning about following essentially the same procedure (although in retrospect, it seems obvious). Before running the tests in Chapter 9 using the AIRS and ANRS, I want to rework my code to be a little more flexible. The main code will look something like:
1) Run initialize grid script, which will return the initial state.
2) Update state, this function will choose which solver to use based on user input or adaptive conditions from a switch.
3) Print data
4) Loop steps 2 and (optionally) 3 until final time step.
This basic framework should make it easier to continue to add solvers as I learn about them. Incidentally, I now have a better understanding of why AstroBear is set up the way it is! I should have plots and tests up before next meeting.
Question: Can I continue writing in Python, or is it going to become a hindrance when I get farther along in the book? Also, when do I move past 1D?
Meeting 2/24
Continuing to work through Toro, but I also found this article onusing Stochastic methods to arrive at Riemann invariants. It's particularly interesting to me, because I'm also doing an independent study in Stochastic Calculus, unfortunately it's still a bit over my head but I would like to present on it in the future (once I'm a bit further in both books). In the meantime, this is a cool paper on N-body simulations of planetessimals in a binary system.
Update 10/28
With a lot of Baowei's help, managed to figure out why the spherical wind module was not working for me.
- The hydro runs are completed for that and the results are on the PN page.
- Updated all of the MHD runs with the new color bars/schemes
Meeting update 9/23
Beta = .5 runs completed, the problem with toroidal ambient is fixed (thanks Martin!).
Made a couple of goofs over the weekend:
- Spherical wind finished running, but the wind was a factor of 10 too fast
- The density plots for Beta =.5 did not switch between runs.
Fixed both mistakes and they are currently running now. The MHD movies should be completed/live by the end of the meeting.
Comparison plots-
- Did not quite work the way I expected it to
Meeting update 9/16
- Included Temperature plots and mesh in newest runs, corrected expression for temperature.
- Found a bug in the hydro runs, so I re-ran them and am now currently processing the images again
- Creating a plot of low-res hydro simulations versus higher-res (25 zones/radius vs. 50)
- Re-running mhd runs with beta = .5
Update 8/19
- The "memory error" bug seems to have been fixed *knock on wood*
- The low res (~25 cells/radius) grid on the pn page should be fleshed out again by the end of tonight
- Will queue up higher-res (64 cells/radius) overnight- if they work we can probably close the ticket I opened.
- Baowei helped me set up accounts on Bluestreak and Stampede
- I probably don't need them anymore…but can't hurt?
Meeting update 8/12
- My runs had been sitting in queue for about a week, and once I was able to run the hi-res MHD, I got lots of memory errors, which I submitted ticket #306 for.
- I re-ran all of the MHD cases at hi-res, and they all crash at a resolution of 64 cells/radius. At 32 cells/radius, the stratified diverging wind case and the constant ambient jet case finish, with the rest having the same error. At ~25 cells/radius, they all finish.
Meeting update 7/29
- Attended teleconference with Bruce Balick
- Tried a few different runs to try and fix the issues with MHD runs, but a lot of those problems seem to have stemmed from a bug in the module that Martin has fixed.
- Re-running the constant ambient DW model at 64 zones/ radius
Meeting Update July 23
- All of the basic runs for the PN page are done!
- Added two new sections to the planetary nebulae wiki page
- Trying to figure out what could be the cause of the "bubbles" in the MHD runs
- Trying different parameters for the torus case, as it seems to be very similar to the constant ambient.
Meeting Update July 8
My project for the last week has been writing scripts to run/make movies of all of the planetary nebulae cases at once.
- Still working on getting the make movies part to work properly.
- This will be useful when testing how changing one parameter will affect all of the runs/ make organization a lot easier.
MHD is now working, there is a new table for it at the bottom of the PN Page.
Meeting Update July 1
- Been working on adding MHD, lots of new things to learn!
- Module will compile, but gives errors when trying to run MHD…still works with all of the hydro runs
- Re-ran all of the hydro runs with most current version of module
- VisIt is being buggy. Apparently we don't usually run VisIt on bluehive? Trying to figure out how to get it to work locally
The compute engine running on hay.pas.rochester.edu has exited abnormally. Shortly thereafter, the following occured... VisIt could not find a compute engine to use for the plot on host hay.pas.rochester.edu. VisIt will try to launch a compute engine on that host.
- The Planetary Nebula page looks a little prettier now.
Meeting update 6/24
- Created a wiki page for the Planetary Nebulae Project
- Redoing previous runs (everything not toroidal ambient) with the newest version of the module
- Trying a full grid run, just to see what happens
- Meeting with Martin on Thursday
Meeting Updates 6/17/13
- Bluehive's been incredibly slow the last week, seems to have gotten better this afternoon
- Still working on getting the toroidal ambient to work with jets
- Editing module so that the diverging wind has an arc for opening angle instead of being flat
- Experimenting with different boundary conditions, trying full grid runs
Hopefully more pretty pictures next week!
Meeting Update 6/10/13
Got almost all of the low-res runs working (no MHD yet)-
Stratified Jet
Diverging Wind (10 Degrees)
Diverging Wind (20 Degrees)
Stratified Diverging Wind (10 Degrees)
Stratified Diverging Wind (20 Degrees)
Toroidal Clump
For next week, we need to get the toroidal ambient working with the jets (currently throwing NaNs) and incorporate MHD to the diverging winds case.
Update 5.28.13
This past week, I've been updating Martin's module to work with latest revision of the code. So far, we've gotten the following 3D runs:
(1) Jet, constant ambient
(2) Jet, stratified ambient
(3) Clump, constant ambient
(4) Clump, stratified ambient
- The torus ambient does not seem to be running properly at the moment, and needs some more work.
- We want to edit the module to run in 2.5D instead of 3D.
- Meeting with Martin tomorrow to discuss next steps
As far as reading, I've read:
Shaping Bipolar Planetary Nebulae: Effects of Stellar Rotation, Photoionization Heating, and Magnetic Fields (Garcia-Segura, et al. 1999)
Magnetically Driven Winds from Post-Asymptotic Giant Branch Stars: Solutions for High-Speed Winds and Extreme Collimation (Garcia-Segura, et al. 2004)
From Bipolar to Elliptical: simulating the morphological Evolution of Planetary Nebulae (Huarte-Espinosa, et al. 2012)
The Formation and Evolution of Wind-Capture Disks in Binary Systems (Huarte-Espinosa, et al. 2013)
Outflows from Evolved Stars: The Rapidly Changing Fingers of CRL-618 (Balick, et al. [submitted])
Exploring Model Paradigms for the Cores of Active Pre Planetary Nebulae (Balick, et al. [draft] )
Also picked up Toro for some supplementary reading this summer.
Meeting Update
Dec 6:
- The Bondi module is now in the test suite.
- Still working on my paper.
Nov 29:
- I've made some cosmetic/usability changes to the Bondi module. Cleaned it up, added lots of comments, all relevant variables moved to problem.data.
- Next step: need to talk to Eddie about adding the module to the test suite.
- Got the first draft of my paper back from Martin with lots of constructive criticism. I am planning on revising it today and tomorrow before submitting another draft. I will focus on this first and then copy over portions to the wiki page.
Nov 22:
- This week I've mostly been working on the term paper. I've submitted a first draft to Martin earlier this afternoon. Over break I will edit/revise it and then add the information to the testing page on the wiki.
Meeting 11/15
It works, it works!
Here are the (semi) latest runs of the Bondi Module:
Rho Lineout
Velocity Lineout
Velocity Pseudocolor
Rho Pseudocolor
Beyond this, I've made some changes to global.data and physics.data, and the module also works properly now for the numbers Shu gives in Physics of Astrophysics II. However, the kink at r~0 still persists in all the chombos except for the very first one.
Meeting Update 11/01/11
I've been working trying to get Bondi Accretion to work successfully. Last time, the program bounced the gas falling in towards the center back out towards the edges. I tried to solve this problem by forcing the program to recalculate the momenta and density inside an inner radius and outside an outer radius to be constant. This, in effect, should take gas out of the center and inject gas into the edge of the grid.
I had some issues this week with getting computational units to work correctly- and I suspect that there may still be some issues in my module due to that, because I get obscenely small values for rho in my chombos. In the most recent test I've run, I've obtained these gifs:
Meeting 10/18
Main achievements: -Better understanding of the code -Got a non-zero set of chombo files!!
Main Issues: -Do the data make sense? -Need to talk to Martin about specific parts I still don't understand (e.g. What values pointgravityobj takes care of, why ang. momentum was included in Travis's module, rs/cs values)
Progress:
Last week I managed to get the module to compile properly, and thought everything was more or less in order. Unfortunately, I kept running into "Nans found in ProblemBeforeStep restart requested". I started fiddling around with the mod ules thinking it may have to do with the conditional on lRestart. However, I learned at the testing meeting that it is an entirely different function/feature. Up until then, I had simply been editing Travis's previous module on the Bondi problem, but over the weekend I decided to write it from (almost) scratch, using Travis's module and Eddie's RT Instability module as references. I was careful not to copy/paste anything unless I at least understood what it was doing. I was finally able to get it working earlier today, meaning it compiled, ran without problems and gave me nonzero chombo files. (Hooray!)
These pictures seem like they look right:
Questions/Issues for this week:
These images seem less right:
Basically, the simulation starts out doing exactly what I think it will, then it starts looking kind of funny. Goals for this week then is to diagnose/fix this, and maybe then running postprocessing (bear2fix?). Also, I need to get my webspace set up for at least hosting the images (had to upload to imgur for this week) and I need to figure out how to make those nice .gifs in Visit.
First Blog Post 10/04/11
I was assigned the Bondi Accretion problem with which to do my first module. After consulting with Martin extensively I've been reading Bondi's original paper, as well as:
Galactic Black Hole - Falcke & Hehl High Energy Astrophysics - Melia Physics of Astrophysics Vol. 2 - Shu
My initial issue was looking through the equations and finding the simpler solution for particles that have already crossed the sonic barrier for speed, rather than the more complicated full solution to the problem. Eventually, the simple solution of v = sqrt(2GM/r) made sense, as they have just enough energy to get back out to infinity, but since they are being accreted, have a velocity going the opposite direction.
This last week has been a tutorial in transferring physics into code. I managed to compile astrobear for my problem.f90 file, but got many errors while trying to run the program. I spent a good part of the weekend trying to figure out if there was anything in my problem.f90 file that was causing these errors, but eventually realized that it was because all of my .data files were only in the Problem directory, and not the astrobear directory as well. After fixing this, that error went away, but now astrobear complains of an error with my global.data & physics.data.
I feel as if once I have successfully run this simulation, future ones will be a lot simpler to do!