Progress in Fusor Plasma Simulations

This area is for ADVANCED theory and discussions only. If you just joined fusor.net, chances are this is NOT the area for you to be posting.
Post Reply
User avatar
Liam David
Posts: 380
Joined: Sat Jan 25, 2014 5:30 pm
Real name: Liam David
Location: Arizona

Progress in Fusor Plasma Simulations

Post by Liam David »

Progress update since I've been quiet lately.

I held off on machining a cube-like fusor in the hopes that simulations will give insights on electron/ion behavior and general plasma dynamics. Much of my effort these past several months has been spent on writing a fully collisional, GPU-accelerated, particle-in-cell plasma simulation from scratch. I'm happy to say that after much debugging, the code is almost complete. I'll go much more in depth in the future when everything's done, but here's a very basic outline of how the code works:
  • Import CAD model, generate initial electric field, prepare cross-section data
  • Initialize all arrays for particle position, velocity, etc... included species: D+, D2+, D3+, D2, D, e
  • Move relevant data to the GPU
  • In a loop:
    • Find collisions of particles with background gas using previous timestep's particle velocities
    • Delete particles that reacted, add products
    • Move particles using Boris algorithm (will be adding magnetic fields soon)
    • Delete particles that impact the cathode and chamber
    • Add secondary electrons from ions that impact the cathode
    • Find particle densities, charge densities on the simulation grid
    • Compute space charge and new electric field
    • Compute the fusion rate using energy-binned particle densities
    • Repeat indefinitely, or until some criterion is met
Porting parts of this code to the GPU has resulted in a massive speedup over a standard CPU implementation - of the order 100-1000x, depending on the operation. It's very necessary since the electron timestep is 2e-12s and I simulate several microseconds...

The results are in good agreement with other simulations and measurements. Central ion densities are on the order of 1e15/m^3 and computed fusion rates are ~1e6/s - 1e7/s. As expected, the vast majority of fusion reactions are beam-background and fast neutral-background, with the total contribution of beam-beam being a negligible ~0.0001% of the total. Further crushing hopes of recirculation in standard fusors, in the 10 - 50mtorr range the mean free path of D+, D2+, and D3+ ions is abysmal at a few cm.

Fields in my current fusor.
Fields in my current fusor.


Included atomic/molecular processes.
Included atomic/molecular processes.


(Unsmoothed) cross-sections for all included processes.
(Unsmoothed) cross-sections for all included processes.


At 10mtorr deuterium. Discontinuities due to limits of cross-section data.
At 10mtorr deuterium. Discontinuities due to limits of cross-section data.


D2+ ion density near center of fusor.
D2+ ion density near center of fusor.


Simulation of discharge between parallel plates, showing reasonable agreement with Paschen curve - albeit slightly shifted. I've been using this an indicator of the physical accuracy of my code. The most recent revision is even better.
Simulation of discharge between parallel plates, showing reasonable agreement with Paschen curve - albeit slightly shifted. I've been using this an indicator of the physical accuracy of my code. The most recent revision is even better.
User avatar
Nicolas Krause
Posts: 207
Joined: Fri Sep 30, 2016 7:36 pm
Real name: Nicolas Krause
Location: Canada
Contact:

Re: Progress in Fusor Plasma Simulations

Post by Nicolas Krause »

Wow! Fantastic work Liam! Is the simulation 2d or 3d? For the first diagram of the fields in the fusor, what package did you use to plot the vector lines, it looks great!
User avatar
Liam David
Posts: 380
Joined: Sat Jan 25, 2014 5:30 pm
Real name: Liam David
Location: Arizona

Re: Progress in Fusor Plasma Simulations

Post by Liam David »

It's 2D but represents 3D cylindrical geometry, revolved 180 around the horizontal midline. The electric field is plotted using the Matlab streamline function, nothing fancy. Everything is written in Matlab (including visualizations) and CUDA C.
User avatar
Richard Hull
Moderator
Posts: 14140
Joined: Fri Jun 15, 2001 9:44 am
Real name: Richard Hull

Re: Progress in Fusor Plasma Simulations

Post by Richard Hull »

Curious did the calcs take into the consideration the central grid geometry? I wonder if the math would go very complex with a 6-inch spherical chamber with a 1.5 inch wire geodesic with 12 or 16 openings? (12 to 16 streamlines like my avatar of fusor III's early imaging.

Richard Hull
Attachments
13pt.starSPC.jpg
13pt.starSPC.jpg (6.26 KiB) Viewed 7923 times
Progress may have been a good thing once, but it just went on too long. - Yogi Berra
Fusion is the energy of the future....and it always will be
Retired now...Doing only what I want and not what I should...every day is a saturday.
User avatar
Nicolas Krause
Posts: 207
Joined: Fri Sep 30, 2016 7:36 pm
Real name: Nicolas Krause
Location: Canada
Contact:

Re: Progress in Fusor Plasma Simulations

Post by Nicolas Krause »

Richard, I believe if you look at the diagrams Liam has posted, he's modelling a cylindrical grid of the type that's been used in a number of the cube fusors. As far as your question with regards to different grid geometries, the best reference I'm aware of is Matthew Lilley's work on ion orbits in the fusor. He investigated a 2d simulation of the standard fusor type grid you're asking about. I don't know that the math is any more complicated than what Liam is doing (which is quite complicated), but a relationship between the stable orbit of a particle and its initial distance from the grid was demonstrated in the work I linked too.
User avatar
Nicolas Krause
Posts: 207
Joined: Fri Sep 30, 2016 7:36 pm
Real name: Nicolas Krause
Location: Canada
Contact:

Re: Progress in Fusor Plasma Simulations

Post by Nicolas Krause »

How many particles can the simulation run effectively Liam? 10's of thousands? 100's? Millions?
User avatar
Nathan Marshall
Posts: 53
Joined: Wed May 08, 2019 8:13 pm
Real name: Nathan Marshall

Re: Progress in Fusor Plasma Simulations

Post by Nathan Marshall »

Fantastic work, Liam! I am impressed. Great job getting things ported to GPU. This is definitely one of those "pleasingly parallel" problems where GPU is the obvious choice. How long does a microsecond-scale simulation like this take, and what GPU are you running this on?
User avatar
Liam David
Posts: 380
Joined: Sat Jan 25, 2014 5:30 pm
Real name: Liam David
Location: Arizona

Re: Progress in Fusor Plasma Simulations

Post by Liam David »

Nicholas is correct, I'm modeling a cylindrical geometry. I'm not doing any math that's explicitly geometry-dependent--the simulation tracks individual particles that accelerate via the electric field, which is calculated numerically on a mesh from boundary conditions. Changing the geometry is as simple as changing the boundary conditions, which can be done manually or with a CAD model. While it's been a long time since I ran a simulation with a more traditional fusor configuration, back then I got star mode to form quite easily. My compute time is currently saturated with hundreds of Paschen curve validation simulations, but perhaps when that's done I'll redo those older sims...

Each particle represents 250,000 real particles at the moment. I can simulate as many particles as the GPU VRAM will hold, and I'm also limited by the GPU kernel timeout of 2s imposed by the OS. It'll run tens of thousands very quickly and millions with no issue. Each timestep takes ~1-10ms depending several factors. If I turn off atomic/molecular processes, space charge, and all the more involved computations, I can get <<0.1ms per step due to the massive parallelization. A microsecond-long simulation takes on the order of 10-20 minutes. Currently I'm running this on a laptop with i7-6700hq and a GTX 960m, but I'm about to upgrade to a R9-5950x and RTX 3080 desktop which should allow me to run many simulations in parallel and in much less time.

I've looked at some of Matt Lilley's work including that poster, but I haven't found his code anywhere. I'm not convinced that analytic approaches are sufficient due to the immense complexity of the dynamics. Space charge, collisions, etc.. are very much nonnegligible, and given the abysmal mean free path of D2+ and D+ at standard fusor pressures, I don't think the mechanism he suggests is the full story. But it's interesting work and I think I understand his analysis of the ODE.
User avatar
Joe Gayo
Posts: 395
Joined: Sun Jan 06, 2019 9:34 pm
Real name: Joe Gayo
Location: USA

Re: Progress in Fusor Plasma Simulations

Post by Joe Gayo »

Nice work Liam.

Do you have a plot of the plasma potential vs. linear distance through the cylindrical axis of the cathode?
User avatar
Liam David
Posts: 380
Joined: Sat Jan 25, 2014 5:30 pm
Real name: Liam David
Location: Arizona

Re: Progress in Fusor Plasma Simulations

Post by Liam David »

Here's a plot of the potential along the centerline. Conditions are -50kV, 50mtorr, and 1.1us. The particle densities and potentials are still rising at this point, and I'm hitting my hardware limitations for this laptop. Please ignore the disparate plot formatting...



1D plasma potential
1D plasma potential


2D plasma potential
2D plasma potential


D2+ density
D2+ density



I've avoided reporting it thus far, but two effects that several people have observed in their fusors are present in this last image, namely the off-axis beams and shiny ring bisecting the inside of the cathode.
Mine: viewtopic.php?f=18&t=13542&start=30
Jim: viewtopic.php?f=18&t=13077&start=70
Jon: viewtopic.php?f=6&t=12954&start=20#p85463

The high pressure allows for a self-sustaining discharge off-axis, and the field simply focuses the ion beam to a point. In my fusor, I see this fade and disappear as the pressure drops and voltage rises. The ring is simply the plane of zero axial electric field, so any slower ions that cannot "miss" the cathode will converge towards this point. The electric field plot in the first image of this thread makes this pretty apparent.

I think I also have a basis for the color bands seen on the ends of the cathodes, but I'll hold off until I'm more sure of the results.

The challenge I'm currently facing is getting the chamber to breakdown at the voltages and pressures I measure. While I get very close to the theoretical parallel plate Paschen curve, the fusor is proving trickier. Experimentally I can easily get plasma at ~10mtorr (gauge corrected for deuterium) and some 50kV, but I require ~50mtorr in the simulation. I think there's some process I'm leaving out... probably not thermionic emission, especially for a "cold start", but perhaps cold cathode emission? I ran the numbers using the equations found in the Wikipedia article, but get a negligible current density.... I do include secondary electron emission in the simulation.
User avatar
Nicolas Krause
Posts: 207
Joined: Fri Sep 30, 2016 7:36 pm
Real name: Nicolas Krause
Location: Canada
Contact:

Re: Progress in Fusor Plasma Simulations

Post by Nicolas Krause »

I'd contacted Matthew Lilley a few years back asking if I could see his code. He let me know he doesn't have it anymore, I believe it was written in Fortran and he indicated it was a poster for a conference, I don't know if he was trying to get funding for a project or look better on the job market but it's over a decade old at this point. I agree that the dynamics are very complex, I guess the reason I'm interested in the non-linear methods he was attempting is two fold. Firstly, it seems a good way to try and overcome some of the limitations amateurs like us have, buying a supercomputer to simulate some ungodly number of particles is a bit out of reach. Secondly, I think trying something different allows you to make different discoveries, I understand that PiC simulations are the standard for plasma modelling, and with very good reason, but I like the non-linear math and instead of trying to copy what everyone else is doing at least its a different tack.
User avatar
Liam David
Posts: 380
Joined: Sat Jan 25, 2014 5:30 pm
Real name: Liam David
Location: Arizona

Re: Progress in Fusor Plasma Simulations

Post by Liam David »

I'm not trying to downplay a more analytic approach - quite the opposite. I think it complements simulations and can give some great insights, as Lilley's work demonstrates. It'd be interesting to see if I/we could replicate some of his results, and how well that lines up with his ODE... we could solve it numerically after all. To word what I meant earlier more carefully, it's important to recognize idealizations where they exist, and much work on fusors over the years has shown that they don't work quite like we thought, as in recirculation, beam-beam reactions, and all that. I'm quite the math-y guy and really do enjoy PDEs and the like....
User avatar
Richard Hull
Moderator
Posts: 14140
Joined: Fri Jun 15, 2001 9:44 am
Real name: Richard Hull

Re: Progress in Fusor Plasma Simulations

Post by Richard Hull »

Agreed, any path to discovery and learning more about what we are doing to increase the yield is for the best, regardless of how it is arrived at, be it via simulation or in hardware. I have long believed there is no one explanation for how all fusion is done in the simple fusor.

Richard Hull
Progress may have been a good thing once, but it just went on too long. - Yogi Berra
Fusion is the energy of the future....and it always will be
Retired now...Doing only what I want and not what I should...every day is a saturday.
User avatar
Nicolas Krause
Posts: 207
Joined: Fri Sep 30, 2016 7:36 pm
Real name: Nicolas Krause
Location: Canada
Contact:

Re: Progress in Fusor Plasma Simulations

Post by Nicolas Krause »

Ah right, I see what you're saying Liam, yes I'd definitely agree with that. The whole art of simulating is picking the right simplifications so that the model captures the desired behaviour. I think the reason I'm interested in Lilley's work, is Richard talks a lot about how he has to condition his fusor to get good fusion numbers. The leading theory being that this is created by wall loading of the device with deuterium, this seems to be the furthest starting point for an orbit one could imagine. So the mechanism in my head is like so, a stable orbit leads to more chances to fuse, so an increase in the number of particles with stable orbits increases fusion, the best way to increase stable orbits is by starting the particles from their furthest possible point. The best way to do this is with wall loading. That's sort of the logical chain of thought I have regarding Lilley's work, no clue if people here would agree or disagree with it, but it seems to me plausible.
User avatar
Richard Hull
Moderator
Posts: 14140
Joined: Fri Jun 15, 2001 9:44 am
Real name: Richard Hull

Re: Progress in Fusor Plasma Simulations

Post by Richard Hull »

I have always believed the walls on the long MFP in 6" and larger fusors was charged with neutral D molecules/atoms over time. Bombarding electrons at near full energy and even some neutrals of high energy pop imbedded D out at some X value of sufficient loading as Deuterons to undergo full acceleration towards the grid. Due to MFP few make it, fewer still circulate. Fusion occurs in velocity space within the spherical fusion reactor. This was pointed out by empirical experiment way back in 2004 by U of W. In 1999 Robert Hirsch told me he believed our fusion took placed in velocity space and not in a thermal environment. Velocity space includes inside the grid, but the grid represents near zero volume in the sphere related to the overall volume of the vessel.

Spheres offer only increased 360 degree volume of velocity space over a well done BOT type design. There are many other rare processes fast neutral/wall D fusions, neutral/neutral, neutral/deuteron, the list goes on and on. This is not to boost the sphere as a fusor, for I have already denigrated it for its well proven lower neutron and fusion production over other designs already extant and in use here. I work it more for aesthetics than any other reason now.

Richard Hull
Progress may have been a good thing once, but it just went on too long. - Yogi Berra
Fusion is the energy of the future....and it always will be
Retired now...Doing only what I want and not what I should...every day is a saturday.
User avatar
Liam David
Posts: 380
Joined: Sat Jan 25, 2014 5:30 pm
Real name: Liam David
Location: Arizona

Re: Progress in Fusor Plasma Simulations

Post by Liam David »

Just a quick update as I work on a new and improved (read: much faster and with fewer bugs) iteration of the code. Some eye candy with ion optics:

https://youtu.be/XyJW5LpbuBE
User avatar
Nicolas Krause
Posts: 207
Joined: Fri Sep 30, 2016 7:36 pm
Real name: Nicolas Krause
Location: Canada
Contact:

Re: Progress in Fusor Plasma Simulations

Post by Nicolas Krause »

Lovely visualization Liam, are those level surfaces of ions in the video?
User avatar
Liam David
Posts: 380
Joined: Sat Jan 25, 2014 5:30 pm
Real name: Liam David
Location: Arizona

Re: Progress in Fusor Plasma Simulations

Post by Liam David »

It's the density of D2+ ions on the domain.
User avatar
Liam David
Posts: 380
Joined: Sat Jan 25, 2014 5:30 pm
Real name: Liam David
Location: Arizona

Re: Progress in Fusor Plasma Simulations

Post by Liam David »

I've rewritten much of the code and have fundamentally changed the data storage architecture to reduce memory operations, resulting in about an order of magnitude speedup especially at large particle counts. The precision has also been reduced from double to single.

I have also focused on maximizing the performance of collisionless particle tracking, meaning simulations of several million particles evolving for several milliseconds are possible with an overnight run (with a timestep of 40ps). A common performance metric for PIC codes is particle push/boundary check operations per second, and mine achieves ~10^10/s on an RTX 3080, or very roughly 600 GFLOPS. It's a memory-limited application, so the raw FLOP performance does not come close to the maximum of about 30 TFLOPS.

Other changes include:
  • Implementing a 5-point stencil (instead of a 3-point) for the Laplacian in the electric field solver, and making it fully applicable to cylindrical coordinates by including the 1/r*df/dr term. It now accurately produces the logarithmic potentials found in cylindrical geometries.
  • Adding magnetic fields, which is accomplished by providing a magnetization density vector field M and solving a Poisson equation much like the electrostatic case. Some preliminary simulations indicate that a moderate axial magnetic field can enhance ion lifetime, as well as the phase space of stable and quasi-stable orbits. The Boris algorithm is used to push the particles.
  • Accounting for collision angle in the fusion rate calculation (work in progress).
5-point Laplacian
5-point Laplacian

Magnetic field lines superimposed on electric potential
Magnetic field lines superimposed on electric potential
User avatar
Liam David
Posts: 380
Joined: Sat Jan 25, 2014 5:30 pm
Real name: Liam David
Location: Arizona

Re: Progress in Fusor Plasma Simulations

Post by Liam David »

As I hinted at before, the simulation reproduces the color bands seen on cylindrical cathodes, as well as the off-axis beams and circumferential ring bisecting the inside. Due to all the code changes, I can now simulate steady state, which constitutes ~8e6 total particles. I'm also getting hints of what may be ion acoustic waves (or transit time resonance) at ~13 MHz... I'm slowly building confidence in the simulation, although the calculated locations of fusion reactions are not where expected, as in not within the cathode due to anomalous electron trapping. More results to come.

High density beamline obscured to enhance contrast
High density beamline obscured to enhance contrast
User avatar
Javier Lopez
Posts: 141
Joined: Wed Feb 22, 2006 3:32 am
Real name: Javier L

Re: Progress in Fusor Plasma Simulations

Post by Javier Lopez »

Can you set external magnetic fields in your simulations?
User avatar
Dennis P Brown
Posts: 2709
Joined: Sun May 20, 2012 10:46 am
Real name: Dennis P Brown
Location: Glen Arm, MD

Re: Progress in Fusor Plasma Simulations

Post by Dennis P Brown »

With all those posts and you still not using your full name either you are intentionally breaking the rules or feel you are above them. I'd suggest you follow the rules here if you want to continue posting.
User avatar
Richard Hull
Moderator
Posts: 14140
Joined: Fri Jun 15, 2001 9:44 am
Real name: Richard Hull

Re: Progress in Fusor Plasma Simulations

Post by Richard Hull »

All of the posts by "fusion" will be deleted if he doesn't change his user name along with any responses to his posts.

Richard Hull
Progress may have been a good thing once, but it just went on too long. - Yogi Berra
Fusion is the energy of the future....and it always will be
Retired now...Doing only what I want and not what I should...every day is a saturday.
Frank Sanns
Site Admin
Posts: 1981
Joined: Fri Jun 14, 2002 2:26 pm
Real name: Frank Sanns

Re: Progress in Fusor Plasma Simulations

Post by Frank Sanns »

Negative. Do not delete Javier's posts.

He has been on here for over a decade but has just come back after the rule changes.

Give him a chance to respond. I have sent him an email but I do not want to change his login name as it will prevent him access.

Under no circumstances should we be nuking somebody with over 100 posts. It is not good form and it will screw up over 100 threads.
Achiever's madness; when enough is still not enough. ---FS
User avatar
Richard Hull
Moderator
Posts: 14140
Joined: Fri Jun 15, 2001 9:44 am
Real name: Richard Hull

Re: Progress in Fusor Plasma Simulations

Post by Richard Hull »

Can we go back and un pink all those posts? I do not know how to do it.

Richard Hull
Progress may have been a good thing once, but it just went on too long. - Yogi Berra
Fusion is the energy of the future....and it always will be
Retired now...Doing only what I want and not what I should...every day is a saturday.
User avatar
Liam David
Posts: 380
Joined: Sat Jan 25, 2014 5:30 pm
Real name: Liam David
Location: Arizona

Re: Progress in Fusor Plasma Simulations

Post by Liam David »

I'm rewriting much of the plasma simulation to improve its accuracy and speed, and one of the modules that most needs updating is the electromagnetic (EM) field solver.

The previous EM solver wasn't a true EM solver. That is to say, it solved the electrostatic and magnetostatic Poisson equations using the relaxation and/or finite difference methods, situation depending.

poissonE.png
poissonE.png (1.58 KiB) Viewed 2746 times
poissonM.png
poissonM.png (5.09 KiB) Viewed 2746 times

These methods completely decouple the electric and magnetic fields, and give accurate solutions only when the fields' and source' rates of change are very slow (i.e. quasi-static). In addition, relaxation is ridiculously slow, and even using sparse matrices to solve an Ax=b finite difference scheme in >2D is extremely memory-intensive (many TB of RAM).

The new 3D solver uses the finite difference time domain (FDTD) method which fully couples the electric and magnetic fields, any charges and currents present, and all boundary conditions in a relativistic, self-consistent manner. There are still drawbacks, such as non-uniform propagation along diagonals when using a cartesian grid (visible in the videos) and the need to store previous iterations of the fields, but it is still much better than the other methods. Technically, it solves the electric potential and magnetic vector potential equations, converting them into the E and B fields in a post-processing step.

potentialEquations.png
potentialEquations.png (8.49 KiB) Viewed 2746 times

All that aside, here are some videos of a test simulation with the new EM solver. The domain is a conducting box at 0 V with two conducting hemispheres at +100 kV and -100 kV. All other boundaries are open. The potentials are applied instantly, simulating a step function response. Note the interference and reflection of the waves. The timestep is 2e-13 s and the domain is 256^3 cells spanning (84 mm)^3. Steady-state (i.e. the Poisson equation solution) is approached after some time. Since energy is lost only through the open boundaries, reaching steady-state takes a long time. Videos are of the YZ plane.

E Field: https://youtu.be/8IOmu2IlU3Y
Potential: https://youtu.be/erYUmWrbwc0

domain.png
Efield.png
User avatar
Nathan Marshall
Posts: 53
Joined: Wed May 08, 2019 8:13 pm
Real name: Nathan Marshall

Re: Progress in Fusor Plasma Simulations

Post by Nathan Marshall »

Very nice, Liam! I am curious how you implemented the boundary conditions. I have played around with 2D FDTD wave equations but always had issues when trying to implement open boundary conditions.
User avatar
Liam David
Posts: 380
Joined: Sat Jan 25, 2014 5:30 pm
Real name: Liam David
Location: Arizona

Re: Progress in Fusor Plasma Simulations

Post by Liam David »

Yeah, the open boundary conditions are the difficult part. I'm using equation 3.9 in the attached paper for each boundary, representing a wave traveling out of the boundary. I haven't yet completely verified my implementation, so I can't say much about its usefulness, but it seems to work alright. For the conductor boundaries, I just enforce the applied potentials which leads to the 100% reflections.

The other thing I have not implemented is the Lorenz gauge condition, which is required to get a physical solution since the potential wave equations are derived under its assumption. Since phi and A are decoupled, if I initialize some phi, it will never produce a B-field unless I initialize a valid A. It also calculates a non-physical E-field since it depends on A as well. Depending on how difficult the Lorenz gauge is to implement, I may backtrack to the coupled phi and A equations.
Attachments
ln_fdtd_1d.pdf
(60.09 KiB) Downloaded 89 times
User avatar
Nathan Marshall
Posts: 53
Joined: Wed May 08, 2019 8:13 pm
Real name: Nathan Marshall

Re: Progress in Fusor Plasma Simulations

Post by Nathan Marshall »

Wow, I implemented the same boundary condition into my wave equation solver and it worked like a charm! I was overthinking the problem and trying to implement perfectly matched layers from research literature which was too complicated. I didn't think that simply using a 1D open boundary condition would work so well but should have tried that first. I do notice some minor reflections on close inspection since these boundary conditions are not perfect in 2D, but it works well enough for my applications! Thanks for the suggestion.

Quick clip of single slit diffraction from my FDTD solver: https://youtube.com/shorts/kdysUlUlRHI?feature=share
Attachments
Screenshot from 2022-08-30 20-44-51.png
User avatar
Liam David
Posts: 380
Joined: Sat Jan 25, 2014 5:30 pm
Real name: Liam David
Location: Arizona

Re: Progress in Fusor Plasma Simulations

Post by Liam David »

Great to hear that it works for you too, and nice simulation! I also get minor reflections in 3D... something to work on but it's not a priority. I've got more significant errors elsewhere. I was also looking at perfectly matched layers, but like you say they're much more complicated and they also artificially increase the domain size.
User avatar
Liam David
Posts: 380
Joined: Sat Jan 25, 2014 5:30 pm
Real name: Liam David
Location: Arizona

Re: Progress in Fusor Plasma Simulations

Post by Liam David »

There are still some hiccups in the FDTD solver; namely, there's a spurious open boundary that is radiating inwards, but it doesn't affect the dynamics of the plasma which is confined mostly in the center of the domain. I'm probably combing over the GPU solver as you read this...

My latest challenge is one of different scales. Accurately capturing electron dynamics requires timesteps on the order of 1e-13 s. I need simulation durations on the order of hundreds of microseconds, which translates to billions of timesteps. Even with each timestep taking some tens of milliseconds, that translates to weeks of computation (note that each timestep is rather involved, requiring moving particles, computing molecular reactions, depositing charge and current onto the mesh, solving the FDTD equations, adjusting the cathode potential, and logging any important parameters). I need the long duration to limit the divergence of the current. If the background pressure is over the glow discharge threshold, the number of particles grows without bound, quickly surpassing the 20 million limit imposed by my 8 GB of GPU VRAM. Lowering the cathode potential increases the discharge pressure, but changing it unphysically fast generates massive EM waves that mess things up. Lots of work to do.

Deliberately asymmetric.
Deliberately asymmetric.
Pablo Llaguno
Posts: 104
Joined: Sun Feb 05, 2017 6:00 pm
Real name: Pablo Llaguno

Re: Progress in Fusor Plasma Simulations

Post by Pablo Llaguno »

Amazing work Liam.

When you solve the scalar and vector potentials wave (inhomogeneous) equations, I am confused as to why do you need to implement a Lorenz gauge condition. The physical electric and magnetic fields are independent of the gauge function (gauge invariance) and the usual treatment in textbooks is that if this gauge function can be found, then it is not necessary to solve it.

For the Coulomb gauge the gauge function is determined by a Poisson equation, with the divergence of the vector potential as the source. For the Lorenz gauge I can't remember, but my electrodynamics book (Griffiths) has a problem (10.6) that shows it is always possible to meet the Lorenz gauge, assuming you have solutions for the inhomogeneous wave functions for V and A. Wouldn't your numerical solutions for V and A make unnecessary implementing a Lorenz gauge function?
User avatar
Liam David
Posts: 380
Joined: Sat Jan 25, 2014 5:30 pm
Real name: Liam David
Location: Arizona

Re: Progress in Fusor Plasma Simulations

Post by Liam David »

Thanks, Pablo. Indeed, you are correct: FDTD solvers don't require you to pick a gauge. I decided to incorporate it (although nowhere do I explicitly solve it) for a few reasons.

1. It's relativistically valid. While I don't use relativistic particle pushing yet, it is relatively simple to implement.
2. It decouples the potential equations. This makes writing a finite difference scheme much simpler.
3. It is less numerically intensive as it requires slightly less memory access and has far fewer floating point operations.
4. Who doesn't like the symmetry?
Coupled equations
Coupled equations
Lorenz gauge
Lorenz gauge
lorenz.PNG (2.13 KiB) Viewed 1983 times
Decoupled potential equations
Decoupled potential equations
lorenz gauge.PNG (6.31 KiB) Viewed 1983 times
Pablo Llaguno
Posts: 104
Joined: Sun Feb 05, 2017 6:00 pm
Real name: Pablo Llaguno

Re: Progress in Fusor Plasma Simulations

Post by Pablo Llaguno »

Using the Lorenz gauge for this simulation is actually a very astute thing to do. However I am still confused with the reasoning behind this quote
Liam David wrote: Mon Aug 29, 2022 10:05 am The other thing I have not implemented is the Lorenz gauge condition, which is required to get a physical solution since the potential wave equations are derived under its assumption. Since phi and A are decoupled, if I initialize some phi, it will never produce a B-field unless I initialize a valid A. It also calculates a non-physical E-field since it depends on A as well. Depending on how difficult the Lorenz gauge is to implement, I may backtrack to the coupled phi and A equations.
How does initializing a phi (in the grid boundary conditions I imagine) affect the Lorenz gauge? Couldn't you initialize A such that the initial B is zero?

On another note, have you considered using Multiphysics simulators such as COMSOL? I did some FDTD simulations for an optical waveguide and while different from plasma simulations, it still solves Maxwell's equations. I guess it's much faster to do PIC simulations in MATLAB?
User avatar
Liam David
Posts: 380
Joined: Sat Jan 25, 2014 5:30 pm
Real name: Liam David
Location: Arizona

Re: Progress in Fusor Plasma Simulations

Post by Liam David »

Consider this example: I have two parallel plates that are instantaneously set to some potential difference at t=0. We know in advance that the resulting EM waves within the gap between the plates will have both E and B components due to the coupling in Maxwell's equations. There are no charges or currents in the gap, so the potential wave equations become homogenous--both the coupled ones and the decoupled ones.

Consider first the coupled equations (no applied gauge). We have a nonzero d(phi)/dt due to our initial conditions. The first coupled equation tells us that the time derivative of the divergence of A must be nonzero. Thus, except perhaps instantaneously, A cannot be zero. The 2nd coupled equation leads us to the same conclusion. Our algorithm, developed around finite difference stencils, would take this coupling into account and give us equations of the form
phi(n+1) = f (phi(n-1), A(n-1); phi(n-2), A(n-2); ...)
A(n+1) = g (phi(n-1), A(n-1); phi(n-2), A(n-2); ...)
Here n is the timestep index.

Now consider the decoupled equations (in the Lorenz gauge). Since the initial conditions for phi have no coupling to A, unless A is given an initial condition that satisfies the Lorenz gauge, not only will it be zero for all time, but the solution will be nonphysical essentially as constructed. The finite difference stencils give equations of the form
phi(n+1) = f' (phi(n-1); phi(n-2); ...)
A(n+1) = g' (A(n-1); A(n-2); ...)

Another issue can (and does) arise: the continuity equation. It is a corollary to Maxwell's equations and can thus be derived from them, and it must hold true for the simulation to be physical. It relates the change in charge density (rho) to the current density (J) and sort of (although incompletely) couples the decoupled phi and A equations. The discretization of the simulation can increasingly violate continuity over time, causing divergences in the fields. This qualitatively hasn't been an issue for me yet, but it is something I will eventually fix. One method is to define a fictitious field that effectively transports charge conservation errors out of the simulation domain at v > c.

Continuity equation
Continuity equation
continuity.PNG (1.58 KiB) Viewed 1798 times
User avatar
Liam David
Posts: 380
Joined: Sat Jan 25, 2014 5:30 pm
Real name: Liam David
Location: Arizona

Re: Progress in Fusor Plasma Simulations

Post by Liam David »

Some preliminary plots of electron+ion flux on the surfaces inside a standard cube fusor. Normalization is unimportant--I'm literally counting simulation particles here. The highest-flux region at the endcaps is ~1.2" in diameter, which closely matches the measured value (~1.1") within uncertainties. Upper plot linear color scale, lower plot log color scale.

collision map 2.png

collision map.png
User avatar
Liam David
Posts: 380
Joined: Sat Jan 25, 2014 5:30 pm
Real name: Liam David
Location: Arizona

Re: Progress in Fusor Plasma Simulations

Post by Liam David »

Really getting down to the limits of my hardware... I managed to eke out (roughly) another order of magnitude in computation speed by optimizing memory access, limiting CUDA kernel grid sizes, and, most importantly, sorting particles by their global cell index (i.e. the linear index in an n x m x p simulation grid). The latter allows the GPU to limit global memory access (latency of 100s of clocks); it can reuse data already loaded in the registers (single-clock latency). I also fixed a few bugs, notably one that didn't save/delete the correct particles in some rare cases.

To give a relative sense of the performance, the peak particle push rate for the state-of-the-art PIC code VPIC 2.0 on an Nvidia A100 GPU is 6 particles/nanosecond (https://arxiv.org/pdf/2102.13133, Figure 6). Notably, this metric includes only pushing the particles and not calculating collisions, advancing the fields, etc.. My RTX 3080, which is not optimized for computation, achieves 2.1 particles/nanosecond in this step. The total loop performance is 0.42 particles/ns. Moreover, my grid size is >16 million, while the VPIC grid sizes are much smaller.

I think the next improvements are physics upgrades, now that I have a pretty fast code.

efficiency.png
User avatar
Nicolas Krause
Posts: 207
Joined: Fri Sep 30, 2016 7:36 pm
Real name: Nicolas Krause
Location: Canada
Contact:

Re: Progress in Fusor Plasma Simulations

Post by Nicolas Krause »

Hi Liam,

I'm not too sure how much additional programming work it would be, but I don't know if you've heard of Lambda Labs? They have a GPU cloud with some pretty beefy hardware, and since you've already rewritten your program to use CUDA I can't imagine it would be too bad? Their prices seem pretty reasonable, around 2-4$/hr for some of their smaller GPU setups. If you wanted to scale some parameter like particles, rather than re-working a bunch of the math to some different model it might be worthwhile.
User avatar
Liam David
Posts: 380
Joined: Sat Jan 25, 2014 5:30 pm
Real name: Liam David
Location: Arizona

Re: Progress in Fusor Plasma Simulations

Post by Liam David »

I haven't heard of them before, but their prices seem very competitive. 8x A100s with 240 GB VRAM and 1.8 TB RAM is just $8.80/hr. My code and investigation are just not at that stage yet.
Post Reply

Return to “Advanced Technical Discussion Area”