## Python Simulation

- Nicolas Krause
**Posts:**169**Joined:**Fri Sep 30, 2016 11:36 pm**Real name:**Nicolas Krause**Location:**Canada-
**Contact:**

### Re: Python Simulation

Very nice simulation Nathan. This poster is about the only academic reference I've been able to find that analyzes the possible orbits in a Fusor. It has a very nice perspective from non-linear dynamics about why things like star mode exist in the Fusor.

- Liam David
**Posts:**207**Joined:**Sat Jan 25, 2014 10:30 pm**Real name:**Liam David**Location:**Arizona

### Re: Python Simulation

Nicholas, Nathan, your efforts in have inspired me to dig out some simulations that I shelved a year or two ago. With a few more semesters of physics and math under my belt, as well as a functioning fusor (unlike last time), I've decided to take another stab at ion optics and space charge effects.

I've opted to go for a 2D particle-in-cell method as opposed to a direct sum or tree algorithm for a few reasons. It avoids divergences in the forces between particles, allows me to simulate millions of particles at O(N) efficiency, and makes it easier to determine bulk plasma properties. I started on a 3D version but decided to downscale by one dimension due to the matrix sizes needed to solve for the potential. A 3D model discretized into 100^3 cells needs a 1e6 x 1e6 matrix, which even as a sparse matrix takes way more RAM than I have. Sizes up to 70^3 are possible with 32GB, but that's pretty coarse. In 2D, I can easily run a 300^2 simulation with only a few GB.

The potential solver inverts the discrete Laplacian matrix to solve grad^2 (phi) = rho/epsilon0 in the form of a normal matrix equation Ax=b. I take into account boundary conditions, i.e. the grid and chamber, by setting the appropriate cells in b=rho to the boundary values and the corresponding rows of A=grad^2 to the identity. That way, when I compute x = (A^-1)b, I get back the boundary conditions plus the vacuum potential and potential from the particles.

The macroparticles (each represents ~3000 real particles) are binned using a square kernel the size of a grid cell, meaning each can contribute to the properties of up to four cells. This density matrix, with boundary values replaced, is the b in the above equation. In addition to the density, I find the mean velocity of particles in each cell, and subsequently the estimated fusion rate by evaluating the cross-section and f = n1*n2*<sigma*v>*volume. The fusion rate is still very much up-in-the-air, the main uncertainty being whether I should scale my 2D densities to 3D by just adding a 1/length, or by raising it to the 3/2. I could argue either way so perhaps someone has some insights. Particles that hit the walls or cathode are removed.

I found this database of cross-sections for all kinds of particle interactions: https://www-amdis.iaea.org/ALADDIN/collision.html. I downloaded data for three reactions, assuming that values for deuterium are roughly the same as those for hydrogen:

Single charge-exchange: D,D+ --> D+,D

Single ionization: D+,D --> D+,D+,e (discard electron, for now)

Double ionization: D,D --> D+,D+,e

Each time step, the simulation calculates the distance the particles traveled, computes the interaction probability from the cross-sections and density, and adds both ions and fast neutrals when necessary. The particles are given velocities based on the standard elastic collision equations, plus some randomness in what you could call the impact parameter. Spoiler alert, at 10mtorr the mean free path is much smaller than the device size, meaning lots of interactions occur. The vast majority of ions disappear after traveling just 10cm, and the peak fast neutral densities after a couple hundred nanoseconds are comparable to the peak ion densities.

The timestep is 0.1ns such that ions travel across few cells even at maximum speed (~2e6m/s at 50kV). Each step takes ~1 second to compute, increasing somewhat as particles are added, but even so, simulating 1us takes several hours and equilibrium is not achieved.

Space charge from the ions causes a deviation from the vacuum potential of ~2V at ~1us, so negligible. I'd have to run it for a lot longer for this to build up, I'm sure.

Up until today, my initial condition has been a uniform distribution of ions. Currently I am using two, 1mA ion guns.

While I've been focusing on linear cathodes, and specifically ion gunned designs, I decided to model the standard spherical fusor as well. This simulation used 1e6 particles, a 200^2 grid, and ignored charge exchange, neutrals, and anything other than electrostatic interactions, simply because I had not implemented those yet: https://youtu.be/-QoMSFVQbX0. The maximum central density is on the order of 1e8/cm^2, which matches up pretty well with the thesis "Improved Lifetimes and Synchronization Behavior in Multi-grid Inertial Electrostatic Confinement Fusion Devices", which is too large to attach. The star mode also looks pretty nice.

Some more images, specifically of a dual-gunned linear cathode with charge exchange and particle interactions:

I'll post more as I make progress. Perhaps it's worth collaborating to produce a better model.

I've opted to go for a 2D particle-in-cell method as opposed to a direct sum or tree algorithm for a few reasons. It avoids divergences in the forces between particles, allows me to simulate millions of particles at O(N) efficiency, and makes it easier to determine bulk plasma properties. I started on a 3D version but decided to downscale by one dimension due to the matrix sizes needed to solve for the potential. A 3D model discretized into 100^3 cells needs a 1e6 x 1e6 matrix, which even as a sparse matrix takes way more RAM than I have. Sizes up to 70^3 are possible with 32GB, but that's pretty coarse. In 2D, I can easily run a 300^2 simulation with only a few GB.

The potential solver inverts the discrete Laplacian matrix to solve grad^2 (phi) = rho/epsilon0 in the form of a normal matrix equation Ax=b. I take into account boundary conditions, i.e. the grid and chamber, by setting the appropriate cells in b=rho to the boundary values and the corresponding rows of A=grad^2 to the identity. That way, when I compute x = (A^-1)b, I get back the boundary conditions plus the vacuum potential and potential from the particles.

The macroparticles (each represents ~3000 real particles) are binned using a square kernel the size of a grid cell, meaning each can contribute to the properties of up to four cells. This density matrix, with boundary values replaced, is the b in the above equation. In addition to the density, I find the mean velocity of particles in each cell, and subsequently the estimated fusion rate by evaluating the cross-section and f = n1*n2*<sigma*v>*volume. The fusion rate is still very much up-in-the-air, the main uncertainty being whether I should scale my 2D densities to 3D by just adding a 1/length, or by raising it to the 3/2. I could argue either way so perhaps someone has some insights. Particles that hit the walls or cathode are removed.

I found this database of cross-sections for all kinds of particle interactions: https://www-amdis.iaea.org/ALADDIN/collision.html. I downloaded data for three reactions, assuming that values for deuterium are roughly the same as those for hydrogen:

Single charge-exchange: D,D+ --> D+,D

Single ionization: D+,D --> D+,D+,e (discard electron, for now)

Double ionization: D,D --> D+,D+,e

Each time step, the simulation calculates the distance the particles traveled, computes the interaction probability from the cross-sections and density, and adds both ions and fast neutrals when necessary. The particles are given velocities based on the standard elastic collision equations, plus some randomness in what you could call the impact parameter. Spoiler alert, at 10mtorr the mean free path is much smaller than the device size, meaning lots of interactions occur. The vast majority of ions disappear after traveling just 10cm, and the peak fast neutral densities after a couple hundred nanoseconds are comparable to the peak ion densities.

The timestep is 0.1ns such that ions travel across few cells even at maximum speed (~2e6m/s at 50kV). Each step takes ~1 second to compute, increasing somewhat as particles are added, but even so, simulating 1us takes several hours and equilibrium is not achieved.

Space charge from the ions causes a deviation from the vacuum potential of ~2V at ~1us, so negligible. I'd have to run it for a lot longer for this to build up, I'm sure.

Up until today, my initial condition has been a uniform distribution of ions. Currently I am using two, 1mA ion guns.

While I've been focusing on linear cathodes, and specifically ion gunned designs, I decided to model the standard spherical fusor as well. This simulation used 1e6 particles, a 200^2 grid, and ignored charge exchange, neutrals, and anything other than electrostatic interactions, simply because I had not implemented those yet: https://youtu.be/-QoMSFVQbX0. The maximum central density is on the order of 1e8/cm^2, which matches up pretty well with the thesis "Improved Lifetimes and Synchronization Behavior in Multi-grid Inertial Electrostatic Confinement Fusion Devices", which is too large to attach. The star mode also looks pretty nice.

Some more images, specifically of a dual-gunned linear cathode with charge exchange and particle interactions:

I'll post more as I make progress. Perhaps it's worth collaborating to produce a better model.

- Nicolas Krause
**Posts:**169**Joined:**Fri Sep 30, 2016 11:36 pm**Real name:**Nicolas Krause**Location:**Canada-
**Contact:**

### Re: Python Simulation

Great work Liam! I assume you wrote this in Python? Collaboration on a simulation would be fantastic, I've got a ton of ideas that I simply haven't had the time to explore with school. As you've noted with the scaling problems for 3 dimensions, I'd started moving towards some sort of parallelization. I have 4 Raspberry Pi's and a network switch and was hoping to setup a toy beowulf cluster to learn about these things but I've been working on the physical side of the fusor when I've had spare time recently. The other alternative to speed up the simulation is to purchase some computation time on something like Amazon Web Services, I understand it's not too expensive, but I have no idea how to structure code for that. There seem to be a ton of people offering compute time with the rise of AI though.

I think the other big challenge that I simply haven't tackled because I don't know how, is the electron waves that can occur in plasma. I've read Vlasov's original paper, which has some fearsome mathematics in it, but it's pretty apparent that the plasma is a huge strange soup and it doesn't behave the way my simplified model predicts. I hope to use the simulation to provide some clarity about how the fusor actually operates and from there to guide areas of investigation to increase the rate. I'm currently reading some of Irving Langmuir's original papers on electrons in plasma in the hope that they'll be useful.

I have a couple of Pluto notebooks (like jupyter notebooks but for the Julia language) with some more complicated grid shapes that I put together over christmas, but beyond that I think everything I've done is visible here on fusor.net. I suppose the way to collaborate would be to setup a meeting and sort of sketch out some initial ideas?

I think the other big challenge that I simply haven't tackled because I don't know how, is the electron waves that can occur in plasma. I've read Vlasov's original paper, which has some fearsome mathematics in it, but it's pretty apparent that the plasma is a huge strange soup and it doesn't behave the way my simplified model predicts. I hope to use the simulation to provide some clarity about how the fusor actually operates and from there to guide areas of investigation to increase the rate. I'm currently reading some of Irving Langmuir's original papers on electrons in plasma in the hope that they'll be useful.

I have a couple of Pluto notebooks (like jupyter notebooks but for the Julia language) with some more complicated grid shapes that I put together over christmas, but beyond that I think everything I've done is visible here on fusor.net. I suppose the way to collaborate would be to setup a meeting and sort of sketch out some initial ideas?

- Richard Hull
- Moderator
**Posts:**13213**Joined:**Fri Jun 15, 2001 1:44 pm**Real name:**Richard Hull

### Re: Python Simulation

Nicolas, Welcome to the reality club. Plasmas do not obey any real calculable rules until you create a rigidly controlled environment of great simplicity and even then there are gotchas, unexpected, unexplained that throw all of the best mathematical work into a cocked hat. In the case of the fusor, it is a real mess. Wall loading and unloading and the velocity space fusion really mess the thing up. And yet this is just a spherical ball with a central electrode. Seems simple, but it isn't. Precise calculations fail due to the gotchas. Yes you can figure on a lot of generalizations in such a simple system, but high precision and close predictions via mathematics as to fusion, fail.....Even at the hyper advanced amateur level found here. Still, it an interesting and challenging endeavor. Keep at it.

Richard Hull

Richard Hull

Progress may have been a good thing once, but it just went on too long. - Yogi Berra

Fusion is the energy of the future....and it always will be

Retired now...Doing only what I want and not what I should...every day is a saturday.

Fusion is the energy of the future....and it always will be

Retired now...Doing only what I want and not what I should...every day is a saturday.

- Liam David
**Posts:**207**Joined:**Sat Jan 25, 2014 10:30 pm**Real name:**Liam David**Location:**Arizona

### Re: Python Simulation

I've been using Matlab for all my simulations. The vector-matrix operations built into the syntax make it comparatively easy to vectorize the code, and it's the language I'm most familiar with because of other work. I've done some stuff in Python, though for fast simulations it isn't ideal, and I've never used Julia but it seems similar to Matlab. The direction I'm leaning is to write overhead processing, visualization, post processing, etc... in a high level language like Matlab, Julia, or Python, but write the actual integrators/solvers in C++. I had some success running CUDA code on an NVIDIA GPU for another project, which could help speed everything up, but my experience there is limited.

In my current simulations I wanted to add electrons, but the particle-in-cell method runs into a problem. My timestep is already 0.1ns, and since electrons are so light, accurately simulating their trajectories would mean a picosecond-scale timestep. They're also relativistic at 50kV, so that adds a layer of complexity to the integrators. I'm not thrilled about waiting years for a few microseconds of total simulation time, so like you, I think the ideal next step is to look at the Vlasov-Maxwell equations. According to this paper I've just started reading (https://www-m16.ma.tum.de/foswiki/pub/M ... -Notes.pdf), the Vlasov equation is fully relativistic and apparently works for non-thermalized plasmas. This paper also goes into some detail (http://www.phys.nthu.edu.tw/~thschang/notes/PP06.pdf). They're both really heavy on the math, and naturally don't include lots of the processes we're interested in like charge exchange, impact ionization, etc... so there's a lot of work ahead.

I think it would be great to discuss general directions and ideas in a meeting. I've got finals till next week but should be good to go after those.

In my current simulations I wanted to add electrons, but the particle-in-cell method runs into a problem. My timestep is already 0.1ns, and since electrons are so light, accurately simulating their trajectories would mean a picosecond-scale timestep. They're also relativistic at 50kV, so that adds a layer of complexity to the integrators. I'm not thrilled about waiting years for a few microseconds of total simulation time, so like you, I think the ideal next step is to look at the Vlasov-Maxwell equations. According to this paper I've just started reading (https://www-m16.ma.tum.de/foswiki/pub/M ... -Notes.pdf), the Vlasov equation is fully relativistic and apparently works for non-thermalized plasmas. This paper also goes into some detail (http://www.phys.nthu.edu.tw/~thschang/notes/PP06.pdf). They're both really heavy on the math, and naturally don't include lots of the processes we're interested in like charge exchange, impact ionization, etc... so there's a lot of work ahead.

I think it would be great to discuss general directions and ideas in a meeting. I've got finals till next week but should be good to go after those.

- Nicolas Krause
**Posts:**169**Joined:**Fri Sep 30, 2016 11:36 pm**Real name:**Nicolas Krause**Location:**Canada-
**Contact:**

### Re: Python Simulation

Sounds good to me, I will note that I started in Matlab too but it really bogged down when I tried to make things a bit larger. All of those problems were not as apparent when I first switched to Python (using numpy) and later on with Julia. Thanks for the links to the papers, I've got a copy of the original Vlasov on hand that I can send you if you'd like.

- Nathan Marshall
**Posts:**47**Joined:**Thu May 09, 2019 12:13 am**Real name:**Nathan Marshall

### Re: Python Simulation

Good work, Liam! I too am interested in digging deeper with these Vlasov-Maxwell equations. I'll have more time after I finish finals next week, graduate, and get settled at my internship for this summer. It would be great to collaborate with you two.

- Liam David
**Posts:**207**Joined:**Sat Jan 25, 2014 10:30 pm**Real name:**Liam David**Location:**Arizona

### Re: Python Simulation

It'd be great to have a small team working on this! Sounds like we'll have to have a meeting in a couple weeks.

- Nicolas Krause
**Posts:**169**Joined:**Fri Sep 30, 2016 11:36 pm**Real name:**Nicolas Krause**Location:**Canada-
**Contact:**

### Re: Python Simulation

Can we set a tentative date of May 29th? Also what's everyone's time zone? I'm on Mountain Standard Time.

- Liam David
**Posts:**207**Joined:**Sat Jan 25, 2014 10:30 pm**Real name:**Liam David**Location:**Arizona

### Re: Python Simulation

The 29th should work for me. I'm also on MST, and I think Nathan's MST+2. I've started working in Julia and have gotten some basic n-body code to run. I'm slowly starting to get the hang of it, but it'll be a while till I'm proficient. I'll probably try and port my Matlab code to get some practice.