A number of blogs this week have been critical of the hype this week on BBC news that nuclear fusion researchers at the National Ignition Facility reached a "milestone."
The NIF is an exciting research facility because it allows us to understand nature better and because it helps us understand the D-T fusion process used for military applications. However, I'm afraid that what we've learned this week is that the science media (once again) goes after whatever can get hype.
So, let's be clear with what happened at NIF last month.
192 lasers generated photons that had 1.8 MJ of energy. (It should be pointed out that the NIF site consumed well more than 1.8 MJ of electricity to create the 1.8 MJ of photons. If this had been a semi-continuous event, the lasers would likely need at least 5 MJ to create the 1.8 MJ in photons.)
Of the 1.8 MJ in the photons, less than 14 kJ of energy reached the inside of the target as high energy X-rays. And 14 kJ of neutrons were generated from the reaction. Assuming that the neutrons are used to run a Rankine cycle power plant (same as for nuclear fission), then we are talking about roughly 5 kJ of electricity could be generated from the neutrons.
This means that the site spent ~5MJ of electricity to be able to perhaps obtain 5 kJ of electricity.
This means that NIF is three orders of magnitude away from "breakeven," and four orders of magnitude after from being "thermodynamically viable." This is far from a "milestone", and it is far away from what has already been achieved by the magnetically-confined fusion plasmas at JET.
I think that it's silly that NIF is trying to sell itself as an energy source of the future.
With that having been said, I want to point out that the idea of nuclear fusion is not a complete pipe dream. There is a possibly viable route to electricity production via magnetically-confined fusion plasmas, such as the still-being-built ITER experiment in Cadarache, France.
While this experiment is really expensive and there's still a chance that there's another plasma instability that will keep the system from reaching the real "breakeven" milestone (i.e. of generating more potential electricity from the neutrons than the electricity consumed to heat the plasma), I am proud that this facility is getting funding from world governments, including the US. The research at ITER is ground-breaking, and magnetically-confined fusion plasma is a potential energy source in the future if we can figure out how to control a few more of the instabilities have have appeared over the last ~60 years of research in this field.
I'd like to end this post by detailing some more information on some of the main engineering breakthroughs required before magnetically-confined fusion plasma can become "engineering" viable.
List of engineering breakthroughs required for magnetically-confined fusion plasma
(Also see slide 4 of the following presentation. The required engineering 'feats' or breakthroughs are well known. The required feats are all likely achievable...just really damn hard and require lots of upfront capital to do the research.)
(1) Controlling any instabilities that occur through alpha-heating (i.e. there are likely to be instabilities due to the fact that the alpha particles emerge with energies on the order 4 MeV, but the core temperature of the plasma may only be 100's of keV.) The ability to control potential instabilities in nuclear fusion powered plasmas will be tested at ITER.
(2) Not-steady-state: Tokamak plasmas have a torodial electric field that must be applied by a time-varying magnetic field. This means that the process is inherently not-steady-state because you eventually need to change the direction of the electric field as you reach the maximum magnetic field that can be generated. This means that the plasma needs to be turned off (likely on a weekly/monthly basis), and then the current needs to be restarted in the opposite direction. An engineering 'feat' is required here to design a system that doesn't break during these scheduled start-ups / shut-downs (or a breakthrough is required in steady-state plasmas) and that isn't cost prohibitive. So far, the steady-state stellarators designs have been cost-prohibitive.
(3) The wall materials that can withstand high flux of ions, electrons, photons, and neutrons still need to be tested and proven to work. Also, the process for generating Tritium from Lithium needs to be demonstrated on a continuous basis. (Note: there are plans to do this testing. I'm just point out that this has still been yet to demonstrated.)
(4) There are also a number of challenges associated with making cheap, super-conducting, high field magnets, with fueling the plasma, with removing heat, and with designing wall materials to withstand instabilities that release large amounts of energy to the wall while not releasing material from the wall that can end up cooling off the core of the plasma.
My overall conclusion (i.e. educated guess) is that magnetically-confined fusion plasma may be engineering-feasible sometime in the next 50 years, but it may not be economically competitive in the next 100 yrs. There's just too much uncertainty to known if magnetically-confined fusion will ever be economically viable against other sources of energy.
What can be stated with 99% certainty is that inertially-confined, laser-driven fusion is nowhere close to being engineering-viable or economically-viable. As a tax-payer in the US, I'd like to be able to vote for where my taxes goes. I would be willing to vote for magnetically-confined fusion plasma research, but I would not vote for my tax dollars to go to inertially-confined fusion research.