There is a lot of information in the MIT study and it is definitely worth reading. A burn-up of 50 GWd/MTIHM (50 gigawatt.day per metric ton of initial heavy metal) is typical of a high burn-up LWR. It is the quotient of the thermal energy by the mass of enriched fuel that was inserted in the fuel rods before irradiation. It tells you how much energy you can get from a fuel element, and how long it can stay in the core before unloading, with an impact on reactor availability. It says nearly nothing on the efficiency of the reactor in terms of natural uranium usage. To get a higher burnup, you need a higher enrichment, hence more depleted uranium tails at your enrichment plant. The balance in terms of natural uranium usage is almost even (to be more specific, it is much influenced by the energy efficiency of the enrichment process). As an example, CANDU reactors have very low burn-ups since they use un-enriched uranium as a fuel. They need more uranium to pass through the reactor, but they do not use more natural uranium than LWRs.jsbiff wrote:The MIT "Future of Nuclear Power" paper, has this to say: "Typical LWR spent fuel today reaches a burnup of 50,000 MWD/MT"
With the open fuel cycle practiced in the US and in many other countries, LWRs hardly burn (I mean, fission) a half-percent of the natural uranium withdrawal required by the process, two thirds U235 and one third U238 (transmuting into Pu in the core). Most breeder designs cannot obtain high burn-ups from a single load. They require many cycles of depleted fuel (blankets) and enriched fuel (inner core) loading, irradiation, unloading and re-processing, and so on. Not that it is theoretically impossible, but a 99.9% burnup would take centuries. Most people would be very happy with 50%, to be compared with the current 0.5%.Axil wrote:A breeder should convert 99.9% of the fuel energy content to thermal power.
Now enough with uranium resources, let us think of the wastes. That is where the big deal is.