From the University of California – San Diego via Eurekalert

First quantitative measure of radiation leaked from Fukushima reactor
Observations of radioactive sulfur that formed when seawater was used to cool reactors and spent fuel ponds reveal the amount of radiation leaked from damaged fuel
Atmospheric chemists at the University of California, San Diego, report the first quantitative measurement of the amount of radiation leaked from the damaged nuclear reactor in Fukushima, Japan, following the devastating earthquake and tsunami earlier this year.
Their estimate, reported this week in the early, online edition of the Proceedings of the National Academy of Sciences, is based on a signal sent across the Pacific Ocean when operators of the damaged reactor had to resort to cooling overheated fuel with seawater.
“In any disaster, there’s always a lot to be learned by analysis of what happened,” said senior author Mark Thiemens, Dean of the Division of Physical Sciences at UC San Diego. “We were able to say how many neutrons were leaking out of that core when it was exposed.”
On March 28, 2011, 15 days after operators began pumping seawater into the damaged reactors and pools holding spent fuel, Thiemens’ group observed an unprecedented spike in the amount of radioactive sulfur in the air in La Jolla, California. They recognized that the signal came from the crippled power plant.
Neutrons and other products of the nuclear reaction leak from fuel rods when they melt. Seawater pumped into the reactor absorbed those neutrons, which collided with chloride ions in the saltwater. Each collision knocked a proton out of the nucleus of a chloride atom, transforming the atom to a radioactive form of sulfur.
When the water hit the hot reactors, nearly all of it vaporized into steam. To prevent explosions of the accumulating hydrogen, operators vented the steam, along with the radioactive sulfur, into the atmosphere.
In air, sulfur reacts with oxygen to form sulfur dioxide gas and then sulfate particles. Both blew across the Pacific Ocean on prevailing westerly winds to an instrument at the end of the pier at UC San Diego’s Scripps Institution of Oceanography where Thiemens’ group continuously monitors atmospheric sulfur.
Using a model based on NOAA’s observations of atmospheric conditions the team determined the path air took on its way to the pier over the preceding 10 days and found that it led back to Fukushima.
Then they calculated how much radiation must have been released. “You know how much seawater they used, how far neutrons will penetrate into the seawater and the size of the chloride ion. From that you can calculate how many neutrons must have reacted with chlorine to make radioactive sulfur,” said Antra Priyadarshi, a post-doctoral researcher in Thiemens’ lab and first author of the paper.
After accounting for losses along the way as the sulfate particles fell into the ocean, decayed, or eddied away from the stream of air heading toward California, the researchers calculated that 400 billion neutrons were released per square meter surface of the cooling pools, between March 13, when the seawater pumping operation began, and March 20, 2011.
The trace levels of radiation that reached the California coast never posed a threat to human health. “Although the spike that we measured was very high compared to background levels of radioactive sulfur, the absolute amount of radiation that reached California was small. The levels we recorded aren’t a concern for human health. In fact, it took sensitive instruments, measuring radioactive decay for hours after lengthy collection of the particles, to precisely measure the amount of radiation,” Thiemens said.
Concentrations a kilometer or so above the ocean near Fukushima must have been about 365 times higher than natural levels to account for the levels they observed in California.
The radioactive sulfur that Thiemens and his team observed must have been produced by partially melted nuclear fuel in the reactors or storage ponds. Although cosmic rays can produce radioactive sulfur in the upper atmosphere, that rarely mixes down into the layer of air just above the ocean, where these measurements were made.
Over a four day period ending on March 28th, they measured 1501 atoms of radioactive sulfur in sulfate particles per cubic meter of air, the highest they’ve ever seen in more than two years of recordings at the site.
Even intrusions from the stratosphere – rare events that bring naturally produced radioactive sulfur toward the Earth’s surface – have produced spikes of only 950 atoms per cubic meter of air at this site.
The nuclear reaction within the cooling seawater marked sulfur that originated in a specific place for a discrete period of time. That allowed researchers to time the transformation of sulfur to sulfur dioxide gas and sulfate particles, and measure their transport across the ocean, both important factors for understanding how sulfate pollutants contribute to climate change.
“We’ve really used the injection of a radioactive element to an environment to be a tracer of a very important process in nature for which there are some big gaps in understanding,” Thiemens said.
The event also created a pulse of labeled sulfur that can be traced in the streams and soils in Japan, to better understand how this element cycles through the environment, work that Thiemens and colleagues in Japan have already begun.
###
I’ve located what should have been in the press release, and added the graph above too:
Reference
- Priyadarshi, A. , Dominguez, G. & Thiemens, M. H. Proc. Natl Acad. Sci. USA http://www.pnas.org/content/early/2011/08/11/1109449108.abstract (2011).
Full paper: http://www.pnas.org/content/early/2011/08/11/1109449108.full.pdf+html
Data Supplement:
NOTE: I originally published this at 1PM today, but Willis also published a story right about the same time, so I put this one on hold after it had been up for two minutes in deference to Willis new article since this could keep a few hours. So if anybody wonders why there was a difference in Tweet and Facebook timings, and the story, that’s why. – Anthony
Sorry, I’m not buying it. More on that later, but I got started with some background info for those who might be interested in it (see below) and have to scoot for awhile, so I’ll comment later on why I don’t think the detection of sulfur in Ca is a credible way to determine much of anything about conditions at any time over at Fukushima Daiichi.
TOO TRUE!!! So to lighten the mood a little perhaps, here’s a fun one: Neutrons and Coffee http://www.guardian.co.uk/science/life-and-physics/2010/oct/17/1 (and yes, this really has a LOT to do with neutrons, it’s not a joke).
One that may amaze – Nature’s Nuclear Reactors (yes, there were naturally occurring long running nuclear reactors) http://blogs.scientificamerican.com/guest-blog/2011/07/13/natures-nuclear-reactors-the-2-billion-year-old-natural-fission-reactors-in-gabon-western-africa/
And it may interest some to know that deaths and injuries due to nuclear power generation is about the lowest there is for all forms of electricity generation, including renewable or alternative energy sources (lifecycle): Deaths per TWH by energy source http://nextbigfuture.com/2011/03/deaths-per-twh-by-energy-source.html
Then for whatever the added tidbits of info might be worth to some of you…
Whenever a reactor is ‘shut down’ immediately (a SCRAM), whether by plant operators or automatically, the neutron production is drastically reduced virtually immediately. There are still some spontaneous fissions, but a tiny fraction of the number that occur while operating even at very low power (and boron has been periodically added to the reactors and SFPs to quench those and as an added safety margin against the already remote possibility of re-criticality post SCRAM). The power output is a good reflection of the total amount of neutron production + decay heat contributions. When a reactor is SCRAMd, power output drops to about 7% of full power in one second. The majority of that power is from decay products, not the few spontaneous fissions. IIRC, within 1 hr. of SCRAM, power is down to 1.5% of full power, and it continues the exponential drop from there. Here’s a bit of info on decay heat: http://mitnse.com/2011/03/16/what-is-decay-heat/
Perhaps I should also note that all indications are that all three Fukushima Daiichi units safely and fully SCRAMd – all control rods fully inserted – automatically. Again, IIRC, that was within 5 seconds from the first ground movement associated with the earthquake. All of the primary safety systems, e.g., backup power from diesels, emergency core cooling systems, etc., immediate closure of the main steam isolation valves, etc., came on line immediately as designed also.
The key problem of course was the massive 14-15m tsunami (46-50 ft high!!!) and the fact that the main electrical switchboards and diesel generators were in the basement level – still far above sea level and behind the 5.7m (19 ft) tsunami protection wall. I cannot tell you how many articles and comment I have read castigating a supposed failure to adequately protect against this size tsunami. Ostensibly that’s true – the tsunami occurred, and the plant didn’t have sufficiently high sea walls for it.
Consider, however, that these are 50-60 year old design plants, 40 yrs old safely operating all those years – AND the 5th largest earthquake in the recorded history of the entire world. Meanwhile, much of the normal non-essential staff evacuated in case of tsunami. Said tsunami of course not only occurred, but killed 20,000 people, demolished ~200 tsunami shelters (!), and something like 200,000+ buildings and homes, caused a very large major oil facility to burn out of control for more than 10 days, & I shudder to think of all of the toxins, chemicals, biohazard wastes, biological wastes, etc. that were strewn across the countryside. All on top of the EQ which had demolished the infrastructure, making roads impassable for many days until the most basic repairs could be made, cutting power and water and sewage lines, etc.
Meanwhile, it appears that there are zero deaths or cases of radiation sickness (nor any exposures large enough to be an issue in this regard) or even subclinical temporary blood work changes associated with the Daiichi disaster. Of course those things are still possible if they aren’t very careful as they continue work at the plant, considering some of the very ‘hot’ spots around the plant facilities – but it looks pretty unlikely. Frankly, all things considered and as nasty and difficult as the situation is locally, that more damage wasn’t done is rather a testament to some pretty incredible design work & construction back in the 50’s & 60’s.
Rampant claims were made by the main steam/scream media and in blog articles and comments that knowledge of massive tsunami along the N & E coasts of Japan from something like 800 years ago (or was it in the year 856 or thereabouts?) was ignored in a gross dereliction of safety/duty when it came to the nuclear plants (hey, what about all the buildings, schools, tsunami SHELTERS, and so on???)… but the fact is that discovery/warning came from a handful of scientists just a few years ago who were just creating the new field of seismic paleo-forensics (I may have the order of the names mixed up – paleo-seismic forensics? or ? :0) ). Point is, their claim was quite new and still tentative in terms of scientific validity and acceptance, etc.
Then we can get into the issue of just how big a disaster does one have to design for? There will always be a bigger disaster that can be postulated, and that is virtually certain to occur somewhere on the planet in the next x100 years or x1000 years…and such disasters striking a chemical plant or some other types of facilities can create every bit as much damage or more than at a nuclear power plant. Look up Bhopal, India if you don’t vividly recall it already. Or consider that something like 70% of Tokyo’s downtown skyscrapers and buildings are only designed to withstand a 7.0 earthquake – and we’ve clearly seen that far larger is possible.
Others have carried on about the below grade location of the backup diesels and switchboard, citing this as a gross obvious design flaw. Except the design put those key safety components in the basement as protection against heavy flying debris such as ‘tree missiles.’ In other words, trees or telephone poles or the like that are effectively turned into missiles by either hurricane or tornado winds. I’ve also read that it provides some earthquake protection, but don’t personally know if that is correct or not. That part of Japan certainly gets hurricane force winds as best I understand. So…. clearly lessons learned and more needs to be done (actually HAS already been done) to protect against flooding – but it’s not nearly so clear that there was a greater flood risk than the flying debris risk, or that this was in actuality a design flaw that could be reasonably foreseen.
Anyhow, IIRC (not having reviewed it recently), the following is a fairly decent overview of BWR design (e.g., Fukushima design) and major related safety systems, including a lot more factual data (limits, parameters, etc) than most of the typical stuff out there post Fukushima. http://www.enotes.com/topic/Boiling_water_reactor#The_safety_systems_in_action:_the_Design_Basis_Accident
Enuf for now.
Errmmmm, if we got the activated chloride products, where’s all the sodium activation products?
…. I’m just saying.
…….(I mean, food for thought…..)
Needs to be adjusted to the level of Least Significant Digits, which in this case is statistically indistinguishable from 0 (zero). That’s the accuracy of molecule counts at the 1 in 100 billion billion (10^21) level.
Correction: 10^20.
Moreover, units 1 through 4 were already slated for shutdown and dismantling. Unit 1 was to be shut down at the end of March and the dismantling process started. Had the quake waited only a couple of more weeks, the problems would have been less as there would have only been two operating units rather than three. Units 1 through 4 were supposed to be replaced by units 5 through 8, of which 5 and 6 were already built and survived the quake and tsunami just fine though they were not running at the time. They are built much higher up from the sea than units 1 through 4.
The Japanese more recently recognized the danger and had already slated the plants for shutdown. What they couldn’t know was that the worst quake in recorded Japanese history was going to strike this year. Had it waited a decade, we wouldn’t be discussing this. The plants on the other side of town at Dai-ni are ok. They survived the same quake and the same tsunami. Those are newer units built higher up.
When the Dai-ichi plant site was first selected, the idea of plate tectonics was not accepted science. They had no idea what a “subduction fault” was as the notion of plate subduction and resulting megathrust quakes were not established science. Once the potential danger was recognized, it was decided to move the site higher up. Also, the plant design of units 5 – 8 does not require outside power to circulate water. They have steam turbines that can use decay heat to generate enough power in an emergency to circulate water.
The ultimate problem here was the loss of the “ultimate heat sink”, in this case the ocean. They had no way to circulate water. What really bugs me, though, is what people attempt to frame-drag decisions made decades ago into today’s knowledge base. What looks like an absolutely idiotic decision today was at the time a perfectly reasonable decision.
@Lisa K
Cooling is always preferable to not cooling.
@Thor
I’d say you’re doomed by natural background radiation, if you follow you’re line of reasoning. There is also a case to be made that slightly increased doses over background has a preventitive effect.
@Jeremy
By proton radiation, are you talking about Alpha particles? Only thing that comes from a reactor is alpha, beta, gamma, and neutron.
Interesting thing is one can read this article and easily see how very smart people who understand a narrow slice of the big picture can get something like climate science so wrong.
Mostly I believe I’m confused why a number of free neutrons per square meter can be used as evidence of ongoing nuclear fission since (again correct me if I am wrong) fission leaves behind neutron sources even after it has stopped. It seems as though you would still get significantly irradiated seawater (and radioactive sulfur) even if the reaction was completely shut down.
I keep looking at this article and thinking that it is so twisted up and the calculation assumptions so flawed in so many different ways that it must be some misteak on my part with my off the cuff impression… and that I need to double check some technical aspects that I haven’t visited in awhile – but don’t think it’s me, I think it really is mucked up that badly, which is a scary thing coming from folks with those sorts of credentials – tho I haven’t checked their bio’s either, so who knows Maybe it’s because they’re atmospheric chemists, and not health physicists/radiological safety engineers/nuclear engineers/reactor water chemistry experts or something along those lines.
First, assuming for the sake of argument only that there even were any neutrons to speak of (highly doubtful, especially since reactors scrammed properly, cooling functioned for the most part properly for about an hour before the tsunami hit, and boron – strong neutron absorber – was being added too)… then:
Pretty sure they don’t know the amount of seawater put into those cores – I may be recalling incorrectly, but thought that there wasn’t nearly the ‘accounting’ in the first few hours/days to be able to come close to knowing. And without knowing reasonably precisely just how much Cl was available to begin with, I don’t see any way in h-e-double-toothpicks that one could then correlate to 1501 – not 1500 mind you {VBG} – S-35 atoms per m3
Not to mention they’re saying 1501 ATOMS per m3… the half life of S-35 is 87.44 days, so what’d they do, count 2 whole disintegrations over 4 days after pulling multiple olympic swimming pool sized volumes of air thru their filter/counter, and then round it up? (I’m being facetious of course)
No way to estimate the length of time the sea water was in each core, because no one knows what the escape pathways are for each reactor. Clearly much of it never made it up the hardened stacks, or hydrogen wouldn’t have accumulated on the SPF floors and exploded – and look at what that does to the dispersion and dispensation of whatever was being released.
Some evaporation occurred from the exposed spent fuel pools, but geeze, there’s effectively no real heat column there to really loft anything up all that high, no way to estimate even the exposed pool surface available for evaporation, no indications that I’m aware of that any of the pools actually boiled even, or that any of the fuel rods, or hot equipment (meaning radioactively, not thermally) was ever uncovered. Nor any way to tell how much of the seawater they attempted to get into the pools ever actually made it in vs. just overflowing and pooling in the basements, how much diluted with the existing non-seawater… and so on.
Yet the article says: “calculated that 400 billion neutrons were released per square meter surface of the cooling pools” Um, so they figure all the activated S came solely from the spent fuel pools???!!! Makes no sense even in context with the rest of the article. Makes even less sense to try to say x neutrons were released per square meter seawater surface within the reactor vessels either. Seems that we have to assume they were meaning from all possible sources tho – even tho that creates all the more problems with any sort of meaningful calculations.
No way to tell what the core geometries were – estimates of melt vary by reactor and run from partial melt of central core only with the rest of the bundles/rods still in place, to virtually entire core slumped either to core plate or bottom of reactor vessel or more likely, some combination thereof. Plus, sea water crusts on the fuel bundles and in the channels and gawd knows where else, so no way to even begin to tell how water was flowing within the reactor vessel. Fluid flow for cooling was a massive concern with the use of seawater, because there wasn’t any way to know these things or predict just how much the crusted/plated salt would block fuel bundle channels, etc.
So no way in the world to calc from values of neutron penetration ‘in water’ when you’re actually dealing with some combination of water and steam with the fuel in who knows what geometry, all of which affects just where neutrons would even be scattering to, what materials they’d be hitting, and what other than the water would be absorbing, reflecting, or letting them pass thru. Bet you they used clean fresh still water values to boot, e.g., that would actually be used in a normal reactor or SFP – which is incredible clear of impurities, & so wouldn’t be anything close to accurate either compared to seawater, let alone seawater pumped up just after massive tsunami and the impurities inherent in seawater plus all the crud that may have been entrained in it even if they were able to filter to some degree, and I’ve no idea if they were even able to filter!!
Add to it all that there’s no way to tell just how much water is in each reactor, vs. steam. For most of the time (and still in both U2 & 3 & 1 channel of U1) they thought the water was roughly 1/2 way up the fuel bundles… but then they replaced one of the water level gauges (or the channel, which may have two gauges, I’m not clear which) in U1 only to have it show water level below core bottom… and the neutron flux is quite different in steam vs. in water – yet they say they used the depth neutrons penetrate WATER, at the same time saying supposedly almost all the water pumped in immediately flashed to steam. My head is spinning with the inconsistencies/contradictions.
Then, as you’ll already have picked up, there are all the Cl and/or S losses within the system before it’s even released to salt plating out on surfaces, crusting up within the reactor vessels themselves and anywhere else along the release pathways that’s got the right surface or chemistry or temperatures. I’d just love to know how they accounted for that (bet they didn’t).
All of that and we haven’t even gotten to the atmospheric dispersion problems & I’ve had enuf and am not going into the myriad aspects of trying to plume model over thousands of miles. I know it’s been a few years since I ran plume projections and building interferance etc., but I just don’t believe that they got SO accurate about it that they could possible come anywhere close to telling where their detector was relative to plume center either vertically or horizontally. Going back to the beginning, I don’t see how they could possibly have accurately projected plume travel from Fukushima 5+ days across the Pacific, and account for all washout from any storm showers, eddies, loft, dispersion, etc., etc., such that a single measurement point on the far side of the ocean would be in any way able to use data to back calculate and determine diddly about quantative release values at the source. Even if they’d started with a high loft situation and a discrete plume with high levels of S-35, by the time you got across the ocean a single site collection won’t give you jack to estimate any meaningful ‘quantative’ source term values, let alone some bizzare value of ‘neutrons released per square meter surface of the cooling pools.’
And what is it with their timeframes? Am I missing something? First they say they originally detected elevated S 15 days afer Fukushima first started pumping in seawater (why 15 days, when I & Cs were detected 5 to 7 days after the accident occurred?). Then the graph text states they first detected it after 10 days – IF we assume their 3rd day moving box means 3rd day post accident, maybe that’s the 7 day more tyical transit, but it doesn’t match initial detection on day 15 as stated…. not to mention the ‘moving box’ picking up this massive clean spike is all a back calculated spike apparently created to fit all of their preconceived assumptions! Then further down in the article they say the calculated neutron ‘release’ was from March 13, thru March 20 – But accident happened on March 11, and the 13th is 2 days later, not 3 (re the moving box) & how does that fit with them first detecting it 15 days after Fukushima started pumping in seawater?
Oh, and neutrons don’t “leak” from the fuel rods when they melt. Neutrons are emitted from active fissioning fuel, and the fuel rod cladding doesn’t do much to slow them down, which is why water is used as a moderator, to slow and reflect some of the neutrons back into the core, so the reaction can be self sustaining. Whatever core melting occurred at Fukushima, the reactors had already been shut down, and the melt was from heat, not from an operating core with ongoing fission.
Ok, this is intriguing. Anyone here a solar and/or “cosmic ray atmospheric interactions” guru? As they mention, cosmic rays do produce some atmospheric S-35. Well, perhaps not too surprisingly, it also turns out that solar flares significantly increase the cosmic ray flux hitting the earth (I’m ignorant enuf about this area that for all I knew, flares were associated with a magnetic burst that’d actually reduce cosmic flux impacting Earth). Then, low and behold, recall the big Class X flare in early March? March 9 to be exact: Sol is Finally Waking Up http://wattsupwiththat.com/2011/03/10/sol-is-finally-waking-up/ Ah, I see from a quick skim there was another Class X Feb 15, but I’ll hit on Q’s about these below. PLUS, without verifying, I gather there were at least 2 Class M flares: 03/07 19:43:00 20:12:00 M3.7 and 03/08 10:35:00 10:44:00 M5.3 http://www.youtube.com/watch?v=Kgh9cXZ-IJI Perhaps there were others in the days proceeding Japan’s EQ & tsunami?
It sure would be interesting if others here were willing to kick in their knowledge (or bounce this along to other guru’s who might be interested but not happen to catch this post) to work up a back of the envelope calc or even just good idea of the probability that the S-35 detected was from solar flare effects on cosmic ray flux hitting earth and subsequent atmospheric interactions – and that it wasn’t Fukushima at all! That’d be par for the course – history is rife with little examples where folks were dead certain detected radiation – or mutations, or illness, etc. – was ‘because of the nuclear power plant’ or ‘because of the research reactor.’ When solid investigation discovered entirely different clear cut sources or causes and the power plant or research reactor etc., were completely absolved of any blame.
Heck, we could post it as a follow up article here if Anthony would allow it. I just don’t have it in me right now to try to chase down and learn the associated relevant aspects well enough to be able to figure out if the S-35 levels detected would be well within the size that might be expected from actual flare activity prior to Fukushima Dai-ichi, combined with the general atmospheric conditions associated with possible increased S-35 transport to what I assume was a close to sea level detector….
So now begins the ignorance questions of the day… Q’s that I would think necessary to know the answer to in order to solve the puzzle plus ones I’m just curious about, (so thanks in advance for any replies!), not in any particular order:
1. How to best determine which flares are relevant to the question, e.g., how far prior to March 28th, the date they detected the S-35 spike do we look for flare activity?
2. Are there actual measurements of the Earth cosmic ray flux for that time period that can be accessed? If not, is it even possible to do a rough calculation to figure out what it likely would have been?
3. How much lag is there for the cosmic ray flux to increase after a flares?
4. Can we determine the approx. S-35 increase expected per x increase in cosmic rays?
5. What determines whether S-35 makes it down to the surface or not? and what is the lag time from formation to possible detection at ground level?
6. Can a large flare during a weak solar magnetism period cause significantly larger cosmic ray flux to hit Earth as compared to an identical flare but stronger overall solar magnetism field?
7. Does the flare(s) angle on striking the Earth affect where the S-35 is most likely to be detected?
In other words, could the sudden spate of solar flares account for the S-35 increase/spike better than Fukushima? Or of course, can anyone else think of other possible sources. :0)
Plus of course, were there any other solar flare sets & appropriate conditions over the past 2 years since these folks have been monitoring that were similar to this recent spate?
@Patrick
Aaah, yes, perspective. And my perspective is that there is a difference between external radiation which is transient, and a hot particle that is absorbed by the tissue inside your body and keeps radiating the same spot for years and years. Sulfur-35 isn’t that big of a problem since it will be basically be gone within 3 years.
But, consider Strontium-90 which has similar chemical properties as Calcium and tends to be absorbed by bone structure. It is a beta particle emitter, and will keep radiating bone and bone marrow for the rest of your life (half life of 28.8 years). Which source do you think gives the largest chance of cancer? The small hot particle that is virtually undetectable from the outside, or the radiation received during an 8 hour airplane flight? The interesting part is that the total dose received, averaged over your entire body might actually be higher in the airplane case,,,
As for our energy supply? Yes, I do believe that nuclear power generation is necessary to meet the energy demands of our world. But the safety measures of the nuclear reactors currently operating around the world are clearly inadequate. Passive nuclear reactor safety is much preferable to active intervention in case of failure. Liquid fuel Thorium reactors sound interesting though, maybe there is a future there – I don’t know. I’m digressing though….
Is it me or does every press article on Fukushima basically consist of:
Scientist gives a bunch of numbers to a reporter who doesn’t understand them.
Reporter relates gibberish (to them) numbers to some other number not given.
Reporter concludes that it’s, you know, bad.
If you don’t know what the numbers *mean*, they’re just numbers.
Also, if you’re going to compare radiation to background radiation, what’s the background radiation, how does that relate to, say, for example, downtown Miami or Lake Tahoe?
Thor:
A hot particle is a wholly different animal than the molecules of S-35 which they are measuring in the air. A hot particle is a whole bunch of atoms bound together. In this study they are measuring individuals atoms in the air that are not bound and could only be ingested as individual molecules. You correctly noted that as the S-35 atom decays, it immediately is converted to a Cl-35 molecule (non-radioactive) with the emission of a beta particle. The beta particle is considered a low energy beta at 167 kEv. That means not a lot of energy is deposited as the particle interacts with matter, e.g. cellular components. 0.07E6 atoms (from graph) converts to 0.63 Bq (disintegrations per second) for S-35. As numerous posters have noted, the naturally occurring nuclides in your body are putting out about 4400 disintegrations (alpha, beta and gamma mix) per second. This doesn’t even count the amount of radiation interacting with your body from all sorts of external sources of radiation.
The principal route of ingestion in this case would be inhalation. I don’t have my book handy that would tell me exactly how SO4 is utilized/excreted by the body, but you are correct that sulfur is readily taken up by the body. FYI, the critical organ is the testes (principally due to accumulation in the bladder as sulfur is excreted via urine). However, I would hazard that SO4 would most likely be uniformly distributed throughout the body after the body breaks it in to usable components.
There is no difference in dose delivery when the source is an external gamma ray and equivalent to the dose from the internally deposited beta emitter. Dose is dose. Your use of the word “transient” as applied to the effect of a gamma/x-ray exposure belies a misunderstanding of how radiation interacts with the human body. Both a beta emission and a gamma/x-ray emission will deposit energy in the body until the photon or particle runs out of energy – or it exits the body. There is a finite time for this event to occur (usually milliseconds) and then the event is over. So in the sense that you appear to be using the word “transient”, both beta and gamma/x-ray deposition events are transient.
There is a 1 in a quadrillion (10E12) chance that one “hit” from a radiation event will cause a cancer. The amount of disintegrations (and dose) that you might receive by ingesting the SO4 would be absolutely miniscule compared to what you are receiving now. The researchers got one thing right – there is no health threat.
As a lot of commenters have noted, something does not make sense with this study.
New sulfate particles created last night :
http://survivaljapan.wordpress.com/2011/09/22/typhoon-roke-aftermath-in-fukushima/