Join this community to post or comment
Pinned by moderator

Rhys Taylor

• Astronomy/Astrophysics ☄  - 
Help keep Arecibo Observatory funded

Arecibo Observatory has been facing funding problems for many years. Now the National Science Foundation are preparing an "Environmental Impact Statement", which is lawyer-speak for "how should we proceed ?". Options range from keeping everything as it is to full site closure. Although a decision is not likely until sometime next year, official public consultation lasts only until June 23rd this year. So this is your chance to have your say in Arecibo's future.

If you simply want to express your support for Arecibo and don't wish to commit to any specific option under consideration, please consider signing the poll linked below. If you want to be more vocal or prefer some specific funding model, you send the NSF an e-mail or even write them a letter, if anyone remembers what those funny "envelopes" thingys are. Instructions are here :

If you aren't sure if Arecibo should continue receiving funding or just want more information about the situation, you can read my personal experiences with the Observatory at the following link. A suggested generic show of support message is included for those who wish to write to the NSF; feel free to modify it as appropriate.

The short version of the above article is that Arecibo is very far from being outdated, nor is it likely to be surpassed in the next decade or two. Arecibo is an extremely mature facility - rather than being outdated, it's more capable now than it ever was before. It's had many upgrades since its construction in 1963; new discoveries are still resulting from the last one in 2004, and more upgrades could improve it still further. There is no other facility planned that could fully supersede Arecibo save perhaps the Square Kilometre Array, which is unlikely to be operational for the next 15 years (and if you're American and worry about these things, the US isn't playing much of a role in that). Even that will not necessarily reproduce, let alone exceed, all of Arecibo's capabilities. Arecibo requires a relatively modest amount of funding for a unique and diverse range of scientific outputs.

Some projects which depend on Arecibo :
ALFALFA - a large-area survey of neutral hydrogen that's catalogued over 15,000 galaxies to date
(see those galaxies here :

GALFA-HI - a project to map the hydrogen in the Milky Way with exquisite sensitivity and resolution :

PALFA - detection and measuring pulsars and the mysterious fast radio bursts :
Arecibo was essential in the 1993 Nobel Prize thanks to the discovery of the first binary pulsar, which provided the first good evidence that gravitational waves exist.

AGES - a deeper survey of hydrogen than ALFALFA over a smaller area, which has discovered starless hydrogen clouds and giant streams :

GALFACTS - measuring the magnetic field of the Milky Way :

NanoGrav - a multi-telescope project (of which Arecibo is an essential part) to use pulsars to directly measure gravitational waves that instruments like LIGO cannot :

Not a specific project but let's not forget the planetary radar (one of only two such systems in the world) that also measures small bodies in the Solar System (e.g. asteroids) which helps determine (among many other things) whether they're likely to hit us or not :

And it also looks for aliens :

What more can you ask of a radio telescope ?

Thanks to +Lacerant Plainer and +Peter Edenist for permission to post this here, I know this is at least as much about politics as science.
We the people ask the federal government to Call on Congress to act on an issue: The Arecibo Observatory in PR is in danger of being closed or severely impaired, take action to keep it well funded. Created by E.D. on May 27, 2016. Sign This Petition. Needs 94642 signatures by June 26, ...
Peter Edenist's profile photoGustafsson Eric's profile photomaximegagne◄'s profile photoValerie Stuchlikova's profile photo
+Lacerant Plainer well, that's new to me lol! Guess I never knew mods had that available as I'm not a mod! 
Add a comment...

Es Einsteinium

• Physics  - 
Transistors: Understanding the Future

The early 20th century saw the emergence of electronics, and with them, the transistor. Transistors use a series of switches to carry out mathematical instructions, which is a common function in modern electronics. This physics lesson from Es Einsteinium explains how transistors work through amplification and switches, and reveals their revolutionary history.


Zeezee Agapotheos's profile photoRodrigo Casaroto's profile photoDale Miller's profile photoSteve Golf's profile photo
uhgreed andy
Add a comment...
Inspired by the biology of insects like bees, a research team at Harvard’s school of engineering and applied science is developing man-made robotic bees that could play a key role in the future, the Robobees.

A Robobee measures about half of a paper clip, weights less than a tenth of a gram and flies using artificial muscles comprised of materials that contract when a voltage is applied. Rotary motors, gears, nuts or bolds cannot be used on such a scale, as even small amount of turbulence would cause the Robobee to fall off. The robot includes various smart sensors like a vision or a UV sensor to move around its environment. The problem is that the robot is even too small to carry the tiniest microchips, meaning that the Robobee cannot take decisions on its own. It has to transmit the collected data aver a wire to a computer for interpretation. One of the final goals in the development of the Robobee will be to make them fly in a swarm.

Continue reading at:

Page of Harvard's Wyss Institute for Biologically Inspired Engineering sbout the Robobees:

Scientific Report in the Science Journal:
Robobees, robots inspired by the biology of insects like bees could play a key role in the future by helping in rescue missions or by pollinating crops.
Add a comment...
Scientists have discovered an often ignored but very influential process that can affect more than just the air we breathe. It affects our weather and climate. Research identified oligomerization – a process that causes smaller molecules to combine and form larger molecules – as the most influential process affecting secondary organic aerosol-forming in the atmosphere. Learn more about this PNNL-led research at

* * *

One of the secrets relevant to climate change involves how emissions from nature and human activities are changed in the atmosphere to become secondary organic aerosols (SOA).

Now a team of scientists led by Pacific Northwest National Laboratory has discovered an often ignored but very influential process/parameter that can affect not only the air we breathe but the weather and climate. Oligomerization, or molecular bonding, is a process by which smaller molecules combine and form larger molecules, increasing the amount and lifetime of secondary organic aerosols in the atmosphere. These bondings may hold the key to more successful modeling of the particles. Most atmospheric models ignore oligomerization.

"The study's comprehensive sensitivity analysis clearly shows that leaving out secondary organic aerosol oligomerization processes in climate models could severely limit our ability to understand and predict their impacts," noted Dr. Manish Shrivastava, PNNL atmospheric scientist and lead author of the study. "Our variance-based sensitivity analysis technique shows a lot of promise in ranking the most important processes/parameters producing climate-relevant SOA particles."

Researchers from PNNL and the +University of California, Berkeley started with real-world measurements from the 2010 Carbonaceous Aerosols and Radiative Effects (CARES) field study. They also included recent laboratory measurements, especially about rapid oligomerization processes affecting the formation and evolution of secondary organic aerosols from both natural and human-made sources. Based on these measurement sets, the team developed a new method to model aerosol particles, which they used in the Weather Research and Forecasting model coupled to chemistry (WRF-Chem). In a first-of-its-kind sensitivity analysis for SOAs, the scientific team looked at how model estimates of SOAs changed when simultaneously varying seven model parameters such as particle-phase oligomerization and precursor chemical emissions that lead to SOA formation. Surprisingly, they found that the oligomerization process was even more influential than the precursor chemical emissions that lead to SOA formation.

When scientists included the oligomerization process in the model, natural vapors wafting from trees proved to be the largest instigator of SOAs in the study area, and model predictions were closer to surface measurements during the CARES 2010 field campaign. Model simulations that neglected this rapid oligomerization predicted much lower SOA particles compared to measurements, and overemphasized the contribution of fossil-fuel emissions in forming SOAs.

Finding influential processes and parameters that control the particle formation is critical to improve climate models and make reliable predictions about impacts on the present and future climate.

Why is this important? Unseen by the naked eye, large quantities of carbon-containing vapors enter the atmosphere from trees, fossil-fuel burning, and forest fires. Through a complex interaction between sunlight and atmospheric oxidants, these vapors cook an atmospheric stew holding millions of new carbon-containing molecules. Some of these molecules then condense into SOAs that in turn change clouds, influence precipitation, and affect the amount of solar energy reaching the Earth's surface.

Recent measurements led scientists to identify several processes that improve the understanding of how chemically complex SOAs form and persist in the atmosphere. As this understanding evolves, scientists need to determine the most influential parameters and processes affecting SOAs. Ultimately, the most influential must be included in models and long-term climate simulations to better understand the impacts of SOAs on climate change.

In this PNNL study, the scientists found that accounting for a rapid particle-phase oligomerization process in the simulations not only increased the amount of SOA particles, but also affected which sources contributed the most SOAs.

What's Next? As process modeling improves, scientists will use comprehensive sensitivity analysis techniques such as the one demonstrated in this study to tease out the secrets and rank the most influential parameters affecting the amount of secondary organic aerosols in the atmosphere. More effectively modeling these influential parameters will help improve the understanding of their impacts.  

Acknowledgments: This research was supported by the +U.S. Department of Energy Office of Science, Office of Biological and Environmental Research for the Atmospheric System Research program. 
Add a comment...

Daniel Montesinos

• Biology  - 
Two invasive acacia species secure generalist pollinators in invaded communities

Exotic entomophilous plants need to establish effective pollinator interactions in order to succeed after being introduced into a new community, particularly if they are obligatory outbreeders. By establishing these novel interactions in the new non-native range, invasive plants are hypothesized to drive changes in the composition and functioning of the native pollinator community, with potential impacts on the pollination biology of native co-flowering plants.

In this study, we used two different sites in Portugal, each invaded by a different acacia species, to assess whether two native Australian trees, Acacia dealbata and Acacia longifolia, were able to recruit pollinators in Portugal, and whether the pollinator community visiting acacia trees differed from the pollinator communities interacting with native co-flowering plants.

Our results indicate that in the invaded range of Portugal both acacia species were able to establish novel mutualistic interactions, predominantly with generalist pollinators. For each of the two studied sites, only two other co-occurring native plant species presented partially overlapping phenologies. We observed significant differences in pollinator richness and visitation rates among native and non-native plant species, although the study of b diversity indicated that only the native plant Lithodora fruticosa presented a differentiated set of pollinator species. Acacias experienced a large number of visits by numerous pollinator species, but massive acacia flowering resulted in visitation rates frequently lower than those of the native co-flowering species.

The establishment of mutualisms in Portugal contributes to the effective and profuse production of acacia seeds in Portugal. Despite the massive flowering of A. dealbata and A. longifolia, native plant species attained similar visitation rates than acacias.

Link to pdf:,Q4YJBumi

Add a comment...

Marco Calignano

• Astronomy/Astrophysics ☄  - 
Gravitational wave are perturbation on the space-time give usually by sudden variation of gravity fields. This happens for example when two black hole collide and become one. Ligo (Laser Interferometer Gravitational-wave Observatory) already detected the first gravitational wave on September 2015, now a second Gravitation Wave was detected also from two black holes coalescence into a single one.
Here you find the news from the Ligo observatory

and Here you finde the actual Paper
Abak Hoben's profile photoMichael Jenkins's profile photo
Add a comment...

Todd William

• Robotics  - 
Some ideas on consciousness and how it might be simulated...
(references at bottom of post)
A Brief Thought Experiment ~ Simulating Consciousness

There is a raging debate gong on. Will we ultimately advance computing technology far enough to create artificial consciousness? Experts in many fields constantly weigh in on this, and there's no shortage of strong opinions for and against this possibility.

It's necessary to begin with the assumption that the universe and everything in it can be explained entirely by physical, natural phenomena/laws - the philosophical position of Naturalism (if this isn't your worldview, this debate isn't for you).

Though not everyone who maintains this perspective agrees that such a stance automatically implies that consciousness can be replicated with computers, and for good reasons.

Bad Metaphors

For starters, a look through history at the metaphors we've used to describe the brain prove to be embarrassingly shortsighted.

Descartes thought the brain operated like a hydraulic pump. Freud explained it as being similar to a steam engine (1). As recently as a century ago it was thought to be a telephone-switching style network, a battery, and now, thanks to the current paradigm, a digital computer.

What's more likley: People 100 years from now will look back amazed that we finally got it right, or look at our assumptions with the same level of amusement and chuckle at how shortsighted we are?

Perhaps we're the lucky generation that happens to live in the era when we finally got the metaphor right. Maybe computing power really is all it takes to simulate consciousness. There is a good case to be made for this.

Computing Power

Scott Aaronson, a respected theoretical computer scientist at MIT, made an interesting observation about the potential for the artificial development of consciousness through computing:

"Because the brain exists inside the universe, and because computers can simulate the entire universe given enough power, your entire brain can be simulated in a computer."

"And because it can be simulated in a computer, its structure and functions, including your consciousness, must be entirely logical and computational." (2)

This does seem to make sense, and it implies that if you don't believe consciousness requires supernatural elements, you must conclude it can be replicated in a computer - in theory.

Though Aaronson's statement seems to sneak in a crafty assumption, that the theoretical computer can simulate the entire universe. How do we know this? Isn't it possible the universe may contain elements that cause computation problems?

For instance, dividing by zero, or any calculation involving infinity creates a multitude of computation issues. And yet, the universe is full of such things that have zero size and infinite density, otherwise known as black holes.

Testing for Consciousness

Maybe we're really putting the cart before the horse here. We really ought to be asking if its even possible to test for consciousness, and if so, how? If we can't find an objective way to determine consciousnesses, then what's the point of trying?

A machine merely telling us it feels conscious is meaningless - how hard would it be to program such a thing. Even the Turing Test only accounts for our perception of intelligence, not actual consciousness.(3). It seems we really need something more meaningful and objective.

American philosopher John Searle provides an interesting perspective on this:

"Suppose we discover that there are very specific brain processes that cause consciousness - so that, for example, in brain-damaged patients we can re-introduce consciousness by artificially producing certain kinds of brain mechanisms. To give these mechanisms a label, let's just call them XYZ: it's XYZ that causes consciousness."

"Now we go down the phylogenetic scale and we discover, no question, that dogs and cats and primates all have XYZ; but when we get very low we discover that termites have XYZ but snails don't. Then we'd have to say, 'Well, OK, snails are conscious and termites are not'" (4)

On this basis, there is at least hope for a methodology that we can offer judgment on whether we've succeeded. We'd then need only duplicate the brain processes described above in digital format to achieve our goal.

This would not only make the approach much more efficient, but also adds a real sense of objectivity to the whole process.


If we do succeed in finding particular mechanisms in the brain that cause consciousness, there are some interesting parallels that should give us all some pause.

Consider the history of flight. For thousands of years, flight was completely unobtainable for humans. It was a difficult problem to crack.

Yet notice, when humanity did learn to fly, we didn't achieve flight by copying insects and birds. We did it by understanding the basic concepts of aerodynamics and developing our own models.

The result? By artificially simulating flight, we fly many times faster and higher than anything else on earth. We blew right past the abilities of everything on earth that can naturally fly.

Paradigm Shift

Rather than trying to duplicate human capacities, the path to developing artificial consciousness may instead be understanding the mechanisms of consciousness.

Like flight, would this imply we could then build our own, potentially better models of consciousness?

It's not unfathomable. We already know that silicon is a far better conductor than the biological material in our brains - thousands of times faster(5). We might just blow by everything on earth that naturally has consciousness.

What would that even mean? What might it produce?

What would the conscious experience of thinking thousands of times faster feel like?

Are we really ready for this?


(3) Computing Machinery and Intelligence (1950)
(4) Conversations on Consciousness (2007) Susan Blackmore (211-212)

(Artwork by: Igor Morski)

Terry Soltow's profile photoMARK MESSER's profile photoSwasti Panda's profile photoLarry Jessee's profile photo
Using the metaphor of the brain as a computer might be the right way to go. However what if the brain is not a computer but a conductor of a source? If so all the effort trying to find the "right" material to build AI would fall short. Perhaps it would be better to focus our efforts on the discovery of a material that would enhance the power or ability of the brain to tap the source?

Developing AI, to me, is like trying to reinvent the wheel. The wheel is already in use, we simply need to figure out how to change from stone wheels to radial tires.
Add a comment...

ULg Reflexions

• Ecology/Nature ✿  - 
#oceanography   #climatechange   #chemistry   #GHG  
A team of researchers from #Belgium has identified important concentrations of #methane in the surface waters of the #North #Sea, mainly near the Belgian and English coasts. In order to understand the origins of this methane concentration, it is necessary to go back 16,000 years in time when forests and peatlands connected England and Ireland to continental Europe. Trapped in marine sediment today, this organic matter produces methane which is easily released into the atmosphere from the shallower zones of the basin. This ground-breaking study includes the coastal regions in the quantification of the methane cycle. This quest was made even more difficult by the many sinks and sources of this hydrocarbon of both anthropogenic and natural origin. A better understanding of methane, the second most efficient greenhouse gas after carbon dioxide could be key to slowing down climate change. A study published in Scientific Reports. +Université de Liège (ULg) 
infos :
original paper :
gene lowinger's profile photoFrançois Knoppel's profile photoGwen Stoll's profile photoMariusz Rozpędek's profile photo
"methane, the second most efficient greenhouse gas after carbon dioxide" <-- This is incorrect. Carbon dioxide is much more abundant in the atmosphere, but methane as a molecule is a much more powerful greenhouse gas.

See here:
Add a comment...

Marco Calignano

• Astronomy/Astrophysics ☄  - 
Loop quantum gravity is a theory that try to integrate quantum Mechanic and Gravity. Like Einstein have shown Gravity is a proprety of space-time . LQG also describe space-time like a network of finite loops the size of the Plank length.
Using this theory Pranzetti and his team could get a better calcualtion of the Entropy of a Black Hole, but also show that Holographic Hypothesis of a Black Hole is really accurate and most plausible.
Joel Reid's profile photoVitaliy Prooks's profile photoPeeush Trikha's profile photoAbak Hoben's profile photo
+Andreas Geisler theories with holes in are thought of all the time... Pun intended. 
Add a comment...

About this community

Welcome to the Science Community on G+ featuring science research, news, and more. New Members: Please read the rules category and pinned post and the Community Policy category before posting. This is not a meme depository. Please do not post collections. This applies to posts and comments. *One liners will be deleted!* : Please cite any work you reference or state as facts. Put effort into your posts. We all prefer intelligent, descriptive posts that tell us what to expect in any links and/or videos. ***Basic Guidelines:*** 1. Post about science only. The science should be obvious 2. No links / videos without explanation of the science and links to reputable sites with references, ideally to the original research papers. 3. No spam, flooding or reposts (English only please). 4. Be civil 5. No plagiarism 6. Please do not self-promote or advertise services 7. No memes or infographics without prior approval from mods. 8. Please do not argue with the moderators. Take the time to make a reasoned discussion (if required) and logic to make your case. Do not remove moderator comments. 9. Please check comments on your own post, do not disable comments unless instructed to by moderators. 10. Not more than one post in 12 hours. (Anti-flooding guideline) *Announcements* Any announcements of events that members of the community are invited to, need to be cleared by moderators. *Questions* We welcome thoughtful questions. We do ask that you have done some work before you ask. If you have a question please read the link below before you ask. This is not a place to do your homework. The moderators reserve the right to ban, remove or carry out any action on posts, which in their view is not as per the community focus, maybe be repetitive or uninteresting and /or does not further the case for science or the community.

Pam Bloom

• Physics  - 
Hello! I am new here, so hope I'm in the right place!

I'm a new science fiction author aiming at the younger end of the market, and as such am blogging about general sciency things on my website,

My latest blog is about the possibility of wormholes, and is called Timey wimey, wormy squirmy. Check it out! I'd love to connect with other people interested in general science.
Nick James's profile photoPam Bloom's profile photoKerry O'Connor's profile photo
Thanks! I'm just an ordinary person with no science background, so my blogs are more your basic 'explain this in simple terms' type.
Add a comment...
Local wind events often lift particles into the atmosphere, where it affects the mechanisms of our climate. Now, scientists at PNNL have developed a new method to represent small-scale, localized wind events that is computationally efficient in global models. When researchers considered local wind events in a global climate model, they found two surprising results: In some cases, the amount of annual dust increased by more than 50 percent. In others, the annual mean remained the same, but the amount of dust raised by weaker winds was higher. Learn more about this global climate dust simulation research at

* * *

Dust can cloud your vision or sting your face, depending on how it is tossed into the atmosphere. Scientists know that these tiny particles of dust and sea salt can enter the air naturally to affect the climate as well, but when and how varies based in part on local surface wind speeds. This small-scale wind speed variance is often a missing component of large-scale climate models.

Now, PNNL scientists have developed a new method to represent how local wind speeds can lift natural aerosol particle into the atmosphere. When researchers considered these effects in a global climate model, they found two surprising results. In some cases, the amount of dust produced a year (yearly mean emission) increased by more than 50 percent. In others, the mean remained the same, but the amount of dust raised by weaker winds was higher.

"We knew that small-scale wind variances might greatly affect natural aerosol emissions in models," said PNNL atmospheric scientist Dr. Kai Zhang, who led the study. "With the new method we can quantitatively count that small-scale effect in global climate models, with a very small increase in the computational cost."

Scientists analyzed results of local surface wind speeds from finely detailed global and regional models and determined how much variability was currently included. They then developed a way to predict the small-scale wind speed variability and tested their approach in the widely accepted global climate model, Community Atmosphere Model version 5. They used a range of data to represent wind speed variations, taking into account the impact from turbulence (irregular air movement), convection (how heat is transferred in the atmosphere), and topography (the surface features of land, i.e. mountains). Finally, they calculated how emissions might change in each model grid box using different wind speeds.

Their results indicated that, while the changes in small-scale wind speeds have a rather minor impact on sea salt emission calculations, such changes strongly affect the modeling of when and how much dust is released from the ground.

Why is this important? In global climate models, typically simulated at scales of 100 kilometers (62 miles) per grid box, many of the small-scale details are averaged out. For instance, important features like clouds are considered small-scale, and so are wind events for the most part. Yet, scientists have known for decades that things happening at scales smaller than the grid boxes of global and regional climate models can have a large effect on a model's results. That's one of the reasons researchers continue to evaluate which small-scale processes are important to consider.

Surface wind speed, for example, affects how sea salt and dust are sent into the atmosphere, both in computational simulations and in the real world. Most global climate models factor in only a single speed for wind, despite the fact that wind speed can vary considerably across a typical model grid box 100 to 200 kilometers across. Unfortunately, including small-scale calculations such as those for local wind speed in global climate models often comes with a large cost in computer time. The PNNL method, however, proved to be computationally efficient, making the approach more feasible to adopt.

What's next? The new method of representing wind speed variability provides the basis for future work to make simulated, wind-driven aerosol emissions more realistic. PNNL scientists are currently studying additional aerosol-related processes that are affected by differences in local meteorological conditions to improve state-of-the-art climate models.

Acknowledgments: This research was supported by the U.S. Department of Energy's Office of Science, Office of Biological and Environmental Research as part of the Scientific Discovery Through Advanced Computing project in the Earth Systems Modeling Program. 
TrikesterHal's profile photoNihar Sharma's profile photo
Add a comment...

Rhys Taylor

• Astronomy/Astrophysics ☄  - 
Attack of the Flying Snakes : Dark galaxies or tidal debris ?

Whenever I publish a paper I try and do an outreach-friendly version. You can read the full paper here : The very short version with links to more detailed public-friendly blog posts is below.

There are some hydrogen clouds in the Virgo cluster without any stars. The nearest galaxies look undisturbed and show no signs of any extended hydrogen streams, and they're pretty far away from the clouds. Yet the most popular explanation is that the clouds are some form of "tidal debris", meaning that they were ripped out of galaxies as they passed close to each other. Generally speaking this is quite a sensible explanation : after all, the gas has got to come from somewhere.

But some of these clouds have caused a lot of controversy over the years. The gif below shows one such cloud, VIRGOHI21 (, which about ten years ago was making a lot of headlines as a so-called "dark galaxy". The idea was that it's not tidal debris at all, but a rotating disc of gas embedded in a dark matter halo (the main reason being that it has a very high velocity width which makes it look as if it's rotating). If so it would be massive enough to explain the gas stream linking it to a nearby spiral, which also appears to have been disturbed by a massive object. I describe the observational evidence for similar such dark galaxy candidates here :

The problem is that a couple of simulations had shown that the appearance of rotation could just be an illusion, and that VIRGOHI21 in particular could just be tidal debris after all. But, despite the limitations of these simulations, they have been repeatedly invoked to explain any and all such clouds, even though they require some rather unlikely conditions. It's true that the simulations did show that the tidal debris explanation was possible, but they made no comment on how probable it was.

While it's extremely difficult to decide if any individual cloud is really rotating, now that we know of more such clouds we need to try and decide how likely it is that they could just be tidal debris - how likely is it that these features could be produced by gravitational interactions and mimic rotation ? So we ran a new batch of simulations, dropping gas streams into a realistic model of a galaxy cluster to see how frequently the interactions produce fake dark galaxies.

It seems that people have over-interpreted the previous simulations. They show features which are, at a casual glance, quite similar to the observed clouds, but the apparently minor differences turn out to be a huge problem. Specifically our simulations are consistent with the previous ones, but show unequivocally that clouds with high velocity widths cannot possibly be explained as tidal debris. Our simulations show that such features do occur, but only 0.2% of the time - so it's just not a sensible explanation for the real clouds.

We also tested the alternative hypothesis that the clouds could themselves be genuine dark galaxies - rotating hydrogen discs embedded in dark matter halos. That scenario turns out to do a far, far better job of explaining the observations, and seems to tie in quite nicely with the newly-discovered "ultra diffuse galaxies" (which are very faint galaxies discovered in the Virgo cluster which do at least have some stars, just not very many).

You can read a more detailed description of the simulations, with movies and lolcats, here :

Why do these stupid poxy gaseous anomalies matter ? Because dark galaxies were proposed to explain the missing satellite problem, the observation that there are far fewer small galaxies than predicted by simulations. This has been a major thorn in the side of cosmological models for the last 20 years or so.

Not that we should get carried away. We've shown that tidal debris definitely doesn't work, and dark galaxies do work. But a model which works is not the same as a model which is correct. Other explanations are possible and our simulations (like the previous ones) are missing a lot of important physics. The take-home message of the paper is that if you find a mysterious hydrogen cloud, hand-waving explanations about "tidal debris" are just not good enough.
Rhys Taylor's profile photoBrian Fitzgerald's profile photoDaniel Carvalho's profile photoTim Teatro's profile photo
+Eric de Weerdt Yes. :)
Add a comment...
In the quest to synthesize a useful double perovskite material not found in nature, a PNNL team developed a multidimensional analysis approach that resulted in the first direct atomic-scale measurement of ordering in the material. The researchers showed that combining multiscale synthesis, characterization and modeling techniques can lead to a better understanding of complex materials systems. These results will help scientists precisely engineer next-generation materials for data storage and solar cell applications. Read more about this work at

* * *

The results are published in Chemistry of Materials. Lead author Dr. Steven Spurgeon, a postdoctoral associate at PNNL, his mentor, materials expert Dr. Scott Chambers; and colleagues shed light on a fundamental challenge in materials science: many proposed materials have never been synthesized because existing characterization and modeling approaches fail to capture the inherent complexity of solid state systems.

Spurgeon, Chambers, and colleagues at PNNL, North Carolina State University (NCSU), +Sandia National Labs, and instrument manufacturer +Gatan, Inc., put forth an approach to connect atomic-scale structure to macroscale magnetic properties.

Spurgeon's background is in scanning transmission electron microscopy (STEM), one of the instruments of choice for materials scientists. TEM has been around since the 1930s, and it can provide amazing images of even single atoms. But TEM images do not always represent a material's overall structure.

 The PNNL team developed a new approach to characterize material locally using STEM, along with a three-dimensional (3D) analysis technique called atom probe tomography (APT). "This combination gave us beautiful insight into how the material looks across a range of length scales," said Spurgeon. "By using STEM at the atomic scale, APT at the midrange (mesoscale), and magnetic characterization at the macroscale, we could understand how this hierarchy of structure gives rise to properties. That's huge."

This approach requires advanced instrumentation unavailable at most labs, so DOE's Environmental Molecular Sciences Laboratory (EMSL) located at PNNL is key. "The suite of tools at EMSL allows us to do unique materials science. Only a handful of labs worldwide have these kinds of resources in house," said Spurgeon. The study also took advantage of the Advanced Photon Source at Argonne National Laboratory, another DOE national user facility.

PNNL scientists have also used molecular beam epitaxy (MBE), for synthesizing materials a single layer of atoms at a time. With MBE, researchers can use a monitor to watch crystal layers grow, and even count single layers of material as they form. "The ability to synthesize materials this way takes years to learn, and it's as much art as science," said Spurgeon.

Once the PNNL team successfully synthesized LMNO, measurements confirmed the stability of its structure. But previous theory calculations suggested that its magnetic properties were not as good as they should have been. "There should have been a much larger magnetic moment than what we observed," said Spurgeon. "So we threw every technique we could at it, including using the X-ray beam line at the Advanced Photon Source, to look at its overall magnetic properties and structure. Nothing made sense."

He wondered if something was going on locally that the X-ray analysis was missing. "Our STEM analysis showed that even though we got the crystal structure we expected, the manganese (Mn) and nickel (Ni) atoms were swapping places. When Mn and Ni atoms alternate in chains through the crystal, they give rise to good magnetic properties, but if these atoms are swapped or out of place, poor magnetic properties result."

Using STEM, the PNNL and Sandia scientists found that though the LMNO crystal structure was correct, Mn and Ni atoms were randomly distributed throughout it (see left image at top), which is difficult to see using X-ray measurements because of the similarity of the two atoms.

Heating the material caused the atoms to reorganize into a much more ordered structure (right image at top). "The arrangement of Mn and Ni went from a random state to a checkerboard," said Spurgeon. At the same time, they saw a huge increase in magnetic properties. By analyzing the material before and after heat treatment, they confirmed the ordering was greatly enhancing the material's magnetic properties.

Enter RevSTEM. While the magnetic properties were now much improved, theory suggested that they could be even better. The scientists knew they wanted to directly image the material at larger length scales, moving from a handful of atoms to broader regions, so they used a new STEM technique called RevSTEM developed by collaborators at NCSU. Because of the way STEM images are typically acquired, they are very sensitive to small movements of the sample, making it hard to accurately measure bonds between atoms. The NCSU group developed a method that uses many fast images to correct for these distortions.

Spurgeon explains, "By correcting the distortions, we can obtain more precise measurements of crystal structure and then measure the angle between atoms and compare it to what we expect. Any deviation in these angles can affect magnetic properties. That's the tie back." Using RevSTEM, the team could map the LMNO crystal structure to look for any disorder over larger areas. They found broad regions where the LMNO was slightly structurally disordered.

This hint of structural disorder led the team to a shocking discovery: the supposedly pure material was actually laced with small, 2- to 5-nanometer regions of nickel oxide (NiO). 3D atom maps measured using APT showed these regions were distributed throughout the film and were so small that the X-ray analysis could easily overlook them. Because NiO is antiferromagnetic, it could greatly reduce the magnetic properties of the ferromagnetic LMNO. But why was NiO forming?

The team turned to PNNL scientist Dr. Peter Sushko, who specializes in density functional theory (DFT), a popular technique for calculating crystal properties. His models showed that for the synthesis conditions used in the study, the material is at a tipping point where LMNO and NiO can form together.

As Spurgeon explains, "That's the downside to MBE. We can't put much oxygen into the material. If you are using low oxygen pressure, there's a specific regime where you can get phase separation. We hadn't considered that before. But the upside is that within this growth regime, we could make a material where you might want phase separation, such as in a nanocomposite structure. To engineer these structures controllably would be of great value."

Why is this important? LMNO has been studied by scientists for the past decade because of its value in thermoelectric and spintronic applications (see sidebar). LMNO, a ferromagnetic semiconductor material exemplifies the challenges associated with structure-property engineering of multicomponent material systems. Much debate has revolved around what chemical structure LMNO forms and how to engineer the structure to improve its magnetic properties.

Existing materials characterization techniques have limitations. For example, while it is straightforward to measure overall magnetic properties, figuring out how atomic structure gives rise to those properties is much harder. Traditional analysis relies heavily on X-ray scattering, which can give detailed but indirect insight into a material's structure. The challenge is to directly image the material at the atomic scale, connect those images to larger length scales, and, finally, determine how structure controls properties. This study has met that challenge, bringing scientists closer to their goal of controllably engineering such a material.  

What's Next? While LMNO has been studied for many years, scientists are just now beginning to understand the direct connection between structure and magnetic properties. More important, the use of advanced synthesis, characterization, and modeling approaches will allow them to gain deep insight into how materials form and what gives rise to their properties.

Acknowledgments: This work was supported by the +U.S. Department of Energy Office of Science, Office of Basic Energy Sciences Division of Materials Sciences and Engineering. Much of the work was performed in EMSL, a national science user facility sponsored by DOE's Office of Biological and Environmental Research and located at PNNL. Other work was performed at Sandia National Laboratories for DOE's National Nuclear Security Administration, at the Analytical Instrumentation Facility at North Carolina State University, and the Advanced Photon Source at Argonne National Laboratory.
Mariusz Rozpędek's profile photoAbak Hoben's profile photo
Add a comment...

Kam-Yung Soh

• Chemistry  - 
Research on using date seeds to remove environmental toxins. Paper at [ ]. "Over the past four years Dr Hanano, who works in the commission’s molecular-biology department, and his colleagues have developed a way to use the stones (or pits) of dates, a waste product of the fruit-packing industry, to clean up dioxins, a particularly nasty and persistent type of organic pollutant that can lead to reproductive and developmental problems, damage the immune system, and even cause cancer. Dioxins are produced mainly as a by-product of industrial processes.

Dr Hanano lit on date stones for this task for three reasons. One was that they are rich in oils of a sort that have an affinity for dioxins. The second was that, though they are not unique in this oil-richness, unlike other oil-rich seeds (olives, rape, sesame and so on) they have no commercial value. The third was that, despite lacking commercial value, they are abundant.
Remediating polluted land might also, the researchers hope, be on the cards, although they have yet to work out how to recover the droplets once the emulsion has been sprayed on the affected ground. If they can do so, however, the group are likely to have plenty of customers. Substances like 2,3,7,8-tetrachlorodibenzo-p-dioxin are so long-lived that even today the Vietnamese are still trying to clean up the mess Agent Orange created."
Syrian researchers use date stones to suck up toxic materials
Ayub Malik's profile photoabu tahir's profile photoAsad Rana's profile photoAbak Hoben's profile photo
You can grow date trees from them.
Add a comment...
The buildup of protein deposits in cells is a hallmark of neurodegenerative diseases such as Alzheimer’s and Parkinson’s. “If the process by which the cell removes those proteins can be enhanced, then you might be able to prevent that disease progression,” says Carsten Sachse from EMBL. Before scientists can give the cell’s rubbish collectors a boost, they have to understand how the system works. In a paper published in EMBO Reports – – Sachse and his lab drew on expertise from colleagues throughout EMBL to do just that.
How cells eliminate protein deposits that can lead to Alzheimer's, Parkinson's and other neurodegenerative disorders
John Verdon's profile photoBrigitte Coulhon's profile photogustave perlo's profile photoBilly Titra's profile photo
Citing If the process by which the cell removes those proteins can be enhanced, then you might be able to prevent that disease progression, may i propose downregulation of Notch signaling pathway to reduce activation of atherosclerotic macrophage that might compromise neuron mitochondria membrane with calcium ( calcium-induced mitochondrial apoptosis ? ), changing innate to adaptive immune response, to gain phagocytosis of APP-CTF rich exosomes.

Then enhanced by PPAR-gamma activation.

Add a comment...
When nanorods were created in an experiment that didn't go as planned, researchers gave the microscopic spawns of science a closer examination. The nanorods had an unusual property – they spontaneously emitted water. With further development, research has shown that this serendipitous innovation could be used for water harvesting and purification, or even sweat-gathering fabric. Watch our video and learn more at

* * *

Chemist Satish Nune was inspecting the solid, carbon-rich nanorods with a vapor analysis instrument when he noticed the nanorods mysteriously lost weight as humidity increased. Thinking the instrument had malfunctioned, Nune and his colleagues moved on to another tool, a high-powered microscope.

They jumped as they saw an unknown fluid unexpectedly appear between bunches of the tiny sticks and ooze out. Video ( recorded under the microscope is shaky at the beginning, as they quickly moved the view finder to capture the surprising event again.

The team at the Department of Energy's Pacific Northwest National Laboratory would go on to view the same phenomenon more than a dozen times. Immediately after expelling the fluid, the nanorods' weight decreased by about half, causing the researchers to scratch their heads even harder.

A recent paper published in Nature Nanotechnology ( describes the physical processes behind this spectacle, which turned out to be the first experimental viewing of a phenomenon theorized 20-some years ago. The discovery could lead to a large range of real-world applications, including low-energy water harvesting and purification for the developing world, and fabric that automatically pulls sweat away from the body and releases it as a vapor.

"Our unusual material behaves a bit like a sponge; it wrings itself out halfway before it's fully saturated with water," explained PNNL post-doctoral research associate David Lao, who manufactured the material.

"Now that we've gotten over the initial shock of this unforeseen behavior, we're imagining the many ways it could be harnessed to improve the quality of our lives," said PNNL engineer David Heldebrant, one of the paper's two corresponding authors.

"But before we can put these nanorods to good use, we need to be able to control and perfect their size and shape," added Nune, the paper's other corresponding author.

Expectations v. reality: Ordinarily, materials take on more water as the humidity around them increases. But these carbon-rich nanorods — which the researchers mistakenly created while trying to fabricate magnetic nanowires — suddenly expelled a large amount of water as the relative humidity inside the specimen holder reached anywhere between 50 and 80 percent.

Water expulsion can clearly be seen in the microscope video. Water is visible as a gray, cloudy haze — and only emerges from where nanorods intersect. When the team went on to raise the humidity further, the nanorods' weight also increased, indicating they were taking on water again. It was also reversible, with water being ejected and later absorbed as humidity was gradually lowered back down.

The team was further intrigued. They couldn't think of any other material that takes on water at a low humidity and spontaneously releases it at a high humidity. So they dug through the canons of scientific literature to find an explanation.

Old theory, new evidence: They found a 2012 paper ( in the Journal of Physical Chemistry B that explained how, in certain situations where liquid is confined in a teeny-tiny space (roughly 1.5 nanometers wide), the liquid can spontaneously evaporate. And the authors of a 2013 paper ( in the Journal of Chemical Physics described how water can condense into the confines of close hydrophobic materials, which do not play well with water, and quickly turn into vapor due to attractive forces between the surfaces of the two materials facing each other. The 2013 paper gave this phenomenon a very long, technical name: "solvent cavitation under solvo-phobic confinement."

These papers also noted the process was theorized as early as the 1990s by scientists examining crystallized proteins. Back then, scientists noticed they only saw water vapor surrounding hydrophobic sections of protein, while liquid water would surround other areas. The researchers proposed that there was some sort of process that enabled the water caught between hydrophobic protein sections to suddenly vaporize.

Armed with this knowledge, the PNNL team hypothesized water was condensing and forming a bridge between the nanorods, through a process known as capillary condensation. Next, they believe water between rods forms a curved cavity whose surface tension pulls the adjacent rods closer together. When two intersecting nanorods reach about 1.5 nanometers apart, the team reasoned, the water caught between them could be forced to quickly evaporate.

Putting it to good use

Though understanding the nanorods’ unexpected behavior is a triumph in itself, the PNNL team also foresees a future where this phenomenon could also improve quality of life. They see their discovery as a potential humanitarian lifesaver, describing it as “a paradigm shift in water purification and separation” in their paper.

Theoretically, large quantities of the water-spitting nanomaterial could repeatedly take on and then eject collected water when a certain humidity level is reached. Such a system could be used in remote deserts, where it would collect water from the air and harvest it for human consumption.

Another vision is to create a membrane that takes on and later expels water as humidity changes. The membrane could be used in jacket fabrics and enable more comfortable outdoor adventures by removing sweat from inside a jacket and emitting it outside as a vapor.

To make these applications possible, the team is exploring ways to make more of its nanorods spray water. The team estimates only around 10 to 20 percent of the material spits water right now. The plan is to scale up production of the current material, creating more than a few grams of the material at a time. They will do further analysis to ensure the phenomenon is still present when more nanorods are present. They are also conducting a more detailed examination of the material’s physical and chemical properties and determining if other materials that have similar properties. And the team is intrigued by the idea that different nanomaterials could potentially be developed to collect other liquids, such as methanol.

This research is funded by PNNL's internal Materials Synthesis and Simulation Across Scales Initiative. The project also used the expertise of staff and several advanced instruments — environmental transmission and scanning electron microscopes, an X-ray photoelectron spectrometer and a Mössbauer spectrometer — at EMSL, the Environmental Molecular Sciences Laboratory, a DOE Office of Science User Facility at PNNL.
Abak Hoben's profile photo
Add a comment...

Kam-Yung Soh

• Biology  - 
Ants are known to interact with butterfly larvae. Here is the first description of an interaction between ants and adult butterflies. Link includes a video. Link to paper (PDF) at [ ]. Article by Aaron Pomerantz. "Phil [Torres] and I both have backgrounds in entomology, and yet we had never seen anything like this before. I mean sure, we knew that some butterfly larvae have symbiotic relationships with ants, known as myrmecophily. This is well documented, as many of the caterpillars that associate with ants have special organs that secrete sugars and amino acids. The ants get a sugary nutritious meal from the caterpillars and, in return, the fragile caterpillars get personal ant bodyguards which defend against predators and parasites. But this is not the case for the adult butterflies, which usually have to evade ants, lest they become their next meal.
The butterfly appears to be a known species, Adelotypa annulifera, but these pictures could be revealing an undocumented observation for this butterfly interacting with ants and a potentially new wing-mimicry pattern. Super cool, I thought, but there was just one problem: we know little about this butterfly beyond some dead pinned specimens. What is its life cycle? Where do the larvae develop? What do the larvae even look like? In other words, next to nothing was known about the life history of this butterfly. So to solve this mystery, Phil and I decided to collaborate. I was making a return trip to this exact field site in the coming months, so I set out to uncover the missing pieces of this puzzle."
It was late 2014 when Phil Torres first showed me the photos from his recent trip to the Peruvian Amazon. Among them were amazing images of the tropical wildlife, from brilliant macaws to elusive pumas. But there were a few critters in that album that stood out to us in particular.
William Rutiser's profile photoAbak Hoben's profile photo
Add a comment...
#HPC - In the first study of its kind, scientists developed a novel model seeking a more intricate look at what happens during the final stages of nuclear fission. Using the model, they determined that fission fragments remain connected far longer than expected before the daughter nuclei split apart. This work provides a long-awaited description of real-time fission dynamics within a microscopic framework, opening a pathway to a theoretical method with abundant predictive power. Read more at

* * *

For nearly 80 years, nuclear fission has awaited a description within a microscopic framework. In the first study of its kind, scientists collaborating from the University of Washington, Warsaw University of Technology (Poland), Pacific Northwest National Laboratory, and Los Alamos National Laboratory, developed a novel model to take a more intricate look at what happens during the last stages of the fission process. Using the model, they determined that fission fragments remain connected far longer than expected before the daughter nuclei split apart.

Moreover, they noted the predicted kinetic energy agreed with results from experimental observations. This discovery indicates that complex calculations of real-time fission dynamics without physical restrictions are feasible and opens a pathway to a theoretical microscopic framework with abundant predictive power.

In addition to its publication, “Induced Fission of 240Pu Within a Real-time Microscopic Framework” was highlighted as an Editors’ Suggestion by Physical Review Letters—ranked first among physics and mathematics journals by the Google Scholar five-year h-index. Only about one letter in six is highlighted based on its particular importance, innovation, and broad appeal.

Methods: The researchers extended the density functional theory (DFT) modeling method designed for electronic structure systems to strongly interacting many-fermion systems and real-time dynamics, creating a time-dependent superfluid local density approximation (TDSLDA). For the study reported, evaluating the theory amounted to solving ≈56,000 complex coupled nonlinear, time-dependent, three-dimensional partial differential equations for a 240Pu nucleus using a highly efficient parallelized graphic processing unit (GPU) code. The calculations required ≈1760 GPUs and 550 minutes total wall time on Titan, a Cray XK7 supercomputer located at the Oak Ridge Leadership Computing Facility (OLCF).

Unlike other models that incorporate a nuclear energy density functional, the TDSLDA is the only theoretical framework that allows the nucleus to evolve non-adiabatically while including all known collective degrees of freedom. The potential and kinetic energies remain a continuous function of the nuclear shape with no restrictions.

“One notable discovery in using the framework was the time it takes a nucleus to descend to the scission configuration—timescales an order of magnitude greater than those predicted in existing literature,” explained co-author Kenneth Roche, a scientist in PNNL’s High Performance Computing Group and an Affiliate Associate Professor with the University of Washington, Department of Physics. “One might expect this slow evolution was due to viscosity, but the simulations suggest that many shape and pairing modes are excited, causing energy exchanges in the collective degrees of freedom.”

Why is this important? Apart from its fundamental significance in theoretical physics, providing a usable capability that can accurately model fission dynamics will impact research areas such as future reactor fuel compositions, nuclear forensics, and studies of nuclear reactions. Excitation energies of fission fragments are not directly accessible by experiments but are crucial inputs to key activities at National Nuclear Security Administration laboratories. The capability developed by this research stands to improve activities that depend upon empirical data and evaluation models by aligning these with predictive theory.

What’s Next? In their article, the authors note that their method could be extended to two-body observables within the fission process, including mass and charge, and eventually, they may be able to introduce additional random aspects that will result in more detailed information.

Acknowledgments: This work was supported in part by grants and an Early Career Award from the U.S. Department of Energy (DOE) and contracts from the Polish National Science Center. Calculations were performed using Titan, a high-performance system housed within OLCF, a DOE Office of Science (SC) User Facility, and the DOE-SC National Energy Research Scientific Computing Center’s (NERSC) Edison system.
Add a comment...