Asteroid barrage, ancient marine life boom not linked

An asteroid bombardment that some say triggered an explosion of marine animal diversity around 471 million years ago actually had nothing to do with it.

Precisely dating meteorites from the salvo, researchers found that the space rock barrage began at least 2 million years after the start of the Great Ordovician Biodiversification Event. So the two phenomena are unrelated, the researchers conclude January 24 in Nature Communications.

Some scientists had previously proposed a causal link between the two events: Raining debris from an asteroid breakup (SN: 7/23/16, p. 4) drove evolution by upsetting ecosystems and opening new ecological niches. The relative timing of the impacts and biodiversification was uncertain, though.
Geologist Anders Lindskog of Lund University in Sweden and colleagues examined 17 crystals buried alongside meteorite fragments. Gradual radioactive decay of uranium atoms inside the crystals allowed the researchers to accurately date the sediment layer to around 467.5 million years ago. Based in part on this age, the researchers estimate that the asteroid breakup took place around 468 million years ago. That’s well after fossil evidence suggests that the diversification event kicked off.

Other forces such as climate change and shifting continents instead promoted biodiversity, the researchers propose.

LSD’s grip on brain protein could explain drug’s long-lasting effects

Locked inside a human brain protein, the hallucinogenic drug LSD takes an extra-long trip.

New X-ray crystallography images reveal how an LSD molecule gets trapped within a protein that senses serotonin, a key chemical messenger in the brain. The protein, called a serotonin receptor, belongs to a family of proteins involved in everything from perception to mood.

The work is the first to decipher the structure of such a receptor bound to LSD, which gets snared in the protein for hours. That could explain why “acid trips” last so long, study coauthor Bryan Roth and colleagues report January 26 in Cell. It’s “the first snapshot of LSD in action,” he says. “Until now, we had no idea how it worked at the molecular level.”
But the results might not be that relevant to people, warns Cornell University biophysicist Harel Weinstein.

Roth’s group didn’t capture the main target of LSD, a serotonin receptor called 5-HT2A, instead imaging the related receptor 5-HT2B. That receptor is “important in rodents, but not that important in humans,” Weinstein says.

Roth’s team has devoted decades to working on 5-HT2A, but the receptor has “thus far been impossible to crystallize,” he says. Predictions of 5-HT2A’s structure, though, are very similar to that of 5-HT2B, he says.

LSD, or lysergic acid diethylamide, was first cooked up in a chemist’s lab in 1938. It was popular (and legal) for recreational use in the early 1960s, but the United States later banned the drug (also known as blotter, boomer, Purple Haze and electric Kool-Aid).

It’s known for altering perception and mood — and for its unusually long-lasting effects. An acid trip can run some 15 hours, and at high doses, effects can linger for days. “It’s an extraordinarily potent drug,” says Roth, a psychiatrist and pharmacologist at the University of North Carolina School of Medicine in Chapel Hill.
Scientists have known for decades that LSD targeted serotonin receptors in the brain. These proteins, which are also found in the intestine and elsewhere in the body, lodge within the outer membranes of nerve cells and relay chemical signals to the cells’ interiors. But no one knew exactly how LSD fit into the receptor, or why the drug was so powerful.

Roth and colleagues’ work shows the drug hunkered deep inside a pocket of the receptor, grabbing onto an amino acid that acts like a handle to pull down a lid. It’s like a person holding the door of a storm cellar closed during a tornado, Roth says.

When the team did additional molecular experiments, tweaking the lid’s handle so that LSD could no longer hang on, the drug slipped out of the pocket faster than when the handle was intact. That was true whether the team used receptor 5-HT2B or 5-HT2A, Roth says. (Though the researchers couldn’t crystallize 5-HT2A, they were able to grow the protein inside cells in the lab for use in their other experiments.) The results suggest that LSD’s grip on the receptor is what keeps it trapped inside. “That explains to a great extent why LSD is so potent and why it’s so long-lasting,” Roth says.

David Nutt, a neuropsychopharmacologist at Imperial College London, agrees. He calls the work an “elegant use of molecular science.”

Weinstein remains skeptical. The 5-HT2A receptor is the interesting one, he maintains. A structure of that protein “has been needed for a very long time.” That’s what would really help explain the hallucinogenic effects of LSD, he says.

Mysteries of time still stump scientists

The topic of time is both excruciatingly complicated and slippery. The combination makes it easy to get bogged down. But instead of an exhaustive review, journalist Alan Burdick lets curiosity be his guide in Why Time Flies, an approach that leads to a light yet supremely satisfying story about time as it runs through — and is perceived by — the human body.

Burdick doesn’t restrict himself to any one aspect of his question. He spends time excavating what he calls the “existential caverns,” where philosophical questions, such as the shifting concept of now, dwell. He describes the circadian clocks that keep bodies running efficiently, making sure our bodies are primed to digest food at mealtimes, for instance. He even covers the intriguing and slightly insane self-experimentation by the French scientist Michel Siffre, who crawled into caves in 1962 and 1972 to see how his body responded in places without any time cues.
In the service of his exploration, Burdick lived in constant daylight in the Alaskan Arctic for two summery weeks, visited the master timekeepers at the International Bureau of Weights and Measures in Paris to see how they precisely mete out the seconds and plunged off a giant platform to see if time felt slower during moments of stress. The book not only deals with fascinating temporal science but also how time is largely a social construct. “Time is what everybody agrees the time is,” one researcher told Burdick.
That subjective truth also applies to the brain. Time, in a sense, is created by the mind. “Our experience of time is not a cave shadow to some true and absolute thing; time is our perception,” Burdick writes. That subjective experience becomes obvious when Burdick recounts how easily our brains’ clocks can be swayed. Emotions, attention (SN: 12/10/16, p. 10) and even fever can distort our time perception, scientists have found.

Burdick delves deep into several neuroscientific theories of how time runs through the brain (SN: 7/25/15, p. 20). Here, the story narrows somewhat in an effort to thoroughly explain a few key ideas. But even amid these details, Burdick doesn’t lose the overarching truth  — that for the most part, scientists simply don’t know the answers. That may be because there is no one answer; instead, the brain may create time by stitching together a multitude of neural clocks.
After reading Why Time Flies, readers will be convinced that no matter how much time passes, the mystery of time will endure.

Germanium computer chips gain ground on silicon — again

First germanium integrated circuits

Integrated circuits made of germanium instead of silicon have been reported … by researchers at International Business Machines Corp. Even though the experimental devices are about three times as large as the smallest silicon circuits, they reportedly offer faster overall switching speed. Germanium … has inherently greater mobility than silicon, which means that electrons move through it faster when a current is applied. — Science News, February 25, 1967

UPDATE:
Silicon circuits still dominate computing. But demand for smaller, high-speed electronics is pushing silicon to its physical limits, sending engineers back for a fresh look at germanium. Researchers built the first compact, high-performance germanium circuit in 2014, and scientists continue to fiddle with its physical properties to make smaller, faster circuits. Although not yet widely used, germanium circuits and those made from other materials, such as carbon nanotubes, could help engineers make more energy-efficient electronics.

Helium’s inertness defied by high-pressure compound

Helium — the recluse of the periodic table — is reluctant to react with other elements. But squeeze the element hard enough, and it will form a chemical compound with sodium, scientists report.

Helium, a noble gas, is one of the periodic table’s least reactive elements. Originally, the noble gases were believed incapable of forming any chemical compounds at all. But after scientists created xenon compounds in the early 1960s, a slew of other noble gas compounds followed. Helium, however, has largely been a holdout.
Although helium was known to hook up with certain elements, the bonds in those compounds were weak, or the compounds were short-lived or electrically charged. But the new compound, called sodium helide or Na2He, is stable at high pressure, and its bonds are strong, an international team of scientists reports February 6 in Nature Chemistry.

As a robust helium compound, “this is really the first that people ever observed,” says chemist Maosheng Miao of California State University, Northridge, who was not involved with the research.

The material’s properties are still poorly understood, but it is unlikely to have immediate practical applications — scientists can create it only in tiny amounts at very high pressures, says study coauthor Alexander Goncharov, a physicist at the Carnegie Institution for Science in Washington, D.C. Instead, the oddball compound serves as inspiration for scientists who hope to produce weird new materials at lower pressures. “I would say that it’s not totally impossible,” says Goncharov. Scientists may be able to tweak the compound, for example, by adding or switching out elements, to decrease the pressure needed.

To coerce helium to link up with another element, the scientists, led by Artem Oganov of Stony Brook University in New York, first performed computer calculations to see which compounds might be possible. Sodium, calculations predicted, would form a compound with helium if crushed under enormously high pressure. Under such conditions, the typical rules of chemistry change — elements that refuse to react at atmospheric pressure can sometimes become bosom buddies when given a squeeze.

So Goncharov and colleagues pinched small amounts of helium and sodium between a pair of diamonds, reaching pressures more than a million times that of Earth’s atmosphere, and heated the material with lasers to temperatures above 1,500 kelvins (about 1200° Celsius). By scattering X-rays off the compound, the scientists could deduce its structure, which matched the one predicted by calculations.
“I think this is really the triumph of computation,” says Miao. In the search for new compounds, computers now allow scientists to skip expensive trial-and-error experiments and zero in on the best candidates to create in a laboratory.

Na2He is an unusual type of compound known as an electride, in which pairs of electrons are cloistered off, away from any atoms. But despite the compound’s bizarre nature, it behaves somewhat like a commonplace compound such as table salt, in which negatively charged chloride ions alternate with positively charged sodium. In Na2He, the isolated electron pairs act like negative ions in such a compound, and the eight sodium atoms surrounding each helium atom are the positive ions.

“The idea that you can make compounds with things like helium which don’t react at all, I think it’s pretty interesting,” says physicist Eugene Gregoryanz of the University of Edinburgh. But, he adds, “I would like to see more experiments” to confirm the result.

The scientists’ calculations also predicted that a compound of helium, sodium and oxygen, called Na2HeO, should form at even lower pressures, though that one has yet to be created in the lab. So the oddball new helium compound may soon have a confirmed cousin.

New, greener catalysts are built for speed

Platinum, one of the rarest and most expensive metals on Earth, may soon find itself out of a job. Known for its allure in engagement rings, platinum is also treasured for its ability to jump-start chemical reactions. It’s an excellent catalyst, able to turn standoffish molecules into fast friends. But Earth’s supply of the metal is limited, so scientists are trying to coax materials that aren’t platinum — aren’t even metals — into acting like they are.

For years, platinum has been offering behind-the-scenes hustle in catalytic converters, which remove harmful pollutants from auto exhaust. It’s also one of a handful of rare metals that move along chemical reactions in many well-established industries. And now, clean energy technology opens a new and growing market for the metal. Energy-converting devices like fuel cells being developed to power some types of electric vehicles rely on platinum’s catalytic properties to transform hydrogen into electricity. Even generating the hydrogen fuel itself depends on platinum.

Without a cheaper substitute for platinum, these clean energy technologies won’t be able to compete against fossil fuels, says Liming Dai, a materials scientist at Case Western Reserve University in Cleveland.

To reduce the pressure on platinum, Dai and others are engineering new materials that have the same catalytic powers as platinum and other metals — without the high price tag. Some researchers are replacing expensive metals with cheaper, more abundant building blocks, like carbon. Others are turning to biology, using catalysts perfected by years of evolution as inspiration. And when platinum really is best for a job, researchers are retooling how it is used to get more bang for the buck.
Moving right along
Catalysts are the unsung heroes of the chemical reactions that make human society tick. These molecular matchmakers are used in manufacturing plastics and pharmaceuticals, petroleum and coal processing and now clean energy technology. Catalysts are even inside our bodies, in the form of enzymes that break food into nutrients and help cells make energy.
During any chemical reaction, molecules break chemical bonds between their atomic building blocks and then make new bonds with different atoms — like swapping partners at a square dance. Sometimes, those partnerships are easy to break: A molecule has certain properties that let it lure away atoms from another molecule. But in stable partnerships, the molecules are content as they are. Left together for a very long period of time, a few might eventually switch partners. But there’s no mass frenzy of bond breaking and rebuilding.

Catalysts make this breaking and rebuilding happen more efficiently by lowering the activation energy — the threshold amount of energy needed to make a chemical reaction go. Starting and ending products stay the same; the catalyst just changes the path, building a paved highway to bypass a bumpy dirt road. With an easier route, molecules that might take years to react can do so in seconds instead. A catalyst doesn’t get used up in the reaction, though. Like a wingman, it incentivizes other molecules to react, and then it bows out.

A hydrogen fuel cell, for example, works by reacting hydrogen gas (H2) with oxygen gas (O2) to make water (H2O) and electricity. The fuel cell needs to break apart the atoms of the hydrogen and oxygen molecules and reshuffle them into new molecules. Without some assistance, the reshuffling happens very slowly. Platinum propels those reactions along.
Platinum works well in fuel cell reactions because it interacts just the right amount with both hydrogen and oxygen. That is, the platinum surface attracts the gas molecules, pulling them close together to speed along the reaction. But then it lets its handiwork float free. Chemists call that “turnover” — how efficiently a catalyst can draw in molecules, help them react, then send them back out into the world.

Platinum isn’t the only superstar catalyst. Other metals with similar chemical properties also get the job done — palladium, ruthenium and iridium, for example. But those elements are also expensive and hard to get. They are so good at what they do that it’s hard to find a substitute. But promising new options are in the works.
Carbon is key
Carbon is a particularly attractive alternative to precious metals like platinum because it’s cheap, abundant and can be assembled into many different structures.

Carbon atoms can arrange themselves into flat sheets of orderly hexagonal rings, like chicken wire. Rolling these chicken wire sheets — known as graphene — into hollow tubes makes carbon nanotubes, which are stronger than steel for their weight. But carbon-only structures don’t make great catalysts.

“Really pure graphene isn’t catalytically active,” says Huixin He, a chemist at Rutgers University in Newark, N.J. But replacing some of the carbon atoms in the framework with nitrogen, phosphorus or other atoms changes the way electric charge is distributed throughout the material. And that can make carbon behave more like a metal. For example, nitrogen atoms sprinkled like chocolate chips into the carbon structure draw negatively charged electrons away from the carbon atoms. The carbon atoms are left with a more positive charge, making them more attractive to the reaction that needs a nudge.

That movement of electrical charge is a prerequisite for a material to act as a catalyst, says Dai, who has pioneered the development of carbon-based, metal-free catalysts. His lab group demonstrated in 2009 in Science that clumps of nitrogen-containing carbon nanotubes aligned vertically — like a fistful of uncooked spaghetti — could stand in for platinum to help break apart oxygen inside fuel cells.
To perfect the technology, which he has patented, Dai has been swapping in different atoms in different combinations and experimenting with various carbon structures. Should the catalyst be a flat sheet of graphene or a forest of rolled up nanotubes, or some hybrid of both? Should it contain just nitrogen and carbon, or a smorgasbord of other elements, too? The answer depends on the specific application.

In 2015 in Science Advances, Dai demonstrated that nitrogen-studded nanotubes worked in acid-containing fuel cells, one of the most promising designs for electric vehicles.

Other researchers are playing their own riffs on the carbon concept. To produce graphene’s orderly structure requires just the right temperature and specific reaction conditions. Amorphous carbon materials — in which the atoms are randomly clumped together — can be easier to make, Rutgers’ He says.

In one experiment, He’s team started with liquid phytic acid, a substance made of carbon, oxygen and phosphorus. Microwaving the liquid for less than a minute transformed it into a sooty black powder that she describes as a sticky sort of sand.

“Phytic acid strongly absorbs microwave energy and changes it to heat so fast,” she says. The heat rearranges the atoms into a jumbled carbon structure studded with phosphorus atoms. Like the nitrogen atoms in Dai’s nanotubes, the phosphorus atoms changed the movement of electric charge through the material and made it catalytically active, He and colleagues reported last year in ACS Nano.

The sooty phytic acid–based catalyst could help move along a different form of clean energy: It sped up a reaction that turns a big, hard-to-use molecule found in cellulose — a tough, woody component of plants — into something that can react with other molecules. That product could then be used to make fuel or other chemicals. He is still tweaking the catalyst to make it work better.

He’s catalyst particles get mixed into the chemical reaction (and later need to be strained out). These more jumbled carbon structures with nitrogen or phosphorus sprinkled in can work in fuel cells, too — and, she says, they’re easier to make than graphene.

Enzyme-inspired energy
Rather than design new materials from the bottom up, some scientists are repurposing catalysts already used in nature: enzymes. Inside living things, enzymes are involved in everything from copying genetic material to breaking down food and nutrients.

Enzymes have a few advantages as catalysts, says M.G. Finn, a chemist at Georgia Tech. They tend to be very specific for a particular reaction, so they won’t waste much energy propelling undesired side reactions. And because they can evolve, enzymes can be tailored to meet different needs.

On their own, enzymes can be too fragile to use in industrial manufacturing, says Trevor Douglas, a chemist at Indiana University in Bloomington. For a solution, his team looked to viruses, which already package enzymes and other proteins inside protective cases.

“We can use these compartments to stabilize the enzymes, to protect them from things that might chew them up in the environment,” Douglas says. The researchers are engineering bacteria to churn out virus-inspired capsules that can be used as catalysts in a variety of applications.
His team mostly uses enzymes called hydrogenases, but other enzymes can work, too. The researchers put the genetic instructions for making the enzymes and for building a protective coating into Escherichia coli bacteria. The bacteria go into production mode, pumping out particles with the hydrogenase enzymes protected inside, Douglas and colleagues reported last year in Nature Chemistry. The protective coating keeps chunky enzymes contained, but lets the molecules they assist get in and out.

“What we’ve done is co-opt the biological processes,” Douglas says. “All we have to do is grow the bacteria and turn on these genes.” Bacteria, he points out, tend to grow quite easily. It’s a sustainable system, and one that’s easily tailored to different reactions by swapping out one enzyme for another.

The enzyme-containing particles can speed along generation of the hydrogen fuel, he has found. But there are still technical challenges: These catalysts last only a couple of days, and figuring out how to replace them inside a consumer device is hard.

Other scientists are using existing enzymes as templates for catalysts of their own design. The same family of hydrogenase enzymes that Douglas is packaging into capsules can be a launching point for lab-built catalysts that are even more efficient than their natural counterparts.

One of these hydrogenases has an iron core plus an amine — a nitrogen-containing string of atoms — hanging off. Just as the nitrogen worked into Dai’s carbon nanotubes affected the way electrons were distributed throughout the material, the amine changes the way the rest of the molecule acts as a catalyst.

Morris Bullock, a researcher at Pacific Northwest National Laboratory in Richland, Wash., is trying to figure out exactly how that interaction plays out. He and colleagues are building catalysts with cheap and abundant metals like iron and nickel at their core, paired with different types of amines. By systematically varying the metal core and the structure and position of the amine, they’re testing which combinations work best.

These amine-containing catalysts aren’t ready for prime time yet — Bullock’s team is focused on understanding how the catalysts work rather than on perfecting them for industry. But the findings provide a springboard for other scientists to push these catalysts toward commercialization.

Sticking with the metals
These new types of catalysts are promising — many of them can speed up reactions almost as well as a traditional platinum catalyst. But even researchers working on platinum alternatives agree that making sustainable and low-cost catalysts isn’t always as simple as removing the expensive and rare metals.

“The calculation of sustainability is not completely straightforward,” Finn says. Though he works with enzymes in his lab, he says, “a platinum-based catalyst that lasts for years is probably going to be more sustainable than an enzyme that degrades.” It might end up being cheaper in the long run, too. That’s why researchers working on these alternative catalysts are pushing to make their products more stable and longer-lasting.
“If you think about a catalyst, it’s really the atoms on the surface that participate in the reaction. Those in the bulk may just provide mechanical support or are just wasted,” says Younan Xia, a chemist at Georgia Tech. Xia is working on minimizing that waste.

One promising approach is to shape platinum into what Xia dubs “nanocages” — instead of a solid cube of metal, just the edges remain, like a frame.

It’s also why many scientists haven’t given up on metal. “I don’t think you can say, ‘Let’s do without metals,’ ” says James Clark, a chemist at the University of York in England. “Certain metals have a certain functionality that’s going to be very hard to replace.” But, he adds, there are ways to use metals more efficiently, such as using nanoparticle-sized pieces that have a higher surface area than a flat sheet, or strategically combining small amounts of a rare metal with cheaper, more abundant nickel or iron. Changing the structure of the material on a nanoscale level also can make a difference.

In one experiment, Xia started with cubes of a different rare metal, palladium. He coated the palladium cubes with a thin layer of platinum just a few atoms thick — a pretty straightforward process. Then, a chemical etched away the palladium inside, leaving a hollow platinum skeleton. Because the palladium is removed from the final product, it can be used again and again. And the nanocage structure leaves less unused metal buried inside than a large flat sheet or a solid cube, Xia reported in 2015 in Science.

Since then, Xia’s team has been developing more complex shapes for the nanocages. An icosahedron, a ball with 20 triangular faces, worked especially well. The slight disorder to the structure — the atoms don’t crystallize quite perfectly — helped make it four times as active as a commercial platinum catalyst. He has made similar cages out of other rare metals like rhodium that could work as catalysts for other reactions.

It’ll take more work before any of these new catalysts fully dethrone platinum and other precious metals. But once they do, that’ll leave more precious metals to use in places where they can truly shine.

Earth’s mantle may be hotter than thought

Temperatures across Earth’s mantle are about 60 degrees Celsius higher than previously thought, a new experiment suggests. Such toasty temperatures would make the mantle runnier than earlier research suggested, a development that could help explain the details of how tectonic plates glide on top of the mantle, geophysicists report in the March 3 Science.

“Scientists have been arguing over the mantle temperature for decades,” says study coauthor Emily Sarafian, a geophysicist at the Woods Hole Oceanographic Institution in Massachusetts and at MIT. “Scientists will argue over 10 degree changes, so changing it by 60 degrees is quite a large jump.”
The mostly solid mantle sits between Earth’s crust and core and makes up around 84 percent of Earth’s volume. Heat from the mantle fuels volcanic eruptions and drives plate tectonics, but taking the mantle’s temperature is trickier than dropping a thermometer down a hole.

Scientists know from the paths of earthquake waves and from measures of how electrical charge moves through Earth that a boundary in the mantle exists a few dozen kilometers below Earth’s surface. Above that boundary, mantle rock can begin melting on its way up to the surface. By mimicking the extreme conditions in the deep Earth — squeezing and heating bits of mantle that erupt from undersea volcanoes or similar rocks synthesized in the lab — scientist can also determine the melting temperature of mantle rock. Using these two facts, scientists have estimated that temperatures at the boundary depth below Earth’s oceans are around 1314° C to 1464° C when adjusted to surface pressure.

But the presence of water in the collected mantle bits, primarily peridotite rock, which makes up much of the upper mantle, has caused problems for researchers’ calculations. Water can drastically lower the melting point of peridotite, but researchers can’t prevent the water content from changing over time. In previous experiments, scientists tried to completely dry peridotite samples and then manually correct for measured mantle water levels in their calculations. The scientists, however, couldn’t tell for sure if the samples were water-free.

The measurement difficulties stem from the fact that peridotite is a mix of the minerals olivine and pyroxene, and the mineral grains are too small to experiment with individually. Sarafian and colleagues overcame this challenge by inserting spheres of pure olivine large enough to study into synthetic peridotite samples. These spheres exchanged water with the surrounding peridotite until they had the same dampness, and so could be used for water content measurements.

Using this technique, the researchers found that the “dry” peridotite used in previous experiments wasn’t dry at all. In fact, the water content was spot on for the actual wetness of the mantle. “By assuming the samples are dry, then correcting for mantle water content, you’re actually overcorrecting,” Sarafian says.
The new experiment suggests that, if adjusted to surface pressure, the mantle under the eastern Pacific Ocean where two tectonic plates diverge, for example, would be around 1410°, up from 1350°. A hotter mantle is less viscous and more malleable, Sarafian says. Scientists have long been puzzled about some of the specifics of plate tectonics, such as to what extent the mantle resists the movement of the overlying plate. That resistance depends in part on the mix of rock, temperature and how melted the rock is at the boundary between the two layers (SN: 3/7/15, p. 6). This new knowledge could give researchers more accurate information on those details.

The revised temperature is only for the melting boundary in the mantle, so “it’s not the full story,” notes Caltech geologist Paul Asimow, who wrote a perspective on the research in the same issue of Science. He agrees that the team’s work provides a higher and more accurate estimate of that adjusted temperature, but he doesn’t think the researchers should assume temperatures elsewhere in the mantle would be boosted by a similar amount. “I’m not so sure about that,” he says. “We need further testing of mantle temperatures.”

Ancient dental plaque tells tales of Neandertal diet and disease

Dental plaque preserved in fossilized teeth confirms that Neandertals were flexible eaters and may have self-medicated with an ancient equivalent of aspirin.

DNA recovered from calcified plaque on teeth from four Neandertal individuals suggest that those from the grasslands around Beligum’s Spy cave ate woolly rhinoceros and wild sheep, while their counterparts from the forested El Sidrón cave in Spain consumed a menu of moss, mushrooms and pine nuts.

The evidence bolsters an argument that Neandertals’ diets spanned the spectrum of carnivory and herbivory based on the resources available to them, Laura Weyrich, a microbiologist at the University of Adelaide in Australia, and her colleagues report March 8 in Nature.

The best-preserved Neandertal remains were from a young male from El Sidrón whose teeth showed signs of an abscess. DNA from a diarrhea-inducing stomach bug and several gum disease pathogens turned up in his plaque. Genetic material from poplar trees, which contain the pain-killing aspirin ingredient salicylic acid, and a plant mold that makes the antibiotic penicillin hint that he may have used natural medication to ease his ailments.

The researchers were even able to extract an almost-complete genetic blueprint, or genome, for one ancient microbe, Methanobrevibacter oralis. At roughly 48,000 years old, it’s the oldest microbial genome sequenced, the researchers report.

Extreme gas loss dried out Mars, MAVEN data suggest

The Martian atmosphere definitely had more gas in the past.

Data from NASA’s MAVEN spacecraft indicate that the Red Planet has lost most of the gas that ever existed in its atmosphere. The results, published in the March 31 Science, are the first to quantify how much gas has been lost with time and offer clues to how Mars went from a warm, wet place to a cold, dry one.

Mars is constantly bombarded by charged particles streaming from the sun. Without a protective magnetic field to deflect this solar wind, the planet loses about 100 grams of its now thin atmosphere every second (SN: 12/12/15, p. 31). To determine how much atmosphere has been lost during the planet’s lifetime, MAVEN principal investigator Bruce Jakosky of the University of Colorado Boulder and colleagues measured and compared the abundances of two isotopes of argon at different altitudes in the Martian atmosphere. Using those measurements and an assumption about the amounts of the isotopes in the planet’s early atmosphere, the team estimates that about two-thirds of all of Mars’ argon gas has been ejected into space. Extrapolating from the argon data, the researchers also determined that the majority of carbon dioxide that the Martian atmosphere ever had also was kicked into space by the solar wind.

A thicker atmosphere filled with carbon dioxide and other greenhouse gases could have insulated early Mars and kept it warm enough for liquid water and possibly life. Losing an extreme amount of gas, as the results suggest, may explain how the planet morphed from lush and wet to barren and icy, the researchers write.

Cells’ stunning complexity on display in a new online portal

Computers don’t have eyes, but they could revolutionize the way scientists visualize cells.

Researchers at the Allen Institute for Cell Science in Seattle have devised 3-D representations of cells, compiled by computers learning where thousands of real cells tuck their component parts.

Most drawings of cells in textbooks come from human interpretations gleaned by looking at just a few dead cells at a time. The new Allen Cell Explorer, which premiered online April 5, presents 3-D images of genetically identical stem cells grown in lab dishes (composite, above), revealing a huge variety of structural differences.
Each cell comes from a skin cell that was reprogrammed into a stem cell. Important proteins were tagged with fluorescent molecules so researchers could keep tabs on the cell membrane, DNA-containing nucleus, energy-generating mitochondria, microtubules and other cell parts. Using the 3-D images, computer programs learned where the cellular parts are in relation to each other. From those rules, the programs can generate predictive transparent models of a cell’s structure (below). The new views, which can cap­ture cells at different time points, may offer clues into their inner workings.
The project’s tools are available for other researchers to use on various types of cells. Insights gained from the explorations might lead to a better understanding of human development, cancer, health and diseases.

Researchers have already learned from the project that stem cells aren’t the shapeless blobs they might appear to be, says Susanne Rafelski, a quantitative cell biologist at the Allen Institute. Instead, the stem cells have a definite bottom and top, a proposed structure that’s now confirmed by the combined cell data, Rafelski says. A solid foundation of skeleton proteins forms at the bottom. The nucleus is usually found in the cell’s center. Microtubules bundle together into large fibers that tend to radiate from the top of the cell toward the bottom. During cell division, microtubules form structures called bipolar spindles that are necessary to divvy up DNA.
One surprise was that the membrane surrounding the nucleus gets ruffled, but never completely disappears, during cell division. Near the top of the cell, above the nucleus, stem cells store tubelike mitochondria much the way plumbing and electrical wires are tucked into ceilings. The tubular mitochondria were notable because some researchers thought that since stem cells don’t require much energy, the organelles might separate into small, individual units.

Old ways of observing cells were like trying to get to know a city by looking at a map, Rafelski says. The cell explorer is more like a documentary of the lives of the citizens.