Archive for the ‘Uncategorized’ Category

Computer science not widely taught in Washington area schools

Thursday, April 24th, 2014

Nationally, AP computer science gives a particularly vivid glimpse of the subject, with 29,555 students taking the AP computer science exam nationwide in 2013 — far fewer than the hundreds of thousands who took calculus, for example. In Maryland, 1,629 students took the computer science AP exam, and 1,655 took the exam in Virginia. In D.C.’s public and private schools, there were 96 exam-takers.

Here is a snapshot of the Washington region, by the numbers. School districts appear in order of the size of their high school student population, from largest to smallest:

Fairfax County (Va.)

Total high school students: 51,928

Students in computer science courses: 2,343 (4.5 percent)

Students who took the AP computer science exam in 2013: 740

Total number of high schools: 27

High schools with computer science courses: 17

High schools with AP computer science: 17

Full-time and part-time computer science teachers: 24

Montgomery County (Md.)

Total high school students: 45,132

Students in computer science courses: 2,746 (6.1 percent)

Students who took the AP computer science exam in 2013: 521

Total number of high schools: 25

High schools with computer science courses: 19

High schools with AP computer science: 19

Full-time and part-time computer science teachers: At least 50

Prince George’s County (Md.)

Total high school students: 35,174

Students in computer science courses: 209 (0.6 percent)

Students who took the AP computer science exam in 2013: 105

Total number of high schools: 23

High schools with computer science courses: 11

High schools with AP computer science: 5

Full-time and part-time computer science teachers: 11

Prince William County (Va.)

Total high school students: 24,665

Students in computer science courses: 705 (2.9 percent)

Students who took the AP computer science exam in 2013: 124

Total number of high schools: 11

High schools with computer science courses: 9

High schools with AP computer science: 5

Full-time computer science teachers: 10

Loudoun County (Va.)

Total high school students: 19,813

Students in computer science courses: 845 (4.3 percent)

Students who took the AP computer science exam in 2013: 341

Total number of high schools: 13

High schools with computer science courses: 13

High schools with AP computer science: 13

Full-time and part-time computer science teachers: 13

D.C. Public Schools

Total high school students: 10,200

Students in computer science courses: Over 400 (3.9 percent)

Students who took the AP exam in computer science in 2013: 7

Total number of high schools: 16

High schools with computer science courses: 6, growing to 10 next year

High schools with AP computer science: 1

Full-time and part-time computer science teachers: 8

Arlington County (Va.)

Total high school students: 5,847

Students in computer science courses: 370 (6.3 percent)

Students who took the AP computer science exam in 2013: 15

Total number of high schools: 4

High schools with computer science courses: 4

High schools with AP computer science: 4

Full-time and part-time computer science teachers: 4

Alexandria (Va.)

Total high school students: 3,324

Students in computer science courses: 7 (0.2 percent)

Students who took the AP computer science exam in 2013: 14

Total number of high schools: 1

High schools with computer science courses: 1

High schools with AP computer science: 1

Full-time and part-time computer science teachers: none, teacher resigned

Note: Figures in most cases do not include courses given under Career and Technical Education, which may include some computer science.

Article source:

Science Tools Anyone Can Afford

Tuesday, April 22nd, 2014

Log in to manage your products and services from The New York Times and the International New York Times.

Don’t have an account yet?
Create an account »

Subscribed through iTunes and need an account?
Learn more »

Article source:

Depp Impact: The Science Behind ‘Transcendence’

Saturday, April 19th, 2014

In the new science-fiction thriller “Transcendence,” Johnny Depp uploads his mind to a powerful computer, melding his consciousness with artificial intelligence in a scenario many refer to as “the singularity.”

Far-fetched? Yes, but so is the idea of a brooding Depp as an awkward neuroscientist. That doesn’t mean people are not trying to make it happen in real life, including Russian billionaire Dmitry Itskov, who makes “immortality” an explicit goal of his 2045 Initiative.

Two researchers who consulted on “Transcendence,” both professors of electrical engineering and neuroscience at the University of California, Berkeley, are not sure that is an attainable goal, but that does not mean it’s not worth pursuing.

“Will what we see in the movie be happening in 30 years? I would have to say no, because we don’t even understand what consciousness is,” Jose Carmena told NBC News.

“To upload the mind, you would have to build technology that would let you interface with the brain,” added his colleague, Michel Maharbiz. “The race to do try and do that could motivate a lot of technology along the way, and that could potentially help a lot of people.”

In other words, before we get a virtual Johnny Depp, we are going to need to really understand how the brain works — a goal that a lot of people have put a lot of money behind.

In April 2013, President Barack Obama announced the BRAIN Initiative, a $232 million collaboration between the government and private companies to map the human brain.

People with disabilities could benefit the most from this kind of research. Zac Vawter, a Seattle-area man who lost his lower right leg in a motorcycle accident, made headlines when he was outfitted with a prosthetic leg that he could control with signals from his brain.

Other scientists are looking into whether people like Stephen Hawking could communicate with the outside world without moving a muscle — something made easier with technology that can measure brain activity without the need to connect electrodes to someone’s scalp. That same type of technology has also been used for the less noble cause of wriggling robotic cat ears.

Carmena and Maharbiz spent 10 hours in Los Angeles, followed by two more visits, going over the science in the script with “Transcendence” director Wally Pfister. This being Hollywood, plenty of the scenes include a bit of creative license, including one (spoiler alert!) involving a popular science-fiction trope called “grey goo,” a mass of self-replicating nanobots that can heal people and create matter out of nothing in a matter of seconds.

While that might be impossible, nanotechnology in general, which extends to the fields of chemistry, biology, physics, materials science and engineering, is thriving in labs across the country.

“It’s progressing fairly rapidly,” Maharbiz said. “The ability to engineer incredibly small machines down at the nanoscale is being pursued very aggressively by lots of people.”

The hope is that eventually nanobots could be made to attack cancer and other diseases. It’s an ambitious goal, much like mapping the billions of neurons in the human brain. But it could result in much greater things than a fun night out at the movies.

“In 50 years, the landscape will be very different, and you will see very advanced ways of connecting to the brain,” said Maharbiz. “That is the primary benefit from people who have science-fiction goals in mind.”

Article source:

Americans wary of futuristic science, tech

Thursday, April 17th, 2014

.cnnstrylccimg640{margin:0 27px 14px 0}
.cnn_html_slideshow_media_caption a,.cnn_html_slideshow_media_caption a:visited,.cnn_html_slideshow_media_caption a:link,.captionText a,.captionText a:visited,.captiontext a:link{color:outline:medium none}
.cnnVerticalGalleryPhoto{margin:0 auto;padding-right:68px;width:270px}

Americans are wary of many sci-fi developments in science and technology, according to a new Pew survey. Read on to see U.S. adults’ opinions about drones, robot nurses, lab-engineered meat and other futuristic advancements.









(CNN) — Americans are generally excited about the new technology they expect to see in their lifetimes. But when confronted with some advances that already appear possible — from skies filled with drones to meat made in a lab — they get nervous.

Those are the findings in a report released Thursday by the Pew Research Center, which sought to gauge public opinion about our rapidly changing world of science and tech.

“The American public anticipates that the coming half-century will be a period of profound scientific change, as inventions that were once confined to the realm of science fiction come into common usage,” reads the report.

Overall, respondents to Pew’s survey were upbeat about how technology will shape the near future. In the report, 59% of Americans think tech developments will make life in the next half-century better, while only 30% said they will make life worse.

Shooting down drones from your backyard

Can driverless cars improve road safety?

3-D printing for the human body

More than eight out of 10 respondents (81%) said they think that in the next 50 years, people who need transplants will be able to get them with organs grown in labs. And more than half (51%) think computers will be able to create art as skillfully as humans do.

They’re a little less optimistic about some science-fiction staples, though. Only 39% think it’s likely scientists will have figured out how to teleport things (or, presumably, people), 33% say we’ll have long-term space colonies by 2064 and a mere 19% expect humans will be able to control the weather.

Interestingly, some of the advances that may be closest to becoming reality are the ones survey respondents were most worried about.

Nearly two out of three Americans think it would make things worse if U.S. airspace is opened up to personal drones. A similar number dislike the idea of robots being used to care for the sick and elderly, and of parents being able to alter the DNA of their unborn children.

Meanwhile, only 37% of respondents think it will be good if wearable devices or implants allow us to be digitally connected all the time.

With the advent of Google Glass and other wearable technology, that may not be such a distant dream. And already, researchers are developing robots to provide elder care, 3D printers are replicating parts of the human body and government regulators are considering allowing nonmilitary drones to legally operate in U.S. airspace.

“In the long run, Americans are optimistic about the impact that scientific developments will have on their lives and the lives of their children — but they definitely expect to encounter some bumps along the way,” said Aaron Smith, a senior researcher at Pew and the author of the report. “They are especially concerned about developments that have the potential to upend long-standing social norms around things like personal privacy, surveillance, and the nature of social relationships.”

Other responses shined a light on what may be our cautious human nature. While generally excited about future tech, many survey participants weren’t so keen on testing out those advances themselves.

People were split almost evenly (48%-50%) on whether they would ride in a driverless car. But only 26% said they’d get a brain implant to improve their memory or intelligence, and a mere 20% said they’d try eating meat made in a lab.

While they were imagining the future, Pew gave respondents the chance, in their own words, to share the one piece of futuristic technology they’d most like to own.

Some 9% said they’d like to be able to time travel. A similar number said they’d like something that would keep them healthy or extend their lives, 6% said they wanted a flying car (or bike), 3% said they’d take a teleportation device and a mere 1% said they want their own jetpack.

The report was based on telephone interviews conducted February13-18 with 1,001 adults from all 50 U.S. states and the District of Columbia.

Article source:

GOP pushes funding cuts for social science work

Tuesday, April 15th, 2014

WASHINGTON — If Republicans have their way, a Harvard University anthropologist would not be using tax dollars to study the impact of China’s one-child policy. A Massachusetts Institute of Technology political scientist would not have the money to research how Medicare changes might shape seniors’ political attitudes. And a Brown University archeologist would not be spending hundreds of thousands of dollars examining textiles from the Viking Age.

This is the latest front in a GOP-led war against the federal funding of social science and other research, including the study of climate change, in an age of fiscal austerity. House Republicans are questioning millions of dollars in National Science Foundation grants awarded to researchers across the country by singling out dozens of projects for extra scrutiny.

Continue reading below

They have recently proposed a bill — titled the “Frontiers in Innovation, Research, Science, and Technology (First) Act of 2014” — that would cut foundation spending for research in the social, behavioral, and economic sciences by more than 40 percent. The bill would shift some $160 million that the federal government has allocated for the social sciences and geosciences toward Republican priorities in the physical and biological sciences, as well as engineering.

While on the surface it might seem like the usual partisan bickering, the proposal raises deeper questions about what role the government should play in funding science where the payoff is more difficult to discern than, for example, the quest for the genetic codes that could unlock the mysteries of cancer.

It also pits the interests of politicians, who control the scientific purse strings, against the judgments of the scientific community, which selects projects to fund through peer evaluations.

“We have to question spending nearly $700,000 of taxpayer dollars to fund a climate change musical or over $220,000 to study animal photos in National Geographic,” said US Representative Lamar Smith of Texas, chairman of the House Science, Space and Technology Committee, in a statement to the Globe. “It’s the role of Congress to make sure we’re using limited federal funds for the highest priority research.”

The actions have prompted Democrats and many in the scientific community to accuse Republicans of meddling in science — a dangerous precedent, critics say, because it would allow politicians to decide what areas to prioritize based on their own partisan ideologies, over the expertise of scientists.

Continue reading below

“For a committee that is supposed to be advancing science, we seem to be doing an awfully good job of advancing selective science,” said Representative Joseph P. Kennedy III, a Massachusetts Democrat on the same committee. “It’s been frustrating, particularly of late.”

Kennedy called the Republican bill an “opportunistic approach to defunding or attacking certain areas of science that you either don’t agree with or that you don’t want to see what the results might actually be.”

Smith’s bill, up for a committee vote soon, would require the National Science Foundation to publicly justify why each grant it awards meets the national interest.

The bill would give Congress authority to decide how to divide the pool of grant money the foundation awards, instead of allowing the agency to determine how much to allot toward each field of science.

This House effort follows a budget amendment by Senator Tom Coburn of Oklahoma last year to temporarily restrict the funding of political science research to projects that promote US national security or economic development. The amendment was adopted, leading the foundation to cancel a round of political science grant competitions.

In recent weeks, Democrats were able to amend the new House bill and boost social science funding by $50 million to a total of $200 million – but that figure is still 26 percent less than the $272 million for the social sciences that President Obama and the NSF have requested for 2015.

Obama’s proposed budget for social science research accounts for far less than 1 percent of the agency’s $5.8 billion total research funding, a “very, very modest proportion,” said John Holdren, director of the White House Office of Science and Technology Policy, at a recent House Science Committee hearing on science agency budgets.

“Some of the funny-sounding titles, when you look into them, do make a lot of sense,” said Holdren, an environmental policy and earth and planetary science professor at Harvard. “I just don’t feel that most people in this room are well qualified to second-guess NSF’s superb peer review committees.”

Social science research helps the nation understand poverty, control the spread of infectious diseases, reduce human trafficking, understand the conduct of other nations and the effectiveness of sanctions, and optimize disaster response, among other benefits, Holdren said.

“This is an attempt to politicize the grant-making process instead of leaving it up to the experts,” said Wendy Naus, executive director of the Consortium of Social Science Associations in Washington. “Even for the scientific disciplines that would actually see an increase at the expense of social sciences, it’s a slippery slope. It may work out for them this year, but in future years, who’s to say?”

Already Republicans have raised concerns about the merits of some biological research, including a $385,000 study by Yale University ornithologists on the sexual behavior of ducks, which landed on a House science committee spreadsheet of nearly 100 questionable projects.

Patricia Brennan, an evolutionary biologist at University of Massachusetts Amherst who had been awarded the grant to study sexual conflict in ducks when she was a postdoctoral fellow at Yale, said basic research like hers is easy to poke fun at because it does not have clear real-world applicability.

“A lot of the research we fund has to do with topics that may not be immediately apparent why it’s important to human beings,” but it can prove to be, said Brennan.

Brennan and other scientists from around the country descended upon Washington last week to talk with lawmakers about the importance of NSF funding for basic research.

Michelle McKinley, a law professor at the University of Oregon, said she almost fell off her chair when she discovered in March that House Republicans had been citing her $50,000 grant to study lawsuits in Peru between 1600 and 1700 as an example of waste. McKinley said her research was funded only after surviving two rounds of rigorous reviews by other social scientists. She uses historical records to study the institution of slavery to answer questions about social justice and inequality — work she defends as valuable “even if it’s not going to give us clean energy.”

“It doesn’t usually have great results for the production of knowledge when Congress starts questioning why certain things are receiving funding,” McKinley said.

Coburn, an obstetrician who is retiring from Congress in December, has for years tried to eliminate funding for political science research, and he targeted social science in his annual “Wastebook.” Coburn has gone a step further than House Republicans, proposing the elimination of funding altogether for social, behavioral, and economic research within the National Science Foundation to save the nation about $255 million a year.

“Rather than ramping up the amount spent on political science and other social and behavioral research, NSF’s mission should be redirected toward truly transformative sciences with practical uses,” Coburn wrote in a scathing 2011 review of the National Science Foundation.

One of Coburn’s targets has been research on how members of Congress engage their constituents, conducted by David Lazer, a professor of political science and computer and information sciences at Northeastern University. In 2009, Coburn accused the NSF of wasting tax dollars on helping “members of Congress improve their dismal approval ratings” instead of using the money for cancer and other disease research.

“Coburn was the opening wedge of the attack on the more general social sciences, science that may be producing conclusions that are disliked by particular political players today,” Lazer said.

“The hope is that academia can help speak truth to power, but that doesn’t work when the powerful turn the lights out on those academics.”

Article source:

Book review: ‘Why Science Does Not Disprove God’ by Amir D. Aczel

Sunday, April 13th, 2014

Dawkins; Krauss, with his bestseller “A Universe From Nothing”; and Sam Harris, with his bestseller “The End of Faith,”
are prominent New Atheists, who use modern science to argue that God is not only unnecessary but unlikely to exist at all, even behind the curtains. There’s a certain religious fervor in all these books. Atheists, unite.

Aczel, trained as a mathematician, currently a research fellow in the history of science at Boston University and the author of “Fermat’s Last Theorem,” takes aim at the New Atheists in his intelligent and stimulating book
“Why Science Does Not Disprove God.” He attempts to show that the New Atheists’ analyses fall far short of disproving the existence of God. In fact, he accuses these folks of staining the scientific enterprise by bending it to their dark mission. (“The purpose of this book is to defend the integrity of science,” he writes in his introduction.) Yet Aczel has a sly mission of his own. Invoking various physical phenomena that do not (yet) have convincing scientific explanations, he sets out not only to debunk the arguments of the New Atheists but also to gently suggest that the findings of science actually point to the existence of God.

In stockpiling his arguments, Aczel quotes from his interviews with dozens of leading scientists and theologians, and interprets statements in a range of popular writings. The resulting book is part science (interesting but superficial summaries of cosmology, quantum mechanics, evolutionary biology, chaos theory), part history of religion, part philosophy, part spirituality, and a modicum of backbiting and invective. The latter applies to the writings of the New Atheists as well.

Let’s start with the origin of the universe. There is plenty of good scientific evidence that our universe began about 14 billion years ago, in a Big Bang of enormously high density and temperature, long before planets, stars and even atoms existed. But what came before? Krauss in his book discusses the current thinking of physicists that our entire universe could have emerged from a jitter in the amorphous haze of the subatomic world called the quantum foam, in which energy and matter can materialize out of nothing. (On the level of single subatomic particles, physicists have verified in the lab that such creation from “nothing” can occur.) Krauss’s punch line is that we do not need God to create the universe. The quantum foam can do it quite nicely all on its own. Aczel asks the obvious question: But where did the quantum foam come from? Where did the quantum laws come from? Hasn’t Krauss simply passed the buck? Legitimate questions. But ones we will probably never be able to answer.

In his foray into biology, Aczel says the theory of evolution is flawed. In particular, he points out that it does not explain altruistic behavior with no apparent survival benefit to the genes of the do-gooder. He cites a recent example of a Mount Everest climbing expedition in which an Israeli climber was well on his way to the top when he discovered a fallen Turkish climber who had lost his face mask and oxygen supply. At the cost of his own fingers and toes to frostbite, and sacrificing the glory of reaching the summit, the Israeli stopped and saved the life of the Turkish fellow. Why did he do it? “Human decency and goodness,” Aczel writes, with the implication that such qualities come from religion and spirituality. (In another chapter, he explains how a code of morality developed in early religions.)

Aczel discusses the mysteries of “emergent” phenomena — when a complex system exhibits a qualitative behavior that cannot be explained in terms of the workings of its individual parts: for example, the emergence of self-replicating life from inanimate molecules or the emergence of consciousness from a collection of connected neurons. He writes, “The inexplicability of such emergent phenomena is the reason why we cannot disprove the idea of some creative power behind everything.”

I disagree. It is not the inability of science to explain some physical phenomenon that shows we cannot disprove the existence of a creative power (i.e., God). Science is a work in progress, and phenomena that science cannot explain now may be explained 100 years from now. Before the 18th century, people had no explanation for lightning. The reason that science cannot disprove the existence of God, in my opinion, is that God, as understood by all human religions, exists outside time and space. God is not part of our physical universe (although God may choose to enter the physical universe at times). God is not subject to experimental tests. Either you believe or you don’t believe.

Thus, no matter what scientific evidence is amassed to explain the architecture of atoms, or the ways that neurons exchange chemical and electrical signals to create the sensations in our minds, or the manner in which the universe may have been born out of the quantum foam, science cannot disprove the existence of God — any more than a fish can disprove the existence of trees. Likewise, no matter what gaps exist in current scientific knowledge, no matter what baffling good deeds people do, no matter what divine and spiritual feelings people have, theology cannot prove the existence of God. The most persuasive evidence of God, according to the great philosopher and psychologist William James in his landmark book “The Varieties of Religious Experience” (1902), is not physical or objective or provable. It is the highly personal transcendent experience.

There is one scientific conundrum that practically screams out the limitations of both science and religion. And that is the “fine tuning” problem. For the past 50 years or so, physicists have become more and more aware that various fundamental parameters of our universe appear to be fine-tuned to allow the emergence of life — not only life as we know it but life of any kind. For example, if the nuclear force were slightly stronger than it is, then all of the hydrogen atoms in the infant universe would have fused with other hydrogen atoms to make helium, and there would be no hydrogen left. No hydrogen means no water. On the other hand, if the nuclear force were substantially weaker than it is, then the complex atoms needed for biology could not hold together.

In another, even more striking example, if the cosmic “dark energy” discovered 15 years ago were a little denser than it actually is, our universe would have expanded so rapidly that matter could never have pulled itself together to form stars. And if the dark energy were a little smaller, the universe would have collapsed long before stars had time to form. Atoms are made in stars. Without stars there would be no atoms and no life.

So, the question is: Why? Why do these parameters lie in the narrow range that allows life? There are three possibilities: First, there might be some as-yet-unknown physics that requires these parameters to be what they are. But this explanation is highly questionable — why should the laws of physics care about the emergence of life? Second possibility: God created the universe, God wanted life (for whatever reasons), so God designed the universe so that it would allow life. Third possibility, and the one favored by many physicists today: Our universe is one of zillions of different universes with a huge range of parameters, including many different values for the strength of the nuclear force and the density of dark energy.

Some universes have stars and planets, some do not. Some harbor life, some do not. In this scenario, our universe is simply an accident. If our particular universe did not have the right parameters to allow the emergence of life, we wouldn’t be here to talk about it. In a similar way, Earth happens to be at the right distance from the sun to have liquid water, a nice oxygen atmosphere and so on. We can ask why our planet has all these lovely properties, amenable to life. And the explanation is that there is nothing special or designed about Earth. Other planets exist. But if we lived on Mercury, where the temperature is 800 degrees, or on Neptune, where it is 328 degrees below zero, we could not exist. Unfortunately, it is almost certain that we cannot prove the existence of these other universes. We must accept their existence as a matter of faith.

And here we come to the fascinating irony of the fine-tuning problem. Both the theological explanation and the scientific explanation require faith. To be sure, there are huge differences between science and religion. Religion knows about the transcendent experience. Science knows about the structure of DNA and the orbits of planets. Religion gathers its knowledge largely by personal testament. Science gathers its knowledge by repeated experiments and mathematical calculations, and has been enormously successful in explaining much of the physical universe. But, in the manner I have described, faith enters into both enterprises.

Several years ago, I thought that the writings and arguments of such people as Dawkins and Aczel, attempting to disprove or prove the existence of God, were a terrible waste of calories. I have changed my mind. I now believe that the discussions of science and religion, even the attempts of one side to disprove the other, are part of the continuing and restorative conversation of humanity with itself. In the end, all of our art, our science and our theological beliefs are an attempt to make sense of this fabulous and fleeting existence we find ourselves in.

Alan Lightman is a physicist, novelist and professor of the practice of the humanities at MIT. His latest book is “The Accidental Universe.”

Article source:

Opinion: Science Is Running Out of Things to Discover

Friday, April 11th, 2014

Call it confirmation bias, but I keep seeing signs that science—and especially fundamental physics, which seeks to discern the basic rules of reality—is running out of gas, just as I predicted in my 1996 book The End of Science.

The latest evidence is a “Correspondence” published today in the journal Nature. A group of six researchers, led by Santo Fortunato, professor of complex systems at Aalto University in Finland, points out that it is taking longer and longer for scientists to receive Nobel Prizes for their work.

The trend is weakest in prizes for physiology or medicine and strongest in physics. Prior to 1940, only 11 percent of physics prizes, 15 percent of chemistry prizes, and 24 percent of physiology or medicine prizes were awarded for work more than 20 years old. Since 1985, those percentages have risen to 60 percent, 52 percent, and 45 percent, respectively. If these trends continue, the Nature authors note, by the end of this century no one will live long enough to win a Nobel Prize, which cannot be awarded posthumously.

In their brief Nature letter, Fortunato and co-authors do not speculate on the larger significance of their data, except to say that they are concerned about the future of the Nobel Prizes. But in an unpublished paper called “The Nobel delay: A sign of the decline of Physics?” they suggest that the Nobel time lag “seems to confirm the common feeling of an increasing time needed to achieve new discoveries in basic natural sciences—a somewhat worrisome trend.”

This comment reminds me of an essay published in Nature a year ago, “After Einstein: Scientific genius is extinct.” The author, psychologist Dean Keith Simonton, suggested that scientists have become victims of their own success. “Our theories and instruments now probe the earliest seconds and farthest reaches of the universe,” he writes. Hence, scientists may produce no more “momentous leaps” but only “extensions of already-established, domain-specific expertise.” Or, as I wrote in The End of Science, “further research may yield no more great revelations or revolutions, but only incremental, diminishing returns.”

Needless to say, not all physicists accept this view—or the claim of Fortunato and co-authors that the Nobel time lag reported in Nature is a symptom of physics’ decline. The British astrophysicist Martin Rees spins the Nobel trend in the opposite direction, suggesting that it reflects “a growing backlog of potential winners.”

Rees conjectures that “there are more people than ever before whose achievements are up to the standard of most earlier winners.” But he concedes that “there is indeed perhaps a lull in particle physics.”

The recent discovery of the Higgs boson by the Large Hadron Collider (LHC) represents, paradoxically, both a triumph for particle physics and a sign of the field’s troubles. Peter Higgs and Francois Englert, who received the 2013 Nobel Prize in physics, predicted the existence of the Higgs boson—the fabled “God particle“—a half century ago.

The experimental evidence from the LHC that bears out their prediction stands as the capstone of the Standard Model of particle physics, which provides quantum accounts of the electroweak and strong nuclear forces governing the interactions of the known subatomic particles. But the Standard Model—often called the “theory of almost everything”—falls short of a full explanation of reality. For decades, physicists have sought to vault beyond it by proposing a host of unified theories, which assume deep connections between the electroweak and strong forces and even gravity. The most popular of these unified theories postulates that reality stems from infinitesimal strings wriggling in a hyperspace of nine or more dimensions.

But evidence—and hence Nobel recognition—for string theory and other unified theories remains elusive. Most recent Nobel Prizes in physics have instead recognized work that contributed to the conventional Standard Model and other preexisting theories rather than providing profound new insights into reality. For example, the 2003 and 1996 physics prizes honored research on superfluidity, a phenomenon first discovered in 1938.

I hope I’m wrong that the era of fundamental revelations is over, and there are grounds to argue I may be. In the late 1990s, for instance, two groups of astrophysicists discovered that the universe is expanding at an accelerating rate. The researchers won the 2011 Nobel Prize in physics for this totally unexpected finding, which hints that our understanding of the cosmos may indeed be radically incomplete.

Just last month, moreover, researchers announced that new observations of microwaves pervading the universe provide evidence of inflation, a dramatic theory of cosmic creation. Inflation theory holds that an instant after the big bang, our cosmos underwent a fantastically rapid, faster-than-light growth spurt. Inflation implies that our entire cosmos is just a tiny bubble in an oceanic “multiverse.”

But I remain skeptical of inflation. There are so many different versions of the theory that it can “predict” practically any observation, meaning that it doesn’t really predict anything at all. String theory suffers from the same problem. As for multiverse theories, all those hypothetical universes out there are unobservable by definition. It’s hard to imagine a better reason to think we may be running out of new things to discover than the fascination of physicists with these highly speculative ideas.

I would nonetheless be delighted if further observations provide enough evidence of inflation to impress the Nobel judges, who historically have had very high standards of evidence. Physicist Max Tegmark, a proponent of multiverse theories, thinks that inflation has a “good shot” at winning a Nobel.

If the Nobel Committee on physics does decides to award prizes for the invention of inflation, it shouldn’t dally. The theory was originally proposed more than 30 years ago, and its inventors, including Alan Guth and Andrei Linde—at ages 67 and 66, respectively—aren’t getting any younger.

John Horgan teaches at Stevens Institute of Technology and writes the Cross-Check blog for Scientific American. Follow him on twitter at @horganism.

Article source:

‘Oh S***’: Kids’ Science Center Reportedly Removes Evolution Warning …

Wednesday, April 9th, 2014

Following criticism from atheists, a California science center has reportedly removed a disclaimer about evolution that it placed on a poster advertising one of its public shows.

CuriOdyssey, a science and wildlife center in San Mateo, Calif., used language to let parents know that the subject of evolution would be discussed in “Animal Connections,” a live animal demonstration. A line on the poster read, “This program may discuss the topic of evolution.”

After noticing the disclaimer, Adam Rogers, an editor and writer at Wired, tweeted a picture of the sign, writing, “Apparently evolution is something they warn you about now, like smoke effects in the theater.”

Atheists who noticed the picture soon spoke out against CuriOdyssey’s warning, reportedly leading the science center to remove the line about evolution.

“It’s like a warning sign… But for whom? Did someone attend the program, hear the “E” word come up, and go, ‘Oh s***! I didn’t know this was that kind of museum!?,’” wrote Friendly Atheist blogger Hemant Mehta. “Since when does a science museum need to warn people that a science presentation will include science?”

And Jerry Coyne, an atheist biologist at the University of Chicago, sent the museum a letter, calling the evolutionary warning “unnecessary” and writing that there’s no need for scientific institutions to cater to faith-based opposition to evolution.

“Evolution happens to be true, and people need to learn about it,” Coyne wrote. “Making it seem ‘scary’ in this way only adds to the bad feelings people have about such a marvelous view of life, and deprives children of a proper grounding in biology.”

A screen shot from the CuriOdyssey website

According to Mehta, a spokeswoman for CuriOdyssey told him that the evolution line will now be removed following public response.

Mehta said he was told that the science center put the disclaimer on the poster to accommodate religious visitors who were surprised that evolution was discussed during “Animal Connections.”

“But after hearing feedback from science advocates, they decided the disclaimer didn’t align with their mission and they will no longer be including it on any promotional materials,” Mehta wrote.

(H/T: Friendly Atheist)

Featured image via @jetjocko’s Twitter account

Article source:

Parallel universes, milk and evolution: your science questions answered

Monday, April 7th, 2014

Q “All cows eat grass,” I was taught in music many years ago. Grass contains chlorophyll, which is based on the only mineral grass contains, magnesium. So why do people drink milk for the calcium it is said to contain? asks Tony Hunting

A Magnesium has many roles in a plant – including in chlorophyll molecules (the biological pigment needed for photosynthesis in green plants), where a magnesium ion sits in the central cavity of the large ring-shaped part of the structure. However, it is not the only “mineral nutrient” in plants.

Besides phosphorous, potassium, sulphur and other nutrients, plants also contain calcium that is used, among other things, in the cell walls of a plant.

Cows munch on grass and other plant matter and their digestive processes break it down and allow the cow to absorb a proportion of the nutrients within. Magnesium and calcium play important roles in the body, including in muscle contraction and skeletal development. After a cow has had a calf, it produces milk to feed its offspring and help it grow and develop. Nutrients are carried in the blood to the mammary glands where milk is produced. In fact, milk does not only contain calcium; apart from fats and proteins, it has a host of other components, including vitamin B12, potassium, zinc and magnesium. In modern society we have harnessed this milk production to produce copious amounts of the white stuff for our supermarket shelves.

Q I have been reading a book that suggested the possibility of the existence of parallel universes. Is this theory still considered to be too “far out” to be generally accepted? asks George Lange

A As Dr Daniel Mortlock of Imperial College London tells me: “Probably the strictest definition of parallel universes relates to the ‘many worlds’ interpretation of quantum mechanics, which imagines that all the possible results of every decision, measurement, etc are realised in one of an infinity of parallel universes.” However, such a view is not championed by all.

“This many worlds interpretation of quantum mechanics is certainly not generally accepted by the world’s physicists, albeit not for the usual scientific reason (ie, that it makes predictions that aren’t supported by experiment), but for the more philosophical reason that its only predictions are those that were already made by quantum mechanics,” Mortlock says.

“There is hence lots of debate about whether this is really a theory at all or just an interpretation of quantum mechanics.”

There is, however, another meaning attached to the concept of parallel universes: the theory of multiple universes, or the multiverse. “This is much more like a standard physical theory: our universe is part of some much larger construction in which, for example, separate ‘bubble’ universe regions are formed, possibly with different physical constants, etc,” says Mortlock.

“In some versions of these theories the expanding bubbles can collide. This theory is hence potentially testable – the detection of a ‘bubble collision’ signature would represent strong evidence for the multiverse.”

While the multiverse theory has also not been generally accepted, it has not been completely ruled out. “The most compelling arguments in favour of the multiverse are theoretical: one is that, if the different universes have different physical constants, it’s not such insanely good luck that our universe happens to be just right for life to exist.

“Another is that the theory of inflation (the best model we have for the first instants after the big bang) very naturally predicts that many bubble universes form,” says Mortlock. “It is actively being worked on by many well-respected theoretical physicists and one of the main suspects for a fundamental description of reality.”

Q Why do liquid molecules exert force in all directions (asked in relation to the buoyant force exerted by water in an upward direction)? asks Amitabh Saran

A First, it’s worth noting that pressure is simply a measure of force per unit area. Now, let’s imagine a beaker of water sitting at rest on a table top. At any point in the beaker, the pressure acting on that point is the same in all directions – this is because unlike a solid, the molecules in a liquid can move past each other. What’s more, any point in the beaker, at the same depth, will experience the same pressure. However, the pressure increases with depth as a result of the increasing volume, and hence the increasing weight, of water bearing down.

Now let’s put a cube of wood into our beaker of water. The bottom of the block is deeper in the beaker than the top and so is experiencing a greater pressure than the top of the block. But the pressure acting on the different sides of the block is the same at any given depth, so there is no net pressure to move the block left or right.

But where does buoyancy come in? Well, as the pressure acting on the bottom of the block is greater than that acting on the top, it follows that the force acting on the bottom of the block is greater than that acting on the top, so the net force is in the upwards direction. This net force is equal to the weight of water displaced by the block. And the volume of water displaced is equal to the volume of the block submerged in the water. It was this revelation that got Archimedes so excited he allegedly jumped out of his bathtub shouting “Eureka!”

If the weight of the block is greater than that of the water it displaced, the block will sink, whereas if it is lighter in weight, the block will bob up –like a floating plastic duck, left.

Q Has the evolution of humans slowed down or stopped? asks Anna Leoni

A As Dr Chris Tyler-Smith of the Wellcome Trust Sanger Institute says, humans still face selective pressures, though these have changed. “Particularly in the westernised world, in the developed world, infectious diseases with some exceptions, in general are less of a selective force,” he says.

But this has not always been the case. “For most of our evolution, and unfortunately still in some parts of the world, they are one of the major selective forces.”

But there are other factors that affect who passes on their genes. “I think things like mate choice and family-size choice will now be influencing our evolution still, even in developed countries.

“Something like choice of family size is something that has not really been much of an option for most of our evolutionary history. Mate choice in one form or another has probably been a major force throughout and continues to be.”

However, measuring how quickly we are evolving is a far from straightforward.

“It is tempting to think that evolution is slowing down as some of these pressures are changing, but I don’t know how we would really quantitate that unless we can wait for a million years to pass and then look back,” he says.

Article source:

Elsewhere in Science, 4 April 2014

Friday, April 4th, 2014

Each week, Science publishes a number of articles that are likely to be of interest to career-minded readers. Because those articles are published on the other Science sites, Science Careers readers could easily overlook them.

To remedy that, every Friday we’re pointing our readers toward articles appearing in Science—the print magazine as well as ScienceInsider, ScienceNOW, Science Translational Medicine (Sci. TM) and Science Signaling—that hold some relevance to careers in science and technical fields. (Note that while articles appearing in ScienceInsider and ScienceNOW can be read by anyone, articles appearing in Sci. TM, Science Signaling, and Science may require AAAS—publisher of Science Careersmembership/Science subscription or a site license.)

• The controversy continues over STAP—stimulus-triggered acquisition of pluripotency—which was presented in two January Nature papers as a new method of creating stem cells. The validity of the papers was called into question after bloggers and PubPeer contributors “started pointing out possibly manipulated images and apparently plagiarized text.” At ScienceInsider on Tuesday, Dennis Normile wrote that a report by an investigation committee at Riken in Kobe, Japan, where the work was performed, found that “falsification and fabrication mar” the papers.  The committee concluded that the researchers’ actions constitute research misconduct but did not ask for the papers to be retracted.

Lead author Haruko Obokata of the RIKEN Center for Developmental Biology (RIKEN CDB) was the only author found guilty, “[b]ut the report notes that co-authors Teruhiko Wakayama, a former RIKEN researcher now at the University of Yamanashi in Kofu, and Yoshiki Sasai, of RIKEN CDB, who worked with Obokata to finalize the research, ‘allowed the papers to be submitted to Nature without verifying the accuracy of the data, and they bear heavy responsibility for the research misconduct that resulted from this failure on their part.’ “

“I am filled with feelings of indignation and surprise,” Obokata said in a statement.

• Last Friday, we mentioned embryologist Kenneth Ka-Ho Lee’s attempts to reproduce Obokata’s result. Last week, Lee, who works at the Chinese University of Hong Kong, started live-blogging the attempt on ResearchGate. Earlier this week, as thousands watched, Lee reported a surprising result: evidence of pluripotency in a sample that had been forced through very small pipettes—a step in the STAP technique—but had not been exposed to the acid solution. Lee insisted that it wasn’t an April Fools’ joke.

According to a Friday ScienceInsider post, Lee has now given up on reproducing the STAP experiment. “I don’t think STAP cells exist and it will be a waste of manpower and research funding to carry on with this experiment any further,” Lee wrote on his ResearchGate page yesterday. If he ever does decide to try again, he told Science in an interview, “I’m not going to live-blog it.”

• On Thursday at ScienceInsider, Kelly Servick reported a 1 April announcement that the Defense Advanced Research Projects Agency (DARPA) will be creating a new division “that will consolidate biology research scattered across its existing six divisions and possibly expand the arsenal of projects.” Focus areas will span a wide range: technology to support service members, synthetic biology research, and complex biological systems.

Alicia Jackson, deputy director of the new Biological Technologies Office, “wants to focus on recruiting new program managers—who normally serve 3- to 5-year stints—and reach out to ‘young researchers and start-ups who may have little idea of how to interact with DARPA or that DARPA exists at all.’ “

• This week’s big career-related offering in Science is a special News Focus package called “The Hunt for Money in Biomedicine.” The package leads off with an overview, “Chasing the Money” by Jennifer Couzin-Frankel.  The overview is followed by a series of funding-focused profiles:

 - “The Vulnerable,” about a young Parkinson’s researcher struggling to land her first grant;
 - “The Veteran,” about a long-established biochemist who is shocked when his grants aren’t renewed and has to scramble to find alternatives;
 - “The Adapter,” about a developmental neuroscientist who is letting her postdoc fellows go;
 - “The Administrator,” about a vice president for research who has managed to keep the dollars flowing to his well-funded institution, Northwestern University;
 - “The Well-Heeled,” about a geneticist who still has nearly $3 million a year in funding;
 - “The Crowd-Funder,” about a nutritionist who lost funding in the middle of a clinical trial and turned to the Internet for help; but will she raise enough before the deadline?
 - “Anatomy of a Grant,” in which a microbiologist opens his books and shows where his grant dollars go;
 -  And don’t miss the Science Careers tie-ins, “Research on a Shoestring in India” and “Scarcity Breeds Opportunity.” Also related is Beryl Benderly’s review of Michael Teitelbaum’s new book.

• In a Policy Forum, Bruce Weinberg of the National Bureau for Economic Research and six co-authors—including corresponding author Julia Lane of the American Institutes for Research—begin to lay rigorous empirical foundations for an evidence-based science policy. Using data from 2012, they document the short-term “production” resulting from science investments at nine institutions. The data show that these funds lead to local economic activity and support scientists in diverse roles: one in three was a student (either graduate or undergraduate), about one in ten was a postdoc, and one in three was a member of the research staff or a staff scientist. The composition of the workforce supported by these funds varies across scientific fields. For example, in computational and information sciences, a large portion of the money goes to supporting graduate students. Funding from the various institutes within the National Institutes of Health, on the other hand, is more likely to support research staff.

• In this week’s Letters section, the latest NextGen VOICES survey presents short essays in which early-career scientists explain how they would use an extra 5 hours a week if they had it. Most—a surprisingly large proportion, in fact—would devote the extra time to science outreach. Most are eager to reach out to young people, but a few would aim their efforts at the broader public, at patients, or at lawmakers. My favorite response: Michael Kemp, of the Department of Biochemistry and Biophysics at the University of North Carolina School of Medicine, would use the time to do science, since right now he spends so much time writing and submitting grant proposals to try and make up for historically low funding rates.  It’s a vicious cycle.

Top Image: Blood Cells. CREDIT: Mustafa Mir, Sam Copeland, and Gabriel Popescu/National Science Foundation/DARPA

Article source: