The Standard Model of particle physics: The absolutely amazing theory of almost everything

File 20180521 14978 36nv6i.jpg?ixlib=rb 1.1
How does our world work on a subatomic level?
Varsha Y S, CC BY-SA

By Glenn Starkman, Case Western Reserve University

The Standard Model. What dull name for the most accurate scientific theory known to human beings.

More than a quarter of the Nobel Prizes in physics of the last century are direct inputs to or direct results of the Standard Model. Yet its name suggests that if you can afford a few extra dollars a month you should buy the upgrade. As a theoretical physicist, I’d prefer The Absolutely Amazing Theory of Almost Everything. That’s what the Standard Model really is.

Many recall the excitement among scientists and media over the 2012 discovery of the Higgs boson. But that much-ballyhooed event didn’t come out of the blue – it capped a five-decade undefeated streak for the Standard Model. Every fundamental force but gravity is included in it. Every attempt to overturn it to demonstrate in the laboratory that it must be substantially reworked – and there have been many over the past 50 years – has failed.

In short, the Standard Model answers this question: What is everything made of, and how does it hold together?

The smallest building blocks

But these elements can be broken down further.
Rubén Vera Koster, CC BY-SA

You know, of course, that the world around us is made of molecules, and molecules are made of atoms. Chemist Dmitri Mendeleev figured that out in the 1860s and organized all atoms – that is, the elements – into the periodic table that you probably studied in middle school. But there are 118 different chemical elements. There’s antimony, arsenic, aluminum, selenium … and 114 more.

Physicists like things simple. We want to boil things down to their essence, a few basic building blocks. Over a hundred chemical elements is not simple. The ancients believed that everything is made of just five elements – earth, water, fire, air and aether. Five is much simpler than 118. It’s also wrong.

By 1932, scientists knew that all those atoms are made of just three particles – neutrons, protons and electrons. The neutrons and protons are bound together tightly into the nucleus. The electrons, thousands of times lighter, whirl around the nucleus at speeds approaching that of light. Physicists Planck, Bohr, Schroedinger, Heisenberg and friends had invented a new science – quantum mechanics – to explain this motion.

That would have been a satisfying place to stop. Just three particles. Three is even simpler than five. But held together how? The negatively charged electrons and positively charged protons are bound together by electromagnetism. But the protons are all huddled together in the nucleus and their positive charges should be pushing them powerfully apart. The neutral neutrons can’t help.

What binds these protons and neutrons together? “Divine intervention” a man on a Toronto street corner told me; he had a pamphlet, I could read all about it. But this scenario seemed like a lot of trouble even for a divine being – keeping tabs on every single one of the universe’s 10⁸⁰ protons and neutrons and bending them to its will.

Expanding the zoo of particles

Meanwhile, nature cruelly declined to keep its zoo of particles to just three. Really four, because we should count the photon, the particle of light that Einstein described. Four grew to five when Anderson measured electrons with positive charge – positrons – striking the Earth from outer space. At least Dirac had predicted these first anti-matter particles. Five became six when the pion, which Yukawa predicted would hold the nucleus together, was found.

Then came the muon – 200 times heavier than the electron, but otherwise a twin. “Who ordered that?” I.I. Rabi quipped. That sums it up. Number seven. Not only not simple, redundant.

By the 1960s there were hundreds of “fundamental” particles. In place of the well-organized periodic table, there were just long lists of baryons (heavy particles like protons and neutrons), mesons (like Yukawa’s pions) and leptons (light particles like the electron, and the elusive neutrinos) – with no organization and no guiding principles.

Into this breach sidled the Standard Model. It was not an overnight flash of brilliance. No Archimedes leapt out of a bathtub shouting “eureka.” Instead, there was a series of crucial insights by a few key individuals in the mid-1960s that transformed this quagmire into a simple theory, and then five decades of experimental verification and theoretical elaboration.

Quarks. They come in six varieties we call flavors. Like ice cream, except not as tasty. Instead of vanilla, chocolate and so on, we have up, down, strange, charm, bottom and top. In 1964, Gell-Mann and Zweig taught us the recipes: Mix and match any three quarks to get a baryon. Protons are two ups and a down quark bound together; neutrons are two downs and an up. Choose one quark and one antiquark to get a meson. A pion is an up or a down quark bound to an anti-up or an anti-down. All the material of our daily lives is made of just up and down quarks and anti-quarks and electrons.

The Standard Model of elementary particles provides an ingredients list for everything around us.
Fermi National Accelerator Laboratory, CC BY

Simple. Well, simple-ish, because keeping those quarks bound is a feat. They are tied to one another so tightly that you never ever find a quark or anti-quark on its own. The theory of that binding, and the particles called gluons (chuckle) that are responsible, is called quantum chromodynamics. It’s a vital piece of the Standard Model, but mathematically difficult, even posing an unsolved problem of basic mathematics. We physicists do our best to calculate with it, but we’re still learning how.

The other aspect of the Standard Model is “A Model of Leptons.” That’s the name of the landmark 1967 paper by Steven Weinberg that pulled together quantum mechanics with the vital pieces of knowledge of how particles interact and organized the two into a single theory. It incorporated the familiar electromagnetism, joined it with what physicists called “the weak force” that causes certain radioactive decays, and explained that they were different aspects of the same force. It incorporated the Higgs mechanism for giving mass to fundamental particles.

Since then, the Standard Model has predicted the results of experiment after experiment, including the discovery of several varieties of quarks and of the W and Z bosons – heavy particles that are for weak interactions what the photon is for electromagnetism. The possibility that neutrinos aren’t massless was overlooked in the 1960s, but slipped easily into the Standard Model in the 1990s, a few decades late to the party.

3D view of an event recorded at the CERN particle accelerator showing characteristics expected from the decay of the SM Higgs boson to a pair of photons (dashed yellow lines and green towers).
McCauley, Thomas; Taylor, Lucas; for the CMS Collaboration CERN, CC BY-SA

Discovering the Higgs boson in 2012, long predicted by the Standard Model and long sought after, was a thrill but not a surprise. It was yet another crucial victory for the Standard Model over the dark forces that particle physicists have repeatedly warned loomed over the horizon. Concerned that the Standard Model didn’t adequately embody their expectations of simplicity, worried about its mathematical self-consistency, or looking ahead to the eventual necessity to bring the force of gravity into the fold, physicists have made numerous proposals for theories beyond the Standard Model. These bear exciting names like Grand Unified Theories, Supersymmetry, Technicolor, and String Theory.

Sadly, at least for their proponents, beyond-the-Standard-Model theories have not yet successfully predicted any new experimental phenomenon or any experimental discrepancy with the Standard Model.

The ConversationAfter five decades, far from requiring an upgrade, the Standard Model is worthy of celebration as the Absolutely Amazing Theory of Almost Everything.

Glenn Starkman, Distinguished University Professor of Physics, Case Western Reserve University

This article was originally published on The Conversation. Read the original article.

The secret to turtle hibernation: Butt-breathing

File 20171109 13344 2wzmr2.jpeg?ixlib=rb 1.1
Turtles can’t head south for the winter, so they hibernate in rivers, lakes and ponds.
(Pexels)

By Jacqueline Litzgus, Laurentian University

To breathe or not to breathe, that is the question.

What would happen if you were submerged in a pond where the water temperature hovered just above freezing and the surface was capped by a lid of ice for 100 days?

Well, obviously you’d die.

And that’s because you’re not as cool as a turtle. And by cool I don’t just mean amazing, I mean literally cool, as in cold. Plus, you can’t breathe through your butt.

But turtles can, which is just one of the many reasons that turtles are truly awesome.

Cold weather slow down

As an ectotherm — an animal that relies on an external source of heat — a turtle’s body temperature tracks that of its environment. If the pond water is 1℃, so is the turtle’s body.

But turtles have lungs and they breathe air. So, how is it possible for them to survive in a frigid pond with a lid of ice that prevents them from coming up for air? The answer lies in the relationship between body temperature and metabolism.

A cold turtle in cold water has a slow metabolism. The colder it gets, the slower its metabolism, which translates into lower energy and oxygen demands.

When turtles hibernate, they rely on stored energy and uptake oxygen from the pond water by moving it across body surfaces that are flush with blood vessels. In this way, they can get enough oxygen to support their minimal needs without using their lungs. Turtles have one area that is especially well vascularized — their butts.

See, I wasn’t kidding, turtles really can breathe through their butts. (The technical term is cloacal respiration.)

Not frozen, just cold

We are not turtles. We are endotherms — expensive metabolic heat furnaces — that need to constantly fuel our bodies with food to generate body heat and maintain a constant temperature to stay alive and well.

When it’s cold out, we pile on clothes to trap metabolic heat and stay warm. We could never pick up enough oxygen across our vascularized surfaces, other than our lungs, to supply the high demand of our metabolic furnaces.

Turtles will bask in the sun to warm up and ease their crampy muscles.
(Patrick Moldowan), Author provided

For humans, a change in body temperature is generally a sign of illness, that something is wrong. When a turtle’s body temperature changes, it’s simply because the environment has become warmer or colder.

But even ectotherms have their limits. With very few exceptions (e.g., box turtles), adult turtles cannot survive freezing temperatures; they cannot survive having ice crystals in their bodies. This is why freshwater turtles hibernate in water, where their body temperatures remain relatively stable and will not go below freezing.

Water acts as a temperature buffer; it has a high specific heat, which means it takes a lot of energy to change water temperature. Pond water temperatures remain quite stable over the winter and an ectotherm sitting in that water will have a similarly stable body temperature. Air, on the other hand, has a low specific heat so its temperature fluctuates, and gets too cold for turtle survival.

Crampy muscles

An ice-covered pond presents two problems for turtles: they can’t surface to take a breath, and little new oxygen gets into the water. On top of that, there are other critters in the pond consuming the oxygen that was produced by aquatic plants during the summer.

Over the winter, as the oxygen is used up, the pond becomes hypoxic (low oxygen content) or anoxic (depleted of oxygen). Some turtles can handle water with low oxygen content — others cannot.

Snapping turtles and painted turtles tolerate this stressful situation by switching their metabolism to one that doesn’t require oxygen. This ability is amazing, but can be dangerous, even lethal, if it goes on for too long, because acids build up in their tissues as a result of this metabolic switch.

But how long is “too long”? Both snapping turtles and painted turtles can survive forced submergence at cold water temperatures in the lab for well over 100 days. Painted turtles are the kings of anoxia-tolerance. They mobilize calcium from their shells to neutralize the acid, in much the same way we take calcium-containing antacids for heartburn.

In the spring, when anaerobic turtles emerge from hibernation, they are basically one big muscle cramp. It’s like when you go for a hard run — your body switches to anaerobic metabolism, lactic acid builds up and you get a cramp. The turtles are desperate to bask in the sun to increase their body temperature, to fire up their metabolism and eliminate these acidic by-products.

And it’s hard to move when they’re that crampy, making them vulnerable to predators and other hazards. Spring emergence can be a dangerous time for these lethargic turtles.

Cold weather turtle tracking

Field biologists tend to do their research during the spring and summer, when animals are most active. But in Ontario, where the winters are long, many turtle species are inactive for half of their lives.

Understanding what they do and need during winter is essential to their conservation and habitat protection, especially given that two-thirds of turtle species are at risk of extinction.

X marks the spot. Former graduate student Bill Greaves tracks turtles during a cold Ontario winter.
Author provided

My research group has monitored several species of freshwater turtles during their hibernation. We attach tiny devices to the turtles’ shells that measure temperature and allow us to follow them under the ice.

We’ve found that all species choose to hibernate in wetland locations that hover just above freezing, that they move around under the ice, hibernate in groups and return to the same places winter after winter.

Despite all this work, we still know so little about this part of turtles’ lives.

So, I do what any committed biologist would do: I send my students out to do field research at -25℃. We are not restricted to fair-weather biology here.

The ConversationBesides, there is unparalleled beauty in a Canadian winter landscape, especially when you envision all of those awesome turtles beneath the ice, breathing through their butts.

Jacqueline Litzgus, Professor, Department of Biology, Laurentian University

This article was originally published on The Conversation. Read the original article.

Half of Earth’s satellites restrict use of climate data

File 20180330 189807 9kpl5p.jpg?ixlib=rb 1.1
Dust storms in the Gulf of Alaska, captured by NASA’s Aqua satellite.
NASA

By Mariel Borowitz, Georgia Institute of Technology

Scientists and policymakers need satellite data to understand and address climate change. Yet data from more than half of unclassified Earth-observing satellites is restricted in some way, rather than shared openly.

When governments restrict who can access data, or limit how people can use or redistribute it, that slows the progress of science. Now, as U.S. climate funding is under threat, it’s more important than ever to ensure that researchers and others make the most of the collected data.

Why do some nations choose to restrict satellite data, while others make it openly available? My book, “Open Space,” uses a series of historical case studies, as well as a broad survey of national practices, to show how economic concerns and agency priorities shape the way nations treat their data.

The price of data

Satellites can collect comprehensive data over the oceans, arctic areas and other sparsely populated zones that are difficult for humans to monitor. They can collect data consistently over both space and time, which allows for a high level of accuracy in climate change research.

For example, scientists use data from the U.S.-German GRACE satellite mission to measure the mass of the land ice in both the Arctic and Antarctic. By collecting data on a regular basis over 15 years, GRACE demonstrated that land ice sheets in both Antarctica and Greenland have been losing mass since 2002. Both lost ice mass more rapidly since 2009.

Satellites collect valuable data, but they’re also expensive, typically ranging from US$100 million to nearly $1 billion per mission. They’re usually designed to operate for three to five years, but quite often continue well beyond their design life.

Many nations attempt to sell or commercialize data to recoup some of the costs. Even the U.S. National Oceanic and Atmospheric Administration and the European Space Agency – agencies that now make nearly all of their satellite data openly available – attempted data sales at an earlier stage in their programs. The U.S. Landsat program, originally developed by NASA in the early 1970s, was turned over to a private firm in the 1980s before later returning to government control. Under these systems, prices often ranged from hundreds to thousands of dollars per image.

https://datawrapper.dwcdn.net/IpN0F/1/

In other cases, agency priorities prevent any data access at all. As of 2016, more than 35 nations have been involved in the development or operation of an Earth observation satellite. In many cases, nations with small or emerging space programs, such as Egypt and Indonesia, have chosen to build relatively simple satellites to give their engineers hands-on experience.

Since these programs aim to build capacity and demonstrate new technology, rather than distribute or use data, data systems don’t receive significant funding. Agencies can’t afford to develop data portals and other systems that would facilitate broad data access. They also often mistakenly believe that demand for the data from these experimental satellites is low.

If scientists want to encourage nations to make more of their satellite data openly available, both of these issues need to be addressed.

Landsat 8, an American Earth observation satellite.
NASA, CC BY

Promoting access

Since providing data to one user doesn’t reduce the amount available for everyone else, distributing data widely will maximize the benefits to society. The more that open data is used, the more we all benefit from new research and products.

In my research, I’ve found that making data freely available is the best way to make sure the greatest number of people access and use it. In 2001, the U.S. Geological Survey sold 25,000 Landsat images, a record at the time. Then Landsat data was made openly available in 2008. In the year following, the agency distributed more than 1 million Landsat images.

For nations that believe demand for their data is low, or that lack resources to invest in data distribution systems, economic arguments alone are unlikely to spur action. Researchers and other user groups need to raise awareness of the potential uses of this data and make clear to governments their desire to access and use it.

Intergovernmental organizations like the Group on Earth Observations can help with these efforts by connecting research and user communities with relevant government decision-makers. International organizations can also encourage sharing by providing nations with global recognition of their data-sharing efforts. Technical and logistical assistance – helping to set up data portals or hosting foreign data in existing portals – can further reduce the resource investment required by smaller programs.

Promise for future

Satellite technology is improving rapidly. I believe that agencies must find ways to take advantage of these developments while continuing to make data as widely available as possible.

Satellites are collecting more data than ever before. Landsat 8 collected more data in its first two years of operation than Landsat 4 and 5 collected over their combined 32-year lifespan. The Landsat archive currently grows by a terabyte a day.

This avalanche of data opens promising new possibilities for big data and machine learning analyses – but that would require new data access systems. Agencies are embracing cloud technology as a way to address this challenge, but many still struggle with the costs. Should agencies pay commercial cloud providers to store their data, or develop their own systems? Who pays for the cloud resources needed to carry out the analysis: agencies or users?

The ConversationSatellite data can contribute significantly to a wide range of areas – climate change, weather, natural disasters, agricultural development and more – but only if users can access the data.

Mariel Borowitz, Assistant Professor of International Affairs, Georgia Institute of Technology

This article was originally published on The Conversation. Read the original article.