A few years back, the late Michael Wells noted that, with the revolution in biology, many young life scientists lacked the quantitative skills necessary to be effective researchers. Mike asked for a fix. My suggestion was to take a calculus course, followed by differential equations, and then take the toughest stat class they could handle. The stat course, I had to admit, did not really exist. Soon thereafter, Mike secured funding from the Howard Hughes Medical Institute with the charge to create such a course.

Learn some math, learn how to apply it, and learn how to bring your data to your science. In recommending what is new, we honor what is old. In the 17th century, the haberdasher John Graunt introduced statistical methods today called demography to understand the population of London, one of the first big cities.

Late in the 18th century, Carl Friedrich Gauss created the statistical method of least squares in time for the astronomer Giuseppe Piazzi to follow a small object, the asteroid Ceres, at big distance. Nearly two centuries after Graunt, the nurse Florence Nightingale brought new ways to display data to a big concern - the health of soldiers during the Crimean War.

In a world of big data, in Graunt's social science, Gauss' physical science and Nightingale's medical science, new ideas in statistical science were needed to address big questions. Today, big begets bigger - computer and information scientists and engineers bring us more data, better algorithms and faster processors. In this new world as it was in the old, the statistical scientist is an essential companion on a journey to the edge of what is known.

Demographers are analyzing tweets to uncover critical details in the conflict in Syria. Astronomers are preparing for the deluge of data from the Large Synoptic Survey Telescope to probe the mysteries of dark matter. Medical researchers are using extensive genetic data to understand how the microbiome, the critters in our gut, relates to our state of health.

In preparing for the future, a successful curriculum for the emerging generation must not be overly prescriptive. Rather, it must enable students to marry questions to their own well-designed implementation of statistical techniques. Can we manufacture tiny nanogrooves so that we can examine a single organic molecule? Can we organize electrophysiology data so that we can investigate how the brain learns and remembers? Can we sequence the genetic transcripts of white blood cells so that we can gain insights into our immune system? My students' questions are simply placed. Can we do the statistics so that we can do the science?

Learn some math, learn how to apply it, learn how to bring your data to your science and your findings to the public: 2013 is the International Year of Statistics for a reason. The McKinsey Global Institute notes, "The United States alone faces a shortage of 140,000 to 190,000 people with analytical expertise and 1.5 million managers and analysts with the skills to understand and make decisions based on the analysis of big data."

Local manufacturing and biotech sectors are looking to the University of Arizona for interns and future employees in statistics. Moreover, they want the university to assist in the statistics training of their current employees.

In trying to make good on Wells' challenge, I have been privileged in witnessing the excitement as students take on big questions with big data. We owe them our support through the public advocacy of a modern education system. And we owe them our gratitude as they address with skill, intelligence and purpose the big challenges ahead.


Watch a video, "Improving Human Welfare in 2013, International Year of Statistics" at www.youtube.com/watch?v=nTBZuQR7dRc

Joe Watkins is the chair of the Graduate Interdisciplinary Program in Statistics at the University of Arizona and a member of the School of Mathematical Sciences.