A couple of cooled “cats” in the basement of a building off Speedway are simultaneously recording sea temperatures in the Antarctic and predicting the shape and size of the black hole at the center of the Milky Way — among many other things.

“Ocelote” and “El Gato” are the two most-recent components of a high-performance research-computing system at the University of Arizona that grows and shifts to meet demand and also allows researchers to accept donations of grant funds to buy more processors.

Mike Bruck, assistant director for research computing at UA, prefers the term “high-performance computing” to describe what his arrays of processors do. He reserves the word “supercomputer” for the giant banks of processors put together by the National Science Foundation at eight university and institute sites across the country.

Those truly supercomputers contain 200,000 to nearly 1 million processors. The research computers at the UA have about 10,000. “Ours are pretty super, but they’re pretty small compared to the big NSF computers,” Bruck said.

When chemist Steve Schwartz moved his research group to the UA in 2012, he could have bought his own computers with the startup money he received, but he knew he would have to set up and maintain them.

He instead bought into the university’s first foray into research computing.

When that system was upgraded for a second time this year, he used $200,000 in leftover startup money from the Arizona Board of Regents Technology Research Initiative Fund and a $200,000 match from the National Institutes of Health to “buy” 1,500 processors on the UA’s new “Ocelote” system. The Ocelote cluster adds nearly 10,000 cores and 60 terabytes of memory.

“I don’t actually ‘own’ anything,” said Schwartz, but his group has first dibs on using his processors.

It’s needed, he said. His 13-member group, which does theoretical and computational studies of complex systems, has projects that require a good deal of computing power.

Schwartz is trying to figure out the physical process that allows enzymes to speed up chemical reactions and is examining the physical properties of surfactants, the molecules that make detergents work and put the foam in beer.

In partnership with his wife, UA medical researcher Jil Tardiff, he is also researching the physical mechanism that leads to heart failure in young athletes.

“We’re studying how mutations in your heart cause hypertrophic cardiomyopathy,” he said.

Schwartz said his wife “brought me kicking and screaming into this with very tiny pieces of cardiac thin-film filament.”

“Now we study the entire cardiac thin filament — all the atoms and all the water molecules, over 4 million atoms. That’s a very big system, and you need significant computing power for that.”

Schwartz said his group can now spend all its time on research, instead of maintaining and fixing computers or writing grants to get on the NSF’s supercomputers.

Schwartz had his own bank of processors when he and his wife were researchers at Albert Einstein College of Medicine.

“In New York, I owned my own cluster. If something broke, one of us had to go over and figure out what to do about it,” he said. He arrived at the UA as the university was ramping up its research-computing facilities.

“Out of my startup money I bought a nice chunk of that first computer. It was getting turned on the day I walked in the door,” he said.

Bruck said the benefits of central research computing are many.

A team of technicians keeps things running in the basement of the University Information Technology Services Building. The computers are cooled by radiators fed by the university’s chilled water supply. Redundant power feeds from Tucson Electric Power are backed by banks of batteries and a diesel generator on the building’s south side.

You don’t have to buy processors to use the research computers, said Bruck.

The main funding for the UA’s high-performance computers comes from the Office of the Chief Information Officer and the Office for Research and Discovery. They are available to anyone doing research on campus and used by more than 120 research groups, he said.

When Schwartz isn’t using his processors, they become “windfall” time for projects with lesser priority.

Schwartz said he made use of that windfall time on the El Gato system, the 2012 upgrade made possible by grant money from the UA’s Department of Astronomy.

Schwatrtz said the heart muscle work required visualization, and El Gato was outfitted with 140 graphics-processing units (GPUs), the kind of computing needed for video games.

The astronomers use them to simulate astrophysical phenomena. Feryal Ozel and Dimitrios Psaltis, the husband-wife team that is preparing the theoretical basis for an attempt to image the black hole at the center of our galaxy, have fed all of the data about black holes and all of Einstein’s theories into El Gato to produce animations of what that image should look like if current theories of relativity are accurate.

Oceanographer Joellen Russell was an early proponent of high-performance computing at the UA. The advances of the past few years have allowed her to make the UA the computing center of an international effort to sample the biogeochemical content of the Southern Ocean.

Russell, an associate professor of geosciences, is a principal investigator for the NSF’s Southern Ocean Carbon and Climate Observations and Modeling project. The project is run by Princeton University and the “floats” are deployed by the Scripps Institution of Oceanography.

Currently, the project has deployed 57 of 200 floats in the Antarctic. The floats are missile-shaped tanks that submerge to gather data at the 1,000- and 2,000-meter level. The data are available within eight hours, she said.

Those data are combined with atmospheric observations to create a more complete picture of the effects of a warming world .

The Southern Ocean absorbs half the world’s carbon production and most of its heat, Russell said. Measuring it is important but difficult. Research cruises are expensive and can only be undertaken during the southern summer.

“The Southern Ocean is the hardest to observe, because it is so far from ship-enriched countries. It has incredibly bad weather. The westerlies are 30 percent stronger and they beat you to death with icebergs, massive waves and wind.”

The floats deployed by Southern Ocean project provide data yearlong. Scientists will be able to see how much the ocean is warming and what affect that has. “We’re actually watching the acidification of the ocean in real time,” Russell said.

The data collected will also serve as a check on treaty promises to reduce carbon emissions and help predict our climate future, Russell said. “We don’t have any crystal ball except math and supercomputers,” she said. “This is the new picture of oceanography. It’s a revolution. We’re going to change the world.”

Russell said the work could not have been done at the UA 10 years ago, but the computing power and speed has grown exponentially in the past few years.

Russell, who recently resigned as chair of the Research Computing Governance Committee, said the computing center generates $200 million a year in grants.

“It’s shocking, really. This is one of those ‘Stone Soup’ things where everyone brings their good stuff to the table. And this happened during the recession. It’s the best thing about being here. We know we’re not Harvard. We know someone is not going to give us $200 million to buy a supercomputer. We’re all basically chipping in to support the university,” Russell said.

Contact reporter Tom Beal at tbeal@tucson.com or 520-573-4158. Follow on Facebook or @bealagram on Twitter.