HP Aichi

Main Menu

  • Excess Supply
  • Factoring UK
  • Feasibility Studies
  • Capital Structure
  • Saving Investment

HP Aichi

Header Banner

HP Aichi

  • Excess Supply
  • Factoring UK
  • Feasibility Studies
  • Capital Structure
  • Saving Investment
Factoring UK
Home›Factoring UK›Brookhaven National Laboratory deploys equipment in new computing center

Brookhaven National Laboratory deploys equipment in new computing center

By Allison Nichols
April 11, 2022
0
0

April 11, 2022 — Packing and moving is a daunting task. Now imagine taking into account large, expensive and sensitive equipment, unpredictable weather events and a global pandemic! These are just some of the challenges facing the U.S. Department of Energy’s (DOE) Brookhaven National Laboratory’s Scientific Computing and Data Center (SDCC) as a crew transports computer equipment and network from their longtime base in Building 515 to an upgraded facility housed in Building 725. While moving this mission-critical equipment, the team also needed to simultaneously deploy additional new parts, including five new racks and one advanced cooling system.

It was worth the effort to transform the former National Synchrotron Light Source building into a state-of-the-art computing and data storage facility. The new facility’s additional space and adaptable infrastructure will be critical to delivering the data solutions of tomorrow, while seamlessly supporting the science in progress of today.

The SDCC manages large amounts of data with minimal downtime for several major research projects inside and outside of Brookhaven Lab. “These experiments cannot tolerate significant downtime in accessing our computing resources,” said Imran Latif, SDCC’s Head of Infrastructure and Operations. “To pull off this move, we needed the cooperation of multiple teams and their unique areas of expertise.”

More science, more data

The original SDCC, formerly known as RHIC and ATLAS Computing Facility, has grown and adapted to meet the needs of the Laboratory over several decades. His computers played a crucial role in the storage, distribution and analysis of data for experiments at the Relativistic Heavy Ion Collider (RHIC) – a DOE Office of Science User Facility for Nuclear Physics Research at Brookhavent – as well as the ATLAS experiment at the Large Hadron Collider (LHC) at CERN in Switzerland and the Belle II experiment at the Japanese SuperKEKB particle accelerator. As research grows and accelerates in these facilities, so does the volume of data. As of 2021, SDCC had stored approximately 215 petabytes (PB) of data from these experiments. If you were to translate that amount of data into full high-definition video, it would take over 700 years to watch everything continuously!

At SDCC, all of the hardware components used to store and process data are stacked in specialized “racks” of cabinets arranged in aisles, much like shelves in a supermarket. Large and powerful computer equipment emits a lot of heat. You’ve probably felt the heat generated by a small laptop, and a typical PC is capable of generating at least double that amount. So you can imagine how hot an entire data center can get. Arranging server racks in rows helps maximize airflow between them, but constructing these aisles requires a lot of space. This was a significant issue in Building 515, where size, non-uniform layout, and aging infrastructure presented major challenges for the expansion.

The Main Data Hall that is being assembled in building 725, where the equipment that made up the old synchrotron resided, has plenty of room to adapt and grow. Its modular layout also includes a new power distribution system that can power various racks of equipment with different power demands. This allows equipment to be moved and rearranged almost anywhere in the data room, if needed, to accommodate future projects.

As the SDCC grows, an important goal is to implement measures that will reduce the economic and ecological impacts of day-to-day operations. One way to achieve this is through more efficient cooling measures.

The original SDCC relied solely on a cooling system that circulates cooled air throughout an entire room via ventilation in the floor. The new installation, on the other hand, uses a state-of-the-art system based on rear door heat exchangers (RDHx), which is a more modern and efficient option for cooling on this scale. RDHx systems are attached directly to the back of each computer server rack, using chilled water to efficiently dissipate heat close to the source. This direct heat control system allows servers to be grouped closer together, maximizing space utilization.

Team work
The process of receiving and installing sophisticated equipment, as well as building new infrastructure to house this equipment, relied on the cooperation and expertise of internal laboratory staff and external vendors. Tight deadlines had to be stretched to accommodate safety measures, including maintaining a low workforce during the pandemic. The uncertainty of the trip also complicated the process for suppliers overseas, including UK supplier USystems, which supplied the SDCC’s new cooling system.

The project was three months behind schedule, but the group was determined to stick to the schedule for each experiment. ATLAS was a particular priority. It was a race to get ready for LHC Run 3 as they finalized their start date.

Mother Nature also intervened.

“We started all of this at the start of intense weather last spring,” Latif said. He recalled how a particular part of the project – moving and installing several existing racks alongside five new racks scheduled for delivery – had to be completed in the middle of a storm within a tight 72-hour deadline. The weather event could have blocked an entire day if they hadn’t gotten ahead.

“We all sat down together, the whole group, and made a consensus about whether we could do it or not,” Latif recalled. “We took a calculated risk, while coordinating with our UK supplier.” With research, careful planning and a bit of luck, “we were able to get everything inside the building before the worst of the storm hit.”

Setting up the new infrastructure and installing the equipment in its new home also required significant support from Brookhaven’s facilities and operations (F&O) team. Riggers, carpenters and plumbers played a vital role in ensuring the new space was ready in a timely manner.

“Without their hard work, this wouldn’t be possible; it was a team effort,” Latif said.

The path to follow

Although the majority of the SDCC will now be located in building 725, the legacy data center will still have a purpose. As the migration ends in fall 2023, components of Bldg. 515 will find new life as a hub for secondary data storage. Scale and functionality will remain limited for the foreseeable future, with only legacy tape libraries and their associated servers remaining.

Meanwhile, the relocated SDCC has an exciting year ahead. Physicists involved in the ATLAS experiment will resume data collection for LHC Run 3, taking advantage of upgraded data center resources. There will also be plenty of data to analyze from the nuclear physics experiments at RHIC, including the resumption of data collection from STAR and the upcoming sPHENIX detector. There are even plans to migrate data from ongoing experiments to the National Synchrotron Light Source II (NSLS-II) beamlines, another DOE Office of Science user facility. And, eventually, SDCC will pick up data from the electron-ion collider (EIC).

Latif distilled the goal of the new SDCC facility into three succinct principles: “adaptability, scalability and efficiency.” Designing with the future in mind opens up an exciting array of possibilities.

SDCC upgrades were supported by the DOE Office of Science. Operations at NSLS-II and RHIC are also supported by the Office of Science.

About Brookhaven National Laboratory

Brookhaven National Laboratory is supported by the U.S. Department of Energy’s Office of Science. The Office of Science is the largest supporter of basic physical science research in the United States and works to address some of the most pressing challenges of our time. For more information, visit science.energy.gov.


Source: Denise Yazak, Brookhaven National Laboratory

Related posts:

  1. FTSE set to rise forward of Bailey’s feedback – Dwell Updates
  2. 10-year JGB yields decrease earlier than BOJ coverage assessment
  3. Trade Evaluation & Forecast (2019-2026), By Materials, Packaging Kind, Manufacturing Course of, Utility & Area – SoccerNurds
  4. Greatest cryptocurrency mining swimming pools of 2021

Categories

  • Capital Structure
  • Excess Supply
  • Factoring UK
  • Feasibility Studies
  • Saving Investment

Recent Posts

  • Greenfern secures deal to supply medical cannabis
  • Single Phase Wet & Dry Cleaners Market Research Report – Global Forecast to 2030 – Indian Defense News
  • How can SMEs go public?
  • Not all “Secure 2.0” proposals in the House and Senate are the same
  • Australians are traveling more carefully after border closures and COVID-19 lockdowns
  • Terms and Conditions
  • Privacy Policy