UC, San Diego debuts large data storage cloud
The San Diego Supercomputer Center (SDSC) at the University of California (UC) San Diego, launched what is purported to be the largest academic-based cloud storage system in the U.S. The system, designed for researchers, students, academics and industry users who require secure storage and sharing of digital information, including large data sets.
Conceived in planning for UC San Diego's campus research cyberinfrastructure project, the initiative grew in scope and partners. At its launch, users and research partners include UC San Diego's Libraries, School of Medicine, Rady School of Management, Jacobs School of Engineering and SDSC researchers, as well as federally funded research projects from the National Science Foundation, National Institutes for Health, and Centers for Medicare & Medicaid Services.
"We believe that the SDSC Cloud may well revolutionize how data is preserved and shared among researchers, especially massive datasets that are becoming more prevalent in this new era of data-intensive research and computing," says Michael Norman, director of SDSC. "The SDSC Cloud goes a long way toward meeting federal data sharing requirements, since every data object has a unique URL and could be accessed over the web."
SDSC's web-based system is 100 percent disk-based and interconnected by high-speed 10 gigabit ethernet switching technology, providing read and write performance. With an initial raw capacity of 5.5 petabytes – one petabyte equals one quadrillion bytes of storage capacity, or the equivalent about 250 billion pages of text – the SDSC Cloud has sustained read rates of 8 to 10 gigabytes (GB) per second that will seek to improve as more nodes and storage are added. That's akin to reading all the contents of a 250 GB laptop drive in less than 30 seconds, according to the UC San Diego.
Moreover, the SDSC Cloud is scalable by orders of magnitude to hundreds of petabytes, with aggregate performance and capacity both scaling almost linearly with growth.
Conceived in planning for UC San Diego's campus research cyberinfrastructure project, the initiative grew in scope and partners. At its launch, users and research partners include UC San Diego's Libraries, School of Medicine, Rady School of Management, Jacobs School of Engineering and SDSC researchers, as well as federally funded research projects from the National Science Foundation, National Institutes for Health, and Centers for Medicare & Medicaid Services.
"We believe that the SDSC Cloud may well revolutionize how data is preserved and shared among researchers, especially massive datasets that are becoming more prevalent in this new era of data-intensive research and computing," says Michael Norman, director of SDSC. "The SDSC Cloud goes a long way toward meeting federal data sharing requirements, since every data object has a unique URL and could be accessed over the web."
SDSC's web-based system is 100 percent disk-based and interconnected by high-speed 10 gigabit ethernet switching technology, providing read and write performance. With an initial raw capacity of 5.5 petabytes – one petabyte equals one quadrillion bytes of storage capacity, or the equivalent about 250 billion pages of text – the SDSC Cloud has sustained read rates of 8 to 10 gigabytes (GB) per second that will seek to improve as more nodes and storage are added. That's akin to reading all the contents of a 250 GB laptop drive in less than 30 seconds, according to the UC San Diego.
Moreover, the SDSC Cloud is scalable by orders of magnitude to hundreds of petabytes, with aggregate performance and capacity both scaling almost linearly with growth.