Tape-based Storage Remains as a Key Element for On-Premises Storage Strategies
$1,500.00
Authors: Mark Nossokoff and Jaclyn Ludema
Publication Date: June 2023
Length: 3 pages
While innovations associated with HDD capacities, SSD performance, latency, bandwidth, and throughput advancements dominate HPC storage-related investments and headlines, tape-based storage solutions remain a critical part of data center storage architectures. Tape continues to be utilized at roughly half of the sites surveyed, with average tape capacity at those sites being approximately 2.5x greater than average disk (both HDD and SSD) capacity across all surveyed sites. This data is from an annual study which is part of the eighth edition of Hyperion Research’s high-performance computing (HPC) end-user-based tracking of the HPC marketplace. It included 181 HPC end-user sites with 3,830 HPC systems.
Related Products
NOAA and Microsoft Announce Cloud Computing Collaboration to Advance Climate-Ready Nation Mission
Jaclyn Ludema and Mark Nossokoff
The US National Oceanic and Atmospheric Administration (NOAA) and Microsoft have entered into a Cooperative Research and Development Agreement (CRADA), formalizing NOAA's commitment to using Microsoft Azure cloud computing resources in the pursuit of NOAA's mission to build a Climate[1]Ready Nation by 2030. Several initiatives are envisioned whereby NOAA scientists and engineers will work with Microsoft experts to leverage Azure's machine learning and HPC capabilities: ▪ Fast-tracking innovative contributions to NOAA Earth Prediction Innovation Center (EPIC) earth systems modeling and research ▪ Applying machine learning capabilities to improve models supporting air quality, smoke, and particulate pollution forecasts, as well as relevant NOAA climate models ▪ Accelerating NOAA Fisheries' survey and observations data collection and management ▪ Creating new ocean observations cataloging efforts ▪ Designing resilient and accessible weather modeling and forecasting that can incorporate external data sources with NOAA enterprise data
December 2022 | Uncategorized
Cerebras Announces Capability to Train Largest Models Easily
Alex Norton and Thomas Sorensen
In mid-June of 2022, Cerebras Systems announced a new feature that allows users to train some of the largest AI models in the world within a single CS-2 machine using a simplified software support scheme. The announcement highlights multiple capabilities that Cerebras sees as their competitive advantages over other companies. Notable examples cited include the ability to accommodate an entire training model within the memory, through Cerebras' Weight Streaming software on the Wafer Scale Engine (WSE), instead of splitting it across processors, as well as the ability for users to manipulate a few inputs within the software scheme and GUI to choose the scale of model desired for training (i.e., GPT-3 13B, GPT-3XL 1.3B). Cerebras claims that this advancement can cut down the setup of large model training runs from months to minutes, with the Cerebras software managing much of the initial setup.
September 2022 | Uncategorized