Hyperion Research LogoHyperion Research Logo_StickyHyperion Research LogoHyperion Research Logo
  • Home
  • Services
    • Consulting Services
    • Artificial Intelligence-High Performance Data Analysis (AI-HPDA)
    • Traditional and Emerging HPC
    • Quantum Computing Continuing Information Service
    • HPC User Forum
    • Worldwide High Performance Technical Server QView
    • Worldwide HPC Server, Verticals and Countries Forecast Database
    • HPC End-User Multi-Client Study 2024
    • High-Performance Computing Pathfinders
    • Cloud Computing Program
  • Team
  • Sample Projects
    • List of Recent Reports
    • Top 10 Predictions for the Global HPC-AI Community for 2025
    • HPC User Forum: Dr. Ann Speed
    • QC Optimization Status and Prospects
    • To Out-compute is to Out-compete: Competitive Threats and Opportunities Relative to U.S. Government HPC Leadership
    • HPC-AI Success Story
    • HPC+AI Market Update SC24
    • Taxonomy
      • AI-HPDA Taxonomy
      • HPC Server Tracking and Application Workload Segments
      • Traditional HPC and AI-HPDA Subverticals
    • NERSC Update, May 2021 HPC User Forum
    • Cloud Computing Changing HPC Spending
    • NASA Bespoke HPC Study
    • ROI with HPC
    • Interview Series
    • Cloud Application Assessment Tool
    • MCS Server Highlights 2021
    • QC User Study 2021
    • HPC Storage Review 2021 First Half Yr
    • Hyperion Research Sponsored Tech Spotlight AMD-Supermicro
    • U.S. HPC Centers of Activity
  • Events
  • Contact
0

$0.00

LOGIN
✕
  • Home
  • Uncategorized
  • Application Scaling for Typical HPC Site Covers Broad Range from Single-Core to Multi-Node
Awaiting product image

Application Scaling for Typical HPC Site Covers Broad Range from Single-Core to Multi-Node

$2,000.00

Authors: Melissa Riddle and Mark Nossokoff

Publication Date: June 2023

Length: 5 pages

Category: Uncategorized
Share
Description

In a recent study, respondents reported that only about half of all their HPC applications (46.8%) run on multiple nodes, leaving half of the applications running on a single node or less. Looking at the top two applications per site, two-thirds (68.9%) were run on multiple nodes. This suggests that the typical HPC site has at least several important, large-scale applications, and a very large set of single node jobs. This data is from an annual study that is part of the eighth edition of Hyperion Research’s HPC end-user-based tracking of the HPC marketplace. It included 181 HPC end-user sites with 3,830 HPC systems.

Related Products

    Continued Development of DNA-based Storage Solutions

    Mark Nossokoff, Earl Joseph

    Catalog and Seagate Technology recently announced a collaboration to advance DNA-based technology towards becoming commercially viable storage and computing solution. Catalog brings their molecule designs for storing data in DNA and performing computation across a library of molecules to the partnership, while Seagate Technology will be contributing its silicon-based lab-on-a[1]chip technology to reduce the volume of chemistry required for DNA-based storage and computation.

    September 2022 | Uncategorized

    Slurm Remains Top Resource Manager

    Melissa Riddle and Mark Nossokoff

    Slurm continues to be the most popular job queuing, resource manager, or scheduling software at HPC sites around the world. In a recent study, Slurm maintained its lead with half of all respondents (50.0%) reporting they use Slurm at least some of the time. After Slurm, the most popular resource managers and schedulers were OpenPBS (18.9%), PBS Pro (13.9%), Torque (13.3%), NQS (12.2%), and LSF (10.6%). This data is from an annual study that is part of the eighth edition of Hyperion Research's HPC end-user-based tracking of the HPC marketplace. It included 181 HPC end-user sites with 3,830 HPC systems.

    June 2February3 20 | Uncategorized

Have any questions?

365 Summit Ave.
St. Paul MN 55102, USA.

info@hyperionres.com

© 2021 Hyperion Research. All Rights Reserved | Privacy Policy | Website Terms of Use
LOGIN
0

$0.00

✕

Login

Lost your password?