
Large Language Models: Finding Their Place in the HPC Ecosystem
$8,000.00
Authors: Bob Sorensen and Tom Sorensen
Publication Date: September 202023
Length: 33 pages
The purpose of this study was to gain a better understanding of the capabilities of large language models (LLMs), an emerging class of AI algorithms, to benefit the overall HPC community. Key goals of this effort included describing the base of current and planned HPC-related activity that could incorporate LLMs, assessing the level of ongoing LLM activity within end user organizations, characterizing the interest in general-purpose LLM applications, exploring the prospects for LLM integration into traditional HPC algorithms, and highlighting the key challenges with integrating LLM capability into HPC-based workloads.
Related Products
2022 HPC End Users Perspectives on Trends and Forecast in HPC Storage and Interconnects – Key Findings
Mark Nossokoff, Jaclyn Ludema and Earl Joseph
Key findings from a recent Hyperion Research study indicate that HPC storage solutions, and associated storage and system interconnects, continue to be critical for HPC infrastructure to deliver optimal capabilities and provide the fastest time to results for the systems' users. Data-intensive workloads driven by new AI/ML/DL workloads, increasing scale of traditional HPC modelling and simulation, emerging edge computing, and emerging composable systems are placing greater demands and requirements on HPC storage systems. Insights into the critical factors driving these and other trends are detailed in the 2022 iteration of Hyperion Research's annual MCS end users' study, 2022 HPC Multi-Client Study: Trends and Forecasts in HPC Storage and Interconnects. Key findings from the report are summarized in this document.
February 2023 | Special Report
Expertise is a Major Concern for Both HPC and AI Users
Melissa Riddle and Mark Nossokoff
Expertise is now a major concern for both HPC and AI users, outranked only by budget concerns among top barriers to expanding HPC on-premises. A third of HPC sites (33.1%) reported that lack of knowledge or skilled support staff was one of their top three barriers. Only a third of respondents (35.4%) report that they do not have any staffing concerns within the next year. When asked about barriers to furthering AI capabilities, AI-specific expertise was identified as a significant concern. Other significant barriers also included access to AI expertise (48.9%), skills in AI model development (47.2%), and skills in AI programming (36.1%). This data is from an annual study that is part of the eighth edition of Hyperion Research's HPC end-user-based tracking of the HPC marketplace. It included 181 HPC end-user sites with 3,830 HPC systems.
June 2February3 20 | Special Report

