
Related Products
Cerebras Announces Capability to Train Largest Models Easily
Alex Norton and Thomas Sorensen
In mid-June of 2022, Cerebras Systems announced a new feature that allows users to train some of the largest AI models in the world within a single CS-2 machine using a simplified software support scheme. The announcement highlights multiple capabilities that Cerebras sees as their competitive advantages over other companies. Notable examples cited include the ability to accommodate an entire training model within the memory, through Cerebras' Weight Streaming software on the Wafer Scale Engine (WSE), instead of splitting it across processors, as well as the ability for users to manipulate a few inputs within the software scheme and GUI to choose the scale of model desired for training (i.e., GPT-3 13B, GPT-3XL 1.3B). Cerebras claims that this advancement can cut down the setup of large model training runs from months to minutes, with the Cerebras software managing much of the initial setup.
September 2022 | Uncategorized
Continued Development of DNA-based Storage Solutions
Mark Nossokoff, Earl Joseph
Catalog and Seagate Technology recently announced a collaboration to advance DNA-based technology towards becoming commercially viable storage and computing solution. Catalog brings their molecule designs for storing data in DNA and performing computation across a library of molecules to the partnership, while Seagate Technology will be contributing its silicon-based lab-on-a[1]chip technology to reduce the volume of chemistry required for DNA-based storage and computation.
September 2022 | Uncategorized