Making hyperscale AI infrastructure more accessible
for scaling any AI model and application
A full-stack infrastructure software from PyTorch to GPUs for the LLM era
September 3, 2024
There are no barriers to fine-tune Llama 3.1 405B on the MoAI platform. The Moreh team has actually demonstrated fine-tuning on the model with 192 AMD GPUs.August 19, 2024
The MoAI platform provides comprehensive GPU virtualization including fine-grained resource allocation, multi-GPU scaling, and heterogeneous GPU support.August 14, 2023
Moreh trained a largest-ever Korean LLM with 221B parameters on top of the MoAI platform and an 1,200 AMD MI250 cluster system.November 18, 2024
Press release — Joint R&D of AI data center solutions by integrating Tenstorrent's semiconductors with Moreh's software; Targeting NVIDIA dominant market with competitive solutions.January 18, 2024
The Korea Economic Daily — South Korean artificial intelligence tech startup Moreh Inc. said on Thursday that its large language model (LLM) topped a performance assessment by global leading AI platform operator Hugging Face Inc.October 26, 2023
TechCrunch — Advanced Micro Devices (AMD) and Korean telco KT are among the investors of Moreh, which builds an AI software tool that optimizes and creates AI models.