Making hyperscale AI infrastructure more accessible
for scaling any AI model and application
A full-stack infrastructure software from PyTorch to GPUs for the LLM era
February 20, 2025
MoAI provides a PyTorch-compatible environment that makes LLM fine-tuning on hundreds of AMD GPUs super easy, including DeepSeek 671B MoE.December 2, 2024
Moreh announces the release of Motif, a high-performance 102B Korean language model (LLM), which will be made available as an open-source model.September 3, 2024
There are no barriers to fine-tune Llama 3.1 405B on the MoAI platform. The Moreh team has actually demonstrated fine-tuning on the model with 192 AMD GPUs.January 28, 2025
Global Corporate Venturing — The startup that perhaps comes closest to DeepSeek’s approach is South Korea’s Moreh, which has created a software tool that allows users to build and optimise their own AI models using a more flexible, modular approach.November 18, 2024
Press release — Joint R&D of AI data center solutions by integrating Tenstorrent's semiconductors with Moreh's software; Targeting NVIDIA dominant market with competitive solutions.January 18, 2024
The Korea Economic Daily — South Korean artificial intelligence tech startup Moreh Inc. said on Thursday that its large language model (LLM) topped a performance assessment by global leading AI platform operator Hugging Face Inc.