Alibaba’s Qwen powers latest version of Singapore AI’s home-grown Sea-Lion LLM

The partnership also sees Alibaba offering technical support for “advanced post-training” of the LLM

Chloe Lim
Published Mon, Nov 24, 2025 · 01:51 PM
    • Qwen-Sea-Lion-v4 was built using Alibaba's Qwen3-32B as a foundation model.
    • Qwen-Sea-Lion-v4 was built using Alibaba's Qwen3-32B as a foundation model. PHOTO: BT FILE

    [SINGAPORE] Alibaba’s cloud computing arm on Monday (Nov 24) announced its support for AI Singapore’s (AISG) latest version of its Sea-Lion large language model (LLM), dubbed Qwen-Sea-Lion-v4.

    This latest version was built on Alibaba’s Qwen3-32B foundation model, said Alibaba Cloud in a news release, noting the latest version’s launch marked a “significant step in AISG’s efforts to deliver increasingly capable and accessible AI solutions for the region”.

    The partnership also sees Alibaba offering technical support for “advanced post-training” of the LLM.

    Alibaba Cloud also said the base Qwen3-32B model had been further trained on over 100 billion South-east Asian language tokens to enhance its ability to interpret local expressions, conversational nuances and regional knowledge domains.

    As for the base model of Qwen3 – also the latest iteration of the Qwen family – it was already pre-trained on a large and diverse database, spanning 119 languages and dialects, and totalling 36 trillion tokens.

    This gives it broader linguistic exposure from the outset, including to South-east Asian languages that are typically under-represented in mainstream AI models.

    In exchange, AI Singapore contributed their open-source region-specific data curation, optimisation and evaluation across South-east Asian language tasks.

    Combining the model’s multilingual and reasoning strengths, with AI Singapore’s deep regional expertise, shows how open collaboration can make advanced AI more “inclusive and locally relevant”, said Choong Hon Keat, general manager of Singapore, Alibaba Cloud Intelligence.

    “We look forward to enabling more developers, enterprises and public-sector partners to build applications that truly understand the languages and cultures of this region,” he added.

    Qwen-Sea-Lion-v4 now ranks first for South-east Asian Holistic Evaluation of Language Models, among open-source models under 200B parameters. It can be downloaded for free on the AI Singapore website or Hugging Face.

    In December 2023, the S$70 million Sea-Lion initiative was publicly launched to build an open-source LLM that reflects the native characteristics of South-east Asia.

    The project received funding by the National Research Foundation and is backed by the Infocomm Media Development Authority and Agency for Science, Technology and Research.

    Decoding Asia newsletter: your guide to navigating Asia in a new global order. Sign up here to get Decoding Asia newsletter. Delivered to your inbox. Free.

    Copyright SPH Media. All rights reserved.