AI Singapore's Sea-Lion v2 AI model is now available

Built on Meta's Llama 3.

AI Singapore's Sea-Lion v2 AI model is now available
Photo Credit: Paul Mah

Sea-Lion v2, the new version of the made-in-Singapore AI model is now available.

No mainstream publication has reported on it yet, but Dr. Leslie Teo, who helms the project, announced its availability in a LinkedIn post yesterday.

The background of Sea-Lion

Created by AI Singapore, the Sea-Lion project was designed specifically to understand and represent Southeast Asia's linguistic and cultural diversity.

The original Sea-Lion:

  • Trained using 8x Nvidia A100 GPUs.
  • Created with lean team of 20 Singaporeans.
  • Outperform other LLMs on Southeast Asian tasks.

I've written about how tech giants have trampled on copyright and ethics as they rushed to accumulate sufficient data to train their AI models.

However, AI Singapore took great pains to source its data ethically.

I spoke with Leslie in April, and he told me: "We want to do things correctly and uphold a higher standard with our AI model in Singapore."

Sea-Lion v2

Unlike the original Sea-Lion which was trained from scratch, Sea-Lion v2 is built on Meta's Llama 3.

The project is now part of the National Multimodal LLM Programme (NMLP), which sees the Singapore government setting aside SG$70M* to develop AI talent.

According to Leslie, SEA-LION v2 key features:

  • Continued pre-training and fine-tuning.
  • Trained with 50B tokens from SEA languages.
  • Licensed under Meta Llama 3 Community Licenses.

It is instruction-tuned in English, Bahasa Indonesia, Thai, Vietnamese, and Tamil.

Sea-Lion v2 demonstrates superior performance on tasks in regional languages while retaining Llama 3's general capabilities.

*The $70M goes to quite a few places.

The training progress

Sea-Lion v2 was trained using 64x Nvidia H100 GPUs in just two days for each run. This excludes the "many" experimentations with hyperparameters and data mixes.

According to Leslie, the challenge with Continued Pre-Training (CPT) is not the new knowledge but preserving older knowledge.

There appear to be plans to build on Google's Gemma 2 and AI startup Reka's models next.

For now, Sea-Lion v2 is available to download as a base model, instruction-tuned model, or smaller quantised models.

Sorry, there's no online demo, though the instruction-tuned model lets you do basic "chats" once properly deployed in a suitable environment.

Read Leslie's LinkedIn post here.