menu_open Columnists
We use cookies to provide some features and experiences in QOSHE

More information  .  Close

An LLM of our own

21 0
02.11.2025

In August, I had the opportunity to attend an AI journalism lab alongside many like-minded journalists. During the two-day training, we met an official from the National University of Sciences and Technology (NUST) who explained how the university is working with the government and a telecom company to build Pakistan’s indigenous large language model (LLM).

One reporter asked: why do we need our own LLM when we already have great models in the market? My understanding so far is that we are building our localised or fine-tuned LLM using open-source foundations, such as Meta's Llama, the parent company of Facebook. And before I get back to the reporter’s question, it is time for a flashback.

Earlier this year, during my Deep Learning course, we had an assignment where we were asked to create a model that could recognise an individual’s gender, age and race based on the photo provided. It is a classic model with YouTube full of the code sheet (for those who want to try). Anyway, when I fed my public photo to the model, it came up with a hilarious result: Male, 10 and White. Beyond the initial laughter that the response elicited, we identified several factors that helped justify the weak response. One, I had pigtails, which may have confused the model. Second, the data set that we used had a few photos of people from the South Asian region, making the model unable to study my features. When a model is not shown what a particular group of people look like, it is bound to make mistakes.........

© The News International