24/05/2025
SATURDAY | MAY 24, 2025
READ OUR
HERE
4
Malaysian Paper
/thesun
Rethinking learning in age of artificial intelligence
MyMahir AI Council launched to future proof workforce
Ű BY KIRTINEE RAMESH newsdesk@thesundaily.com
PETALING JAYA: In a strategic move to future-proof the nation’s workforce, Malaysia officially launched the MyMahir National AI Council for Industry (MyMahir-NAICI) yesterday. The council is spearheaded by Talent Corporation Malaysia Berhad (TalentCorp) under the Human Resources Ministry and the National AI Office through MyDIGITAL Corporation under the Digital Ministry. It aims to coordinate AI talent development and accelerate industry adoption. The launch included the inaugural council meeting and the signing of a memorandum of understanding between TalentCorp and MyDIGITAL, witnessed by Human Resources Minister Steven Sim and Digital Minister Gobind Singh Deo. “The question is not whether AI will replace jobs, but whether we will empower Malaysians to evolve with it,” Sim said. “Through MyMahir-NAICI, we’re building a whole of-nation strategy—aligning skills with strategy, technology with talent, and policy with purpose.” Gobind highlighted the need to link innovation with real-world application. “The National AI Office drives demand and deployment, TalentCorp shapes the talent supply, and MyMahir-NAICI closes the loop by informing policy and practice.” The council will operate over a three-year period, focusing on four strategic pillars: AI talent development, industry integration, policy and funding alignment and stakeholder governance. TalentCorp will act as secretariat, while the National AI Office ensures alignment with the national AI roadmap. Supporting tools include the MyMahir Impact Study, the MyMahir.my platform, the GIAT Action Plan, which promotes coordination across government, industry, academia and training providers, and a comprehensive AI Talent Framework. The AI Readiness Index, embedded in MyMahir.my, will help companies assess their preparedness and guide transformation strategies. According to the impact study, 620,000 jobs or 18% of Malaysia’s formal sector, are expected to be significantly impacted by AI within the next three to five years. In response, 60 new roles have been identified across the AI, digital, green and deep tech sectors. The initiative will be monitored by the newly established AI Implementation Monitoring Unit and supports national agendas including the Madani Economy, New Industrial Master Plan 2030 and the Malaysia Digital Economy Blueprint. Call for support to fund inmate education MALACCA: The Prisons Department is calling on the private sector, corporate bodies and non governmental organisations to help fund higher education programmes for inmates keen to continue their studies. Deputy commissioner-general (security and corrections) Datuk Ibrisam Abdul Rahman said although these individuals are serving sentences, many have expressed strong interest in pursuing studies at diploma, bachelor’s, master’s and even doctoral levels. “This cooperation through corporate social responsibility programmes would be immensely helpful, as current funds are limited and insufficient to cover the full cost of inmates’ education, especially for those without financial support from their families. “Some may be fortunate to receive help from their families, but for those who cannot afford it we must step in to secure suitable funding. Without such assistance, many may only be able to complete their Sijil Pelajaran Malaysia,” he said in Telok Mas on Thursday. – Bernama
PETALING JAYA: As artificial intelligence (AI) becomes embedded in students’ daily academic routines, universities face urgent calls to rethink how they teach and assess learning before real understanding is replaced by AI-generated shortcuts. Universiti Teknikal Malaysia Melaka Faculty of Artificial Intelligence and Cybersecurity dean Assoc Prof Dr Muhammad Hafidz Fazli Md Fauadi warned that over-reliance on AI risks producing graduates who pass exams but lack critical thinking and practical skills. “Recent advances in AI have reshaped education. You’ll hear students say, ‘I finished my final assignment in four hours using AI.’ “It may sound like a joke, but it reflects a real challenge in today’s higher education landscape,” he said. His comments followed theSun’s report on a student’s Facebook post claiming they breezed through assignments with AI tools, prompting backlash from netizens concerned about eroding cognitive skills. AI tools now enable students to generate polished essays, technical reports or even coding projects within minutes, often without grasping the underlying content. Muhammad Hafidz noted this demands a serious rethink by universities, particularly among lecturers and administrators, to ensure that teaching remains relevant in the digital age. “Policy-makers and academic leaders must redesign curricula to integrate AI responsibly.” He stressed that assessment should go beyond factual knowledge to evaluate creativity, critical thinking, problem-solving and ethics. He said banning AI outright is neither feasible nor productive given its growing o Universities urged to prioritise creativity, ethics and critical thinking as AI tools reshape student work habits Ű BY QIRANA NABILLA MOHD RASHIDI newsdesk@thesundaily.com
Muhammad Hafidz said educators also need ongoing professional development to understand AI tools and how to assess AI-assisted work fairly. – ADAM AMIR HAMZAH/THESUN
She is aware of her university’s policy on AI use and believes it is fair, valuing the balance between using technology and building her own skills. Directing in Film student Muhammad Azim Irfan Bahtiar, 22, shared that he often turns to AI tools when struggling to begin assignments. “Sometimes I also ask it to explain theories or terms I don’t understand. It’s like a study buddy that guides me or explains things better than some textbooks.” Commenting on the use of AI detectors by lecturers, he said the aim is to ensure students do not simply copy AI-generated answers. “But not all AI use is cheating. Sometimes, detectors aren’t always accurate and can flag original work unfairly. Instead of relying solely on these tools, it’s better to teach students responsible AI use and foster mutual trust.” – by Qirana Nabilla Mohd Rashidi assess the role of AI in their work.” Muhammad Hafidz also recommended expanding oral and in-person assessments. “When asked, ‘Why did you choose this approach?,’ students who overly rely on AI often can’t explain their work. That’s a red flag.” He further advocated phased assessments – breaking assignments into proposals, drafts and final submissions – to reduce last-minute dependency on generative tools. Some platforms now even allow educators to track AI interactions in real time. The most effective safeguard, he noted, is designing assignments that require real-world problem-solving and creativity, areas where AI alone falls short. “Engineering students, for instance, could be tasked with designing a solar-powered system for a specific village, complete with site planning, cost analysis and social impact evaluation. That’s not something you can easily copy-paste from AI.” Ultimately, he urged educators to go beyond preparing students for exams and equip them for the real world.
“In most cases, if a student’s work shows a high percentage of AI-generated text, they will be asked to revise and resubmit their assignment until the AI score is brought down.” She warned that misusing AI, such as submitting AI-generated work without disclosure, relying entirely on it to complete assignments or using it to bypass learning objectives, undermines academic integrity. “Such actions can violate academic policies and may result in disciplinary consequences, similar to plagiarism. “AI is here to stay, but so is the importance of academic honesty. As students navigate this new landscape, learning to use AI wisely is essential to remain innovative and ethical.” Theatre student Alini Anak Dolly, 22, said she occasionally uses tools such as ChatGPT or Google Gemini, mainly to generate ideas and presence in modern workplaces. Instead, students should be taught to use it ethically, in line with the World Economic Forum’s 2025 report which highlights AI literacy, creativity and critical thinking as essential job skills. “Educators also need ongoing professional development to understand AI tools and how to assess AI-assisted work fairly. “By adapting frameworks such as Bloom’s Taxonomy, institutions can build clear rubrics that measure originality, technical skills and practical AI usage, helping separate genuine learning from over-reliance.” He added that the focus should shift from outcomes to learning processes. Requiring students to submit multiple drafts, include software logs or maintain reflective journals can promote engagement and limit misuse. “Transparency is key. Students should disclose AI prompts, responses and their own edits. “This fosters explainability, especially in STEM disciplines, and helps them critically
Varsities boost efforts to address academic integrity PETALING JAYA: With artificial intelligence tools increasingly used by students to complete assignments, academic institutions are stepping up efforts to detect AI-generated content, viewing it as a growing threat to academic integrity. tools such as ChatGPT.” Nor Shahniza said students at the university are required to use Turnitin, which now detects both similarity and the percentage of AI generated content. improve her writing. “I think AI helps me understand certain topics better because it explains things in ways that suit my learning style. Not everyone processes information the same way.”
Universiti Teknologi Mara College of Computing, Informatics and Mathematics (computer science) head Assoc Prof Dr Nor Shahniza Kamal Bashah said while AI has the potential to enhance learning, its use raises important questions about integrity, policy and fairness. She noted that academics recognise AI’s benefits, such as helping generate engaging content, guiding students to accurate answers and correcting coding errors. “Currently, there’s no specific policy or guideline governing responsible or acceptable AI use in academic work. “Institutions typically monitor the similarity index to ensure it remains below 30%, but now many also use AI detectors to assess how much of a student’s work may have been generated by
Made with FlippingBook - Online magazine maker