03/02/2026

BIZ & FINANCE TUESDAY | FEB 3, 2026

19 Getting ready for agentic AI

AROUND the world, governments and industries are moving quickly to secure leadership in artificial intelligence (AI). For many countries, this momentum intersects with various pressing demographic realities: an aging population, a shrinking workforce, and the urgent need to reinvent productivity. We cannot afford to fall behind. And the rise of agentic AI promises to accelerate this transformation. Unlike traditional AI models, agentic AI does not just respond to queries – it reasons, plans, and takes actions across systems. For example, instead of simply answering a question on travel recommendations, an agentic system would book your flights, update your calendar, send reminders, and even adjust your itinerary based on weather or delays – all without being prompted for each step. This marks a shift from passive AI responses to proactive, collaborative systems that work alongside humans. The rise of agentic AI will require significantly more compute power – not just for single tasks or queries, but for extended workflows that involve reasoning, planning, and continuous adaptation. As the technology for agentic AI matures and adoption expands, the world is effectively adding billions of virtual users into the compute fabric. The question for every country, including Malaysia is whether their AI infrastructure is ready to support this scale and complexity. High-performance graphics processing units (GPUs) often dominate AI conversations, especially for training and running large-scale models. But central processing units (CPUs) are just as critical in powering AI systems behind the scenes – handling essential tasks such as data movement, memory management, thread coordination, and orchestrating GPU workloads. In fact, many AI workloads including language models with up to 13 billion parameters, image recognition, fraud detection, and recommendation systems, can run efficiently on CPU-only servers, particularly when powered by high performance CPUs like the AMD EPYC 9005 Series processors. background. They are now at the point where trust is earned, customer experience is shaped, and business resilience is tested. Entering 2026, three shifts stand out: how automation is changing the way trust is built, how flexibility is reshaping in-store commerce, and how checkout has become a critical moment where speed, security, and confidence must come together. Agentic commerce, where AI systems begin to initiate and support transactions on behalf of consumers, is emerging. As these interactions enter the customer journey, they challenge assumptions around how intent, identity, and consent are verified – placing greater importance on payment infrastructure that can assess legitimacy and operate reliably at scale. The industry is increasingly collaborating to establish shared standards for agent-led interactions, ensuring emerging AI capabilities

To keep pace, more retailers are investing in native checkout experiences, where customers buy products directly within an app or platform (like social media), making the transaction process smoother while also boosting consumer trust. As customers shop across more channels and touchpoints, payments sit at the centre of the retail experience. They are where automation meets accountability, where speed meets security, and where operational decisions are most visible to customers. For local businesses, the direction is clear. The future of payments will favour platforms and partners that can support intelligent automation, flexible in-store experiences, and real time trust without adding friction. In 2026 and beyond, every transaction is more than a transaction. It is a moment to prove reliability, reinforce confidence, and keep customers coming back. competitiveness. As the country navigates rising automation needs and growing regional AI ambitions, future-ready AI infrastructure will be essential to unlocking sustainable growth, innovation, and resilience. This article is contributed by Alexey Navolokin, general manager Apac, AMD. and data centre operators the ability to build flexible, interoperable infrastructure that keeps pace with AI’s explosive growth. For Malaysia, embracing an open ecosystem positions the country to benefit from global innovation while cultivating local differentiation. It enables governments and businesses to build infrastructure that is performant, energy-efficient, and tailored to domestic needs – without being locked into proprietary limitations. In the upcoming era defined by multi-agent AI, openness is not just a philosophy – it is a pre-requisite for scale, sovereignty, and sustained leadership. As agentic AI reshapes how everything is done, the focus must go beyond GPUs to encompass CPUs, high-speed interconnects, and smart networking – all equally essential for orchestrating the complex, real-time decisions AI agents make at scale. Just as critical is an open ecosystem – with open software like ROCm, industry standards for rack scale design, and collaborative efforts such as UALink and UEC enabling greater flexibility, faster innovation, and interoperability from edge to cloud. This is why AMD is advancing its vision with Helios – a next-generation rack-scale reference design for AI infrastructure that will be released in 2026, designed to unify high performance compute, open software, and scalable architecture to meet the demands of agentic AI. For Malaysia, building open, heterogeneous, and scalable infrastructure like this is more than a technology choice – it is a strategic foundation for national

From GPUs to CPUs and high-speed networks, AI infrastructure must evolve to support the scale and complexity of agentic AI systems.

According to Adyen Index 2025 Retail Report, 58% of stores plan to speed up checkout through queue-busting tools, while 31% intend to equip store staff with mobile point-of-sale (MPOS) terminals and Tap to Pay capabilities – enabling faster, more flexible transactions beyond the traditional checkout counter. With payments no longer confined to fixed counters, transactions can now take place in aisles, at café tables, during peak-hour rushes, or pop-up stores. This mobility gives retailers the flexibility to rethink store layouts, minimise bottlenecks, and create space for more engaging and interactive experiences, while empowering staff to close sales wherever customers are. These same expectations apply just as strongly online. As shoppers move quickly between platforms and devices, they expect transactions to be quick and secure. Yet, traditional online checkouts redirect shoppers away from the retailer’s website to potentially unfamiliar external payment gateways – leading to increased dropoffs. foundational. Open standards such as the Open Compute Project (OCP) support modular system design, while emerging collaborations like the Ultra Accelerator Link (UALink) aim to create open, high-bandwidth connections between AI accelerators across servers. Meanwhile, the Ultra Ethernet Consortium (UEC) is defining next-generation networking standards purpose-built for AI – enabling low-latency, high throughput data movement across distributed systems. These open initiatives give cloud advanced tools for performance tuning, and offers portability across hardware – all available as open source. In the context of Japan’s ambition to foster innovation across academia, startups, and industry – open AI software offers broader accessibility, faster iteration, and lower barriers to entry. Similarly vital is openness at the hardware and systems level. As AI compute evolves towards large-scale, heterogeneous deployments, rack scale architecture becomes

This matters even more in a market like Malaysia, where fraud continues to rise for both consumers and businesses. As automation increases, security cannot be added after the fact. Trust needs to be built directly into real-time transaction flows, using data-driven decisioning to protect customers, support compliance, and maintain conversion without adding friction. Consumers today expect shopping to be fast, seamless, and flexible— whether in-store or online. In Malaysia, over half of shoppers will abandon a purchase if checkout is slow or their preferred payment method is unavailable. In an environment where convenience is non-negotiable, a clunky or limited checkout experience is often enough to cost retailers the sale. To meet these expectations, retailers are increasingly rethinking how and where payments happen. networking, and memory in a flexible and scalable way. Systems built this way can deliver the speed, coordination, and throughput needed to support rapid, real-time interactions of billions of intelligent agents. As adoption scales, rack-level optimisation where compute, storage, and networking are tightly co-designed will be key to delivering the next wave of performance and efficiency. As AI systems grow more complex and distributed, the need for openness in software, hardware, and systems design becomes a strategic imperative. Closed ecosystems risk vendor lock-in, limit flexibility, and can constrain innovation at a time when adaptability is key to scaling AI. This is why open software stacks like AMD ROCm TM are essential. ROCm provides developers and researchers the freedom to build, optimise, and deploy AI models across a wide range of environments. It supports popular frameworks like PyTorch and TensorFlow, includes

As AI models evolve into more modular architectures such as mixture-of-experts systems popularised by DeepSeek and others, the need for smarter resource orchestration grows. CPUs must deliver high instructions per clock (IPC), fast input/output (I/O), and the ability to manage multiple concurrent tasks with precision. Equally critical is connectivity, the “glue” that binds modern AI systems together. Advanced networking components such as smart network interface controllers (NICs) help route data efficiently and securely between components, offloading traffic from GPUs and reducing latency. High speed, low-latency interconnects help ensure data flows seamlessly across systems, while scalable fabric ties nodes together into powerful distributed AI clusters. In the age of agentic AI, heterogeneous system design becomes critical. AI infrastructure must go beyond raw compute – it must integrate CPUs, GPUs,

Beyond the checkout: What’s defining Malaysia’s retail in 2026 AS Malaysia’s retail landscape continues to be more competitive, payments are no longer just an operational function in the designed to plug into the payment, risk, and compliance systems retailers already trust, avoiding fragmented or siloed workflows as automation evolves.

integrate safely and seamlessly into existing commerce operations. This approach prioritises accountability, transparency, and customer needs, while giving businesses continued visibility and control over transactions, payment methods, routing, and data. Crucially, agent-led interactions are

This article is contributed by Soon Yean Lee ( pix ), country manager (Malaysia) Adyen.

Made with FlippingBook. PDF to flipbook with ease