Skip to Content

The Interconnection of AI and APIs

with Aki Ranin, AI [r]ecursive

Aki Ranin, co-founder of Bambu, published author and AI transformation consultant, discusses the interconnection between AI and APIs. He shares his journey in the AI data science space and explains how AI and APIs are intricately linked. Aki highlights the potential of large language models and AI agents in transforming industries and making AI-assisted tasks more efficient. He also discusses the challenges of discoverability and the importance of metadata in making information accessible to AI agents. Aki provides recommendations for individuals looking to understand the trajectory of AI and APIs.

When Jon Scheele sat down with Aki Ranin for this episode of TheLoopAsia, the conversation centred on a deceptively simple idea: that AI and APIs are not parallel trends but a single, inseparable phenomenon. Two years on, with Aki preparing to keynote at apidays Singapore 2026, that idea has never felt more relevant.


Aki brings an unusual vantage point to this conversation. Finnish by birth, Singaporean by choice, he trained as a computer scientist at Aalto University before spending years in robotics, ERP software, and UX — a winding path that eventually led him to co-found Bambu, a robo-advisory platform built on the premise that sophisticated financial intelligence could be packaged and delivered through APIs. It was, in hindsight, an early experiment in exactly what the industry is now racing to build: AI capability made accessible through clean integration interfaces.


The most enduring insight from this episode is Aki's framing of large language models as connective tissue. Where earlier AI systems were point solutions — smart at one thing, isolated from everything else — LLMs introduced something qualitatively different: the ability to act as a dynamic layer between a user's intent and a constellation of underlying services. Aki described the Bambu platform as a collection of intelligent APIs, each doing a specific job in financial planning. What LLMs added was the glue — the capacity to receive a natural language request and orchestrate the right combination of those APIs in response, without requiring a human to navigate through screens and menus to get there.


This is not merely a technical observation. It has direct implications for how enterprises should think about their AI investments. Aki was clear that the infrastructure underneath the agent matters as much as the agent itself. A well-designed AI interface sitting on top of fragmented, poorly documented, or inaccessible systems will underperform — not because the AI is weak, but because it has nowhere useful to go.


That thread runs directly into Aki's recent Substack article, How to Connect AI Agents To Your Business, published in February 2026. In it, he maps out three emerging architectural models for enterprise AI agents — Closed Agents, Platform Agents, and Screenless Agents — and argues that the real unlock is not the agent but the integration layer connecting it to an organisation's systems of record: its CRM, data warehouse, ERP, and the accumulated SaaS tools that most enterprises have built up over the years. His conclusion is practical and actionable: give agents access to your APIs, start with read-only permissions, and build from there. The article is essential reading for any technology leader thinking seriously about agent deployment in 2026.


Aki also touches on the challenge of discoverability — the question of how AI agents find and understand the services available to them. Just as the early web required metadata and search to make content accessible to humans, the agentic web will require well-structured, machine-readable API documentation and interfaces. Organisations that invest in this now will have a meaningful advantage as agent adoption accelerates.


The conversation closes with career advice that applies equally to individuals and organisations: the question is not just where AI is today, but where it is heading. Building strategies around current limitations is risky. The more durable investment is in the foundational infrastructure — the integration layer — that will remain valuable regardless of which model or platform sits on top of it.


Aki Ranin returns to the apidays Singapore stage on 14–15 April 2026, where his keynote will build on these ideas. You can read his article here: How To Connect AI Agents To Your Business

apidays Singapore returns, 14-15 April 2026

If you attended a previous apidays Singapore, you know the energy of the community. We’re bringing it back on 14–15 April 2026 with a focus on AI-readiness, API strategy, platform engineering, and cybersecurity.

Whether you’re building APIs, consuming them, or want to connect your AI Agents to your existing services — this is the place to connect with practitioners across Asia-Pacific who are navigating the same challenges.

Register / Learn more

See Aki Ranin's talks at apidays Singapore

Building an AI Operating System

Aki Ranin's talk at apidays Singapore 2024

The Great Fintech Convergence

Aki Ranin's talk at apidays Singapore 2020

Searching for Scale through APIs

Aki Ranin's talk at apidays Singapore 2019

powered by blue connector

API Strategy and Tech Advisory, Training and Events

We connect your organisation, your customers, partners and suppliers with the information and knowledge you need to make your tech work for you

 Learn more