SINGAPORE, July 26, 2024 – The global edge AI market, valued at $14,787.5 million in 2022, is forecasted to expand at a compound annual growth rate (CAGR) of 21.0% from 2023 to 2030. This rapid growth is driven by the increasing adoption of edge AI workloads and infrastructure across diverse industries, enhancing cloud resource efficiency and ensuring data sovereignty.
Luxembourg-headquartered Gcore stands at the forefront of this market with its extensive global network of over 180 edge nodes across six continents, including more than 25 cloud locations. Trusted by public organizations, telecommunications firms, and global corporations, Gcore delivers scalable and reliable edge AI, cloud, network, and security solutions. Its cloud infrastructure is designed for the edge, facilitating both the training of large language models (LLMs)—a market valued at over $200 billion—and the inference of AI applications at the edge.

Speaking to AsiaBizToday at a select press gathering in Singapore earlier this month, Andre Reitenbach, CEO of Gcore said, “We are on the cusp of an AI revolution that will transform how companies operate. Gcore is perfectly positioned to connect the world to AI, anywhere and anytime, by delivering innovative AI, cloud, and edge solutions.”
$60 million Series A Funding
On July 23, Gcore announced that it has successfully raised $60 million in a Series A funding round. This pivotal investment, spearheaded by Wargaming with additional backing from Constructor Capital and Han River Partners, marks the company’s first external raise since its inception over a decade ago. With this new funding, Gcore aims to advance AI innovation within its solutions and technology, particularly through the investment in cutting-edge AI servers powered by NVIDIA GPUs.
Fabrice Moizan, Chief Revenue Officer at Gcore said, “Over the past decade, Gcore has established itself as a global leader in Edge networking with a primary focus on optimizing latency and automation for CDN, security, and Edge Cloud services. We have an unwavering commitment to delivering innovative solutions for Edge AI by leveraging our expertise and deploying cutting-edge AI solutions worldwide, ultimately shaping the future of AI at the Edge.”

Edge AI
Edge artificial intelligence (edge AI) refers to the deployment of AI algorithms close to the data source or user. This approach allows data processing and inference to occur without relying on central cloud servers, resulting in faster and more efficient AI inference and lower latency for end users. Keeping data local also offers security and data privacy benefits.
Edge AI allows businesses to perform high-performance computing and deep learning closer to end-user devices. By reducing the distance data needs to travel, response times and efficiency improve. This is a paradigm shift for industries where AI speed and reliability are mission-critical.
Transitioning AI inferencing from the cloud to the edge enhances real-time decision making by bringing data processing closer to data sources. For businesses, this shift significantly reduces latency, directly enhancing user experience by enabling near-instant content delivery and real-time interaction.
Gcore Inference at the Edge: Revolutionizing AI Deployment
Earlier this June, Gcore launched Gcore Inference at the Edge, an innovative solution designed to provide ultra-low latency experiences for AI applications. This solution enables the distributed deployment of pre-trained machine learning (ML) models to edge inference nodes, ensuring real-time inference and seamless performance.
The new service leverages Gcore’s global network of 180+ edge nodes, interconnected by sophisticated low-latency smart routing technology. Each high-performance node is strategically placed close to end users, ensuring response times of under 30 milliseconds. Gcore Inference at the Edge runs on NVIDIA L40S GPUs, offering cost-effective, scalable, and secure AI model deployment.
At the launch Andre Reitenbach stated, “Gcore Inference at the Edge empowers customers to focus on getting their machine learning models trained, rather than worrying about the costs, skills, and infrastructure required to deploy AI applications globally. At Gcore, we believe the edge is where the best performance and end-user experiences are achieved, and that is why we are continuously innovating to ensure every customer receives unparalleled scale and performance.”

Strategic Partnership with NHN Cloud
In a strategic move to bolster its presence in the Asia Pacific region, Gcore announced this April partnership with NHN Cloud, a subsidiary of South Korean technology and gaming giant NHN Group. This partnership, aims to integrate Gcore’s pioneering cloud services—including CDN, 5G eSIM, and accelerated AI processing—into the NHN Cloud ecosystem, providing APAC enterprises with access to Gcore’s advanced technologies.
In return, Gcore will serve as NHN Cloud’s strategic partner in Europe, the Middle East, and Africa, facilitating the expansion of NHN Cloud’s customer base into these regions. The collaboration also lays the groundwork for the joint development of new solutions and commercial initiatives.
With these strategic initiatives and innovative solutions, Gcore is set to continue its leadership in the global edge AI market, driving growth and transformation for businesses worldwide.