Early deployment highlights Korea’s push to strengthen domestic AI research infrastructure.
Kakao said it will build and operate large-scale GPU infrastructure to support South Korea’s artificial intelligence research and development after being selected as the final operator of a government-led GPU procurement project. The initiative is part of a national strategy to provide advanced computing resources to the private sector as Korea aims to position itself among the world’s top three AI powers.
Under the project, Kakao will secure and operate a total of 2,424 high-performance GPUs, including NVIDIA B200 models, on a commissioned basis for five years. The company said the infrastructure will be used to support domestic AI research across industry, academia, and public research institutions.
Faster-than-planned infrastructure deployment
Kakao said it has already completed the bulk of the infrastructure build at Kakao Data Center Ansan in Ansan, Gyeonggi Province, moving well ahead of its original schedule. The company has deployed 255 computing nodes, or 2,040 GPUs, representing about 84% of the total allocation.
This deployment exceeds the company’s original year-end construction target by more than four times, highlighting how quickly the project has progressed since Kakao was selected as the operator in August. The remaining GPUs are expected to be installed as the project moves into its next phase.
Factors behind the early rollout
Kakao attributed the accelerated schedule to a combination of in-house capabilities and tighter project management. According to the company, key factors included:
- Established data center infrastructure capable of supporting high-density GPU servers
- Prior experience in building and operating GPU clusters
- Early coordination with GPU suppliers to secure critical equipment
- Advance technical verification, including proof-of-concept testing, to reduce operational risks
By addressing potential performance and stability issues early, Kakao said it was able to speed up deployment without compromising reliability.
Power and cooling as a core consideration
High-performance GPUs place heavy demands on power and cooling systems, and Kakao said these factors were addressed early in the Ansan data center build-out. The facility was prepared with reinforced power supply systems to support continuous operation of dense GPU clusters.
To manage heat more efficiently, Kakao implemented a hot aisle containment system, which separates hot exhaust air from cooler intake air and directs it efficiently toward cooling units. The company said this setup improves cooling efficiency and helps maintain stable operating conditions for large-scale AI workloads.
Software environment for AI developers
In addition to hardware, Kakao is setting up a software environment designed to let researchers focus on AI development rather than system operations. By linking its platform with the national AI computing resource support portal, users will be able to access Kakao Cloud, which is operated by Kakao Enterprise, through a unified interface.
Kakao is also offering Kubeflow, a cloud-native AI platform designed to support the full machine learning lifecycle. Built on Kubernetes, the platform supports model development, training, deployment, and inference, allowing researchers to automate workflows and manage computing resources more efficiently.
Access for industry, academia, and research
Kakao said it is currently conducting network and performance tests on the deployed nodes. Starting in January next year, the company plans to begin providing GPU resources to projects selected by the Ministry of Science and ICT and the National IT Industry Promotion Agency.
The initial rollout will focus on industry–academia–research collaboration projects, reflecting the government’s aim to spread AI infrastructure access beyond large corporations to a wider research community.
Outlook and implications
“Stable deployment and operation of large-scale GPU infrastructure is essential to AI competitiveness,” said Kim Se-woong, AI Synergy Performance Leader at Kakao. He added that Kakao plans to contribute to the growth of Korea’s AI ecosystem by providing a reliable development environment based on its data center and cloud capabilities.
While the project highlights strong execution on infrastructure, its long-term impact will depend on how effectively researchers and companies can use the resources to produce competitive AI models and applications. As global demand for AI computing continues to rise, Kakao’s role as an infrastructure operator places it at the center of Korea’s broader effort to translate public investment in GPUs into tangible research and industrial outcomes.






