Why Japan is lagging behind in generative A.I. — and how it can create its own large language models
Japan is already behind other countries when it comes to developing its own generative artificial intelligence algorithms.
Since OpenAI made waves with its chatbot ChatGPT, generative AI has been the hottest topic in tech. Goldman Sachs research suggests that advances in generative AI could boost global GDP by 7% over the next decade, or almost $7 trillion.
Generate AI relies on large language models, such as ChatGPT and Baidu’s Ernie Bot, which can process vast data sets to generate text and other content. Japanese LLM startup Kotoba Technology co-founder Noriyuki Kojima said Japan is currently falling behind the U.S., China, and the EU in developing these algorithms.
According to research from a consortium of state-run institutes, at least 79 LLMs have been launched in China in the past three years, including those from Alibaba and Tencent. Kojima credited U.S. companies such as OpenAI, Microsoft, Google and Meta with propelling LLM advancements in the country.
Japan lagging behind in generative AI
Its LLM development, however, lags behind that of the U.S., China, and Europe.
As Kojima noted, Japan’s trailing position is largely the result of its comparative shortcomings in deep learning and software development.
To develop the necessary infrastructure and applications for deep learning, Kojima said a “robust community of software engineers” is needed. The Ministry of Economy, Trade and Industry estimates that Japan will lose 789,000 software engineers by 2030. According to the IMD World Digital Competitiveness Ranking, China is ranked 28th out of 63 countries in terms of technological knowledge.
Moreover, Japan faces hardware challenges, since LLMs must be trained on AI supercomputers like IBM’s Vela and Microsoft’s Azure. As reported by Nikkei Asia, no private company in Japan has a “world-class machine” with those capabilities.
According to Kojima, government-controlled supercomputers such as Fugaku hold the key to Japan’s pursuit of LLMs.
According to him, the most significant bottleneck in LLM development has traditionally been access to such large-scale supercomputers.
How Japan’s supercomputers can help
In May, Fujitsu announced that two Japanese universities will use Fugaku to develop LLMs based on Japanese data in collaboration with the supercomputer’s developers.
According to Fujitsu, the organizations plan to publish their research results in 2024 to assist Japanese researchers and engineers in developing LLMs.