Navbar logo new
ODDO BHF AM : Fund Insight ODDO BHF Artificial Intelligence
Calendar15 Mar 2023
Theme: Funds
Fundhouse: ODDO BHF AM

We believe that we are witnessing the rapid emergence of a huge market for content generation by Artificial Intelligence (now known as "Generative AI"). This anticipation is based on two observations. First of all, never in history has a computer application grown as fast as ChatGPT, proof of the simultaneous (and often competing) efforts of many players to position themselves and capture large market share.

Moreover, the current existence of an ecosystem and infrastructure makes it easy to rapidly deploy new uses and innovative initiatives based on Generative AI. It is therefore possible to anticipate a potentially considerable automation of many tasks performed today by "white collar workers". Countless companies, all over the world, will go down this road and integrate Generative AI. Indeed, it is a matter of fiduciary duty (they will have to serve their customers with the increased efficiency that AI brings), of social responsibility (more efficiency means less consumption of resources) and, of course, of their survival (not to be overtaken by their competitors who are more audacious in integrating AI).

We would therefore like to devote this Fund Insight to an updated overview of the most involved players in Generative AI, in order to identify the possible winners and potential losers that this technology may generate.

SHOVEL AND PICKAXE SUPPLIERS FIRST

"During the gold rush, it was not the gold diggers who got richer, but the shovel and pickaxe dealers."

This quote has served as our compass several times in the recent history of technological disruptions. This time around, it could be a master precept. Indeed, Generative AI and the language models on which it is based require major investments. It is therefore essential to understand the key infrastructure segments, on which Generative AI will be anchored, in order to identify their respective "shovels and picks providers", who will benefit from this powerful wave:

Cloud Computing 2.0: The global infrastructure of remote computer server networks, commonly referred to as "cloud computing," did not wait for ChatGPT to experience strong, steady, uninterrupted growth. It didn't need to, because the migration of enterprise IT systems to the public cloud, combined with the growing need for infrastructure to handle the explosion of data generated by Internet traffic, were already powerful drivers. However, we are convinced that the emergence of Generative AI tools (and the language models that underlie them, each with hundreds of billions of parameters) is changing the game. It represents an additional windfall, but on a completely different scale, for public cloud providers. This primarily concerns the three players dominating this market worldwide: Amazon Web Services, Microsoft Azure and Alphabet Cloud. This sudden acceleration of needs and services is paving theway for cloud computing stage 2.0.

Artificial intelligence chips: Nvidia , an ultradominant player. The American manufacturer of high-performance graphics processors appears today as the big winner of Generative AI. The recent increase of its stock price proves it. This position is based on the unrivalled computing power of Nvidia 's chips, due to their unique technical characteristics (in particular their "parallelization"); but also on the unique hardware and software ecosystem provided to its customers by Nvidia . These chips and this ecosystem are de facto a standard for AI. Thus, Nvidia , whose chips provide 80% of the power required by Generative AI, is currently the only chip manufacturer able to meet its ogre-like appetite. Yet this demand is growing structurally.

For proof: Language models such as ChatGPT already require phenomenal computing power to develop. Like top-level athletes, they must be trained (commonly referred to as "training"), i.e. fed and tested via billions of examples and situations. However, the necessary computing power grows exponentially with each new generation of language models; A query on ChatGPT (an "inference") requires about 4 times more computing power than an Internet search via Alphabet ;

Remember that in 1971 the founder of Intel predicted that the power of microprocessors would double every two years. Fifty years later, this "Moore's law" was still valid! The growing need for computing capacity in AI is like a new version of this famous law, mainly to the benefit of Nvidia for the moment.

However, other semiconductor players are also affected by this strong growth in computing capacity:

Marvell Technology and AMD: for their presence in the data center chip segment Marvell Technology (again) and Broadcom : for their leadership in ASICs (integrated circuits dedicated to specific applications), such as Alphabet 's TPU (co-designed by Alphabet and Broadcom ). Among the players in the value chain of computing chips, let's mention downstream the Taiwanese TSMC (for its top-level manufacturing capabilities) and upstream the memory manufacturers Samsung Electronics, SK Hynix or Micron Technology (whose high-bandwidth DRAM memories are sub-components of Nvidia 's most elaborate chips). Finally, it should be remembered that an artificial intelligence server requires 2 to 3 times more DRAM memory than a conventional server.

High-performance network equipment: necessary to spread Generative AI, this concerns routers or optical components. Companies such as Arista, Cisco or Juniper Networks (for network equipment) but also Coherent or Lumentum (for the optical part) appear well placed.

MICROSOFT: A CREDIBLE PROSPECT OF SEEING ARTIFICIAL INTELLIGENCE SPREAD TO A LARGE PART OF THE SOFTWARE OFFERING

After $1 billion at launch, Microsoft will reinvest $10 billion in Open AI (the company that founded ChatGPT) in the coming weeks. By doing so, Microsoft has clearly taken the lead and should be able to deploy the following strategy:

Integrate ChatGPT in its search engine Bing: the goal is to get part of the manna that Alphabet benefits today. (For each percent of market share taken from Alphabet , it is 2 billion more advertising revenue for Microsoft ). Alphabet benefits from its quasimonopolistic and therefore ultra-profitable market share in the Internet search business. Microsoft 's immediate gains should, in our opinion, remain moderate, but could lead to a certain pressure on Alphabet 's gross margin in this activity (due to the costs of training and inference of Generative AI).

Deploy artificial intelligence modules in a large part of its offer for enterprises: the notion of "copilot", inherent to Generative AI, could enrich entire areas of Microsoft 's offer (Teams video conferencing of the "Premium" type, Office software suite enhanced by AI, etc.). This could also extend to the "Github co-pilot" tool, which is currently intended for IT developers.

Boost the growth of Azure, its public cloud division: already the most powerful on the market in terms of computing capacity, Azure could benefit from the volumes processed by ChatGPT, not to mention those of other Generative AI platforms to come soon (e.g. platforms verticalized by industry or, on the contrary, generalists). Monetize its stake in Open AI: since this company may not have seen the end of its revaluation yet, the war chest that Microsoft could recover could turn out to be colossal in the end.

FOR WHICH OTHER TECHNOLOGY COMPANIES DOES GENERATIVE AI COULD BE A THREAT AS WELL AS AN OPPORTUNITY?

It is too early to make a definitive judgement on the sure losers. However, the following companies have real challenges to face:

Alphabet : the new competition from Bing (developed by Microsoft ) threatens Alphabet 's ultra-dominant market share in Internet search (its core business). In our opinion, this risk should be put into perspective. On the other hand, the risk linked to the decrease of the profitability of this activity (due to the investments induced to counter Bing) is real. Content creation editors (e.g. Adobe or Canva): they could see some of their functionalities cannibalized by the new Generative AI modules that could be integrated in the software suites of more generalist competitors (e.g. Microsoft 's Office suite). Generally speaking, professions linked to creation and design should be disrupted by the contents created by Generative AI, especially images.

Publishers of "low code" / "no code" solutions: for software publishers such as Github, Gitlab or Atlassian, Generative AI represents both a powerful productivity gas pedal for their existing offer (it is Artificial Intelligence that generates the lines of code of computer programs) and a threat, through the lowering of barriers to entry to the benefit of newentrants.

Semiconductor design tool vendors: companies like Cadence, Synopsys , Mentor or ARM see their core business now equipped with revolutionary tools thanks to AI. De facto, this makes it technically easier for their customers ( Nvidia , Apple , Tesla to name but three) to vertically integrate design and verification functions.