How Google is revolutionizing Apple’s AI with its tailor-made chips? Discover the secret behind Gemini!

IN BRIEF

  • Google uses custom chips to optimize its performance artificial intelligence.
  • The project Gemini is at the heart of this revolution, aiming to surpass the offers ofApple.
  • These advanced technologies enable faster and more efficient data processing.
  • Google emphasizesmultidisciplinary approach to strengthen its AI capabilities.
  • The battle between Google And Apple is intensifying in the field ofartificial intelligence.

In a world where artificial intelligence is taking an increasingly central place, competition between technological giants is intensifying. Google, with its ambitious Gemini project, is radically transforming the AI ​​landscape, pushing Apple to rethink its own strategies. At the heart of this revolution, Google’s custom chips are proving to be a key element, making it possible to optimize the performance and increase the efficiency of AI systems. Let’s discover together the foundations of this innovation and its impact on the Apple ecosystem.

The beginnings of Google’s custom chips

At the heart of Google’s headquarters in Mountain View, California, hundreds of server racks hum to perform specific tasks far removed from simply managing the search engine. They run tests on Google’s tensor processing units (TPUs), special chips developed to improve the performance of artificial intelligence models.

The advantages of Google TPUs for Apple

As of July 2024, Apple has revealed that it uses Google’s TPUs to train its AI models. This collaboration allows Apple to strengthen its capabilities in the field of artificial intelligence, particularly with Apple Intelligence. Google also uses TPUs to train and operate its own AI chatbot, Gemini.

How TPU changed the game

TPUs are application-specific integrated circuits (ASICs) designed to perform large-scale matrix multiplications quickly. With the arrival of the sixth generation, called Trillium, Google plans even greater performance. Since the launch of TPUs in 2015, Google has dominated the AI ​​accelerator market with 58% market share.

Collaboration and chip production

The complexity of these chips requires major collaborations. Google works with Broadcom and Taiwan Semiconductor Manufacturing Company (TSMC) for the production and assembly of its chips. These partnerships ensure that Google can meet the growing demands of AI.

Impact on cloud and revenue

Google’s AI solutions have had a significant impact on Google Cloud’s revenue. Alphabet, Google’s parent company, reported a 29% increase in cloud revenue, surpassing $10 billion for the first time.

Environmental objectives

To reduce its carbon footprint, Google uses direct chip cooling techniques that use much less water. This approach addresses growing environmental concerns related to data centers and the high energy requirements of AI.

Element Details
Specialized chips TPU from Google, designed for AI
Collaboration Apple uses Google TPUs
Market share 58% of custom AI accelerators
Generations of TPU Sixth generation, Trillium
Partners Broadcom, TSMC
Environmental impact Greener cooling techniques
  • The collaboration between Apple and Google has strengthened Apple’s AI capabilities using Google TPUs.
  • TPUs represent 58% of the market share of custom AI accelerators.
  • Google works with Broadcom and TSMC to produce its advanced chips.
  • Direct cooling of Google’s chips reduces carbon footprint.
  • Alphabet reported a 29% increase in cloud revenue thanks to AI.
Retour en haut