Fri. Mar 29th, 2024

Microsoft has rather a lot using on AI, a lot that it’s reportedly creating its personal superpowered chip designed particularly for coaching and operating its refined chatbot methods. This chip, code-named “Athena,” would wish Olympus-sized energy to channel all of Microsoft’s ambitions within the AI area.

Warning! Microsoft Desires ChatGPT to Management Robots Subsequent

On Tuesday, The Info reported primarily based on two nameless sources with direct information of the venture that Microsoft’s upcoming Athena chip has been within the works since 2019. That yr additionally occurs to be the identical time the Redmond, Washington tech large made its first funding into OpenAI, the makers of ChatGPT and GPT-4. Per the report, the chip is being examined behind the scenes by a small variety of Microsoft and OpenAI workers. The chip is reportedly designed to deal with each coaching and operating its AI methods (which implies it’s meant Microsoft’s inner setups, fairly than your private PC).

Gizmodo reached out to Microsoft, which declined remark. After all, it is sensible that the Redmond firm is attempting to develop its personal proprietary tech to deal with its rising AI ambitions. Ever because it added a ChatGPT-like interface into its Bing app, the corporate has labored to put in a big language model-based chatbot into every part from its 365 apps to Home windows 11 itself.

Microsoft is attempting to chop down on the cash it’s paying different AI chip makers. Nvidia is the most important participant on this area, with its $10,000 A100 chip taking over greater than 90% of the datacenter GPU marketplace for each coaching and operating AI, based on investor Nathan Benaich in his State of AI report from final October.

Now Nvidia has a extra superior chip, the H100 chip, which is meant to grant 9 occasions as a lot coaching efficiency as its earlier AI coaching chip, based on the chipmaker. Nvidia claimed final month that OpenAI and Secure Diffusion-maker Stability AI every used a H100 partially to coach and run their current-gen AI fashions. Factor is, that H100 isn’t low-cost. As first reported by CNBC, that newer GPU has gone for upwards of $40,000 on eBay, when it has value nearer to $36,000 up to now.

Nvidia declined to touch upon Microsoft’s plans to cease counting on the corporate’s tech. Although Nvidia maintains the most important lead in AI coaching chips, the corporate does in all probability nonetheless wish to maintain Microsoft as a buyer. Final yr, the U.S. launched new restrictions to maintain the corporate from sending its A100 and H100 chips to Russia and China. Final month, Nvidia stated it was permitting for extra cloud-based entry to its H100 chips, and that Meta was additionally leaping on the H100 bandwagon.

Competitors and prices are reportedly dashing up Athena’s improvement. Google, the opposite main tech large attempting to make a press release within the burgeoning AI business, can be working by itself AI chips. Earlier this month, the corporate provided extra particulars by itself Tensor Processing Unit supercomputers. The corporate stated it linked a number of thousand of those chips collectively to make a machine studying supercomputer, and that this method was used to coach its PaLM mannequin, which in flip was used to create its Bard AI.

Google even claimed its chips use two to 6 occasions much less vitality and produce roughly 20 occasions much less CO2 than “up to date DSAs.” Understanding simply how a lot vitality it takes to coach and run these AI fashions, Microsoft’s new chip might want to cope with the large environmental value of proliferated AI.

Avatar photo

By Admin

Leave a Reply