Power User
  • Communities
  • Create Post
  • Create Community
  • heart
    Support Lemmy
  • search
    Search
  • Login
  • Sign Up
fallingdowndizzyvr@alien.topB to LocalLLaMAEnglish · 2 years ago

Microsoft announced the Maia 100 AI Accelerator Chip. It's also expanding the use of the AMD MI300 in it's datacenters. Is this the beginning of the end of CUDA dominance?

news.microsoft.com

external-link
message-square
9
link
fedilink
1
external-link

Microsoft announced the Maia 100 AI Accelerator Chip. It's also expanding the use of the AMD MI300 in it's datacenters. Is this the beginning of the end of CUDA dominance?

news.microsoft.com

fallingdowndizzyvr@alien.topB to LocalLLaMAEnglish · 2 years ago
message-square
9
link
fedilink
With a systems approach to chips, Microsoft aims to tailor everything ‘from silicon to service’ to meet AI demand - Source
news.microsoft.com
external-link
Microsoft unveils two custom chips, new industry partnerships and a systems approach to Azure hardware optimized for internal and customer workloads
  • Material_Policy6327@alien.topB
    link
    fedilink
    English
    arrow-up
    1
    ·
    2 years ago

    Would be nice to have some options in this space for sure

LocalLLaMA

localllama

Subscribe from Remote Instance

Create a post
You are not logged in. However you can subscribe from another Fediverse account, for example Lemmy or Mastodon. To do this, paste the following into the search field of your instance: !localllama@poweruser.forum

Community to discuss about Llama, the family of large language models created by Meta AI.

Visibility: Public
globe

This community can be federated to other instances and be posted/commented in by their users.

  • 4 users / day
  • 4 users / week
  • 4 users / month
  • 4 users / 6 months
  • 1 local subscriber
  • 11 subscribers
  • 1.02K Posts
  • 5.82K Comments
  • Modlog
  • mods:
  • communick
  • BE: 0.19.11
  • Modlog
  • Instances
  • Docs
  • Code
  • join-lemmy.org