• AutoTL;DRB
    link
    English
    11 year ago

    This is the best summary I could come up with:


    Meta has been snapping up AI training chips and building out data centers in order to create a more powerful new chatbot it hopes will be as sophisticated as OpenAI’s GPT-4, according to The Wall Street Journal.

    The Journal writes that Meta has been buying more Nvidia H100 AI-training chips and is beefing up its infrastructure so that, this time around, it won’t need to rely on Microsoft’s Azure cloud platform to train the new chatbot.

    The company reportedly assembled a group earlier this year to build the model, with the goal of speeding up the creation of AI tools that can emulate human expressions.

    A June leak claimed there was an Instagram chatbot with 30 personalities being tested, which sounds a lot like the unannounced AI “personas” the company is said to be launching this month.

    Meta has reportedly dealt with heavy AI researcher turnover over computing resources split between multiple LLM projects this year.

    OpenAI said in April that it wasn’t training a GPT-5 and “won’t for some time,” but Apple has reportedly been dumping millions of dollars daily into its own “Ajax” AI model that it apparently thinks is more powerful than even GPT-4.


    The original article contains 301 words, the summary contains 197 words. Saved 35%. I’m a bot and I’m open source!