

That was an interesting substack article. I’m not super deep into the AI stuff and never heard of Gary Marcus. I agree that they went to scaling LLMs first because 1 - it’s easier to scale vs tie in new ways of doing things and 2 - companies like Nvidia were in line to make a ton of money as crypto mining started to fall out of favor and real-time ray tracing wasn’t giving them as big of advantage as they hoped.
They are still making a ton of dough!
Any alternatives? Lately I just send links to ChatGPT and ask for detailed summaries