Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

My main takeaway is that Generative AI has hit a wall... New paradigms, architectures and breakthroughs are necessary for the field to progress but this begs the question, If everyone knows the current paradigms have hit a wall, Why is so much money being spent on LLMs ,diffusion models etc,which are bound to become obsolete within a few(?) years?


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: