--- categories: - AI and Machine Learning date: '2023-12-20 06:37:14' draft: false preview: /social/74dec459b08c7d33313051d560eae8cb93f954c96c2d904514120502112e370a.png tags: - nlp title: NLP in the Post-LLM World type: posts url: /2023/12/20/nlp-in-the-post-llm-world/ --- <!-- wp:paragraph --> <p>I really enjoyed diving into Seb Ruder's <a href="https://nlpnewsletter.substack.com/p/nlp-research-in-the-era-of-llms">latest NLP Newsletter</a> which focuses on all the areas of NLP that are still in desperate need of attention in a post-LLM world.</p> <!-- /wp:paragraph --> <!-- wp:quote --> <blockquote class="wp-block-quote"><!-- wp:paragraph --> <p>In an era where running state-of-the-art models requires a garrison of expensive GPUs, what research is left for academics, PhD students, and newcomers to NLP without such deep pockets?</p> <!-- /wp:paragraph --> <!-- wp:paragraph --> <p>...while massive compute often achieves breakthrough results, its usage is often inefficient. Over time, improved hardware, new techniques, and novel insights provide opportunities for dramatic compute reduction...</p> <!-- /wp:paragraph --></blockquote> <!-- /wp:quote --> <!-- wp:paragraph --> <p> I wrote about some of the same issues in my post <a href="https://brainsteam.co.uk/2023/03/25/nlp-is-more-than-just-llms/">NLP is more than just LLMs</a> earlier this year and <a href="https://brainsteam.co.uk/2023/12/05/carbon-footprint-of-generative-models/">I recently speculated</a> about how current industry AI darlings, sexy-scale-up companies very much in "growth" mode as opposed to incumbents in "cost-saving" mode, are just not incentivised to be compute-efficient.</p> <!-- /wp:paragraph --> <!-- wp:paragraph --> <p>If you are just starting out in this space there are plenty of opportunities and lots of problems to solve - particularly around trust, reliability and energy efficiency.</p> <!-- /wp:paragraph -->