4 min read

Weekend Briefing No. 21

Big updates to NumPY are proposed and what are the 7 "must have" features for crafting custom LLMs?
Weekend Briefing No. 21
Photo by David Menidrey / Unsplash

Good Saturday morning! Welcome to this Weekend's Briefing. This week we learn that NumPY is proposing some big enhancements for version 2.0, find out what the 7 features for crafting custom LLMs are, and is a search engine augmented with LLMs better than plain old search? Plus, a special infographic for Halloween!

Interesting data points

NumPY proposed enhancements

If you're a data scientist or an analyst who codes in Python, chances are you've used NumPy. NumPy is a numerical computing package for the scientific community. It's open source and widely used and supports a wide range of hardware and computing platforms.

Now the maintainers and developers of the package want to clean it up for it's version 2.0 release, tentatively scheduled for December 2023!

NumPy 2.0 development status & announcements · Issue #24300 · numpy/numpy
The purpose of this issue is to serve as a brief “umbrella issue” which (a) links out to some key design proposals and other places where design changes and guidance for the 2.0 release are describ…

Here are some proposed enhancements:

The proposed changes to the NumPy C-API and it's cleanup are the ones I'm most interested in because of the efficiencies that can be gained. I'm looking forward to this next release!

7 Must-have features for crafting custom LLMs

What a fantastic article highlighting the 7 must-haves for you custom LLM and RAG applications. This is a must read for anyone trying to build a LLM type of application in a highly competitive market.

7 Must-Have Features for Crafting Custom LLMs
Subscribe • Previous Issues Keys to a Robust Fleet of Custom LLMs The rising popularity of Generative AI is driving companies to adopt custom large language models (LLMs) to address concerns about intellectual property, and data security and privacy. Custom LLMs can safeguard proprietary data while…
Customizing an LLM isn't just about technical finesse; it’s about aligning technology with real-world applications. 

The key features are:

  1. Having a versatile and adaptive tuning toolkit
  2. Human-integrated customization
  3. Data augmentation and synthesis
  4. Facilitation of experimentation
  5. Distributed computing accelerator
  6. Unified lineage and collaboration suite
  7. Excellence in documentation and testing

FreshLLMs: Refreshing large language models with search engine augmentation

Researchers have shown that LLMs + Google Search generate results better than just a plain old Google Search. The created a new dataset of 600 questions used to evaluate a broad range of reasoning abilities for the LLM and Google. In the end they created FRESHPROMPT and FRESHQA.

FreshLLMs: Refreshing Large Language Models with Search Engine Augmentation
Most large language models (LLMs) are trained once and never updated; thus, they lack the ability to dynamically adapt to our ever-changing world. In this work, we perform a detailed study of the factuality of LLM-generated text in the context of answering questions that test current world knowledge…
FRESHPROMPT significantly improves performance over competing search engine-augmented approaches on FRESHQA, and an ablation reveals that factors such as the number of incorporated evidences and their order impact the correctness of LLM-generated answers.

If you ask me, this makes sense. Integrate LLMs with a search engine application and you should see better results. I would call this integration "Beyond Search" and it bears keeping a close eye on.

Franchises with the most horror films

Just in time for Halloween, @idlecrowdesigns gives us the franchises with the most horror films. The Ring scared the hell out of me when I first saw it!

Source: r/coolguides

Help me reach my BHAG!

Hi friends, I have a very Big and Hairy Audacious Goal (BHAG) for the end of the year, I want to reach 1,000 newsletter subscribers! I'm asking for your help to get this done so if you liked this newsletter (or any of the past articles), please share it using one of the sharing icons below. Thank you!