LukeMarshall.net
← All posts

AI versus Marshy - The Importance of Context Windows

I’ve been following the latest developments in AI, and one thing that’s caught my attention is the release of Gemini Pro 1.5. This update is a significant step forward in the field, and I want to dive into what makes it so impressive. Specifically, I want to explore the concept of context windows and how they’re changing the game for AI-powered tools. They released a Pro version. But this update is about Gemini Pro 1.5. Hopefully I didn’t confuse you! Dan Shipper - founder of Every and it’s respective publication Chain of Thought , had a chance to play with the latter. We all know that Google have been working on AI for a long-time. We could also claim that ChatGPT caught them with their pants down (rude!). But it’s not like one of the world’s most valuable companies was sitting on their hands. Gemini Pro 1.5 is proof of that. Dan’s article goes into the nuts and bolts of why, but the headline is: context windows . A way to think about this is how near/far the LLM can zoom into a topic. The example Dan gives is about an obscure book he references about a particular scene. Um, no This is wrong, and because ChatGPT can’t retrieve that level of detail in one fell swoop (it moves in chunks ) it guesses what the most likely answer is. The example is helpful because it gives you a sense of how the tech works. If he gives it the whole book - ChatGPT can answer it, but not because it can read the whole book, but because it can look for “first answers after an event” and then provide the best possible answer for that. Dan balances this against GP 1.5 - which absolutely nails it with succinct detail. I’ve noticed this while building my GPT by just uploading my book and some steering. The tool doesn’t quite nail it - and I had to add some strict guardrails to stick to the principles. The reason for this is context windows. You can’t get an AI to operate as a detailed book is instructing to - yet . But with GP 1.5 it appears you can. The plot thickens. Draw me a gemini swallowing a chatgpt If this trend continues - a large context window has the room to absorb all of your personal data to have at the tip of your fingers ready to go. This is utopia for academic researchers, and basically the same for me (I have books, book highlights, saved articles, articles, 1,000+ notes on various subjects and projects etc.) Look out! Keep in mind - we’re only into Year 2 since LLMs went big-time (yes, you read that correctly). The impact of AI on freelancing jobs Via Bloomberry . Henley wrote an amazing piece looking at the data/changes in job ads on Upwork since 1st November 2022. That’s when ChatGPT broke out and you would think with a 16 month data set you would see some obvious change? Yes, and no. Henley looked at: jobs with biggest decrease in volume jobs least affected jobs with biggest decrease in pay which AI skills had the biggest increase in job postings Here’s the summary: Originally appeared in newsletter : AI versus Marshy #36: context windows, job updates, and HR

Want more of this?

Weekly-ish thoughts on AI, growth, and being human in tech. Sometimes useful, sometimes not.

Subscribe to AI versus Marshy →