Empowering End Users with Local Models
I pulled down Llama 3.1 7B for a long flight recently. It seemed magical in the same way downloading all of Wikipedia seems--like taking an entire universe with you.
The missing piece is the context to steer the LLM. We need a Microsoft Access of RAG and LLM system Eval.
Any programming documentation I was missing, I could ask the LLM to write the function. If I didn't understand a concept or definition, I could ask it to continue writing my blog post. Small models that can fit locally on a machine feel like taking power back to end users. In a world where we're hamstrung if we're offline and dependent on cloud services that may change their pricing, policy, or just plain run out of business, it's refreshing to have such power independent of other entities.
But it's no secret the main missing piece right now is steering the universe that you brought with you (the LLM), with the context of your life and your work. Even with larger context windows, it seems both RAG and eval will be largely important going forward. But because that's the hard part of delivering end-value to users is stuffing the right information in the context for the LLM to use. (also writing the prompt, but we'll get used to that in time).
RAG is effectively information retrieval with databases! Search indices. Eval is setting up pipelines and metrics, and the unsexy work of systematizing intuitions of what "good" looks like.
Both still require expertise and tools beyond the reach of prosumer users who leverage Excel, Notion, and Airtable in their day-to-day work. As a result, even with LLMs that fit on a local machine, users will never be completely free of cloud services with unstable long-term service policies if RAG and Eval parts of the supply chain are out of reach of prosumer users to allow them to build their own.
Why wasn't there a successor to Microsoft Access? I think it occupied a weird middle ground. Too complicated for casual users, but too unsophisticated for engineers. Also, in the late 90's and early 2000's, CRUD web apps largely filled this void. In addition, there wasn't a large class of prosumers as a market like there is today.
So I think there is room for a local-first successor to Microsoft Access that provides RAG and Eval that caters to prosumers, since they exist as a market. We can rightfully argue Notion and Airtable are spiritual successors to Microsoft Access, and they've demonstrated that this prosumer class of users exists.
For those who believe local-first software has a role to play in software going forward, a local Microsoft Access that helps prosumers build RAG and Eval for local LLMs will be super important going forward.
But the question remains, what sort of economic advantage would a local-first Microsoft Access provide a company that adopts it as a strategy? Will prosumers care enough about the advantages local-first gives them–enough to pay for it? Even if they do, will the unit economics work out? Every new technology seeking adoption also needs an economic advantage that renders incumbent advantages irrelevant. At the moment, no one's quite figured that out.