Hey π
β
I'm Saurabh, and I'm thrilled to welcome you to Apperture Focus. This isn't just another newsletter β it's the culmination of our team's year-long deep dive into the intersection of AI and finance.
Every week, we'll be sending you a 5-minute read that distills the noise into actionable insights. It's essentially an invitation into our world β a behind-the-scenes look at our latest research, the "aha!" moments, and the trends that keep us up at night (in a good way).
I'd love to hear your thoughts. What resonates? What doesn't? Your feedback will shape the evolution of Apperture Focus, ensuring it delivers real value to you.
Letβs begin.
β
AI models are getting cheaper by the week. What once cost millions to develop and deploy is now becoming accessible to the masses. Meta's Llama 3.1 is the latest example of this democratisation of AI. It's an open-source model that anyone can use, and it's already outperforming some of the most advanced proprietary models in certain tasks.
But it's not just Meta. Across the board, we're seeing a race to the bottom in terms of pricing. According to OpenAI's press release, "The cost per token of GPT-4o mini has dropped by 99% since text-davinci-003, a less capable model introduced in 2022." This represents a 20x reduction in cost compared to GPT-4o.
β
β
β
OpenAI hasn't released a detailed paper on GPT-4o mini's development, keeping us guessing about their exact methods. However, we can piece together some likely approaches based on recent research from OpenAI, Google, and Meta's newly published Llama 3.1 paper. Here's what we think might be happening behind the scenes:
β
β
This development is part of a broader trend we've been seeing in AI. While we might not be seeing massive leaps in response quality, the rapid decrease in costs is remarkable. It's opening doors for smaller businesses and developers to access powerful AI tools.
The competition between OpenAI, Google, and Meta is driving innovation and cost reduction. As these models become more affordable and accessible, we're likely to see a surge in AI-driven applications across various industries.
It's worth noting that while these advancements are exciting, they're more about making existing capabilities more efficient and cost-effective rather than introducing entirely new functionalities. Still, the potential impact of widely accessible, powerful AI models is significant.
As we watch this space evolve, it'll be interesting to see how these more affordable models are put to use and what new applications might emerge as a result.
Midday new open-source tool that's got the freelance and small business world buzzing. Think of it as QuickBooks or Xero, but with a serious AI upgrade and a fresh coat of paint.
Key Feature That Caught Our Eye:
Well, for starters, it's got this AI assistant that's pretty impressive.
You can ask it things like, "Hey, what's my burn rate?" or "Can you find that $100 receipt from last month?" and it'll dig up the answer for you. No more drowning in spreadsheets or playing hide-and-seek with your finances.
Source:Midday
But here's the thing - it's not just about the AI. Midday's got all the usual suspects - time tracking, invoicing, file storage - but they've made it look good and work smoothly. They've even got this "Magic Inbox" that automatically sorts out your invoices. Neat, right?
Oh, and did I mention it's open source? Yeah, you can tinker with it all you want or even host it yourself if you're into that kind of thing.
β
β
What sets Midday apart is its sleek, user-friendly design coupled with powerful AI integration. It's like having a financial advisor, project manager, and filing clerk all rolled into one intuitive platform. The AI chatbot isn't just a gimmick β it's a game-changer that can pull up crucial financial data in seconds, freeing you to focus on what truly matters: growing your business.
For those wary of complex accounting software, Midday offers a refreshing alternative. It automates the tedious aspects of bookkeeping while providing a clear, visually appealing overview of your financial health. The open-source nature of the tool is the cherry on top, offering unparalleled flexibility and the potential for continuous improvement.
β
In a plot twist worthy of a Silicon Valley thriller, we're witnessing a fascinating showdown in the AI world. On one side, we have OpenAI, the pioneer that's ironically not so open anymore. On the other, Meta and Mistral, championing the open-source movement. Let's dive into this AI soap opera and what it means for the future of artificial intelligence.
β
β
β
First, let's talk numbers - and they're eye-watering. OpenAI is reportedly burning through cash faster than a neural network processes data. Despite raking in $3.4 billion in annual revenue, their burn rate is a staggering $5 billion per year. Building AGI, it seems, is not for the faint of heart (or wallet).
While OpenAI will likely secure funding to keep the lights on, this financial pressure raises a crucial question: Is the closed, proprietary model sustainable in the long run?
β
β
β
With the release of Llama 3, Zuckerberg and co. are playing a brilliant game of "follow the leader." They're essentially letting OpenAI do the heavy lifting in research and development, then open-sourcing their own versions. It's like getting a gourmet meal recipe for free after someone else spent years perfecting it.
What's more, Meta claims they're not even looking to monetize their AI efforts directly. When your social media cash cow is still producing, why not use AI as a loss leader?
β
β
β
β
So where do we go from here? The answer lies in the treasure trove of proprietary data that companies generate every day.
This isn't just any data - it's context-rich, highly specific, and immensely valuable for creating AI solutions tailored to individual business needs.
β
β
Here's where it gets interesting for us finance folks. The cost per token for these models has been in freefall, and with Meta's 405 billion parameter model now open-source, we're looking at a classic race to the bottom. Soon, AI model pricing might just be infrastructure costs plus a small margin.
β
β
Adding insult to injury, Meta has made their AI available on WhatsApp and is gunning to overtake ChatGPT in popularity by year's end. It's like OpenAI built the stage, and now Meta's stealing the show.
β
β
At Apperture, we're eyeing a sweet spot in this AI battlefield. We believe the future lies in self-hosted, open-source models trained on proprietary data. This approach offers a tantalising trifecta: reliability, security, and affordability.
Imagine having the power of a state-of-the-art AI model, customised with your company's unique data, all without breaking the bank or compromising on security.
As this AI drama unfolds, we're left wondering: in the battle between open and closed AI, has OpenAI ironically positioned itself on the losing side? Will the future of AI be democratised and open, or will proprietary models find a way to justify their premium?
Thatβs all.
Stay curious, leaders! See you next week.
How did you like today's email?
β
We'd greatly appreciate your thoughts on the structure, content, and insights provided. Your feedback will help us refine and improve future editions to better serve your needs and interests.
β