“Many of life's failures are people who did not realize how close they were to success when they gave up.”
- Thomas Edison
Bridging the Financial Gap
Note: I am not an economist.
As I was listening to the All-In podcast this week, an interesting economic concept was discussed. At a high level, the idea is to replace income taxation with consumption tax. If you run this through ChatGPT, you'll find a few important pieces of information:
This concept has been debated in the US for decades
There have been multiple proposals including the FairTax Proposal, Hall-Rabushka Flat Tax, and VAT taxes
Before the federal income tax (1913), the US government was primarily funded through excise taxes, tariffs, and sales taxes on specific goods
Texas and Florida, which are both seeing massive influxes of US citizens, have no state income tax and instead rely more on sales and consumption taxes (although their citizens are taxed federally)
Pros include: Encouragement of savings and investment, simplification (IRS), and a broader tax base (undocumented workers and tourists)
Cons include: Regressive impact (lower income people spend more of their income on necessities, meaning they'd bear a heavier tax burden, challenges during transitionary time, and federal government revenue instability
This is very interesting. The only challenge I believe is worth thinking through more thoroughly is the regressive impact. The transition challenges and revenue stability to me are more easily resolved.
So this is the story ChatGPT tells:
I think the 30% number is likely way too high, so I pushed back on this. There are also additional ways to account for this such as adjustments or exemptions for essential goods (Groceries, rent, healthcare). I'll stop here, but I hope this makes your brain start churning like it did mine!
How Much Thorium Does it Take to Power a Lightbulb for 60,000 Years?
While you can never believe anything coming out of China, this news is at least interesting.
Chinese surveyors have found a thorium (radioactive rare earth metal) deposit large enough to power China's energy needs for, well, almost forever.
There are a bunch of startups entering this space who've identified new technologies for discovering rare earth metals. It was previously impossible to detect where these rare earth metals are hidden in the Earth's crust. Now, scientists are using AI to ideate new and groundbreaking (no pun intended) approaches.
It's possible, maybe even likely, that we are sitting on top of enough rare earth material to power the entire US forever as well! We'll likely come to find that out in the next 5-10 years or so. Exciting times ahead.
Meme of the week
The Golden Age of AI Models
New models are coming out every week. I don't think people truly understand where we are in this AI revolution. This point in time is like the dot com bubble growing. When will it pop?
We now live in a time where OpenAI's latest model release barely made headlines. Where Alibaba is open sourcing text-to-video models for everyone around the world.
While this is all amazing for consumers, I want to highlight something that is eating at me. All of these companies know what benchmarks they are evaluated on. So they're incentivized to optimize their models for those benchmarks, instead of optimizing for usefulness, accuracy, trust-worthiness, and all of the other things that truly matter.
One approach I found entertaining to learn about was Anthropic testing their newest models by having them play Pokemon. This is the type of independent test I want to see. Which model gets the furthest in the game? Which model builds the best "team" of Pokemon. Let's have them battle it out against each other!
Siri Still Sucks
When Apple released Siri it was groundbreaking. Back then, simple things like being able to ask questions and get search results or set reminders/notes were amazing. That was all before the AI hype started.
Now Siri feels like a baby at the big people table. New information has come out that Apple doesn't foresee Siri being "good" until 2027. How can a company with so much cash not have a proprietary AI model?
First, I want to talk through some of the strategic challenges Apple must be facing. First and foremost, their differentiator of privacy. When you think about Apple, you think about the animation of a lock turning into the Apple logo. Every product release mentions privacy a million times. That is so powerful, and one of the reasons they are still so popular.
Their challenge is balancing the new AI capabilities that could put their products out of touch with their fundamental commitment to privacy. People may not know this, but these LLMs can't very easily be put into a phone's hardware. What that means is anytime someone wants to use AI model capabilities from an iPhone, data has to flow from the phone to the 3rd party model provider, then back. This exposes information that Apple has historically protected with everything they have. You see this now in their demos with ChatGPT, where the user has to approve the data flows.
Instead of using a 3rd party, if Apple were to develop its own model, it could control the security and privacy end-to-end. Here's the challenge with that and it is directly connected to why Siri is still so shitty. You can't store the user's prompt or data and still claim to be a company focused on privacy and security. So you have no way to do reinforcement learning, to train your model and make it personalized. That's not a problem for just running LLM prompts, but it makes it impossible to identify Siri use cases directly from users.
So what do you do? Well, you have a fat stack of cash. Pay people and get qualitative information and start to build all of the use cases! That's one challenge. Another challenge is more fundamental to the operating system and hardware.
Great, now you can ask Siri to do a lot more and it won't tell you "I'm sorry I can't do that right now" or whatever bullshit it says today. But how does it actually do those things? For apps run by Apple, they can easily handle those use cases with privacy and security. But many of those use cases will be for 3rd party apps, meaning Apple's future AI Agent (Siri) would need to privately and securely handle passing critical user data to third party apps. I haven't the slightest clue the technical complexities there, so I'll have to leave you wondering.
LLM Hallucinations vs Dementia
I was running a Deep Research prompt the other day about building versus buying a content management system (CMS). OpenAI's deep research shows you a side panel as it is working, explaining everything it's doing. I was looking through it and saw something that made me laugh. In between searching two open-source CMS tools, it randomly ran a search about aluminum refining. So it made me think, what are LLM hallucinations compared to human dementia. This is what I found:
Quick Tip - GTM
Share with friends
Enjoying The Hillsberg Report? Share it with friends who might find it valuable!