vrypan |--o--|
@vrypan.eth
Apple has advocated a "local-first" strategy for years. They said it's for privacy, their competitors said it's because Apple is not very good at cloud services. The recent announcement of access to AI's APIs, adds a new twist to "local-first": Developers can offer smart services, without paying the cost to host or rent LLM capacity. This can be a huge differentiator: You want your app to automatically generate transcripts of the videos it shows, index them, translate them, etc. Normally, you would have to do this server side, and pay the cost of using/hosting an LLM -a cost that does not scale. But if the AI functionality uses the user's device resources, it's free for the developer. And it's (practically) free for the user. What does this mean for Apple's competitors like Android? Will they have to accept that the same feature will cost more on them? Or maybe subsidize the cost and offer their LLM infra for free to devs?
4 replies
0 recast
28 reactions
Nandit Mehra
@nanditmehra
Interesting analysis 🧐
0 reply
0 recast
0 reaction
Koolkheart
@koolkheart.eth
Do you think this will finally force Android OEMs to build more unified, device-level AI stacks? Feels like Google will have to respond
0 reply
0 recast
0 reaction
Jithin Raj
@jithinraj
Interesting take; local first is a win-win for end users and Apple if they get as efficient over time.
0 reply
0 recast
0 reaction
SQX
@sqx
Not sure this strategy will work. Though makes sense for them to try. I see a world with less phones. And more servers. Not more powerful phones.
0 reply
0 recast
0 reaction