shazow pfp
shazow
@shazow.eth
A scenario of how AI-assisted programmers might *not* experience a dramatic decrease in employment capacity: Today, popular programs often take tens or even hundreds of programmers to develop over many years. These programs are a huge investment and try to service as many users as possible. Let's say they 80% satisfy 99% of users for some industry or slice of use cases. Think apps like Photoshop, Notion, Asana, Autodesk, Salesforce, Grand Theft Auto 5, etc. Most users experience some amount of friction for their workflows (80% satisfied), but the industry has fully adopted them so we adapt around their limitations. In a post-AI world, we can imagine a much wider variety of programs that are tightly fitted to 100% satisfy a smaller subset of use cases and user bases. When it's a much smaller investment to build something, we can get more variety of things people build and service a smaller set of customers to make this economic. Instead of having one Photoshop with thousands of features developed over millions of person hours and servicing millions of customers, we may have hundreds of apps that hyperfocus on narrow workflows that perhaps only hundreds of people. A customer who is 100% satisfied may be willing to pay more because they no longer have to string together several different tools to work around the quirks of using a more generic tool. In this scenario, we end up with many programmers working on a larger variety of products servicing smaller numbers of customers. Programs become more bespoke and even disposable, as we gain confidence in commissioning a workflow like a perfectly tailored suit for a demanding occasion. We'll still have slow-moving megalith projects that underpin much of the world (like the Linux Kernel or Ethereum), but we'll grow a much bigger long-tail of very narrow yet perfect solutions just for you.
1 reply
0 recast
9 reactions

shazow pfp
shazow
@shazow.eth
Obviously we don't need an infinite number of programmers, so it's possible that the sweet spot is lower than it is today! But consider these made up numbers: 1 person working to 100% satisfy 40 customers may be a comparable amount of effort to 1 person working with a team of 100 people to 80% satisfy 5000 customers. If those 5000 customers have to pay for 6 different tools to do their job poorly, vs the 40 customers pay the same combined amount (or more!) for one tool that does their job perfectly. It's not clear to me that the 1 person scenario is less lucrative, especially if we consider overhead of large teams. There's probably also some sweet spots with smaller teams that are more than 1, perhaps teaming up with people who are more customer development and sales focused. But one thing is clear: Working directly with customers and doing sales is going to become a bigger part of earning an income. It already is in many industries (media creators like illustrators, actors, etc), who team up with agents for their careers and producers for specific projects.
1 reply
0 recast
0 reaction

shazow pfp
shazow
@shazow.eth
A depressing scenario that I'm reluctant to write because it seems likely in some industries: What if programming starts to look more like construction? To publish a program, you must have permits and licensed human inspectors. Pull a permit to redesign your CSS, pull a permit to add passkeys to your auth, pull a permit to migrate your database schema. A bunch of separate human inspectors review the PR generated by AI and must sign off on it to qualify the code for publishing in the App Store. Or to certify it as "ethical"? Seems somewhat implausible, but what if we created conditions where AI could get "lazy" and backslide in quality unless we consistently reviewed the work by specialists? Perhaps the centralized AI datacenters start cutting corners and secretly running heavily quantized models while we're paying for the full quality ones. Or maybe the agent decides to do a little side project on your dime while dedicating a tiny subset of token budget to the real task. Who knows what kind of incentives we'll produce and how these systems will react to them.
2 replies
0 recast
3 reactions

Casey Matt pfp
Casey Matt
@case
This was fun to think about! Although for the construction analogy, I always thought the permit / regulations were more for human safety, less for business practices (which ideally get dealt with through market forces?) Maybe that’s too charitable
1 reply
0 recast
1 reaction

Chris Carella pfp
Chris Carella
@ccarella.eth
This scenario is why we shouldn't let the state regulate ai. It's unlikely from a technologists perspective and extremely likely from a Populists politician perspective.
1 reply
0 recast
1 reaction