Content pfp
Content
@
https://warpcast.com/~/channel/worldcoin
0 reply
0 recast
0 reaction

ted (not lasso) pfp
ted (not lasso)
@ted
sharing a few speculative thoughts / questions if @worldcoin app becomes a social app, after reading this @bankless piece by @robinson. half-baked hypothesis: 1. would shift world from identity-focused to content-focused, which is *hard* 2. user behavior, engagement, and content becomes highly valuable training data for OpenAI (if users consent into it; opt-in should be imperative here) 3. then, in theory, $WLD could be the mechanism to compensate users for providing training data (in contrast to reddit selling user data but users get no cut whatsoever) open questions i can't stop thinking about: 1. how do you measure which data is most valuable? not all data is equal 2. who gets to choose who the data is sold to and at what cost? 3. does incentivizing data actually warp the data? a la goodhart's law problem "when a measure becomes a target, it ceases to be a good measure" https://www.bankless.com/read/world-openai-social-network
9 replies
8 recasts
104 reactions

zoo pfp
zoo
@zoo
what would a data dao for this look like?
3 replies
0 recast
6 reactions

Omar pfp
Omar
@dromar.eth
It could be a way to stratify social data as well, would be powerful. Thought Sama in the past hasnt been a fan of social media data as he detailed again in a recent blog post.
2 replies
0 recast
2 reactions

Frank pfp
Frank
@deboboy
Had coffee yesterday with someone making $2K / day labeling data; prefer this to giving the eyeball my data for training
1 reply
0 recast
0 reaction

RyanFox.eth pfp
RyanFox.eth
@ryanfox.eth
1) How do you measure which data is valuable: Start with usage. • Unused data is worthless. • Data’s value is directly tied to how much “state” it changes. If I tell you it’s raining and that changes your state from “no umbrella” to “umbrella”, that data has some value. If I tell 30,000 listeners via radio broadcast it’s raining and 10,000 people’s state changes to “umbrella” that data has more value. It’s obviously much more complicated than that, and not at all even. There’s no unified value calculation. But measuring “usage” is a starting point. I don’t know if you’re watching Kevin Rose and Alexis Ohanian but Digg.com is rebooting and they seem to be working on this very problem very hard. 🧐
1 reply
0 recast
3 reactions

llbxdll pfp
llbxdll
@mrbriandesign
Didn’t hear much about World Coin for a long time. Now making noise. I scheduled an appointment to get my eyeball scanned in Shibuya a couple years ago and ended up detouring to Ebisu for some sushi instead… my gut instinct was no.
0 reply
0 recast
1 reaction

Connor McCormick ☀️ pfp
Connor McCormick ☀️
@nor
Re. 2) You can do dropout on data to find the information contribution value. What I would do is train a secondary model to estimate the dropout value of data, then allow data holders to pay to dispute their value attribution. When a funding threshold is reached the model is ablation tested to find the true value of their data. Set up the funding mechanism so that accurate error detection is incentivized (i.e. if the ablation test returns a large delta with the estimates of the secondary model, the funding is mostly returned). With this approach, a huge amount of compute would be allocated for learning categories of data informativeness, which could not only accurately reward contributors but also could direct data generation efforts. It would be largely robust to your concerns about Goodharting (at least as much as the rest of the world, as with today we’d call the reflexive parts ‘trends’) You’d be in some sense learning the derivative of data value, gives much richer data + credible neutrality
0 reply
0 recast
0 reaction

Igor Yuzovitskiy pfp
Igor Yuzovitskiy
@igoryuzo.eth
#3 is something I think about often. The accumulation of data should be abstracted to the point the user behavior remains the same therefore it wouldn’t warp. Hard to do but somewhat thesis for Griv, (people can’t see in current form) Also heard second hand @chuckstock that Worldcoin pales in user quality to Farcaster. No shade I wish everyone enormous success. Still early.
0 reply
0 recast
0 reaction

Bullers pfp
Bullers
@db
Starts with data dignity. Could spur data unions. If you haven’t already check out content from Jaron Lanier. Looking at MCPs world could play the identity layer, but need other parties managing the personalization piece - that’s what we are focusing on.
0 reply
0 recast
0 reaction

🌟 pfp
🌟
@metawavestudio
Too much friction to onboard users AND create a threshold for content-quality imo. Bankless and OpenAI are going to lean bias optimistic obviously because their incentives kinda requires that bias. Psychological Case Studies, incentives are absolutely statistical variables which would need to be isolated for clean truthful data, as they can and will affect and skew the data. But it's a paradox, seeing businesses are not bound by the truthful limitations of Academia... omitting the incentive entirely would miss out on the trending aspect that actually acquires target users. I think Worldcoin is an unreasonably-optimistic sunk cost anchored in solving a bot problem many humans are fully capable of brain-filtering naturally, because that incentive aligns with their futuristic brand. Not because it solves something meaningful, or real. I don't see the collective world accepting dystopian eye-scanning orbs. The 4-million or so onboards in Africa seems pervasive to me as well.
0 reply
0 recast
0 reaction