martin ↑
@martin
sam altman tends to refer to any downsides of AI as "society will need to figure this out" whereas any good things are "ChatGPT will enable people to do this" privatize the profits, socialize the losses i guess
6 replies
0 recast
36 reactions
martin ↑
@martin
in the same thought, he talks about how kids love voice mode and it's amazing, but also "society will need to figure out guard rails" for parasocial relationships. essentially "We made a really good drug but _society_ needs to figure out how to not get people addicted to it" https://open.spotify.com/show/0zojMEDizKMh3aTxnGLENP
2 replies
0 recast
5 reactions
Jack
@jackten
I mean yea.. it's called personal responsibility. It's important for a culture to cultivate that unless you want to live in a nanny state where the man tells you how to live and what you can do with your life
3 replies
0 recast
0 reaction
martin ↑
@martin
he has no problem with the man telling us how to live - it'll just be the government and not him, that way he can play the victim
0 reply
0 recast
0 reaction
Valerie Feria-Isacks
@bkwrmgal
Also when you invent something part of your personal responsibility should be giving recommended guidance on best practices on its use. Should a company enforce that? That is a larger more complicated debate that involves a lot of social conditions and I'd argue depends greatly on each culture (some cultures are more libertarian and some are more communal oriented) though personally leaning libertarian I lean towards non-enforcement or in the case of more communal societies that the government - not companies - wearing that role. OTOH no guidance or thoughts or cares for how others use or misuse it at all ... that's a suicidal level of independence. One should care about one's customers beyond profit. Thinking about it and giving safety recommendations is due diligence. Plus can prevent being sued.
0 reply
0 recast
0 reaction
Valerie Feria-Isacks
@bkwrmgal
As an educator parameters are good for kids. It's actually good to have a bit of 'nanny state' (based on ages & stages) for those whose brains aren't fully developed yet so that by the time they get older they're responsible enough not to abuse the full, uncensored version of a thing. It's actually part of how you teach personal responsibility ... gradiently.
0 reply
0 recast
0 reaction