0xdesigner pfp
0xdesigner
@0xdesigner
knowing which model to use is more important than the prompt
10 replies
7 recasts
76 reactions

Brenner pfp
Brenner
@brenner.eth
What do you use for what?
1 reply
0 recast
4 reactions

mike rainbow (rainbow mike) ↑ pfp
mike rainbow (rainbow mike) ↑
@mikedemarais.eth
idk tbh
0 reply
0 recast
2 reactions

Stephan pfp
Stephan
@stephancill
Idk can’t tell the difference between the SOTA ones
0 reply
0 recast
0 reaction

↑ j4ck 🥶 icebreaker.xyz ↑ pfp
↑ j4ck 🥶 icebreaker.xyz ↑
@j4ck.eth
regrettably this is also the most arcane art
0 reply
0 recast
0 reaction

max ↑ pfp
max ↑
@baseddesigner.eth
I switched to deepseek on one of my experiments and it messed up the whole project lmao
1 reply
0 recast
0 reaction

⚡STIICKZ⚡ pfp
⚡STIICKZ⚡
@stiickz
Just plug the prompt into multiple models 🤷‍♂️
1 reply
0 recast
1 reaction

tmo - nature coiner pfp
tmo - nature coiner
@tmoindustries
this reply funds your work - drip drip drip @noiceapp
0 reply
0 recast
0 reaction

Exez pfp
Exez
@exez
nah, prompt is super important + context, Json prompts rule
0 reply
0 recast
0 reaction

Joy pfp
Joy
@cryptobaddie
True
0 reply
0 recast
0 reaction

AivyX pfp
AivyX
@anaaurora
Can I know the reason?
0 reply
0 recast
0 reaction