
ickas
@ickas
0 reply
0 recast
0 reaction
0 reply
0 recast
0 reaction
1 reply
0 recast
0 reaction
0 reply
0 recast
0 reaction
1 reply
0 recast
0 reaction
0 reply
0 recast
0 reaction
0 reply
0 recast
1 reaction
0 reply
0 recast
0 reaction
0 reply
0 recast
0 reaction
0 reply
0 recast
1 reaction
AI is moving fast, and in the same way, the design, and the UX specifically, needs to keep up. Not just on the web, but (soon) across all mobile apps too.
As LLMS evolve, our UX needs to evolve with them. From easier (and faster) input methods, to smart outputs that drive real insight—not just text blobs. Visual and context-driven UI is the new standard.
Traditional chat-based AI feels more and more outdated. The next wave is task-driven UX: sliders, buttons, visual nodes, and tailored actions. Users get power and clarity, and AI delivers real, actionable value.
And yes—MCPs will be a huge thing to watch as AI features get smarter and more integrated into everywhere we work and create.
It’s not just about being AI-first, but AI-smart: integrating AI into users' natural workflows and wherever real work happens.
I'm super excited to see how these patterns shape the future, and better of that, being part of that!
What a time to be alive! 1 reply
0 recast
0 reaction
0 reply
0 recast
0 reaction
0 reply
0 recast
0 reaction
0 reply
0 recast
0 reaction
1 reply
0 recast
0 reaction
1 reply
0 recast
1 reaction
0 reply
0 recast
0 reaction
0 reply
0 recast
0 reaction
0 reply
0 recast
0 reaction
0 reply
0 recast
0 reaction