A few days ago, a friend (Greg) asked me:
What skills do you think a product designer should learn now?
Considering the AI landscape, the most straightforward answer is:
Learn how to write prompts to get the results you want.
But then, I remembered seeing a tweet by Halli:
To become good at something you need to practice being bad at it for a long time.
Over time that training helps the person understand the complexity of the tasks required.
But if people can skip the practice part by using AI, will they be able to become great at something?
Think about that.
That’s when I realized: My straightforward answer - “write good prompts” - is very outcome-driven.
What do you mean?
Well, at the moment, I can think of two types of mindset:
To give an example, let’s use Webflow - my current no-code platform.
When I first learned Webflow, there was a learning curve.
I didn’t come from a web development background. So, when I first saw things like div, section, and containers, I was like
WTF are these?
Not to mention that I used Figma before. I was like
Why can’t I just drag stuff around? Why is everything so rigid?
But eventually - I learned. I learned about the box model. I learned about the basics of web development. I learned how to use Webflow.
Now, I have a good understanding of the platform. And I have built a few websites with Webflow.
What if one day I can just tell Webflow what I want to build? And it builds it exactly the way I want.
I know - we’re not there yet. But with the current trajectory, I don’t think this is a distant future.
In this case, is it even important that I understand web development? Is it even important that I understand Webflow?
If I can get the results I want, why do I need to learn about the topic/subject?
I don’t want to sound lazy. But are we heading in that direction?
I think, in the end, the question we should be asking is:
How much do we need to know to perform a task?
It seems like with AI, in the future, the bar to complete a task is getting lower and lower - across different tasks.
And personally, I’m conflicted.
On one hand, I love to learn new things. I love to understand how something works. I want to learn. There’s a sense of accomplishment after I built something with my own hands.
But also, who doesn’t want their lives to be easier? To have something automatically built for them - exactly the way they want.
I don’t know which is better. I may also be missing out on some perspectives. Again, I don’t know what I don’t know.
But the world is not black and white. So I want to identify some key statements:
To generate a great prompt, you need to understand the inner workings first.
This can be a potential sweet spot. I guess the magical part is you can also use AI to help learn new things.
But again, I may have to experiment with this first. I’ve also seen examples where people have built apps with AI without knowing development.
My scope is narrow.
I understand - I’m only looking at this from the lens of tech. Thus far, I’ve mostly seen AI used in developing apps/designs.
But maybe it’s different in other industries.
The role of AI is to augment, not replace, humans.
This is what Marcy - ex. Design Lead at Meta AI - told me.
It’s a future I want to believe in.
The future is unpredictable. Who knows - maybe people will think of new ways to live and learn. They may come up with new ways to interact and work with AI.
Ultimately, I believe AI will affect our fundamental understanding of two things:
Please let me know if you have thoughts about the effects of AI or if I missed out on any important perspectives. I'll love to chat!