Here’s an exercise. Take anything you want to recommend to your client and assign a probability of this having great results for the client. Clearly that number is above 50%, or else you wouldn’t be recommending it, right? So, 50 out of every 100 ideas of yours will be good, let’s say.
Now, only 70% of those ideas will actually be good, so take 70% of the 50, and that leaves 35 of them that may actually be good. Maybe even the number is 20% of your ideas will be good, which leaves 10 out of every 100 ideas of yours being good. Now, take another 50% of those because only half of your ideas will be something that is realistic enough so that other team members would be open to doing, so now we have some number between 5 and 35 of your 100 ideas might even be good enough to be possibly be feasible. And these numbers are optimistic!
You see the point: no matter how competent and good you are, humans are just terrible at estimating how good or bad their own ideas are. In particular, humans think and act with much greater confidence than any of their ideas warrant.
It’s nothing personal! You are human, after all.
This observation has substantial implications.
On the smallest level, be a little bit less confident in the ideas you recommend and push forward. Some ways to do this include adding qualifications when you write up strategy docs. Another is to always make sure you both think through your assumptions and articulate them (in writing, of course.)
While this is easy to do in a perfunctory and formal way—that’s just a waste of time. The importance is truly doing it. In other words, intellectual humility. And thus, the bigger consequence: rethink your work at every level. Will your design likely achieve the objectives? Approach the problem from wildly different angles. Come up with different options, some crazy. Test, test, test. And so forth.
On the highest level, this also implies the importance of sincerely and with an open mind, questioning things you never thought to question: if you question your assumptions, then really question your assumptions. As one example, we learn that science works via the consensus of the famous scientists; determining scientific truth is the result of a popularity contest of ideas, among the cool kids, in other words. Thinking through the implications of that one question alone is a life-long question—and one that will definitely distract you from client work. Hypothetically, of course.