Sat June 21 2014

The Best Design is Opinionated and Has Low Cognitive Load

The best software has a vision. The best software takes sides. When someone uses software, they’re not just looking for features, they’re looking for an approach. They’re looking for a vision. Decide what your vision is and run with it.

– Jason Fried, Founder of 37signals, Getting Real

It’s not a novel idea that simplicity of product is incredibly important to crafting a compelling user experience. My partner Andy talks about getting that one thing right.

However, for many products, this message seems to be lost in translation. There is a gap between the idea of simplicity and actual execution of simplicity.

I had a call with a founder last week who said all the right things:

"We think about user experience constantly… We want to make the product as simple as possible… I want to emphasize this is our minimum viable product…"

Unfortunately, the product was anything but that. It was so complex and overthought that my brain shut down, a victim of cognitive overload.

Cognitive load is how mentally taxing a task is on a finite reservoir of decision making power. John Tierney in The New York Times Magazine described the result of high cognitive load as decision fatigue.

Typically, high cognitive load happens when your brain tries to store too many things at once in its short-term memory. (Similarly, our computers also have a finite amount of memory / RAM before it crunches to a halt and starts utilizing swap space.)

What’s happening when your brain is faced with too many options is the Paradox of Choice, that when presented with too many choices actually counterintuitively creates anxiety. In fact, Hick’s Law says that the time it takes to make a decision increases logarithmically as the number of choices increases.

So, have an opinion and do one thing really well. Chances are, your opinion will be wrong. But at least you will know exactly why your opinion is wrong, because you’ve only tried one thing.1

Consider the alternative. You lack an opinion and try to do many things at once. The end result is the same, a lukewarm response. Except in this case, you have no idea why your opinion is wrong.2

Remember: more is less, and less is more.

  1. Think Scientific Method: what you control is more important than having many independent variables. 

  2. Here, you have many independent variables and few controls, so it is difficult to isolate the cause and effect relationship between your product decisions and the end result. 

Sat June 21 2014

Fooled By Randomness in Product Design and Investing

"Fail fast."

You hear this all the time. In entrepreneurship, there is a big emphasis on the mantra of “failing fast” and learning from your mistakes. It’s important to understand why you failed, but I think it’s even more important to understand why you’re successful.

You almost always know why you failed because you’re forced to confront it head on. When you fail, you are naturally driven to understand what went wrong so you can prevent it from happening again in the future.

It takes a much greater conscious effort to understand why you’re successful.

Two cognitive biases, the pattern seeking fallacy and the hindsight bias, skew our perception of why we were successful.

False certainty about the factors that contribute to success can lead you down the wrong path. Believing that one set of learnings applies directly to another scenario without concrete logic to support the belief is dangerous.

Fooled by Randomness in Product Design

In product design, I’ve seen entrepreneurs decide they “must” gamify their product by putting in “mayorships” and “points” because “everyone” is doing it. Gamification is not a panacea that will make your user growth and engagement magically skyrocket. Without understanding the specific ways in which game mechanics drive behavioral change, and then relentlessly A/B testing those hypotheses, you cannot definitively say that game mechanics are relevant to driving user growth and engagement.

In product analytics, it’s extremely easy to find patterns where they don’t actually exist.

I cannot emphasize this point enough: Your data can lie to you.

That spike that happened on November 25, 2011? You didn’t cause it. It was Black Friday. Unless you can manipulate national holidays, I wouldn’t call that a pattern.

This is different from being aware of the fact that it is Black Friday, understanding the consumer behavior behind the event, and structuring your product, marketing, biz dev strategies to take full advantage of that exogenous event.

Trying to find a pattern amidst chaos is backwards; rather, you should form a theory and test it through an experiment, with hypothesis, independent variable, dependent variable, control, and result.

Test relentlessly to isolate a cause-and-effect relationship.

Fooled by Randomness in Investing

On the investment side, I’ve seen investors see nice exits based mostly on luck, and convince themselves that they are amazing investors, that they’ve cracked the secret sauce for all investing. But a single data point does not make a pattern.

Chris Dixon touched on this topic in a thought-provoking post, that investing based on pattern recognition can bias our decision making to miss good investments and overweight bad investments. There are a select few investors who have made consistently high returns over their lifetimes; these investors have definitely cracked the code.

I was talking to my friend Eric Wiesen recently, and he brought up a fascinating study covered by Nassim Taleb in his book Fooled By Randomness.

A group of people was instructed to flip a coin and predict which side it would land on. If they predicted correctly, they would advance to the next round. After a few rounds, the group had been whittled down to a small number of people. The researchers brought in reporters to interview the remaining contestants, asking them how they were so good at flipping coins. Fooled into thinking they had mastered the art of coin flipping, the remaining contestants came up with reasons how their technique was better than others’.

This example is a surprising encapsulation of the hindsight bias. These people were essentially claiming that they had discovered a way to manipulate the most classic example of pure randomness — the probability and conditional probability of a coin flip is always 50%.

Data is an imperfect proxy to the causes of success or failure. Data indicates the evidence of, not the reasons for success or failure.

Understanding the real reasons behind a pattern is much more important than being able to see the pattern itself.

Browse by Category