Microsoft’s Edwin Lau – An Innovation Interview

Microsoft's Edwin Lau - An Innovation InterviewEdwin Lau is Director of Market Intelligence for Microsoft’s Interactive Entertainment Business (products include Xbox and Kinect). Edwin leads the team responsible for providing strategic, research-led insights to guide product, services and content strategy. His team represents the voice of the consumer in strategic planning, product development and marketing activities for Interactive Entertainment Business (IEB). Edwin’s background is in strategy consulting, mergers and acquisitions and he also led a $1.7B consumer business.

Q: Edwin, you’ve described to me before your vision of building a best-in-class research function in a technology business. What are the unique challenges of innovation in a technology firm?

A: In technology, trends change rapidly. You have to balance what the consumer wants with what the technology will be able to do. Then you have to be able to work with the engineers who don’t necessarily see things as a marketer or researcher may see them.  Bringing together these different viewpoints in a way that each party sees the other’s value is a challenge.

Q: How do you overcome that challenge? How do you get both parties to work together?

A: I’ve always believed that if you have data as a common language, you can have a conversation. That seems to work well in an engineering-based company, as engineers tend to be more logically oriented. As marketers and researchers we want to use sound consumer input as a basis for engaging with the engineers, and to have consumer insight shape product development. To do that, we have to bring the research to the team with a different perspective, “here are some numbers we can play with.” This tends to start the discussion.

Q: Why isn’t this the norm? As research professionals, we are seeking the same interaction. Where is the breakdown?

A: The breakdown is when research tries to come in “pure”, presenting hard figures, conclusions, and recommendations. A typical research approach doesn’t always marry the research data exactly to what the engineers are struggling with.

For example, engineers don’t necessarily want go/no go decisions. They may not even want recommendations.  Research very often tries to make assertions based on consumer data. Engineers know things outside of what the research is saying and might think they know better than the consumer. This is especially true in technology companies. Visionary companies have done this for years without research. The visionary technologist envisions the new world before others can see it. And the mentality of the visionary is very much engrained in the tech culture.

Q: Why isn’t that enough? Why do you want to incorporate consumer input?

A: At the end of the day, we’re building products that will hopefully be used by a lot of people. I don’t think there is much value in building technology for technology’s sake. While consumers may not know what they want in the future, they can tell you what they don’t want or where their pain points are. They help us tease out possible unmet needs and opportunities that ultimately result in higher adoption rates for the new products.

Without research, the nature of engineering is to solve more complex technology problems. I think engineers, by nature, are more interested in the thrill of solving complex, technical problems. They love inventing. These problems do not always align with the consumer mindset. If we can now solve a new technical problem, we have to ask “so what?” If it doesn’t serve any tangible benefit to a user, who will use it?

On the flip side, if you rely solely on consumer insights, you probably would not be able to imagine how the world might be different with possible advancements in technology. At the end of the day, I think it is more powerful when you are able to bring them both together – consumer insights and technology vision.

Q: Do you see more of a need for certain types of research?

A: Yes, I think there is so much more we can do around predictions and predictive analysis. In today’s world, we have access to all kinds of data.  We can even use the same data that the engineers sometimes use, and can analyze it from a business perspective. That’s why I think data can be a common language between the engineering and business functions in a technology firm.

With most predictive models, the more relevant data you have, the better the model gets. We have access to so much data these days ─ user data, behavioral data, merged with psychographic data.  When you are able to marry all the different types of data together, your predictive models become even more powerful.

The research function in many organizations tends to be more focused on branding, positioning, value propositions, marcom (marketing communications), and PR. The emphasis for researchers in a technology-focused company seems to be more on the softer side. Hard, quantitative skills tend to be associated with the engineering function. I see more of a need for the quantitative skills, especially marketing sciences and databasing, in the research function of a tech company.

Q: When does your audience light up? What is it about the data or the story that helps you see eye-to-eye?

A: When you collaborate with them. You don’t want to come in and tell them the answer. You come in and tell them “Here’s what we’re seeing. What do you think?” Data can tell different things and if you have your audience involved through that process, there is a higher likelihood that they’ll agree with your overall conclusions.

Typically market researchers will review the data and come in and tell you what they think. There is too much us vs. them and too much “this is our recommendation.” I’m always trying to create a process of collaboration when we’re figuring out what it means with our audience.

Q: Let’s contrast this with the upbringing of a researcher. Researchers are trained throughout their careers to put their necks on the line, and provide a clear set of recommendations for how to move forward. Is this the wrong mentality for researchers working with a technology firm?

A: If you are a consultant, you need to have a point-of-view because that is probably expected. But I think it is more powerful to shift their perspective to, ‘let’s have a conversation around the data. Here’s what I’m seeing. Here are the themes.” Researchers should invite the engineers and the broader team into the discussion instead of having black and white conclusions at the end of the project. Tension can be created when research is not aligned with strategy and broader team input. Market research is very good at answering questions, but it can get lost when not aligned with the team.

Q: Does research need to be less about stage-gating and go/no go decisions?

A: It’s much more about what’s working and what’s not working at the early stage. If it’s not working, our engineers and developers can try to fix it.

In mature industries, most of the time product innovation is at the fringe and incremental.  You’re not going to be consuming coffee in a different way tomorrow.  Stage-gating also assumes you have lots and lots of ideas. Sometimes in tech, you might have one idea ─ a singular thing that makes a tech company. Google = search. Facebook = social. Very few tech companies have had multiple big hits. CPG (consumer packaged goods) companies are more portfolio-based, with lots of ideas. They need to be more about go/no go.

Q: How does that change what you need from research?

A: Research just needs to be more work in progress, more versatile. We need to be comfortable with early insights. We need to test and retest as the product or technology evolves. Testing has to be iterative so that you can compare and monitor as your product changes.

Q: You are instituting more market research rigor than most technology companies. How do you compare your approach vs. the popular wisdom of Steve Jobs, who was not a fan of using market research to identify big, new ideas?

A: If you are truly visionary, I’m fine with it. But people like that are few and far between.  Steve Jobs was an outlier, very gifted in that respect. If you’re not a Steve Jobs, it’s an odds game. I’d rather have the odds in my favor.

Q: What does it take to build a team of people who can accomplish what you’ve described?

A: I wanted to get people who could frame business problems in a quantitative and structured way. I need people who could take the big picture and structure it for the researcher (vendor). I was looking for people who would ask “what is the right question?” Once we had that, we could cue up the right vendors. I place more value in asking the right questions.

I picked a mix of people, some who had less market research experience, and more business acumen.  Strong research skills are an important prerequisite for nearly all researchers on the vendor side. Within the client organization, that’s not the only place where the value is going to be. It’s more about framing the business question.

Q: How do you identify that person?

A: I look for broad experiences. Strategy and consulting was one of my biases, where people have experience framing business issues. We also look for general business skills, such as business line or product management, and product planning.

Q: Are there any pitfalls for this kind of person to watch out for in a research role?

A: You need balance. I built a team with very different skills. Some are very strong market researchers who have done research for most of their careers. Pair them with those who have more general business and strategic backgrounds, and the result is a very strong combination.

Q: How does this type of teamwork work in practice, when the reality is that most people in our industry are stretched too far?

A: That’s the beauty in the double staffing. People start to connect. It’s important to setup a team environment. I try to find people who want to do different things, not only the things that they already know how to do. Team output comes first and foremost, beyond individual recognition.

Q: So you prefer a collegial environment over a competitive environment among peers?

A: It’s critical. We have to have a safe environment, where colleagues can ask questions and learn from one another.

Q: How do you accomplish that, when there is pressure to perform at the individual level?

A: We have a few guiding principles:

  1. We are successful when our team is successful
  2. We are successful when our partners are successful

The idea is that I would be measuring my team’s success based on how well the team performed overall, and how their partners/clients performed.

Q: How do you manage the time pressure for each assignment? How do you avoid rushing to meet pressing deadlines?

A: If you pick the right projects, timing will become less of an issue. There will be a lot of projects that come our way, but 80% won’t be impactful. It takes time to determine what are the 20% of projects you should knock out of the park.

And if you get comfortable working with early stage data, positioning the first few rounds of results as “here’s what we’re seeing,” you get away from the pressure of proofing reports to make sure every “I” is dotted and every “T” is crossed. You can get by with data tables or other rough output to start the discussion.

Insight at the right time is far more valuable than a pretty slide. Influence in the organization is coming from collaborating and getting the right people involved in those conversations. When you get them in that mode, they feel more engaged. They feel ownership. They become less defensive because they’ve been part of the process.

This takes pressure off the research team. You don’t need to worry about the overall story. It just evolves. Something you initially find could change direction. It’s iterative. This gets you to a much better place. We work with small iterations of the story instead of big bets on what the answer or final recommendation is going to be.  You’ll learn more this way. Let the home run deliverable actually evolve.

We also need to understand that the market research presentation to management is not the home run moment. The home run deliverable is when the organization embraces the decisions facilitated by the research. This means more small conversations, little insights and iterations.

Q: Backtracking to your selection process, what do you do about the 80% of projects that don’t meet your criteria for influencing the organization?

A: Ignore them. They usually die. I usually want to know who is asking the question ─ how does it fit with the rest of the business? How many times have I heard this question? If we’ve heard it 10 times, it might be real.

Q: Edwin, thank you for your time today, and for sharing your perspective with our audience. Do you have any final words of wisdom for those setting up a research function?

A: One of the things I’ve discovered is that everyone on our team has different strengths and they bring very diverse experiences to the group. The challenge is to find the glue that binds the team together.  We are always trying to find ways to combine the different strengths in the projects we undertake. We also try to work as much as possible with other functional groups – cross-functional teams give multiple points of view to solve a problem. I like to use the diamond analogy – it’s the multi-facets that give you the sparkle!

Editor’s Note: this article  first appeared on the Ipsos Vantis blog InnovationPOV.com,

image credit: droidlife

Join the global innovation community

Don’t miss an article (4,400+) – Subscribe to our RSS feed and join our Innovation Excellence group!


Stephen Bohnet is Senior Vice President and GM, Technology Research at Ipsos Vantis where he leads new product development and contributes to successful product launches. Stephen also guest lectures at Stanford, Wharton, and UC Davis, and hopes to inspire the next great generation of marketers, researchers, and entrepreneurs.

This entry was posted in Feature Of The Week and tagged , , , , , , , , , , , . Bookmark the permalink.

3 Responses to Microsoft’s Edwin Lau – An Innovation Interview

  1. Pingback: Causes of Failures « Dorai's Learn Log

  2. Pingback: Articles of Interest 6/29/2012 « National Creativity Network

  3. Pingback: Innovation Excellence | Innovation Quotes of the Week – July 1, 2012

Leave a Reply

Your email address will not be published. Required fields are marked *

*

You may use these HTML tags and attributes: <a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <cite> <code> <del datetime=""> <em> <i> <q cite=""> <strike> <strong>

Keep Up to Date

  • FeedBurner
  • LinkedIn
  • Twitter
  • Facebook
  • Slideshare
  • Email
  • YouTube
  • IPhone
  • Amazon Kindle
  • Stumble Upon

Innovation Authors - Braden Kelley, Julie Anixter and Rowan Gibson

Your hosts, Braden Kelley, Julie Anixter and Rowan Gibson, are innovation writers, speakers and strategic advisors to many of the world’s leading companies.

“Our mission is to help you achieve innovation excellence inside your own organization by making innovation resources, answers, and best practices accessible for the greater good.”