Shailesh Kapoor: Researching The Research Department

22 Nov,2012

By Shailesh Kapoor


The organization chart in our television industry has been through its share of evolution. When satellite television first came to India, it was all about “programming” and “ad sales”. All other departments were more like support functions, enabling creation and selling of content. As time progressed, some other functions came into prominence, viz. marketing, distribution and research.


The research function has grown significantly in its importance over the last decade. As the viewership data got more complex, with more market coverage and more frequent reporting, the need to analyze it with more complexity also surfaced. Hence, from one or two executives handling all the viewership analysis needs of a channel to three or four, was a natural progression.


But the key trigger that boosted the research department was the growth in competition. More competition meant more chances of failure. In most companies, across sectors and countries, research is often a reactive response to failure, either post-failure or (the smarter version) pre-failure. When there were only about 20 channels on television, chances of failure were minimal. You could “succeed” to some extent at least, purely because you were being beamed. As the number of channels increased, this ceased to be the case. Programme and channel failure rates increased. And the research department came into prominence for prognosis and diagnosis.


Having interacted with the research departments across at least 40 different channels over the last four years, one knows that there is no defined “research executive type”. Every research executive thinks differently from each other, with no real pattern at large. Put the same material in the hands of research heads in five competition channels and you will see five different stories unfold altogether. This is a reverse problem to the sales department, where most executives across the industry think alike. At most times, too alike for their own good.


Here are two facets about the research department that I find interesting to share. In a follow-up post in December, I’ll share some more.


Researcher v/s Research Executive

There is a difference between a researcher (someone like Ormax Media) and a research executive. It’s the same difference as that between a line producer and an executive producer on a film. The skill sets are different, as is the job description. Good research executives realize this. They would never burden a researcher with questions like: “How will you find these respondents”, “I have often seen that people say one thing but mean another, so how do we handle that”, “I think this question can be phrased differently”, etc.


The point is this: If you didn’t trust the researcher, you shouldn’t be working together in the first place. But if you do, leave their job to them. Channels have not done themselves a great favor by hiring people from the research industry. A researcher becomes a research executive, but is never trained to handle the change in his role. He continues to behave like the backroom researcher in the research agency he came from, while in fact, he should be focusing on getting his key take-outs to the boardroom.


Some of our best and most stimulating work at Ormax has happened when we have been given the mandate to do it the way we feel is right. Often, such carte blanche comes only from senior management, who are more interest in results than the process. At junior levels, there is an apparent need to justify a job, and hence, an intervention into the process is a natural outcome. Having said that, there are a handful of executives at middle and junior level who are result-focused than process-focused, and I have great respect for these people. I hope and believe that they will grow to become powerful people in the media business in the years to come.


Quantitative v/s Qualitative Research


The understanding of quantitative and qualitative research is different across research executives. Invariably, most have higher comfort levels with one over the other. This is not surprising, given that this disposition has been researched in the first place to be personality-linked. But when an executive is put in an unfamiliar situation, it’s like a fish out of water.


At times, the only right way to get an answer to a business question is a statistically robust quantitative research design. But some research executives are “born and brought up” on focus group discussions. They would rather take qualitative feedback of eight groups of respondents as a decisive go or no-go decision on a new programme being tested, than test it with 400 respondents in a way that throws up a statistically valid result for the go or no-go question, as well as a viewership forecast.


The reverse happens less often, where executives insist on quantitative research when the right technique should be more qualitative (e.g. content development). That’s because qualitative research is generally better “understood” than quantitative research. But to anyone who asks “how many people in the group discussion said this” or “this was said by only one person, so it should not be in the presentation”, I have only one advice – read the marketing research textbook again.


In our work, there is great satisfaction in being told: “We have a business question. But you are the experts. You decide what’s the best way to answer our question.” I hope to hear more of this in the times to come.


PS: I’ve clearly not finished yet. So watch out for Part 2 in December!


Shailesh Kapoor is founder and CEO of media & entertainment research and consulting firm Ormax Media. He spent nine years in the television industry before turning entrepreneur. He can be reached at his Twitter handle @shaileshkapoor



Post a Comment 

Comments are closed.