Category Archives : Market Intelligence
Our client, a leading provider of education software, is a large and established company tasked with keeping and preserving market share, mitigating threats from emerging competition and discovering new sources of organic growth. With fast-paced innovation in their industry and the ubiquitous nature of information available to understand market trends, our clients find it difficult to get a clear understanding of the competitive landscape in the market segments they want to compete in. Under our On-Request market intelligence partnership, we are uniquely positioned to provide customized market intelligence in an ongoing fashion to support decision making in their Strategy team and the business units they support.
We work with the Vice President of Strategy at a leading vendor of Learning Management Systems (LMS) who required a greater understanding of an important market segment: the corporate learning and learning management space. While the company had found tremendous success acquiring and retaining its customers in higher education and K-12 segments, it had not yet been able to crack the corporate market. To help develop the right mix of features in its product offering and create the optimal value proposition within this segment, our client asked us to provide insight into the market for these products. Specifically, they needed clarity about the overall size of the market opportunity for corporate training products in a variety of industry verticals, to identify both the major and emerging players operating in those spaces, and understand more about key trends as they are playing out in the market.
Working under a hypothesis that changing customer needs and the evolution of industry standard technologies would create opportunities for new product features or positioning, we collaborated with them to understand which key intelligence questions needed answers and set out to craft a customized research plan to address them.
We collaborated on a customized study to enhance their understanding of this market. The purpose of this customized study was to:
- Size the market for corporate learning and learning management products in the United States and globally
- Provide insight into key trends such as technological innovation, use cases and implementation issues which affect the space
- Identify the major and emerging players in corporate learning and learning management and provide answers to the key intelligence questions our client needed visibility on
Providing a more holistic and up-to-date view of the fast-moving cLMS market gave our client’s Strategy team a framework with which to view its competitors’ offerings. In addition to identifying and profiling key market trends to help with product planning, we provided analysis on where the competition was weak. Because our competitor research was aimed squarely at analyzing the strengths and weaknesses of major competitors and emerging players, it was a valuable tool to identify service gaps or whitespace to exploit. This analytical framework proved valuable to the product strategy team as they made decisions to re-position the company in the corporate market segment.
At its completion, the client’s team made the decision to engage in additional follow-up research aiming to dig deeper into specific needs, pain points and requirements among corporate customers in select industry segments. As a valuable added benefit, the On-Request partnership makes it possible for our client’s strategy team to get the external market insight they need in a flexible manner, without the need for time-consuming procurement processes for each ask.
For more information about how we help our client within our On-Request partnership model or to learn more about the types of research output provided to our clients, please contact us at firstname.lastname@example.org
For this Q&A we spoke with Partha Bhattacharjee, who has written about automation in competitive intelligence in a series titled “Riding the Competitive Intelligence Automation Wave”. Partha is a Senior Solutions Engineer at Cambridge Semantics, Inc. (CSI), an enterprise analytics and Big data management software company. He has been a CI professional for over five years and holds graduate degrees in Systems Engineering and Technology and Policy from MIT.
Emerging Strategy: Taking the title of your series into account, “Riding The Competitive Intelligence Automation Wave”, I will just get right to the point: should competitive intelligence professionals worry about their jobs being automated any time soon?
Partha Bhattacharjee: No, CI professionals will not lose their jobs. Instead, their jobs will evolve and it will be critical that they keep up. It’s true that parts of a CI professional’s job will be automated going forward. What’s important to realize is that this trend is not new. Parts of CI have continuously been automated over time. News crawling, for instance, underwent transformation with the foray of RSS feeds into the mainstream. To the best of my knowledge, not too many CI professionals lost their jobs to prior phases of automation.
Having said that, the current wave of automation is more sophisticated and encompasses tasks that require a higher degree of cognition. Analytics on unstructured text, and the harmonization with structured data, is a prime example in the context of CI. Advances in semantic technologies, natural language processing, and machine learning, often overlapping in nature, now enable tools to process diverse text ranging from analyst call transcripts to scientific publications in a manner that approximates human interaction with the content. Data extracted from the text can then be linked to and stored in spreadsheets and databases. If you come to think of it, these steps represent the bulk of the secondary research performed by CI analysts. But this type of secondary research performed by a computer is unlikely to put a competent CI professional out of a job – on the contrary, the analyst will have more bandwidth for higher value tasks such as primary research and data analysis.
ES: You write that the utility of CI is not in question but it is the outdated methods of CI collection that is putting pressure on departments to visibly demonstrate their value to senior managers. So how can senior managers who consume CI get the most out of their limited (or shrinking) CI budgets in today’s environment?
PB: Senior managers need to both envision and evangelize a CI project as a digital initiative because deep entrenchment of technology is critical for maximizing a project’s value. The project roadmap needs to include tools used at each stage as well as a factual basis for selecting one tool over another. CI managers have typically been subject and/or practice area experts. By viewing a CI project through a technology lens, managers will be able to maximize the throughput at each stage of the project. Native infusion of technology from the outset that complies with a vision, instead of adding tools in an ad hoc manner, will only augment a manager’s effectiveness in leading a CI project.
CI managers also need to zealously scrutinize the value returned from every dollar invested using metrics such as Time To Value. The goal must be allocation of most investment to high-value tasks such as accessing sources of differentiating information and insight generation. For most companies, such scrutiny, in all likelihood, will reveal that an awful amount of analysts’ time is spent in collating data and beating it into a shape that is amenable to analysis. The majority of such tasks can be automated, thus enabling CI managers to deploy their scarce human expertise toward connecting dots between data, augmenting subject matter expertise, and generating insights.
Other cost sinks often emerge around primary research. The trend of minimizing costs using technology to bridge physical distance has been in effect for several years now. Tools for large scale surveys and reporting, text-to-speech applications, and mobile-based data collection tools are helping minimize costs and improve data quality. Interconnecting or harmonizing such data will generate significant value.
ES: Do you have any particular advice for analysts burdened with a great deal of tedious ‘legwork’ that takes time away from higher value activities such as analysis that managers expect of them?
PB: I am fortunate to have had the opportunity to work as a CI analyst early in my career before diving into the world of creating smart data-driven CI solutions. The more I see of the spectrum of tools that a CI analyst can potentially use, the more I am convinced that familiarity with a wide variety of analytical tools and techniques is probably the most important asset a CI analyst can possess apart from an inquisitive nature and subject matter expertise.
A CI analyst needs to view herself as the conductor of an orchestra. The musicians are the software that perform one or more of the required tasks in the CI pipeline. The quality of insights is a function of how well the analyst can strike harmony between an array of tools and techniques. Every step of the CI process can leverage automation to varying degrees.
I would particularly draw attention to unstructured text analytics given that secondary research, apart from cleaning messy data, tends to be the most time intensive exercise in most CI projects. To drastically reduce the time invested in secondary research and enhance productivity and scale of analysis, I strongly urge CI practitioners to consider using text analytics tools. A skilled CI analyst can identify and visualize crucial data from thousands of documents in a matter of hours through an intelligent combination of multiple annotators and dashboards.
ES: What are the best practices for structuring MI/CI projects that require regular updating and managing of complex information from disparate sources?
PB: There are a few core design decisions that need to be considered:
- Adopting smart data lakes: CI managers across enterprises are realizing that they are fighting a losing battle against the burgeoning diversity of data. Traditional data management systems are unable to cope with the variety of data, and analysts are consequently too bogged down in low-value repetitive data collation tasks to fully apply themselves to insight generation. Going forward, MI/CI projects will need to be anchored to what are colloquially known as ‘smart’ data lakes to be able to handle complex data from disparate sources. In such data lakes, all the data that a CI analyst has access to are interlinked and stored using a flexible data model so that depending on the project, the relevant data can be extracted and reused.
- Using advanced text analytics solutions: As previously discussed, current text analytics tools provide an array of features ranging from entity extraction to document translation. The use of such tools is a must to identify data of interest in unstructured text.
- Establishing robust data pipelines: Data lakes need to be connected to relevant data sources through data ‘pipelines’ that periodically update its content. The flexibility of Resource Description Framework (RDF), the semantic data model, is a critical differentiator in this context as it can seamlessly accommodate changes in data’s structure as well as context.
- Avoiding vendor lock-in: The key to successful management of diverse data is the ability to use best of breed tools or components that fit one’s workflow. Hence, it is important for practitioners to be able to work with platform(s) where different tools can be used concurrently.
A truly revolutionary product was launched in 1997, one that would allow people anywhere on the face of the planet to pick up a phone and make a call. Iridium’s satellite constellation communication network launched to great fanfare as a possible replacement to land-based cell towers, but while this Motorola-backed startup got all the technology right (the constellation is still in use today), they failed to assess the market correctly, and filed for bankruptcy shortly after launch. Iridium’s executives did not rigorously canvass the market for their product, leading to a catastrophic rollout of their technologically innovative product. Had Iridium done their homework, they may have discovered that their cost structure could not be supported by the small customer base willing to pay for their product. Instead, its research failed to provide objective insight into its market, and the result speaks for itself.
Conducting a truly objective study of the market before hitting the launch button could have avoided this outcome. Testing central assumptions, validating customer preferences and prioritizing feature sets are mission critical objectives for any B2B product development team going through the new product development (NPD) process. From a new office chair to a constellation of satellites, getting the product and the market right are prerequisites for a successful product launch. Well designed and executed customer research allows companies to hit the mark with their NPD efforts.
Validate central assumptions early on with qualitative customer insights
Early stage product development is generally about testing out new ideas and central assumptions about the market. Just like hockey players are taught to skate to where the puck will be, rather than where it is, product developers and managers need to position their products to meet the future needs of customers, not simply fulfill their wishes today. Isolating exactly where there is room for improvement and how you can position your future product among existing competitors is where customer insights come in handy.
Early stage product development is a critical juncture. This is the time when companies need to place big bets or fold altogether. Market research that collects firsthand customer insight about your customer base can help you validate or invalidate the central assumptions your team has about its market. When undertaking a major product launch or overhaul, most managers say the more information the better. However, having information is objective and sourced from a variety of inputs can make or break a product launch.
For products in the early stages of development, a qualitative approach is generally preferred over a quantitative approach since you likely will be sourcing insights from a smaller group of customers. For this reason, conducting more open-ended qualitative interviews or focus groups are a good choice. Reaching out to existing customers using similar products for market research also serves as a great way to build an audience for your product, especially in niche markets.
Later stage research helps nail down feature sets, customer preferences and messaging
For products further along in the development cycle, market research can help teams optimize the value of products that have already been approved for development. To paint a complete picture of your customer base gather customer insights from a variety of sources, both qualitative and quantitative. While in the earlier stages, identifying or validating customer preferences is the main objective. After a product has moved further along in the development process, it becomes important to hammer out which specific feature sets get included in the final version.
Surveys yield quantitative insights that can be especially useful for developers working on products with a large pool of potential customers. Quantitative methodologies are especially helpful for ranking the importance of features against each other, in relative terms, or as a whole, in absolute terms. Importance ranking forces participants to rank certain features or preferences higher than others (rank these five features in order of importance to you), whereas importance rating can judge which features or preferences are important in an absolute sense (rate the importance of these five features on a scale of 1 to 5). Both methodologies yield useful insights, but must be deliberately applied in appropriate contexts to yield useful insight.
For markets with a smaller customer base, such as niche B2B markets, a methodology which elicits more detailed insight from a small group of people is generally more productive. Collecting insight through customer interviews and focus groups works well for this purpose. One important consideration with qualitative interviews or focus groups, however, is sampling. Choosing a diverse set of sample participants while keeping costs low is a challenge, but one that should not be overlooked. Customer insights extracted from interviews or focus groups should ideally conform to your understanding of the market, but in cases where there is inconsistency, a deeper look may be warranted.
Present and share your data effectively for maximum impact
Even the most well thought out study could turn out to be meaningless if the results are not communicated effectively within your organization. Establish communication lines to ensure product development team members and senior management see the research. Invite as many people as you can to view survey results or interview transcripts. Moving information across boundaries between functional silos can be difficult in some organizations, but if customer research gets lost in the shuffle of interdepartmental communication, it can’t impart value into products.
Debrief sessions with senior management are a great way to share impactful insight gleaned from customer research. Discussing research findings in this way should highlight what was found in the course of research, how the new findings should be applied to the product development process and also identify new possibilities for further research. To take full advantage of debrief opportunities, engage your audience with your presentations or documentation. Rather than dumping your data into a debrief document, consider distilling customer insights into customer personas with similar characteristics. Personas give your data a memorable face and provide a common language for your team to discuss it, making it much easier to comprehend and share throughout the organization.
Customer insights are a lynchpin of the NPD process. Without a thoughtful approach to customer research, all the hard work that goes into developing a new product could go to waste. Had Iridium’s executive team insisted on investing in rigorous, high quality and objective research to inform their product planning and strategy, they could have averted their disastrous launch and possibly changed the way we communicate with eachother.
The transformative effects of e-commerce and the opportunities it presents across various industries and markets have been well documented. For most companies, developing an e-commerce strategy is no longer merely a consideration or an experiment, but integral to their business. Compared to many other industries, however, the automotive aftermarket’s e-commerce activity is less prominent. This is largely due to the complex nature of the industry’s products, supply chains, and channel relationships. Innovative players who can overcome these challenges have the potential to capitalize on a growing but underdeveloped service that could disrupt the way this industry conducts business.
As technological innovation continues to evolve at a breakneck speed, new disruptive changes can emerge swiftly and unexpectedly. These technological advancements both contribute to and have shaped the ongoing data revolution, enabling companies to interact with customers and competitors in real time.
The field of market intelligence has been a clear witness to this synergy. Market intelligence operates by mining relevant data, and then analyzing and curating it to deliver particular insights. These include industry profiles, sales forecasts, customer trends, competitor developments, and the potential for expansion.
The development of intelligent tools, such as artificial intelligence (AI) and crowdsourcing, to efficiently analyze unprecedented amounts of data is thus poised to revolutionize the way market intelligence functions. Failure to sustainably incorporate such strategies can result in companies being overtaken by forward-looking competitors.
As the competitive landscape in the global economy shifts from west to east, established multinational corporations (MNCs) and start-ups alike must gain a competitive edge in order to remain relevant. While there are a number of areas in which spending on internal organization and management can generate a large return on investment (ROI), HR activities stands out among them as an area with tremendous potential for improvement.
Benchmarking is a tool which elite organizations can use to learn what works within their business, and to gauge which activities can become more efficient based on industry best practices. Applied to HR, benchmarking can help businesses achieve high quality leadership development and in turn ensure a steady stream of talent. In this way, organizations can gain a competitive advantage in their hiring and leadership development processes to get an edge over peers.
Long promoted by academics and management gurus alike, benchmarking has become an indispensable tool in any firm’s workbench. Especially in the increasingly competitive global marketplace caused by the shift in consumer demand and corporate influence from the developed to emerging world, thorough benchmarking services have become essential in identifying an organization’s core strengths and weaknesses, and where it stands in regards to its competitors.
Customized benchmarking is a powerful tool that best-in-class organizations can utilize to stay ahead of their competition. It is the indispensable process of comparing organizational structure and design, operational performance and efficiency, compensation levels and headcount, and business processes and tools to learn what works, and what can be done better based on industry best practices. In this way, companies can obtain the competitive intelligence they need to gain an edge over their close peers and drastically improve operational performance, all while enjoying a hefty ROI on their initial outlay for the service itself.
Expected to grow three times faster than the world’s developed countries, emerging markets are fast becoming the engines of global economic growth. Brazil is among the biggest emerging markets with the greatest market potential.
On the one hand, Brazil’s abundant human, mineral, and agricultural resources, and high level of internet usage are serving as a magnet for foreign direct investment (FDI); on the other, a number of problems with its infrastructure, regulatory framework and labor force have presented roadblocks for foreign ventures. In this article, we look at some of the challenges unique to Brazil, and discuss how they can be solved with well conducted market intelligence (MI).
By Daniel Wagner
Many of President Obama’s critics are characterizing him as a bumbling idiot and a traitor for having broken the 55-year old ice between the U.S. and Cuba. If they had their way, Cuba’s people would remain imprisoned by U.S. sanctions for another half century, long after the Castro brothers are gone. These are the same critics who continue to refer to China as “Red China”, would like to ensure that a new Cold War endures with Russia, and would be very pleased if the U.S. were to bomb Iran. They just don’t get it. The Cold War as they’ve known it and would like to see it has been gone for more than 20 years. More to the point, if the U.S. doesn’t think more strategically about its foreign policy, as Obama has in this case, it risks making itself a relic of the Cold War on the global chess board. (more…)
By Daniel Wagner
Many common perceptions foreign investors have about Brazil are misplaced. By all rights, given its size, location, and natural resource base, Brazil should be an economic juggernaut. But the truth is that Brazil should never have been designated a BRIC because it is a poorly managed economy that has rarely lived up to its potential. In 2001, when the country was designated a BRIC, it was $200 billion in debt and spent 38% of its GDP servicing that debt. The following year Brazil took out of $40 billion loan from the IMF. The country’s average GDP growth rate since 2000 has been under 2%, and Brazil has consistently underperformed its BRIC counterparts. So, what’s all the fuss about? (more…)