You are currently browsing the tag archive for the ‘Information Technology’ tag.

As part of our largest-ever research study on business analytics, which surveyed more than 2,600 organizations covering the maturity and competency of business, IT and vertical industries, we looked at how IT is applying analytics to support their own business activities. One of the things we found is that, charged with enabling business units to use information systems as effectively as possible, the IT department, like the shoemaker’s barefoot children in the old tale, typically stands last in line for resources to manage its own performance. In trying to understand and tune the collection of networking and operating systems, middleware and applications an enterprise needs to operate, IT professionals usually have to make do with small sets of historical data stored in spreadsheets and data warehouses and marts that are not as well managed as the systems they maintain to support the business. In most cases IT cannot apply the same level of analytics to its own operations that it provides to business units. This also has effects beyond IT itself: To the extent that the result is subpar performance of its core information systems, the business will suffer.

To break out of this frustrating cycle, IT needs to make the rest of the organization aware of the role it actually performs, of course, and it needs metrics and measurements, which require analytics to standardize and routinely generate them. IT needs to be able to analyze both historical and real-time events involving data and processes so managers can determine the right level of automation and efficiency to demand from the technology. And IT needs the capability delivered by predictive analytics  to anticipate situations and outcomes so it can prepare properly for them. In short, the CIO and IT staff need to manage their portfolio as a business asset, not merely a collection of technologies.

Metrics about its own operations and systems also enable IT to determine priorities for improvement. To fully understand the state of their existing investments and processes, IT organizations should not just measure them but analyze them to develop insights on future outcomes of their systems. This more sophisticated approach to analytics can help IT determine where to focus resources and what to do with legacy systems. Knowing this, it is possible to prioritize precious budget dollars and justify IT investments more convincingly.

Our research found that IT’s concerns currently center on cost and operational efficiency. The most important financial metrics are return on investment, cost per project, budget utilization and adherence to budget. The most important process metrics address timeliness in IT’s core function of service to the business: delivery of projects on time, speed of technology implementation and help desk response time.

In our research, which we presented in this webinar on IT analytics, of the participants’ perceptions of which metrics are most important for executives and managers, two loomed large: business user satisfaction and compliance with service level agreements (SLAs). The executives themselves rated the two metrics nearly equal in importance, but their management reports (vice presidents) by a slight margin most often named adherence to governance and risk management requirements rather than either of those. These responses suggest that people may work somewhat at cross-purposes in pursuing IT analytics.

The research also finds strong suggestions that organizations ought to involve more people in the process of establishing requirements for defining analytics. Research participants asserted overwhelmingly that they and the head of their business unit are involved in establishing requirements important to their jobs, but percentages drop for heads of other business units and business analysts in other business units. This disparity takes on more weight when we recall that business user satisfaction and SLA compliance are important metrics for leaders.

For analytics to deliver value, they must be available to those who need them; the research shows that this is an issue for many organizations. No more than half have analytics generally available to address any of seven major IT management tasks, and only for budget analysis are analytics completely available in even one-fourth of organizations. In a related finding, more than half said it is very important to make it simpler to provide analytics and metrics; less than 10 percent said that is only somewhat important or not important. As well, over a third said they can significantly improve their use of analytics and performance indicators, and over a third are not satisfied with the process currently used to create analytics.

The process of applying analytics also impacts IT’s effectiveness. The IT Analytics benchmark research found that users in nearly two-thirds of all organizations spend most of their time in unproductive chores that precede analyzing their data: preparing it for analysis, reviewing it for quality and consistency and waiting for it. And before that, issues in collecting the data raise another roadblock. In more than half of organizations, doing that is very difficult or a challenge that impedes creating metrics and performance indicators.

These functional barriers also can get in the way of analysts performing important tasks. Among capabilities they need in order to work effectively with analytics and metrics, 42 percent said access to source data is the most important, and at least one-third identified as most important the abilities to search for existing data and analytics, to take action based on analytics and to design and maintain both business models and metrics for analytics. Applying predictive analytics to project future outcomes, a hallmark of advanced maturity in the use of IT analytics, was cited by 31 percent.

IT professionals need appropriate tools to facilitate these and other analytics-related activities. In more than half of these organizations, business intelligence technologies for query, reporting, analysis are the most important of these tools. Yet even in this technologically astute environment, desktop spreadsheets are often used to generate analytics and are an important information source for building IT. But spreadsheets require manual effort to populate the data and are prone to error, and thus are not appropriate for collaborative and enterprise-wide activities. We think their widespread use is a factor in half of organizations being only somewhat satisfied or not satisfied with their organization’s current technology for creating and applying analytics.

As part of our benchmark research methodology, Ventana Research has developed a model for assessing maturity that classifies organizations at four maturity levels (from bottom to top, Tactical, Advanced, Strategic and Innovative) in each of four categories: People, Process, Information and Technology. With respect to their use of and plans for IT analytics, our Maturity Index analysis found only 15 percent whose responses place them at the highest Innovative level of maturity. One important finding reflecting on organizations’ maturity is that two-thirds said the data used in preparing metrics and performance indicators is only somewhat accurate or somewhat inaccurate. As well, it takes 35 percent of organizations more than one week to provide updated metrics and performance indicators to people and nearly as many up to a week to provide them.

It is a positive sign that improvements, if made, will be done most often to improve business processes or decision-making rather than for operational efficiency and cost savings. The first two motivations are more likely to produce better business results. Similarly, maximizing IT effectiveness and improving the value of IT to business managers are more important than issues involving resources, costs and budget.

However, these opinions come from organizations that plan to change the way they generate and apply analytics in the next 12 to 18 months, and they comprise only 28 percent of the total; another 36 percent said changes are needed but are not currently a priority. The primary barriers to such an initiative are both fiscal (lack of resources and budget) and perceptual (lack of awareness and a sense that the business case is not strong enough). Recognizing a problem but not being willing or able to remedy it is another sign of immaturity.

To maximize its value, IT should use analytics and metrics to help set its own goals and objectives and to ensure they serve the business strategies of the organization. This innovative path is embracing IT performance management. Few organizations have taken the necessary steps to actually manage performance and align, optimize and understand the range of their IT processes and resources. We believe, and this benchmark research confirms, that it is time for them to take those steps, supported by executive management in providing resources.

Regards,

David Menninger – VP & Research Director

In various forms, business intelligence (BI) – as queries, reporting, dashboards and online analytical processing (OLAP) – is being used increasingly widely. And as basic BI capabilities spread to more organizations, innovative ones increasingly are exploring how to take advantage of the next step in the strategic use of BI: predictive analytics. The trend in Web searches for the phrase “predictive analytics” gives one indication of the rise in interest in this area. From 2004 through 2008, the number of Web search was relatively steady. Beginning in 2009, the number of searches rose significantly and has continued to rise.

While a market for products that deliver predictive analytics has existed for many years and enterprises in a few vertical industries and specific lines of business have been willing to pay large sums to acquire those capabilities, this constitutes but a fraction of the potential user base. Predictive analytics are nowhere near the top of the list of requirements for business intelligence implementations. In our recent benchmark research on business analytics, participants from more than 2,600 organizations ranked predictive analytics 10th among technologies they use today to generate analytics; only one in eight companies use them. Planning and forecasting fared only slightly better, ranking sixth with use by 26 percent. Yet in other research, 80 percent of organizations indicated that applying predictive analytics is important or very important.

For the untapped majority, technology has now advanced to a stage at which it is feasible to harness the potential of predictive analytics, and the value of these advanced analytics is increasingly clear. Like BI, predictive analytics also can appear in several identities, including predictive statistics, forecasting, simulation, optimization and data mining. Among lines of business, contact centers and supply chains are most interested in predictive analytics. By industry, telecommunications, medicine and financial services are among the heavier users of these analytics, but even there no more than 20 percent now employ them. Ironically, our recent research on analytics shows that finance departments are the line of business least likely to use predictive analytics, even though they could find wide applicability for them.

Predictive analytics defined broadly have a checkered history. Some software vendors have long offered tools for statistics, forecasting and data mining including Angoss, SAS and SPSS, which is now part of IBM. I was part of the team at Oracle that acquired data mining technology for the Oracle database a dozen years ago. Companies like KXEN sprang up around that time, perhaps hoping to capitalize on what appeared to be a growing market. The open source statistics project known simply as R has been around for more than a decade and some BI vendors like Information Builders have embedded into their product called WebFOCUS RStat.

But then things quieted down for almost a decade as the large vendors focused on consolidating BI capabilities into a single platform. Around 2007, interest began to rise again, as BI vendors looked for other ways to differentiate their product offerings. Information Builders began touting its successes in predictive analytics with a customer case study of how the Richmond, Va., police department was using the technology to fight crime. In 2008 Tibco acquired Insightful and Netezza acquired NuTech, both to gain more advanced analytics. And in 2009 IBM acquired SPSS.

As well these combinations, there are signs that more vendors see a market in supplying advanced analytics. Several startups have focused their attention on predictive analytics, including some such as Predixion with offerings that run in the cloud. Database companies have been pushing to add more advanced analytics to their products, some by forming partnerships with or acquiring advanced analytics vendors and others by building their own analytics frameworks. The trend has even been noted by mainstream media: For example, in 2009 an article in The New York Times suggested that “data mining has entered a golden age, whether being used to set ad prices, find new drugs more quickly or fine-tune financial models.”

Thus the predictive analytics market seems to be coming to life just as need intensifies. The huge volumes of data processed by major websites and online retailers, to name just two types, cannot be moved easily or quickly to a separate system for more advanced analyses. The Apache Hadoop project has been extended with a project called Mahout, a machine learning library that supports large data sets. Simultaneously, data warehouse vendors have been adding more analytics to their databases. Other vendors are touting in-memory systems as the way to deliver advanced analytics on large-scale data. It is possible that this marriage of analytics with large-scale data will cause a surge in the use of predictive analytics – unless other challenges prevent more widespread use of these capabilities.

One is the inherent complexity of the analytics themselves. They employ arcane algorithms and often require knowledge of sampling techniques to avoid bias in the analyses. Developing such analytics tools involves creating models, deploying them and managing them to understand when a model has become “stale” or outdated and ought to be revised or replaced. It should be obvious that only the most technically advanced user will have (or desire) any familiarity with this complexity; to achieve broad adoption, either there must be more highly trained users (which is unlikely) or vendors must mask the complexity and make their tools easier to use.

It also is possible that a lack of understanding of the business value these analyses provide is holding back adoption. Predictive analytics have been employed to advantage in certain segments of the market; other segments can learn from those deployments, but it is up to vendors to make the benefits and examples more widely known. In an upcoming benchmark research I’ll be investigating what is necessary to deploy these techniques more broadly and looking for the best practices from successful implementations.

Regards,

David Menninger – VP & Research Director

Follow on WordPress.com

Enter your email address to follow this blog and receive notifications of new posts by email.

Join 22 other followers

RSS David Menninger’s Analyst Perspective’s at Ventana Research

  • An error has occurred; the feed is probably down. Try again later.

David Menninger – Twitter

Top Rated

Blog Stats

  • 46,033 hits
%d bloggers like this: