You are currently browsing the tag archive for the ‘Chief Information Officer’ tag.

InetSoft is a business intelligence vendor that is not well-known but has more than 3,000 customers. Why do you need to know about another BI vendor? As I’ve written in the past, there’s a place in this market for both the megavendors and smaller vendors. InetSoft, one of the latter, has developed a broad set of capabilities over the years that have resonated with its customers. It recently announced and brought to market a significant new release, Style Intelligence 11.   

InetSoft’s BI capabilities include dashboards, visualization, enterprise reporting and access to a variety of data sources. While previous releases delivered these capabilities in separate products, Style Intelligence 11 integrates them all into a single product. InetSoft has also architected its components to make them reusable and sharable. The product provides a range of capabilities: Style Report for reporting, Style Scope for dashboarding and visualization, and Style Intelligence, which includes all these capabilities plus advanced data source connectors for OLAP and ERP access. In addition Style Studio integrates a range of these capabilities into one framework for rapid assembly of analytics and information for meeting specific application needs. 

The reporting capabilities in Style Report include drag-and-drop report creation of pixel-perfect production reports as well as interactive reports and ad-hoc reporting. Report layout is flexible and can be made interactive with scripting of business logic via JavaScript that is linked to different events and elements within the report. A library of over 30 different chart types provides a starting point for many common types of reports. Reports can be viewed interactively in real time or as self-contained extracts of data, which is useful when working offline. The product also includes scheduling, bursting, clustered reporting and auditing, features often found only in high-end products. 

InetSoft’s dashboarding with Style Scope also includes the interactive visualization and data exploration capabilities as well as more traditional portal-style displays using graphical objects such as tables, charts, gauges and maps. Visualization includes a variety of interactive multidimensional chart types as well as brushing to highlight selected data elements in multiple related views. The dashboard components can also be used for both input and output. So in addition to displaying data, the dashboards can be used to gather it, thus enabling what-if capabilities. Email alerts and notifications can be generated including dashboard snapshots that enabling viewing offline; this overcomes one of the common downsides of dashboard-based applications. 

The InetSoft architecture reflects the maturity of the product gained over 15 years of development. Several performance and scalability features are built into the product, such as connection pooling to minimize database resource requirements and clustered deployments for increased scalability. The data access architecture, referred to as Data Blocks, provides data mash-up capabilities for rapid prototyping and self-service data integration. An intelligent caching scheme provides enhanced performance when accessing remote data. InetSoft also provides built-in connectivity to a variety of data sources including relational databases, OLAP cubes and applications from salesforce.com, SAP, PeopleSoft, JD Edwards and Siebel. In addition, InetSoft incorporates a security model that provides fine-grained access down to the cell level for users, roles and groups. It also supports single sign-on to leverage existing security models and makes it easier for authorized end users to access the system. 

To broaden its market further InetSoft needs to make additional investments in mobile capabilities. The current product, although browser-based, has no specific mobile capabilities and uses Adobe Flash technology, which would make it difficult to support Apple iOS devices such as the iPad and iPhone which is now emerging as more important though could operate on other smart phones and tablets from Android, HP and RIM. OEMs constitute about half of InetSoft’s business today, based in part on how easily this J2EE application can be embedded in other software systems. And while the product appears ready for the cloud with multitenancy and clustering capabilities, there is no cloud version yet but most of the BI vendors do not offer it either. I suspect the OEM market might begin to demand both cloud and mobile capabilities as identified in our recent business analytics research. 

All in all, InetSoft offers a broad set of business intelligence features in a well-integrated, lightweight architecture. Its ability to provide rapid assembly of analytics and information is what our research into information applications and business intelligence has found to be critical for business analysts to meet the pressing needs for publishing information. This approach of application assembly is not found in most of the BI vendors products today and is a key differentiator for InetSoft to help organizations deliver business analytics in a simple and usable form. It isn’t the biggest, most established vendor in the market, but it does have a product proven by thousands of customers already. For those of you looking for an alternative for your BI needs, InetSoft may be worth considering and easy to take a look at the product here. 

Regards, 

David Menninger – VP & Research Director

In various forms, business intelligence (BI) – as queries, reporting, dashboards and online analytical processing (OLAP) – is being used increasingly widely. And as basic BI capabilities spread to more organizations, innovative ones increasingly are exploring how to take advantage of the next step in the strategic use of BI: predictive analytics. The trend in Web searches for the phrase “predictive analytics” gives one indication of the rise in interest in this area. From 2004 through 2008, the number of Web search was relatively steady. Beginning in 2009, the number of searches rose significantly and has continued to rise.

While a market for products that deliver predictive analytics has existed for many years and enterprises in a few vertical industries and specific lines of business have been willing to pay large sums to acquire those capabilities, this constitutes but a fraction of the potential user base. Predictive analytics are nowhere near the top of the list of requirements for business intelligence implementations. In our recent benchmark research on business analytics, participants from more than 2,600 organizations ranked predictive analytics 10th among technologies they use today to generate analytics; only one in eight companies use them. Planning and forecasting fared only slightly better, ranking sixth with use by 26 percent. Yet in other research, 80 percent of organizations indicated that applying predictive analytics is important or very important.

For the untapped majority, technology has now advanced to a stage at which it is feasible to harness the potential of predictive analytics, and the value of these advanced analytics is increasingly clear. Like BI, predictive analytics also can appear in several identities, including predictive statistics, forecasting, simulation, optimization and data mining. Among lines of business, contact centers and supply chains are most interested in predictive analytics. By industry, telecommunications, medicine and financial services are among the heavier users of these analytics, but even there no more than 20 percent now employ them. Ironically, our recent research on analytics shows that finance departments are the line of business least likely to use predictive analytics, even though they could find wide applicability for them.

Predictive analytics defined broadly have a checkered history. Some software vendors have long offered tools for statistics, forecasting and data mining including Angoss, SAS and SPSS, which is now part of IBM. I was part of the team at Oracle that acquired data mining technology for the Oracle database a dozen years ago. Companies like KXEN sprang up around that time, perhaps hoping to capitalize on what appeared to be a growing market. The open source statistics project known simply as R has been around for more than a decade and some BI vendors like Information Builders have embedded into their product called WebFOCUS RStat.

But then things quieted down for almost a decade as the large vendors focused on consolidating BI capabilities into a single platform. Around 2007, interest began to rise again, as BI vendors looked for other ways to differentiate their product offerings. Information Builders began touting its successes in predictive analytics with a customer case study of how the Richmond, Va., police department was using the technology to fight crime. In 2008 Tibco acquired Insightful and Netezza acquired NuTech, both to gain more advanced analytics. And in 2009 IBM acquired SPSS.

As well these combinations, there are signs that more vendors see a market in supplying advanced analytics. Several startups have focused their attention on predictive analytics, including some such as Predixion with offerings that run in the cloud. Database companies have been pushing to add more advanced analytics to their products, some by forming partnerships with or acquiring advanced analytics vendors and others by building their own analytics frameworks. The trend has even been noted by mainstream media: For example, in 2009 an article in The New York Times suggested that “data mining has entered a golden age, whether being used to set ad prices, find new drugs more quickly or fine-tune financial models.”

Thus the predictive analytics market seems to be coming to life just as need intensifies. The huge volumes of data processed by major websites and online retailers, to name just two types, cannot be moved easily or quickly to a separate system for more advanced analyses. The Apache Hadoop project has been extended with a project called Mahout, a machine learning library that supports large data sets. Simultaneously, data warehouse vendors have been adding more analytics to their databases. Other vendors are touting in-memory systems as the way to deliver advanced analytics on large-scale data. It is possible that this marriage of analytics with large-scale data will cause a surge in the use of predictive analytics – unless other challenges prevent more widespread use of these capabilities.

One is the inherent complexity of the analytics themselves. They employ arcane algorithms and often require knowledge of sampling techniques to avoid bias in the analyses. Developing such analytics tools involves creating models, deploying them and managing them to understand when a model has become “stale” or outdated and ought to be revised or replaced. It should be obvious that only the most technically advanced user will have (or desire) any familiarity with this complexity; to achieve broad adoption, either there must be more highly trained users (which is unlikely) or vendors must mask the complexity and make their tools easier to use.

It also is possible that a lack of understanding of the business value these analyses provide is holding back adoption. Predictive analytics have been employed to advantage in certain segments of the market; other segments can learn from those deployments, but it is up to vendors to make the benefits and examples more widely known. In an upcoming benchmark research I’ll be investigating what is necessary to deploy these techniques more broadly and looking for the best practices from successful implementations.

Regards,

David Menninger – VP & Research Director

Follow on WordPress.com

Enter your email address to follow this blog and receive notifications of new posts by email.

Join 22 other subscribers

RSS David Menninger’s Analyst Perspective’s at Ventana Research

  • An error has occurred; the feed is probably down. Try again later.

Top Rated

Blog Stats

  • 46,795 hits