You are currently browsing the tag archive for the ‘Application Migration’ tag.

At Informatica’s recent industry analyst summit, Chris Boorman, the company’s chief marketing officer, opened the event by describing Informatica as expanding beyond its core offering in data integration in a broader sense. He compared this growth to Amazon expanding from being an online bookseller to offering computing resources via Amazon Web Services. I see it almost the opposite way. Informatica has always been in the data integration business. It has excelled at making this area of IT more relevant and more applicable to broader audiences. My colleague described their latest efforts to focus on line of business users in a recent post. My purpose here is to review some of the highlights of the company’s latest product releases.

The focus of the soon to be released version of its flagship product, Informatica 9.1, is to consolidate a range of offerings into a single platform. This consolidation includes the master data management (MDM) products acquired from Siperian during 2010, which we commented on. We’ve seen similar efforts from other data integration vendors, for example, the consolidation of the IBM InfoSphere platform.

Informatica 9.1 includes a laundry list of information management capabilities, starting with data integration and continuing with quality, profiling, governance, MDM, federation and virtualization, life-cycle management, event processing and low-latency messaging. So Informatica can be a one-stop shop for a complete portfolio of data integration tools. It also has aggressively embraced cloud computing, providing access to cloud–based data sources and offering its products as services in the cloud. As more enterprise data resides in the cloud, a hybrid configuration likely will become more appealing. We’ll have more data on this subject soon as we are beginning benchmark research on Business Data and the Cloud benchmark research program.

It’s impossible to cover a product line as broad as Informatica’s in a single blog post. For example, this vendor not only provides core data integration, data quality and data profiling capabilities but also has a dedicated product for information life-cycle management that includes data archiving and test-data management. It offers data virtualization and federation capabilities via what it calls data services. And the list goes on. For the moment then, I’ll point out what caught my attention, setting aside Informatica’s impressive ability to execute and continue to expand its presence in the market.

Informatica has invested heavily in the technology acquired from Siperian. As a result, MDM is now a key part of Informatica’s platform – in fact MDM seems to be everywhere in it. While many vendors have limited or specialized their MDM offerings to a customer-related data domains, Informatica’s works well across multiple domains. At the analyst conference presenters shared MDM examples ranging from traditional customer data to aircraft engine specifications to seed catalogs. (However, despite the support for many domains, Informatica does not provide a prebuilt product information management (PIM) application such as those covered in our PIM Value Index for 2011 though they support product data integration. Informatica’s MDM approach provides both repository-based and registry-based hub architectures via a single product, which ensures consistency between the two. In version 9 you can also change data models and rules while the system is running to produce an environment with no downtime. The service-oriented architecture (SOA) of Informatica also comes through in a variety of ways. Informatica created data controls to embed data quality and MDM capabilities in other applications, demonstrating at the conference Informatica functionality embedded within Excel and salesforce.com for true self-service capabilities. Presenters also demonstrated packaging of Informatica activities as a Web service – that goes beyond the needs of most end users, but a skilled technical resource could embed Informatica in virtually any application.

On the other side of the equation, Informatica supports a variety of new data sources via its Data Services offering. These data services can be used for virtualization and federation similar to those of other vendors such as Composite Software or Denodo, but Informatica can also apply other features from its technology stack such as data quality to its data sources, in effect, creating “data quality in flight.” If your organization relies on virtualization and federation, you can use these features to ensure that the quality of that data matches the standards you have set for the rest of your data. Data services also provide another powerful capability. When you define a logical data source, it can include a variety of other steps (such as data quality and data governance) in the definition of the data source. So even if you don’t plan to use virtualized or federated data sources, you should still look into data services as a way to associate all the necessary information management processes with the data sources you make available to users in your organization.

In event processing, Informatica has a bit of split personality. It has both a complex event processing (CEP) engine acquired from Agent Logic and a low-latency messaging infrastructure acquired from 29 West. What the company refers to as CEP is in my opinion more similar to a rules engine as opposed to a product like Streambase – in fact the product is called RulePoint. It is designed for complex rule structures but not necessarily very low latency data such as stock quotes. That’s where the 29 West product comes in that we covered at time of acquisition. Informatica Ultra Messaging provides a messaging infrastructure that can deal with very low latency data – measured in microseconds – but does not provide the same robust rule capabilities. I expect we’ll see more convergence between these two products over time if Informatica intends to be competitive in the CEP space and what we call Operational Intelligence. However, for most organizations that don’t need to process ultra low latency data, RulePoint is probably sufficient.

Cloud computing continues to play a big role for Informatica. It bet early and bet big on the cloud, as mentioned in my previous blog post “Clouds are Raining Corporate Data”.

I arrived at the event wondering how Informatica could sustain its growth and remain so strongly competitive. I expected to hear of plans for acquisitions and expansion into other aspects of business intelligence or data warehousing. I left with the impression that Informatica not only has thrived in its core market but also has found ways to expand its product line and broaden its addressable market within the data integration segment.

Regards,

David Menninger – VP & Research Director

Cloud computing is having an increasingly large influence over the IT landscape. It’s likely that, whether you realize it or not, corporate data exists and or is migrating outside the walls of your organization. Recent research by Ventana Research shows that in areas such as customer services, sales, workforce or human capital management, software as a service (SaaS) or cloud-based applications increasingly are being accepted and adopted. In our benchmark research on business intelligence and performance management, for example, only 53 percent of prefer their systems on-premises, and we expect that percentage to decline in the next 12 to 24 months, in which more than one-third of organizations plan to begin using cloud-based or SaaS applications.

However, cloud-based applications and services raise information management challenges that don’t necessarily exist in on-premises deployments. The new silos of applications and software that enable doing business “in the cloud” also are new corporate data repositories that must be integrated with other enterprise data and must be managed as a whole. Among the many challenges lurking inside the cloud are data accessibility, data consistency, data integration, data quality and data governance.

In many cases the advocates and buyers for using cloud-based services are line-of-business managers who see such solutions as addressing their immediate concerns for rapid deployment with minimal capital outlays, but these business folks may not be aware of the data challenges associated with moving to the cloud. For instance, as more and more data resides in applications managed by third parties, how will the organization bring it all together for analysis, reporting and other necessary uses? Without a capable data integration infrastructure, will users be forced to cut and paste data from reports or export it to spreadsheets, encountering the issues of consistency and accuracy that practice raises?

At last month’s salesforce.com Dreamforce event that my colleague assessed, I spent time examining some of the data integration alternatives available for cloud-based data. Informatica has been investing significant resources in a cloud-based product and now boasts over 1,000 customers using its cloud-based services. At Dreamforce, Informatica added two new products to this portfolio. At $99 a month, Cloud Express is the lowest-priced offering that includes support. This pricing is usage-based and includes up to 300,000 rows of data movement per month. Cloud Express includes scheduling (not available in Informatica’s free product) but is limited to salesforce.com data and does not include application integration features. At the other end of the spectrum, for $6,000 per month, Informatica’s enterprise version offers integration with its PowerCenter product and provides an environment for hybrid integration of cloud and on-premises data. Informatica now has five different cloud-based offerings and price points, which constitute a relatively complete product line.

Pervasive Software has also made a significant investment in cloud-based data integration services. Focused initially on small to midsize business opportunities and point-to-point integration, Pervasive has cloud-enabled its core product as Pervasive Data Integrator v10 Cloud Edition that my colleague assessed earlier in the year. With 250 customers in production in the cloud and four years of experience working there, Pervasive is a serious contender in the cloud data integration market. Its Data Rush technology provides highly parallel operations coupled with elasticity features that spread operations across multiple servers for better performance and throughput. Pervasive does not charge per connection with v10 in the cloud, which could be a significant differentiator for some organizations that need to connect to many different data sources. But there are some limitations to be aware of: Life-cycle management features and data lineage features are not fully supported in the cloud yet.

Cloud data integration (like data integration in general) goes beyond traditional structured data. An operating unit of Information Builders, iWay Software not only integrates structured data but can also integrate data from salesforce.com’s collaboration technology Chatter with enterprise systems. My colleague covered iWay’s cloud offerings and their integration with Chatter last year. Cast Iron Systems, acquired by IBM last year, also offers integration with Chatter as well as other structured data sources.

A relatively small company, Boomi has made a name in the cloud data integration space and as a result recently was acquired by Dell. Boomi uses a deployment architecture based around Java Virtual Machines (JVMs). The company calls these deployment units “atoms,” and because they are based on a JVM architecture, they can easily be deployed on-premises as well as in the cloud. These “atoms” can be run in parallel to enhance performance, throughput and scalability, but beware that the process to create multiple instances is a manual one today. Now that it has Dell’s backing, I expect to see this process built into the product to compete at the enterprise level.

SnapLogic, also competing in the cloud data integration space, has some good credentials. Founded by Gaurav Dhillon founder of Informatica, SnapLogic has focused on building a large developer community to create “Snaps” or connections to a variety of data sources. While this model makes sense as a way to get more connectivity, for an organization needing to connect to many different data sources it can become a costly alternative, especially when competitors like Pervasive offer connectors at no additional charge. But in the spirit of community centric software development like that found in open source market this could be an new approach in sharing interfaces build by customer, partners and software developers.

Jitterbit offers one of the more interesting applications of cloud data integration capabilities. You can read an assessment of their cloud-based data migration services here: Jitterbit has recognized that the cloud brings different challenges to the world of data integration. It introduced CloudReplicate which allows you to create a copy of your salesforce.com data in a separate RDBMS instance in the cloud. The vendor will keep this version in sync with the instance in salesforce.com. Customers use the replicated version for a variety of reasons ranging from simple backup to data federation to historical data analysis.

The market for cloud data integration products and services is still emerging. We have learned some lessons in the on-premises past that will be applicable to the cloud, and established vendors are aggressively pursuing these market opportunities through a combination of development efforts and acquisitions. We see new vendors entering the space like Jitterbit and SnapLogic. One common theme I heard repeated by large and small vendors alike is that cloud data integration is about frequent, smaller transfers of data rather than large bulk operations. Another common theme is that no vendor offers all the functionality of fully established on-premises solutions.

As an industry we’ve only begun to understand the challenges and opportunities that are unique to the cloud. This is an area where I’ll be focusing additional attention with some of my research efforts during 2011. Stay tuned for more information as we begin new benchmark research into the current use and market demand.

Let me know your thoughts or come and collaborate with me on  Facebook, LinkedIn and  Twitter .

Regards,

David Menninger – VP & Research Director

Follow on WordPress.com

Enter your email address to follow this blog and receive notifications of new posts by email.

Join 22 other subscribers

RSS David Menninger’s Analyst Perspective’s at Ventana Research

  • An error has occurred; the feed is probably down. Try again later.

Top Rated

Blog Stats

  • 46,795 hits