You are currently browsing the tag archive for the ‘Tableau’ tag.

More than 13,000 self-described “data and visualization nerds” gathered in Austin, TX, recently for Tableau Software’s annual customer conference. In his inaugural keynote, Tableau’s new CEO, Adam Selipsky, said that nearly 9,000 were first-time attendees. I was impressed with the enthusiasm of the customers who had gathered for the event, cheering as company officials reviewed product plans and demonstrated new features. This enthusiasm suggests Tableau has provided capabilities that resonate with its users. Among other things, the company used the conference to outline a number of planned product enhancements.

Not resting on its laurels as a data visualization software provider, Tableau demonstrated a variety of new capabilities for displaying data. Enhanced tool tips were common to several of the new features. For example, visualization within visualization displays a graph on top of another graph as the user hovers over data points. Hover lines, which are vertical lines on charts, show a variety of data values not just the data value in the chart. Interactive tool tips showing values used to filter the displayed data can show items with similar characteristics in the charts or graphs.

Mapping capabilities are enhanced so that layers of different sets of data or different types of location data can be displayed simultaneously. Tableau executives also demonstrated a natural-language query interface that can interpret concepts such as “near” or “expensive” using machine-learning techniques. All in all, these interface enhancements make it easier for users to get more information as they are navigating through data.

Data preparation is a hot topic, as I have written. For its part Tableau is adding self-service data preparation through an effort called Project Maestro. Data preparation will be a stand-alone application that enables users to access and prepare data prior to visualization. With interactive capabilities to join tables, split fields and profile data, users can be more certain that the data is ready for the analyses that they want to perform. Long having focused on business users, Tableau is trying to appeal to IT also by introducing new governance capabilities to Tableau Server. As data sources are published, they can be reviewed and certified. IT personnel can ensure that the proper joins, security rules and performance optimizations are incorporated. Business users can trust the content on which they are relying. The governance features will also include the frequency of usage and impact analysis to identify where data sources are used.

In some other areas Tableau is catching up with the competition, adding features that users value but were missing from its products previously. One example is alerts. Another is collaboration, including threaded discussions in which analyses can be embedded. I’ve written about the value of collaboration for years.vr_dac_25_use_of_collaboration Our data and analytics in the cloud benchmark research shows that more than half (52%) of organizations use or intend to use collaboration with analytics. Users should welcome these new features, but Tableau could make collaboration even more valuable by adding tasks or assignments so that analysis can lead to action, not just discussion. Tableau also continues to round out its cloud capabilities, but like some other vendors its cloud implementation has not achieved parity with the desktop product. As part of its cloud efforts, Tableau also announced a Linux version of its server, which is a common operating system for cloud-based systems.

Tableau will also be introducing a new data engine based its acquisition of Hyper earlier this year. Tableau uses an in-memory data engine to accelerate the speed of interacting with data, but the current engine has posed two limitations that it hopes to address with Hyper. The first is scalability, and the second is near-real-time capabilities. During the conference the company demonstrated a development version of Hyper loading more than 300,000 rows per second. The Hyper architecture allows data loading to happen simultaneously with querying, eliminating down time. Hyper will appear first in Tableau’s cloud offerings as a proving ground and later in the on-premises versions.

The conference allowed Tableau to polish its core strengths of visualization and usability. It promises a set of valuable enhancements as well as some features that will help it keep up with the competition. Most of these new features are scheduled to be released in 2017. Tableau has also advanced its mobile capabilities to address its gaps I identified in the 2016 Ventana Research Mobile Business Intelligence Value Index to more easily consume analytics on smartphones and tablets. Tableau has a loyal following of customers that allows it the luxury of time to deliver these capabilities. You may want to check these out, too, especially the clever visualization techniques being added.

Regards,

David Menninger

SVP & Research Director

Follow Me on Twitter @dmenningerVR and Connect with me on LinkedIn.

Tableau Software officially released Version 6 of its product this week. Tableau approaches business intelligence from the end user’s perspective, focusing primarily on delivering tools that allow people to easily interact with data and visualize it.  With this release, Tableau has advanced its in-memory processing capabilities significantly. Fundamentally Tableau 6 shifts from the intelligent caching scheme used in prior versions to a columnar, in-memory data architecture in order to increase performance and scalability.

Tableau provides an interesting twist in its implementation of in-memory capabilities, combining in-memory data with data stored on the disk. One of the big knocks against in-memory architectures has been the limitation imposed by the physical memory on the machine. In some cases products were known to crash if you exceeded the memory. In other cases the system didn’t crash, but it performed so much slower once you exceeded the memory that it almost appeared to have crashed.

The advent of 64-bit operating systems dramatically increased the theoretical limitations that existed in 32-bit operating systems. Servers can now be configured with significant amounts of memory at prices that are within reason, but putting your entire warehouse or large-scale data set entirely in-memory on a single machine is still a stretch for most organizations. With Tableau 6 a portion of the data can be loaded into memory and the remainder left on disk. Coupled with the feature that allows links to data in an RDBMS it provides considerable flexibility. Data can be loaded into memory, put on disk or linked to one of many supported databases. As the user interacts with data, it will be retrieved from the appropriate location. Tableau 6 also includes assistance in managing and optimizing the dividing line between data in-memory and on-disk, based on usage patterns.

However, one of the places where this new architecture comes up short is in the data refreshment process. In the current Tableau 6, users must manually request a refresh of the data that is currently in-memory. Ideally there should be an optional automated way to keep the in-memory data up to date. The other thing I would like to see in Tableau 6 and other in-memory products is better read/write facilities. Although this version includes better “table calcs,” which can be used to display some derived data and perform some limited what-if capabilities, there is no write-back capability that would let you use Tableau as a planning tool and record the changes you explore.

Tableau 6 includes a number of other features beyond the in-memory capabilities. It now supports a form of data federation in which data from multiple sources can be combined in a single analysis. The data can be joined on the fly in the Tableau server. Tableau refers to this as “data blending.” Users can also create hierarchies on the fly simply by dragging and dropping dimensions. And there are some new interactive graphing features such as dual-axis graphs with different levels of detail on each axis and the ability to exclude items from one axis but not the other, which can be helpful to correct for outliers such as the impact of one big sale on profitability or average deal size.

As well this release supports several new data sources including the Open Data Protocol (OData), the DataMarket section of Windows Azure Marketplace and Aster Data who my colleague recently assessed.

Version 6 also includes some IT-oriented enhancements. As Tableau has grown, its software has been deployed in ever-larger installations, which places a focus on its administrative facilities. The new release includes improved management for large numbers of users with grouping and assigning privileges and specific selection and edit options. It also includes a new project view of objects created and managed within Tableau. All of these help bring it forward to departmental and enterprise class analytics technology.

Overall, the release includes features that should be well received by both end users and IT. It shares an end user analytics category with QlikView 10, which I recently assessed, and Tibco Spotfire. I’ll be anxious to see if the company can push the in-memory capabilities even further in future releases. It is clear that Tableau brings another viable option to the category of analytics for analysts with new in-memory computing and blending of data from across data sources.

Let me know your thoughts or come and collaborate with me on  Facebook, LinkedIn and  Twitter .

Regards,

David Menninger – VP & Research Director

Follow on WordPress.com

Enter your email address to follow this blog and receive notifications of new posts by email.

Join 22 other subscribers

RSS David Menninger’s Analyst Perspective’s at Ventana Research

  • An error has occurred; the feed is probably down. Try again later.

Top Rated

Blog Stats

  • 46,795 hits