Tableau tops a pile of announcements with totally new data insight
Published:
Content Copyright © 2019 Bloor. All Rights Reserved.
Also posted on: Bloor blogs
Tableau has announced so much at its European conference that it is difficult to know where to start, so I’ll start with Tableau itself. The most notable recent addition, at least from my perspective, is an increased focus on geo-spatial capabilities, for example, with the introduction of vector maps. More developments in this area can be expected in the future. Another major feature is the addition of parameter actions, which got a lot of interest during the Iron Viz competition.
Last year the company introduced data preparation capabilities so I suppose it was inevitable that it would add data cataloguing, which will be available in the 2019.3 release. The Tableau Catalog does pretty much what you would expect a catalogue to do with the exception that, because it is integrated not just with data prep but also visualisation, it provides rather better data lineage than would otherwise be the case: right from the source system via the catalogue through to the visualisation. And, of course, you can visualise the lineage. The product is only available server-side (it has a built-in database (PostgreSQL) with graph capabilities via a GraphQL API) and it will be delivered as part of a Data Management add-on, which will also include Data Prep Conductor. It comes with an open API to integrate with third party catalogues (Alation, Collibra, Informatica and so on) but I was disappointed that the company was not even aware of (they are now), let alone supporting the open source ODPi Egeria project for open metadata and governance (which is also based around a graph database – I wonder if Tableau also uses JanusGraph?).
Another pair of products where one might be considered a follow-on to the other, are Ask Data and Explain Data. The former was released earlier this year and it provides a query facility based on natural language. This is isn’t a new concept, so I shan’t discuss it in any details. Currently it only works with English (US) but other languages, including English (UK) aren’t very far away. Explain Data, which is related to Ask Data only in the sense that it is uses natural language again, though this time for generation rather than processing, is a completely different beast and I haven’t seen anything comparable. What it attempts to do is to explain “why” there are, say, differences in sales across regions. Explain Data then uses built-in machine learning to analyse your data and make suggestions – in natural language and based on relevance – as to why there are these differences. This is impressive.
The other two major announcements were Tableau Blueprint, which is a (strategic) methodology to guide companies evolving into data-driven organisations. I will discuss this further in a subsequent article, and how it supports the mutable enterprise. And, finally, there is Project McKinley, which is intended to provide enhanced security, management and scalability for mission-critical deployments. In this context it is worth commenting that Tableau is reporting a significant increase in enterprise-wide deployments (often tens of thousands of users): a proof point that its original “land and expand” sales model has genuinely started to expand.
As I said, there was a lot for one conference. And a lot to like.