Governance in Tableau
This content is part of Tableau Blueprint – a maturity framework allowing you to zoom in and improve how your organisation uses data to drive impact. To begin your journey, take our assessment(Link opens in a new window).
Governance in Tableau is a critical step in driving usage and adoption of analytics while maintaining the security and integrity of the data. You must define standards, processes, and policies to securely manage data and content through the modern analytics workflow. Just as important as defining these is having everyone in the workflow understand and comply so that users will have trust and confidence in the analytics they’ll use to make data-driven decisions.
Data governance in Tableau
The purpose of data governance in the Modern analytics workflow is to ensure that the right data is available to the right people in the organisation, at the time they need it. It creates accountability and enables, rather than restricts, access to secure and trusted content and for users of all skill levels.
Data source management
Data source management includes processes related to selection and distribution of data within your organisation. Tableau connects to your enterprise data platforms and leverages the governance you already have applied to those systems. In a self-service environment, content authors and data stewards have the ability to connect to various data sources, build and publish data sources, workbooks and other content. Without these processes, there will be a proliferation of duplicate data sources, which will cause confusion among users, increase the likelihood of errors and consume system resources.
Tableau’s hybrid data architecture provides two modes for interacting with data, using a live query or an in-memory extract. Switching between the two is as easy as selecting the right option for your use case. In both live and extract use cases, users may connect to your existing data warehouse tables, views and stored procedures to leverage those with no additional work.
Live queries are appropriate if you have invested in a fast database, need up-to-the-minute data or use . In-memory extracts should be used if your database or network is too slow for interactive queries, to take the load off transactional databases or when offline data access is required.
With support for a new multi-table logical layer and relationships in Tableau 2020.2, users aren’t limited to using data from a single, flat, denormalized table in a Tableau data source. They can now build multi-table data sources with flexible, LOD-aware relationships between tables, without having to specify join types in anticipation of what questions can be asked of the data. With multi-table support, Tableau data sources can now directly represent common enterprise data models such as star and snowflake schemas, as well as more complex, multi-fact models. Multiple levels of detail are supported in a single data source, so fewer data sources are needed to represent the same data. Relationships are more flexible than database joins and can support additional use cases as they arise, reducing the need to build new data models to answer new questions. Using relationships in well modelled schemas can reduce the time to create a data model as well as the number of data sources to answer business questions. For more information, see Metadata management later in this section and The Tableau data model.
When publishing a workbook to Tableau Server or Tableau Cloud, the author will have the choice of publishing the data source or of leaving it embedded in the workbook. The data source management processes you define will govern this decision. With Tableau Data Server, which is a built-in component of the Tableau platform, you can share and reuse data models, secure how your users access data, and manage and consolidate extracts with published data sources. Further, published data sources allow Tableau Creator- and Explorer-licensed users to have access to secure, trusted data in Tableau for web authoring and Ask Data. For more information, see , , and .
With increased data discovery capabilities, Tableau Catalog indexes all content, including workbooks, data sources and flows to allow authors to search for fields, columns, databases and tables in workbooks and published data sources. For more information, see Data management.
When Tableau Catalog is enabled, content authors can search for data by selecting from data sources, databases and files, or tables to see if it exists in Tableau Server and Tableau Cloud and minimise duplication of data sources.
In addition, the Data details tab on a view published to Tableau Server and Tableau Cloud will provide consumers with relevant information about the data used in it. Details include information about the workbook (name, author, date modified), the data sources used in the view and a list of the fields in use.
For data stewards who create new published data sources, the workflow below shows the two major decision points that impact data source management – live or extract and embedded or shared data model. This is not to imply that a formal modelling process must always occur before analysis begins.
Key considerations for data source management
- What are the key sources of data for a department or team?
- Who is the data steward or owner of the data?
- Will you connect live or extract the data?
- Should the data source be embedded or published?
- Do variants of a dataset exist? If so, can they be consolidated as an authoritative source?
- If multiple data sources are consolidated, does the single data source performance or utility suffer by attempting to fulfil too many use cases at once?
- What business questions need to be answered by the data source?
- What naming conventions are used for published data sources?
Data quality is a measure of data's fitness to serve its purpose in a given context – in this case, for making business decisions. The quality of data is determined by factors such as accuracy, completeness, reliability, relevance and freshness. You likely already have processes in place to ensure data quality as it is ingested from source systems, and the more that is fixed in upstream processes, the less correction will be needed at the time of analysis. You should ensure data quality is consistent all the way through to consumption.
As you are planning, it is a good time to review existing upstream data quality checks because data will be available to a larger group of users under a self-service model. In addition, Tableau Prep Builder and Tableau Desktop are great tools for detecting data quality issues. By establishing a process to report data quality issues to the IT team or data steward, the data quality will become an integral part of building trust and confidence in the data.
With Tableau Data Management and Tableau Catalog, you should communicate data quality issues to your users to increase visibility and trust in the data. When a problem exists, you can set a warning message on a data asset so that users of that data asset are aware of particular issues. For example, you might want to let users know that the data hasn't been refreshed in two weeks or that a data source has been deprecated. You can set one data quality warning per data asset, such as a data source, database, flow or table. For more information, see Set a Data Quality Warning, including the following types: Warning, Deprecated, Stale Data and Under Maintenance.
Note that you can set a data quality warning using REST API. For more information, see Add data quality warning in the Tableau REST API help.
Key considerations for data quality
- What processes exist for ensuring accuracy, completeness, reliability and relevance?
- Have you developed a checklist to operationalise the process?
- Who needs to review data prior to it becoming shared and trusted?
- Is your process adaptable to business users and are they able to partner with data owners to report issues?
Enrichment & preparation
Enrichment and preparation include the processes used to enhance, refine or prepare raw data for analysis. Often a single data source does not answer all the questions a user may have. Adding data from different sources adds valuable context. You likely already have ETL processes to clean, combine, aggregate and store data when ingesting raw data from various sources. With command line interfaces and APIs, Tableau can be integrated with your existing processes.
For , and should be used to combine multiple sources of data and automate on a schedule. Tableau Prep has multiple output types to Tableau Server or Tableau Cloud, including CSV, Hyper and TDE, or published data sources. Beginning with 2020.3, Tableau Prep outputs include database tables, where the result of a flow can be saved to a table in a relational database. This means that prepped data from Tableau Prep Builder can be stored and governed in a central location and used throughout your organisation. Tableau Prep Builder is part of the Tableau Creator licence, while Tableau Prep Conductor is part of Tableau Data Management. Tableau Data Management helps you better manage the data within your analytics environment, from data preparation to cataloguing, search and governance, ensuring that trusted and up-to-date data is always used to drive decisions.
With visual, smart, direct feedback at every step, Tableau Prep Builder will help users to prototype and prepare disparate sources of data for analysis. Once the steps are defined and verified, the flow should be published to Tableau Server and Tableau Cloud, where Prep Conductor will execute the flow and output a published data source on the specified schedule. Automation creates a consistent process, reduces error-prone manual steps, tracks success/failure and saves time. Users will have confidence in the output because the steps can be viewed on Tableau Server or Tableau Cloud.
Tableau Prep flow
Tableau Prep flow in Tableau Server or Tableau Cloud
Key considerations for data enrichment
- Will data enrichment and preparation be centralised or self-service?
- What organisational roles perform data enrichment and preparation?
- What ETL tools and processes should be used to automate enrichment and/or preparation?
- What sources of data provide valuable context when combined with each other?
- How complex are the data sources to be combined?
- Will users be able to use Tableau Prep Builder and/or Tableau Desktop to combine datasets?
- Have standardised join or blend fields been established by the DBA to enable users to enrich and prepare datasets?
- How will you enable self-service data preparation?
Data security is of utmost importance in every enterprise. Tableau allows customers to build upon their existing data security implementations. IT administrators have the flexibility to implement security within the database with database authentication, within Tableau with permissions or a hybrid approach of both. Security will be enforced regardless of whether users are accessing the data from published views on the web, on mobile devices, or through Tableau Desktop and Tableau Prep Builder. Customers often favour the hybrid approach for its flexibility to handle different kinds of use cases. Start by establishing a data security classification to define the different types of data and levels of sensitivity that exist in your organisation.
When leveraging database security, it is important to note that the method chosen for authentication to the database is key. This level of authentication is separate from the Tableau Server or Tableau Cloud authentication (i.e. when a user logs into Tableau Server or Tableau Cloud, they are not yet logging into the database). This means that Tableau Server and Tableau Cloud users will also need to have credentials (their own username/password or service account username/password) to connect to the database for the database-level security to apply. To further protect your data, Tableau only needs read-access credentials to the database, which prevents publishers from accidentally changing the underlying data. Alternatively, in some cases, it is useful to give the database user permission to create temporary tables. This can have both performance and security advantages because the temporary data is stored in the database rather than in Tableau. For Tableau Cloud, you need to embed credentials to use automatic refreshes in the connection information for the data source. For Google and Salesforce.com data sources, you can embed credentials in the form of OAuth 2.0 access tokens.
Extract encryption at rest is a data security feature that allows you to encrypt .hyper extracts while they are stored on Tableau Server. Tableau Server administrators can enforce encryption of all extracts on their site or allow users to specify to encrypt all extracts associated with particular published workbooks or data sources. For more information, see Extract Encryption at Rest.
If your organisation is deploying Data Extract Encryption at Rest, then you may optionally configure Tableau Server to use AWS as the KMS for extract encryption. To enable AWS KMS or Azure KMS, you must deploy Tableau Server in AWS or Azure, respectively, and be licensed for Advanced Management for Tableau Server. In the AWS scenario, Tableau Server uses the AWS KMS customer master key (CMK) to generate an AWS data key. Tableau Server uses the AWS data key as the root master key for all encrypted extracts. In the Azure scenario, Tableau Server uses the Azure Key Vault to encrypt the root master key (RMK) for all encrypted extracts. However, even when configured for AWS KMS or Azure KMS integration, the native Java keystore and local KMS are still used for secure storage of secrets on Tableau Server. The AWS KMS or Azure KMS is only used to encrypt the root master key for encrypted extracts. For more information, see Key Management System.
For Tableau Cloud, all data is encrypted at rest by default. However, with Advanced Management for Tableau Cloud, you can take more control over key rotation and auditing by leveraging customer-managed encryption keys. Customer-managed encryption keys give you an extra level of security by allowing you to encrypt your site’s data extracts with a customer-managed site-specific key. The Salesforce Key Management System (KMS) instance stores the default site-specific encryption key for anyone who enables encryption on a site. The encryption process follows a key hierarchy. First, Tableau Cloud encrypts an extract. Next, Tableau Cloud KMS checks its key caches for a suitable data key. If a key isn’t found, one is generated by the KMS GenerateDataKey API, using the permission granted by the key policy that's associated with the key. AWS KMS uses the CMK to generate a data key and returns a plaintext copy and encrypted copy to Tableau Cloud. Tableau Cloud uses the plaintext copy of the data key to encrypt the data and stores the encrypted copy of the key along with the encrypted data.
You can limit which users see what data by setting user filters on data sources in both Tableau Server and Tableau Cloud. This allows you to better control what data users see in a published view based on their Tableau Server login account. Using this technique, a regional manager is able to view data for her region but not the data for the other regional managers. With these data security approaches, you can publish a single view or dashboard in a way that provides secure, personalised data and analysis to a wide range of users on Tableau Cloud or Tableau Server. For more information, see Data security and Restrict access at the data row level. If row-level security is paramount to your analytics use case, with Tableau Data Management, you can leverage virtual connections with data policies to implement user filtering at scale. For more information, see Virtual connections and data policies.
Key considerations for data security
- How do you classify different types of data according to its sensitivity?
- How does someone request access to data?
- Will you use a service account or database security to connect to data?
- What is the appropriate approach to secure data according to sensitivity classification?
- Does your data security meet legal, compliance and regulatory requirements?
Metadata management includes policies and processes that ensure information can be accessed, shared, analysed and maintained across the organisation as an extension of data source management. Metadata is a business-friendly representation of data in common terms, similar to a semantic layer in traditional BI platforms. Curated data sources hide the complexity of your organisation’s modern data architecture and make fields immediately understandable, regardless of the data store and table from which it was sourced.
Tableau employs a simple, elegant and powerful metadata system that gives users flexibility while allowing for enterprise metadata management. The Tableau data model can be embedded in a workbook or centrally managed as a published data source with Data Server. After connecting to data and creating the Tableau data model, which will become a published data source on Tableau Server or Tableau Cloud, look at it from your users’ perspective and see how much easier analytics will be when they have a well-formatted starting point, filtered and sized to the business questions it can answer. For more information on published data sources, visit The Tableau data model, and .
The diagram below shows where elements exist in the Tableau data model:
Beginning in 2020.2, the data source includes the connection, connection attributes, and the physical and logical layers within a data model. Upon connection, Tableau automatically characterises fields as dimensions or measures. In addition, the data model stores calculations, aliases and formatting. The physical layer includes physical tables defined by joins, unions and/or custom SQL. Each group of one or more physical tables defines a logical table, which resides in the logical layer along with relationships.
Relationships are a new way to model data that is more flexible than using joins. A relationship describes how two tables relate to each other, based on common fields, but it does not combine the tables together as the result of a join does. Relationships provide several advantages over using joins.
- You don't need to configure join types between tables. You only need to select the fields to define the relationship.
- Relationships use joins, but they are automatic. Relationships postpone the selection of join types to the time and context of analysis.
- Tableau uses relationships to automatically generate correct aggregations and appropriate joins during analysis, based on the current context of the fields in use in a worksheet.
- Multiple tables at different levels of detail are supported in a single data source, so fewer data sources are needed to represent the same data.
- Unmatched measure values are not dropped (no accidental loss of data).
- Tableau will generate queries only for the data that is relevant to the current view.
At run time in the VizQL model, multiple queries are built dynamically, based on the dimensions and measures of the visualisation, and filters, aggregations and table calculations are applied. Tableau uses the contextual information of the separate logical table to determine what joins are applied to provide the correct aggregation. This enables one user to design the data source without needing to know, plan or otherwise account for all the variations of analysis to be performed with the data source by other users. Tableau Catalog discovers and indexes all of the content on Tableau, including workbooks, data sources, sheets and flows.
Data stewards or authors with direct access to sources of data should prototype data sources as an embedded data source in a Tableau workbook and then create a published data source in Tableau to share the curated data model, as shown below in the direct access workflow:
If authors do not have direct access to sources of data, they will rely on a DBA or data steward to provide the prototype data source embedded in a Tableau workbook. After reviewing and verifying it contains the needed data, a Site Administrator or Project Leader will create a published data source in Tableau to share the Tableau data model, as shown below in the restricted access workflow:
The metadata checklist identifies best practices for curating a published data source. By establishing data standards using the checklist, you’ll enable the business with governed self-service data access that is user-friendly and easy to understand. Prior to creating an extract or published data source in Tableau, review and apply the following checklist to the Tableau data model:
- Validate the data model
- Filter and size to the analysis at hand
- Use standard, user-friendly naming conventions
- Add field name synonyms and custom suggestions for Ask Data
- Create hierarchies (drill paths)
- Set data types
- Apply formatting (dates, numbers)
- Set fiscal year start date, if applicable
- Add new calculations
- Remove duplicate or test calculations
- Enter field descriptions as comments
- Aggregate to highest level
- Hide unused fields
Beginning in 2019.3 in Data Management, Tableau Catalog discovers and indexes all of the content on Tableau, including workbooks, data sources, sheets and flows. Indexing is used to gather information about the metadata, schemas and lineage of the content. Then, from the metadata, Tableau Catalog identifies all of the databases, files and tables used by the content on your Tableau Server or Tableau Cloud site. Knowing where your data comes from is key to trusting the data, and knowing who else uses it means you can analyse the impact of changes on data in your environment. The lineage feature in Tableau Catalog indexes both internal and external content. For more information, see Use Lineage for Impact Analysis.
Using lineage, you can trace down to content owners at the end of the lineage graph. The list of owners includes anyone assigned as the owner of a workbook, data source or flow, and anyone assigned as the contact for a database or table in the lineage. If a change is going to be made, you can email owners to let them know about its impact. For more information, see Use email to contact owners.
Key considerations for metadata management
- What is the process for curating data sources?
- Has the data source been sized to the analysis at hand?
- What is your organisational standard for naming conventions and field formatting?
- Does the Tableau data model meet all criteria for curation, including user-friendly naming conventions?
- Has the metadata checklist been defined, published and integrated into the validation, promotion and certification processes?
Monitoring & management
Monitoring is a critical piece of the self-service model, as it allows IT and administrators to understand how data is being used and be proactive and responsive about usage, performance, data connectivity and refresh failures. Depending on your company’s database standards, IT will use a combination of tools and job schedulers for ingesting and monitoring raw data and server health.
Just as business users leverage data to make smarter decisions, administrators are also empowered to make data-driven decisions about their Tableau deployment. With Tableau Server’s default administrative views and custom administrative views, Tableau Server and site administrators will use default administrative views for monitoring the status of extract refreshes, data source utilisation and delivery of subscriptions and alerts. Custom administrative views are created from Tableau Server's repository data. In Tableau Cloud, site administrators have access to Monitor site activity with default administrative views and can Use Admin Insights to create custom views. For more information, see Tableau Monitoring and Measurement of Tableau user engagement and adoption.
Key considerations for monitoring & management
- Are schedules available for the times needed for extract refreshes?
- How is raw data ingestion monitored from source systems? Did the jobs complete successfully?
- Are there duplicate sources of data?
- When are extract refreshes scheduled to run? How long do extracts run on the server? Did the refresh succeed or fail?
- Are subscription schedules available after extract refreshes have occurred?
- Are data sources being used? By whom? How does this compare with the expected audience size?
- What is the process for removing stale published data sources?
Data governance summary
Striking the balance between control and agility is critical. In spite of stringent governance policies, users often go down the route of saving sensitive data and analytics locally for quick analysis. In a self-service environment, the role of data governance is to permit access to data and enable users to get the answers they need while ensuring security is enforced. Although every organisation has different requirements, the table below describes the ideal state for governing self-service data access:
IT administrators /
Data source management
Provide access to sources of data and comply with organisational data strategy, policies and procedures.
Define, manage and update data models used for analysis.
Define the process to validate data and build trust in its accuracy for decision-making.
Capture and expose data-cleansing rules applied to published data models.
Enrichment & preparation
Create ETL processes from multiple sources of data to make data ready for analysis.
Capture and expose enrichment and preparation rules applied to published data models.
Define security parameters and access controls for published data models.
Comply with enterprise data security policies and external regulations.
Define organisational policies and processes for metadata management.
Define, update and expose field-level metadata for users.
Monitoring & management
Monitor and audit usage to ensure compliance and appropriate use of data assets.
Monitor and track usage metrics for centrally managed data models.
As the use of analytics increases, a growing number of mission-critical business decisions will become data-driven. The net effect is not only an increase in content volume but also in the varying skill levels among its users who will be collaborating and uncovering valuable insights. With more and more people using data daily, it is critical that Tableau content can be secured, governed and trusted – as well as organised so that people can discover, consume and create content with confidence. Without content governance, users will find it increasingly difficult to find what they need among irrelevant, stale or duplicate workbooks and data sources.
Content governance involves the processes that keep content relevant and fresh, such as knowing when to decommission content because it’s not getting the expected traffic or finding out why no one is using an important dashboard for decision-making. The responsibility of ensuring compliance with an organisation’s content governance policies is a core responsibility of content authors.
This section provides IT administrators and business users with the core concepts underpinning Tableau’s content governance features and guidance on how these concepts should be applied to manage the content created in a thriving modern analytics platform.
Defining a consistent content organisation structure allows administrators to manage content and makes content more discoverable by users. Tableau Server and Tableau Cloud give you the flexibility needed to structure your environment and manage content based on your specific governance requirements. Thoughtfully structuring your site will help you deliver true self-service analytics at scale and ensure the responsible use of data to enable your users to discover and share insights.
To share and collaborate, users will create and publish content to a project in Tableau Server or Tableau Cloud. Projects are the default containers used to organise and secure content, holding workbooks, data sources, flows and other nested projects within them. This creates a scalable structure for managing access to the content published to Tableau.
Organisations are not flat, and neither is the way you govern your content. Projects and nested projects behave much like file system folders to provide hierarchical structures that gather related data and content with the users, groups and corresponding permissions that mirror your business. Only administrators can create top-level projects, but it is easy to delegate nested projects to project owners or project leaders for their specific needs. Common content management approaches include organisational (by department/team), functional (by topic), or hybrid (a combination of organisational and functional). When planning the content structure, the cross-functional Tableau team should establish consistent naming conventions for projects and the groups who will have access to them.
For example, in the initial Tableau Server deployment, the sales, marketing, and IT departments will be onboarded. Following the organisational structure, top-level projects for each will be created for every department. The users in these three departments also happen to be part of the cross-functional digital transformation team. Because digital transformation content spans multiple departments’ users, a separate project named Digital Transformation will also be needed. The users from each of their respective departments will be part of a group who can access them. Users and groups only see projects to which they have access, so do not be concerned with the number of projects that you see as an administrator.
Sandbox and certified projects
To support self-service, sandbox and production projects should be used. Sandbox projects contain ad-hoc or uncertified content, and production projects contain validated, certified content. Users should understand the difference in purpose between these two project types. All content authors with access to a sandbox project can freely explore data, author content and perform ad-hoc analysis. The production project’s validated and certified content means that there is a high degree of trust and confidence in it for data-driven decision making.
Publishing to the production project is limited to a small group of users who will validate, promote and certify content for this location. These content management tasks should be delegated to users who are project owners and project leaders. For more information, see Project Level Administration (Tableau Server | Tableau Cloud). The roles and the process of content validation, promotion and certification are described later in this topic.
The diagram below shows the Sales department’s project hierarchy with a Sales Department Data Sources project, which holds department-wide data sources. The nested projects within the sales department’s project map to sales regions. Groups corresponding to the users within each region have access to the appropriate regional nested projected. The content created by the regions will exist alongside nested projects within them, which will be used to organise and secure it as needed. Beginning with your organisational structure is an appropriate place to begin mapping out your Tableau content structure because departments likely already have security, data and application access that correspond with their job functions.
As a department-team example, Marketing branches out to accommodate shared resources such as department-wide production content and data sources, but then locks down specific resources for a group such as Digital which has its own production and sandbox projects. The marketing project hierarchy is shown below.
Permissions should be managed at the project level using locked projects and groups to enforce governed access to content and simplify administration. While it is possible to manage permissions at an item level with unlocked projects, they will quickly become unwieldy to manage. Locked projects secure data while providing collaboration across projects when you need it. For more information, see Use projects to manage content access (Windows | Linux).
With the introduction of locked nested projects in 2020.1, a project can be locked at any level in the project hierarchy regardless of whether the parent is locked with different permissions. Tableau Server and site administrators and Tableau Cloud site administrators can manage content and permissions more effectively by delegating content management responsibilities to project owners or project leaders, who are closer to the work. They will use the locked nested projects with the permissions model that meet their specific group needs at any level in the hierarchy.
Check Apply to nested projects to lock nested projects independently.
Collections, introduced in 2021.2, provide a virtual container for content. Think of Collections as a playlist like you would find in Spotify, giving you the ability to curate the combination of content you want to share with others. This functionality differs from favouriting, which you cannot share with others.
Getting started with Collections is easy and available for any Tableau user site role.
You can add most content types (e.g. workbooks, views, data sources, etc.) to a collection from anywhere across a single site regardless of its project location. It’s a flexible way to onboard new team members, support your workflows and share related content without moving or duplicating existing items. Item permissions are still enforced, so only the appropriate users will see and have access to content that’s in the collection.
There are many ways to use Collections as part of your organisation’s content management framework. Continuing the example above, imagine that your organisation has multiple projects (Sales and Marketing). You want to give users the ability to easily find related content from across these projects, so you create a Collection. Now teams can easily weave a complete picture of a topic from one place.
To provide a place for all individuals to save their work securely on Tableau Server or Tableau Cloud, you should create a single personal sandbox and permissions to restrict content owners to only viewing their own items. The personal sandbox can be used for ad-hoc or in-progress analysis and hides the content that is not ready for broad release. When ready, users can move their content to the department sandbox for validation, promotion and certification process. A single personal sandbox for all users reduces administrative overhead by reducing the number of projects to secure and manage. After creating a top-level project named “Personal Sandbox,” set the permissions on the project for All Users to Publish, None for workbooks, and None for data sources, None for Flows, and None for Metrics (The legacy Metrics feature will be retired in Tableau Server version 2024.2 and in February 2024 for Tableau Cloud. For more information, see Create and Troubleshoot Metrics (Retired).).
Publisher-only permissions at the project level only
With personal sandbox content in a single location, administrators can monitor how often content is viewed, suggest when owners should delete stale content and check who is making the most use of the personal sandbox. Content owners can always see the content they own, even if it is published in a project where they are denied the ability to view workbooks and data sources. Authorisation is explained in more detail in the next section.
Both Tableau Server and Tableau Cloud support multi-tenancy using sites. In Tableau Server, you can create multiple sites to establish a security boundary that isolates specific users, groups, data and content on the same Tableau Server deployment. Users of one site do not have access to another site, including visibility of its existence. Because of the strict boundaries, sites work well when there is a deliberate need to prevent users from collaborating, or when content can remain separate during all phases of development.
For example, the diagram below shows two Tableau Server sites. In this example, unique users in Site 1 have no access to Site 2, including data and content. A user with access to both Site 1 and Site 2 can only sign into one site at a time. If some content is needed by both sites’ users, it will need to be duplicated within each site, or a new site will need to be created for the shared content for these users, which creates a lot more administrative overhead to monitor, measure and maintain. In Tableau Cloud, your instance of Tableau is a single site.
Sites create hard boundaries (see diagram above)
Sites in Tableau Server may initially appear to be a useful construct for segmenting data sources, workbooks and users, but the security boundary prohibits collaboration and content promotion that most organisations need for true self-service at scale. For this reason, carefully consider the implications of using sites instead of projects in a single site with delegated content management responsibilities. To illustrate the hard boundaries across sites, when you stand up a new site, relevant data sources need to be re-created in the new instance.
New sites should only be created when you need to manage a unique set of users and their content separately from all other Tableau users and content, because content is purposefully not shareable across the boundaries. For more information and examples of when it makes sense to use sites, see Sites overview (Windows | Linux).
Key considerations for content management
- Will workbooks and data sources be shared across the company?
- Will sites be used to isolate sensitive content or departments?
- Will projects use an organisational (departments/teams), functional (topics) or hybrid approach?
- Have sandbox and production projects been set up to support ad-hoc and validated content?
- Are content naming conventions used?
- Are authors publishing multiple copies of the same workbook with different filters selected?
- Does content have a description and tags? Does it comply with visual styles?
- Do you have a load time expectation and an exception procedure in place?
- After employees leave the company, what is the process to reassign content ownership?
When a user attempts to log in to Tableau, authentication verifies a user’s identity. Everyone who needs access to Tableau Server must be represented as a user in Tableau Server’s identity store ( | ). Tableau Cloud authentication supports Tableau, Google and SAML to verify a user's identity. Authorisation refers to what users can access on Tableau Server and Tableau Cloud, and how, after they have been authenticated. Authorisation includes:
- What users are allowed to do with content hosted in Tableau Server and Tableau Cloud, including site, projects, workbooks, views, data sources and flows.
- What tasks users are allowed to perform to administer Tableau Server and Tableau Cloud, such as configuring server and site settings, running command line tools, creating sites and other tasks.
Authorisation for these actions is managed by Tableau Server and Tableau Cloud, and determined by a combination of the user's licence type, site role and permissions associated with specific entities such as workbooks and data sources. Tableau’s role-based licences have implicit governance built in because of the capabilities that are included with them. For more information on the specific capabilities of each licence, see .
When you add users to a site on Tableau Server or Tableau Cloud, independent of their licence type, you must apply a site role to them. The site role signifies the maximum level of access a user can have on the site.
Users with a Tableau Creator licence have access to Tableau Server or Tableau Cloud, Tableau Desktop, Tableau Prep Builder and Tableau Mobile. The following site roles use a Tableau Creator licence:
Available on Tableau Server only; not applicable to Tableau Cloud.
Configure settings for the Tableau Server, all sites on the server, users and groups, and all content assets, such as projects, data sources (including connection information), workbooks and flows.
Connect to Tableau published data sources or external data from the browser, Tableau Desktop or Tableau Prep Builder; create and publish new data sources and flows; author and publish workbooks.
Site Administrator Creator
Unrestricted access to content as described above, but at the site level. Connect to Tableau or external data in the browser, Tableau Desktop or Tableau Prep Builder; create new data sources; build and publish content.
On Tableau Server, server administrators can determine whether to allow site administrators to manage users and assign site roles and site membership. By default, on Tableau Server, and always on Tableau Cloud, site administrators are allowed these capabilities.
This is the highest level of access for Tableau Cloud. Site administrators have access to site configuration settings.
Connect to data to author new data sources and dashboards, which are published and shared on Tableau Server and Tableau Cloud. Data stewards (DBA or data analyst) publish data sources. Creators incorporate process definitions, policies, guidelines and business knowledge on enterprise metadata management in compliance with organisational and/or regulatory obligations.
Users with a Tableau Explorer licence have access to Tableau Server or Tableau Cloud and Tableau Mobile. The following site roles use a Tableau Explorer licence:
Site Administrator Explorer
Same access to site and user configuration as Site Administrator Creator, but cannot connect to external data from the web editing environment.
Connect to Tableau published data sources to create new workbooks and edit and save existing workbooks.
Explorer (Can Publish)
Publish new content from browser, browse and interact with published views, use all interaction features. In the web editing environment, can edit and save existing workbooks, and save new standalone data sources from data connections embedded in workbooks, but cannot connect to external data and create new data sources.
Browse and interact with published views. Can subscribe to content, create data-driven alerts, connect to Tableau published data sources and open workbooks in the web authoring environment for ad-hoc queries, but they cannot save their work.
Users with a Tableau Viewer licence have access to Tableau Server or Tableau Cloud and Tableau Mobile.
View and interact with filters and content. Viewers can also receive alerts triggered by business events.
Users who have been added to Tableau Server or Tableau Cloud but without a licence are unlicensed.
Unlicensed users cannot sign in to Tableau Server or Tableau Cloud.
Site roles along with content permissions determine who can publish, interact with or only view published content, as well as who can manage the site’s users and administer the site itself. The project team should work together to define the content permissions model. Tableau Server and/or site administrators will assign permission rules to groups and lock them to the project. Locked projects enforce permission rules on all content within the container, including nested projects. For more information, see .
Tableau has default permission rules for projects, workbooks and data sources, or you can define custom permission rules for these content types.
Permission rules template
Combined with the appropriate site role, allows the user or group full access to the project, its child projects, and content published into that project hierarchy.
Allows the user or group to connect to, edit, download, delete, and set permissions for data sources or workbooks in the project.
They can also publish data sources. Provided they are the owner of a data source they publish, they can also update connection information and extract refresh schedules. This permission is relevant for views when the view they access connects to a data source.
Allows the user or group to publish workbooks and data sources to the project.
Allows the user or group to connect to data sources in the project.
Allows the user or group to view the workbooks and views in the project.
Sets all capabilities for the permission rule to Unspecified.
Sets all capabilities for the permission rule to Denied.
Custom permissions allow more granularity in permissions – from accessing or downloading a data source to how a user interacts with published content. Tableau’s intuitive interface makes it easy to associate users to functional groups, assign permissions to the groups and see who has access to which content. For more information, see Set permissions on individual content resources. If Data Management is present, permissions for external assets have additional considerations. For more information, see Manage Permissions for External Assets.
You should create groups locally on the server or import from Active Directory/LDAP and synchronise ( | ) on a set schedule. Synchronisation schedules are set by the Tableau Server administrator or Tableau Cloud site administrators. To simplify maintenance, assign permissions to groups at the project level as shown below. For Tableau Cloud, you can Automate User Provisioning and Group Synchronisation in Tableau Cloud through an External Identity Provider via SCIM and add or remove users or add or remove members from groups using the REST API programmatically.
Key considerations for authorisation
- What is the minimum site role for Active Directory/LDAP or SCIM group synchronisation?
- Have you set all permissions for the All users group in the Default project to None?
- Are any explicit restrictions (Deny permissions) needed on the All users group to propagate to every user account?
- Have you created groups that correspond to a set of authoring and viewing capabilities for each project?
- Have you reviewed effective permissions on select users to test your permissions model?
- Have you locked permissions at the parent project to maintain security throughout the project hierarchy?
- Have service account usernames/passwords been established for published data sources?
Content validation is the first step in a series of events that will culminate in content certification. Similar to the data quality area in data governance, content validation encompasses the processes to validate that content is accurate, complete, reliable, relevant and recent.
The first role to validate content should be its author. Authors should solicit feedback from the target audience as well. This can be done in an informal feedback group or by sharing a link to the workbook. Data stewards should also play a role to ensure accuracy and review an embedded data source as a potential candidate for publishing and certifying. If the data source is embedded in the workbook, the data steward should consider whether it is a potential candidate for publishing and certifying. Beyond data and calculation correctness, content validation should also include a review of the branding, layout, formatting, performance, filters, dashboard actions and edge case behaviours by the Site Administrator or Project Leader site roles.
Key considerations for content validation
- Who is involved in the validation process?
- Is the workbook accurate, complete, reliable, relevant and recent?
- Does the new content replace existing content?
- Are the underlying data and calculations correct?
- Does the workbook reflect corporate branding?
- Does the workbook have a logical layout?
- Are all axes and numbers formatted correctly?
- Do dashboards load within the acceptable performance time?
- Do filters and dashboard actions behave on the targeted views?
- Does the dashboard remain useful in edge case behaviours (filtered to all, none, one value, etc.)?
After content validation is complete, the process of content promotion is used to publish the workbook to a trusted project location or add the certification badge designation for published data sources. An example of a workbook workflow is shown below.
Content authors will connect to data, create new dashboards and publish to the sandbox project. Site administrators or project leaders will validate and approve the content. The approved content will be published to the production project. The Content Migration Tool, licensed as part of Tableau Advanced Management, provides an easy way to promote or migrate content between Tableau Server projects. You can do this between projects on separate Tableau Server installations (for instance, between a development instance of Tableau Server and a product installation with appropriate licensing for each environment), or between projects on a single Tableau Server installation. The Content Migration Tool user interface walks you through the steps necessary to build a "migration plan" that you can use either a single time or as a template for multiple migrations. To learn more about use cases, visit Content migration tool use cases.
If IT requirements mandate three separately licensed environments (dev, QA, and production), try not to replicate a traditional waterfall development cycle with a modern analytics platform. Users may favour the QA environment to circumvent stringent policies or delays to get content into production, so work towards a good balance by automating content migration to the production server with custom workflow scripts using Tableau’s REST APIs.
Key considerations for content promotion
- Who is involved in the promotion process?
- Do content-promoting roles have a checklist of criteria to evaluate?
- Have you clearly delineated between certified content and ad-hoc content by projects?
- Is the process agile, to support iterations and innovation?
- Do you have workflows to address both direct and restricted sources of data and workbooks?
After content has been validated and promoted, it achieves a trusted, certified status when a Site Administrator, Project Leader or Publisher (content author or data steward) with the appropriate permission for the production project promotes the workbook or data source to the designated location. Certification makes content discoverable by content consumers and improves data stewards’ abilities to govern enterprise-wide data more effectively in Tableau by reducing the proliferation of duplicate workbooks and data sources.
Use the baseline requirements that were established in key considerations for content validation as the criteria for becoming certified. Content authors should have a clear understanding of how the certification process works from start to finish, and content consumers should know where certified content is published in the production project, as defined by your content management standards.
Data source certification enables data stewards to promote specific data sources in your Tableau deployment as trusted and ready for use. Certified data sources get preferential treatment in Tableau Server and Tableau Cloud search results and in our smart data source recommendations algorithm so that they are discoverable and easily reusable.
Certified data source
Key considerations for content certification
- Who is responsible for designating certified content?
- Have all criteria for achieving certification status been met?
- Are all fields completed: about, certification notes, tags?
Content utilisation is a measurement of the effective use of the data for business decisions, but the complete picture cannot be told through traffic to views alone. Measurement of content utilisation helps your deployment to operate at scale and evolve by understanding user behaviour – who creates and consumes content, and the quality and relevance of the dashboards and data sources. If content isn’t being consumed, you will be able to identify it and take the appropriate next steps.
Tableau Server administrators and Tableau Cloud site administrators should monitor broad usage patterns with default administrative views. For more specific requirements, it is possible to create custom administrative views. For Tableau Server, this can be done with Tableau Server repository data. In Tableau Cloud, site administrators have access to Monitor site activity with default administrative views and can Use Admin Insights to create custom views. Site administrators should measure and audit usage of published content – both certified and ad-hoc – within their site. For example, if ad-hoc content utilisation is significantly higher than certified content utilisation, perhaps the promotion process is too restrictive or takes too long for business needs.
Site administrators should review content utilisation in the context of the expected audience sizes that were documented on the Tableau use cases and data sources tab of the . Individual content authors should also review utilisation for their content in the sparkline tooltip by hovering over the workbook’s thumbnail or selecting Who has seen this view from the menu. For more information, see the Measurement of Tableau user engagement and adoption.
Key considerations for content utilisation
- How much traffic goes to each view?
- What is the definition of stale content? How often is stale content purged?
- How much indirect utilisation (alerts & subscriptions) occurs?
- Are subscriptions delivered on time?
- Does the actual audience size match with expectations?
- Does content follow a weekly, monthly, quarterly trend?
- What is the frequency of login or days since last login by user cohort?
- What is the distribution of workbook and data source size?
Content governance summary
The table below defines the ideal state for promoting and governing content in a thriving modern analytics deployment:
IT administrators/BI professionals
Create and maintain an environment for storing and organising published content.
Ensure content in their site or project is relevant.
Security & permissions
Secure analytical content and grant users the appropriate levels of access based on content type, sensitivity, business need, etc.
Comply with organisational security and permissions policies.
Confirm that the process for validating content is correct.
Access platform capabilities to assist with validation and accuracy verification of user-generated analytical content.
Define process for promoting content.
Promote validated analytical content to a centralised, trusted environment as determined by the governance process.
Define process for certifying content.
Certify content as trusted and delineate from untrusted content in the same environment.
Measure broad usage patterns across organisational business units.
Measure and audit usage of published content and track usage of untrusted content.