Governance in Tableau

Governance in Tableau is a critical step to driving usage and adoption of analytics while maintaining security and integrity of the data. You must define standards, processes, and policies to securely manage data and content through the Modern Analytics Workflow. Just as important as defining these is having everyone in the workflow understand and comply so that users will have trust and confidence in the analytics they’ll use to make data-driven decisions.

To define your organization’s Tableau Governance Models, you should work through the areas of data and content governance that are outlined in the diagram below using the Tableau Blueprint Planner.

 

Data Governance in Tableau

The purpose of data governance in the Modern Analytics Workflow is to ensure that the right data is available to the right people in the organization, at the time they need it. It creates accountability and enables, rather than restricts, access to secure and trusted content and for users of all skill levels.

Data Source Management

Data source management includes processes related to selection and distribution of data within your organization. Tableau connects to your enterprise data platforms and leverages the governance you already have applied to those systems. In a self-service environment, content authors and data stewards have the ability to connect to various data sources, build and publish data sources, workbooks, and other content. Without these processes, there will be a proliferation of duplicate data sources, which will cause confusion among users, increase likelihood of errors, and consume system resources.

Tableau’s hybrid data architecture provides two modes for interacting with data, using a live query or an in-memory extract. Switching between the two is as easy as selecting the right option for your use case. In both live and extract use cases, users may connect to your existing data warehouse tables, views, and stored procedures to leverage those with no additional work. 

Live queries are appropriate if you have invested in a fast database, need up-to-the-minute data, or use Initial SQL. In-memory extracts should be used if your database or network is too slow for interactive queries, to take load off transactional databases, or when offline data access is required.

With support for a new multi-table logical layer and relationships in Tableau 2020.2, users aren’t limited to using data from a single, flat, denormalized table in a Tableau Data Source. They can now build multi-table data sources with flexible, LOD-aware relationships between tables, without having to specify join types in anticipation of what questions can be asked of the data. With multi-table support, Tableau data sources can now directly represent common enterprise data models such as star and snowflake schemas, as well as more complex, multi-fact models. Multiple levels of detail are supported in a single data source, so fewer data sources are needed to represent the same data. Relationships are more flexible than database joins and can support additional use-cases as they arise, reducing the need to build new data models to answer new questions. Using relationships in well-modeled schemas can reduce both the time to create a data model as well as the number of data sources to answer business questions. For more information, see Metadata Management later in this section and The Tableau Data Model.

When publishing a workbook to Tableau Server or Tableau Online, the author will have a choice to publish the data source or leave it embedded in the workbook. The data source management processes you define will govern this decision. With Tableau Data Server, which is a built-in component of the Tableau platform, you can share and reuse data models, secure how your users access data, and manage and consolidate extracts with Published Data Sources. Further, Published Data Sources allow Tableau Creator- and Explorer-licensed users to have access to secure, trusted data in Tableau for web authoring and Ask Data. For more information, see Best Practices for Published Data Sources, Edit Views on the Web, and Optimize Data for Ask Data.

With increased data discovery capabilities, Tableau Catalog indexes all content, including workbooks, data sources, and flows to allow authors to search for fields, columns, databases, and tables in workbooks and published data sources. For more information, see Data Management Add-on.

When Tableau Catalog is enabled, content authors can Search for Data by selecting from Data Sources, Databases and Files, or Tables to see if it exists in Tableau Server and Tableau Online and minimize duplication of data sources.

In addition, the Data Details tab on a view published to Tableau Server and Tableau Online will provide consumers with relevant information about the data used in it. Details include information about the workbook (name, author, date modified), the data sources used in the view, and a list of the fields in use.

For data stewards who create new Published Data Sources, the workflow below shows the two major decision points that impact data source management—live or extract and embedded or shared data model. This is not to imply that a formal modeling process must always occur before analysis begins.


To discover and prioritize key sources of data, use the Tableau Data and Analytics Survey and Tableau Use Cases and Data Sources tabs in the Tableau Blueprint Planner.

 

Key Considerations for Data Source Management

  • What are the key sources of data for a department or team?
  • Who is the data steward or owner of the data?
  • Will you connect live or extract the data?
  • Should the data source be embedded or published?
  • Do variants of a dataset exist? If so, can they be consolidated as an authoritative source?
  • If multiple data sources are consolidated, does the single data source performance or utility suffer by attempting to fulfill too many use cases at once?
  • What business questions need to be answered by the data source?
  • What naming conventions are used for Published Data Sources?

 

Data Quality

Data quality is a measure of data's fitness to serve its purpose in a given context—in this case, for making business decisions. The quality of data is determined by factors such as accuracy, completeness, reliability, relevance, and freshness. You likely already have processes in place to ensure data quality as it is ingested from source systems, and the more that is fixed in upstream processes, the less correction will be needed at the time of analysis. You should ensure data quality is consistent all the way through to consumption.

As you are planning, it is a good time to review existing upstream data quality checks because data will be available to a larger group of users under a self-service model. In addition, Tableau Prep Builder and Tableau Desktop are great tools for detecting data quality issues. By establishing a process to report data quality issues to the IT team or data steward, the data quality will become an integral part of building trust and confidence in the data.

With the Tableau Data Management Add-on and Tableau Catalog, you should communicate data quality issues to your users to increase visibility and trust in the data. When a problem exists, you can set a warning message on a data asset so that users of that data asset are aware of particular issues. For example, you might want to let users know that the data hasn't been refreshed in two weeks or that a data source has been deprecated. You can set one data quality warning per data asset, such as a data source, database, flow, or table. For more information, see Set a Data Quality Warning, including the following types: Warning, Deprecated, Stale Data, and Under Maintenance.

Note that you can set a data quality warning using REST API. For more information, see Add Data Quality Warning in the Tableau REST API Help.

Key Considerations for Data Quality

  • What processes exist for ensuring accuracy, completeness, reliability, and relevance?
  • Have you developed a checklist to operationalize the process?
  • Who needs to review data prior to it becoming shared and trusted?
  • Is your process adaptable to business users and are they able to partner with data owners to report issues?

 

Enrichment & Preparation

Enrichment and preparation include the processes used to enhance, refine, or prepare raw data for analysis. Often a single data source does not answer all the questions a user may have. Adding data from different sources adds valuable context. You likely already have ETL processes to clean, combine, aggregate, and store data when ingesting raw data from various sources. With command line interfaces and APIs, Tableau can be integrated with your existing processes.

For self-service data preparation, Tableau Prep Builder and Tableau Prep Conductor, should be used to combine multiple sources of data and automate on a schedule. Tableau Prep has multiple output types to Tableau Server or Tableau Online, including CSV, Hyper, and TDE, or Published Data Sources. Beginning with 2020.3, Tableau Prep outputs include database tables, where the result of a flow can be saved to a table in a relational database. This means that prepped data from Tableau Prep Builder can be stored and governed in a central location and leveraged throughout your organization. Tableau Prep Builder is part of the Tableau Creator license, while Tableau Prep Conductor is part of the Tableau Data Management Add-On. Tableau Data Management helps you better manage the data within your analytics environment from data preparation to cataloging, search, and governance, ensuring that trusted and up-to-date data is always used to drive decisions.

With visual, smart, direct feedback at every step, Tableau Prep Builder will help users to prototype and prepare disparate sources of data for analysis. Once the steps are defined and verified, the flow should be published to Tableau Server and Tableau Online where Prep Conductor will execute the flow and output a Published Data Source on the specified schedule. Automation creates a consistent process, reduces error-prone manual steps, tracks success/failure, and saves time. Users will have confidence in the output because the steps can be viewed on Tableau Server or Tableau Online.

 

Tableau Prep Flow

Tableau Prep Flow in Tableau Server or Tableau Online

Key Considerations for Data Enrichment

  • Will data enrichment and preparation be centralized or self-service?
  • What organizational roles perform data enrichment and preparation?
  • What ETL tools and processes should be used to automate enrichment and/or preparation?
  • What sources of data provide valuable context when combined with each other?
  • How complex are the data sources to be combined?
  • Will users be able to use Tableau Prep Builder and/or Tableau Desktop to combine datasets?
  • Have standardized join or blend fields been established by the DBA to enable users to enrich and prepare datasets?
  • How will you enable self-service data preparation?

 

Data Security

Data security is of utmost importance in every enterprise. Tableau allows customers to build upon their existing data security implementations. IT administrators have the flexibility to implement security within the database with database authentication, within Tableau with permissions, or a hybrid approach of both. Security will be enforced regardless of whether users are accessing the data from published views on the web, on mobile devices, or through Tableau Desktop and Tableau Prep Builder. Customers often favor the hybrid approach for its flexibility to handle different kinds of use cases. Start by establishing a data security classification to define the different types of data and levels of sensitivity that exist in your organization.

When leveraging database security, it is important to note that the method chosen for authentication to the database is key. This level of authentication is separate from the Tableau authentication (i.e. when a user logs into Tableau, he or she is not yet logging into the database). This means that Tableau users will also need to have credentials (their own username/password or service account username/password) to connect to the database for the database-level security to apply. To further protect your data, Tableau only needs read-access credentials to the database, which prevents publishers from accidentally changing the underlying data. Alternatively, in some cases, it is useful to give the database user permission to create temporary tables. This can have both performance and security advantages because the temporary data is stored in the database rather than in Tableau.

In addition, extract encryption at rest is a data security feature that allows you to encrypt .hyper extracts while they are stored on Tableau Server. Available as of 2019.3, Tableau Server administrators can enforce encryption of all extracts on their site or enable users to encrypt all extracts associated with particular published workbooks or data sources. For more information, see Extract Encryption at Rest. Tableau Online is already fully encrypted at rest.

You can limit which users see what data by setting user filters on data sources. This allows you to better control what data users see in a published view based on their Tableau login account. Using this technique, a regional manager can view data for her region but not the data for the other regional managers. With these data security approaches, you can publish a single view or dashboard in a way that provides secure, personalized data and analysis to a wide range of users on Tableau. For more information, see Data Security and Restrict Access at the Data Row Level.

Key Considerations for Data Security

  • How do you classify different types of data according to its sensitivity?
  • How does someone request access to data?
  • Will you use a service account or database security to connect to data?
  • What is the appropriate approach to secure data according to sensitivity classification?
  • Does your data security meet legal, compliance, and regulatory requirements?

 

Metadata Management

Metadata management includes policies and processes that ensure information can be accessed, shared, analyzed and maintained across the organization, as an extension of Data Source Management. Metadata is a business-friendly representation of data in common terms, similar to a semantic layer in traditional BI platforms. Curated data sources hide the complexity of your organization’s modern data architecture and make fields immediately understandable regardless of the data store and table from which it was sourced.

Tableau employs a simple, elegant, and powerful metadata system that gives users flexibility while allowing for enterprise metadata management. The Tableau Data Model can be embedded in a workbook or centrally managed as a Published Data Source with Data Server. After connecting to data and creating the Tableau Data Model, which will become a Published Data Source on Tableau Server or Tableau Online, look at it from your users’ perspective and see how much easier analytics will be when they have a well-formatted starting point, filtered and sized to the business questions it can answer. For more information on Published Data Sources, visit The Tableau Data Model, Best Practices for Published Data Sources and Enabling Governed Data Access with Tableau Data Server.

The diagram below shows where elements exist in the Tableau Data Model:

Beginning in 2020.2, the Data Source includes the connection, connection attributes, and the physical and logical layers within a Data Model. Upon connection, Tableau automatically characterizes fields as dimensions or measures. In addition, the Data Model stores calculations, aliases, and formatting. The physical layer includes physical tables defined by joins, unions, and/or custom SQL. Each group of one or more physical tables defines a logical table, which resides in the logical layer along with relationships.

Relationships are a new way to model data that is more flexible than using joins. A relationship describes how two tables relate to each other, based on common fields, but it does not combine the tables together as the result of a join does. Relationships provide several advantages over using joins.

  • You don't need to configure join types between tables. You only need to select the fields to define the relationship.
  • Relationships use joins, but they are automatic. Relationships postpone the selection of join types to the time and context of analysis.
  • Tableau uses relationships to automatically generate correct aggregations and appropriate joins during analysis, based on the current context of the fields in use in a worksheet.
  • Multiple tables at different levels of detail are supported in a single data source, so fewer data sources are needed to represent the same data.
  • Unmatched measure values are not dropped (no accidental loss of data).
  • Tableau will generate queries only for the data that is relevant to the current view.

At run-time in the VizQL model, multiple queries are built dynamically based on the dimensions and measures of the visualization and filters, aggregations, and table calculations are applied. Tableau uses the contextual information of the separate logical table to determine what joins are applied to provide the correct aggregation. This enables one user to design the Data Source without needing to know, plan, or otherwise account for all the variations of analysis to be performed with the Data Source by other users. Tableau Catalog discovers and indexes all of the content on Tableau, including workbooks, data sources, sheets, and flows.

Data stewards or authors with direct access to sources of data should prototype data sources as an embedded data source in a Tableau workbook and then create a Published Data Source in Tableau to share the curated Tableau Data Model, as shown below in the direct access workflow:

If authors do not have direct access to sources of data, they will rely on a DBA or data steward to provide the prototype data source embedded in a Tableau workbook. After reviewing and verifying it contains the needed data, a Site Administrator or Project Leader will create a Published Data Source in Tableau to share the Tableau Data Model, as shown below in the restricted access workflow:

The metadata checklist identifies best practices for curating a Published Data Source. By establishing data standards using the checklist, you’ll enable the business with governed self-service data access that is user-friendly and easy to understand. Prior to creating an extract or Published Data Source in Tableau, review and apply the following checklist to the Tableau Data Model:

  • Validate the data model
  • Filter and size to the analysis at hand
  • Use standard, user-friendly naming conventions
  • Add field name synonyms and custom suggestions for Ask Data
  • Create hierarchies (drill paths)
  • Set data types
  • Apply formatting (dates, numbers)
  • Set fiscal year start date, if applicable
  • Add new calculations
  • Remove duplicate or test calculations
  • Enter field descriptions as comments
  • Aggregate to highest level
  • Hide unused fields

Beginning in 2019.3 in the Data Management Add-on,Tableau Catalog discovers and indexes all of the content on Tableau, including workbooks, data sources, sheets, and flows. Indexing is used to gather information about the metadata, schemas, and lineage of the content. Then from the metadata, Tableau Catalog identifies all of the databases, files, and tables used by the content on your Tableau Server or Tableau Online site. Knowing where your data comes from is key to trusting the data, and knowing who else uses it means you can analyze the impact of changes data in your environment. The lineage feature in Tableau Catalog indexes both internal and external content. For more information, see Use Lineage for Impact Analysis.

Using lineage, you can trace down to content owners at the end of the lineage graph. The list of owners includes anyone assigned as the owner of a workbook, data source, or flow, and anyone assigned as the contact for a database or table in the lineage. If a change is going to be made, you can email owners to let them know about its impact. For more information, see Use email to contact owners.

Key Considerations for Metadata Management

  • What is the process for curating data sources?
  • Has the data source been sized to the analysis at hand?
  • What is your organizational standard for naming conventions and field formatting?
  • Does the Tableau Data Model meet all criteria for curation, including user-friendly naming conventions?
  • Has the metadata checklist been defined, published, and integrated into the validation, promotion, and certification processes?

 

Monitoring & Management

Monitoring is a critical piece of the self-service model as it allows IT and administrators to understand how data is being used and be proactive and responsive about usage, performance, data connectivity, and refresh failures. Depending on your company’s database standards, IT will use a combination of tools and job schedulers for ingesting and monitoring raw data and server health.

Just as business users leverage data to make smarter decisions, administrators are also empowered to make data-driven decisions about their Tableau deployment. With Tableau Server’s default administrative views and custom administrative views, Tableau Server and Site Administrators will use default administrative views for monitoring the status of extract refreshes, data source utilization, and delivery of subscriptions and alerts. Custom administrative views are created from Tableau Server's repository data. In Tableau Online, Site Administrators have access to Monitor Site Activity with default administrative views and can Use Admin Insights to Create Custom Views. For more information, see Tableau Monitoring and the Measurement of Tableau User Engagement and Adoption.

Key Considerations for Monitoring & Management

  • Are schedules available for the times needed for extract refreshes?
  • How is raw data ingestion monitored from source systems? Did the jobs complete successfully?
  • Are there duplicate sources of data?
  • When are extract refreshes scheduled to run? How long do extracts run on server? Did the refresh succeed or fail?
  • Are subscription schedules available after extract refreshes have occurred?
  • Are data sources being used? By whom? How does this compare with the expected audience size?
  • What is the process to remove stale Published Data Sources?

 

Data Governance Summary

Striking the balance between control and agility is critical. In spite of stringent governance policies, users often go the route of locally saving sensitive data and analytics for quick analysis. In a self-service environment, the role of data governance is to permit access to data and enable users to get the answers they need while ensuring security is enforced. Although every organization has different requirements, the table below describes the ideal state for governing self-service data access:

 

Area

IT Administrators/
BI Professionals

Content AUTHORS

Data Source Management

Provide access to sources of data and comply with organizational data strategy, policies, and procedures.

Define, manage, and update data models used for analysis.

Data Quality

Define the process to validate data and build trust in its accuracy for decision making.

Capture and expose data-cleansing rules applied to published data models.

Enrichment & Preparation

Create ETL processes from multiple sources of data to make data ready for analysis.

Capture and expose enrichment and preparation rules applied to published data models.

Data Security

Define security parameters and access controls to published data models.

Comply with enterprise data security policies and external regulations.

Metadata Management

Define organizational policies and processes for metadata management.

Define, update, and expose field-level metadata for users.

Monitoring & Management

Monitor and audit usage to ensure compliance and appropriate use of data assets.

Monitor and track usage metrics of centrally-managed data models.

 

Content Governance in Tableau

As the use of analytics increases, a growing number of mission-critical business decisions will become data-driven. The net effect is not only an increase in content volume but also in the varying skill levels among its users who will be collaborating and uncovering valuable insights. With more and more people using data daily, it is critical that Tableau content can be secured, governed, and trusted—as well as organized so that people can discover, consume, and create content with confidence. Without content governance, users will find it increasingly difficult to find what they need among irrelevant, stale, or duplicate workbooks.

Content governance involves the processes that keep content relevant and fresh, such as knowing when to decommission content because it’s not getting the expected traffic or finding out why no one is using an important dashboard for decision-making. The responsibility of ensuring compliance with an organization’s content governance policies is a core responsibility of content authors.

This section provides IT administrators and business users with the core concepts underpinning Tableau’s content management features and guidance on how these concepts should be applied to manage the content created in a thriving modern analytics platform.

Content Management

Defining a consistent content organization structure allows administrators to manage content and makes content more discoverable by users. Tableau Server and Tableau Online give you the flexibility needed to structure your environment and manage content based your specific governance requirements.

To isolate content, you can organize content on separate sites. This is known as multi-tenancy. Users of one site do not have access to another site or even awareness of its existence. Each site can have unique users, data, and content. Sites work well when content can remain completely separate during all phases, and there is little to no user overlap. Create a new site only when you need to manage a unique set of users and their content completely separately from all other Tableau users and content. While sites may appear easier initially to segment data sources, workbooks, and users, carefully consider whether there will be shared content across the organization. For more information, see Sites Overview (Windows | Linux).

Projects are containers for your workbooks, data sources, flows, and other projects, and they help you to create a scalable process for managing access to the content published to Tableau. They group together items and behave much like folders to provide hierarchical organization. Projects and nested projects are used to group similar content with the corresponding levels of permission that administrators define. Permissions should be managed at the project level using groups for simplified administration. For more information, see Use Projects to Manage Content Access (Windows | Linux).

Content organization structure

Depending on your requirements, content can be managed by organizational (by department/team), functional (by topic), or hybrid approaches. The deployment project team should work together to create a content organization framework of different projects with consistent naming conventions that include sandbox projects for ad hoc or uncertified content, and production projects for validated, certified content. In a self-service sandbox project hierarchy, content authors can freely explore, author, and perform ad hoc analysis. Publishing to the production project is limited to a small group of users who will validate, promote, and certify content in this location as trusted for data-driven decision-making. As business users create new content based on trusted data, these items will be certified and promoted to a production project. The process of content validation, promotion, and certification is described later in this document. This ensures that the organization’s primary data sources and dashboards are constantly improving and evolving.

An example of project hierarchies for Marketing Production, Marketing Sandbox, and Marketing Data Sources are shown below. You should carefully consider your content organization approach with respect to security and permissions requirements. In the Marketing Production and Sandbox project hierarchies, permissions are set by the administrator and locked, while Published Data Sources in the Marketing Data Sources project are secured and permissioned on each data source. Using this approach, Marketing workbooks can be secured to only the Marketing department, and Marketing data sources can be accessed by specified groups outside of Marketing who are granted permission to them. Having a separate sandbox hierarchy supports content review and promotion requirements.

Departmental Project Hierarchy

To provide a place for all individuals to securely save their work on Tableau Server or Tableau Online, you should use a single Personal Sandbox and permissions to restrict content owners to only viewing their own items. Once ready, the user can publish their content to the department sandbox for validation, promotion, and certification process. This has the benefit of reducing administrative overhead by reducing the number of projects to secure and manage. Apply the permission to the Personal Sandbox project as shown below:

  1. Create the Personal Sandbox Project, and lock content permissions to the project.

Permissions locked to the project

  1. Set permissions for All Users to Publisher on the project, None for workbooks, and none for data sources.

Publisher only permissions

With Personal Sandbox content in a single location, administrators can monitor how often content is viewed, suggest owners delete stale content, and check who is making the most use of the Personal Sandbox. Content owners can always see the content they own, even if it’s published in a project where they are denied the ability to view workbooks and data sources. Authorization is explained in more detail in the next section.

 

Key Considerations for Content Management

  • Will workbooks and data sources be shared across the company?
  • Will sites be used to isolate sensitive content or departments?
  • Will projects use an organizational (departments/teams), functional (topics), or hybrid approach?
  • Have sandbox and production projects been setup to support ad-hoc and validated content?
  • Are content naming conventions used?
  • Are authors publishing multiple copies of the same workbook with different filters selected?
  • Does content have a description, tags, and comply with visual styles?
  • Do you have a load time expectation and an exception procedure in place?
  • After employees leave the company, what is the process to reassign content ownership?

 

Authorization

When a user attempts to login to Tableau, authentication verifies a user’s identity. Everyone who needs access to Tableau Server must be represented as a user in Tableau Server’s identity store (Windows | Linux). Tableau Online Authentication supports Tableau, Google, and SAML to verify a user's identity. Authorization refers to how and what users can access on Tableau Server and Tableau Online after the user has been authenticated. Authorization includes:

  • What users are allowed to do with content hosted in Tableau Server and Tableau Online, including site, projects, workbooks, views, data sources, and flows.
  • What tasks users are allowed to perform to administer Tableau Server and Tableau Online, such as configuring server and site settings, running command line tools, creating sites, and other tasks.

Authorization for these actions is managed by Tableau Server and Tableau Online and determined by a combination of the user's license type, site role, and permissions associated with specific entities such as workbooks and data sources. Tableau’s role-based licenses have implicit governance built in because of the capabilities that are included with them. For more information on specific capabilities by each license, see Tableau for Teams and Organizations.

When you add users to a site on Tableau Server or Tableau Online, independent of their license type, you must apply a site role to them. The site role signifies the maximum level of access a user can have on the site.

Users with a Tableau Creator license have access to Tableau Server or Tableau Online, Tableau Desktop, Tableau Prep Builder, and Tableau Mobile. The following site roles use a Tableau Creator license:

Site Role

Description

Server Administrator

Available on Tableau Server only; not applicable to Tableau Online.

Configure settings for the Tableau Server, all sites on the server, users and groups, and all content assets, such as projects, data sources (including connection information), workbooks, and flows.

Connect to Tableau Published Data Sources or external data from the browser, Tableau Desktop, or Tableau Prep Builder; create and publish new data sources and flows; author and publish workbooks.

Site Administrator Creator

Unrestricted access to content as described above, but at the site level. Connect to Tableau or external data in the browser, Tableau Desktop, or Tableau Prep Builder; create new data sources; build and publish content.

On Tableau Server, server administrators can determine whether to allow site administrators to manage users and assign site roles and site membership. By default, on Tableau Server, and always on Tableau Online, site administrators are allowed these capabilities.

This is the highest level of access for Tableau Online. Site Administrators have access to Site configuration settings.

Creator

Connect to data to author new data sources and dashboards, which are published and shared on Tableau Server and Tableau Online. Data stewards (DBA or data analyst) publish data sources. Creators incorporate process definitions, policies, guidelines, and business knowledge for enterprise metadata management in compliance with organizational and/or regulatory obligations.

 

 

Users with a Tableau Explorer license have access to Tableau Server or Tableau Online and Tableau Mobile. The following site roles use a Tableau Explorer license:

Site Role

Description

Site Administrator Explorer

Same access to site and user configuration as Site Administrator Creator but cannot connect to external data from the web editing environment.

Connect to Tableau Published Data Sources to create new workbooks and edit and save existing workbooks.

Explorer (Can Publish)

Publish new content from browser, browse and interact with published views, use all interaction features. In the web editing environment, can edit and save existing workbooks, and save new standalone data sources from data connections embedded in workbooks, but cannot connect to external data and create new data sources.

Explorer

Browse and interact with published views. Can subscribe to content, create data driven alerts, connect to Tableau Published Data Sources and open workbooks in the web authoring environment for ad-hoc queries, but they cannot save their work.

 

 

Users with a Tableau Viewer license have access to Tableau Server or Tableau Online and Tableau Mobile.

Site Role

Description

Viewer

View and interact with filters and content. Viewers can also receive alerts triggered by business events.

 

Users who have been added to Tableau Server or Tableau Online but without a license are Unlicensed.

Site Role

Description

Unlicensed

Unlicensed users cannot sign in to Tableau Server or Tableau Online.

 

 

Site roles along with content permissions determines who can publish, interact with, or only view published content, as well as who can manage the site’s users and administer the site itself. The project team should work together to define the content permissions model. Tableau Server and/or Site Administrators will assign permission rules to groups and lock them to the project. Locked projects enforce permission rules on all content within the container, including nested projects. For more information, see Set Project Default Permissions and Lock the Project.

Tableau has default permission rules for projects, workbooks, and data sources, or you can define custom permission rules for these content types.

Permission Rules Template

Description

Project Leader

Combined with the appropriate site role, allows the user or group full access to the project, its child projects, and content published into that project hierarchy.

 

Editor

Allows the user or group to connect to, edit, download, delete, and set permissions for data sources or workbooks in the project.

 

They can also publish data sources, and provided they are the owner of a data source they publish, can update connection information and extract refresh schedules. This permission is relevant for views when the view they access connects to a data source.

 

Publisher

Allows the user or group to publish workbooks and data sources to the project.

 

Connector

Allows the user or group to connect to data sources in the project.

 

Viewer

Allows the user or group to view the workbooks and views in the project.

 

None

Sets all capabilities for the permission rule to Unspecified.

 

Denied

Sets all capabilities for the permission rule to Denied.

 

Custom permissions allow more granularity in permissions—from accessing or downloading a data source to how a user interacts with published content. Tableau’s intuitive interface makes it easy to associate users to functional groups, assign permissions to the groups, and see who has access to which content. For more information, see Set Permissions on Individual Content Resources. If the Data Management Add-on is present, permissions for external assets have additional considerations. For more information, see Manage Permissions for External Assets.


https://help.tableau.com/current/server/en-us/Img/perms_projectwrkbkperms.png

You should create groups locally on the server or import from Active Directory/LDAP and synchronize (Windows | Linux) on a set schedule. Synchronization schedules are set by the Tableau Server Administrator or Tableau Online Site Administrators. To simplify maintenance, assign permissions to groups at the project level as shown below. For Tableau Online, you can Automate User Provisioning and Group Synchronization in Tableau Online through an External Identity Provider via SCIM and add or remove users or add or remove members from groups using the REST API programmatically.

For more information, see Set-up Permissions Quick Start, Configure Projects, Groups, and Permissions for Managed Self-Service, and Permissions Reference.

 

Key Considerations for Authorization

  • What is the minimum site role for Active Directory/LDAP or SCIM group synchronization?
  • Have you set all permissions for the All Users group in the Default project to None?
  • Are any explicit restrictions (Deny permissions) needed on the All Users group to propagate to every user account?
  • Have you created groups that correspond to a set of authoring and viewing capabilities for each project?
  • Have you reviewed effective permissions on select users to test your permissions model?
  • Have you locked permissions at the parent project to maintain security throughout the project hierarchy?
  • Have service account usernames/passwords been established for Published Data Sources?

 

Content Validation

Content validation is the first step in a series of events that will culminate in content certification. Similar to the data quality area in data governance, content validation encompasses the processes to validate that content is accurate, complete, reliable, relevant, and recent.

The first role to validate content should be its author. Authors should solicit feedback from the target audience as well. This can be done in an informal feedback group or by sharing a link to the workbook. Data stewards should also play a role to ensure correctness and review an embedded data source as a potential candidate for publishing and certifying. If the data source is embedded in the workbook, the data steward should consider whether it is a potential candidate for publishing and certifying. Beyond data and calculation correctness, content validation should also include a review of the branding, layout, formatting, performance, filters, dashboard actions, and edge case behaviors by the Site Administrator or Project Leader site roles.

Key Considerations for Content Validation

  • Who is involved in the validation process?
  • Is the workbook accurate, complete, reliable, relevant, and recent?
  • Does the new content replace existing content?
  • Are the underlying data and calculations correct?
  • Does the workbook reflect corporate branding?
  • Does the workbook have a logical layout?
  • Are all axes and numbers formatted correctly?
  • Do dashboards load within the acceptable performance time?
  • Do filters and dashboard actions behave on the targeted views?
  • Does the dashboard remain useful in edge case behaviors (filtered to all, none, one value, etc.)?

 

Content Promotion

After content validation is complete, the process of content promotion is used to publish the workbook to a trusted project location or add the certification badge designation for Published Data Sources. An example of a workbook workflow is shown below.

Workbook Workflow

Content authors will connect to data, create new dashboards, and publish to the sandbox project. Site Administrators or Project Leaders will validate and approve the content. The approved content will be published to the production project. The Content Migration Tool, licensed as part of the Tableau Server Management Add-on, provides an easy way to promote or migrate content between Tableau Server projects. You can do this between projects on separate Tableau Server installations (for instance, between a development instance of Tableau Server and a product installation with appropriate licensing for each environment), or between projects on a single Tableau Server installation. The Content Migration Tool User Interface walks you through the steps necessary to build a "migration plan" that you can use a single time, or as a template for multiple migrations. To learn more about use cases, visit Content Migration Tool Use Cases.

If IT requirements mandate three separately licensed environments (Dev, QA, and Production), try not to replicate a traditional waterfall development cycle with a modern analytics platform. Users may favor the QA environment to circumvent stringent policies or delays to get content into production, so work towards a good balance by automating content migration to the production server with custom workflow scripts using Tableau’s REST APIs.

Key Considerations for Content Promotion

  • Who is involved in the promotion process?
  • Do content-promoting roles have a checklist of criteria to evaluate?
  • Have you clearly delineated between certified content and ad-hoc content by projects?
  • Is the process agile to support iterations and innovation?
  • Do you have workflows to address both direct and restricted sources of data and workbooks?

 

Content Certification

After content has been validated and promoted, it achieves a trusted, certified status when a Site Administrator, Project Leader, or a Publisher (content author or data steward) with permission to the production project promotes the workbook or data source to the designated location. Certification makes content discoverable by content consumers and improves data stewards’ abilities to govern enterprise-wide data more effectively in Tableau by reducing the proliferation of duplicate workbooks and data sources.

Use the baseline requirements that were established in key considerations for content validation as the criteria for becoming certified. Content authors should have a clear understanding of how the certification process works from start to finish, and content consumers should know where certified content is published in the production project, as defined by your content management standards.

Data source certification enables data stewards to promote specific data sources in your Tableau deployment as trusted and ready for use. Certified Data Sources get preferential treatment in Tableau Server and Tableau Online search results and in our smart data source recommendations algorithm so that they are discoverable and easily reusable.

https://cdnl.tblsft.com/sites/default/files/blog/data_source_certification_3new.png
Certified Data Source

Key Considerations for Content Certification

  • Who is responsible for designating certified content?
  • Have all criteria for achieving certification status been met?
  • Are all fields completed: about, certification notes, tags?

 

Content Utilization

Content utilization is a measurement of the effective use of the data for business decisions, but the complete picture cannot be told through Traffic to Views alone. Measurement of content utilization helps your deployment to operate at scale and evolve by understanding user behaviors—who creates and consumes content, and the quality and relevance of the dashboards and data sources. If content isn’t being consumed, you will be able to identify it, and take the appropriate next steps.

Tableau Server Administrators and Tableau Online Site Administrators should monitor broad usage patterns with default administrative views. For more specific requirements, it is possible to create custom administrative views. For Tableau Server, this can be done with Tableau Server repository data. In Tableau Online, Site Administrators have access to Monitor Site Activity with default administrative views and can Use Admin Insights to Create Custom Views. Site Administrators should measure and audit usage of published content—both certified and ad-hoc—within their site. For example, if ad-hoc content utilization is significantly higher than certified content utilization, perhaps the promotion process is too restrictive or takes too long for business needs.

Site Administrators should review content utilization in the context of the expected audience sizes that were documented on theTableau Use Cases and Data Sources tab of the Tableau Blueprint Planner. Individual content authors should also review utilization for their content in the sparkline tooltip by hovering over the workbook’s thumbnail or selecting Who Has Seen This View from the menu. For more information, see the Measurement of Tableau User Engagement and Adoption.

Key Considerations for Content Utilization

  • How much traffic goes to each view?
  • What is the definition of stale content? How often is stale content purged?
  • How much indirect utilization (alerts & subscriptions) occurs?
  • Are subscriptions delivered on time?
  • Does the actual audience size match with expectations?
  • Does content follow a weekly, monthly, quarterly trend?
  • What is the frequency of login or days since last login by user cohort?
  • What is the distribution of workbook and data source size?

 

Content Governance Summary

The table below defines the ideal state for promoting and governing content in a thriving modern analytics deployment:

 

Area

IT Administrators/BI Professionals

Content AUTHORS

Content Management

Create and maintain an environment for storing and organizing published content.

 

Ensure content is relevant in their site or project.

Security & Permissions

Secure analytic content and grant users the appropriate levels of access based on content type, sensitivity, business need, etc.

 

Comply with organizational security and permissions policies.

Content Validation

Define process for validating content is correct.

Access platform capabilities to assist with validation and accuracy verification of user- generated analytic content.

 

Content Promotion

Define process for promoting content.

Promote validated analytic content to centralized-trusted environment as determined by governance process.

 

Content Certification

Define process for certifying content.

Certify content as trusted and delineate from untrusted content in the same environment.

 

Content Utilization

Measure broad usage patterns across organizational business units.

Measure and audit usage of published content and track usage of untrusted content.

 

 

Thanks for your feedback! There was an error submitting your feedback. Please try again.