At Collibra, we believe in sharing the knowledge. Browse through pertinent how-tos, expert tips and solutions that might be just what you need to help you get the best out of your Collibra Platform.
How to…
Working with cron time zones
See the list of expected time zone names for the cronTimeZone key of the REST Catalog API.
Run a DQ Job from a PySpark notebook
Run DQ Jobs using a PySpark notebook environment, such as Databricks Notebook, Google Colab, and Jupyter.
Collibra AI model Python integration
Transfer the metadata related to your AI models to Collibra using Python and the Collibra REST Import API.
Get the IDs for JDBC schema ingestion with the REST API
Retrieve the IDs you need to register a data source using a JDBC driver with the Catalog REST API.
Add an attachment with a workflow
Use workflows to add attachments to a community domain or asset in your Collibra Platform.
Create a workflow to change asset responsibilities
Learn how to create a workflow that allows users to automatically change their responsibility on an asset.
Count the number of assets in a domain with the REST API
Learn how to use the REST API to get the total number of assets in a domain.
Fix hidden tasks in workflow diagram view
Learn how to fix hidden tasks in the Eclipse workflow diagram view.
Start a workflow when a definition is added
Configure a workflow definition to start the workflow when an attribute is added to an asset in your Collibra Platform.
Set a default filter in Collibra On-the-Go for Desktop
Customize your default filter for Collibra On-the-Go for Desktop so it applies to every search query you make.
Display an error message in a workflow
Use workflow exceptions to display a customized error message in your Collibra Platform from a workflow.
Add extra information to your workflows
Use annotations in your workflow diagrams to add extra information that explains how the workflow works, the purpose of individual tasks or the role of the users.
Export Collibra Platform users to Microsoft Excel
Use Microsoft Excel to get a list of all the users of your Collibra Platform as a spreadsheet in your workbook.
Working with date attributes
Learn how to format dates and time to interact with the Collibra Platform through its APIs and get familiar with the Unix timestamp.
Run Catalog data profiling jobs on Apache Spark clusters
In the Collibra Platform, profiling jobs are executed in JobServer, which runs Spark in local mode. With the Collibra Catalog Profiling Library, you can leverage your infrastructure and scale up profiling jobs to get more out of your Catalog.
Configure Google Cloud Platform for Collibra Insights consumption
Follow these steps to create and configure a Google Cloud Platform bucket where you store the tables of the Reporting Data Layer and configure BigQuery for to use that data.
Tableau – Google BigQuery configuration
Use the reporting capabilities of Tableau to get the most out of the Reporting Data Layer tables stored on Google Cloud Platform.