📄️ Build an ML Model and update ML Catalog
Build an ML Model into the ML Catalog.
📄️ Convert a Delimited File into a Lakehouse Table
Create or Update Lakehouse Table from Delimited File in the Lakehouse.
📄️ Convert a JSON file into a Lakehouse Table
Create or Update Lakehouse Table from Data Lake file.
📄️ Convert a Parquet file into a Lakehouse Table
Create or Update Lakehouse Table from Data Lake file.
📄️ Convert a Text file into a Lakehouse Table
Create or Update Lakehouse Table from Data Lake file.
📄️ Convert an Excel Worksheet cell range into a Lakehouse Table
Create or Update Lakehouse Table from Data Lake file.
📄️ Convert an Excel Worksheet into a Lakehouse Table
Create or Update Lakehouse Table from Data Lake file.
📄️ Convert an XML file into a Lakehouse Table
Create or Update Lakehouse Table from Data Lake file.
📄️ Create a Databricks Function
Create a Function within Databricks.
📄️ Create a Databricks View
Create a View within Databricks.
📄️ Create/Update a Feature Store Table
Run Enrichment Notebook that will create/update a Lakehouse Table.
📄️ Create/Update a Training Feature Log
Run Enrichment Notebook that will create/update a Lakehouse Table.
📄️ Execute Azure Batch Task
Create and execute Azure Batch tasks
📄️ Execute SQL against the Lakehouse
Execute SQL against the Lakehouse.
📄️ Execute SQL in Azure SQL DB
Execute SQL in an Azure SQL DB. The SQL may request an email to be sent and/or the Task to fail.
📄️ Execute SQL in SQL Server DB
Execute SQL in a SQL Server DB. The SQL may request an email to be sent and/or the Task to fail.
📄️ Export File Server Binary File to Azure Storage Account
Export a binary file from file server path to Azure Data Lake Storage Gen2.
📄️ Export Lakehouse Notebook Results to Azure SQL DB
Export data from Lakehouse to Azure SQL DB.
📄️ Export Lakehouse Notebook Results to Cosmos DB for NoSQL
Export data from Lakehouse to Json format in Azure Cosmos DB for NoSQL.
📄️ Export Lakehouse Notebook Results to CSV in ADLS (via MSI) URL as Secret
Export data from Lakehouse to a CSV File in Azure Data Lake Storage Gen2 (accessed via MSI).
📄️ Export Lakehouse Notebook Results to CSV/Parquet/JSON File in ADLS (via MSI
Export data from Lakehouse to a CSV File in Azure Data Lake Storage Gen2 (accessed via MSI).
📄️ Export Lakehouse Notebook Results to SQL Server DB
Export data from Lakehouse to SQL Server DB.
📄️ Export Lakehouse Query Results to Azure SQL DB
Execute a Query against the Lakehouse and export the results to Azure SQL DB.
📄️ Export Lakehouse Query Results to Cosmos DB for NoSQL
Execute a Query against the Lakehouse and export the results in Json format to Azure Cosmos DB for NoSQL.
📄️ Export Lakehouse Query Results to CSV in ADLS (via MSI) URL as Secret
Execute a Query against the Lakehouse and export the results to a CSV File in Azure Data Lake Storage Gen2.
📄️ Export Lakehouse Query Results to CSV/Parquet/JSON File in ADLS (via MSI)
Execute a Query against the Lakehouse and export the results to a CSV File in Azure Data Lake Storage Gen2.
📄️ Export Lakehouse Query Results to CSV/Parquet/JSON File in Azure Blob Stora
Execute a Query against the Lakehouse and export the results to a CSV File in Azure Blob Storage (accessed via MSI).
📄️ Export Lakehouse Query Results to CSV/Parquet/JSON File in SFTP (via SSH Pu
Export data from Lakehouse to a CSV File in SFTP endpoint (accessed via SSH Public Key).
📄️ Export Lakehouse Query Results to CSV/Parquet/JSON File in SFTP
Export data from Lakehouse to a CSV File in SFTP endpoint (accessed via Basic Auth).
📄️ Export Lakehouse Query Results to SQL Server DB
Execute a Query against the Lakehouse and export the results to SQL Server DB.
📄️ Export Lakehouse Results to CSV/Parquet/JSON File in Azure Blob Storage (vi
Execute a Query against the Lakehouse and export the results to a CSV File in Azure Blob Storage (accessed via MSI).
📄️ Export Lakehouse Results to CSV/Parquet/JSON File in SFTP
Export data from Lakehouse to a CSV File in SFTP endpoint.
📄️ Export Notebook Results to Dataverse Online (via Service Principal)
Export data from a Databricks notebook to Dataverse Online (accessed via Service Principal).
📄️ Federated Task
A Task that lives in another Insight Factory but forms part of this Insight Factory's dependency chain.
📄️ HTTP POST to a Durable Azure Function
POST to a Durable Azure Function.
📄️ Information
Add Information to a Production Line including links.
📄️ Ingest Amazon S3 Binary File
Copy file(s) from Amazon S3 (accessed via an Access Key) to Azure Data Lake Storage Gen2.
📄️ Ingest Azure Data Lake Binary File (via MSI)
Copy binary file(s) from Azure Data Lake Storage Gen2 to Azure Data Lake Storage Gen2.
📄️ Ingest Azure Data Lake Binary File with URL as Secret
Copy binary file(s) from Azure Data Lake Storage Gen2 to Azure Data Lake Storage Gen2.
📄️ Ingest Azure Data Lake Delimited File (via Acct Key) with URL as Secret
Copy file(s) from Azure Data Lake Storage Gen2 (accessed via an Account Key) to Azure Data Lake Storage Gen2.
📄️ Ingest Azure Data Lake Delimited File with URL as Secret
Copy delimited file(s) from Azure Data Lake Storage Gen2 to Azure Data Lake Storage Gen2.
📄️ Ingest Azure Data Lake Delimited File
Copy delimited file(s) from Azure Data Lake Storage Gen2 to Azure Data Lake Storage Gen2.
📄️ Ingest Azure File Storage Binary File
Copy binary file from file server path to Azure Data Lake Storage Gen2.
📄️ Ingest Azure SQL DB as Parquet
Copy data from Azure SQL DB to parquet format in Azure Data Lake Storage Gen2.
📄️ Ingest Common Data Model (via Manifest) as Parquet
Copy data from Azure Data Lake Storage Gen2 in Common Data Model (CDM) format, using a Manifest, to Parquet format in Azure Data Lake Storage Gen2.
📄️ Ingest Common Data Model (via Model) as Parquet
Copy data from Azure Data Lake Storage Gen2 in Common Data Model (CDM) format, using a Model.json file, to Parquet format in Azure Data Lake Storage Gen2.
📄️ Ingest Cosmos DB for NoSQL as JSON
Copy data from Azure Cosmos DB from NoSQL to JSON format in Azure Data Lake Storage Gen2.
📄️ Ingest DB2 as Parquet
Copy data from a DB2 database to parquet format in Azure Data Lake Storage Gen2.
📄️ Ingest Dynamics 365 (via Service Principal) as Parquet
Copy data from Dynamics 365 (accessed via Service Principal) to parquet format in Azure Data Lake Storage Gen2.
📄️ Ingest Dynamics AX as Parquet
Copy data from Dynamics AX to parquet format in Azure Data Lake Storage Gen2.
📄️ Ingest Dynamics CRM Entity as Parquet
Copy data from Dynamics CRM using the Entity Name to Apache Parquet format in Azure Data Lake Storage Gen2.
📄️ Ingest Dynamics CRM using Query as Parquet
Copy data from Dynamics CRM using FetchXML to Apache Parquet format in Azure Data Lake Storage Gen2.
📄️ Ingest File Server Binary File And Deflate
Copy and Deflate binary file from file server path to Azure Data Lake Storage Gen2.
📄️ Ingest File Server Binary File
Copy binary file from file server path to Azure Data Lake Storage Gen2.
📄️ Ingest from ServiceNow as Parquet
Copy data from ServiceNow to Parquet format in Azure Data Lake Storage Gen2.
📄️ Ingest FTP Binary File
Copy file(s) from FTP to Data Lake Storage
📄️ Ingest HTTP Binary File
Copy binary file from HTTP endpoint to Azure Data Lake Storage Gen2.
📄️ Ingest Microsoft Access as Parquet
Copy data from Microsoft Access to parquet format in Azure Data Lake Storage Gen2.
📄️ Ingest MySQL as Parquet
Copy data from MySQL to parquet format in Azure Data Lake Storage Gen2.
📄️ Ingest OData as Parquet
Copy data from OData resource to parquet format in Azure Data Lake Storage Gen2.
📄️ Ingest ODBC as Parquet
Copy data from ODBC to parquet format in Azure Data Lake Storage Gen2.
📄️ Ingest Oracle Database as Parquet
Copy data from an Oracle database to parquet format in Azure Data Lake Storage Gen2.
📄️ Ingest PostgreSQL as Parquet
Copy data from PostgreSQL to parquet format in Azure Data Lake Storage Gen2.
📄️ Ingest REST API as JSON
Copy data from REST API using to JSON format in Azure Data Lake Storage Gen2.
📄️ Ingest Salesforce (by Object) as Parquet
Copy data from Salesforce to parquet format in Azure Data Lake Storage Gen2.
📄️ Ingest Salesforce (by Query) as Parquet
Copy data from Salesforce to parquet format in Azure Data Lake Storage Gen2.
📄️ Ingest SAP C4C as Parquet
Copy data from SAP C4C to parquet format in Azure Data Lake Storage Gen2.
📄️ Ingest SAP CDC (Full Load Only) as Parquet
Copy data from SAP CDC (Full Load only) to parquet format in Azure Data Lake Storage Gen2.
📄️ Ingest SAP CDC (Full then Incremental Load) as Parquet
Copy data from SAP CDC (Full Load then Incremental Load) to parquet format in Azure Data Lake Storage Gen2.
📄️ Ingest SAP CDC (Incremental Load Only) as Parquet
Copy data from SAP CDC (Incremental Load only) to parquet format in Azure Data Lake Storage Gen2.
📄️ Ingest SAP HANA as Parquet
Copy data from SAP HANA to parquet format in Azure Data Lake Storage Gen2.
📄️ Ingest SFTP Binary File
Copy file(s) from SFTP to Azure Data Lake Storage Gen2.
📄️ Ingest SFTP Zip File (and unzip)
Copy a zipped file from SFTP and unzip into Azure Data Lake Storage Gen2.
📄️ Ingest Sharepoint Online Binary File in Document Library
Copy file from Sharepoint Online Document Library to Azure Data Lake Storage Gen2.
📄️ Ingest Sharepoint Online List as Parquet
Copy data from Sharepoint Online List to parquet format in Azure Data Lake Storage Gen2.
📄️ Ingest Snowflake as Parquet
Copy data from Snowflake to Parquet format in Azure Data Lake Storage Gen2.
📄️ Ingest SQL Server Database as Delimited File
Copy data from a SQL Server database (via a Self-Hosted Integration Runtime) to delimited format in Azure Data Lake Storage Gen2.
📄️ Ingest SQL Server Database as Parquet
Copy data from SQL Server database to parquet format in Azure Data Lake Storage Gen2.
📄️ Ingest XERO as Parquet
Copy data from XERO to parquet format in Azure Data Lake Storage Gen2.
📄️ Refresh Power BI Dataset
Trigger Power BI Dataset Refresh.
📄️ Run an Inference on an ML Model
Run Enrichment Notebook that will create/update a Lakehouse Table.
📄️ Run Azure Container App Job
Start an Azure Container App Job execution.
📄️ Run Databricks Notebook
Run Notebook.
📄️ Run dbt Command
Executes a dbt command using Databricks Workflows
📄️ Run Enrichment Notebook to update Lakehouse Table
Run Enrichment Notebook that will create/update a Lakehouse Table.
📄️ Run Logic App Workflow
Run a workflow in Azure Logic Apps using a trigger.
📄️ Run Production Line
Run Production Line.
📄️ Set Properties in Task State
Set one of more properties in State Config for a particular Task.
📄️ Webhook
Normally used as a trigger where the status is set by an external process.