Microsoft Fabric Integration
  • 09 May 2024
  • 1 Minuta do przeczytania
  • Współtwórcy

Microsoft Fabric Integration


The content is currently unavailable in Polish. You are viewing the default English version.
Streszczenie artykułu

Streamline sending data from Tulip to Microsoft Fabric for broader analytics opportunities

Purpose

This guide walks through step by step how to send Tulip tables data to Fabric via a paginated REST API query. This is useful for analyzing Tulip tables data in an Azure Data Lakehouse, Synapse Data Warehouse or other Azure Data storage location

The broader architecture for Fabric -- Microsoft Cloud for Manufacturing is shown below

image

Setup

Detailed setup video is embedded below:



Overall Requirements:

  • Usage of Tulip Tables API (Get API Key and Secret in Account Settings)
  • Tulip Table (Get the Table Unique ID

Process:

  1. On Fabric homepage, go to Data Factory
  2. Create a new Data Pipeline on Data Factory
  3. Start with the "Copy Data Assistant" to streamline creation process
  4. Copy Data Assistant Details:
    1. Data Source: REST
    2. Base URL: https://[instance].tulip.co/api/v3
    3. Authentication type: Basic
    4. Username: API Key from Tulip
    5. Password: API Secret from Tulip
    6. Relative URL: tables/[TABLE_UNIQUE_ID]/records?limit=100&offset={offset}
    7. Request: GET
    8. Pagination Option Name: QueryParameters.{offset}
    9. Pagination Option Value: RANGE:0:10000:100

Note: Limit can be lower than 100 if needed, but the increment in the pagination needs to match
Note: the Pagination Value for the range needs to be greater than the number of records in the table

image.png

  1. Then you can send this data to a variety of data destinations such as Azure Lakehouse
  2. Connect to the data destination, review the fields / rename as needed, and you are all set!

Example Architecture

image.png

Use Cases and Next Steps

Once you have finalized the connection, you can easily analyze the data with a spark notebook, PowerBI, or a variety of other tools.

1. Defect prediction

  • Identify production defects before they happen and increase right first time.
  • Identify core production drivers of quality in order to implement improvements

2. Cost of quality optimization

  • Identify opportunities to optimize product design without impact customer satisfaction

3. Production energy optimization

  • Identify production levers to optimal energy consumption

4. Delivery and planning prediction and optimization

  • Optimize production schedule based on customer demand and real time order schedule

5. Global Machine / Line Benchmarking

  • Benchmark similar machines or equipment with normalization

6. Global / regional digital performance management
Consolidated data to create real time dashboards


Czy ten artykuł był pomocny?