|
| 1 | +--- |
| 2 | +title: 'Quickstart: Get started ingesting data with One-click (Preview)' |
| 3 | +description: In this quickstart, you'll learn to ingest data to Data Explorer pools using One-click. |
| 4 | +ms.topic: quickstart |
| 5 | +ms.date: 11/02/2021 |
| 6 | +author: shsagir |
| 7 | +ms.author: shsagir |
| 8 | +ms.reviewer: tzgitlin |
| 9 | +services: synapse-analytics |
| 10 | +ms.service: synapse-analytics |
| 11 | +ms.subservice: data-explorer |
| 12 | +--- |
| 13 | + |
| 14 | +# Quickstart: Ingest data using One-click (Preview) |
| 15 | + |
| 16 | +One-click ingestion makes the data ingestion process easy, fast, and intuitive. One-click ingestion helps you ramp-up quickly to start ingesting data, creating database tables, mapping structures. Select data from different kinds of sources in different data formats, either as a one-time or continuous ingestion process. |
| 17 | + |
| 18 | +The following features make one-click ingestion so useful: |
| 19 | + |
| 20 | +* Intuitive experience guided by the ingestion wizard |
| 21 | +* Ingest data in a matter of minutes |
| 22 | +* Ingest data from different kinds of sources: local file, blobs, and containers (up to 10,000 blobs) |
| 23 | +* Ingest data in a variety of [formats](#file-formats) |
| 24 | +* Ingest data into new or existing tables |
| 25 | +* Table mapping and schema are suggested to you and easy to change |
| 26 | +<!-- * Continue ingestion easily and quickly from a container with [Event Grid](one-click-ingestion-new-table.md#create-continuous-ingestion) --> |
| 27 | + |
| 28 | +One-click ingestion is particularly useful when ingesting data for the first time, or when your data's schema is unfamiliar to you. |
| 29 | + |
| 30 | +## Prerequisites |
| 31 | + |
| 32 | +- Create a Data Explorer pool using [Synapse Studio](../data-explorer-create-pool-studio.md) or [the Azure portal](../data-explorer-create-pool-portal.md) |
| 33 | + |
| 34 | +- Create a Data Explorer database. |
| 35 | + 1. In Synapse Studio, on the left-side pane, select **Data**. |
| 36 | + 1. Select **+** (Add new resource) > **Data Explorer pool**, and use the following information: |
| 37 | + |
| 38 | + | Setting | Suggested value | Description | |
| 39 | + |--|--|--| |
| 40 | + | Pool name | *contosodataexplorer* | The name of the Data Explorer pool to use | |
| 41 | + | Name | *TestDatabase* | The database name must be unique within the cluster. | |
| 42 | + | Default retention period | *365* | The time span (in days) for which it's guaranteed that the data is kept available to query. The time span is measured from the time that data is ingested. | |
| 43 | + | Default cache period | *31* | The time span (in days) for which to keep frequently queried data available in SSD storage or RAM, rather than in longer-term storage. | |
| 44 | + |
| 45 | + 1. Select **Create** to create the database. Creation typically takes less than a minute. |
| 46 | +- Create a table |
| 47 | + 1. In Synapse Studio, on the left-side pane, select **Develop**. |
| 48 | + 1. Under **KQL scripts**, Select **+** (Add new resource) > **KQL script**. On the right-side pane, you can name your script. |
| 49 | + 1. In the **Connect to** menu, select *contosodataexplorer*. |
| 50 | + 1. In the **Use database** menu, select *TestDatabase*. |
| 51 | + 1. Paste in the following command, and select **Run** to create a StormEvents table. |
| 52 | + |
| 53 | + ```Kusto |
| 54 | + .create table StormEvents (StartTime: datetime, EndTime: datetime, EpisodeId: int, EventId: int, State: string, EventType: string, InjuriesDirect: int, InjuriesIndirect: int, DeathsDirect: int, DeathsIndirect: int, DamageProperty: int, DamageCrops: int, Source: string, BeginLocation: string, EndLocation: string, BeginLat: real, BeginLon: real, EndLat: real, EndLon: real, EpisodeNarrative: string, EventNarrative: string, StormSummary: dynamic) |
| 55 | + ``` |
| 56 | +
|
| 57 | + > [!TIP] |
| 58 | + > Verify that the table was successfully created. On the left-side pane, select **Data**, select the *contosodataexplorer* more menu, and then select **Refresh**. Under *contosodataexplorer*, expand **Tables** and make sure that the *StormEvents* table appears in the list. |
| 59 | +
|
| 60 | +## Access the one-click wizard |
| 61 | +
|
| 62 | +The one-click ingestion wizard guides you through the one-click ingestion process. |
| 63 | +
|
| 64 | +* To access the wizard from Azure Synapse: |
| 65 | +
|
| 66 | + 1. In Synapse Studio, on the left-side pane, select **Data**. |
| 67 | + 1. Under **Data Explorer Databases**, right-click the relevant database, and then select **Open in Azure Data Explorer**. |
| 68 | +
|
| 69 | + :::image type="content" source="../media/ingest-data-one-click/open-azure-data-explorer-synapse.png" alt-text="Screenshot of Azure Synapse Studio, showing opening Azure Data Explorer in the context of a specific pool."::: |
| 70 | +
|
| 71 | + 1. Right-click the relevant pool, and then select **Ingest new data**. |
| 72 | +
|
| 73 | +* To access the wizard from the Azure portal: |
| 74 | +
|
| 75 | + 1. In the Azure portal, search for and select the relevant Synapse workspace. |
| 76 | + 1. Under **Data Explorer pools**, select the relevant pool. |
| 77 | + 1. On the **Welcome to Data Explorer pool** home screen, select **Ingest new data**. |
| 78 | +
|
| 79 | + :::image type="content" source="../media/ingest-data-one-click/open-azure-data-explorer-portal.png" alt-text="Screenshot of the Azure portal, showing opening Azure Data Explorer in the context of a specific pool."::: |
| 80 | +
|
| 81 | +
|
| 82 | +* To access the wizard from the Azure Data Explorer web ui: |
| 83 | +
|
| 84 | + 1. [!INCLUDE [data-explorer-get-endpoint](../includes/data-explorer-get-endpoint.md)] |
| 85 | + 1. Add a connection to the *Query endpoint*. |
| 86 | + 1. Select **Query** from the left menu, right-click on the **database** or **table**, and select **Ingest new data**. |
| 87 | +
|
| 88 | +## One-click ingestion wizard |
| 89 | +
|
| 90 | +> [!NOTE] |
| 91 | +> This section describes the wizard using Event Hub as the data source. You can also use these steps to ingest data from a blob, file, blob container, and a ADLS Gen2 container. |
| 92 | +> |
| 93 | +
|
| 94 | +1. On the **Destination** tab, choose the database and table for the ingested data. |
| 95 | +
|
| 96 | + :::image type="content" source="../media/ingest-data-one-click/select-azure-data-explorer-ingest-destination-table.png" alt-text="Screenshot of the Azure Data Explorer one-click ingestion wizard, showing the selection of a database and table."::: |
| 97 | +
|
| 98 | +1. On the **Source** tab: |
| 99 | + 1. Select *Event Hub* as the **Source type** for the ingestion. |
| 100 | +
|
| 101 | + :::image type="content" source="../media/ingest-data-one-click/select-azure-data-explorer-ingest-source-type.png" alt-text="Screenshot of the Azure Data Explorer one-click ingestion wizard, showing the selection of the source type."::: |
| 102 | +
|
| 103 | + 1. Fill out the Event Hub data connection details using the following information: |
| 104 | +
|
| 105 | + | Setting | Suggested value | Description | |
| 106 | + |--|--|--| |
| 107 | + | Data connection name | *ContosoDataConnection* | The name of the Event Hub data connection | |
| 108 | + | Subscription | *Contoso_Synapse* | The subscription where the Event Hub resides. | |
| 109 | + | Even Hub namespace | *contosoeventhubnamespace* | The namespace of the Event Hub. | |
| 110 | + | Consumer group | *contosoconsumergroup* | The name of the Even Hub consumer group. | |
| 111 | +
|
| 112 | + :::image type="content" source="../media/ingest-data-one-click/select-azure-data-explorer-ingest-event-hub-details.png" alt-text="Screenshot of the Azure Data Explorer one-click ingestion wizard, showing the Event Hub connection details."::: |
| 113 | +
|
| 114 | + 1. Select **Next**. |
| 115 | +
|
| 116 | +### Schema mapping |
| 117 | +
|
| 118 | +The service automatically generates schema and ingestion properties, which you can change. You can use an existing mapping structure or create a new one, depending on if you're ingesting to a new or existing table. |
| 119 | +
|
| 120 | +In the **Schema** tab, do the following actions: |
| 121 | + 1. Confirm the autogenerated compression type. |
| 122 | + 1. Choose the [format of your data](#file-formats). Different formats will allow you to make further changes. |
| 123 | + 1. Change mapping in the [Editor window](#editor-window). |
| 124 | +
|
| 125 | +#### File formats |
| 126 | +
|
| 127 | +One-click ingestion supports ingesting from source data in all [data formats supported by Azure Data Explorer for ingestion](data-explorer-ingest-supported-formats.md). |
| 128 | +
|
| 129 | +### Editor window |
| 130 | +
|
| 131 | +In the **Editor** window of the **Schema** tab, you can adjust data table columns as necessary. |
| 132 | +
|
| 133 | +The changes you can make in a table depend on the following parameters: |
| 134 | +
|
| 135 | +* **Table** type is new or existing |
| 136 | +* **Mapping** type is new or existing |
| 137 | +
|
| 138 | +Table type | Mapping type | Available adjustments| |
| 139 | +|---|---|---| |
| 140 | +|New table | New mapping |Change data type, Rename column, New column, Delete column, Update column, Sort ascending, Sort descending | |
| 141 | +|Existing table | New mapping | New column (on which you can then change data type, rename, and update),<br> Update column, Sort ascending, Sort descending | |
| 142 | +| | Existing mapping | Sort ascending, Sort descending |
| 143 | +
|
| 144 | +> [!NOTE] |
| 145 | +> When adding a new column or updating a column, you can change mapping transformations. For more information, see [Mapping transformations](#mapping-transformations) |
| 146 | +
|
| 147 | +<!-- >[!NOTE] |
| 148 | +> At any time, you can open the [command editor](one-click-ingestion-new-table.md#command-editor) above the **Editor** pane. In the command editor, you can view and copy the automatic commands generated from your inputs. --> |
| 149 | +
|
| 150 | +#### Mapping transformations |
| 151 | +
|
| 152 | +Some data format mappings (Parquet, JSON, and Avro) support simple ingest-time transformations. To apply mapping transformations, create or update a column in the [Editor window](#editor-window). |
| 153 | +
|
| 154 | +Mapping transformations can be performed on a column of **Type** string or datetime, with the **Source** having data type int or long. Supported mapping transformations are: |
| 155 | +
|
| 156 | +* DateTimeFromUnixSeconds |
| 157 | +* DateTimeFromUnixMilliseconds |
| 158 | +* DateTimeFromUnixMicroseconds |
| 159 | +* DateTimeFromUnixNanoseconds |
| 160 | +
|
| 161 | +### Data ingestion |
| 162 | +
|
| 163 | +Once you have completed schema mapping and column manipulations, the ingestion wizard will start the data ingestion process. |
| 164 | +
|
| 165 | +* When ingesting data from **non-container** sources, the ingestion will take immediate effect. |
| 166 | +
|
| 167 | +* If your data source is a **container**: |
| 168 | + * Azure Data Explorer's [batching policy](/azure/data-explorer/kusto/management/batchingpolicy?context=/azure/synapse-analytics/context/context) will aggregate your data. |
| 169 | + * After ingestion, you can download the ingestion report and review the performance of each blob that was addressed. |
| 170 | + <!-- * You can select **Create continuous ingestion** and set up [continuous ingestion using Event Grid](one-click-ingestion-new-table.md#create-continuous-ingestion). --> |
| 171 | +
|
| 172 | +### Initial data exploration |
| 173 | +
|
| 174 | +After ingestion, the wizard gives you options to use **Quick commands** for initial exploration of your data. |
0 commit comments