Skip to content

Commit 5e15fa8

Browse files
authored
Merge pull request #3694 from MicrosoftDocs/master
10/27 AM Publish
2 parents 37a2543 + ef71d56 commit 5e15fa8

File tree

1,103 files changed

+51593
-50508
lines changed

Some content is hidden

Large Commits have some content hidden by default. Use the searchbox below for content that may be hidden.

1,103 files changed

+51593
-50508
lines changed

docs/analysis-services/what-s-new-in-sql-server-analysis-services-2017.md

Lines changed: 3 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -1,6 +1,6 @@
11
---
22
title: "What's new in SQL Server 2017 Analysis Services | Microsoft Docs"
3-
ms.date: "10/03/2017"
3+
ms.date: "10/27/2017"
44
ms.prod: "sql-server-2017"
55
ms.reviewer: ""
66
ms.suite: ""
@@ -41,9 +41,9 @@ To upgrade an existing tabular model in SSDT, in Solution Explorer, right-click
4141
It's important to keep in mind, once you upgrade an existing model to 1400, you can't downgrade. Be sure to keep a backup of your 1200 model database.
4242

4343
## Modern Get Data experience
44-
When it comes to ingesting data from data sources into your tabular models, SQL Server Data Tools (SSDT) introduces the modern **Get Data** experience for models at the 1400 compatibility level. This new feature is based on similar functionality in Power BI Desktop and Microsoft Excel 2016. The modern Get Data experience provides immense data transformation and data mashup capabilities by using the Get Data query builder and M expressions.
44+
When it comes to importing data from data sources into your tabular models, SQL Server Data Tools (SSDT) introduces the modern **Get Data** experience for models at the 1400 compatibility level. This new feature is based on similar functionality in Power BI Desktop and Microsoft Excel 2016. The modern Get Data experience provides immense data transformation and data mashup capabilities by using the Get Data query builder and M expressions.
4545

46-
The modern Get Data experience provided support for a wide range of additional data source. Future updates will support additional data sources.
46+
The modern Get Data experience provides support for a wide range of data sources. Going forward, updates will include support for even more.
4747

4848
![AS_Get_Data_in_SSDT](../analysis-services/media/as-get-data-in-ssdt.png)
4949

docs/integration-services/change-data-capture/process-inserts-updates-and-deletes.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -32,7 +32,7 @@ ms.workload: "Inactive"
3232
3333
Matching an ordinal value to its corresponding operation is not as easy as using a mnemonic of the operation. For example, 'D' can easily represent a delete operation and 'I' represent an insert operation. The example query that was created in the topic, [Creating the Function to Retrieve the Change Data](../../integration-services/change-data-capture/create-the-function-to-retrieve-the-change-data.md), makes this conversion from an ordinal value to a friendly string value that is returned in a new column. The following segment of code shows this conversion:
3434

35-
```
35+
```sql
3636
select
3737
...
3838
case __$operation

docs/integration-services/change-data-capture/specify-an-interval-of-change-data.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -90,7 +90,7 @@ ms.workload: "Inactive"
9090

9191
4. For **SQLStatement**, enter the following SQL statement:
9292

93-
```
93+
```sql
9494
SELECT DATEADD(dd,0, DATEDIFF(dd,0,GETDATE()-1)) AS ExtractStartTime,
9595
DATEADD(dd,0, DATEDIFF(dd,0,GETDATE())) AS ExtractEndTime
9696

docs/integration-services/data-flow/data-streaming-destination.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -24,7 +24,7 @@ ms.workload: "Inactive"
2424

2525
In the following example example, the following query returns output from the Package.dtsx package in the SSISPackagePublishing project in the Power BI folder of the SSIS Catalog. This query uses the linked server named [Default Linked Server for Integration Services] that in turn uses the new OLE DB Provider for SSIS. The query includes folder name, project name, and package name in the SSIS catalog. The OLE DB Provider for SSIS runs the package you specified in the query and returns the tabular result set.
2626

27-
```
27+
```sql
2828
SELECT * FROM OPENQUERY([Default Linked Server for Integration Services], N'Folder=Power BI;Project=SSISPackagePublishing;Package=Package.dtsx')
2929

3030
```

docs/integration-services/extending-packages-custom-objects/building-deploying-and-debugging-custom-objects.md

Lines changed: 23 additions & 23 deletions
Original file line numberDiff line numberDiff line change
@@ -1,25 +1,25 @@
1-
---
2-
title: "Building, Deploying, and Debugging Custom Objects | Microsoft Docs"
3-
ms.custom: ""
4-
ms.date: "03/14/2017"
5-
ms.prod: "sql-server-2016"
6-
ms.reviewer: ""
7-
ms.suite: ""
8-
ms.technology:
9-
- "docset-sql-devref"
10-
ms.tgt_pltfrm: ""
11-
ms.topic: "reference"
12-
applies_to:
13-
- "SQL Server 2016 Preview"
14-
helpviewer_keywords:
15-
- "custom objects [Integration Services]"
16-
ms.assetid: b03685bc-5398-4c3f-901a-1219c1098fbe
17-
caps.latest.revision: 50
18-
author: "douglaslMS"
19-
ms.author: "douglasl"
20-
manager: "jhubbard"
21-
---
22-
# Building, Deploying, and Debugging Custom Objects
1+
---
2+
title: "Building, Deploying, and Debugging Custom Objects | Microsoft Docs"
3+
ms.custom: ""
4+
ms.date: "03/14/2017"
5+
ms.prod: "sql-server-2016"
6+
ms.reviewer: ""
7+
ms.suite: ""
8+
ms.technology:
9+
- "docset-sql-devref"
10+
ms.tgt_pltfrm: ""
11+
ms.topic: "reference"
12+
applies_to:
13+
- "SQL Server 2016 Preview"
14+
helpviewer_keywords:
15+
- "custom objects [Integration Services]"
16+
ms.assetid: b03685bc-5398-4c3f-901a-1219c1098fbe
17+
caps.latest.revision: 50
18+
author: "douglaslMS"
19+
ms.author: "douglasl"
20+
manager: "jhubbard"
21+
---
22+
# Building, Deploying, and Debugging Custom Objects
2323
After you have written the code for a custom object for [!INCLUDE[ssISnoversion](../../includes/ssisnoversion-md.md)], you must build the assembly, deploy it, and integrate it into [!INCLUDE[ssIS](../../includes/ssis-md.md)] Designer to make it available for use in packages, and test and debug it.
2424

2525
## <a name="top"></a> Steps in Building, Deploying, and Debugging a Custom Object for Integration Services
@@ -65,7 +65,7 @@ manager: "jhubbard"
6565

6666
Here is an example of a post-build event command line for a custom log provider:
6767

68-
```
68+
```cmd
6969
"C:\Program Files (x86)\Microsoft SDKs\Windows\v7.0A\bin\NETFX 4.0 Tools\gacutil.exe" -u $(TargetName)
7070
"C:\Program Files (x86)\Microsoft SDKs\Windows\v7.0A\bin\NETFX 4.0 Tools\gacutil.exe" -i $(TargetFileName)
7171
copy $(TargetFileName) "C:\Program Files\Microsoft SQL Server\130\DTS\LogProviders "

docs/integration-services/extending-packages-scripting-data-flow-script-component-examples/parsing-non-standard-text-file-formats-with-the-script-component.md

Lines changed: 27 additions & 27 deletions
Original file line numberDiff line numberDiff line change
@@ -1,28 +1,28 @@
1-
---
2-
title: "Parsing Non-Standard Text File Formats with the Script Component | Microsoft Docs"
3-
ms.custom: ""
4-
ms.date: "03/17/2017"
5-
ms.prod: "sql-server-2016"
6-
ms.reviewer: ""
7-
ms.suite: ""
8-
ms.technology:
9-
- "docset-sql-devref"
10-
ms.tgt_pltfrm: ""
11-
ms.topic: "reference"
12-
applies_to:
13-
- "SQL Server 2016 Preview"
14-
helpviewer_keywords:
15-
- "text file reading [Integration Services]"
16-
- "Script component [Integration Services], non-standard text file formats"
17-
- "transformations [Integration Services], components"
18-
- "Script component [Integration Services], examples"
19-
ms.assetid: 1fda034d-09e4-4647-9a9f-e8d508c2cc8f
20-
caps.latest.revision: 36
21-
author: "douglaslMS"
22-
ms.author: "douglasl"
23-
manager: "jhubbard"
24-
---
25-
# Parsing Non-Standard Text File Formats with the Script Component
1+
---
2+
title: "Parsing Non-Standard Text File Formats with the Script Component | Microsoft Docs"
3+
ms.custom: ""
4+
ms.date: "03/17/2017"
5+
ms.prod: "sql-server-2016"
6+
ms.reviewer: ""
7+
ms.suite: ""
8+
ms.technology:
9+
- "docset-sql-devref"
10+
ms.tgt_pltfrm: ""
11+
ms.topic: "reference"
12+
applies_to:
13+
- "SQL Server 2016 Preview"
14+
helpviewer_keywords:
15+
- "text file reading [Integration Services]"
16+
- "Script component [Integration Services], non-standard text file formats"
17+
- "transformations [Integration Services], components"
18+
- "Script component [Integration Services], examples"
19+
ms.assetid: 1fda034d-09e4-4647-9a9f-e8d508c2cc8f
20+
caps.latest.revision: 36
21+
author: "douglaslMS"
22+
ms.author: "douglasl"
23+
manager: "jhubbard"
24+
---
25+
# Parsing Non-Standard Text File Formats with the Script Component
2626
When your source data is arranged in a non-standard format, you may find it more convenient to consolidate all your parsing logic in a single script than to chain together multiple [!INCLUDE[ssISnoversion](../../includes/ssisnoversion-md.md)] transformations to achieve the same result.
2727

2828
[Example 1: Parsing Row-Delimited Records](#example1)
@@ -66,7 +66,7 @@ manager: "jhubbard"
6666
6767
3. Select a destination database, and open a new query window. In the query window, execute the following script to create the destination table:
6868
69-
```
69+
```sql
7070
create table RowDelimitedData
7171
(
7272
FirstName varchar(32),
@@ -220,7 +220,7 @@ public override void Input0_ProcessInputRow(Input0Buffer Row)
220220
221221
3. Select a destination database, and open a new query window. In the query window, execute the following script to create the destination tables:
222222
223-
```
223+
```sql
224224
CREATE TABLE [dbo].[Parents]([ParentID] [int] NOT NULL,
225225
[ParentRecord] [varchar](32) NOT NULL,
226226
CONSTRAINT [PK_Parents] PRIMARY KEY CLUSTERED

docs/integration-services/extending-packages-scripting-data-flow-script-component-types/creating-a-source-with-the-script-component.md

Lines changed: 29 additions & 29 deletions
Original file line numberDiff line numberDiff line change
@@ -1,29 +1,29 @@
1-
---
2-
title: "Creating a Source with the Script Component | Microsoft Docs"
3-
ms.custom: ""
4-
ms.date: "03/17/2017"
5-
ms.prod: "sql-server-2016"
6-
ms.reviewer: ""
7-
ms.suite: ""
8-
ms.technology:
9-
- "docset-sql-devref"
10-
ms.tgt_pltfrm: ""
11-
ms.topic: "reference"
12-
applies_to:
13-
- "SQL Server 2016 Preview"
14-
dev_langs:
15-
- "VB"
16-
helpviewer_keywords:
17-
- "Script component [Integration Services], source components"
18-
- "output columns [Integration Services]"
19-
- "sources [Integration Services], components"
20-
ms.assetid: 547c4179-ea82-4265-8c6f-04a2aa77a3c0
21-
caps.latest.revision: 59
22-
author: "douglaslMS"
23-
ms.author: "douglasl"
24-
manager: "jhubbard"
25-
---
26-
# Creating a Source with the Script Component
1+
---
2+
title: "Creating a Source with the Script Component | Microsoft Docs"
3+
ms.custom: ""
4+
ms.date: "03/17/2017"
5+
ms.prod: "sql-server-2016"
6+
ms.reviewer: ""
7+
ms.suite: ""
8+
ms.technology:
9+
- "docset-sql-devref"
10+
ms.tgt_pltfrm: ""
11+
ms.topic: "reference"
12+
applies_to:
13+
- "SQL Server 2016 Preview"
14+
dev_langs:
15+
- "VB"
16+
helpviewer_keywords:
17+
- "Script component [Integration Services], source components"
18+
- "output columns [Integration Services]"
19+
- "sources [Integration Services], components"
20+
ms.assetid: 547c4179-ea82-4265-8c6f-04a2aa77a3c0
21+
caps.latest.revision: 59
22+
author: "douglaslMS"
23+
ms.author: "douglasl"
24+
manager: "jhubbard"
25+
---
26+
# Creating a Source with the Script Component
2727
You use a source component in the data flow of an [!INCLUDE[ssISnoversion](../../includes/ssisnoversion-md.md)] package to load data from a data source to pass on to downstream transformations and destinations. Ordinarily you connect to the data source through an existing connection manager.
2828

2929
For an overview of the Script component, see [Extending the Data Flow with the Script Component](../../integration-services/extending-packages-scripting/data-flow-script-component/extending-the-data-flow-with-the-script-component.md).
@@ -136,7 +136,7 @@ manager: "jhubbard"
136136

137137
6. Create and configure a destination component, such as a [!INCLUDE[ssNoVersion](../../includes/ssnoversion-md.md)] destination, or the sample destination component demonstrated in [Creating a Destination with the Script Component](../../integration-services/extending-packages-scripting-data-flow-script-component-types/creating-a-destination-with-the-script-component.md), that expects the **AddressID** and **City** columns. Then connect the source component to the destination. (You can connect a source directly to a destination without any transformations.) You can create a destination table by running the following [!INCLUDE[tsql](../../includes/tsql-md.md)] command in the **AdventureWorks** database:
138138

139-
```
139+
```sql
140140
CREATE TABLE [Person].[Address2]([AddressID] [int] NOT NULL,
141141
[City] [nvarchar](30) NOT NULL)
142142
```
@@ -269,7 +269,7 @@ manager: "jhubbard"
269269

270270
7. Create and configure a destination component, such as a [!INCLUDE[ssNoVersion](../../includes/ssnoversion-md.md)] destination, or the sample destination component demonstrated in [Creating a Destination with the Script Component](../../integration-services/extending-packages-scripting-data-flow-script-component-types/creating-a-destination-with-the-script-component.md). Then connect the source component to the destination. (You can connect a source directly to a destination without any transformations.) You can create a destination table by running the following [!INCLUDE[tsql](../../includes/tsql-md.md)] command in the **AdventureWorks** database:
271271

272-
```
272+
```sql
273273
CREATE TABLE [Person].[Address2]([AddressID] [int] NOT NULL,
274274
[City] [nvarchar](30) NOT NULL)
275275
```
@@ -390,4 +390,4 @@ manager: "jhubbard"
390390
[Creating a Destination with the Script Component](../../integration-services/extending-packages-scripting-data-flow-script-component-types/creating-a-destination-with-the-script-component.md)
391391
[Developing a Custom Source Component](../../integration-services/extending-packages-custom-objects-data-flow-types/developing-a-custom-source-component.md)
392392

393-
393+

docs/integration-services/import-export-data/connect-to-a-sql-server-data-source-sql-server-import-and-export-wizard.md

Lines changed: 3 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -92,18 +92,18 @@ Specify **Trusted_Connection=Yes** to connect with Windows integrated authentica
9292
### Connection string format
9393
Here's the format of a connection string that uses Windows integrated authentication.
9494

95-
Driver={ODBC Driver 13 for SQL Server};server=<server>;database=<database>;trusted_connection=Yes;
95+
`Driver={ODBC Driver 13 for SQL Server};server=<server>;database=<database>;trusted_connection=Yes;`
9696

9797
Here's the format of a connection string that uses SQL Server authentication instead of Windows integrated authentication.
9898

99-
Driver={ODBC Driver 13 for SQL Server};server=<server>;database=<database>;uid=<user id>;pwd=<password>;
99+
`Driver={ODBC Driver 13 for SQL Server};server=<server>;database=<database>;uid=<user id>;pwd=<password>;`
100100

101101
### Enter the connection string
102102
Enter the connection string in the **ConnectionString** field, or enter the DSN name in the **Dsn** field, on the **Choose a Data Source** or **Choose a Destination** page. After you enter the connection string, the wizard parses the string and displays the individual properties and their values in the list.
103103

104104
The following example uses this connection string.
105105

106-
Driver={ODBC Driver 13 for SQL Server};server=localhost;database=WideWorldImporters;trusted_connection=Yes;
106+
`Driver={ODBC Driver 13 for SQL Server};server=localhost;database=WideWorldImporters;trusted_connection=Yes;`
107107

108108
Here's the screen that you see after entering the connection string.
109109

docs/integration-services/lift-shift/ssis-azure-lift-shift-ssis-packages-overview.md

Lines changed: 2 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -17,10 +17,9 @@ You can now move your SQL Server Integration Services (SSIS) packages and worklo
1717

1818
## Benefits
1919
Moving your on-premises SSIS workloads to Azure has the following potential benefits:
20-
- **Reduce operational costs** by reducing on-premises infrastructure.
21-
- **Increase high availability** with multiple nodes per cluster, as well as the high availability features of Azure and of Azure SQL Database.
20+
- **Reduce operational costs** and reduce the burden of managing infrastructure that you have when you run SSIS on-premises or on Azure virtual machines.
21+
- **Increase high availability** with the ability to specify multiple nodes per cluster, as well as the high availability features of Azure and of Azure SQL Database.
2222
- **Increase scalability** with the ability to specify multiple cores per node (scale up) and multiple nodes per cluster (scale out).
23-
- **Avoid the limitations** of running SSIS on Azure virtual machines.
2423

2524
## Architecture overview
2625
The following table highlights the differences between SSIS on premises and SSIS on Azure. The most significant difference is the separation of storage from compute.

docs/integration-services/packages/legacy-package-deployment-ssis.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -150,7 +150,7 @@ manager: "jhubbard"
150150

151151
The following SQL statement shows the default CREATE TABLE statement that the Package Configuration Wizard provides.
152152

153-
```
153+
```sql
154154
CREATE TABLE [dbo].[SSIS Configurations]
155155
(
156156
ConfigurationFilter NVARCHAR(255) NOT NULL,

docs/integration-services/performance/performance-counters.md

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -52,13 +52,13 @@ manager: "jhubbard"
5252

5353
In the following example, the function returns statistics for a running execution with an ID of 34.
5454

55-
```
55+
```sql
5656
select * from [catalog].[dm_execution_performance_counters] (34)
5757
```
5858

5959
In the following example, the function returns statistics for all the executions running on the [!INCLUDE[ssISnoversion](../../includes/ssisnoversion-md.md)] server.
6060

61-
```
61+
```sql
6262
select * from [catalog].[dm_execution_performance_counters] (NULL)
6363

6464
```

docs/integration-services/system-views/catalog-execution-component-phases.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -41,7 +41,7 @@ manager: "jhubbard"
4141
> [!WARNING]
4242
> The catalog.execution_component_phases view provides this information when the logging level of the package execution is set to Performance or Verbose. For more information, see [Enable Logging for Package Execution on the SSIS Server](../../integration-services/performance/integration-services-ssis-logging.md#server_logging).
4343
44-
```
44+
```sql
4545
use SSISDB
4646
select package_name, task_name, subcomponent_name, execution_path,
4747
SUM(DATEDIFF(ms,start_time,end_time)) as active_time,

docs/relational-databases/in-memory-oltp/atomic-blocks-in-native-procedures.md

Lines changed: 4 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -1,7 +1,7 @@
11
---
22
title: "Atomic Blocks | Microsoft Docs"
33
ms.custom: ""
4-
ms.date: "12/02/2016"
4+
ms.date: "10/26/2017"
55
ms.prod: "sql-server-2016"
66
ms.reviewer: ""
77
ms.suite: ""
@@ -17,6 +17,8 @@ manager: "jhubbard"
1717
ms.workload: "Inactive"
1818
---
1919
# Atomic Blocks in Native Procedures
20+
[!INCLUDE[tsql-appliesto-ss2014-asdb-xxxx-xxx_md](../../includes/tsql-appliesto-ss2014-asdb-xxxx-xxx-md.md)]
21+
2022
**BEGIN ATOMIC** is part of the ANSI SQL standard. [!INCLUDE[ssNoVersion](../../includes/ssnoversion-md.md)] supports atomic blocks at the top-level of natively compiled stored procedures, as well as for natively compiled, scalar user-defined functions. For more information about these functions, see [Scalar User-Defined Functions for In-Memory OLTP](../../relational-databases/in-memory-oltp/scalar-user-defined-functions-for-in-memory-oltp.md).
2123

2224
- Every natively compiled stored procedure contains exactly one block of [!INCLUDE[tsql](../../includes/tsql-md.md)] statements. This is an ATOMIC block.
@@ -122,7 +124,7 @@ ORDER BY c1
122124
GO
123125
```
124126

125-
The following error messages specific to memory-optimized tables are transaction dooming. If they occur in the scope of an atomic block, they will cause the transaction to abort: 10772, 41301, 41302, 41305, 41325, 41332, and 41333.
127+
The following error messages specific to memory-optimized tables are transaction dooming. If they occur in the scope of an atomic block, they will cause the transaction to abort: 10772, 41301, 41302, 41305, 41325, 41332, 41333, and 41839.
126128

127129
## Session Settings
128130
The session settings in atomic blocks are fixed when the stored procedure is compiled. Some settings can be specified with **BEGIN ATOMIC** while other settings are always fixed to the same value.

0 commit comments

Comments
 (0)