Use LEFT and RIGHT arrow keys to navigate between flashcards;
Use UP and DOWN arrow keys to flip the card;
H to show hint;
A reads text to speech;
73 Cards in this Set
- Front
- Back
What does the ActiveX script task do?
|
Runs Microsoft Visual Basic Scripting Edition (VBScript) and JScript code and is cincluded mainly for legacy support what a DTS packages is migrated to SSIS
|
|
What does the Analysis Services Execute DDL Task do?
|
Runs XML for Analysis (XMLA) code against an SSAS database. XMLA is the data defnition language (DDL) for SSAS
therefore, this task lets you perform common structure changes such as adding partitions to cubes |
|
What does the Analysis Services Processing Task do?
|
Allows the processing of SSAS objects through an SSIS package.
|
|
What does the Bulk Insert Task do?
|
Allows the execution of bulk copy operations for SQL Server. This task works only against a SQL server database engine.
|
|
What does the Data Flow Task do?
|
Allows data processing from sources to destinations.
|
|
What does the Data Mining query Task do?
|
Performs data mining queries and lets you capture the results for analysis.
|
|
What does the data profiling task do?
|
Allows the analysis of source data for patterns, missing data, candidate keys, and statistics. These results typically inform devleopers about what logic to include in their SSIS packages based on their data needs.
|
|
What does the Execute DTS 2000 Package task do?
|
Reuns a DTS package within SSIS
|
|
What does the Execute Package task do?
|
Runs other SSIS packages either deployed to SQL Server or in the file system.
|
|
What does the execute process task do?
|
Runs a command-line operation such as program or batch file execution.
|
|
What does the Execute SQL Task do?
|
Runs SQL code against any underlying database connection in the SQL language of the connected database engine.
|
|
What does the File System Task do?
|
Lets you copy, move, and delete files as well as perform other file and folder operaitons.
|
|
What does the FTP Task do?
|
Sends and receives files between the file system and an FTP server and performs simple file and folder operations on the FTP server.
|
|
What does the Message Queue Task do?
|
Integrates with message queuing (MSMQ) on a server running Windows to read and send messages.
|
|
What does the Script Task do?
|
Runs Microsoft Visusal Basic 2008 or Mircrosoft Visual C# 2008 within an SSIS package.
|
|
What does the Send Mail Task do?
|
Sens an e-mail message through an SMTP server
|
|
What does the Transfer [object] task do?
|
Tasks that copy SQL Server ojects from one system to another, including databases, SQL Server Agent jobs, error messages, logins, master stored procedures, and database-level objects.
|
|
What does the Web Service Task do?
|
Lets you connect to a web service to send or recieve information.
|
|
What does the WMI Data Reader Task do?
|
Lets you run a windows management Instrumentation query against the operating system to capture server information.
|
|
What does the WMI event watcher task do?
|
Waits for a particular event before executing.
|
|
What does the XML task do?
|
Comines, queries, and differentiates multiple XML files on the server.
|
|
What does the Data Profiling Task Column Null Ratio Profile do?
|
Evaluates the column and returns the percent of NULLs in the column relative to the total numer of rows in the table.
|
|
What does the Data Profiling Task Column Statistics Profile do?
|
For numberic and datetime columns, returns the spread and averages of the values.
|
|
What does the Data Profiling Task Column Value Distibution profile do?
|
Identifies the uniqueness of the values in a column across all the rows for that column.
|
|
What does the Data Profiling Task Column Length Distribution profile do?
|
Shows the various value lengths for a text column and the percentage of all the rows that each length takes up.
|
|
What does the Data Profiling Task Column Pattern Profile do?
|
Displays any patterns found in the column data and returns the regular expression pattern that matches the pattern.
|
|
What does the Data Profiling Task Cadidate Key profile do?
|
Identifies one or more columns that are unique across all rows--the percentage of uniqueness is shown.
|
|
What does the Data Profiling Task Functional Dependency Profile do?
|
Lists any colums that have value dependencies on other coumns within the table, where a value from one column is found only when the value of another coumn is distinct.
|
|
How many sources and destinations does the Import and Export Wizard let you work with?
|
1 of each
|
|
What is the package connection manager?
|
Someitmes called package connection
it is independent of project data sources that let the differenct components of SSIS communicate with an object outside the package. Pacakge connections can be source adapters, FTP or email servers, or flat files. |
|
Data source:
|
a connection string that resides outside the package and can be used for more than one package
|
|
When are connections updated?
|
When a package is opened for editing in BIDS.
|
|
What are the six data sources?
|
ADO.NET
Excel Flat File OLE DB Raw file XML |
|
What are the 12 data destinations?
|
ADO.Net
data mining model training DataReader Dimension Excel Flat File OLE DB Partition Raw file recordset SQL Server Compact Destination SQL Server Destination |
|
How are the following data destinations used: ADO.Net
Data mining model training |
Allows data insertion by using an ADO.Net managed provider
lets you pass data from the data flow into a data mining model in SSAS |
|
How are the following data destinations used: DataReader
Dimension |
lets you put data in an ADO.net recordset that can be progrmmatically referenced
Lets SSAS dimensions be processed directly from data flowing through the dtat flow. |
|
How are the following data destinations used: Excel
Flat File |
Used for insrting data into Excel (including Excel 2007)
Allows insertion of data to a flat file such as a comma-delimited or tab-delimited |
|
How are the following data destinations used: OLE DB
Partition Processing |
Uses the OLE DB proveder to insert rows into a destination system that allows an OLE DB connection
Allows SSAS data in a binary file type useful for data staging |
|
How are the following data destinations used: SQL Server Compact Destination
SQL Server Destination |
Lets you send data to a mobile device running SQL Mobile
provides a high-speed destination specific to SQL Server 2008 if the package is running on SQL server |
|
How are the following data destinations used: Raw file; Recordset
|
Stores native SSIS data in a binary file useful for data staging
takes the data flow data and creates a recordset in a package variable of type object |
|
What are the nine logical row-level transformations?
|
audit
cache transform character map copy olumn data conversion derived column export column import column row count |
|
What is the purpose of the audit data flow transformation?
|
Adds additional columns to each row based on system package variables such as ExecutionStartTime and PackageName
|
|
What is the purpose of the Cache Transform Data Flow transformation?
|
Allows data that will be used in a lookup transformation to be cached and available for multiple lookup components
|
|
What is the purpose of the Character Map Data Flow transformation?
|
Performs common text operations such as uppercase and allows advanced linguistic bit coversion operations
|
|
What is the purpose of the Copy Column Data Flow transformation?
|
Duplicates column values in each row to new named columns
|
|
What is the purpose of the Data Conversion Data Flow transformation?
|
Creates new columns in each row based on new data types coverted from other columns, e.g. converting text to numeric
|
|
What is the purpose of the Derived Column Data Flow transformation?
|
Uses the SSIS expresion language to perform in-place calculations on existing values
alternatively allows the addition of new columns based on expressions and calculations from other columns and variables |
|
What is the purpose of the Export Column Data Flow transformation?
|
Exports binary large object (BLOB) columns, one row at a time, to a file
|
|
What is the purpose of the Import Column Data Flow transformation?
|
Loads binary files such as images into the pipeline, intended for a BLOB data type destination
|
|
What is the purpose of the Row Count Data Flow transformation?
|
Tracks the number of rows that flow through the transformation and stores the number in a package variable after the final row
|
|
What are the six multi-input or multi-output transformations?
|
conditional split
lookup merge merge join multicast union all |
|
What is the purpose of the conditional split data flow transformation?
|
Routes or filters data based on Boolean expressions to one or more outputs, from which each row can be sent out only one output path
|
|
What is the purpose of the lookup Data Flow transformation?
|
Allows matching between pipline column values to external database tables
additional columns can be added to the data flow from the external table |
|
What is the purpose of the Merge Data Flow transformation?
|
Combines the rows of two sim ilar sorted inputs, one on top of the other, based on a defined sort key
|
|
What is the purpose of the Merge Join Data Flow transformation?
|
Joins the rows of two sorted inputs based on a defined joing column(s) adding columns from each source
|
|
What is the purpose of the multicast Data Flow transformation?
|
generates one or more identical outputs, from which every row is sent out every output
|
|
What is the purpose of the Union all Data Flow transformation?
|
Combines one or more similar inputs, stacking rows one on top of another, based on matching columns
|
|
What are the six multi-row transformations?
|
aggregate
percent sampling pivot row sampling sort unpivot |
|
What is the purpose of the aggregate Data Flow transformation?
|
associates records based on defined groupings and generates aggregations such as SUM, MAX, MIN, and count
|
|
What is the purpose of the Percent Sampling Data Flow transformation?
|
Filters the input rows by allowing only a defined percent to be passed to the output path
|
|
What is the purpose of the Pivot Data Flow transformation?
|
takes multiple input rows and pivots the rows to generate and output with more columns based on the orignial row values
|
|
What is the purpose of the row sampling Data Flow transformation?
|
Outputs a fixed number of rows, sampling the data from the entire input, no matter how much larger than the defined output the input is
|
|
What is the purpose of the Sort data Flow transformation?
|
Orders the input based on defined sort columns and sort direction and allows the removal of duplicates across the sort columns
|
|
What is the purpose of the unpivot Data Flow transformation?
|
takes a single row an doutputs multiple rows, moving column values to the new row based on defined columns
|
|
What is the purpose of the OLE DB Data Flow transformation?
|
Performs database operations such as updates and deletes, one row at a time, based on mapped paramaters from input rows
|
|
What is the purpose of the SLowly Changing Dimension Data Flow transformation?
|
Processes dimension changes, including tracking dimension history and updating dimension values. The slowly changing dimenstion transformaiton handles these common dimmension change types: historical attributes, fixed attributes, and changing attributes
|
|
What is the purpose of the Data Mining Data Flow transformation?
|
Applies input rows against a data mining model for prediction
|
|
What is the purpose of the Fuzzy Grouping Data Flow transformation?
|
Associates column values with a set of rows based on similarity, for data cleansing
|
|
What is the purpose of the fuzzy lookup Data Flow transformation?
|
joins a data flow input to a reference table based on column similarity. the similarity threshold setting specifies the closeness of allowed matches--a high setting means that matching values are closer in similarity
|
|
What is the purpose of the script component Data Flow transformation?
|
priveds VB.NET scripting capabilities agains rows, columns, inputs, and outputs in the data flow pipeline
|
|
What is the purpose of the term extraction Data Flow transformation?
|
Analyzes text input columns for English nouns and noun phrases
|
|
What is the purpose of the Term Lookup Data Flow transformation?
|
Analyzes text input columns against a user-defined set of words for association.
|
|
what are the 8 advanced data-preperation transformations?
|
OLE DB
slowyl changing dimension data mining query fuzzy grouping fuzzy lookup script component term extraction term lookup |