更新时间: 试题数量: 购买人数: 提供作者:

有效期: 个月

章节介绍: 共有个章节

收藏
搜索
题库预览
Case study -

Overview -

XYZ is an online training provider. They regularly conduct polls on the effectiveness of their training material. The polling data comes from various sources, such as online surveys and public events.

The polling data would be stored into 2 locations:

* An on-premise Microsoft SQL Server database

* Azure Data Lake Gen 2 storage

The data in the data lake would be queried using PolyBase

Each poll data also has associated metadata. The metadata is stored as JSON. The metadata has the date and number of people who have taken the poll.

Polling data is also taken via phone calls. Below are the security requirements for polling data taken via phone calls:

* The poll data must be uploaded by authorized users from authorized devices

* External contractors can only access their own polling data

* The access to the polling data would be given to users on a per-active directory user basis

Other requirements:

* All data migration processes must be carried out using Azure Data Factory

* All of the data migrations must run automatically and be carried out during non-business hours

* All services and processes must be resilient to regional Azure outages

* All services must be monitored using Azure Monitor

* The performance of the on-premise SQL Server must also be monitored

* After 3 months all polling data must be moved to low cost storage

* All deployments must be performed using Azure DevOps

* Deployments must make use of templates

* No credentials or secrets of any kind must be exposed during deployments

You have to implement the deployment of Azure Data Factory pipelines.

Which of the following would you use for authorization of the deployments?

Case study -

Overview -

XYZ is an online training provider. They regularly conduct polls on the effectiveness of their training material. The polling data comes from various sources, such as online surveys and public events.

The polling data would be stored into 2 locations:

* An on-premise Microsoft SQL Server database

* Azure Data Lake Gen 2 storage

The data in the data lake would be queried using PolyBase.

Each poll data also has associated metadata. The metadata is stored as JSON. The metadata has the date and number of people who have taken the poll.

Polling data is also taken via phone calls. Below are the security requirements for polling data taken via phone calls:

* The poll data must be uploaded by authorized users from authorized devices

* External contractors can only access their own polling data

* The access to the polling data would be given to users on a per-active directory user basis

Other requirements:

* All data migration processes must be carried out using Azure Data Factory

* All of the data migrations must run automatically and be carried out during non-business hours

* All services and processes must be resilient to regional Azure outages

* All services must be monitored using Azure Monitor

* The performance of the on-premise SQL Server must also be monitored

* After 3 months all polling data must be moved to low cost storage

* All deployments must be performed using Azure DevOps

* Deployments must make use of templates

* No credentials or secrets of any kind must be exposed during deployments

You have to implement the deployment of Azure Data Factory pipelines.

Which of the following would you use for authentication of the deployments?

Case study -

Overview -

XYZ is an online training provider. They regularly conduct polls on the effectiveness of their training material. The polling data comes from various sources, such as online surveys and public events.

The polling data would be stored into 2 locations:

* An on-premise Microsoft SQL Server database

* Azure Data Lake Gen 2 storage

The data in the data lake would be queried using PolyBase.

Each poll data also has associated metadata. The metadata is stored as JSON. The metadata has the date and number of people who have taken the poll.

Polling data is also taken via phone calls. Below are the security requirements for polling data taken via phone calls:

* The poll data must be uploaded by authorized users from authorized devices

* External contractors can only access their own polling data

* The access to the polling data would be given to users on a per-active directory user basis

Other requirements:

* All data migration processes must be carried out using Azure Data Factory

* All of the data migrations must run automatically and be carried out during non-business hours

* All services and processes must be resilient to regional Azure outages

* All services must be monitored using Azure Monitor

* The performance of the on-premise SQL Server must also be monitored

* After 3 months all polling data must be moved to low cost storage

* All deployments must be performed using Azure DevOps

* Deployments must make use of templates

* No credentials or secrets of any kind must be exposed during deployments

You have to create the storage account that would be used to store the polling data.

Which of the following would you use as the Account type?

Case study -

Overview -

XYZ is an online training provider. They regularly conduct polls on the effectiveness of their training material. The polling data comes from various sources, such as online surveys and public events.

The polling data would be stored into 2 locations:

* An on-premise Microsoft SQL Server database

* Azure Data Lake Gen 2 storage

The data in the data lake would be queried using PolyBase.

Each poll data also has associated metadata. The metadata is stored as JSON. The metadata has the date and number of people who have taken the poll.

Polling data is also taken via phone calls. Below are the security requirements for polling data taken via phone calls:

* The poll data must be uploaded by authorized users from authorized devices

* External contractors can only access their own polling data

* The access to the polling data would be given to users on a per-active directory user basis

Other requirements:

* All data migration processes must be carried out using Azure Data Factory

* All of the data migrations must run automatically and be carried out during non-business hours

* All services and processes must be resilient to regional Azure outages

* All services must be monitored using Azure Monitor

* The performance of the on-premise SQL Server must also be monitored

* After 3 months all polling data must be moved to low cost storage

* All deployments must be performed using Azure DevOps

* Deployments must make use of templates

* No credentials or secrets of any kind must be exposed during deployments

You have to create the storage account that would be used to store the polling data.

Which of the following would you use as the replication type?

Case study -

Overview -

XYZ is an online training provider. They regularly conduct polls on the effectiveness of their training material. The polling data comes from various sources, such as online surveys and public events.

The polling data would be stored into 2 locations:

* An on-premise Microsoft SQL Server database

* Azure Data Lake Gen 2 storage

The data in the data lake would be queried using PolyBase.

Each poll data also has associated metadata. The metadata is stored as JSON. The metadata has the date and number of people who have taken the poll.

Polling data is also taken via phone calls. Below are the security requirements for polling data taken via phone calls:

* The poll data must be uploaded by authorized users from authorized devices

* External contractors can only access their own polling data

* The access to the polling data would be given to users on a per-active directory user basis

Other requirements:

* All data migration processes must be carried out using Azure Data Factory

* All of the data migrations must run automatically and be carried out during non-business hours

* All services and processes must be resilient to regional Azure outages

* All services must be monitored using Azure Monitor

* The performance of the on-premise SQL Server must also be monitored

* After 3 months all polling data must be moved to low cost storage

* All deployments must be performed using Azure DevOps

* Deployments must make use of templates

* No credentials or secrets of any kind must be exposed during deployments

You have to ensure the polling data security requirements are met.

Which of the following would you set for Polybase?

Case study -

Overview -

XYZ is an online training provider.

Current Environment -

The company currently has Microsoft SQL databases that are split into different categories or tiers. Some of the databases are used by Internal users, some by external partners and external distributions.

Below is the List of applications, tiers and their individual requirements:

(含图)

Below are the current requirements of the company:

* For Tier 4 and Tier 5 databases, the backup strategy must include the following:

- Transactional log backup every hour

- Differential backup every day

- Full backup every week

* Backup strategies must be in place for all standalone Azure SQL databases using methods available with Azure SQL databases

* Tier 1 database must implement the following data masking logic:

- For Data type XYZ-A `" Mask 4 or less string data type characters

- For Data type XYZ-B `" Expose the first letter and mask the domain

- For Data type XYZ-C `" Mask everything except characters at the beginning and the end

* All certificates and keys are internally managed in on-premise data stores

* For Tier 2 databases, if there are any conflicts between the data transfer from on-premise, preference should be given to on-premise data.

* Monitoring must be setup on every database

* Applications with Tiers 6 through 8 must ensure that unexpected resource storage usage is immediately reported to IT data engineers.

* Azure SQL Data warehouse would be used to gather data from multiple internal and external databases.

* The Azure SQL Data warehouse must be optimized to use data from its cache

* The below metrics must be available when it comes to the cache:

- Metric XYZ-A `" Low cache hit %, high cache usage %

- Metric XYZ-B `" Low cache hit %, low cache usage %

- Metric XYZ-C `" high cache hit %, high cache usage %

* The reporting data for external partners must be stored in Azure storage. The data should be made available during regular business hours in connecting regions.

* The reporting for Tier 9 needs to be moved to Event Hubs.

* The reporting for Tier 10 needs to be moved to Azure Blobs.

The following issues have been identified in the setup:

* The External partners have control over the data formats, types and schemas

* For External based clients, the queries can't be changed or optimized

* The database development staff are familiar with T-SQL language

* Because of the size and amount of data, some applications and reporting features are not performing at SLA levels.

The Azure Data Factory instance must meet the requirements to move the data from the On-premise SQL Servers to Azure.

Which of the following would you use as the integration runtime?

Case study -

Overview -

XYZ is an online training provider.

Current Environment -

The company currently has Microsoft SQL databases that are split into different categories or tiers. Some of the databases are used by Internal users, some by external partners and external distributions.

Below is the List of applications, tiers and their individual requirements:

(含图)

Below are the current requirements of the company:

* For Tier 4 and Tier 5 databases, the backup strategy must include the following:

- Transactional log backup every hour

- Differential backup every day

- Full backup every week

* Backup strategies must be in place for all standalone Azure SQL databases using methods available with Azure SQL databases

* Tier 1 database must implement the following data masking logic:

- For Data type XYZ-A `" Mask 4 or less string data type characters

- For Data type XYZ-B `" Expose the first letter and mask the domain

- For Data type XYZ-C `" Mask everything except characters at the beginning and the end

* All certificates and keys are internally managed in on-premise data stores

* For Tier 2 databases, if there are any conflicts between the data transfer from on-premise, preference should be given to on-premise data.

* Monitoring must be setup on every database

* Applications with Tiers 6 through 8 must ensure that unexpected resource storage usage is immediately reported to IT data engineers.

* Azure SQL Data warehouse would be used to gather data from multiple internal and external databases.

* The Azure SQL Data warehouse must be optimized to use data from its cache

* The below metrics must be available when it comes to the cache:

- Metric XYZ-A `" Low cache hit %, high cache usage %

- Metric XYZ-B `" Low cache hit %, low cache usage %

- Metric XYZ-C `" high cache hit %, high cache usage %

* The reporting data for external partners must be stored in Azure storage. The data should be made available during regular business hours in connecting regions.

* The reporting for Tier 9 needs to be moved to Event Hubs.

* The reporting for Tier 10 needs to be moved to Azure Blobs.

The following issues have been identified in the setup:

* The External partners have control over the data formats, types and schemas

* For External based clients, the queries can't be changed or optimized

* The database development staff are familiar with T-SQL language

* Because of the size and amount of data, some applications and reporting features are not performing at SLA levels.

The Azure Data Factory instance must meet the requirements to move the data from the On-premise SQL Servers to Azure.

Which of the following should you use as the masking function for Data type XYZA?

Case study 

Overview -

XYZ is an online training provider.

Current Environment -

The company currently has Microsoft SQL databases that are split into different categories or tiers. Some of the databases are used by Internal users, some by external partners and external distributions.

Below is the List of applications, tiers and their individual requirements:

(含图)

Below are the current requirements of the company:

* For Tier 4 and Tier 5 databases, the backup strategy must include the following:

- Transactional log backup every hour

- Differential backup every day

- Full backup every week

* Backup strategies must be in place for all standalone Azure SQL databases using methods available with Azure SQL databases

* Tier 1 database must implement the following data masking logic:

- For Data type XYZ-A `" Mask 4 or less string data type characters

- For Data type XYZ-B `" Expose the first letter and mask the domain

- For Data type XYZ-C `" Mask everything except characters at the beginning and the end

* All certificates and keys are internally managed in on-premise data stores

* For Tier 2 databases, if there are any conflicts between the data transfer from on-premise, preference should be given to on-premise data.

* Monitoring must be setup on every database

* Applications with Tiers 6 through 8 must ensure that unexpected resource storage usage is immediately reported to IT data engineers.

* Azure SQL Data warehouse would be used to gather data from multiple internal and external databases.

* The Azure SQL Data warehouse must be optimized to use data from its cache

* The below metrics must be available when it comes to the cache:

- Metric XYZ-A `" Low cache hit %, high cache usage %

- Metric XYZ-B `" Low cache hit %, low cache usage %

- Metric XYZ-C `" high cache hit %, high cache usage %

* The reporting data for external partners must be stored in Azure storage. The data should be made available during regular business hours in connecting regions.

* The reporting for Tier 9 needs to be moved to Event Hubs.

* The reporting for Tier 10 needs to be moved to Azure Blobs.

The following issues have been identified in the setup:

* The External partners have control over the data formats, types and schemas

* For External based clients, the queries can't be changed or optimized

* The database development staff are familiar with T-SQL language

* Because of the size and amount of data, some applications and reporting features are not performing at SLA levels.

Which of the following should you use as the masking function for Data type XYZB?

Case study -

Overview -

XYZ is an online training provider.

Current Environment -

The company currently has Microsoft SQL databases that are split into different categories or tiers. Some of the databases are used by Internal users, some by external partners and external distributions.

Below is the List of applications, tiers and their individual requirements:

(含图)

Below are the current requirements of the company:

* For Tier 4 and Tier 5 databases, the backup strategy must include the following:

- Transactional log backup every hour

- Differential backup every day

- Full backup every week

* Backup strategies must be in place for all standalone Azure SQL databases using methods available with Azure SQL databases

* Tier 1 database must implement the following data masking logic:

- For Data type XYZ-A `" Mask 4 or less string data type characters

- For Data type XYZ-B `" Expose the first letter and mask the domain

- For Data type XYZ-C `" Mask everything except characters at the beginning and the end

* All certificates and keys are internally managed in on-premise data stores

* For Tier 2 databases, if there are any conflicts between the data transfer from on-premise, preference should be given to on-premise data.

* Monitoring must be setup on every database

* Applications with Tiers 6 through 8 must ensure that unexpected resource storage usage is immediately reported to IT data engineers.

* Azure SQL Data warehouse would be used to gather data from multiple internal and external databases.

* The Azure SQL Data warehouse must be optimized to use data from its cache

* The below metrics must be available when it comes to the cache:

- Metric XYZ-A `" Low cache hit %, high cache usage %

- Metric XYZ-B `" Low cache hit %, low cache usage %

- Metric XYZ-C `" high cache hit %, high cache usage %

* The reporting data for external partners must be stored in Azure storage. The data should be made available during regular business hours in connecting regions.

* The reporting for Tier 9 needs to be moved to Event Hubs.

* The reporting for Tier 10 needs to be moved to Azure Blobs.

The following issues have been identified in the setup:

* The External partners have control over the data formats, types and schemas

* For External based clients, the queries can't be changed or optimized

* The database development staff are familiar with T-SQL language

* Because of the size and amount of data, some applications and reporting features are not performing at SLA levels.

Which of the following should you use as the masking function for Data type XYZC?

Case study -

Overview -

XYZ is an online training provider.

Current Environment -

The company currently has Microsoft SQL databases that are split into different categories or tiers. Some of the databases are used by Internal users, some by external partners and external distributions.

Below is the List of applications, tiers and their individual requirements:

(含图)

Below are the current requirements of the company:

* For Tier 4 and Tier 5 databases, the backup strategy must include the following:

- Transactional log backup every hour

- Differential backup every day

- Full backup every week

* Backup strategies must be in place for all standalone Azure SQL databases using methods available with Azure SQL databases

* Tier 1 database must implement the following data masking logic:

- For Data type XYZ-A `" Mask 4 or less string data type characters

- For Data type XYZ-B `" Expose the first letter and mask the domain

- For Data type XYZ-C `" Mask everything except characters at the beginning and the end

* All certificates and keys are internally managed in on-premise data stores

* For Tier 2 databases, if there are any conflicts between the data transfer from on-premise, preference should be given to on-premise data.

* Monitoring must be setup on every database

* Applications with Tiers 6 through 8 must ensure that unexpected resource storage usage is immediately reported to IT data engineers.

* Azure SQL Data warehouse would be used to gather data from multiple internal and external databases.

* The Azure SQL Data warehouse must be optimized to use data from its cache

* The below metrics must be available when it comes to the cache:

- Metric XYZ-A `" Low cache hit %, high cache usage %

- Metric XYZ-B `" Low cache hit %, low cache usage %

- Metric XYZ-C `" high cache hit %, high cache usage %

* The reporting data for external partners must be stored in Azure storage. The data should be made available during regular business hours in connecting regions.

* The reporting for Tier 9 needs to be moved to Event Hubs.

* The reporting for Tier 10 needs to be moved to Azure Blobs.

The following issues have been identified in the setup:

* The External partners have control over the data formats, types and schemas

* For External based clients, the queries can't be changed or optimized

* The database development staff are familiar with T-SQL language

* Because of the size and amount of data, some applications and reporting features are not performing at SLA levels.

You have to implement logging for monitoring the data warehousing solution.

Which of the following would you log?

Case study -

Overview -

XYZ is an online training provider.

Current Environment -

The company currently has Microsoft SQL databases that are split into different categories or tiers. Some of the databases are used by Internal users, some by external partners and external distributions.

Below is the List of applications, tiers and their individual requirements:

(含图)

Below are the current requirements of the company:

* For Tier 4 and Tier 5 databases, the backup strategy must include the following:

- Transactional log backup every hour

- Differential backup every day

- Full backup every week

* Backup strategies must be in place for all standalone Azure SQL databases using methods available with Azure SQL databases

* Tier 1 database must implement the following data masking logic:

- For Data type XYZ-A `" Mask 4 or less string data type characters

- For Data type XYZ-B `" Expose the first letter and mask the domain

- For Data type XYZ-C `" Mask everything except characters at the beginning and the end

* All certificates and keys are internally managed in on-premise data stores

* For Tier 2 databases, if there are any conflicts between the data transfer from on-premise, preference should be given to on-premise data.

* Monitoring must be setup on every database

* Applications with Tiers 6 through 8 must ensure that unexpected resource storage usage is immediately reported to IT data engineers.

* Azure SQL Data warehouse would be used to gather data from multiple internal and external databases.

* The Azure SQL Data warehouse must be optimized to use data from its cache

* The below metrics must be available when it comes to the cache:

- Metric XYZ-A `" Low cache hit %, high cache usage %

- Metric XYZ-B `" Low cache hit %, low cache usage %

- Metric XYZ-C `" high cache hit %, high cache usage %

* The reporting data for external partners must be stored in Azure storage. The data should be made available during regular business hours in connecting regions.

* The reporting for Tier 9 needs to be moved to Event Hubs.

* The reporting for Tier 10 needs to be moved to Azure Blobs.

The following issues have been identified in the setup:

* The External partners have control over the data formats, types and schemas

* For External based clients, the queries can't be changed or optimized

* The database development staff are familiar with T-SQL language

* Because of the size and amount of data, some applications and reporting features are not performing at SLA levels.

You need to fulfil the below requirement of the case study:

`Applications with Tiers 6 through 8 must ensure that unexpected resource storage usage is immediately reported to IT data engineers.`

Which of the following would you implement for this requirement?

1 2