[May-2023]High Quality Braindump2go AZ-305 PDF and VCE AZ-305 310Q Free Share[Q125-Q165]

May/2023 Latest Braindump2go AZ-305 Exam Dumps with PDF and VCE Free Updated Today! Following are some new Braindump2go AZ-305 Real Exam Questions!

QUESTION 125
Note: This question is part of a series of questions that present the same scenario. Each question in the series contains a unique solution that might meet the stated goals. Some question sets might have more than one correct solution, while others might not have a correct solution.
After you answer a question in this section, you will NOT be able to return to it. As a result, these questions will not appear in the review screen.
Your company deploys several virtual machines on-premises and to Azure. ExpressRoute is deployed and configured for on-premises to Azure connectivity.
Several virtual machines exhibit network connectivity issues.
You need to analyze the network traffic to identify whether packets are being allowed or denied to the virtual machines.
Solution: Install and configure the Azure Monitoring agent and the Dependency Agent on all the virtual machines. Use VM insights in Azure Monitor to analyze the network traffic.
Does this meet the goal?

A. Yes
B. No

Answer: B
Explanation:
Use the Azure Monitor agent if you need to:
Collect guest logs and metrics from any machine in Azure, in other clouds, or on-premises.
Use the Dependency agent if you need to:
Use the Map feature VM insights or the Service Map solution.
Note: Instead use Azure Network Watcher IP Flow Verify allows you to detect traffic filtering issues at a VM level.
IP flow verify checks if a packet is allowed or denied to or from a virtual machine. The information consists of direction, protocol, local IP, remote IP, local port, and remote port. If the packet is denied by a security group, the name of the rule that denied the packet is returned. While any source or destination IP can be chosen, IP flow verify helps administrators quickly diagnose connectivity issues from or to the internet and from or to the on-premises environment.
Reference:
https://docs.microsoft.com/en-us/azure/network-watcher/network-watcher-ip-flow-verify-overview
https://docs.microsoft.com/en-us/azure/azure-monitor/agents/agents-overview#dependency-agent

QUESTION 126
Note: This question is part of a series of questions that present the same scenario. Each question in the series contains a unique solution that might meet the stated goals. Some question sets might have more than one correct solution, while others might not have a correct solution.
After you answer a question in this section, you will NOT be able to return to it. As a result, these questions will not appear in the review screen.
You plan to deploy multiple instances of an Azure web app across several Azure regions.
You need to design an access solution for the app. The solution must meet the following replication requirements:
Support rate limiting.
Balance requests between all instances.
Ensure that users can access the app in the event of a regional outage.
Solution: You use Azure Traffic Manager to provide access to the app.
Does this meet the goal?

A. Yes
B. No

Answer: B
Explanation:
Azure Traffic Manager is a DNS-based traffic load balancer. This service allows you to distribute traffic to your public facing applications across the global Azure regions. Traffic Manager also provides your public endpoints with high availability and quick responsiveness. It does not provide rate limiting.
Note: Azure Front Door would meet the requirements. The Azure Web Application Firewall (WAF) rate limit rule for Azure Front Door controls the number of requests allowed from clients during a one-minute duration.
Reference:
https://docs.microsoft.com/en-us/azure/app-service/web-sites-traffic-manager https://docs.microsoft.com/en-us/azure/traffic-manager/traffic-manager-overview
https://docs.microsoft.com/en-us/azure/web-application-firewall/afds/waf-front-door-rate-limit-powershell

QUESTION 127
Note: This question is part of a series of questions that present the same scenario. Each question in the series contains a unique solution that might meet the stated goals. Some question sets might have more than one correct solution, while others might not have a correct solution.
After you answer a question in this section, you will NOT be able to return to it. As a result, these questions will not appear in the review screen.
You plan to deploy multiple instances of an Azure web app across several Azure regions.
You need to design an access solution for the app. The solution must meet the following replication requirements:
Support rate limiting.
Balance requests between all instances.
Ensure that users can access the app in the event of a regional outage.
Solution: You use Azure Load Balancer to provide access to the app.
Does this meet the goal?

A. Yes
B. No

Answer: B
Explanation:
Azure Application Gateway and Azure Load Balancer do not support rate or connection limits.
Note: Azure Front Door would meet the requirements. The Azure Web Application Firewall (WAF) rate limit rule for Azure Front Door controls the number of requests allowed from clients during a one-minute duration.
Reference:
https://www.nginx.com/blog/nginx-plus-and-azure-load-balancers-on-microsoft-azure/
https://docs.microsoft.com/en-us/azure/web-application-firewall/afds/waf-front-door-rate-limit-powershell

QUESTION 128
Note: This question is part of a series of questions that present the same scenario. Each question in the series contains a unique solution that might meet the stated goals. Some question sets might have more than one correct solution, while others might not have a correct solution.
After you answer a question in this section, you will NOT be able to return to it. As a result, these questions will not appear in the review screen.
You plan to deploy multiple instances of an Azure web app across several Azure regions.
You need to design an access solution for the app. The solution must meet the following replication requirements:
Support rate limiting.
Balance requests between all instances.
Ensure that users can access the app in the event of a regional outage.
Solution: You use Azure Application Gateway to provide access to the app.
Does this meet the goal?

A. Yes
B. No

Answer: B
Explanation:
Azure Application Gateway and Azure Load Balancer do not support rate or connection limits.
Note: Azure Front Door would meet the requirements. The Azure Web Application Firewall (WAF) rate limit rule for Azure Front Door controls the number of requests allowed from clients during a one-minute duration.
Reference:
https://www.nginx.com/blog/nginx-plus-and-azure-load-balancers-on-microsoft-azure/
https://docs.microsoft.com/en-us/azure/web-application-firewall/afds/waf-front-door-rate-limit-powershell

QUESTION 129
Your company has an Azure Web App that runs via the Premium App Service Plan.
A development team will be using the Azure Web App.
You have to configure the Azure Web app so that it can fulfil the below requirements:
– Provide the ability to switch the web app from the current version to a newer version
– Provide developers with the ability to test newer versions of the application before the switch to the newer version occurs
– Ensure that the application version can be rolled back
– Minimize downtime
Which of the following can be used for this requirement?

A. Create a new App Service Plan
B. Make use of deployment slots
C. Map a custom domain
D. Backup the Azure Web App

Answer: B

QUESTION 130
You have to deploy an Azure SQL database named db1 for your company. The databases must meet the following security requirements:
– When IT help desk supervisors query a database table named customers, they must be able to see the full number of each credit card
– When IT help desk operators query a database table named customers, they must only see the last four digits of each credit card number
– A column named Credit Card rating in the customers table must never appear in plain text in the database system.
– Only client applications must be able to decrypt the information that is stored in this column
Which of the following can be implemented for the Credit Card rating column security requirement?

A. Always Encrypted
B. Azure Advanced Threat Protection
C. Transparent Data Encryption
D. Dynamic Data Masking

Answer: A
Explanation:
https://docs.microsoft.com/en-us/sql/relational-databases/security/encryption/always-encrypted-database-engine?view=sql-server-ver15

QUESTION 131
You have an Azure Active Directory (Azure AD) tenant that syncs with an on-premises Active Directory domain.
Your company has a line-of-business (LOB) application that was developed internally.
You need to implement SAML single sign-on (SSO) and enforce multi-factor authentication (MFA) when users attempt to access the application from an unknown location.
Which two features should you include in the solution? Each correct answer presents part of the solution.
NOTE: Each correct selection is worth one point.

A. Azure AD enterprise applications
B. Azure AD Identity Protection
C. Azure Application Gateway
D. Conditional Access policies
E. Azure AD Privileged Identity Management (PIM)

Answer: AD
Explanation:
https://learn.microsoft.com/en-us/azure/active-directory/app-proxy/application-proxy-configure-single-sign-on-on-premises-apps

QUESTION 132
You ate designing an Azure governance solution.
All Azure resources must be easily identifiable based on the following operational information environment, owner, department and cost center.
You need 10 ensure that you can use the operational information when you generate reports for the Azure resources.
What should you include in the solution?

A. Azure Active Directory (Azure AD) administrative units
B. an Azure data catalog that uses the Azure REST API as a data source
C. an Azure policy that enforces tagging rules
D. an Azure management group that uses parent groups to create a hierarchy

Answer: C
Explanation:
You use Azure Policy to enforce tagging rules and conventions. By creating a policy, you avoid the scenario of resources being deployed to your subscription that don’t have the expected tags for your organization. Instead of manually applying tags or searching for resources that aren’t compliant, you create a policy that automatically applies the needed tags during deployment.
Note: Organizing cloud-based resources is a crucial task for IT, unless you only have simple deployments. Use naming and tagging standards to organize your resources for these reasons:
Resource management: Your IT teams will need to quickly locate resources associated with specific workloads, environments, ownership groups, or other important information. Organizing resources is critical to assigning organizational roles and access permissions for resource management.
Reference:
https://docs.microsoft.com/en-us/azure/cloud-adoption-framework/decision-guides/resource-tagging
https://docs.microsoft.com/en-us/azure/azure-resource-manager/management/tag-policies

QUESTION 133
You plan to automate the deployment of resources to Azure subscriptions.
What is a difference between using Azure Blueprints and Azure Resource Manager (ARM) templates?

A. ARM templates remain connected to the deployed resources.
B. Only ARM templates can contain policy definitions.
C. Blueprints remain connected to the deployed resources.
D. Only Blueprints can contain policy definitions.

Answer: C
Explanation:
With Azure Blueprints, the relationship between the blueprint definition (what should be deployed) and the blueprint assignment (what was deployed) is preserved. This connection supports improved tracking and auditing of deployments. Azure Blueprints can also upgrade several subscriptions at once that are governed by the same blueprint.
Reference:
https://docs.microsoft.com/en-us/answers/questions/26851/how-is-azure-blue-prints-different-from-resource-m.html

QUESTION 134
A company named Contoso, Ltd. has an Azure Active Directory (Azure AD) tenant that is integrated with Microsoft 365 and an Azure subscription.
Contoso has an on-premises identity infrastructure. The infrastructure includes servers that run Active Directory Domain Services (AD DS) and Azure AD Connect.
Contoso has a partnership with a company named Fabrikam. Inc. Fabrikam has an Active Directory forest and a Microsoft 365 tenant. Fabrikam has the same on- premises identity infrastructure components as Contoso.
A team of 10 developers from Fabrikam will work on an Azure solution that will be hosted in the Azure subscription of Contoso. The developers must be added to the Contributor role for a resource group in the Contoso subscription.
You need to recommend a solution to ensure that Contoso can assign the role to the 10 Fabrikam developers. The solution must ensure that the Fabrikam developers use their existing credentials to access resources
What should you recommend?

A. Configure a forest trust between the on-premises Active Directory forests of Contoso and Fabrikam.
B. Configure an organization relationship between the Office 365 tenants of Fabrikam and Contoso.
C. In the Azure AD tenant of Contoso, create guest accounts for the Fabnkam developers.
D. In the Azure AD tenant of Contoso. create cloud-only user accounts for the Fabrikam developers.

Answer: C
Explanation:
Collaborate with any partner using their identities
With Azure AD B2B, the partner uses their own identity management solution, so there is no external administrative overhead for your organization. Guest users sign in to your apps and services with their own work, school, or social identities.
The partner uses their own identities and credentials, whether or not they have an Azure AD account.
You don’t need to manage external accounts or passwords.
You don’t need to sync accounts or manage account lifecycles.
Reference:
https://docs.microsoft.com/en-us/azure/active-directory/external-identities/what-is-b2b

QUESTION 135
You are designing a microservices architecture that will support a web application.
The solution must meet the following requirements:
– Allow independent upgrades to each microservice
– Deploy the solution on-premises and to Azure
– Set policies for performing automatic repairs to the microservices
– Support low-latency and hyper-scale operations
You need to recommend a technology.
What should you recommend?

A. Azure Service Fabric
B. Azure Container Service
C. Azure Container Instance
D. Azure Virtual Machine Scale Set

Answer: A
Explanation:
https://docs.microsoft.com/en-us/azure/service-fabric/service-fabric-overview

QUESTION 136
You plan to deploy an Azure App Service web app that will have multiple instances across multiple Azure regions.
You need to recommend a load balancing service for the planned deployment. The solution must meet the following requirements:
– Maintain access to the app in the event of a regional outage.
– Support Azure Web Application Firewall (WAF).
– Support cookie-based affinity.
– Support URL routing.
What should you include in the recommendation?

A. Azure Front Door
B. Azure Load Balancer
C. Azure Traffic Manager
D. Azure Application Gateway

Answer: B
Explanation:
Azure Traffic Manager performs the global load balancing of web traffic across Azure regions, which have a regional load balancer based on Azure Application Gateway. This combination gets you the benefits of Traffic Manager many routing rules and Application Gateway’s capabilities such as WAF, TLS termination, path-based routing, cookie-based session affinity among others.
Reference:
https://docs.microsoft.com/en-us/azure/application-gateway/features

QUESTION 137
You have an Azure subscription.
Your on-premises network contains a file server named Server1. Server 1 stores 5 TB of company files that are accessed rarely.
You plan to copy the files to Azure Storage.
You need to implement a storage solution for the files that meets the following requirements:
– The files must be available within 24 hours of being requested.
– Storage costs must be minimized.
Which two possible storage solutions achieve this goal? Each correct answer presents a complete solution. NOTE: Each correct selection is worth one point.

A. Create a general-purpose v1 storage account. Create a blob container and copy the files to the blob container.
B. Create a general-purpose v2 storage account that is configured for the Hot default access tier.
Create a blob container, copy the files to the blob container, and set each file to the Archive access tier.
C. Create a general-purpose v1 storage account. Create a file share in the storage account and copy the files to the file share.
D. Create a general-purpose v2 storage account that is configured for the Cool default access tier.
Create a file share in the storage account and copy the files to the file share.
E. Create an Azure Blob storage account that is configured for the Cool default access tier.
Create a blob container, copy the files to the blob container, and set each file to the Archive access tier.

Answer: BE
Explanation:
To minimize costs: The Archive tier is optimized for storing data that is rarely accessed and stored for at least 180 days with flexible latency requirements (on the order of hours).
Reference:
https://docs.microsoft.com/en-us/azure/storage/blobs/storage-blob-storage-tiers

QUESTION 138
You have 100 Microsoft SQL Server integration Services (SSIS) packages that are configured to use 10 on-premises SQL Server databases as their destinations. You plan to migrate the 10 on-premises databases to Azure SQL Database.
You need to recommend a solution to host the SSlS packages in Azure.
The solution must ensure that the packages can target the SQL Database instances as their destinations.
What should you include in the recommendation?

A. SQL Server Migration Assistant (SSMA)
B. Azure Data Catalog
C. Data Migration Assistant
D. Azure Data Factory

Answer: D
Explanation:
Migrate on-premises SSIS workloads to SSIS using ADF (Azure Data Factory).
When you migrate your database workloads from SQL Server on premises to Azure database services, namely Azure SQL Database or Azure SQL Managed Instance, your ETL workloads on SQL Server Integration Services (SSIS) as one of the primary value-added services will need to be migrated as well.
Azure-SSIS Integration Runtime (IR) in Azure Data Factory (ADF) supports running SSIS packages. Once Azure-SSIS IR is provisioned, you can then use familiar tools, such as SQL Server Data Tools (SSDT)/SQL Server Management Studio (SSMS), and command-line utilities, such as dtinstall/dtutil/dtexec, to deploy and run your packages in Azure.
Reference:
https://docs.microsoft.com/en-us/azure/data-factory/scenario-ssis-migration-overview

QUESTION 139
You have an app named App1 that uses two on-premises Microsoft SQL Server databases named DB1 and DB2.
You plan to migrate DB1 and DB2 to Azure.
You need to recommend an Azure solution to host DB1 and DB2. The solution must meet the following requirements:
– Support server-side transactions across DB1 and DB2.
– Minimize administrative effort to update the solution.
What should you recommend?

A. two databases on the same SQL Server instance on an Azure virtual machine
B. two Azure SQL databases on different Azure SQL Database servers
C. two Azure SQL databases in an elastic pool
D. two databases on the same Azure SQL managed instance

Answer: D
Explanation:
When both the database management system and client are under the same ownership (e.g. when SQL Server is deployed to a Elastic database transactions for Azure SQL Database and Azure SQL Managed Instance allow you to run transactions that span several databases.
SQL Managed Instance enables system administrators to spend less time on administrative tasks because the service either performs them for you or greatly simplifies those tasks.
Reference:
https://docs.microsoft.com/en-us/azure/azure-sql/database/elastic-transactions-overview?view=azuresql-azure

QUESTION 140
Note: This question is part of a series of questions that present the same scenario. Each question in the series contains a unique solution that might meet the stated goals. Some question sets might have more than one correct solution, while others might not have a correct solution.
After you answer a question in this section, you will NOT be able to return to it. As a result, these questions will not appear in the review screen.
Your company has deployed several virtual machines (VMs) on-premises and to Azure. Azure ExpressRoute has been deployed and configured for on-premises to Azure connectivity.
Several VMs are exhibiting network connectivity issues.
You need to analyze the network traffic to determine whether packets are being allowed or denied to the VMs.
Solution: Use the Azure Traffic Analytics solution in Azure Log Analytics to analyze the network traffic.
Does the solution meet the goal?

A. Yes
B. No

Answer: B
Explanation:
Instead use Azure Network Watcher to run IP flow verify to analyze the network traffic.
Reference:
https://docs.microsoft.com/en-us/azure/network-watcher/network-watcher-monitoring-overview
https://docs.microsoft.com/en-us/azure/network-watcher/network-watcher-ip-flow-verify-overview

QUESTION 141
Note: This question is part of a series of questions that present the same scenario. Each question in the series contains a unique solution that might meet the stated goals. Some question sets might have more than one correct solution, while others might not have a correct solution.
After you answer a question in this section, you will NOT be able to return to it. As a result, these questions will not appear in the review screen.
You have an Azure Storage account that contains two 1-GB data files named File1 and File2. The data files are set to use the archive access tier.
You need to ensure that File1 is accessible immediately when a retrieval request is initiated.
Solution: For File1, you set Access tier to Cool.
Does this meet the goal?

A. Yes
B. No

Answer: A
Explanation:
The data in the cool tier is “considered / intended to be stored for 30 days”. But this is not a must. You can store data indefinitely in the cool tier. The mentioned reference (see below) even gives an example of large scientific or otherwise large data which is stored for long duration in the cool tier.
Reference:
https://docs.microsoft.com/en-us/azure/storage/blobs/storage-blob-storage-tiers?tabs=azure-portal

QUESTION 142
You plan provision a High Performance Computing (HPC) cluster in Azure that will use a third-party scheduler.
You need to recommend a solution to provision and manage the HPC cluster node.
What should you include in the recommendation?

A. Azure Lighthouse
B. Azure CycleCloud
C. Azure Purview
D. Azure Automation

Answer: B
Explanation:
You can dynamically provision Azure HPC clusters with Azure CycleCloud. Azure CycleCloud is the simplest way to manage HPC workloads.
Note: Azure CycleCloud is an enterprise-friendly tool for orchestrating and managing High Performance Computing (HPC) environments on Azure. With CycleCloud, users can provision infrastructure for HPC systems, deploy familiar HPC schedulers, and automatically scale the infrastructure to run jobs efficiently at any scale. Through CycleCloud, users can create different types of file systems and mount them to the compute cluster nodes to support HPC workloads.
Reference:
https://docs.microsoft.com/en-us/azure/cyclecloud/overview

QUESTION 143
You have an Azure Data Lake Storage account that contains 1,000 10-MB CSV files and an Azure Synapse Analytics dedicated SQL pool named sql1. You need to load the files to sql1. The solution must meet the following requirements:
– Maximize data load performance.
– Eliminate the need to define external tables before the data loads.
What should you use?

A. the copy statement
B. PolyBase
C. BCP
D. the sqlBulkcopy object

Answer: B

QUESTION 144
You plan to deploy an Azure Databricks Data Science & Engineering workspace and ingest data into the workspace.
Where should you persist the ingested data?

A. Azure Files
B. Azure Data Lake
C. Azure SQL Database
D. Azure Cosmos DB

Answer: B
Explanation:
The Azure Databricks Data Science & Engineering data lands in a data lake for long term persisted storage, in Azure Blob Storage or Azure Data Lake Storage.
Reference:
https://docs.microsoft.com/en-us/azure/databricks/scenarios/what-is-azure-databricks-ws

QUESTION 145
You plan to migrate data to Azure.
The IT department at your company identifies the following requirements:
– The storage must support 1 PB of data.
– The data must be stored in blob storage.
– The storage must support three levels of subfolders.
– The storage must support access control lists (ACLs).
You need to meet the requirements.
What should you use?

A. a premium storage account that is configured for block blobs
B. a general purpose v2 storage account that has hierarchical namespace enabled
C. a premium storage account that is configured for page blobs
D. a premium storage account that is configured for files shares and supports large file shares

Answer: B
Explanation:
Microsoft recommends that you use a GPv2 storage account for most scenarios. It supports up to 5 PB, and blob storage including Data Lake storage.
Note: A key mechanism that allows Azure Data Lake Storage Gen2 to provide file system performance at object storage scale and prices is the addition of a hierarchical namespace. This allows the collection of objects/files within an account to be organized into a hierarchy of directories and nested subdirectories in the same way that the file system on your computer is organized. With a hierarchical namespace enabled, a storage account becomes capable of providing the scalability and cost-effectiveness of object storage, with file system semantics that are familiar to analytics engines and frameworks.
Reference:
https://docs.microsoft.com/en-us/azure/storage/common/storage-account-overview
https://docs.microsoft.com/en-us/azure/storage/blobs/data-lake-storage-namespace

QUESTION 146
Hotspot Questiom
A company is looking for a solution that collects a large amount of data from loT devices and performs real time analytics.
The data consists of millions of messages that are sent hourly. This incoming data must be processed and then presented on a dashboard. It must also be connected to referential data that provides business information. All data must be stored in native JSON format so that it can be moved to Azure Synapse Analytics SQL Pool.
You need to determine a storage solution for the historical data and a solution to perform real time data aggregation.
What should you recommend? To answer, select the appropriate options from the drop-down menus.

Answer:

Explanation:
You should recommend Azure Data Lake Storage to store historical data because you can load it directly into Azure Synapse. You can create an external table in the data warehouse and use PolyBase to load the data.
You should recommend a Stream Analytics job as a solution for real time data aggregations. You can also join the input stream with reference data and use this information in the analysis.
You should not recommend Azure Blob storage for historical data because you would need to customize the process of loading data into Azure Synapse. Azure Blob storage is generally used to store unstructured data, such as images, videos, and audio files.
You should not recommend Azure Cosmos DB because you would need to customize the process of loading data to Azure Synapse.
You should not recommend Azure Functions as a solution for real time data aggregation. Azure Functions requires custom code to implement business logic. Besides, Azure Functions cannot efficiently send data to the real-time dashboard.
You should not recommend Azure Analysis Services. Azure Analysis Services is a solution that provides enterprise-grade data models in the cloud.

QUESTION 147
Note: This question is part of a series of questions that present the same scenario. Each question in the series contains a unique solution that might meet the stated goals. Some question sets might have more than one correct solution, while others might not have a correct solution.
After you answer a question in this section, you will NOT be able to return to it. As a result, these questions will not appear in the review screen.
You are designing an Azure solution for a company that has four departments. Each department will deploy several Azure app services and Azure SQL databases.
You need to recommend a solution to report the costs for each department to deploy the app services and the databases. The solution must provide a consolidated view for cost reporting that displays cost broken down by department.
Solution: Create a separate resource group for each department. Place the resources for each department in its respective resource group.
Does this meet the goal?

A. Yes
B. No

Answer: B
Explanation:
Instead create a resources group for each resource type. Assign tags to each resource group.
Note: Tags enable you to retrieve related resources from different resource groups. This approach is helpful when you need to organize resources for billing or management.
Reference:
https://docs.microsoft.com/en-us/azure/azure-resource-manager/resource-group-using-tags

QUESTION 148
You are designing a solution that will include containerized applications running in an Azure Kubernetes Service (AKS) cluster.
You need to recommend a load balancing solution for HTTPS traffic. The solution must meet the following requirements:
– Automatically configure load balancing rules as the applications are deployed to the cluster.
– Support Azure Web Application Firewall (WAF).
– Support cookie-based affinity.
– Support URL routing.
What should you include the recommendation?

A. an NGINX ingress controller
B. Application Gateway Ingress Controller (AGIC)
C. an HTTP application routing ingress controller
D. the Kubernetes load balancer service

Answer: B
Explanation:
Much like the most popular Kubernetes Ingress Controllers, the Application Gateway Ingress Controller provides several features, leveraging Azure’s native Application Gateway L7 load balancer.
To name a few:
URL routing
Cookie-based affinity
Secure Sockets Layer (SSL) termination
End-to-end SSL
Support for public, private, and hybrid web sites
Integrated support of Azure web application firewall
Application Gateway redirection support isn’t limited to HTTP to HTTPS redirection alone. This is a generic redirection mechanism, so you can redirect from and to any port you define using rules. It also supports redirection to an external site as well.
Reference:
https://docs.microsoft.com/en-us/azure/application-gateway/features

QUESTION 149
A company has the following on-premise data stores:
– A Microsoft SQL Server 2012 database
– A Microsoft SQL Server 2008 database
The data needs to be migrated to Azure.
– Requirement 1 – The data in the Microsoft SQL Server 2012 database needs to be migrated to an Azure SQL database
– Requirement 2 – The data in a table in the Microsoft SQL Server 2008 database needs to be migrated to an Azure CosmosDB account that uses the SQL API
Which of the following should be used to accomplish Requirement1?

A. AzCopy
B. Azure CosmosDB Data Migration tool
C. Data Management Gateway
D. Data Migration Assistant

Answer: D
Explanation:
The Data Migration assistant can be used to migrate the data. It has support for various versions of Microsoft SQL Server as shown below:

Option A is incorrect since this works with data in Azure storage accounts
Option B is incorrect since this is used for migration of data to CosmosDB
Option C is incorrect since this is used for building a gateway with the on-premise infrastructure
Reference:
https://docs.microsoft.com/en-us/sql/dma/dma-overview?view=sql-server-2017

QUESTION 150
You have an application named App1. App1 generates log files that must be archived for five years. The log files must be readable by App1 but must not be modified.
Which storage solution should you recommend for archiving?

A. Ingest the log files into an Azure Log Analytics workspace
B. Use an Azure Blob storage account and a time-based retention policy
C. Use an Azure Blob storage account configured to use the Archive access tier
D. Use an Azure file share that has access control enabled

Answer: B
Explanation:
Immutable storage for Azure Blob storage enables users to store business-critical data objects in a WORM (Write Once, Read Many) state.
Immutable storage supports:
Time-based retention policy support: Users can set policies to store data for a specified interval. When a time-based retention policy is set, blobs can be created and read, but not modified or deleted. After the retention period has expired, blobs can be deleted but not overwritten.
Reference:
https://docs.microsoft.com/en-us/azure/storage/blobs/storage-blob-immutable-storage

QUESTION 151
You have an Azure subscription that contains a Windows Virtual Desktop tenant.
You need to recommend a solution to meet the following requirements:
– Start and stop Windows Virtual Desktop session hosts based on business hours.
– Scale out Windows Virtual Desktop session hosts when required.
– Minimize compute costs.
What should you include in the recommendation?

A. Microsoft Intune
B. a Windows Virtual Desktop automation task
C. Azure Automation
D. Azure Service Health

Answer: C
Explanation:
Reference:
https://www.ciraltos.com/automatically-start-and-stop-wvd-vms-with-azure-automation/
https://wvdlogix.net/windows-virtual-desktop-host-pool-automation-2
https://getnerdio.com/academy/how-to-optimize-windows-virtual-desktop-wvd-azure-costs-with-event-based-autoscaling-and-azure-vm-scale-sets/

QUESTION 152
You have an Azure subscription. The subscription contains an app that is hosted in the East US, Central Europe, and East Asia regions.
You need to recommend a data-tier solution for the app. The solution must meet the following requirements:
– Support multiple consistency levels.
– Be able to store at least 1 TB of data.
– Be able to perform read and write operations in the Azure region that is local to the app instance.
What should you include in the recommendation?

A. an Azure Cosmos DB database
B. a Microsoft SQL Server Always On availability group on Azure virtual machines
C. an Azure SQL database in an elastic pool
D. Azure Table storage that uses geo-redundant storage (GRS) replication

Answer: A
Explanation:
Azure Cosmos DB approaches data consistency as a spectrum of choices. This approach includes more
options than the two extremes of strong and eventual consistency. You can choose from five well-defined levels on the consistency spectrum.
With Cosmos DB any write into any region must be replicated and committed to all configured regions within the account.
Reference:
https://docs.microsoft.com/en-us/azure/cosmos-db/consistency-levels-tradeoffs

QUESTION 153
The accounting department at your company migrates to a new financial accounting software. The accounting department must keep file-based database backups for seven years for compliance purposes. It is unlikely that the backups will be used to recover data.
You need to move the backups to Azure. The solution must minimize costs.
Where should you store the backups?

A. Azure Blob storage that uses the Archive tier
B. Azure SQL Database
C. Azure Blob storage that uses the Cool tier
D. a Recovery Services vault

Answer: A
Explanation:
Azure Front Door enables you to define, manage, and monitor the global routing for your web traffic by optimizing for best performance and instant global failover for high availability. With Front Door, you can transform your global (multi-region) consumer and enterprise applications into robust, high-performance personalized modern applications, APIs, and content that reaches a global audience with Azure.
Front Door works at Layer 7 or HTTP/HTTPS layer and uses anycast protocol with split TCP and Microsoft’s global network for improving global connectivity.
Reference:
https://docs.microsoft.com/en-us/azure/frontdoor/front-door-overview

QUESTION 154
You have an Azure subscription.
You need to deploy an Azure Kubernetes Service (AKS) solution that will use Windows Server 2019 nodes.
The solution must meet the following requirements:
– Minimize the time it takes to provision compute resources during scale-out operations.
– Support autoscaling of Windows Server containers.
Which scaling option should you recommend?

A. cluster autoscaler
B. horizontal pod autoscaler
C. Kubernetes version 1.20.2 or newer
D. Virtual nodes with Virtual Kubelet ACI

Answer: A
Explanation:
Azure Container Instances (ACI) lets you quickly deploy container instances without additional infrastructure overhead. When you connect with AKS, ACI becomes a secured, logical extension of your AKS cluster. The virtual nodes component, which is based on Virtual Kubelet, is installed in your AKS cluster that presents ACI as a virtual Kubernetes node. Kubernetes can then schedule pods that run as ACI instances through virtual nodes, not as pods on VM nodes directly in your AKS cluster.
Your application requires no modification to use virtual nodes. Deployments can scale across AKS and ACI and with no delay as cluster autoscaler deploys new nodes in your AKS cluster.

Note: AKS clusters can scale in one of two ways:
The cluster autoscaler watches for pods that can’t be scheduled on nodes because of resource constraints. The cluster then automatically increases the number of nodes. The horizontal pod autoscaler uses the Metrics Server in a Kubernetes cluster to monitor the resource demand of pods. If an application needs more resources, the number of pods is automatically increased to meet the demand.
Reference:
https://docs.microsoft.com/en-us/azure/aks/concepts-scale5

QUESTION 155
You need to design a highly available Azure SQL database that meets the following requirements:
– Failover between replicas of the database must occur without any data loss.
– The database must remain available in the event of a zone outage.
– Costs must be minimized
Which deployment option should you use?

A. Azure SQL Database Standard
B. Azure SQL Database Serverless
C. Azure SQL Managed Instance General Purpose
D. Azure SQL Database Premium

Answer: D
Explanation:
Azure SQL Database Premium tier supports multiple redundant replicas for each database that are automatically provisioned in the same datacenter within a region. This design leverages the SQL Server AlwaysON technology and provides resilience to server failures with 99.99% availability SLA and RPO=0.
With the introduction of Azure Availability Zones, we are happy to announce that SQL Database now offers built-in support of Availability Zones in its Premium service tier.
Reference:
https://azure.microsoft.com/en-us/blog/azure-sql-database-now-offers-zone-redundant-premium-databases-and-elastic-pools/

QUESTION 156
You need to design a highly available Azure SQL database that meets the following requirements:
– Failover between replicas of the database must occur without any data loss.
– The database must remain available in the event of a zone outage.
– Costs must be minimized.
Which deployment option should you use?

A. Azure SQL Database Premium
B. Azure SQL Database Hyperscale
C. Azure SQL Database Basic
D. Azure SQL Managed Instance Business Critical

Answer: A
Explanation:
Azure SQL Database Premium tier supports multiple redundant replicas for each database that are automatically provisioned in the same datacenter within a region. This design leverages the SQL Server AlwaysON technology and provides resilience to server failures with 99.99% availability SLA and RPO=0.
With the introduction of Azure Availability Zones, we are happy to announce that SQL Database now offers built-in support of Availability Zones in its Premium service tier.
Reference:
https://azure.microsoft.com/en-us/blog/azure-sql-database-now-offers-zone-redundant-premium-databases-and-elastic-pools/

QUESTION 157
You plan to migrate App1 to Azure. The solution must meet the authentication and authorization requirements.
Which of the endpoint should App1 use to obtain an access token?

A. Microsoft identify platform
B. Azure AD
C. Azure instance Service (IMDS)
D. Azure Service management

Answer: A

QUESTION 158
You plan to automate the deployment of resources to Azure subscriptions.
What is a difference between using Azure Blueprints and Azure Resource Manager templates?

A. Azure Resource Manager templates remain connected to the deployed resources.
B. Only Azure Resource Manager templates can contain policy definitions.
C. Azure Blueprints remain connected to the deployed resources.
D. Only Azure Blueprints can contain policy definitions.

Answer: C
Explanation:
With Azure Blueprints, the relationship between the blueprint definition (what should be deployed) and the blueprint assignment (what was deployed) is preserved. This connection supports improved tracking and auditing of deployments. Azure Blueprints can also upgrade several subscriptions at once that are governed by the same blueprint.
Reference:
https://docs.microsoft.com/en-us/answers/questions/26851/how-is-azure-blue-prints-different-from-resource-m.html

QUESTION 159
You have an Azure subscription that contains an Azure SQL database.
You are evaluating whether to use Azure reservations on the Azure SQL database.
Which tool should you use to estimate the potential savings?

A. The Purchase reservations blade in the Azure portal
B. The Advisor blade in the Azure portal
C. The SQL database blade in the Azure portal

Answer: A
Explanation:
Buy reserved capacity
Sign in to the Azure portal.
Select All services > Reservations.
Select Add and then in the Purchase Reservations pane, select SQL Database to purchase a new reservation for SQL Database.
Fill in the required fields. Existing databases in SQL Database and SQL Managed Instance that match the attributes you select qualify to get the reserved capacity discount. The actual number of databases or managed instances that get the discount depends on the scope and quantity selected.

Review the cost of the capacity reservation in the Costs section.
Select Purchase.
Select View this Reservation to see the status of your purchase.
Reference:
https://docs.microsoft.com/en-us/azure/azure-sql/database/reserved-capacity-overview

QUESTION 160
You have an Azure subscription that contains an Azure SQL database.
You plan to use Azure reservations on the Azure SQL database.
To which resource type will the reservation discount be applied?

A. vCore compute
B. DTU compute
C. Storage
D. License

Answer: A
Explanation:
Quantity: The amount of compute resources being purchased within the capacity reservation. The quantity is a number of vCores in the selected Azure region and Performance tier that are being reserved and will get the billing discount. For example, if you run or plan to run multiple databases with the total compute capacity of Gen5 16 vCores in the East US region, then you would specify the quantity as 16 to maximize the benefit for all the databases.
Reference:
https://docs.microsoft.com/en-us/azure/azure-sql/database/reserved-capacity-overview

QUESTION 161
You are designing an Azure Cosmos DB solution that will host multiple writable replicas in multiple Azure regions.
You need to recommend the strongest database consistency level for the design. The solution must meet the following requirements:
– Provide a latency-based Service Level Agreement (SLA) for writes.
– Support multiple regions.
Which consistency level should you recommend?

A. bounded staleness
B. strong
C. session
D. consistent prefix

Answer: A
Explanation:
Each level provides availability and performance tradeoffs. The following image shows the different consistency levels as a spectrum.

Note: The service offers comprehensive 99.99% SLAs which covers the guarantees for throughput, consistency, availability and latency for the Azure Cosmos DB Database Accounts scoped to a single Azure region configured with any of the five Consistency Levels or Database Accounts spanning multiple Azure regions, configured with any of the four relaxed Consistency Levels.
Reference:
https://azure.microsoft.com/en-us/support/legal/sla/cosmos-db/v1_3/
https://docs.microsoft.com/en-us/azure/cosmos-db/consistency-levels#consistency-levels-and-latency

QUESTION 162
Your company has offices in the United States, Europe, Asia, and Australia.
You have an on-premises app named App1 that uses Azure Table storage. Each office hosts a local instance of App1.
You need to upgrade the storage for App1. The solution must meet the following requirements:
– Enable simultaneous write operations in multiple Azure regions.
– Ensure that write latency is less than 10 ms.
– Support indexing on all columns.
– Minimize development effort.
Which data platform should you use?

A. Azure SQL Database
B. Azure SQL Managed Instance
C. Azure Cosmos DB
D. Table storage that uses geo-zone-redundant storage (GZRS) replication

Answer: D
Explanation:
Azure Cosmos DB Table API has
Single-digit millisecond latency for reads and writes, backed with <10-ms latency reads and <15-ms latency writes at the 99th percentile, at any scale, anywhere in the world. Automatic and complete indexing on all properties, no index management. Turnkey global distribution from one to 30+ regions. Support for automatic and manual failovers at any time, anywhere in the world.
Reference:
https://docs.microsoft.com/en-us/azure/cosmos-db/table-support

QUESTION 163
You plan to archive 10 TB of on-premises data files to Azure.
You need to recommend a data archival solution. The solution must minimize the cost of storing the data files.
Which Azure Storage account type should you include in the recommendation?

A. Standard StorageV2 (general purpose v2)
B. Standard Storage (general purpose v1)
C. Premium StorageV2 (general purpose v2)
D. Premium Storage (general purpose v1)

Answer: A
Explanation:
Standard StorageV2 supports the Archive access tier, which would be the cheapest solution.
Reference:
https://docs.microsoft.com/en-us/azure/storage/common/storage-introduction

QUESTION 164
You plan to move a web application named App! from an on-premises data center to Azure.
App1 depends on a custom COM component that is installed on the host server.
You need to recommend a solution to host App1 in Azure. The solution must meet the following requirements:
– App1 must be available to users if an Azure data center becomes unavailable.
– Costs must be minimized.
What should you include in the recommendation?

A. In two Azure regions, deploy a load balancer and a virtual machine scale set.
B. In two Azure regions, deploy a Traffic Manager profile and a web app.
C. In two Azure regions, deploy a load balancer and a web app.
D. Deploy a load balancer and a virtual machine scale set across two availability zones.

Answer: D
Explanation:
https://docs.microsoft.com/en-us/dotnet/azure/migration/app-service#com-and-com-components
Azure App Service does not allow the registration of COM components on the platform. If your app makes use of any COM components, these need to be rewritten in managed code and deployed with the site or application.
https://docs.microsoft.com/en-us/dotnet/azure/migration/app-service
Azure App Service with Windows Containers If your app cannot be migrated directly to App Service, consider App Service using Windows Containers, which enables usage of the GAC, COM components, MSIs, full access to .NET FX APIs, DirectX, and more.

QUESTION 165
You have an Azure subscription.
You need to deploy an Azure Kubernetes Service (AKS) solution that will use Linux nodes. The solution must meet the following requirements:
– Minimize the time it takes to provision compute resources during scale-out operations.
– Support autoscaling of Linux containers.
– Minimize administrative effort.
Which scaling option should you recommend?

A. Virtual Kubelet
B. cluster autoscaler
C. horizontal pod autoscaler
D. AKS virtual nodes

Answer: D
Explanation:
https://docs.microsoft.com/en-us/azure/aks/virtual-nodes


Resources From:

1.2023 Latest Braindump2go AZ-305 Exam Dumps (PDF & VCE) Free Share:
https://www.braindump2go.com/az-305.html

2.2023 Latest Braindump2go AZ-305 PDF and AZ-305 VCE Dumps Free Share:
https://drive.google.com/drive/folders/1ZjEqbV6mv15IiTS1GH-Im-8DkoxHa8tW?usp=sharing

3.2023 Free Braindump2go AZ-305 Exam Questions Download:
https://www.braindump2go.com/free-online-pdf/AZ-305-PDF-Dumps(125-165).df.pdf

Free Resources from Braindump2go,We Devoted to Helping You 100% Pass All Exams!

Braindump2go Testking Pass4sure Actualtests Others
$99.99 $124.99 $125.99 $189 $29.99/$49.99
Up-to-Dated ✔ ✖ ✖ ✖ ✖
Real Questions ✔ ✖ ✖ ✖ ✖
Error Correction ✔ ✖ ✖ ✖ ✖
Printable PDF ✔ ✖ ✖ ✖ ✖
Premium VCE ✔ ✖ ✖ ✖ ✖
VCE Simulator ✔ ✖ ✖ ✖ ✖
One Time Purchase ✔ ✖ ✖ ✖ ✖
Instant Download ✔ ✖ ✖ ✖ ✖
Unlimited Install ✔ ✖ ✖ ✖ ✖
100% Pass Guarantee ✔ ✖ ✖ ✖ ✖
100% Money Back ✔ ✖ ✖ ✖ ✖