AI-100 | Most Recent AI-100 Preparation Exams For Designing And Implementing An Azure AI Solution Certification

Cause all that matters here is passing the Microsoft AI-100 exam. Cause all that you need is a high score of AI-100 Designing and Implementing an Azure AI Solution exam. The only one thing you need to do is downloading Certleader AI-100 exam study guides now. We will not let you down with our money-back guarantee.

Online AI-100 free questions and answers of New Version:

NEW QUESTION 1

You deploy an application that performs sentiment analysis on the data stored in Azure Cosmos DB.
Recently, you loaded a large amount of data to the database. The data was for a customer named Contoso. Ltd. You discover that queries for the Contoso data are slow to complete, and the queries slow the entire
application.
You need to reduce the amount of time it takes for the queries to complete. The solution must minimize costs. What is the best way to achieve the goal? More than one answer choice may achieve the goal. Select the BEST answer.

  • A. Change the requests units.
  • B. Change the partitioning strategy.
  • C. Change the transaction isolation level.
  • D. Migrate the data to the Cosmos DB database.

Answer: B

Explanation:
References:
https://docs.microsoft.com/en-us/azure/architecture/best-practices/data-partitioning

NEW QUESTION 2

You are configuring data persistence for a Microsoft Bot Framework application. The application requires a structured NoSQL cloud data store.
You need to identify a storage solution for the application. The solution must minimize costs. What should you identify?

  • A. Azure Blob storage
  • B. Azure Cosmos DB
  • C. Azure HDInsight
  • D. Azure Table storage

Answer: D

Explanation:
Table Storage is a NoSQL key-value store for rapid development using massive semi-structured datasets You can develop applications on Cosmos DB using popular NoSQL APIs.
Both services have a different scenario and pricing model.
While Azure Storage Tables is aimed at high capacity on a single region (optional secondary read only region but no failover), indexing by PK/RK and storage-optimized pricing; Azure Cosmos DB Tables aims for high throughput (single-digit millisecond latency), global distribution (multiple failover), SLA-backed predictive performance with automatic indexing of each attribute/property and a pricing model focused on throughput.
References:
https://db-engines.com/en/system/Microsoft+Azure+Cosmos+DB%3BMicrosoft+Azure+Table+Storage

NEW QUESTION 3

You are designing a solution that uses drones to monitor remote locations for anomalies. The drones have Azure loT Edge devices. The solution must meet the following requirements:
•Email a user the picture and location of an anomaly when an anomaly is detected.
•Use a video stream to detect anomalies at the location.
•Send the pictures and location information to Azure.
•Use the least amount of code possible.
You develop a custom vision Azure Machine Learning module to detect the anomalies.
Which service should you use for each requirement? To answer, drag the appropriate services to the correct requirements. Each service may be used once, more than once, or not at all. You may need to drag the split bar between panes or scroll to view content.
NOTE: Each correct selection is worth one point.
AI-100 dumps exhibit

  • A. Mastered
  • B. Not Mastered

Answer: A

Explanation:
Box 1: Azure IOT Edge Example:
You configure the Remote Monitoring solution to respond to anomalies detected by an IoT Edge device. IoT Edge devices let you process telemetry at the edge to reduce the volume of telemetry sent to the solution and to enable faster responses to events on devices.
Box 2: Azure Functions Box 3: Azure Logic Apps References:
https://docs.microsoft.com/en-us/azure/iot-accelerators/iot-accelerators-remote-monitoring-edge

NEW QUESTION 4

You are designing an AI solution in Azure that will perform image classification.
You need to identify which processing platform will provide you with the ability to update the logic over time. The solution must have the lowest latency for inferencing without having to batch.
Which compute target should you identify?

  • A. graphics processing units (GPUs)
  • B. field-programmable gate arrays (FPGAs)
  • C. central processing units (CPUs)
  • D. application-specific integrated circuits (ASICs)

Answer: B

Explanation:
FPGAs, such as those available on Azure, provide performance close to ASICs. They are also flexible and reconfigurable over time, to implement new logic.

NEW QUESTION 5

Your company has an Azure subscription that contains an Azure Active Directory (Azure AD) tenant. Azure AD contains 500 user accounts for your company's employees. Some temporary employees do NOT
have user accounts in Azure AD
You are designing a storage solution for video files and metadata files. You plan to deploy an application to perform analysis of the metadata files.
You need to recommend an authentication solution to provide links to the video files. The solution must provide access to each file for only five minutes.
What should you include in the in the recommendation?

  • A. Secondary Storage Key
  • B. Primary Storage Key
  • C. Shared Access Signature
  • D. Azure Active Directory

Answer: C

Explanation:
References:
https://docs.microsoft.com/en-us/azure/storage/common/storage-dotnet-shared-access-signature-part-1

NEW QUESTION 6

Note: This question is part of a series of questions that present the same scenario. Each question in the series contains a unique solution that might meet the stated goals. Some question sets might have more than one correct solution, while others might not have a correct solution.
After you answer a question in this section, you will NOT be able to return to it. As a result, these questions will not appear in the review screen.
You create several Al models in Azure Machine Learning Studio. You deploy the models to a production environment.
You need to monitor the compute performance of the models. Solution: You enable Applnsights diagnostics.
Does this meet the goal?

  • A. Yes
  • B. No

Answer: B

Explanation:
You need to enable Model data collection. References:
https://docs.microsoft.com/en-us/azure/machine-learning/service/how-to-enable-data-collection

NEW QUESTION 7

You are designing a solution that will ingest temperature data from loT devices, calculate the average temperature, and then take action based on the aggregated data. The solution must meet the following requirements:
•Minimize the amount of uploaded data.
• Take action based on the aggregated data as quickly as possible.
What should you include in the solution? To answer, select the appropriate options in the answer area. NOTE: Each correct selection is worth one point.
AI-100 dumps exhibit

  • A. Mastered
  • B. Not Mastered

Answer: A

Explanation:
Box 1: Azure Functions
Azure Function is a (serverless) service to host functions (little piece of code) that can be used for e. g. event driven applications.
General rule is always difficult since everything depends on your requirement but if you have to analyze a data stream, you should take a look at Azure Stream Analytics and if you want to implement something like a serverless event driven or timer-based application, you should check Azure Function or Logic Apps.
Note: Azure IoT Edge allows you to deploy complex event processing, machine learning, image recognition, and other high value AI without writing it in-house. Azure services like Azure Functions, Azure Stream Analytics, and Azure Machine Learning can all be run on-premises via Azure IoT Edge.
Box 2: An Azure IoT Edge device
Azure IoT Edge moves cloud analytics and custom business logic to devices so that your organization can focus on business insights instead of data management.
References:
https://docs.microsoft.com/en-us/azure/iot-edge/about-iot-edge

NEW QUESTION 8

You plan to design a solution for an Al implementation that uses data from loT devices.
You need to recommend a data storage solution for the loT devices that meets the following requirements:
•Allow data to be queried in real-time as it streams into the solution.
•Provide the lowest amount of latency for loading data into the solution. What should you include in the recommendation?

  • A. a Microsoft Azure SQL database that has In-Memory OLTP enabled
  • B. a Microsoft Azure HDInsight R Server cluster
  • C. a Microsoft Azure Table Storage solution
  • D. a Microsoft Azure HDInsight Hadoop cluster

Answer: D

Explanation:
You can use HDInsight to process streaming data that's received in real time from a variety of devices. Internet of Things (IoT)
You can use HDInsight to build applications that extract critical insights from data. You can also use Azure Machine Learning on top of that to predict future trends for your business.
By combining enterprise-scale R analytics software with the power of Apache Hadoop and Apache Spark, Microsoft R Server for HDInsight gives you the scale and performance you need. Multi-threaded math libraries and transparent parallelization in R Server handle up to 1000x more data and up to 50x faster speeds than open-source R, which helps you to train more accurate models for better predictions.
References:
https://docs.microsoft.com/en-us/azure/hdinsight/hadoop/apache-hadoop-introduction

NEW QUESTION 9

You have thousands of images that contain text.
You need to process the text from the images into a machine-readable character stream. Which Azure Cognitive Services service should you use?

  • A. Translator Text
  • B. Text Analytics
  • C. Computer Vision
  • D. the Image Moderation API

Answer: C

Explanation:
With Computer Vision you can detect text in an image using optical character recognition (OCR) and extract the recognized words into a machine-readable character stream.
References:
https://azure.microsoft.com/en-us/services/cognitive-services/computer-vision/ https://docs.microsoft.com/en-us/azure/cognitive-services/content-moderator/image-moderation-api

NEW QUESTION 10

You plan to use the Microsoft 8ot Framework to develop bots that will be deployed by using the Azure Bot Service.
You need to configure the Azure Bot Service to support the following types of bots:
•Bots that use Azure Functions
•Bots that set a timer
Which template should you use for each bot type? To answer, drag the appropriate templates to the correct bot types. Each template may be used once, more than once, or not at all. You may need to drag the split bar between panes or scroll to view content.
NOTE: Each correct selection is worth one point.
AI-100 dumps exhibit

  • A. Mastered
  • B. Not Mastered

Answer: A

Explanation:
References:
https://docs.microsoft.com/en-us/azure/bot-service/bot-service-concept-templates?view=azure-bot-service-3.0

NEW QUESTION 11

You have an Al application that uses keys in Azure Key Vault.
Recently, a key used by the application was deleted accidentally and was unrecoverable. You need to ensure that if a key is deleted, it is retained in the key vault for 90 days. Which two features should you configure? Each correct answer presents part of the solution. NOTE: Each correct selection is worth one point

  • A. the expiration date on the keys
  • B. soft delete
  • C. purge protection
  • D. auditors
  • E. the activation date on the keys

Answer: BC

Explanation:
References:
https://docs.microsoft.com/en-us/azure/architecture/best-practices/data-partitioning

NEW QUESTION 12

Your company plans to implement an Al solution that will analyse data from loT devices.
Data from the devices will be analysed in real time. The results of the analysis will be stored in a SQL database.
You need to recommend a data processing solution that uses the Transact-SQL language. Which data processing solution should you recommend?

  • A. Azure Stream Analytics
  • B. SQL Server Integration Services (SSIS)
  • C. Azure Event Hubs
  • D. Azure Machine Learning

Answer: A

Explanation:
References:
https://www.linkedin.com/pulse/getting-started-azure-iot-services-stream-analytics-rob-tiffany

NEW QUESTION 13

You are designing an Al application that will use an Azure Machine Learning Studio experiment. The source data contains more than 200 TB of relational tables. The experiment will run once a month. You need to identify a data storage solution for the application. The solution must minimize compute costs. Which data storage solution should you identify?

  • A. Azure Database for MySQL
  • B. Azure SQL Database
  • C. Azure SQL Data Warehouse

Answer: B

Explanation:
References:
https://azure.microsoft.com/en-us/pricing/details/sql-database/single/

NEW QUESTION 14

You have a database that contains sales data.
You plan to process the sales data by using two data streams named Stream1 and Stream2. Stream1 will be used for purchase order data. Stream2 will be used for reference data.
The reference data is stored in CSV files.
You need to recommend an ingestion solution for each data stream.
What two solutions should you recommend? Each correct answer is a complete solution.
NOTE: Each correct selection is worth one point.

  • A. an Azure event hub for Stream1 and Azure Blob storage for Stream2
  • B. Azure Blob storage for Stream1 and Stream2
  • C. an Azure event hub for Stream1 and Stream2
  • D. Azure Blob storage for Stream1 and Azure Cosmos DB for Stream2
  • E. Azure Cosmos DB for Stream1 and an Azure event hub for Stream2

Answer: AB

Explanation:
Stream1 - Azure Event Stream2 - Blob Storage
Azure Event Hubs is a highly scalable data streaming platform and event ingestion service, capable of
receiving and processing millions of events per second. Event Hubs can process and store events, data, or telemetry produced by distributed software and devices. Data sent to an event hub can be transformed and stored using any real-time analytics provider or batching/storage adapters. Event Hubs provides publishsubscribe capabilities with low latency at massive scale, which makes it appropriate for big data scenarios.
Stream1, Stream2 - Blob Storage
Stream Analytics has first-class integration with Azure data streams as inputs from three kinds of resources: Azure Event Hubs
Azure IoT Hub Azure Blob storage
These input resources can live in the same Azure subscription as your Stream Analytics job or a different subscription.
References:
https://docs.microsoft.com/en-us/azure/architecture/data-guide/technology-choices/real-time-ingestion

NEW QUESTION 15

You plan to implement a new data warehouse for a planned AI solution. You have the following information regarding the data warehouse:
•The data files will be available in one week.
•Most queries that will be executed against the data warehouse will be ad-hoc queries.
•The schemas of data files that will be loaded to the data warehouse will change often.
•One month after the planned implementation, the data warehouse will contain 15 TB of data. You need to recommend a database solution to support the planned implementation.
What two solutions should you include in the recommendation? Each correct answer is a complete solution. NOTE: Each correct selection is worth one point.

  • A. Apache Hadoop
  • B. Apache Spark
  • C. a Microsoft Azure SQL database
  • D. an Azure virtual machine that runs Microsoft SQL Server

Answer: AB

NEW QUESTION 16

Note: This question is part of a series of questions that present the same scenario. Each question in the series contains a unique solution that might meet the stated goals. Some question sets might have more than one correct solution, while others might not have a correct solution.
After you answer a question, you will NOT be able to return to it. As a result, these questions will not appear in the review screen.
You are deploying an Azure Machine Learning model to an Azure Kubernetes Service (AKS) container. You need to monitor the accuracy of each run of the model.
Solution: You configure Azure Monitor for containers. Does this meet the goal?

  • A. Yes
  • B. No

Answer: B

NEW QUESTION 17

You are developing a mobile application that will perform optical character recognition (OCR) from photos. The application will annotate the photos by using metadata, store the photos in Azure Blob storage, and then score the photos by using an Azure Machine Learning model.
What should you use to process the data?

  • A. Azure Event Hubs
  • B. Azure Functions
  • C. Azure Stream Analytics
  • D. Azure Logic Apps

Answer: A

NEW QUESTION 18

Your company recently deployed several hardware devices that contain sensors.
The sensors generate new data on an hourly basis. The data generated is stored on-premises and retained for several years.
During the past two months, the sensors generated 300 GB of data.
You plan to move the data to Azure and then perform advanced analytics on the data. You need to recommend an Azure storage solution for the data.
Which storage solution should you recommend?

  • A. Azure Queue storage
  • B. Azure Cosmos DB
  • C. Azure Blob storage
  • D. Azure SQL Database

Answer: C

Explanation:
References:
https://docs.microsoft.com/en-us/azure/architecture/data-guide/technology-choices/data-storage

NEW QUESTION 19

You are designing a solution that will ingest data from an Azure loT Edge device, preprocess the data in Azure Machine Learning, and then move the data to Azure HDInsight for further processing.
What should you include in the solution? To answer, select the appropriate options in the answer area. NOTE: Each correct selection is worth one point.
AI-100 dumps exhibit

  • A. Mastered
  • B. Not Mastered

Answer: A

Explanation:
Box 1: Export Data
The Export data to Hive option in the Export Data module in Azure Machine Learning Studio. This option is useful when you are working with very large datasets, and want to save your machine learning experiment data to a Hadoop cluster or HDInsight distributed storage.
Box 2: Apache Hive
Apache Hive is a data warehouse system for Apache Hadoop. Hive enables data summarization, querying, and analysis of data. Hive queries are written in HiveQL, which is a query language similar to SQL.
Box 3: Azure Data Lake
Default storage for the HDFS file system of HDInsight clusters can be associated with either an Azure Storage account or an Azure Data Lake Storage.
References:
https://docs.microsoft.com/en-us/azure/machine-learning/studio-module-reference/export-to-hive-query https://docs.microsoft.com/en-us/azure/hdinsight/hadoop/hdinsight-use-hive

NEW QUESTION 20

You design an AI solution that uses an Azure Stream Analytics job to process data from an Azure IoT hub. The IoT hub receives time series data from thousands of IoT devices at a factory.
The job outputs millions of messages per second. Different applications consume the messages as they are available. The messages must be purged.
You need to choose an output type for the job.
What is the best output type to achieve the goal? More than one answer choice may achieve the goal.

  • A. Azure Event Hubs
  • B. Azure SQL Database
  • C. Azure Blob storage
  • D. Azure Cosmos DB

Answer: D

Explanation:
Stream Analytics can target Azure Cosmos DB for JSON output, enabling data archiving and low-latency queries on unstructured JSON data.
References:
https://docs.microsoft.com/en-us/azure/stream-analytics/stream-analytics-documentdb-output

NEW QUESTION 21

You need to design an application that will analyze real-time data from financial feeds. The data will be ingested into Azure loT Hub. The data must be processed as quickly as possible in the order in which it is ingested.
Which service should you include in the design?

  • A. Azure Event Hubs
  • B. Azure Data Factory
  • C. Azure Stream Analytics
  • D. Apache Kafka

Answer: D

NEW QUESTION 22

Which two services should be implemented so that Butler can find available rooms based on the technical requirements? Each correct answer presents part of the solution.
NOTE: Each correct selection is worth one point.

  • A. QnA Maker
  • B. Bing Entity Search
  • C. Language Understanding (LUIS)
  • D. Azure Search
  • E. Content Moderator

Answer: AC

Explanation:
References:
https://azure.microsoft.com/en-in/services/cognitive-services/language-understanding-intelligent-service/

NEW QUESTION 23

You have an Azure Machine Learning experiment that must comply with GDPR regulations. You need to track compliance of the experiment and store documentation about the experiment. What should you use?

  • A. Azure Table storage
  • B. Azure Security Center
  • C. an Azure Log Analytics workspace
  • D. Compliance Manager

Answer: D

Explanation:
References:
https://azure.microsoft.com/en-us/blog/new-capabilities-to-enable-robust-gdpr-compliance/

NEW QUESTION 24

Note: This question is part of a series of questions that present the same scenario. Each question in the series contains a unique solution that might meet the stated goals. Some question sets might have more than one correct solution, while others might not have a correct solution.
After you answer a question, you will NOT be able to return to it. As a result, these questions will not
appear in the review screen.
You have Azure IoT Edge devices that generate streaming data.
On the devices, you need to detect anomalies in the data by using Azure Machine Learning models. Once an anomaly is detected, the devices must add information about the anomaly to the Azure IoT Hub stream. Solution: You deploy Azure Functions as an IoT Edge module.
Does this meet the goal?

  • A. Yes
  • B. No

Answer: B

Explanation:
Instead use Azure Stream Analytics and REST API.
Note. Available in both the cloud and Azure IoT Edge, Azure Stream Analytics offers built-in machine learning based anomaly detection capabilities that can be used to monitor the two most commonly occurring anomalies: temporary and persistent.
Stream Analytics supports user-defined functions, via REST API, that call out to Azure Machine Learning endpoints.
References:
https://docs.microsoft.com/en-us/azure/stream-analytics/stream-analytics-machine-learning-anomaly-detection

NEW QUESTION 25
......

Thanks for reading the newest AI-100 exam dumps! We recommend you to try the PREMIUM Allfreedumps.com AI-100 dumps in VCE and PDF here: https://www.allfreedumps.com/AI-100-dumps.html (101 Q&As Dumps)