Black Friday Special Limited Time 70% Discount Offer - Ends in 0d 00h 00m 00s - Coupon code: cramtick70

ARA-C01 SnowPro Advanced: Architect Certification Exam Questions and Answers

Questions 4

A retail company has 2000+ stores spread across the country. Store Managers report that they are having trouble running key reports related to inventory management, sales targets, payroll, and staffing during business hours. The Managers report that performance is poor and time-outs occur frequently.

Currently all reports share the same Snowflake virtual warehouse.

How should this situation be addressed? (Select TWO).

Options:

A.

Use a Business Intelligence tool for in-memory computation to improve performance.

B.

Configure a dedicated virtual warehouse for the Store Manager team.

C.

Configure the virtual warehouse to be multi-clustered.

D.

Configure the virtual warehouse to size 4-XL

E.

Advise the Store Manager team to defer report execution to off-business hours.

Buy Now
Questions 5

What Snowflake features should be leveraged when modeling using Data Vault?

Options:

A.

Snowflake’s support of multi-table inserts into the data model’s Data Vault tables

B.

Data needs to be pre-partitioned to obtain a superior data access performance

C.

Scaling up the virtual warehouses will support parallel processing of new source loads

D.

Snowflake’s ability to hash keys so that hash key joins can run faster than integer joins

Buy Now
Questions 6

Company A has recently acquired company B. The Snowflake deployment for company B is located in the Azure West Europe region.

As part of the integration process, an Architect has been asked to consolidate company B's sales data into company A's Snowflake account which is located in the AWS us-east-1 region.

How can this requirement be met?

Options:

A.

Replicate the sales data from company B's Snowflake account into company A's Snowflake account using cross-region data replication within Snowflake. Configure a direct share from company B's account to company A's account.

B.

Export the sales data from company B's Snowflake account as CSV files, and transfer the files to company A's Snowflake account. Import the data using Snowflake's data loading capabilities.

C.

Migrate company B's Snowflake deployment to the same region as company A's Snowflake deployment, ensuring data locality. Then perform a direct database-to-database merge of the sales data.

D.

Build a custom data pipeline using Azure Data Factory or a similar tool to extract the sales data from company B's Snowflake account. Transform the data, then load it into company A's Snowflake account.

Buy Now
Questions 7

A company is using a Snowflake account in Azure. The account has SAML SSO set up using ADFS as a SCIM identity provider. To validate Private Link connectivity, an Architect performed the following steps:

* Confirmed Private Link URLs are working by logging in with a username/password account

* Verified DNS resolution by running nslookups against Private Link URLs

* Validated connectivity using SnowCD

* Disabled public access using a network policy set to use the company’s IP address range

However, the following error message is received when using SSO to log into the company account:

IP XX.XXX.XX.XX is not allowed to access snowflake. Contact your local security administrator.

What steps should the Architect take to resolve this error and ensure that the account is accessed using only Private Link? (Choose two.)

Options:

A.

Alter the Azure security integration to use the Private Link URLs.

B.

Add the IP address in the error message to the allowed list in the network policy.

C.

Generate a new SCIM access token using system$generate_scim_access_token and save it to Azure AD.

D.

Update the configuration of the Azure AD SSO to use the Private Link URLs.

E.

Open a case with Snowflake Support to authorize the Private Link URLs’ access to the account.

Buy Now
Questions 8

An Architect is troubleshooting a query with poor performance using the QUERY_HIST0RY function. The Architect observes that the COMPILATIONJHME is greater than the EXECUTIONJTIME.

What is the reason for this?

Options:

A.

The query is processing a very large dataset.

B.

The query has overly complex logic.

C.

The query is queued for execution.

D.

The query is reading from remote storage.

Buy Now
Questions 9

The data share exists between a data provider account and a data consumer account. Five tables from the provider account are being shared with the consumer account. The consumer role has been granted the imported privileges privilege.

What will happen to the consumer account if a new table (table_6) is added to the provider schema?

Options:

A.

The consumer role will automatically see the new table and no additional grants are needed.

B.

The consumer role will see the table only after this grant is given on the consumer side:

grant imported privileges on database PSHARE_EDW_4TEST_DB to DEV_ROLE;

C.

The consumer role will see the table only after this grant is given on the provider side:

use role accountadmin;

Grant select on table EDW.ACCOUNTING.Table_6 to share PSHARE_EDW_4TEST;

D.

The consumer role will see the table only after this grant is given on the provider side:

use role accountadmin;

grant usage on database EDW to share PSHARE_EDW_4TEST ;

grant usage on schema EDW.ACCOUNTING to share PSHARE_EDW_4TEST ;

Grant select on table EDW.ACCOUNTING.Table_6 to database PSHARE_EDW_4TEST_DB ;

Buy Now
Questions 10

When loading data into a table that captures the load time in a column with a default value of either CURRENT_TIME () or CURRENT_TIMESTAMP() what will occur?

Options:

A.

All rows loaded using a specific COPY statement will have varying timestamps based on when the rows were inserted.

B.

Any rows loaded using a specific COPY statement will have varying timestamps based on when the rows were read from the source.

C.

Any rows loaded using a specific COPY statement will have varying timestamps based on when the rows were created in the source.

D.

All rows loaded using a specific COPY statement will have the same timestamp value.

Buy Now
Questions 11

An Architect is integrating an application that needs to read and write data to Snowflake without installing any additional software on the application server.

How can this requirement be met?

Options:

A.

Use SnowSQL.

B.

Use the Snowpipe REST API.

C.

Use the Snowflake SQL REST API.

D.

Use the Snowflake ODBC driver.

Buy Now
Questions 12

Files arrive in an external stage every 10 seconds from a proprietary system. The files range in size from 500 K to 3 MB. The data must be accessible by dashboards as soon as it arrives.

How can a Snowflake Architect meet this requirement with the LEAST amount of coding? (Choose two.)

Options:

A.

Use Snowpipe with auto-ingest.

B.

Use a COPY command with a task.

C.

Use a materialized view on an external table.

D.

Use the COPY INTO command.

E.

Use a combination of a task and a stream.

Buy Now
Questions 13

Which command will create a schema without Fail-safe and will restrict object owners from passing on access to other users?

Options:

A.

create schema EDW.ACCOUNTING WITH MANAGED ACCESS;

B.

create schema EDW.ACCOUNTING WITH MANAGED ACCESS DATA_RETENTION_TIME_IN_DAYS - 7;

C.

create TRANSIENT schema EDW.ACCOUNTING WITH MANAGED ACCESS DATA_RETENTION_TIME_IN_DAYS = 1;

D.

create TRANSIENT schema EDW.ACCOUNTING WITH MANAGED ACCESS DATA_RETENTION_TIME_IN_DAYS = 7;

Buy Now
Questions 14

A Data Engineer is designing a near real-time ingestion pipeline for a retail company to ingest event logs into Snowflake to derive insights. A Snowflake Architect is asked to define security best practices to configure access control privileges for the data load for auto-ingest to Snowpipe.

What are the MINIMUM object privileges required for the Snowpipe user to execute Snowpipe?

Options:

A.

OWNERSHIP on the named pipe, USAGE on the named stage, target database, and schema, and INSERT and SELECT on the target table

B.

OWNERSHIP on the named pipe, USAGE and READ on the named stage, USAGE on the target database and schema, and INSERT end SELECT on the target table

C.

CREATE on the named pipe, USAGE and READ on the named stage, USAGE on the target database and schema, and INSERT end SELECT on the target table

D.

USAGE on the named pipe, named stage, target database, and schema, and INSERT and SELECT on the target table

Buy Now
Questions 15

An Architect needs to grant a group of ORDER_ADMIN users the ability to clean old data in an ORDERS table (deleting all records older than 5 years), without granting any privileges on the table. The group’s manager (ORDER_MANAGER) has full DELETE privileges on the table.

How can the ORDER_ADMIN role be enabled to perform this data cleanup, without needing the DELETE privilege held by the ORDER_MANAGER role?

Options:

A.

Create a stored procedure that runs with caller’s rights, including the appropriate "> 5 years" business logic, and grant USAGE on this procedure to ORDER_ADMIN. The ORDER_MANAGER role owns the procedure.

B.

Create a stored procedure that can be run using both caller’s and owner’s rights (allowing the user to specify which rights are used during execution), and grant USAGE on this procedure to ORDER_ADMIN. The ORDER_MANAGER role owns the procedure.

C.

Create a stored procedure that runs with owner’s rights, including the appropriate "> 5 years" business logic, and grant USAGE on this procedure to ORDER_ADMIN. The ORDER_MANAGER role owns the procedure.

D.

This scenario would actually not be possible in Snowflake – any user performing a DELETE on a table requires the DELETE privilege to be granted to the role they are using.

Buy Now
Questions 16

A Snowflake Architect Is working with Data Modelers and Table Designers to draft an ELT framework specifically for data loading using Snowpipe. The Table Designers will add a timestamp column that Inserts the current tlmestamp as the default value as records are loaded into a table. The Intent is to capture the time when each record gets loaded into the table; however, when tested the timestamps are earlier than the loae_take column values returned by the copy_history function or the Copy_HISTORY view (Account Usage).

Why Is this occurring?

Options:

A.

The timestamps are different because there are parameter setup mismatches. The parameters need to be realigned

B.

The Snowflake timezone parameter Is different from the cloud provider's parameters causing the mismatch.

C.

The Table Designer team has not used the localtimestamp or systimestamp functions in the Snowflake copy statement.

D.

The CURRENT_TIMEis evaluated when the load operation is compiled in cloud services rather than when the record is inserted into the table.

Buy Now
Questions 17

A group of Data Analysts have been granted the role analyst role. They need a Snowflake database where they can create and modify tables, views, and other objects to load with their own data. The Analysts should not have the ability to give other Snowflake users outside of their role access to this data.

How should these requirements be met?

Options:

A.

Grant ANALYST_R0LE OWNERSHIP on the database, but make sure that ANALYST_ROLE does not have the MANAGE GRANTS privilege on the account.

B.

Grant SYSADMIN ownership of the database, but grant the create schema privilege on the database to the ANALYST_ROLE.

C.

Make every schema in the database a managed access schema, owned by SYSADMIN, and grant create privileges on each schema to the ANALYST_ROLE for each type of object that needs to be created.

D.

Grant ANALYST_ROLE ownership on the database, but grant the ownership on future [object type] s in database privilege to SYSADMIN.

Buy Now
Questions 18

Which steps are recommended best practices for prioritizing cluster keys in Snowflake? (Choose two.)

Options:

A.

Choose columns that are frequently used in join predicates.

B.

Choose lower cardinality columns to support clustering keys and cost effectiveness.

C.

Choose TIMESTAMP columns with nanoseconds for the highest number of unique rows.

D.

Choose cluster columns that are most actively used in selective filters.

E.

Choose cluster columns that are actively used in the GROUP BY clauses.

Buy Now
Questions 19

A new table and streams are created with the following commands:

CREATE OR REPLACE TABLE LETTERS (ID INT, LETTER STRING) ;

CREATE OR REPLACE STREAM STREAM_1 ON TABLE LETTERS;

CREATE OR REPLACE STREAM STREAM_2 ON TABLE LETTERS APPEND_ONLY = TRUE;

The following operations are processed on the newly created table:

INSERT INTO LETTERS VALUES (1, 'A');

INSERT INTO LETTERS VALUES (2, 'B');

INSERT INTO LETTERS VALUES (3, 'C');

TRUNCATE TABLE LETTERS;

INSERT INTO LETTERS VALUES (4, 'D');

INSERT INTO LETTERS VALUES (5, 'E');

INSERT INTO LETTERS VALUES (6, 'F');

DELETE FROM LETTERS WHERE ID = 6;

What would be the output of the following SQL commands, in order?

SELECT COUNT (*) FROM STREAM_1;

SELECT COUNT (*) FROM STREAM_2;

Options:

A.

2 & 6

B.

2 & 3

C.

4 & 3

D.

4 & 6

Buy Now
Questions 20

A company has a Snowflake environment running in AWS us-west-2 (Oregon). The company needs to share data privately with a customer who is running their Snowflake environment in Azure East US 2 (Virginia).

What is the recommended sequence of operations that must be followed to meet this requirement?

Options:

A.

1. Create a share and add the database privileges to the share

2. Create a new listing on the Snowflake Marketplace

3. Alter the listing and add the share

4. Instruct the customer to subscribe to the listing on the Snowflake Marketplace

B.

1. Ask the customer to create a new Snowflake account in Azure EAST US 2 (Virginia)

2. Create a share and add the database privileges to the share

3. Alter the share and add the customer's Snowflake account to the share

C.

1. Create a new Snowflake account in Azure East US 2 (Virginia)

2. Set up replication between AWS us-west-2 (Oregon) and Azure East US 2 (Virginia) for the database objects to be shared

3. Create a share and add the database privileges to the share

4. Alter the share and add the customer's Snowflake account to the share

D.

1. Create a reader account in Azure East US 2 (Virginia)

2. Create a share and add the database privileges to the share

3. Add the reader account to the share

4. Share the reader account's URL and credentials with the customer

Buy Now
Questions 21

A Snowflake Architect is designing a multi-tenant application strategy for an organization in the Snowflake Data Cloud and is considering using an Account Per Tenant strategy.

Which requirements will be addressed with this approach? (Choose two.)

Options:

A.

There needs to be fewer objects per tenant.

B.

Security and Role-Based Access Control (RBAC) policies must be simple to configure.

C.

Compute costs must be optimized.

D.

Tenant data shape may be unique per tenant.

E.

Storage costs must be optimized.

Buy Now
Questions 22

What built-in Snowflake features make use of the change tracking metadata for a table? (Choose two.)

Options:

A.

The MERGE command

B.

The UPSERT command

C.

The CHANGES clause

D.

A STREAM object

E.

The CHANGE_DATA_CAPTURE command

Buy Now
Questions 23

An Architect has a VPN_ACCESS_LOGS table in the SECURITY_LOGS schema containing timestamps of the connection and disconnection, username of the user, and summary statistics.

What should the Architect do to enable the Snowflake search optimization service on this table?

Options:

A.

Assume role with OWNERSHIP on future tables and ADD SEARCH OPTIMIZATION on the SECURITY_LOGS schema.

B.

Assume role with ALL PRIVILEGES including ADD SEARCH OPTIMIZATION in the SECURITY LOGS schema.

C.

Assume role with OWNERSHIP on VPN_ACCESS_LOGS and ADD SEARCH OPTIMIZATION in the SECURITY_LOGS schema.

D.

Assume role with ALL PRIVILEGES on VPN_ACCESS_LOGS and ADD SEARCH OPTIMIZATION in the SECURITY_LOGS schema.

Buy Now
Questions 24

Two queries are run on the customer_address table:

create or replace TABLE CUSTOMER_ADDRESS ( CA_ADDRESS_SK NUMBER(38,0), CA_ADDRESS_ID VARCHAR(16), CA_STREET_NUMBER VARCHAR(IO) CA_STREET_NAME VARCHAR(60), CA_STREET_TYPE VARCHAR(15), CA_SUITE_NUMBER VARCHAR(10), CA_CITY VARCHAR(60), CA_COUNTY

VARCHAR(30), CA_STATE VARCHAR(2), CA_ZIP VARCHAR(10), CA_COUNTRY VARCHAR(20), CA_GMT_OFFSET NUMBER(5,2), CA_LOCATION_TYPE

VARCHAR(20) );

ALTER TABLE DEMO_DB.DEMO_SCH.CUSTOMER_ADDRESS ADD SEARCH OPTIMIZATION ON SUBSTRING(CA_ADDRESS_ID);

Which queries will benefit from the use of the search optimization service? (Select TWO).

Options:

A.

select * from DEMO_DB.DEMO_SCH.CUSTOMER_ADDRESS Where substring(CA_ADDRESS_ID,1,8)= substring('AAAAAAAAPHPPLBAAASKDJHASLKDJHASKJD',1,8);

B.

select * from DEMO_DB.DEMO_SCH.CUSTOMER_ADDRESS Where CA_ADDRESS_ID= substring('AAAAAAAAPHPPLBAAASKDJHASLKDJHASKJD',1,16);

C.

select*fromDEMO_DB.DEMO_SCH.CUSTOMER_ADDRESSWhereCA_ADDRESS_IDLIKE ’%BAAASKD%';

D.

select*fromDEMO_DB.DEMO_SCH.CUSTOMER_ADDRESSWhereCA_ADDRESS_IDLIKE '%PHPP%';

E.

select*fromDEMO_DB.DEMO_SCH.CUSTOMER_ADDRESSWhereCA_ADDRESS_IDNOT LIKE '%AAAAAAAAPHPPL%';

Buy Now
Questions 25

A retail company has over 3000 stores all using the same Point of Sale (POS) system. The company wants to deliver near real-time sales results to category managers. The stores operate in a variety of time zones and exhibit a dynamic range of transactions each minute, with some stores having higher sales volumes than others.

Sales results are provided in a uniform fashion using data engineered fields that will be calculated in a complex data pipeline. Calculations include exceptions, aggregations, and scoring using external functions interfaced to scoring algorithms. The source data for aggregations has over 100M rows.

Every minute, the POS sends all sales transactions files to a cloud storage location with a naming convention that includes store numbers and timestamps to identify the set of transactions contained in the files. The files are typically less than 10MB in size.

How can the near real-time results be provided to the category managers? (Select TWO).

Options:

A.

All files should be concatenated before ingestion into Snowflake to avoid micro-ingestion.

B.

A Snowpipe should be created and configured with AUTO_INGEST = true. A stream should be created to process INSERTS into a single target table using the stream metadata to inform the store number and timestamps.

C.

A stream should be created to accumulate the near real-time data and a task should be created that runs at a frequency that matches the real-time analytics needs.

D.

An external scheduler should examine the contents of the cloud storage location and issue SnowSQL commands to process the data at a frequency that matches the real-time analytics needs.

E.

The copy into command with a task scheduled to run every second should be used to achieve the near-real time requirement.

Buy Now
Questions 26

Based on the Snowflake object hierarchy, what securable objects belong directly to a Snowflake account? (Select THREE).

Options:

A.

Database

B.

Schema

C.

Table

D.

Stage

E.

Role

F.

Warehouse

Buy Now
Questions 27

What are purposes for creating a storage integration? (Choose three.)

Options:

A.

Control access to Snowflake data using a master encryption key that is maintained in the cloud provider’s key management service.

B.

Store a generated identity and access management (IAM) entity for an external cloud provider regardless of the cloud provider that hosts the Snowflake account.

C.

Support multiple external stages using one single Snowflake object.

D.

Avoid supplying credentials when creating a stage or when loading or unloading data.

E.

Create private VPC endpoints that allow direct, secure connectivity between VPCs without traversing the public internet.

F.

Manage credentials from multiple cloud providers in one single Snowflake object.

Buy Now
Questions 28

An Architect on a new project has been asked to design an architecture that meets Snowflake security, compliance, and governance requirements as follows:

1) Use Tri-Secret Secure in Snowflake

2) Share some information stored in a view with another Snowflake customer

3) Hide portions of sensitive information from some columns

4) Use zero-copy cloning to refresh the non-production environment from the production environment

To meet these requirements, which design elements must be implemented? (Choose three.)

Options:

A.

Define row access policies.

B.

Use the Business-Critical edition of Snowflake.

C.

Create a secure view.

D.

Use the Enterprise edition of Snowflake.

E.

Use Dynamic Data Masking.

F.

Create a materialized view.

Buy Now
Questions 29

What step will im the performance of queries executed against an external table?

Options:

A.

Partition the external table.

B.

Shorten the names of the source files.

C.

Convert the source files' character encoding to UTF-8.

D.

Use an internal stage instead of an external stage to store the source files.

Buy Now
Questions 30

A company's Architect needs to find an efficient way to get data from an external partner, who is also a Snowflake user. The current solution is based on daily JSON extracts that are placed on an FTP server and uploaded to Snowflake manually. The files are changed several times each month, and the ingestion process needs to be adapted to accommodate these changes.

What would be the MOST efficient solution?

Options:

A.

Ask the partner to create a share and add the company's account.

B.

Ask the partner to use the data lake export feature and place the data into cloud storage where Snowflake can natively ingest it (schema-on-read).

C.

Keep the current structure but request that the partner stop changing files, instead only appending new files.

D.

Ask the partner to set up a Snowflake reader account and use that account to get the data for ingestion.

Buy Now
Questions 31

Which security, governance, and data protection features require, at a MINIMUM, the Business Critical edition of Snowflake? (Choose two.)

Options:

A.

Extended Time Travel (up to 90 days)

B.

Customer-managed encryption keys through Tri-Secret Secure

C.

Periodic rekeying of encrypted data

D.

AWS, Azure, or Google Cloud private connectivity to Snowflake

E.

Federated authentication and SSO

Buy Now
Questions 32

Consider the following scenario where a masking policy is applied on the CREDICARDND column of the CREDITCARDINFO table. The masking policy definition Is as follows:

Sample data for the CREDITCARDINFO table is as follows:

NAME EXPIRYDATE CREDITCARDNO

JOHN DOE 2022-07-23 4321 5678 9012 1234

if the Snowflake system rotes have not been granted any additional roles, what will be the result?

Options:

A.

The sysadmin can see the CREDICARDND column data in clear text.

B.

The owner of the table will see the CREDICARDND column data in clear text.

C.

Anyone with the Pl_ANALYTICS role will see the last 4 characters of the CREDICARDND column data in dear text.

D.

Anyone with the Pl_ANALYTICS role will see the CREDICARDND column as*** 'MASKED* **'.

Buy Now
Questions 33

What is a key consideration when setting up search optimization service for a table?

Options:

A.

Search optimization service works best with a column that has a minimum of 100 K distinct values.

B.

Search optimization service can significantly improve query performance on partitioned external tables.

C.

Search optimization service can help to optimize storage usage by compressing the data into a GZIP format.

D.

The table must be clustered with a key having multiple columns for effective search optimization.

Buy Now
Questions 34

Based on the architecture in the image, how can the data from DB1 be copied into TBL2? (Select TWO).

A)

B)

C)

D)

E)

Options:

A.

Option A

B.

Option B

C.

Option C

D.

Option D

E.

Option E

Buy Now
Questions 35

A healthcare company wants to share data with a medical institute. The institute is running a Standard edition of Snowflake; the healthcare company is running a Business Critical edition.

How can this data be shared?

Options:

A.

The healthcare company will need to change the institute’s Snowflake edition in the accounts panel.

B.

By default, sharing is supported from a Business Critical Snowflake edition to a Standard edition.

C.

Contact Snowflake and they will execute the share request for the healthcare company.

D.

Set the share_restriction parameter on the shared object to false.

Buy Now
Questions 36

A user is executing the following command sequentially within a timeframe of 10 minutes from start to finish:

What would be the output of this query?

Options:

A.

Table T_SALES_CLONE successfully created.

B.

Time Travel data is not available for table T_SALES.

C.

The offset -> is not a valid clause in the clone operation.

D.

Syntax error line 1 at position 58 unexpected 'at’.

Buy Now
Questions 37

Which of the following ingestion methods can be used to load near real-time data by using the messaging services provided by a cloud provider?

Options:

A.

Snowflake Connector for Kafka

B.

Snowflake streams

C.

Snowpipe

D.

Spark

Buy Now
Questions 38

Data is being imported and stored as JSON in a VARIANT column. Query performance was fine, but most recently, poor query performance has been reported.

What could be causing this?

Options:

A.

There were JSON nulls in the recent data imports.

B.

The order of the keys in the JSON was changed.

C.

The recent data imports contained fewer fields than usual.

D.

There were variations in string lengths for the JSON values in the recent data imports.

Buy Now
Questions 39

Assuming all Snowflake accounts are using an Enterprise edition or higher, in which development and testing scenarios would be copying of data be required, and zero-copy cloning not be suitable? (Select TWO).

Options:

A.

Developers create their own datasets to work against transformed versions of the live data.

B.

Production and development run in different databases in the same account, and Developers need to see production-like data but with specific columns masked.

C.

Data is in a production Snowflake account that needs to be provided to Developers in a separate development/testing Snowflake account in the same cloud region.

D.

Developers create their own copies of a standard test database previously created for them in the development account, for their initial development and unit testing.

E.

The release process requires pre-production testing of changes with data of production scale and complexity. For security reasons, pre-production also runs in the production account.

Buy Now
Questions 40

What integration object should be used to place restrictions on where data may be exported?

Options:

A.

Stage integration

B.

Security integration

C.

Storage integration

D.

API integration

Buy Now
Questions 41

An Architect entered the following commands in sequence:

USER1 cannot find the table.

Which of the following commands does the Architect need to run for USER1 to find the tables using the Principle of Least Privilege? (Choose two.)

Options:

A.

GRANT ROLE PUBLIC TO ROLE INTERN;

B.

GRANT USAGE ON DATABASE SANDBOX TO ROLE INTERN;

C.

GRANT USAGE ON SCHEMA SANDBOX.PUBLIC TO ROLE INTERN;

D.

GRANT OWNERSHIP ON DATABASE SANDBOX TO USER INTERN;

E.

GRANT ALL PRIVILEGES ON DATABASE SANDBOX TO ROLE INTERN;

Buy Now
Questions 42

A company is designing a process for importing a large amount of loT JSON data from cloud storage into Snowflake. New sets of loT data get generated and uploaded approximately every 5 minutes.

Once the loT data is in Snowflake, the company needs up-to-date information from an external vendor to join to the data. This data is then presented to users through a dashboard that shows different levels of aggregation. The external vendor is a Snowflake customer.

What solution will MINIMIZE complexity and MAXIMIZE performance?

Options:

A.

1. Create an external table over the JSON data in cloud storage.

2. Create a task that runs every 5 minutes to run a transformation procedure on new data, based on a saved timestamp.

3. Ask the vendor to expose an API so an external function can be used to generate a call to join the data back to the loT data in the transformation procedure.

4. Give the transformed table access to the dashboard tool.

5. Perform the a

B.

1. Create an external table over the JSON data in cloud storage.

2. Create a task that runs every 5 minutes to run a transformation procedure on new data based on a saved timestamp.

3. Ask the vendor to create a data share with the required data that can be imported into the company's Snowflake account.

4. Join the vendor's data back to the loT data using a transformation procedure.

5. Create views over the larger da

C.

1. Create a Snowpipe to bring the JSON data into Snowflake.

2. Use streams and tasks to trigger a transformation procedure when new JSON data arrives.

3. Ask the vendor to expose an API so an external function call can be made to join the vendor's data back to the loT data in a transformation procedure.

4. Create materialized views over the larger dataset to perform the aggregations required by the dashboard.

5. Give

D.

1. Create a Snowpipe to bring the JSON data into Snowflake.

2. Use streams and tasks to trigger a transformation procedure when new JSON data arrives.

3. Ask the vendor to create a data share with the required data that is then imported into the Snowflake account.

4. Join the vendor's data back to the loT data in a transformation procedure

5. Create materialized views over the larger dataset to perform the aggregatio

Buy Now
Questions 43

An Architect has designed a data pipeline that Is receiving small CSV files from multiple sources. All of the files are landing in one location. Specific files are filtered for loading into Snowflake tables using the copy command. The loading performance is poor.

What changes can be made to Improve the data loading performance?

Options:

A.

Increase the size of the virtual warehouse.

B.

Create a multi-cluster warehouse and merge smaller files to create bigger files.

C.

Create a specific storage landing bucket to avoid file scanning.

D.

Change the file format from CSV to JSON.

Buy Now
Questions 44

An Architect is troubleshooting a query with poor performance using the QUERY function. The Architect observes that the COMPILATION_TIME Is greater than the EXECUTION_TIME.

What is the reason for this?

Options:

A.

The query is processing a very large dataset.

B.

The query has overly complex logic.

C.

The query Is queued for execution.

D.

The query Is reading from remote storage

Buy Now
Questions 45

A company has several sites in different regions from which the company wants to ingest data.

Which of the following will enable this type of data ingestion?

Options:

A.

The company must have a Snowflake account in each cloud region to be able to ingest data to that account.

B.

The company must replicate data between Snowflake accounts.

C.

The company should provision a reader account to each site and ingest the data through the reader accounts.

D.

The company should use a storage integration for the external stage.

Buy Now
Questions 46

An Architect is implementing a CI/CD process. When attempting to clone a table from a production to a development environment, the cloning operation fails.

What could be causing this to happen?

Options:

A.

The table is transient.

B.

The table has a masking policy.

C.

The retention time for the table is set to zero.

D.

Tables cannot be cloned from a higher environment to a lower environment.

Buy Now
Questions 47

An Architect with the ORGADMIN role wants to change a Snowflake account from an Enterprise edition to a Business Critical edition.

How should this be accomplished?

Options:

A.

Run an ALTER ACCOUNT command and create a tag of EDITION and set the tag to Business Critical.

B.

Use the account's ACCOUNTADMIN role to change the edition.

C.

Failover to a new account in the same region and specify the new account's edition upon creation.

D.

Contact Snowflake Support and request that the account's edition be changed.

Buy Now
Questions 48

How can an Architect enable optimal clustering to enhance performance for different access paths on a given table?

Options:

A.

Create multiple clustering keys for a table.

B.

Create multiple materialized views with different cluster keys.

C.

Create super projections that will automatically create clustering.

D.

Create a clustering key that contains all columns used in the access paths.

Buy Now
Exam Code: ARA-C01
Exam Name: SnowPro Advanced: Architect Certification Exam
Last Update: Nov 25, 2024
Questions: 162
ARA-C01 pdf

ARA-C01 PDF

$25.5  $84.99
ARA-C01 Engine

ARA-C01 Testing Engine

$30  $99.99
ARA-C01 PDF + Engine

ARA-C01 PDF + Testing Engine

$40.5  $134.99