Big Halloween Sale Limited Time 70% Discount Offer - Ends in 0d 00h 00m 00s - Coupon code: simple70

Pass the Snowflake SnowPro Advanced: Architect ARA-C01 Questions and answers with ValidTests

Exam ARA-C01 All Questions
Exam ARA-C01 Premium Access

View all detail and faqs for the ARA-C01 exam

Viewing page 1 out of 5 pages
Viewing questions 1-10 out of questions
Questions # 1:

A user has the appropriate privilege to see unmasked data in a column.

If the user loads this column data into another column that does not have a masking policy, what will occur?

Options:

A.

Unmasked data will be loaded in the new column.

B.

Masked data will be loaded into the new column.

C.

Unmasked data will be loaded into the new column but only users with the appropriate privileges will be able to see the unmasked data.

D.

Unmasked data will be loaded into the new column and no users will be able to see the unmasked data.

Expert Solution
Questions # 2:

There are two databases in an account, named fin_db and hr_db which contain payroll and employee data, respectively. Accountants and Analysts in the company require different permissions on the objects in these databases to perform their jobs. Accountants need read-write access to fin_db but only require read-only access to hr_db because the database is maintained by human resources personnel.

An Architect needs to create a read-only role for certain employees working in the human resources department.

Which permission sets must be granted to this role?

Options:

A.

USAGE on database hr_db, USAGE on all schemas in database hr_db, SELECT on all tables in database hr_db

B.

USAGE on database hr_db, SELECT on all schemas in database hr_db, SELECT on all tables in database hr_db

C.

MODIFY on database hr_db, USAGE on all schemas in database hr_db, USAGE on all tables in database hr_db

D.

USAGE on database hr_db, USAGE on all schemas in database hr_db, REFERENCES on all tables in database hr_db

Expert Solution
Questions # 3:

What are characteristics of the use of transactions in Snowflake? (Select TWO).

Options:

A.

Explicit transactions can contain DDL, DML, and query statements.

B.

The autocommit setting can be changed inside a stored procedure.

C.

A transaction can be started explicitly by executing a BEGIN WORK statement and ended explicitly by executing a COMMIT WORK statement.

D.

A transaction can be started explicitly by executing a BEGIN TRANSACTION statement and ended explicitly by executing an END TRANSACTION statement.

E.

Explicit transactions should contain only DML statements and query statements. All DDL statements implicitly commit active transactions.

Expert Solution
Questions # 4:

A company is trying to Ingest 10 TB of CSV data into a Snowflake table using Snowpipe as part of Its migration from a legacy database platform. The records need to be ingested in the MOST performant and cost-effective way.

How can these requirements be met?

Options:

A.

Use ON_ERROR = continue in the copy into command.

B.

Use purge = TRUE in the copy into command.

C.

Use FURGE = FALSE in the copy into command.

D.

Use on error = SKIP_FILE in the copy into command.

Expert Solution
Questions # 5:

A company has a Snowflake account named ACCOUNTA in AWS us-east-1 region. The company stores its marketing data in a Snowflake database named MARKET_DB. One of the company’s business partners has an account named PARTNERB in Azure East US 2 region. For marketing purposes the company has agreed to share the database MARKET_DB with the partner account.

Which of the following steps MUST be performed for the account PARTNERB to consume data from the MARKET_DB database?

Options:

A.

Create a new account (called AZABC123) in Azure East US 2 region. From account ACCOUNTA create a share of database MARKET_DB, create a new database out of this share locally in AWS us-east-1 region, and replicate this new database to AZABC123 account. Then set up data sharing to the PARTNERB account.

B.

From account ACCOUNTA create a share of database MARKET_DB, and create a new database out of this share locally in AWS us-east-1 region. Then make this database the provider and share it with the PARTNERB account.

C.

Create a new account (called AZABC123) in Azure East US 2 region. From account ACCOUNTA replicate the database MARKET_DB to AZABC123 and from this account set up the data sharing to the PARTNERB account.

D.

Create a share of database MARKET_DB, and create a new database out of this share locally in AWS us-east-1 region. Then replicate this database to the partner’s account PARTNERB.

Expert Solution
Questions # 6:

The Data Engineering team at a large manufacturing company needs to engineer data coming from many sources to support a wide variety of use cases and data consumer requirements which include:

1) Finance and Vendor Management team members who require reporting and visualization

2) Data Science team members who require access to raw data for ML model development

3) Sales team members who require engineered and protected data for data monetization

What Snowflake data modeling approaches will meet these requirements? (Choose two.)

Options:

A.

Consolidate data in the company’s data lake and use EXTERNAL TABLES.

B.

Create a raw database for landing and persisting raw data entering the data pipelines.

C.

Create a set of profile-specific databases that aligns data with usage patterns.

D.

Create a single star schema in a single database to support all consumers’ requirements.

E.

Create a Data Vault as the sole data pipeline endpoint and have all consumers directly access the Vault.

Expert Solution
Questions # 7:

The Business Intelligence team reports that when some team members run queries for their dashboards in parallel with others, the query response time is getting significantly slower What can a Snowflake Architect do to identify what is occurring and troubleshoot this issue?

A)

Question # 7

B)

Question # 7

C)

Question # 7

D)

Question # 7

Options:

A.

Option A

B.

Option B

C.

Option C

D.

Option D

Expert Solution
Questions # 8:

A Snowflake Architect created a new data share and would like to verify that only specific records in secure views are visible within the data share by the consumers.

What is the recommended way to validate data accessibility by the consumers?

Options:

A.

Create reader accounts as shown below and impersonate the consumers by logging in with their credentials.create managed account reader_acctl admin_name = userl , adroin_password ■ 'Sdfed43da!44T , type = reader;

B.

Create a row access policy as shown below and assign it to the data share.create or replace row access policy rap_acct as (acct_id varchar) returns boolean -> case when 'acctl_role' = current_role() then true else false end;

C.

Set the session parameter called SIMULATED_DATA_SHARING_C0NSUMER as shown below in order to impersonate the consumer accounts.alter session set simulated_data_sharing_consumer - 'Consumer Acctl*

D.

Alter the share settings as shown below, in order to impersonate a specific consumer account.alter share sales share set accounts = 'Consumerl’ share restrictions = true

Expert Solution
Questions # 9:

A company is using Snowflake in Azure in the Netherlands. The company analyst team also has data in JSON format that is stored in an Amazon S3 bucket in the AWS Singapore region that the team wants to analyze.

The Architect has been given the following requirements:

1. Provide access to frequently changing data

2. Keep egress costs to a minimum

3. Maintain low latency

How can these requirements be met with the LEAST amount of operational overhead?

Options:

A.

Use a materialized view on top of an external table against the S3 bucket in AWS Singapore.

B.

Use an external table against the S3 bucket in AWS Singapore and copy the data into transient tables.

C.

Copy the data between providers from S3 to Azure Blob storage to collocate, then use Snowpipe for data ingestion.

D.

Use AWS Transfer Family to replicate data between the S3 bucket in AWS Singapore and an Azure Netherlands Blob storage, then use an external table against the Blob storage.

Expert Solution
Questions # 10:

An Architect is designing a pipeline to stream event data into Snowflake using the Snowflake Kafka connector. The Architect’s highest priority is to configure the connector to stream data in the MOST cost-effective manner.

Which of the following is recommended for optimizing the cost associated with the Snowflake Kafka connector?

Options:

A.

Utilize a higher Buffer.flush.time in the connector configuration.

B.

Utilize a higher Buffer.size.bytes in the connector configuration.

C.

Utilize a lower Buffer.size.bytes in the connector configuration.

D.

Utilize a lower Buffer.count.records in the connector configuration.

Expert Solution
Viewing page 1 out of 5 pages
Viewing questions 1-10 out of questions