+ All Categories
Home > Documents > Centrify Identity and Access Management for Hortonworks · Centrify Identity and Access Management...

Centrify Identity and Access Management for Hortonworks · Centrify Identity and Access Management...

Date post: 20-May-2020
Category:
Upload: others
View: 15 times
Download: 0 times
Share this document with a friend
29
© 2015 CENTRIFY CORPORATION. ALL RIGHTS RESERVED. PAGE 1 Centrify Identity and Access Management for Hortonworks Integration Guide Abstract Centrify Server Suite is an enterprise-class solution that secures Hortonworks Data Platform leveraging an organization’s existing Active Directory infrastructure to deliver identity, access control, privilege management and user-level auditing.
Transcript
Page 1: Centrify Identity and Access Management for Hortonworks · Centrify Identity and Access Management for Hortonworks Integration Guide . Abstract . Centrify Server Suite is an enterprise-

© 2015 CENTRIFY CORPORATION. ALL RIGHTS RESERVED. PAGE 1

Centrify Identity and Access Management for Hortonworks Integration Guide

Abstract

Centrify Server Suite is an enterprise-class solution that secures Hortonworks Data Platform leveraging an organization’s existing Active Directory infrastructure to deliver identity, access control, privilege management and user-level auditing.

Page 2: Centrify Identity and Access Management for Hortonworks · Centrify Identity and Access Management for Hortonworks Integration Guide . Abstract . Centrify Server Suite is an enterprise-

CENTRIFY APP NOTE AN OVERVIEW OF THE SECURITY OF THE CENTRIFY CLOUD SERVICE

© 2015 CENTRIFY CORPORATION. ALL RIGHTS RESERVED. PAGE 2

Information in this document, including URL and other Internet Web site references, is subject to change without notice. Unless otherwise noted, the example companies, organizations, products, domain names, email addresses, logos, people, places and events depicted herein are fictitious, and no association with any real company, organization, product, domain name, e-mail address, logo, person, place or event is intended or should be inferred. Complying with all applicable copyright laws is the responsibility of the user. Without limiting the rights under copyright, no part of this document may be reproduced, stored in or introduced into a retrieval system, or transmitted in any form or by any means (electronic, mechanical, photocopying, recording, or otherwise), or for any purpose, without the express written permission of Centrify Corporation.

Centrify may have patents, patent applications, trademarks, copyrights, or other intellectual property rights covering subject matter in this document. Except as expressly provided in any written license agreement from Centrify, the furnishing of this document does not give you any license to these patents, trademarks, copyrights, or other intellectual property.

© 2015 Centrify Corporation. All rights reserved.

Centrify, DirectControl and DirectAudit are registered trademarks and Centrify Suite, DirectAuthorize, DirectSecure and DirectManage are trademarks of Centrify Corporation in the United States and/or other countries. Microsoft, Active Directory, Windows, Windows NT, and Windows Server are either registered trademarks or trademarks of Microsoft Corporation in the United States and/or other countries.

The names of actual companies and products mentioned herein may be the trademarks of their respective owners.

Page 3: Centrify Identity and Access Management for Hortonworks · Centrify Identity and Access Management for Hortonworks Integration Guide . Abstract . Centrify Server Suite is an enterprise-

CENTRIFY APP NOTE AN OVERVIEW OF THE SECURITY OF THE CENTRIFY CLOUD SERVICE

© 2015 CENTRIFY CORPORATION. ALL RIGHTS RESERVED. PAGE 3

Contents

Contents ......................................................................................................................... 3

Benefits of Integrating with Centrify ................................................................... 4

Planning for Active Directory Integration ......................................................... 4 Basic prerequisites ........................................................................................................... 5 Planning the organizational units to use ................................................................. 5 Planning to use Centrify zones for Hadoop clusters ........................................... 5

Creating Active Directory Organizational units .............................................. 6

Installing Centrify DirectManage Access ............................................................ 7

Creating zones and defining a user profile ........................................................ 7

Assigning a role to a user in a zone ...................................................................... 9

Integrating Hortonworks and Centrify ............................................................... 9 Prepare the virtual machines ....................................................................................... 9 Install the Centrify agent ............................................................................................. 10 Install the Centrify LDAP proxy ................................................................................ 11 Install Hortonworks on the first node in the cluster ....................................... 11

Enabling security for the cluster ......................................................................... 16 Console after services have successfully restarted in secure mode .......... 19

Verifying Active Directory managed Service Accounts ............................... 19

Maintaining your Centrify Hadoop environment.......................................... 20 Keeping the Hadoop service account keytab up to date ................................ 21 Configuring Active Directory user accounts not to expire ............................ 21 Configuring Kerberos credentials not to expire ................................................ 22

Zone enable Hadoop Accounts ............................................................................. 22

Validating cluster security .................................................................................... 23

Conclusion ................................................................................................................... 27

How to Contact Centrify .......................................................................................... 29

Page 4: Centrify Identity and Access Management for Hortonworks · Centrify Identity and Access Management for Hortonworks Integration Guide . Abstract . Centrify Server Suite is an enterprise-

CENTRIFY APP NOTE AN OVERVIEW OF THE SECURITY OF THE CENTRIFY CLOUD SERVICE

© 2015 CENTRIFY CORPORATION. ALL RIGHTS RESERVED. PAGE 4

Benefits of Integrating with Centrify Centrify Server Suite is an enterprise-class solution that supports the Hortonworks implementation of Hadoop. Together, Centrify and Hortonworks allow you to use your organization’s existing Active Directory infrastructure to deliver access control, privilege management, and user-level auditing.

By installing the Centrify agent on each node in the Hadoop cluster, you can provide identity and access management for the users who will log on to computers in the cluster with their Active Directory credentials.

Centrify Server Suite provides identity, access and privilege management for the Hortonworks Data Platform in the following ways:

• Simplifies AD integration for Hortonworks to run in secure mode.

• Automates service account credential management.

• Simplifies access with AD-based user single sign-on authentication.

• Ensures regulatory compliance with least privilege and auditing.

• Uses developer SDKs for secure client application access to Hadoop.

Planning for Active Directory Integration The default Hadoop security architecture is based on Kerberos, which is also the core infrastructure for Active Directory. As a result, all principals are user principals and there will be an Active Directory account for each service account that requires a Kerberos key table (keytab) file. For example, a 2-node cluster with 6 unique distributed services will require 12 Active Directory accounts which will each require a unique Kerberos keytab file.

The key to managing Hadoop clusters in Active Directory is the addition of a cluster prefix to the associated Hortonworks Kerberos principal. The cluster prefix ensures that the user principal name (UPN) and service principal name (SPN) for the account each cluster depends upon are unique across the Active Directory forest.

After you install the Centrify agent on each node, you can use Centrify to manage user and service principals and corresponding keytab files on those computer nodes or centrally from a Windows console on an administrator’s workstation.

You should outline a naming convention for all Hadoop service principals that will reside in Active Directory. Ideally, you should be able to identify the service, cluster, and host by the naming convention you establish. Keep in mind that the sAMAccountName attribute has a maximum length of 20 characters and must be unique across the Active Directory forest.

Page 5: Centrify Identity and Access Management for Hortonworks · Centrify Identity and Access Management for Hortonworks Integration Guide . Abstract . Centrify Server Suite is an enterprise-

CENTRIFY APP NOTE AN OVERVIEW OF THE SECURITY OF THE CENTRIFY CLOUD SERVICE

© 2015 CENTRIFY CORPORATION. ALL RIGHTS RESERVED. PAGE 5

Basic prerequisites

• Active Directory must be installed and at least one domain controller available.

• You should have a Windows workstation joined to the domain where you can run administrative consoles.

• You should have access to at least three physical or virtual Linux computers to use as Hadoop nodes.

• You should have Centrify Server Suite software installed or available to be installed.

• You can request a free trial of Centrify Server Suite by filling out the http://www.centrify.com/free-trial/server-suite-form/ on the Centrify website and specifying Hadoop in the Comments field.

• You should have Centrify Server Suite documentation available for reference.

• You can download documentation from http://community.centrify.com/t5/custom/page/page-id/Centrify-Documentation after you register your free trial and set up your Centrify account.

Planning the organizational units to use

You should use an Active Directory organizational unit (OU) to manage all your Hadoop clusters such as OU=centrifyse (as shown in the figure on page 7). Your Active Directory domain administrators might need to delegate administrative rights of this OU to you or your technical lead. The Linux identity, access information, and privilege information are stored within the OU that was created for you (OU=centrifyse).

Each cluster should have its own OU to independently manage its nodes and service accounts. The OU name should reflect the name of the cluster, for example, OU=HWX4 (the name chosen here represents Hortonworks, cluster 4). This cluster-level OU is usually created within the OU that was created by the Active Directory administrator and delegated to you so that you can create an OU for each Hadoop cluster and manage the accounts and policies yourself.

Planning to use Centrify zones for Hadoop clusters

Centrify uses the Zones container to store the access and privilege permissions for the selected Active Directory users that you authorize to access each Hadoop cluster. A typical setup for Hadoop is to create a Global zone (ou=zones, ou=Global) containing unique child zones for each Hadoop cluster that you deploy. This arrangement ensures separation of duties and enables delegated administration. You should use the same name you used for the cluster prefix for the child zone, for example, ou=hwx4.

Page 6: Centrify Identity and Access Management for Hortonworks · Centrify Identity and Access Management for Hortonworks Integration Guide . Abstract . Centrify Server Suite is an enterprise-

CENTRIFY APP NOTE AN OVERVIEW OF THE SECURITY OF THE CENTRIFY CLOUD SERVICE

© 2015 CENTRIFY CORPORATION. ALL RIGHTS RESERVED. PAGE 6

Creating Active Directory Organizational units You are now ready to create Active Directory organizational units for the Hadoop cluster. The figure on page 7 shows Active Directory Users and Computers (ADUC) after you perform this procedure.

1. On the domain controller, open ADUC.

2. Right-click the domain (centrify.vms) and then select New>Organizational Unit.

3. Type the name of the top-level Hadoop OU, then click OK.

For example, you might use a format such as this:

OU=centrifyse, DC=Company, DC=com

4. Create new OUs for each cluster and for other required and optional objects that support Hadoop.

a. Select the top-level Hadoop OU (centrifyse in the figure on page 7) and right-click.

b. Select New > Organizational Unit.

c. Type the name of the cluster OU (for example, HWX1), then click OK.

d. Repeat these steps for additional cluster OUs (for example, HWX2, HWX3, HWX4, and so on).

5. To manage computer nodes in the cluster separately from user accounts and Service accounts in the cluster, create additional OUs for computer nodes (OU=Nodes) and user and service accounts (OU=Accounts).

a. Select the cluster-specific OU (for example, HWX4), and right-click.

b. Select New > Organizational Unit.

c. Create a Nodes OU for computer nodes, then repeat these steps to create an Accounts OU.

Page 7: Centrify Identity and Access Management for Hortonworks · Centrify Identity and Access Management for Hortonworks Integration Guide . Abstract . Centrify Server Suite is an enterprise-

CENTRIFY APP NOTE AN OVERVIEW OF THE SECURITY OF THE CENTRIFY CLOUD SERVICE

© 2015 CENTRIFY CORPORATION. ALL RIGHTS RESERVED. PAGE 7

Installing Centrify DirectManage Access You are now ready to install Centrify Server Suite on a Windows administrator’s workstation.

Note If DirectManage Access is already installed, go to “Creating zones and defining a user profile” and continue from there.

If you downloaded the documentation, you can use the Centrify Server Suite Quick Start Guide to guide you through the next steps.

1. Open the Centrify Server Suite ISO or ZIP file for Windows 32-bit or Windows 64-bit on the Windows workstation.

2. Click Access on the Getting Started page or run the setup program in the DirectManage folder.

3. Follow the prompts displayed to select the suite edition and components to install.

Creating zones and defining a user profile Use Access Manager to set up the Active Directory domain and create the zones for the Hadoop cluster.

1. Open Access Manager to start the Setup Wizard.

2. Follow the prompts displayed to create the containers for Licenses and Zones.

Set up the containers so that they match the OU structure that you created in “Creating Active Directory organizational units.”

3. In Access Manager, create a top level zone (for example, Global) that will contain all child zones for the Hadoop integration.

Page 8: Centrify Identity and Access Management for Hortonworks · Centrify Identity and Access Management for Hortonworks Integration Guide . Abstract . Centrify Server Suite is an enterprise-

CENTRIFY APP NOTE AN OVERVIEW OF THE SECURITY OF THE CENTRIFY CLOUD SERVICE

© 2015 CENTRIFY CORPORATION. ALL RIGHTS RESERVED. PAGE 8

4. Create a child zone for each cluster within the top level (Global) zone. Name the child zones so that they are easily identified as a cluster zones (for example, HWX4, which matches the name of the cluster OU that you created in “Creating Active Directory organizational units”). If it would help with the organization of your environment, you can optionally create an intermediary zone (such as the Hadoop zone shown in the figure below) between the top level zone and the cluster zones.

5. Add Active Directory users to either the top level (Global) zone or the intermediary (Hadoop) zone. In this example, it is recommended that you add the users to the Hadoop zone. The users that you add will have permission to log into computers in the clusters.

a. Select the top level zone (Global) or the intermediary zone (Hadoop), and right-click.

b. Select Add User to search for and select an existing Active Directory user.

6. Select Define user UNIX profile, and deselect Assign roles.

7. Accept the defaults for all fields, click Next, then click Finish.

Page 9: Centrify Identity and Access Management for Hortonworks · Centrify Identity and Access Management for Hortonworks Integration Guide . Abstract . Centrify Server Suite is an enterprise-

CENTRIFY APP NOTE AN OVERVIEW OF THE SECURITY OF THE CENTRIFY CLOUD SERVICE

© 2015 CENTRIFY CORPORATION. ALL RIGHTS RESERVED. PAGE 9

Assigning a role to a user in a zone User profiles are inherited by child zones, so the users that you added to the top level (Global) zone or intermediary (Hadoop) zone automatically have a profile in the cluster (HWX4) zone. To log on to a computer, however, a user must have both a profile and a role assignment. Access Manager includes a default UNIX Login role that you can assign to enable users to log on.

1. Expand the top level (Global) zone or intermediary (Hadoop) zone, Child Zones, the cluster (HWX4) zone, and Authorization.

2. Right-click Role Assignments, then click Assign Role.

3. Select the UNIX Login role from the list of roles and click OK.

4. Click Add AD Account to search for and select each Active Directory user you added to the Global zone or Hadoop zone, then click OK.

Integrating Hortonworks and Centrify Prepare the virtual machines

Perform the following steps to prepare the virtual environment for testing with three CentOS computers:

1. Provision three new Centos 6.x virtual machines:

• For node 1 (named hwx4n1 in this document):

2 processors, 8GB RAM, 1 HD (120 GB).

• For node 2 (named hwx4n2):

2 processors, 8GB RAM, 1 HD (120 GB).

• For node 3 (named hwx4n3):

2 processors, 8GB RAM, 1 HD (120 GB).

2. Ensure that the DNS is configured correctly:

a. Create the corresponding DNS address (A) records in the appropriate DNS zone. In this document we are using the centrifyimage.vms DNS zone.

b. Create the proper reverse DNS entries.

Page 10: Centrify Identity and Access Management for Hortonworks · Centrify Identity and Access Management for Hortonworks Integration Guide . Abstract . Centrify Server Suite is an enterprise-

CENTRIFY APP NOTE AN OVERVIEW OF THE SECURITY OF THE CENTRIFY CLOUD SERVICE

© 2015 CENTRIFY CORPORATION. ALL RIGHTS RESERVED. PAGE 10

Note: Perform the following steps on each node.

3. Perform a yum update.

4. Disable and stop the iptables service:

chkconfig iptables off && service iptables stop

5. Enable the ntpd service:

chkconfig ntpd on

6. Disable SElinux:

edit /etc/selinux/config

7. Set the directive to “enabled = 0” on the /etc/yum/pluginconf.d/refresh-packagekit.conf

8. On your first node (hwx4n1):

a. Run the ssh-keygen command and copy the contents of id_rsa.pub to /root/.ssh/authorized_keys.

b. Attempt an ssh connection as root to the second node, (hwx4n2.centrify.vms) and third node (hwx4n3.centrify.vms).

c. Copy the /root/.shh/authorized_keys file to hwx4n2:/root/.ssh & hwx4n3:/root/.ssh.

Install the Centrify agent

You can now install the Centrify agent on each node computer in the cluster and join the each node computer to an Active Directory domain.

1. Download the appropriate tarred and zipped Centrify agent for the operating system of the virtual machines. Copy the agent .tgz file to each node computer in the cluster.

2. Unzip and extract the agent package.

3. Run the install.sh script interactively.

4. Open /etc/centrifydc/centrifydc.conf for editing and make the following changes.

Note: this step is required or the cluster will not start. Centrify should not create servicePrincipalName for the http service because Hortonworks will need to do this later.

a. Uncomment the adclient.krb5.service.principals line.

b. Remove http from the adclient.krb5.service.principals line.

5. Join each computer to Active Directory. For example:

Page 11: Centrify Identity and Access Management for Hortonworks · Centrify Identity and Access Management for Hortonworks Integration Guide . Abstract . Centrify Server Suite is an enterprise-

CENTRIFY APP NOTE AN OVERVIEW OF THE SECURITY OF THE CENTRIFY CLOUD SERVICE

© 2015 CENTRIFY CORPORATION. ALL RIGHTS RESERVED. PAGE 11

adjoin –z zone –c “container” –V –u user domain name adjoin –z hwx3 –c ou=nodes,ou=hwx3,ou=hadoop –V –u <your AD loginname> company.com

6. Optional: Install the Centrify Audit agent and enable auditing.

rpm –Uvh centrifyda-<version>

You should see confirmation that the computer has successfully joined Active Directory. After the computer restarts, you can log on using the Active Directory user name and password you previously assigned the UNIX Login role.

Install the Centrify LDAP proxy

Install the Centrify LDAP proxy on the first node “hwx4n1.centrify.vms” in the cluster. Hortonworks 2.2 features an “Enable Kerberos” wizard. The wizard requires the ability to communicate over secured LDAP (LDAPS) to Active Directory. The procedure for installing and setting the secured LDAP proxy is described in Centrify’s UNIX Administrator’s Guide (PDF) beginning on page 194.

Install Hortonworks on the first node in the cluster

You can now install Hortonworks on the first node (hwx4n1, which is the name node) in the cluster .

1. Log on as a root-level user to node hwx4n1.

2. Add the Hortonworks repo: wget -nv http://public-repo- 1.hortonworks.com/ambari/centos6/2.x/updates/2.0.0/ambari.repo -O /etc/yum.repos.d/ambari.repo

3. Install the epel repository:

yum install epel-release

4. Confirm the repos:

yum repolist

5. Install the Ambari server:

yum install ambari-server

The server installer will prompt you for dependencies and to accept the Oracle JDK EULA.

6. Run the “ambari-server setup” program and accept all the defaults.

7. Start the ambari server:

ambari-server start

8. Open a web-browser, enter the web address, your-hostname:8080 and log in with username “admin” and password “admin”.

Page 12: Centrify Identity and Access Management for Hortonworks · Centrify Identity and Access Management for Hortonworks Integration Guide . Abstract . Centrify Server Suite is an enterprise-

CENTRIFY APP NOTE AN OVERVIEW OF THE SECURITY OF THE CENTRIFY CLOUD SERVICE

© 2015 CENTRIFY CORPORATION. ALL RIGHTS RESERVED. PAGE 12

9. Select “Launch Install Wizard” from the Welcome to Apache Ambari webpage and name your cluster (in this document, HWX4).

10. On the Select Stack page, select HDP 2.2.

11. On the install Options page > Target hosts, enter the FQDNs for the Hadoop servers and in the host registration page, paste the contents of hadoop1:/root/.ssh/id_rsa

12. On the Confirm Hosts page, the installation of the Ambari agents will begin.

Page 13: Centrify Identity and Access Management for Hortonworks · Centrify Identity and Access Management for Hortonworks Integration Guide . Abstract . Centrify Server Suite is an enterprise-

CENTRIFY APP NOTE AN OVERVIEW OF THE SECURITY OF THE CENTRIFY CLOUD SERVICE

© 2015 CENTRIFY CORPORATION. ALL RIGHTS RESERVED. PAGE 13

13. On the Choose Services page, select every service.

14. Accept the defaults on the Assign Masters page and the Assign Slaves and Clients page.

15. On the Customize Services page, select Misc, to add a cluster pre-fix (“hwx3-“ to match the name of your cluster entered earlier) to all users and groups.

Note: This step allows for multiple clusters within Active Directory and must be done before Hadoop software deployment.

Page 14: Centrify Identity and Access Management for Hortonworks · Centrify Identity and Access Management for Hortonworks Integration Guide . Abstract . Centrify Server Suite is an enterprise-

CENTRIFY APP NOTE AN OVERVIEW OF THE SECURITY OF THE CENTRIFY CLOUD SERVICE

© 2015 CENTRIFY CORPORATION. ALL RIGHTS RESERVED. PAGE 14

16. Accept the changes to the various services such as “HDFS User” and “Hadoop Group.”

17. Click Next until you get to the Review Page. Review your options and click Next to navigate to the Install, Start and Test Page progress window

Page 15: Centrify Identity and Access Management for Hortonworks · Centrify Identity and Access Management for Hortonworks Integration Guide . Abstract . Centrify Server Suite is an enterprise-

CENTRIFY APP NOTE AN OVERVIEW OF THE SECURITY OF THE CENTRIFY CLOUD SERVICE

© 2015 CENTRIFY CORPORATION. ALL RIGHTS RESERVED. PAGE 15

18. On the Summary page, Click Complete.

You will be taken to the Ambari Dashboard.

19. Test the unsecured cluster and authorize the dwirth user.

[root@hwx4n1 Desktop]# hadoop fs -ls /user Found 5 items drwxrwx--- - hwx4-ambari-qa hwx4-hdfs 0 2015-05-15 09:46 /user/hwx4-ambari-qa drwxr-xr-x - hwx4-hcat hwx4-hdfs 0 2015-05-15 09:44 /user/hwx4-hcat drwx------ - hwx4-hive hwx4-hdfs 0 2015-05-15 09:41 /user/hwx4-hive drwxrwxr-x - hwx4-oozie hwx4-hdfs 0 2015-05-15 09:42 /user/hwx4-oozie drwxrwxr-x - hwx4-spark hwx4-hdfs 0 2015-05-15 09:36 /user/hwx4-spark [root@hwx4n1 Desktop]# cd /usr/hdp/2.2.4.2-2/hadoop-mapreduce/ [root@hwx4n1 hadoop-mapreduce]# su hwx4-hdfs

Page 16: Centrify Identity and Access Management for Hortonworks · Centrify Identity and Access Management for Hortonworks Integration Guide . Abstract . Centrify Server Suite is an enterprise-

CENTRIFY APP NOTE AN OVERVIEW OF THE SECURITY OF THE CENTRIFY CLOUD SERVICE

© 2015 CENTRIFY CORPORATION. ALL RIGHTS RESERVED. PAGE 16

[hwx4-hdfs@hwx4n1 hadoop-mapreduce]$ hadoop fs -mkdir /user/dwirth [hwx4-hdfs@hwx4n1 hadoop-mapreduce]$ hadoop fs -chown dwirth:dwirth /user/dwirth [hwx4-hdfs@hwx4n1 hadoop-mapreduce]$ hadoop jar hadoop-mapreduce-examples.jar pi 5 10 Number of Maps = 5 Samples per Map = 10 Wrote input for Map #0 Wrote input for Map #1 Wrote input for Map #2 Wrote input for Map #3 Wrote input for Map #4 Starting Job 15/05/15 10:04:34 INFO impl.TimelineClientImpl: Timeline service address: http://hwx4n2.centrify.vms:8188/ws/v1/timeline/ 15/05/15 10:04:34 INFO client.RMProxy: Connecting to ResourceManager at hwx4n2.centrify.vms/192.168.1.118:8050 . . . Job Finished in 21.658 seconds Estimated value of Pi is 3.28000000000000000000

Enabling security for the cluster Configure the cluster to operate in secure mode by using the Kerberos infrastructure that was enabled by the Centrify agent on each of the nodes.

1. In Ambari, go to Admin > Kerberos and click Enable Security.

2. On the Get Started page, select Existing Active Directory then click on all prerequisites.

3. Click Next.

4. Complete the KDC and KAdmin page. The required fields are:

• KDC Host: Is the FQDN of your domain controller

• REALM Name: is your domain name. Must be all uppercase

• LDAP url: must be in secured format ldaps:// and your domain controller

• Container DN: location the service principals (accounts) will be created in. Use domain component format.

• Kadmin Host: Domain Contolller for the Admin principals to authenticate to.

Page 17: Centrify Identity and Access Management for Hortonworks · Centrify Identity and Access Management for Hortonworks Integration Guide . Abstract . Centrify Server Suite is an enterprise-

CENTRIFY APP NOTE AN OVERVIEW OF THE SECURITY OF THE CENTRIFY CLOUD SERVICE

© 2015 CENTRIFY CORPORATION. ALL RIGHTS RESERVED. PAGE 17

• Admin principal: Active Directory account that has permissions to create service principals

• Admin password: Admin principals password

5. In Advanced krb5-conf, deselect Manage Kerberos client krb5.conf.

The Centrify agent will update the local Kerberos files.

6. Select Test KDC Connection.

The expected response is Connection OK.

7. Review the Install and Test Kerberos Client page and select Next.

Page 18: Centrify Identity and Access Management for Hortonworks · Centrify Identity and Access Management for Hortonworks Integration Guide . Abstract . Centrify Server Suite is an enterprise-

CENTRIFY APP NOTE AN OVERVIEW OF THE SECURITY OF THE CENTRIFY CLOUD SERVICE

© 2015 CENTRIFY CORPORATION. ALL RIGHTS RESERVED. PAGE 18

8. Select Advanced -> Configure Identities and expand the Storm link.

9. Add the cluster pre-fix “hwx4-“ to the Storm principal storm_prinicapal_name then click Next.

10. Select Kerberize Cluster.

11. Click Next to re-start and test the service.

Note: In some cases the Start and Test Services failed and subsequent retries failed as well. To finish the Kerberos installation select Complete and Regenerate Keytabs

Page 19: Centrify Identity and Access Management for Hortonworks · Centrify Identity and Access Management for Hortonworks Integration Guide . Abstract . Centrify Server Suite is an enterprise-

CENTRIFY APP NOTE AN OVERVIEW OF THE SECURITY OF THE CENTRIFY CLOUD SERVICE

© 2015 CENTRIFY CORPORATION. ALL RIGHTS RESERVED. PAGE 19

Console after services have successfully restarted in secure mode

Verifying Active Directory managed Service Accounts In ADUC, browse to the Hadoop/HWX3 OU, you should see your Service Account in AD.

Page 20: Centrify Identity and Access Management for Hortonworks · Centrify Identity and Access Management for Hortonworks Integration Guide . Abstract . Centrify Server Suite is an enterprise-

CENTRIFY APP NOTE AN OVERVIEW OF THE SECURITY OF THE CENTRIFY CLOUD SERVICE

© 2015 CENTRIFY CORPORATION. ALL RIGHTS RESERVED. PAGE 20

Note: The number of and names of the keytabs will vary from node to node.

On each host, you should see the keytabs with the appropriate permissions:

Maintaining your Centrify Hadoop environment This section describes the actions you should take to ensure that your integrated Centrify Hadoop environment continues to operate correctly.

Page 21: Centrify Identity and Access Management for Hortonworks · Centrify Identity and Access Management for Hortonworks Integration Guide . Abstract . Centrify Server Suite is an enterprise-

CENTRIFY APP NOTE AN OVERVIEW OF THE SECURITY OF THE CENTRIFY CLOUD SERVICE

© 2015 CENTRIFY CORPORATION. ALL RIGHTS RESERVED. PAGE 21

Hadoop creates Kerberos principals for service accounts. Those principals are governed by the same Active Directory polices that govern user accounts and computer accounts. That arrangement differs from MIT Kerberos implementations, and requires the following maintenance procedures after your environment is set up.

Keeping the Hadoop service account keytab up to date

Centrify Server Suite automatically maintains the keytab entries for computer accounts when the Centrify agent updates keytab entries every 28 days (or at a different interval if you specify a value other than the default of 28 days). However, other keytab entries, such as those created for user accounts and that reside on each node, are not automatically refreshed. If you created the Hadoop service account as a user account, you must ensure that keytab entries for Hadoop-specific user principals are automatically updated.

Note: You do not need to perform this procedure if you created the Hadoop service account as a computer account.

You can perform this configuration by writing a script that issues the adkeytab -C command, so that the keytab entry for the specified user account is updated. When the Centrify agent updates the user account, it obtains a new key version number (KVNO). The script must update every keytab on every node in the cluster.

Also, you must ensure that Hadoop service accounts are zone enabled.

Configuring Active Directory user accounts not to expire

Active Directory user accounts (user principals) are governed by Active Directory group policy objects for users. Organizations typically change user passwords every 30 to 60 days, or automatically expire accounts.

If you created the Hadoop service account as a user account, you must ensure that passwords for Hadoop-specific user principals are set to never expire.

You do not need to perform this procedure if you created the Hadoop service account as a computer account.

To perform this configuration in ADUC:

1. Go to the Users organizational unit.

2. Right-click the user account that you want to have never expire.

3. Select Properties.

4. Select the Account tab.

5. Select the Password never expires option.

Page 22: Centrify Identity and Access Management for Hortonworks · Centrify Identity and Access Management for Hortonworks Integration Guide . Abstract . Centrify Server Suite is an enterprise-

CENTRIFY APP NOTE AN OVERVIEW OF THE SECURITY OF THE CENTRIFY CLOUD SERVICE

© 2015 CENTRIFY CORPORATION. ALL RIGHTS RESERVED. PAGE 22

Configuring Kerberos credentials not to expire

For your Centrify Hadoop environment to operate correctly in the long term, you must ensure that Kerberos tickets that are linked to user principals do not expire. Starting with Server Suite 2015.1, you can perform this configuration in one of these ways:

• Through the krb5.cache.infinite.renewal parameter in /etc/centrifydc/centrifydc.conf. When you set this parameter to true, user credentials are automatically reissued when they expire. See the Configuration and Tuning Reference Guide for more information about this parameter.

• Through the Renew credentials automatically group policy. See the Group Policy Guide for more information about this group policy.

Zone enable Hadoop Accounts

The Ambari installer will automatically create a number of local accounts on the node with the cluster prefix (for example, “hwx3-“). In addition, the RPM installer will create additional accounts without the cluster prefix (see below). postgres:x:26:26:PostgreSQL Server:/var/lib/pgsql:/bin/bash hwx3-ambari-qa:x:1001:501::/home/hwx3-ambari-qa:/bin/bash hwx3-nobody:x:502:501::/home/hwx3-nobody:/bin/bash hwx3-zookeeper:x:503:501::/home/hwx3-zookeeper:/bin/bash hwx3-mapred:x:504:501::/home/hwx3-mapred:/bin/bash hwx3-hdfs:x:505:501::/home/hwx3-hdfs:/bin/bash hwx3-yarn:x:506:501::/home/hwx3-yarn:/bin/bash hwx3-nagios:x:507:503::/home/hwx3-nagios:/bin/bash zookeeper:x:496:493:ZooKeeper:/var/run/zookeeper:/bin/bash yarn:x:495:492:Hadoop Yarn:/var/lib/hadoop-yarn:/bin/bash mapred:x:494:491:Hadoop MapReduce:/var/lib/hadoop-mapreduce:/bin/bash hdfs:x:493:490:Hadoop HDFS:/var/lib/hadoop-hdfs:/bin/bash rrdcached:x:492:489:rrdcached:/var/rrdtool/rrdcached:/sbin/nologin nagios:x:491:488:nagios:/var/log/nagios:/bin/sh

After zone-enabling all of the above accounts that have a cluster prefix, the local accounts can be removed from all nodes in the cluster:

Page 23: Centrify Identity and Access Management for Hortonworks · Centrify Identity and Access Management for Hortonworks Integration Guide . Abstract . Centrify Server Suite is an enterprise-

CENTRIFY APP NOTE AN OVERVIEW OF THE SECURITY OF THE CENTRIFY CLOUD SERVICE

© 2015 CENTRIFY CORPORATION. ALL RIGHTS RESERVED. PAGE 23

In the preceding example, the cluster specific accounts hwc9-nagios and hwc9-yarn are linked to normal AD accounts Nagios and yarn except for the headless accounts. The headless accounts are created during keytab creation with specific UPNs and are cluster-wide. However, the headless accounts still must be zone enabled.

Validating cluster security Verify that users cannot access the cluster without first obtaining their Kerberos credential by logging on to Active Directory. If a user does not have Kerberos credentials and tries to run a Hadoop job after logging in to a cluster node, the attempt fails.:

Using Kerberos authentication Using principal [email protected] Got host ticket host/[email protected] login as dwirth Successful Kerberos connection ------------- WARNING ------------- THIS IS A PRIVATE COMPUTER SYSTEM. All computer systems may be monitored for all lawful purposes. This is to ensure that their use is authorized. During monitoring, information may be examined, r ecorded, copied and used for authorized purposes. All information including pers onal information, placed on or sent over this system may be monitored. Uses of this system, authorized or unauthorized, constitutes consent to monitoring of thi s system. Use of this system constitutes consent to monitoring for these purpos es. ----------- Created home directory [dwirth@hwx4n1 ~]$ klist klist: No credentials cache found (ticket cache FILE:/tmp/krb5cc_1040188499) [dwirth@hwx4n1 ~]$ hadoop fs -ls /user 15/05/15 14:03:52 WARN ipc.Client: Exception encountered while connecting to the server : javax.security.sasl.SaslException: GSS initiate failed [Caused by GSSE xception: No valid credentials provided (Mechanism level: Failed to find any Ker beros tgt)] ls: Failed on local exception: java.io.IOException: javax.security.sasl.SaslExce ption: GSS initiate failed [Caused by GSSException: No valid credentials provide d (Mechanism level: Failed to find any Kerberos tgt)]; Host Details : local host is: "hwx4n1.centrify.vms/192.168.1.117"; destination host is: "hwx4n1.centrify. vms":8020; [dwirth@hwx4n1 ~]$ cd /usr/hdp/2.2.4.2-2/hadoop-mapreduce/ [dwirth@hwx4n1 hadoop-mapreduce]$ hadoop jar hadoop-mapreduce-examples.jar pi 5 10 Number of Maps = 5 Samples per Map = 10 15/05/15 14:05:47 WARN ipc.Client: Exception encountered while connecting to the server : javax.security.sasl.SaslException: GSS initiate failed [Caused by GSSException: No valid credentials provided (Mechanism level: Failed to find any Kerberos tgt)] java.io.IOException: Failed on local exception: java.io.IOException: javax.security.sasl.SaslException: GSS initiate failed [Caused by GSSException: No valid credentials provided (Mechanism level: Failed to find any Kerberos tgt)]; Host Details : local host is: "hwx4n1.centrify.vms/192.168.1.117"; destination host is: "hwx4n1.centrify.vms":8020; at org.apache.hadoop.net.NetUtils.wrapException(NetUtils.java:772) at org.apache.hadoop.ipc.Client.call(Client.java:1473) at org.apache.hadoop.ipc.Client.call(Client.java:1400) at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:232) at com.sun.proxy.$Proxy14.getFileInfo(Unknown Source) at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.getFileInfo(ClientNamenodeProtocolTranslatorPB.java:768) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)

Page 24: Centrify Identity and Access Management for Hortonworks · Centrify Identity and Access Management for Hortonworks Integration Guide . Abstract . Centrify Server Suite is an enterprise-

CENTRIFY APP NOTE AN OVERVIEW OF THE SECURITY OF THE CENTRIFY CLOUD SERVICE

© 2015 CENTRIFY CORPORATION. ALL RIGHTS RESERVED. PAGE 24

at java.lang.reflect.Method.invoke(Method.java:606) at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:187) at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:102) at com.sun.proxy.$Proxy15.getFileInfo(Unknown Source) at org.apache.hadoop.hdfs.DFSClient.getFileInfo(DFSClient.java:2007) at org.apache.hadoop.hdfs.DistributedFileSystem$19.doCall(DistributedFileSystem.java:1136) at org.apache.hadoop.hdfs.DistributedFileSystem$19.doCall(DistributedFileSystem.java:1132) at org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81) at org.apache.hadoop.hdfs.DistributedFileSystem.getFileStatus(DistributedFileSystem.java:1132) at org.apache.hadoop.fs.FileSystem.exists(FileSystem.java:1423) at org.apache.hadoop.examples.QuasiMonteCarlo.estimatePi(QuasiMonteCarlo.java:278) at org.apache.hadoop.examples.QuasiMonteCarlo.run(QuasiMonteCarlo.java:354) at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70) at org.apache.hadoop.examples.QuasiMonteCarlo.main(QuasiMonteCarlo.java:363) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:606) at org.apache.hadoop.util.ProgramDriver$ProgramDescription.invoke(ProgramDriver.java:71) at org.apache.hadoop.util.ProgramDriver.run(ProgramDriver.java:144) at org.apache.hadoop.examples.ExampleDriver.main(ExampleDriver.java:74) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:606) at org.apache.hadoop.util.RunJar.run(RunJar.java:221) at org.apache.hadoop.util.RunJar.main(RunJar.java:136) Caused by: java.io.IOException: javax.security.sasl.SaslException: GSS initiate failed [Caused by GSSException: No valid credentials provided (Mechanism level: Failed to find any Kerberos tgt)] at org.apache.hadoop.ipc.Client$Connection$1.run(Client.java:681) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:415) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1628) at org.apache.hadoop.ipc.Client$Connection.handleSaslConnectionFailure(Client.java:644) at org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:731) at org.apache.hadoop.ipc.Client$Connection.access$2800(Client.java:369) at org.apache.hadoop.ipc.Client.getConnection(Client.java:1522) at org.apache.hadoop.ipc.Client.call(Client.java:1439) ... 34 more Caused by: javax.security.sasl.SaslException: GSS initiate failed [Caused by GSSException: No valid credentials provided (Mechanism level: Failed to find any Kerberos tgt)] at com.sun.security.sasl.gsskerb.GssKrb5Client.evaluateChallenge(GssKrb5Client.java:212) at org.apache.hadoop.security.SaslRpcClient.saslConnect(SaslRpcClient.java:413) at org.apache.hadoop.ipc.Client$Connection.setupSaslConnection(Client.java:554) at org.apache.hadoop.ipc.Client$Connection.access$1800(Client.java:369) at org.apache.hadoop.ipc.Client$Connection$2.run(Client.java:723) at org.apache.hadoop.ipc.Client$Connection$2.run(Client.java:719) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:415) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1628) at org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:718) ... 37 more Caused by: GSSException: No valid credentials provided (Mechanism level: Failed to find any Kerberos tgt) at sun.security.jgss.krb5.Krb5InitCredential.getInstance(Krb5InitCredential.java:147) at sun.security.jgss.krb5.Krb5MechFactory.getCredentialElement(Krb5MechFactory.java:121) at sun.security.jgss.krb5.Krb5MechFactory.getMechanismContext(Krb5MechFactory.java:187) at sun.security.jgss.GSSManagerImpl.getMechanismContext(GSSManagerImpl.java:223) at sun.security.jgss.GSSContextImpl.initSecContext(GSSContextImpl.java:212) at sun.security.jgss.GSSContextImpl.initSecContext(GSSContextImpl.java:179) at com.sun.security.sasl.gsskerb.GssKrb5Client.evaluateChallenge(GssKrb5Client.java:193) ... 46 more

Page 25: Centrify Identity and Access Management for Hortonworks · Centrify Identity and Access Management for Hortonworks Integration Guide . Abstract . Centrify Server Suite is an enterprise-

CENTRIFY APP NOTE AN OVERVIEW OF THE SECURITY OF THE CENTRIFY CLOUD SERVICE

© 2015 CENTRIFY CORPORATION. ALL RIGHTS RESERVED. PAGE 25

If a user has Kerberos credentials and tries to run a Hadoop job, the attempt succeeds: [dwirth@hwx4n1 hadoop-mapreduce]$ kinit Password for [email protected]: [dwirth@hwx4n1 hadoop-mapreduce]$ klist Ticket cache: FILE:/tmp/krb5cc_1040188499 Default principal: [email protected] Valid starting Expires Service principal 05/15/15 14:05:52 05/16/15 00:05:56 krbtgt/[email protected] renew until 05/16/15 14:05:52 [dwirth@hwx4n1 hadoop-mapreduce]$ hadoop fs -ls /user Found 7 items drwxr-xr-x - dwirth dwirth 0 2015-05-15 10:03 /user/dwirth drwxrwx--- - hwx4-ambari-qa hwx4-hdfs 0 2015-05-15 09:46 /user/hwx4-ambari-qa drwxr-xr-x - hwx4-hcat hwx4-hdfs 0 2015-05-15 09:44 /user/hwx4-hcat drwxr-xr-x - hwx4-hdfs hwx4-hdfs 0 2015-05-15 10:04 /user/hwx4-hdfs drwx------ - hwx4-hive hwx4-hdfs 0 2015-05-15 09:41 /user/hwx4-hive drwxrwxr-x - hwx4-oozie hwx4-hdfs 0 2015-05-15 09:42 /user/hwx4-oozie drwxrwxr-x - hwx4-spark hwx4-hdfs 0 2015-05-15 09:36 /user/hwx4-spark [dwirth@hwx4n1 hadoop-mapreduce]$ hadoop jar hadoop-mapreduce-examples.jar pi 5 10 Number of Maps = 5 Samples per Map = 10 Wrote input for Map #0 Wrote input for Map #1 Wrote input for Map #2 Wrote input for Map #3 Wrote input for Map #4 Starting Job 15/05/15 14:06:20 INFO impl.TimelineClientImpl: Timeline service address: http://hwx4n2.centrify.vms:8188/ws/v1/timeline/ 15/05/15 14:06:20 INFO client.RMProxy: Connecting to ResourceManager at hwx4n2.centrify.vms/192.168.1.118:8050 15/05/15 14:06:20 INFO hdfs.DFSClient: Created HDFS_DELEGATION_TOKEN token 1 for dwirth on 192.168.1.117:8020 15/05/15 14:06:20 INFO security.TokenCache: Got dt for hdfs://hwx4n1.centrify.vms:8020; Kind: HDFS_DELEGATION_TOKEN, Service: 192.168.1.117:8020, Ident: (HDFS_DELEGATION_TOKEN token 1 for dwirth) 15/05/15 14:06:21 INFO input.FileInputFormat: Total input paths to process : 5 15/05/15 14:06:21 INFO mapreduce.JobSubmitter: number of splits:5 15/05/15 14:06:21 INFO mapreduce.JobSubmitter: Submitting tokens for job: job_1431706809700_0001 15/05/15 14:06:21 INFO mapreduce.JobSubmitter: Kind: HDFS_DELEGATION_TOKEN, Service: 192.168.1.117:8020, Ident: (HDFS_DELEGATION_TOKEN token 1 for dwirth) 15/05/15 14:06:22 INFO impl.YarnClientImpl: Submitted application application_1431706809700_0001 15/05/15 14:06:22 INFO mapreduce.Job: The url to track the job: http://hwx4n2.centrify.vms:8088/proxy/application_1431706809700_0001/ 15/05/15 14:06:22 INFO mapreduce.Job: Running job: job_1431706809700_0001 15/05/15 14:06:31 INFO mapreduce.Job: Job job_1431706809700_0001 running in uber mode : false 15/05/15 14:06:31 INFO mapreduce.Job: map 0% reduce 0% 15/05/15 14:06:45 INFO mapreduce.Job: map 80% reduce 0% 15/05/15 14:06:46 INFO mapreduce.Job: map 100% reduce 0% 15/05/15 14:06:50 INFO mapreduce.Job: map 100% reduce 100% 15/05/15 14:06:51 INFO mapreduce.Job: Job job_1431706809700_0001 completed successfully 15/05/15 14:06:51 INFO mapreduce.Job: Counters: 49 File System Counters FILE: Number of bytes read=116 FILE: Number of bytes written=719379 FILE: Number of read operations=0 FILE: Number of large read operations=0 FILE: Number of write operations=0 HDFS: Number of bytes read=1375 HDFS: Number of bytes written=215 HDFS: Number of read operations=23 HDFS: Number of large read operations=0 HDFS: Number of write operations=3 Job Counters Launched map tasks=5

Page 26: Centrify Identity and Access Management for Hortonworks · Centrify Identity and Access Management for Hortonworks Integration Guide . Abstract . Centrify Server Suite is an enterprise-

CENTRIFY APP NOTE AN OVERVIEW OF THE SECURITY OF THE CENTRIFY CLOUD SERVICE

© 2015 CENTRIFY CORPORATION. ALL RIGHTS RESERVED. PAGE 26

Launched reduce tasks=1 Data-local map tasks=5 Total time spent by all maps in occupied slots (ms)=59954 Total time spent by all reduces in occupied slots (ms)=2670 Total time spent by all map tasks (ms)=59954 Total time spent by all reduce tasks (ms)=2670 Total vcore-seconds taken by all map tasks=59954 Total vcore-seconds taken by all reduce tasks=2670 Total megabyte-seconds taken by all map tasks=61392896 Total megabyte-seconds taken by all reduce tasks=2734080 Map-Reduce Framework Map input records=5 Map output records=10 Map output bytes=90 Map output materialized bytes=140 Input split bytes=785 Combine input records=0 Combine output records=0 Reduce input groups=2 Reduce shuffle bytes=140 Reduce input records=10 Reduce output records=0 Spilled Records=20 Shuffled Maps =5 Failed Shuffles=0 Merged Map outputs=5 GC time elapsed (ms)=939 CPU time spent (ms)=5780 Physical memory (bytes) snapshot=3196092416 Virtual memory (bytes) snapshot=9394270208 Total committed heap usage (bytes)=3326083072 Shuffle Errors BAD_ID=0 CONNECTION=0 IO_ERROR=0 WRONG_LENGTH=0 WRONG_MAP=0 WRONG_REDUCE=0 File Input Format Counters Bytes Read=590 File Output Format Counters Bytes Written=97 Job Finished in 31.143 seconds Estimated value of Pi is 3.28000000000000000000 [dwirth@hwx4n1 hadoop-mapreduce]$e.Job: The url to track the job: http://hwx3n2.centrify.vms:8088/proxy/application_1425816762815_0001/ 15/03/08 08:18:38 INFO mapreduce.Job: Running job: job_1425816762815_0001 15/03/08 08:18:57 INFO mapreduce.Job: Job job_1425816762815_0001 running in uber mode : false 15/05/15 14:06:31 INFO mapreduce.Job: map 0% reduce 0% 15/05/15 14:06:45 INFO mapreduce.Job: map 80% reduce 0% 15/05/15 14:06:46 INFO mapreduce.Job: map 100% reduce 0% 15/05/15 14:06:50 INFO mapreduce.Job: map 100% reduce 100% 15/05/15 14:06:51 INFO mapreduce.Job: Job job_1431706809700_0001 completed successfully 15/05/15 14:06:51 INFO mapreduce.Job: Counters: 49 File System Counters FILE: Number of bytes read=116 FILE: Number of bytes written=719379 FILE: Number of read operations=0 FILE: Number of large read operations=0 FILE: Number of write operations=0 HDFS: Number of bytes read=1375 HDFS: Number of bytes written=215

Page 27: Centrify Identity and Access Management for Hortonworks · Centrify Identity and Access Management for Hortonworks Integration Guide . Abstract . Centrify Server Suite is an enterprise-

CENTRIFY APP NOTE AN OVERVIEW OF THE SECURITY OF THE CENTRIFY CLOUD SERVICE

© 2015 CENTRIFY CORPORATION. ALL RIGHTS RESERVED. PAGE 27

HDFS: Number of read operations=23 HDFS: Number of large read operations=0 HDFS: Number of write operations=3 Job Counters Launched map tasks=5 Launched reduce tasks=1 Data-local map tasks=5 Total time spent by all maps in occupied slots (ms)=59954 Total time spent by all reduces in occupied slots (ms)=2670 Total time spent by all map tasks (ms)=59954 Total time spent by all reduce tasks (ms)=2670 Total vcore-seconds taken by all map tasks=59954 Total vcore-seconds taken by all reduce tasks=2670 Total megabyte-seconds taken by all map tasks=61392896 Total megabyte-seconds taken by all reduce tasks=2734080 Map-Reduce Framework Map input records=5 Map output records=10 Map output bytes=90 Map output materialized bytes=140 Input split bytes=785 Combine input records=0 Combine output records=0 Reduce input groups=2 Reduce shuffle bytes=140 Reduce input records=10 Reduce output records=0 Spilled Records=20 Shuffled Maps =5 Failed Shuffles=0 Merged Map outputs=5 GC time elapsed (ms)=939 CPU time spent (ms)=5780 Physical memory (bytes) snapshot=3196092416 Virtual memory (bytes) snapshot=9394270208 Total committed heap usage (bytes)=3326083072 Shuffle Errors BAD_ID=0 CONNECTION=0 IO_ERROR=0 WRONG_LENGTH=0 WRONG_MAP=0 WRONG_REDUCE=0 File Input Format Counters Bytes Read=590 File Output Format Counters Bytes Written=97 Job Finished in 31.143 seconds Estimated value of Pi is 3.28000000000000000000 [dwirth@hwx4n1 hadoop-mapreduce]$

Conclusion Centrify Server Suite is the industry’s most widely deployed solution for securing identity on Linux and Windows-based servers and applications and provides several benefits for Hadoop and Big Data environments including:

• Simple and secure access to Hadoop environments. Centrify makes it simple to run Hadoop in secure mode by leveraging existing identity management infrastructure, Active Directory, without the hassle of introducing alternative solutions that do not scale and are not enterprise ready. Centrify Server Suite also saves money by letting organizations leverage existing skill sets within the enterprise.

Page 28: Centrify Identity and Access Management for Hortonworks · Centrify Identity and Access Management for Hortonworks Integration Guide . Abstract . Centrify Server Suite is an enterprise-

CENTRIFY APP NOTE AN OVERVIEW OF THE SECURITY OF THE CENTRIFY CLOUD SERVICE

© 2015 CENTRIFY CORPORATION. ALL RIGHTS RESERVED. PAGE 28

• Single sign-on for IT administrators and big data users. By extending the power of Active Directory’s Kerberos and LDAP capabilities to Hadoop clusters, Centrify Server Suite lets organizations leverage existing Active Directory-based authentication for Hadoop administrators and end users. New SSO functionality in Big Data environments makes users more productive and secure by allowing them to login in as themselves, rather than sharing privileged accounts.

• Secure machine-to-machine communications. Centrify Server Suite automates Hadoop service account management within Active Directory. By automating machine-to-machine credential management, Centrify not only secures user identity but also system account and service account identity.

• Reduced identity-related risks and greater regulatory compliance. Hadoop environments now store most, if not all of an organization’s most important data. Centrify Server Suite tracks user activity back to an individual in Active Directory, thereby making data more secure. Centrify also reports on who did what across Hadoop clusters, nodes and services. By enforcing access controls and least-privilege security across Hadoop, Centrify delivers cost-effective compliance through combined access and activity reporting.

• Certified solution for superior compatibility and support. Centrify has worked closely with Hortonworks and has received product certification. This ensures product compatibility and technical support collaboration between customers, Hortonworks, and Centrify.

Page 29: Centrify Identity and Access Management for Hortonworks · Centrify Identity and Access Management for Hortonworks Integration Guide . Abstract . Centrify Server Suite is an enterprise-

CENTRIFY APP NOTE AN OVERVIEW OF THE SECURITY OF THE CENTRIFY CLOUD SERVICE

© 2015 CENTRIFY CORPORATION. ALL RIGHTS RESERVED. PAGE 29

How to Contact Centrify North America (And All Locations Outside EMEA)

Europe, Middle East, Africa (EMEA)

Centrify Corporation 3393 Octavius Dr, Suite 100 Santa Clara, CA 95054 United States

Centrify EMEA Lilly Hill House Lilly Hill Road Bracknell, Berkshire RG12 2SJ United Kingdom

Sales: +1 (669) 444-5200 Online: www.centrify.com/contact

Sales: +44 (0) 1344 317950


Recommended