Installation Instructions for Hot Fix B8T004
Hadoop on Linux x64
Hot fix B8T004 addresses the issue(s) in SAS Threaded Kernel DS2 and FedSQL Language Processors 9.41_M3 for Hadoop on Linux x64 as documented
in the Issue(s) Addressed section of the hot fix download page:
https://tshf.sas.com/techsup/download/hotfix/HF2/U47.html#B8T004
The hot fix downloaded, B8T004hl.tar, contains the updated file required to address the documented issues.
IMPORTANT NOTES
-
All currently active SAS sessions, daemons, spawners and servers must be terminated
before applying this hot fix.
- For the 9.4M5 SAS EP for Hadoop, the account which performs the hotfix installation must either have
passwordless-ssh setup on all yarn nodes in the Hadoop cluster or an account with superuser privileges should
be used.
INSTALLATION on Cloudera 5.x, Hortonworks 2.x, and MapR 5.x+
- On the NameNode of your Hadoop cluster, create a directory called “hotfix” within your
/SASEPHome location called “hotfix”.
mkdir /SASEPHome/hotfix
- Transfer the file B8T004hl.tar to the hot fix directory you created.
- Extract the file.
cd <EPInstallDir>/SASEPHome/hotfix
tar -xvf ./B8T004hl.tar
- Verify that the B8T004 subdirectory now contains the extracted file, sepcorehadphf-13.00000-2.sh.
- Navigate to the “bin” directory of your SASEPHome directory.
cd <EPInstallDir>/SASEPHome/bin
- Execute the sasep-admin.sh script with the -hotfix flag, followed by the path to the hot fix file.
./sasep-admin.sh -hotfix ../hotfix/B8T004/sepcorehadphf-13.00000-2.sh
- Validate that the yarn nodes of your cluster have been successfully patched by inspecting the
output from the sasep-admin.sh command. You should see the following output corresponding to each
node in your cluster:
INFO: Applying hotfix for SAS Embedded Process for Hadoop on node <NODE>.
INSTALLATION on Cloudera 6.x and Hortonworks 3.x
- On the NameNode of your Hadoop cluster, create a directory called “hotfix” within your
/SASEPHome location called “hotfix”.
mkdir /SASEPHome/hotfix
- Transfer the file B8T004hl.tar to the hot fix directory you created.
- Extract the file.
cd <EPInstallDir>/SASEPHome/hotfix
tar -xvf ./B8T004hl.tar
- Verify that the B8T004 subdirectory now contains the extracted file, sepcorehadphf-13.00000-2.sh.
- Move this file to the <EPInstallDir> directory (which is two levels up from the hotfix directory).
mv B8T004/sepcorehadphf-13.00000-2.sh ../..
- Navigate to this directory and run the hotfix file. Running this script by itself will patch the
sasep-admin.sh script so that it will support the newer version of these Hadoop distributions.
cd ../..
./sepcorehadphf-13.00000-2.sh
- Navigate to the “bin” directory of your SASEPHome directory.
cd SASEPHome/bin
- Execute the sasep-admin.sh script with the -hotfix flag, followed by the path to the hot fix file.
./sasep-admin.sh -hotfix ../../sepcorehadphf-13.00000-2.sh
NOTE: It is expected for this operation to report a failure to expand the SAS Embedded Process libraries
on the current/master node. This is because you ran the hotfix file here manually in an earlier step.
INSTALLATION Amazon Web Services EMR
- Create a directory named <EPInstallDir> on all yarn nodes as ec2-user with permissions 755.
- Copy the sepcorehadp-13.00000-1.sh file to the <EPInstallDir> directory on all yarn nodes
as ec2-user
- Navigate to the <EPInstallDir> directory on each node as ec2-user and run
./sepcorehadp-13.00000-1.sh
- Copy the sepcorehadphf-13.00000-2.sh file to the <EPInstallDir> directory on all yarn nodes
as ec2-user
- Run <EPInstallDir>/sepcorehadphf-13.00000-2.sh on all yarn nodes as ec2-user
- Perform the Post-Installation steps as ec2-user
POST INSTALLATION
Verify the sasep.jar link in the <EPInstallDir>/SASEPHome/jars directory on each node in the
cluster is referencing the the latest version of the sasep-hdp2 jar. If this is not correct, run
these commands on each affected node:
cd <EPInstallDir>/SASEPHome/bin
./sasep-admin.sh -linksasepjaronly
This completes the installation of hot fix B8T004 on Hadoop on Linux x64.
Copyright 2017 SAS Institute Inc. All Rights Reserved.