HPC 클러스터 - OpenMPI/Ubuntu
페이지 정보
작성자 상석하대 쪽지보내기 메일보내기 자기소개 아이디로 검색 전체게시물 에필로그 댓글 0건 조회 2,105회 작성일 22-06-28 18:57본문
1. 시간
date
sudo ln -sf /usr/share/zoneinfo/Asia/Seoul /etc/localtime
sudo apt install rdate
sudo rdate -s time.bora.net
date
2. 계정 *관리자로 쓸 계정은 owl이라고 가정
sudo usermod -aG sudo owl
su - owl
sudo whoami *root가 나오면 정상 sudo 사용자
exit
3. SSH *허용할 사용자는 owl, 포트를 41760으로 변경시
Port 41760PermitRootLogin no #prohibit-passwordAllowUsers owlPermitEmptyPasswords noChallengeResponseAuthentication yesUsePAM yesPasswordAuthentication yes
sudo service ssh restart
4. ufw 방화벽
sudo ufw default deny incoming
sudo ufw default allow outgoing
sudo ufw allow 53
sudo ufw allow out 123/udp
sudo ufw allow 41760/tcp
sudo ufw deny 22
sudo ufw enable *뭐 물어보면 y
sudo ufw status
5. TCP Wrapper
sudo -i
echo "ALL: ALL" >> /etc/hosts.deny
echo "sshd: ALL" >> /etc/hosts.allow
exit
6. 업데이트
sudo apt update
sudo apt upgrade -y
7. 인증키
ssh-keygen -t rsa -b 4096 -P ""
cp ~/.ssh/id_rsa.pub ~/.ssh/authorized_keys
chmod 600 ~/.ssh/authorized_keys
8. OpenMPI 설치 *1 ~ 8 작업은 전체 노드에 해당
sudo apt install build-essential gfortran -y *도구설치
wget https://download.open-mpi.org/release/open-mpi/v4.1/openmpi-4.1.1.tar.gz *버전은 다를 수 있음
tar xvf openmpi-4.1.1.tar.gz
cd openmpi-4.1.1
./configure --prefix=/usr/local/openmpi-4.1.1 CC=gcc CXX=g++ FC=gfortran --enable-orterun-prefix-by-default --enable-mpi-thread-multiple
make all
sudo make install *설치
echo "export PATH="/usr/local/openmpi-4.1.1/bin":${PATH}" >> ~/.bashrc *경로
source ~/.bashrc *적용
echo $PATH *경로 적용 확인
mpirun -V *버전 출력
mpicc -v *설정 확인
sudo ufw allow from 192.168.0.0/24 *방화벽 허용
9. OpenMPI HPC 구성 *마스터 노드를 192.168.0.10, 일반 노드는 192.168.0.11~15라고 가정
sudo echo "node0 192.168.0.10" >> /etc/hosts
sudo echo "node1 192.168.0.11" >> /etc/hosts
sudo echo "node2 192.168.0.12" >> /etc/hosts
sudo echo "node3 192.168.0.13" >> /etc/hosts
sudo echo "node4 192.168.0.14" >> /etc/hosts
sudo echo "node5 192.168.0.15" >> /etc/hosts
ssh owl@node1 'cat ~/.ssh/id_rsa.pub' >> ~/.ssh/authorized_keys
ssh owl@node2 'cat ~/.ssh/id_rsa.pub' >> ~/.ssh/authorized_keys
ssh owl@node3 'cat ~/.ssh/id_rsa.pub' >> ~/.ssh/authorized_keys
ssh owl@node4 'cat ~/.ssh/id_rsa.pub' >> ~/.ssh/authorized_keys
ssh owl@node5 'cat ~/.ssh/id_rsa.pub' >> ~/.ssh/authorized_keys
scp ~/.ssh/authorized_keys owl@node1:~/.ssh/authorized_keys
scp ~/.ssh/authorized_keys owl@node2:~/.ssh/authorized_keys
scp ~/.ssh/authorized_keys owl@node3:~/.ssh/authorized_keys
scp ~/.ssh/authorized_keys owl@node4:~/.ssh/authorized_keys
scp ~/.ssh/authorized_keys owl@node5:~/.ssh/authorized_keys
10. 연결 확인 *예, 24≤4+4+4+4+4+4, cpu 코어수
mpirun -np 24 -host node0:4 -host node1:4 -host node2:4 -host node3:4 -host node4:4 -host node5:4 hostname
11. 프로그램 배포 및 실행 *실행 파일은 testmpifile이라고 가정
scp ~/testmpifile owl@node1:~/
scp ~/testmpifile owl@node2:~/
scp ~/testmpifile owl@node3:~/
scp ~/testmpifile owl@node4:~/
scp ~/testmpifile owl@node5:~/
mpirun -np 24 -host node0:4 -host node1:4 -host node2:4 -host node3:4 -host node4:4 -host node5:4 ./testmpifile
댓글목록
등록된 댓글이 없습니다.