Versions Compared

Key

  • This line was added.
  • This line was removed.
  • Formatting was changed.
Table of Contents
Note

All yarnman service commands need to be run with sudo

Install

ym-set-static-ip.sh

Info

This command encrypts the local keys and configuration using clevis/tang

ym-generate-certs.sh

...

ym-install.sh

Info

This command encrypts the local keys and configuration using clevis/tang

...

Upgrade

ym-upgrade.sh

Info

This command encrypts the local keys and configuration using clevis/tang

Code Block
yarnman@ym-ph-test [ ~ ]$ sudo ym-encrypt-at-rest.sh Database key found proceeding Number of pins required for decryption :1 Number of pins this must be equal or greater than the number of pins required for decryption :3 Enter URL for tang server 1 :http://10.101.10.10:6655 Enter THP for tang server 1 :DwLco7FJtXWxFTprQ5M3cojJsZo Connection successful to : http://10.101.10.10:6655 Enter URL for tang server 2 :http://10.101.10.11:6655 Enter THP for tang server 2 :0Lqk7DroJ0g3patTCgTweMUAHPc Connection successful to : http://10.101.10.11:6655 Enter URL for tang server 3 :http://10.101.10.12:6655 Enter THP for tang server 3 :GEpmSTQfz8ctVxdgQEp_rnS3za Connection successful to : http://10.101.10.12:6655 { "t": 1, "pins": { "tang": [ { "url": "http://10.101.10.10:6655", "thp": "DwLco7FJtXWxFTprQ5M3cojJsZo" }, { "url": "http://10.101.10.11:6655",

upgrades yarnman

Copy the upgrade file into /var/opt/yarnlab/upgrade  eg wget http://xxxxxxxx or sftp/scp the file onto the server

SSH into the yarnman host and change into the directory /var/opt/yarnlab/upgrade 

run the command yarnman@host [ ~ ]$ sudo ym-upgrade.sh upgradefile.tar.gz

Code Block
Yarnman Upgrade file  found /var/opt/yarnlab/upgrade/ym-registry:package-upgrade-yl-ph-8023676b.tar.gz
Do you want to upgrade yarnman to ym-registry:package-upgrade-yl-ph-8023676b.tar.gz ? Y or Ny
Upgrade yarnman
Stopping yarnman services
Stopping local registry containers
Removing local registry images
Loading local registry package tgz
Loaded image: ym-registry:package
Launching  yarnman registry
f39ac12322df9a3add72c0ad135e691c6fc3ca0fc7be463a5b4534b88e8e68e6
Loading upgrade pre-req script from registry container
Starting upgrade pre-req script
TEMP upgrade script
Setting up tang
groupadd: group 'ym-tang-app-gp' already exists
Showing package container registry catalog
{"repositories":["ym-couchdb","ym-ostree-upgrade","ym-redis","ym-tang","ym-yarnman"]}
{"name":"ym-ostree-upgrade","tags":["yl-ph-8023676b"]}
{"name":"ym-yarnman","tags":["yl-ph-8023676b"]}
[+] Running 2/4
*** lots of docker pull output ***
*** lots of ostree output ***
State: idle
Deployments:
  photon:photon/4.0/x86_64/yarnman
        "thp": "0Lqk7DroJ0g3patTCgTweMUAHPc"       },   Version: 4.0_yarnman (2022-11-16T23:54:09Z)
 {         "url": "http://10.101.10.12:6655",         "thp"Commit: "GEpmSTQfz8ctVxdgQEp_rnS3za"9941830a095f3a8630eabca846414afa03a935e95462845f7e71cc17f8437438
         }     ]GPGSignature: Valid signature }by }352365935446AC840528AF8703F9C95608035F3C
Do you want to encrypt configuration? Y or Ny encrypt configuration Encrypting keys 1668397245104 INFO  Encrypting private and SSL keys using settingsDiff: 166839724510615 INFOadded

  - not overwriting existing encrypted files and not deleting any original files after encryption
1668397245106 INFO  --------------------------------
1668397245106 INFO  Encrypting...
1668397245308 INFO    - 'private-encryption-key.pem' encrypted successfully
1668397245543 INFO    - 'ssl-key.pem' encrypted successfully
1668397245543 INFO  --------------------------------
1668397245543 INFO  Finished encrypting the files
Encrypting config
1668397245643 INFO  Starting the encryption of 1 local configuration fields through Clevis Shamir Secret Sharing
1668397245743 INFO  Attempting to encrypt the following local config fields: couchdb.password
1668397245843 INFO  Local key 'couchdb.password' encrypted successfully
1668397245943 INFO  1 local config fields encrypted, 0 fields omitted
photon:photon/4.0/x86_64/yarnman
                   Version: 4.0_yarnman (2022-11-14T04:04:13Z)
                    Commit: 7fe66e8afc639d7a006b60208b5981748426ef4487581924e897d69a7b7c87cd
              GPGSignature: Valid signature by 352365935446AC840528AF8703F9C95608035F3C
Do you want to remove upgrade file ? Y or N
Removing :ym-registry:package-upgrade-yl-ph-n18-a23846af.tar.gz
Removing old containers
Removing old yarnman image :localhost:5000/ym-yarnman:yl-ph-n18-475aac7a
Removing old couchdb image :localhost:5000/ym-couchdb:yl-ph-n18-475aac7a
Removing old redis image :localhost:5000/ym-redis:yl-ph-n18-475aac7a
Removing old tang image :localhost:5000/ym-tang:yl-ph-n18-475aac7a
Do you want to takereboot ayarnman backup? ofY databaseor keyN this
will be shown on console? Y orNy
Echo private key to console
-----BEGIN RSA PRIVATE KEY-----
REMOVED
-----END RSA PRIVATE KEY-----
Encrypted private key is 8129 bytes
restarting services
Config encryption is complete

ym-upgrade.sh

Info

This command upgrades yarnman

Copy the upgrade file into /var/opt/yarnlab/upgrade  eg wget http://xxxxxxxx or sftp/scp the file onto the server

SSH into the yarnman host and change into the directory /var/opt/yarnlab/upgrade 

run the command yarnman@host [ ~ ]$ sudo ym-upgrade.sh upgradefile.tar.gz

Code Block
Yarnman Upgrade file  found /var/opt/yarnlab/upgrade/ym-registry:package-upgrade-yl-ph-8023676b.tar.gz
Do you want to upgrade yarnman to ym-registry:package-upgrade-yl-ph-8023676b.tar.gz ? Y or Ny
Upgrade yarnman
Stopping yarnman services
Stopping local registry containers
Removing local registry images
Loading local registry package tgz
Loaded image: ym-registry:package
Launching  yarnman registry
f39ac12322df9a3add72c0ad135e691c6fc3ca0fc7be463a5b4534b88e8e68e6
Loading upgrade pre-req script from registry container
Starting upgrade pre-req script
TEMP upgrade script
Setting up tang
groupadd: group 'ym-tang-app-gp' already exists
Showing package container registry catalog
{"repositories":["ym-couchdb","ym-ostree-upgrade","ym-redis","ym-tang","ym-yarnman"]}
{"name":"ym-ostree-upgrade","tags":["yl-ph-8023676b"]}
{"name":"ym-yarnman","tags":["yl-ph-8023676b"]}
[+] Running 2/4
*** lots of docker pull output ***
*** lots of ostree output ***
State: idle
Deployments:
  photon:photon/4.0/x86_64/yarnman
                   Version: 4.0_yarnman (2022-11-16T23:54:09Z)
                    Commit: 9941830a095f3a8630eabca846414afa03a935e95462845f7e71cc17f8437438
              GPGSignature: Valid signature by 352365935446AC840528AF8703F9C95608035F3C
                      Diff: 15 added

● photon:photon/4.0/x86_64/yarnman
                   Version: 4.0_yarnman (2022-11-14T04:04:13Z)
                    Commit: 7fe66e8afc639d7a006b60208b5981748426ef4487581924e897d69a7b7c87cd
              GPGSignature: Valid signature by 352365935446AC840528AF8703F9C95608035F3C
Do you want to remove upgrade file ? Y or N
Removing :ym-registry:package-upgrade-yl-ph-n18-a23846af.tar.gz
Removing old containers
Removing old yarnman image :localhost:5000/ym-yarnman:yl-ph-n18-475aac7a
Removing old couchdb image :localhost:5000/ym-couchdb:yl-ph-n18-475aac7a
Removing old redis image :localhost:5000/ym-redis:yl-ph-n18-475aac7a
Removing old tang image :localhost:5000/ym-tang:yl-ph-n18-475aac7a
Do you want to reboot yarnman ? Y or N 
Reboot yarnman
Note

A reboot may be required to apply OS patches if they are bundled into the update.

ym-backup-setup.sh

Sets up the local backup service account on the yarnman node, and the passphrase used on the backup
Code Block
yarnman@node1 [ ~ ]$ sudo ym-backup-setup.sh 

Starting yarnman ph4 backup
Backup password not set 
Set Backup password: 
Backup password (again): 
Clevis not setup
using local backup password
no backup configuration file found creating
yarnman@node1 [ ~ ]$
Note

No login access is available to the backup service account

ym-backup-actions.sh

all the backup commands are done via the script above

Setup sftp as the backup method and ssh public keys
Code Block
yarnman@node1Reboot yarnman
Note

A reboot may be required to apply OS patches if they are bundled into the update.

Service Commands

ym-service-commands.sh start

Info

This command starts the yarnman services

Code Block
yarnman@yarnman-test [ ~ ]$ sudo ym-service-commands.sh start
starting yarnman.service
● yarnman.service - yarnman
     Loaded: loaded (/usr/lib/systemd/system/yarnman.service; enabled; vendor preset: enabled)
     Active: active (running) since Wed 2022-08-17 08:24:21 UTC; 5ms ago
    Process: 56027 ExecStartPre=/usr/bin/docker-compose -f docker-compose.yml down (code=exited, status=0/SUCCESS)
   Main PID: 56037 (docker-compose)
      Tasks: 5 (limit: 4694)
     Memory: 5.0M
     CGroup: /system.slice/yarnman.service
             └─56037 /usr/bin/docker-compose -f docker-compose.yml -f docker-compose-override.yml up --remove-orphans

ym-service-commands.sh stop

Info

This command stops the yarnman services

Code Block
yarnman@yarnman-test [ ~ ]$ sudo ym-service-commands.sh stop
stopping yarnman.service
● yarnman.service - yarnman
     Loaded: loaded (/usr/lib/systemd/system/yarnman.service; enabled; vendor preset: enabled)
     Active: inactive (dead) since Wed 2022-08-17 08:24:16 UTC; 6ms ago
    Process: 4221 ExecStart=/usr/bin/docker-compose -f docker-compose.yml -f docker-compose-override.yml up --remove-orphans (code=exited, status=0/SUCCESS)
    Process: 55552 ExecStop=/usr/bin/docker-compose -f docker-compose.yml down (code=exited, status=0/SUCCESS)
   Main PID: 4221 (code=exited, status=0/SUCCESS)

Aug 17 08:24:14 yarnman-test docker-compose[4221]: ym-redis exited with code 0
Aug 17 08:24:14 yarnman-test docker-compose[55552]: Container ym-redis  Removed
Aug 17 08:24:15 yarnman-test docker-compose[55552]: Container ym-couchdb  Stopped
Aug 17 08:24:15 yarnman-test docker-compose[55552]: Container ym-couchdb  Removing
Aug 17 08:24:15 yarnman-test docker-compose[4221]: ym-couchdb exited with code 0
Aug 17 08:24:15 yarnman-test docker-compose[55552]: Container ym-couchdb  Removed
Aug 17 08:24:15 yarnman-test docker-compose[55552]: Network yarnman_yl-yarnman  Removing
Aug 17 08:24:16 yarnman-test docker-compose[55552]: Network yarnman_yl-yarnman  Removed
Aug 17 08:24:16 yarnman-test systemd[1]: yarnman.service: Succeeded.
Aug 17 08:24:16 yarnman-test systemd[1]: Stopped yarnman.

ym-service-commands.sh restart

Info

this command restarts the yarnman services

Code Block
yarnman@yarnman-test [ ~ ]$ sudo ym-service-commands.sh restart
restarting yarnman.service
● yarnman.service - yarnman
     Loaded: loaded (/usr/lib/systemd/system/yarnman.service; enabled; vendor preset: enabled)
     Active: active (running) since Wed 2022-08-17 08:27:36 UTC; 6ms ago
    Process: 63277 ExecStartPre=/usr/bin/docker-compose -f docker-compose.yml down (code=exited, status=0/SUCCESS)
   Main PID: 63287 (docker-compose)
      Tasks: 6 (limit: 4694)
     Memory: 4.9M
     CGroup: /system.slice/yarnman.service
             └─63287 /usr/bin/docker-compose -f docker-compose.yml -f docker-compose-override.yml up --remove-orphans

Aug 17 08:27:36 yarnman-test systemd[1]: Starting yarnman...
Aug 17 08:27:36 yarnman-test docker-compose[63277]: yarnman  Warning: No resource found to remove
Aug 17 08:27:36 yarnman-test systemd[1]: Started yarnman.

ym-service-commands.sh status

Info

this command shows the systemd service status

Code Block
yarnman@yarnman-test [ ~ ]$ sudo ym-backupservice-actionscommands.sh -p sftpstatus
● yarnman.service -a sftp-user-setupyarnman
 backup config found PROFILE_NAME_VAR Loaded: = sftp
ACTION_VAR   loaded (/usr/lib/systemd/system/yarnman.service; enabled; vendor preset: enabled)
    = sftp-user-setup
RESTORECOMMIT     = 
RESTORE_IP  Active: active (running) since Wed 2022-08-17 08:29:13 UTC; 4s ago
    Process: 67157 ExecStartPre=/usr/bin/docker-compose 
RESTORE_PATH-f docker-compose.yml down (code=exited, status=0/SUCCESS)
   Main PID: 67167 (docker-compose)
       = 
settting sftp mode
profile mode :yarnman-sftp
creating keys for ym-backup-user
public key for ssh/sftp
ssh-rsa ****LongStringForPubKey****
yarnman@node1 [ ~ ]$

Copy ssh pub key to sftp server

if ssh access is available to the SFTP server you can copy the ssh public key for login, otherwise provide the key to your SFTP Administrator.

Code Block
yarnman@node1 [ ~ ]$ su
Password: 
yarnman@node1 [ /var/home/yarnman ]# sudo -u ym-backup-user ssh-copy-id -i /home/ym-backup-user/.ssh/id_rsa.pub sftpbackup@10.101.10.86

/bin/ssh-copy-id: INFO: Source of key(s) to be installed: "/home/ym-backup-user/.ssh/id_rsa.pub"
The authenticity of host '10.101.10.86 (10.101.10.86)' can't be established.
ED25519 key fingerprint is SHA256:****j7t+o1aQu5FoWlxS0uhKzCe414jt3****
This key is not known by any other names
Are you sure you want to continue connecting (yes/no/[fingerprint])? yes
/bin/ssh-copy-id: INFO: attempting to log in with the new key(s), to filter out any that are already installed
/bin/ssh-copy-id: INFO: 1 key(s) remain to be installed -- if you are prompted now it is to install the new keys
Authorized uses only. All activity may be monitored and reported.
sftpbackup@10.101.10.86's password: 

Number of key(s) added: 1

Setup SFTP destination for backup

the script will prompt for backup path, ip address and userid to the SFTP server

Code Block
yarnman@node1 [ ~ ]$ sudo ym-backup-actions.sh -p sftp -a sftp-setup-connection

backup config found
PROFILE_NAME_VAR  = sftp
ACTION_VAR        = sftp-setup-connection
RESTORECOMMIT     = 
RESTORE_IP        = 
RESTORE_PATH      = 
settting sftp mode
profile mode :yarnman-sftp
SFTP config is /var/opt/yarnlab/backup/sftp
enter sftp infomation
SFTP Username: sftpbackup
SFTP Host: 10.101.10.86
SFTP backup directory Path i.e /srv/yarnman/backup: /home/sftpbackup/yarnman
sftp:yarnman@10.101.10.86:/home/sftpbackup/yarnman
yarnman@node1 [ ~ ]$
Info

you may be prompted for username/password if the SSH pub key hasn’t been added to the SFTP server, this is OK for the initial setup, however scheduled/automated backups will fail

Check if backups exist at location

for first time configuration no backups will be available, nor a backup repository which will be setup in the next section.

Code Block
yarnman@node1 [ ~ ]$ sudo ym-backup-actions.sh -p sftp -a snapshots

backup config found
PROFILE_NAME_VAR  = sftp
ACTION_VAR        = snapshots
RESTORECOMMIT     = 
RESTORE_IP        = 
RESTORE_PATH      = 
settting sftp mode
profile mode :yarnman-sftp
Checking snapshots for profile :yarnman-sftp
2023/08/11 04:41:34 profile 'yarnman-sftp': starting 'snapshots'
2023/08/11 04:41:34 unfiltered extra flags: 
subprocess ssh: Authorized uses only. All activity may be monitored and reported.
Fatal: unable to open config file: Lstat: file does not exist
Is there a repository at the following location?
sftp:sftpbackup@10.101.10.86:/home/sftpbackup/yarnman
2023/08/11 04:41:34 snapshots on profile 'yarnman-sftp': exit status 1
Initialise the repository

the password used from the initial ym-backup-setup.sh will automatically be used

Code Block
yarnman@node1 [ ~ ]$ sudo ym-backup-actions.sh -p sftp -a init

backup config found
PROFILE_NAME_VAR  = sftp
ACTION_VAR        = init
RESTORECOMMIT     = 
RESTORE_IP        = 
RESTORE_PATH      = 
settting sftp mode
profile mode :yarnman-sftp
Initialise backup for profile :yarnman-sftp
2023/08/11 04:43:57 profile 'yarnman-sftp': starting 'init'
2023/08/11 04:43:57 unfiltered extra flags: 
created restic repository 7180598c67 at sftp:yarnman@10.101.10.86:/home/sftpbackup/yarnman

Please note that knowledge of your password is required to access
the repository. Losing your password means that your data is
irrecoverably lost.
2023/08/11 04:44:00 profile 'yarnman-sftp': finished 'init'
yarnman@node1 [ ~ ]$
Info

Initialising can only be preformed once to a repository, an error will occur if it exists already.

List backups (snapshots)

list all backups available , on a new repository this will be blank

Code Block
yarnman@node1 [ ~ ]$ sudo ym-backup-actions.sh -p sftp -a snapshots

backup config found
PROFILE_NAME_VAR  = sftp
ACTION_VAR        = snapshots
RESTORECOMMIT     = 
RESTORE_IP        = 
RESTORE_PATH      = 
settting sftp mode
profile mode :yarnman-sftp
Checking snapshots for profile :yarnman-sftp
2023/08/11 04:44:19 profile 'yarnman-sftp': starting 'snapshots'
2023/08/11 04:44:19 unfiltered extra flags: 
subprocess ssh: Authorized uses only. All activity may be monitored and reported.
repository 7180598c opened (version 2, compression level auto)
2023/08/11 04:44:20 profile 'yarnman-sftp': finished 'snapshots'
yarnman@node1 [ ~ ]$
Info

repository 7180598c opened (version 2, compression level auto) indicating a valid backup location

Manual Backup

preform a manual backup

Code Block
yarnman@node1 [ ~ ]$ sudo ym-backup-actions.sh -p sftp -a backup

backup config found
PROFILE_NAME_VAR  = sftp
ACTION_VAR        = backup
RESTORECOMMIT     = 
RESTORE_IP        = 
RESTORE_PATH      = 
settting sftp mode
profile mode :yarnman-sftp
Running backup for profile :yarnman-sftp
2023/08/11 04:46:11 profile 'yarnman-sftp': starting 'backup'
2023/08/11 04:46:11 unfiltered extra flags: 
subprocess ssh: Authorized uses only. All activity may be monitored and reported.
repository 7180598c opened (version 2, compression level auto)
lock repository
no parent snapshot found, will read all files
load index files
start scan on [/var/opt/yarnlab/yarnman/config /var/opt/yarnlab/couchdb/config /var/opt/yarnlab/couchdb/data /var/opt/yarnlab/couchdb/certs /var/opt/yarnlab/tang/db /var/opt/yarnlab/certs /var/opt/yarnlab/registry]
start backup on [/var/opt/yarnlab/yarnman/config /var/opt/yarnlab/couchdb/config /var/opt/yarnlab/couchdb/data /var/opt/yarnlab/couchdb/certs /var/opt/yarnlab/tang/db /var/opt/yarnlab/certs /var/opt/yarnlab/registry]
scan finished in 0.233s: 564 files, 5.211 MiB

Files:         564 new,     0 changed,     0 unmodified
Dirs:          348 new,     0 changed,     0 unmodified
Data Blobs:    404 new
Tree Blobs:    349 new
Added to the repository: 5.479 MiB (736.577 KiB stored)

processed 564 files, 5.211 MiB in 0:00
snapshot fa50ff98 saved
2023/08/11 04:46:12 profile 'yarnman-sftp': finished 'backup'
2023/08/11 04:46:12 profile 'yarnman-sftp': cleaning up repository using retention information
2023/08/11 04:46:12 unfiltered extra flags: 
repository 7180598c opened (version 2, compression level auto)
Applying Policy: keep 3 daily, 1 weekly, 1 monthly snapshots and all snapshots with tags [[manual]] and all snapshots within 3m of the newest
snapshots for (host [node76-restore4], paths [/var/opt/yarnlab/certs, /var/opt/yarnlab/couchdb/certs, /var/opt/yarnlab/couchdb/config, /var/opt/yarnlab/couchdb/data, /var/opt/yarnlab/registry, /var/opt/yarnlab/tang/db, /var/opt/yarnlab/yarnman/config]):
keep 1 snapshots:
ID        Time                 Host             Tags            Reasons           Paths
-----------------------------------------------------------------------------------------------------------------
fa50ff98  2023-08-11 04:46:11           node1  ym-backup-sftp  within 3m         /var/opt/yarnlab/certs
                                                                daily snapshot    /var/opt/yarnlab/couchdb/certs
                                                                weekly snapshot   /var/opt/yarnlab/couchdb/config
                                                                monthly snapshot  /var/opt/yarnlab/couchdb/data
                                                                                  /var/opt/yarnlab/registry
                                                                                  /var/opt/yarnlab/tang/db
                                                                                  /var/opt/yarnlab/yarnman/config
-----------------------------------------------------------------------------------------------------------------
1 snapshots

yarnman@node1 [ ~ ]$
Schedule

By default the schedule is setup to backup at 1am UTC every day, This can be modified in the config file with as the root user

Code Block
nano /var/opt/yarnlab/yarnman/config/ym-backup-config.yml
Code Block
PENDING

Enable Schedule

Tasks: 9 (limit: 4694)
     Memory: 15.7M
     CGroup: /system.slice/yarnman.service
             └─67167 /usr/bin/docker-compose -f docker-compose.yml -f docker-compose-override.yml up --remove-orphans

Aug 17 08:29:14 yarnman-test docker-compose[67167]: ym-couchdb  | [info] 2022-08-17T08:29:14.759420Z nonode@nohost <0.11.0> -------- Application ddoc_cache started on node nonode@nohost
Aug 17 08:29:14 yarnman-test docker-compose[67167]: ym-couchdb  | [info] 2022-08-17T08:29:14.769878Z nonode@nohost <0.11.0> -------- Application global_changes started on node nonode@nohost
Aug 17 08:29:14 yarnman-test docker-compose[67167]: ym-couchdb  | [info] 2022-08-17T08:29:14.769962Z nonode@nohost <0.11.0> -------- Application jiffy started on node nonode@nohost
Aug 17 08:29:14 yarnman-test docker-compose[67167]: ym-couchdb  | [info] 2022-08-17T08:29:14.774590Z nonode@nohost <0.11.0> -------- Application mango started on node nonode@nohost
Aug 17 08:29:14 yarnman-test docker-compose[67167]: ym-couchdb  | [info] 2022-08-17T08:29:14.779025Z nonode@nohost <0.11.0> -------- Application setup started on node nonode@nohost
Aug 17 08:29:14 yarnman-test docker-compose[67167]: ym-couchdb  | [info] 2022-08-17T08:29:14.779045Z nonode@nohost <0.11.0> -------- Application snappy started on node nonode@nohost
Aug 17 08:29:15 yarnman-test docker-compose[67167]: ym-yarnman  | 1660724955149 WARN  Setting Default startup.
Aug 17 08:29:15 yarnman-test docker-compose[67167]: ym-couchdb  | [notice] 2022-08-17T08:29:15.166800Z nonode@nohost <0.334.0> 144d89930f localhost:5984 127.0.0.1 undefined GET / 200 ok 70
Aug 17 08:29:16 yarnman-test docker-compose[67167]: ym-couchdb  | [notice] 2022-08-17T08:29:16.252345Z nonode@nohost <0.335.0> 23ea8ef0ca localhost:5984 127.0.0.1 undefined GET / 200 ok 1
Aug 17 08:29:17 yarnman-test docker-compose[67167]: ym-couchdb  | [notice] 2022-08-17T08:29:17.323062Z nonode@nohost <0.465.0> a377eb4c4c localhost:5984 127.0.0.1 undefined GET / 200 ok 0

ym-service-commands.sh status-pm2

Info

this command shows the internal processes of yarnman

Code Block
yarnman@yarnman-test [ ~ ]$ sudo ym-service-commands.sh status-pm2
┌─────┬──────────────────────────────────────────────────────────┬─────────────┬─────────┬─────────┬──────────┬────────┬──────┬───────────┬──────────┬──────────┬──────────┬──────────┐
│ id  │ name                                                     │ namespace   │ version │ mode    │ pid      │ uptime │ ↺    │ status    │ cpu      │ mem      │ user     │ watching │
├─────┼──────────────────────────────────────────────────────────┼─────────────┼─────────┼─────────┼──────────┼────────┼──────┼───────────┼──────────┼──────────┼──────────┼──────────┤
│ 2   │ administration-app-0ca298ae6a834cf29c661930c58cb621      │ default     │ 2.5.18  │ fork    │ 236      │ 10s    │ 0    │ online    │ 0%       │ 137.8mb  │ ym-… │ enabled  │
│ 0   │ arm_fc30b4f5d59f4275829ff8b65d02914b                     │ default     │ 2.5.18  │ fork    │ 121      │ 19s    │ 5    │ online    │ 0%       │ 65.1mb   │ ym-… │ enabled  │
│ 3   │ interconnect-service-49ab91419f064823b8ab85806b3b4ce1    │ default     │ 2.5.18  │ fork    │ 260      │ 8s     │ 0    │ online    │ 0%       │ 138.8mb  │ ym-… │ enabled  │
│ 1   │ jadeberlin_arm_fc30b4f5d59f4275829ff8b65d02914b          │ default     │ N/A     │ fork    │ 0        │ 0      │ 4    │ errored   │ 0%       │ 0b       │ ym-… │ disabled │
│ 4   │ proxy-service-a4500ec67fcc491399dc395e12c1bbe1           │ default     │ 2.5.18  │ fork    │ 271      │ 6s     │ 0    │ online    │ 0%       │ 105.3mb  │ ym-… │ enabled  │
│ 5   │ workflow-service-8b4edbbb287c468cae0f023dd7e0cf44        │ default     │ 2.5.18  │ fork    │ 282      │ 5s     │ 0    │ online    │ 0%       │ 175.4mb  │ ym-… │ enabled  │
└─────┴──────────────────────────────────────────────────────────┴─────────────┴─────────┴─────────┴──────────┴────────┴──────┴───────────┴──────────┴──────────┴──────────┴──────────┘
[PM2][WARN] Current process list is not synchronized with saved list. Type 'pm2 save' to synchronize.

Note that the jadeberlin service will be in an errored state till setup

Note that the status-pm2 options will change based on the terminal/console width/resolution

ym-service-commands.sh yarnman-logs

Info

This command shows the scrolling output of yarnman services press CTRL+c to exit

ym-service-commands.sh couchdb-logs

Info

This command shows the scrolling output of dabase logs press CTRL+c to exit

ym-service-commands.sh redis-logs

Info

This command shows the scrolling output of message bus logs press CTRL+c to exit

ym-service-commands.sh tang-logs

Info

This command shows the scrolling output of NBE logs press CTRL+c to exit

ym-service-commands.sh tang-thp

Note

Note that this command was previously ym-service-commands.sh tang-adv

Info

This command shows the tag thp used for setting up configuration encryption

Code Block
yarnman@ym-ph-test [ ~ ]$ sudo ym-service-commands.sh tang-adv
9_CZiwV9PKBlQfehPKZO7cd5ZpM

ym-service-commands.sh update-jtapi

Info

This command updates jtapi for test_mate

Code Block
PENDING

Edit Configuration Commands

ym-edit-config.sh enable-local-admin-access

Info

This command enables local admin access on port 3999

Code Block
PENDING

ym-edit-config.sh disable-local-admin-access

Info

This command disables local admin access on port 3999

Code Block
PENDING

ym-edit-config.sh enable-local-couchdb-access

Info

This command enables couchdb access

Code Block
PENDING

ym-edit-config.sh disable-local-couchdb-access

Info

This command disables couchdb access

Code Block
PENDING

ym-edit-config.sh set-local-yarnman-container-name

Info

This command sets the container hostname for clustered systems

Code Block
PENDING

ym-edit-config.sh unset-local-yarnman-container-name

Info

This command unsets the container hostname for clustered systems

Code Block
PENDING

ym-edit-config.sh enable-yarnman-logs

Info

This command enables yarnman trace logs

Code Block
PENDING

ym-edit-config.sh disable-yarnman-logs

Info

This command enables yarnman debug logs (default)

Code Block
PENDING

Backup

ym-backup-setup.sh

Sets up the local backup service account on the yarnman node, and the passphrase used on the backup
Code Block
yarnman@node1 [ ~ ]$ sudo ym-backup-setup.sh 

Starting yarnman ph4 backup
Backup password not set 
Set Backup password: 
Backup password (again): 
Clevis not setup
using local backup password
no backup configuration file found creating
yarnman@node1 [ ~ ]$
Note

No login access is available to the backup service account

ym-backup-actions.sh

all the backup commands are done via the script above

Setup sftp as the backup method and ssh public keys
Code Block
yarnman@node1 [ ~ ]$ sudo ym-backup-actions.sh -p sftp -a sftp-user-setup

backup config found
PROFILE_NAME_VAR  = sftp
ACTION_VAR        = sftp-user-setup
RESTORECOMMIT     = 
RESTORE_IP        = 
RESTORE_PATH      = 
settting sftp mode
profile mode :yarnman-sftp
creating keys for ym-backup-user
public key for ssh/sftp
ssh-rsa ****LongStringForPubKey****
yarnman@node1 [ ~ ]$

Copy ssh pub key to sftp server

if ssh access is available to the SFTP server you can copy the ssh public key for login, otherwise provide the key to your SFTP Administrator.

Code Block
yarnman@node1 [ ~ ]$ su
Password: 
yarnman@node1 [ /var/home/yarnman ]# sudo -u ym-backup-user ssh-copy-id -i /home/ym-backup-user/.ssh/id_rsa.pub sftpbackup@10.101.10.86

/bin/ssh-copy-id: INFO: Source of key(s) to be installed: "/home/ym-backup-user/.ssh/id_rsa.pub"
The authenticity of host '10.101.10.86 (10.101.10.86)' can't be established.
ED25519 key fingerprint is SHA256:****j7t+o1aQu5FoWlxS0uhKzCe414jt3****
This key is not known by any other names
Are you sure you want to continue connecting (yes/no/[fingerprint])? yes
/bin/ssh-copy-id: INFO: attempting to log in with the new key(s), to filter out any that are already installed
/bin/ssh-copy-id: INFO: 1 key(s) remain to be installed -- if you are prompted now it is to install the new keys
Authorized uses only. All activity may be monitored and reported.
sftpbackup@10.101.10.86's password: 

Number of key(s) added: 1

Setup SFTP destination for backup

the script will prompt for backup path, ip address and userid to the SFTP server

Code Block
yarnman@node1 [ ~ ]$ sudo ym-backup-actions.sh -p sftp -a sftp-setup-connection

backup config found
PROFILE_NAME_VAR  = sftp
ACTION_VAR        = sftp-setup-connection
RESTORECOMMIT     = 
RESTORE_IP        = 
RESTORE_PATH      = 
settting sftp mode
profile mode :yarnman-sftp
SFTP config is /var/opt/yarnlab/backup/sftp
enter sftp infomation
SFTP Username: sftpbackup
SFTP Host: 10.101.10.86
SFTP backup directory Path i.e /srv/yarnman/backup: /home/sftpbackup/yarnman
sftp:yarnman@10.101.10.86:/home/sftpbackup/yarnman
yarnman@node1 [ ~ ]$
Info

you may be prompted for username/password if the SSH pub key hasn’t been added to the SFTP server, this is OK for the initial setup, however scheduled/automated backups will fail

Check if backups exist at location

for first time configuration no backups will be available, nor a backup repository which will be setup in the next section.

Code Block
yarnman@node1 [ ~ ]$ sudo ym-backup-actions.sh -p sftp -a snapshots

backup config found
PROFILE_NAME_VAR  = sftp
ACTION_VAR        = snapshots
RESTORECOMMIT     = 
RESTORE_IP        = 
RESTORE_PATH      = 
settting sftp mode
profile mode :yarnman-sftp
Checking snapshots for profile :yarnman-sftp
2023/08/11 04:41:34 profile 'yarnman-sftp': starting 'snapshots'
2023/08/11 04:41:34 unfiltered extra flags: 
subprocess ssh: Authorized uses only. All activity may be monitored and reported.
Fatal: unable to open config file: Lstat: file does not exist
Is there a repository at the following location?
sftp:sftpbackup@10.101.10.86:/home/sftpbackup/yarnman
2023/08/11 04:41:34 snapshots on profile 'yarnman-sftp': exit status 1
Initialise the repository

the password used from the initial ym-backup-setup.sh will automatically be used

Code Block
yarnman@node1 [ ~ ]$ sudo ym-backup-actions.sh -p sftp -a scheduleinit

Disablebackup Scheduleconfig found
sudo ym-backup-actions.sh -pPROFILE_NAME_VAR  = sftp
-aACTION_VAR unschedule  Check status of schedule  sudo ym-backup-actions.sh -p sftp -a status

Restore backup

To restore a snapshot to an existing node.

List the snapshots available as shown earlier to restore the required snapshot.

the restore script will create a Local backup before starting the restore in the event you need to rollback.

Code Block
yarnman@node1 [ ~ ]$ sudo ym-backup-actions.sh -p sftp -a restore -r fa50ff98

backup config found
PROFILE_NAME_VAR  = sftp
ACTION_VAR        = restore
RESTORECOMMIT     = latest
BACKUP_IP         = 
BACKUP_PATH      = 
settting sftp mode
profile mode :yarnman-sftp
Restore backup for profile :yarnman-sftp
starting restore
Restore backup for profile :yarnman-sftp commit :latest
Are you sure you want to restore backup? Y or Ny
Restore Backup
subprocess ssh: Authorized uses only. All activity may be monitored and reported.
Backup nodeId's match

Stopping yarnman services
Removing exising configuration to prevent duplicates
Starting restic restore

2023/08/16 08:08:33 profile 'yarnman-sftp': finished 'restore'
Resetting permissions
Starting Database and Encryption services
[+] Creating 5/5
 ✔ Network yarnman_yl-yarnman  Created                                                                = init
RESTORECOMMIT     = 
RESTORE_IP        = 
RESTORE_PATH      = 
settting sftp mode
profile mode :yarnman-sftp
Initialise backup for profile :yarnman-sftp
2023/08/11 04:43:57 profile 'yarnman-sftp': starting 'init'
2023/08/11 04:43:57 unfiltered extra flags: 
created restic repository 7180598c67 at sftp:yarnman@10.101.10.86:/home/sftpbackup/yarnman

Please note that knowledge of your password is required to access
the repository. Losing your password means that your data is
irrecoverably lost.
2023/08/11 04:44:00 profile 'yarnman-sftp': finished 'init'
yarnman@node1 [ ~ ]$
Info

Initialising can only be preformed once to a repository, an error will occur if it exists already.

List backups (snapshots)

list all backups available , on a new repository this will be blank

Code Block
yarnman@node1 [ ~ ]$ sudo ym-backup-actions.sh -p sftp -a snapshots

backup config found
PROFILE_NAME_VAR  = sftp
ACTION_VAR        = snapshots
RESTORECOMMIT     = 
RESTORE_IP        = 
RESTORE_PATH      = 
settting sftp mode
profile mode :yarnman-sftp
Checking snapshots for profile :yarnman-sftp
2023/08/11 04:44:19 profile 'yarnman-sftp': starting 'snapshots'
2023/08/11 04:44:19 unfiltered extra flags: 
subprocess ssh: Authorized uses only. All activity may be monitored and reported.
repository 7180598c opened (version 2, compression level auto)
2023/08/11 04:44:20 profile 'yarnman-sftp': finished 'snapshots'
yarnman@node1 [ ~ ]$
Info

repository 7180598c opened (version 2, compression level auto) indicating a valid backup location

Manual Backup

preform a manual backup

Code Block
yarnman@node1 [ ~ ]$ sudo ym-backup-actions.sh -p sftp -a backup

backup config found
PROFILE_NAME_VAR  = sftp
ACTION_VAR        = backup
RESTORECOMMIT     = 
RESTORE_IP        = 
RESTORE_PATH      = 
settting sftp mode
profile mode :yarnman-sftp
Running backup for profile :yarnman-sftp
2023/08/11 04:46:11 profile 'yarnman-sftp': starting 'backup'
2023/08/11 04:46:11 unfiltered extra flags: 
subprocess ssh: Authorized uses only. All activity may be monitored and reported.
repository 7180598c opened (version 2, compression level auto)
lock repository
no parent snapshot found, will read all files
0.0sload index files
start Containerscan ym-redis          Created                     on [/var/opt/yarnlab/yarnman/config /var/opt/yarnlab/couchdb/config /var/opt/yarnlab/couchdb/data /var/opt/yarnlab/couchdb/certs /var/opt/yarnlab/tang/db /var/opt/yarnlab/certs /var/opt/yarnlab/registry]
start backup on [/var/opt/yarnlab/yarnman/config /var/opt/yarnlab/couchdb/config /var/opt/yarnlab/couchdb/data /var/opt/yarnlab/couchdb/certs /var/opt/yarnlab/tang/db /var/opt/yarnlab/certs /var/opt/yarnlab/registry]
scan finished in 0.233s: 564 files, 5.211 MiB

Files:         564 new,     0 changed,     0 unmodified
Dirs:          348 new,     0 changed,     0 unmodified
Data Blobs:    404 new
Tree Blobs:    349 new
Added to the repository: 5.479 MiB (736.577 KiB stored)

processed 564 files, 5.211 MiB in 0:00
snapshot fa50ff98 saved
2023/08/11 04:46:12 profile 'yarnman-sftp': finished 'backup'
2023/08/11 04:46:12                                           0.1s 
 ✔ Container ym-couchdb        Created        profile 'yarnman-sftp': cleaning up repository using retention information
2023/08/11 04:46:12 unfiltered extra flags: 
repository 7180598c opened (version 2, compression level auto)
Applying Policy: keep 3 daily, 1 weekly, 1 monthly snapshots and all snapshots with tags [[manual]] and all snapshots within 3m of the newest
snapshots for (host [node76-restore4], paths [/var/opt/yarnlab/certs, /var/opt/yarnlab/couchdb/certs, /var/opt/yarnlab/couchdb/config, /var/opt/yarnlab/couchdb/data, /var/opt/yarnlab/registry, /var/opt/yarnlab/tang/db, /var/opt/yarnlab/yarnman/config]):
keep 1 snapshots:
ID        Time                 Host             Tags            Reasons                                      Paths
-----------------------------------------------------------------------------------------------------------------
fa50ff98  2023-08-11 04:46:11           node1  ym-backup-sftp  within 3m         /var/opt/yarnlab/certs
                              0.1s    Container ym-tang           Created                  daily snapshot    /var/opt/yarnlab/couchdb/certs
                                                                weekly snapshot   /var/opt/yarnlab/couchdb/config
                                                           0.1s    Container ym-yarnmanmonthly snapshot  /var/opt/yarnlab/couchdb/data
    Created                                                                              /var/opt/yarnlab/registry
                                                                         0.1s  [+] Running 1/1   Container ym-couchdb  Started /var/opt/yarnlab/tang/db
                                                                                                                                                              0.3s 
[+] Running 1/1
 ✔ Container ym-tang  Started

If you are restoring a node in a multi node deployment you will see an additional message of

Code Block
Checking number of admin nodes
number of admin nodes :x
Yarnman is in distributed mode
Check couchdb replication on other nodes is healthy and after 5 minutes reboot yarnman or run systemctl stop yarnman.service and systemctl start yarnman.service

This is to allow replication to all nodes, to prevent any schedule jobs/ reports from rerunning from the last backup

Rebuild Disaster recovery

Pre-Req

  • Deploy new OVA with same version as the backup

  • Setup as a new install (eg Configure with ip, user/pass, generate certificates if prompted)

  • install yarnman

  • confirm can reach appadmin webpage, Do not Login or Accept the EULA as we will restore over this.

  • Setup backup to same repo for the node to be restored, Do Not initiate the repo or preform a backup

Info

A new SFTP/SSH key will be created, this will need to be added to the backups server for future automated backups to function again. interactive (user/pass) can be used for a restore , if the new ssh key can’t be added to the backup server at time of restore.

Note

the Hostname doesn’t need to be the same as the restore backup, however any new backups will backup with the new hostname.

If building with a different IP address, Replication will need to be adjusted to the new IP address as well as Clevin/Tang if setup.

Run the following, Refer to previous detailed command instructions if required

Code Block
sudo ym-backup-setup.sh 
sudo ym-backup-actions.sh -p sftp -a sftp-user-setup
as root user ;  sudo -u ym-backup-user ssh-copy-id -i /home/ym-backup-user/.ssh/id_rsa.pub sftpbackup@10.101.10.86
sudo ym-backup-actions.sh -p sftp -a sftp-setup-connection
sudo ym-backup-actions.sh -p sftp -a snapshots
sudo ym-backup-actions.sh -p sftp -a restore -r xxxxx

The restore script will warn we are restoring to a different node, Continue.

Code Block
yarnman@node79-restore [ ~ ]$ sudo ym-backup-actions.sh -p sftp -a restore -r 5f13f62b

backup config found
PROFILE_NAME_VAR  = sftp
ACTION_VAR        = restore
RESTORECOMMIT     = 5f13f62b
BACKUP_IP/var/opt/yarnlab/yarnman/config
-----------------------------------------------------------------------------------------------------------------
1 snapshots

yarnman@node1 [ ~ ]$

Schedule

By default the schedule is setup to backup at 1am UTC every day, This can be modified in the config file with as the root user

Code Block
nano /var/opt/yarnlab/yarnman/config/ym-backup-config.yml
Code Block
PENDING

Enable Schedule

sudo ym-backup-actions.sh -p sftp -a schedule

Disable Schedule

sudo ym-backup-actions.sh -p sftp -a unschedule

Check status of schedule

sudo ym-backup-actions.sh -p sftp -a status

Restore backup

To restore a snapshot to an existing node.

List the snapshots available as shown earlier to restore the required snapshot.

the restore script will create a Local backup before starting the restore in the event you need to rollback.

Code Block
yarnman@node1 [ ~ ]$ sudo ym-backup-actions.sh -p sftp -a restore -r fa50ff98

backup config found
PROFILE_NAME_VAR  = sftp
ACTION_VAR        = restore
RESTORECOMMIT     = latest
BACKUP_IP         = 
BACKUP_PATH      = 
settting sftp mode
profile mode :yarnman-sftp
Restore backup for profile :yarnman-sftp
starting restore
Restore backup for profile :yarnman-sftp commit :latest
Are you sure you want to restore backup? Y or Ny
Restore Backup
subprocess ssh: Authorized uses only. All activity may be monitored and reported.
Backup nodeId's match

Stopping yarnman services
Removing exising configuration to prevent duplicates
Starting restic restore

2023/08/16 08:08:33 profile 'yarnman-sftp': finished 'restore'
Resetting permissions
Starting Database and Encryption services
[+] Creating 5/5
 ✔ Network yarnman_yl-yarnman  Created                                                                                                                                                        0.0s 
 ✔ Container ym-redis          Created                                                  =  BACKUP_PATH      =  settting sftp mode profile mode :yarnman-sftp Restore backup for profile :yarnman-sftp starting restore Restore backup for profile :yarnman-sftp commit :5f13f62b Are you sure you want to restore backup? Y or Ny Restore Backup subprocess ssh: Authorized uses only. All activity may be monitored and reported. Current Node Id is :arm_46b194ad3d374b7397fa14b1a3136d56 Backup Node Id is :arm_3110b0b79eb84bd899291d5e0d231009 Do you want to apply this backup that has different nodeId? Y or N 

Follow instructions after the restore completes.

Alternate Manual Method (not recommended)

*** snapshot command doesnt work in manual mode yet, also requires sudo ym-backup-setup.sh to be run ?

Code Block
sudo ym-backup-actions.sh -p manual -a manual-sftp-snapshots -i 10.101.10.86 -k /home/sftpbackup/path/ 
Code Block
sudo ym-backup-actions.sh -p manual -a manual-sftp-restore -i 10.101.10.86 -k /home/sftpbackup/path/ -r xxxxx

ym-service-commands.sh start

Info

This command starts the yarnman services

Code Block
yarnman@yarnman-test [ ~0.1s ]$
sudo ym-service-commands.sh start
starting yarnman.service
● yarnman.service - yarnman ✔ Container ym-couchdb        Created   Loaded: loaded (/usr/lib/systemd/system/yarnman.service; enabled; vendor preset: enabled)      Active: active (running) since Wed 2022-08-17 08:24:21 UTC; 5ms ago     Process: 56027 ExecStartPre=/usr/bin/docker-compose -f docker-compose.yml down (code=exited, status=0/SUCCESS)    Main PID: 56037 (docker-compose)       Tasks: 5 (limit:  4694)      Memory: 5.0M      CGroup: /system.slice/yarnman.service              └─56037 /usr/bin/docker-compose -f docker-compose.yml -f docker-compose-override.yml up --remove-orphans

ym-service-commands.sh stop

Info

This command stops the yarnman services

Code Block
yarnman@yarnman-test [ ~ ]$ sudo ym-service-commands.sh stop stopping yarnman.service  yarnman.service - yarnman      Loaded: loaded (/usr/lib/systemd/system/yarnman.service; enabled; vendor preset: enabled)                 Active: inactive (dead) since Wed 2022-08-17 08:24:16 UTC; 6ms ago     Process: 4221 ExecStart=/usr/bin/docker-compose -f docker-compose.yml -f docker-compose-override.yml up --remove-orphans (code=exited, status=0/SUCCESS)   0.1s 
Process: 55552 ExecStop=/usr/bin/docker-compose -f docker-compose.yml down (code=exited, status=0/SUCCESS)✔ Container ym-tang         Main PID: 4221Created (code=exited, status=0/SUCCESS)  Aug 17 08:24:14 yarnman-test docker-compose[4221]: ym-redis exited with code 0 Aug 17 08:24:14 yarnman-test docker-compose[55552]: Container ym-redis  Removed Aug 17 08:24:15 yarnman-test docker-compose[55552]: Container ym-couchdb  Stopped Aug 17 08:24:15 yarnman-test docker-compose[55552]: Container ym-couchdb  Removing Aug 17 08:24:15 yarnman-test docker-compose[4221]: ym-couchdb exited with code 0 Aug 17 08:24:15 yarnman-test docker-compose[55552]: Container ym-couchdb  Removed Aug 17 08:24:15 yarnman-test docker-compose[55552]: Network yarnman_yl-yarnman  Removing Aug 17 08:24:16 yarnman-test docker-compose[55552]: Network yarnman_yl-yarnman   Removed Aug 17 08:24:16 yarnman-test systemd[1]: yarnman.service: Succeeded. Aug 17 08:24:16 yarnman-test systemd[1]: Stopped yarnman.

ym-service-commands.sh restart

Info

this command restarts the yarnman services

Code Block
yarnman@yarnman-test [ ~ ]$ sudo ym-service-commands.sh restart restarting yarnman.service  yarnman.service - yarnman      Loaded: loaded (/usr/lib/systemd/system/yarnman.service; enabled; vendor preset: enabled)      Active: active (running) since Wed 2022-08-17 08:27:36 UTC; 6ms ago     Process: 63277 ExecStartPre=/usr/bin/docker-compose -f docker-compose.yml down (code=exited, status=0/SUCCESS)    Main PID: 63287 (docker-compose)  0.1s 
  Container Tasks:ym-yarnman 6 (limit: 4694)     Created Memory: 4.9M      CGroup: /system.slice/yarnman.service              └─63287 /usr/bin/docker-compose -f docker-compose.yml -f docker-compose-override.yml up --remove-orphans  Aug 17 08:27:36 yarnman-test systemd[1]: Starting yarnman... Aug 17 08:27:36 yarnman-test docker-compose[63277]: yarnman  Warning: No resource found to remove Aug 17 08:27:36 yarnman-test systemd[1]: Started yarnman.

ym-service-commands.sh status

Info

this command shows the systemd service status

Code Block
yarnman@yarnman-test [ ~ ]$ sudo ym-service-commands.sh status
● yarnman.service - yarnman                   Loaded: loaded (/usr/lib/systemd/system/yarnman.service; enabled; vendor preset: enabled)      Active: active (running) since Wed 2022-08-17 08:29:13 UTC; 4s ago     Process: 67157 ExecStartPre=/usr/bin/docker-compose -f docker-compose.yml down (code=exited, status=0/SUCCESS)    Main PID: 67167 (docker-compose)       Tasks: 9 (limit: 4694)      Memory: 15.7M      CGroup: /system.slice/yarnman.service           0.1s 
[+] └─67167Running /usr/bin/docker-compose -f docker-compose.yml -f docker-compose-override.yml up --remove-orphans

Aug 17 08:29:14 yarnman-test docker-compose[67167]: ym-couchdb  | [info] 2022-08-17T08:29:14.759420Z nonode@nohost <0.11.0> -------- Application ddoc_cache started on node nonode@nohost
Aug 17 08:29:14 yarnman-test docker-compose[67167]: ym-couchdb  | [info] 2022-08-17T08:29:14.769878Z nonode@nohost <0.11.0> -------- Application global_changes started on node nonode@nohost
Aug 17 08:29:14 yarnman-test docker-compose[67167]: ym-couchdb  | [info] 2022-08-17T08:29:14.769962Z nonode@nohost <0.11.0> -------- Application jiffy started on node nonode@nohost
Aug 17 08:29:14 yarnman-test docker-compose[67167]: ym-couchdb  | [info] 2022-08-17T08:29:14.774590Z nonode@nohost <0.11.0> -------- Application mango started on node nonode@nohost
Aug 17 08:29:14 yarnman-test docker-compose[67167]: ym-couchdb  | [info] 2022-08-17T08:29:14.779025Z nonode@nohost <0.11.0> -------- Application setup started on node nonode@nohost
Aug 17 08:29:14 yarnman-test docker-compose[67167]: ym-couchdb  | [info] 2022-08-17T08:29:14.779045Z nonode@nohost <0.11.0> -------- Application snappy started on node nonode@nohost
Aug 17 08:29:15 yarnman-test docker-compose[67167]: ym-yarnman  | 1660724955149 WARN  Setting Default startup.
Aug 17 08:29:15 yarnman-test docker-compose[67167]: ym-couchdb  | [notice] 2022-08-17T08:29:15.166800Z nonode@nohost <0.334.0> 144d89930f localhost:5984 127.0.0.1 undefined GET / 200 ok 70
Aug 17 08:29:16 yarnman-test docker-compose[67167]: ym-couchdb  | [notice] 2022-08-17T08:29:16.252345Z nonode@nohost <0.335.0> 23ea8ef0ca localhost:5984 127.0.0.1 undefined GET / 200 ok 1
Aug 17 08:29:17 yarnman-test docker-compose[67167]: ym-couchdb  | [notice] 2022-08-17T08:29:17.323062Z nonode@nohost <0.465.0> a377eb4c4c localhost:5984 127.0.0.1 undefined GET / 200 ok 0

ym-service-commands.sh status-pm2

Info

this command shows the internal processes of yarnman

Code Block
yarnman@yarnman-test [ ~ ]$ sudo ym-service-commands.sh status-pm2
┌─────┬──────────────────────────────────────────────────────────┬─────────────┬─────────┬─────────┬──────────┬────────┬──────┬───────────┬──────────┬──────────┬──────────┬──────────┐
│ id  │ name                                                     │ namespace   │ version │ mode    │ pid      │ uptime │ ↺    │ status    │ cpu      │ mem      │ user     │ watching │
├─────┼──────────────────────────────────────────────────────────┼─────────────┼─────────┼─────────┼──────────┼────────┼──────┼───────────┼──────────┼──────────┼──────────┼──────────┤
│ 2   │ administration-app-0ca298ae6a834cf29c661930c58cb621      │ default     │ 2.5.18  │ fork    │ 236      │ 10s    │ 0    │ online    │ 0%       │ 137.8mb  │ ym-… │ enabled  │
│ 0   │ arm_fc30b4f5d59f4275829ff8b65d02914b                     │ default     │ 2.5.18  │ fork    │ 121      │ 19s    │ 5    │ online    │ 0%       │ 65.1mb   │ ym-… │ enabled  │
│ 3   │ interconnect-service-49ab91419f064823b8ab85806b3b4ce1    │ default     │ 2.5.18  │ fork    │ 260      │ 8s     │ 0    │ online    │ 0%       │ 138.8mb  │ ym-… │ enabled  │
│ 1   │ jadeberlin_arm_fc30b4f5d59f4275829ff8b65d02914b          │ default     │ N/A     │ fork    │ 0        │ 0      │ 4    │ errored   │ 0%       │ 0b       │ ym-… │ disabled │
│ 4   │ proxy-service-a4500ec67fcc491399dc395e12c1bbe1           │ default     │ 2.5.18  │ fork    │ 271      │ 6s     │ 0    │ online    │ 0%       │ 105.3mb  │ ym-… │ enabled  │
│ 5   │ workflow-service-8b4edbbb287c468cae0f023dd7e0cf44        │ default     │ 2.5.18  │ fork    │ 282      │ 5s     │ 0    │ online    │ 0%       │ 175.4mb  │ ym-… │ enabled  │
└─────┴──────────────────────────────────────────────────────────┴─────────────┴─────────┴─────────┴──────────┴────────┴──────┴───────────┴──────────┴──────────┴──────────┴──────────┘
[PM2][WARN] Current process list is not synchronized with saved list. Type 'pm2 save' to synchronize.

Note that the jadeberlin service will be in an errored state till setup

Note that the status-pm2 options will change based on the terminal/console width/resolution

ym-service-commands.sh yarnman-logs

Info

This command shows the scrolling output of yarnman services press CTRL+c to exit

ym-service-commands.sh couchdb-logs

Info

This command shows the scrolling output of dabase logs press CTRL+c to exit

ym-service-commands.sh redis-logs

Info

This command shows the scrolling output of message bus logs press CTRL+c to exit

ym-service-commands.sh tang-logs

Info

This command shows the scrolling output of NBE logs press CTRL+c to exit

ym-service-commands.sh tang-thp

Note

Note that this command was previously ym-service-commands.sh tang-adv

Info

This command shows the tag thp used for setting up configuration encryption

Code Block
yarnman@ym-ph-test [ ~ ]$ sudo ym-service-commands.sh tang-adv
9_CZiwV9PKBlQfehPKZO7cd5ZpM

ym-service-commands.sh update-jtapi

Info

This command updates jtapi for test_mate

Code Block
PENDING

ym-edit-config.sh enable-local-admin-access

Info

This command enables local admin access on port 3999

Code Block
PENDING

ym-edit-config.sh disable-local-admin-access

Info

This command disables local admin access on port 3999

Code Block
PENDING

ym-edit-config.sh enable-local-couchdb-access

Info

This command enables couchdb access

Code Block
PENDING

ym-edit-config.sh disable-local-couchdb-access

Info

This command disables couchdb access

Code Block
PENDING

ym-edit-config.sh set-local-yarnman-container-name

Info

This command sets the container hostname for clustered systems

Code Block
PENDING

ym-edit-config.sh unset-local-yarnman-container-name

Info

This command unsets the container hostname for clustered systems

Code Block
PENDING

ym-edit-config.sh enable-yarnman-logs

Info

This command enables yarnman trace logs

Code Block
PENDING

ym-edit-config.sh disable-yarnman-logs

Info

This command enables yarnman debug logs (default)

Code Block
PENDING1/1
 ✔ Container ym-couchdb  Started                                                                                                                                                              0.3s 
[+] Running 1/1
 ✔ Container ym-tang  Started

If you are restoring a node in a multi node deployment you will see an additional message of

Code Block
Checking number of admin nodes
number of admin nodes :x
Yarnman is in distributed mode
Check couchdb replication on other nodes is healthy and after 5 minutes reboot yarnman or run systemctl stop yarnman.service and systemctl start yarnman.service

This is to allow replication to all nodes, to prevent any schedule jobs/ reports from rerunning from the last backup

Rebuild Disaster recovery

Pre-Req

  • Deploy new OVA with same version as the backup

  • Setup as a new install (eg Configure with ip, user/pass, generate certificates if prompted)

  • install yarnman

  • confirm can reach appadmin webpage, Do not Login or Accept the EULA as we will restore over this.

  • Setup backup to same repo for the node to be restored, Do Not initiate the repo or preform a backup

Info

A new SFTP/SSH key will be created, this will need to be added to the backups server for future automated backups to function again. interactive (user/pass) can be used for a restore , if the new ssh key can’t be added to the backup server at time of restore.

Note

the Hostname doesn’t need to be the same as the restore backup, however any new backups will backup with the new hostname.

If building with a different IP address, Replication will need to be adjusted to the new IP address as well as Clevin/Tang if setup.

Run the following, Refer to previous detailed command instructions if required

Code Block
sudo ym-backup-setup.sh 
sudo ym-backup-actions.sh -p sftp -a sftp-user-setup
as root user ;  sudo -u ym-backup-user ssh-copy-id -i /home/ym-backup-user/.ssh/id_rsa.pub sftpbackup@10.101.10.86
sudo ym-backup-actions.sh -p sftp -a sftp-setup-connection
sudo ym-backup-actions.sh -p sftp -a snapshots
sudo ym-backup-actions.sh -p sftp -a restore -r xxxxx

The restore script will warn we are restoring to a different node, Continue.

Code Block
yarnman@node79-restore [ ~ ]$ sudo ym-backup-actions.sh -p sftp -a restore -r 5f13f62b

backup config found
PROFILE_NAME_VAR  = sftp
ACTION_VAR        = restore
RESTORECOMMIT     = 5f13f62b
BACKUP_IP         = 
BACKUP_PATH      = 
settting sftp mode
profile mode :yarnman-sftp
Restore backup for profile :yarnman-sftp
starting restore
Restore backup for profile :yarnman-sftp commit :5f13f62b
Are you sure you want to restore backup? Y or Ny
Restore Backup
subprocess ssh: Authorized uses only. All activity may be monitored and reported.
Current Node Id is :arm_46b194ad3d374b7397fa14b1a3136d56
Backup Node Id is :arm_3110b0b79eb84bd899291d5e0d231009
Do you want to apply this backup that has different nodeId? Y or N

Follow instructions after the restore completes.

Alternate Manual Method (not recommended)

*** snapshot command doesnt work in manual mode yet, also requires sudo ym-backup-setup.sh to be run ?

Code Block
sudo ym-backup-actions.sh -p manual -a manual-sftp-snapshots -i 10.101.10.86 -k /home/sftpbackup/path/ 
Code Block
sudo ym-backup-actions.sh -p manual -a manual-sftp-restore -i 10.101.10.86 -k /home/sftpbackup/path/ -r xxxxx

Advanced Configuration

ym-encrypt-at-rest.sh

Info

This command encrypts the local keys and configuration using clevis/tang

Code Block
yarnman@ym-ph-test [ ~ ]$ sudo ym-encrypt-at-rest.sh
Database key found proceeding
Number of pins required for decryption :1
Number of pins this must be equal or greater than the number of pins required for decryption :3
Enter URL for tang server 1 :http://10.101.10.10:6655
Enter THP for tang server 1 :DwLco7FJtXWxFTprQ5M3cojJsZo
Connection successful to : http://10.101.10.10:6655
Enter URL for tang server 2 :http://10.101.10.11:6655
Enter THP for tang server 2 :0Lqk7DroJ0g3patTCgTweMUAHPc
Connection successful to : http://10.101.10.11:6655
Enter URL for tang server 3 :http://10.101.10.12:6655
Enter THP for tang server 3 :GEpmSTQfz8ctVxdgQEp_rnS3za
Connection successful to : http://10.101.10.12:6655

{
  "t": 1,
  "pins": {
    "tang": [
      {
        "url": "http://10.101.10.10:6655",
        "thp": "DwLco7FJtXWxFTprQ5M3cojJsZo"
      },
      {
        "url": "http://10.101.10.11:6655",
        "thp": "0Lqk7DroJ0g3patTCgTweMUAHPc"
      },
      {
        "url": "http://10.101.10.12:6655",
        "thp": "GEpmSTQfz8ctVxdgQEp_rnS3za"
      }
    ]
  }
}
Do you want to encrypt configuration? Y or Ny
encrypt configuration
Encrypting keys
1668397245104 INFO  Encrypting private and SSL keys using settings:
1668397245106 INFO    - not overwriting existing encrypted files and not deleting any original files after encryption
1668397245106 INFO  --------------------------------
1668397245106 INFO  Encrypting...
1668397245308 INFO    - 'private-encryption-key.pem' encrypted successfully
1668397245543 INFO    - 'ssl-key.pem' encrypted successfully
1668397245543 INFO  --------------------------------
1668397245543 INFO  Finished encrypting the files
Encrypting config
1668397245643 INFO  Starting the encryption of 1 local configuration fields through Clevis Shamir Secret Sharing
1668397245743 INFO  Attempting to encrypt the following local config fields: couchdb.password
1668397245843 INFO  Local key 'couchdb.password' encrypted successfully
1668397245943 INFO  1 local config fields encrypted, 0 fields omitted
Do you want to take a backup of database key this will be shown on console? Y orNy
Echo private key to console
-----BEGIN RSA PRIVATE KEY-----
REMOVED
-----END RSA PRIVATE KEY-----
Encrypted private key is 8129 bytes
restarting services
Config encryption is complete