SA: puppet agent -t shows rabbitmq permission issues post upgrade or installation of packages.
Originally Published: 2016-02-13
Article Number
Applies To
RSA Product/Service Type: SA Core Appliance, SA Virtual Log Collector
RSA Version/Condition: 10.5.1.2
Platform: CentOS
O/S Version: 6
Issue
root@XXXXX ~]# puppet agent -t
Info: Retrieving pluginfacts
Info: Retrieving plugin
Info: Loading facts
Info: Caching catalog for ce104637-d54b-44fa-8c44-bb9af55f1659
Info: Applying configuration version '1454009856'
Notice: /Stage[main]/Rabbitmq/File[/etc/rabbitmq]/group: group changed 'root' to 'rabbitmq'
Notice: /Stage[main]/Rabbitmq/File[/etc/rabbitmq]/mode: mode changed '0755' to '0775'
Notice: /Stage[main]/Ssh/Exec[fix-ssh]/returns: executed successfully
Notice: /Stage[main]/Rabbitmq/Service[rabbitmq-server]/ensure: ensure changed 'stopped' to 'running'
Info: /Stage[main]/Rabbitmq/Service[rabbitmq-server]: Unscheduling refresh on Service[rabbitmq-server]
Notice: /Stage[main]/Rabbitmq/Exec[create-system-vhost]/returns: Error: unable to connect to node sa@localhost: nodedown
Notice: /Stage[main]/Rabbitmq/Exec[create-system-vhost]/returns:
Notice: /Stage[main]/Rabbitmq/Exec[create-system-vhost]/returns: DIAGNOSTICS
Notice: /Stage[main]/Rabbitmq/Exec[create-system-vhost]/returns: ===========
Notice: /Stage[main]/Rabbitmq/Exec[create-system-vhost]/returns:
Notice: /Stage[main]/Rabbitmq/Exec[create-system-vhost]/returns: attempted to contact: [sa@localhost]
Notice: /Stage[main]/Rabbitmq/Exec[create-system-vhost]/returns:
Notice: /Stage[main]/Rabbitmq/Exec[create-system-vhost]/returns: sa@localhost:
Notice: /Stage[main]/Rabbitmq/Exec[create-system-vhost]/returns: * connected to epmd (port 4369) on localhost
Notice: /Stage[main]/Rabbitmq/Exec[create-system-vhost]/returns: * epmd reports: node 'sa' not running at all
Notice: /Stage[main]/Rabbitmq/Exec[create-system-vhost]/returns: other nodes on localhost: ['rabbitmqctl-6830']
Notice: /Stage[main]/Rabbitmq/Exec[create-system-vhost]/returns: * suggestion: start the node
Notice: /Stage[main]/Rabbitmq/Exec[create-system-vhost]/returns:
Notice: /Stage[main]/Rabbitmq/Exec[create-system-vhost]/returns: current node details:
Notice: /Stage[main]/Rabbitmq/Exec[create-system-vhost]/returns: - node name: 'rabbitmqctl-6830@rsa-collector-01'
Notice: /Stage[main]/Rabbitmq/Exec[create-system-vhost]/returns: - home dir: /var/lib/rabbitmq
Notice: /Stage[main]/Rabbitmq/Exec[create-system-vhost]/returns: - cookie hash: jHfuU22qoNu7EVDTiz/tGA==
Error: rabbitmqctl -n sa@localhost add_vhost /rsa/system returned 2 instead of one of [0]
Error: /Stage[main]/Rabbitmq/Exec[create-system-vhost]/returns: change from notrun to 0 failed: rabbitmqctl -n sa@localhost add_vhost /rsa/system returned 2 instead of one of [0]
Notice: /Stage[main]/Rabbitmq/Exec[set-carlos-federation-policy]: Dependency Exec[create-system-vhost] has failures: true
Warning: /Stage[main]/Rabbitmq/Exec[set-carlos-federation-policy]: Skipping because of failed dependencies
Notice: /Stage[main]/Rabbitmq/Exec[create-sa_node_id-user]/returns: Error: unable to connect to node sa@localhost: nodedown
Notice: /Stage[main]/Rabbitmq/Exec[create-sa_node_id-user]/returns:
Notice: /Stage[main]/Rabbitmq/Exec[create-sa_node_id-user]/returns: DIAGNOSTICS
Notice: /Stage[main]/Rabbitmq/Exec[create-sa_node_id-user]/returns: ===========
Notice: /Stage[main]/Rabbitmq/Exec[create-sa_node_id-user]/returns:
Notice: /Stage[main]/Rabbitmq/Exec[create-sa_node_id-user]/returns: attempted to contact: [sa@localhost]
Notice: /Stage[main]/Rabbitmq/Exec[create-sa_node_id-user]/returns:
Notice: /Stage[main]/Rabbitmq/Exec[create-sa_node_id-user]/returns: sa@localhost:
Notice: /Stage[main]/Rabbitmq/Exec[create-sa_node_id-user]/returns: * connected to epmd (port 4369) on localhost
Notice: /Stage[main]/Rabbitmq/Exec[create-sa_node_id-user]/returns: * epmd reports: node 'sa' not running at all
Notice: /Stage[main]/Rabbitmq/Exec[create-sa_node_id-user]/returns: other nodes on localhost: ['rabbitmqctl-6977']
Notice: /Stage[main]/Rabbitmq/Exec[create-sa_node_id-user]/returns: * suggestion: start the node
Notice: /Stage[main]/Rabbitmq/Exec[create-sa_node_id-user]/returns:
Notice: /Stage[main]/Rabbitmq/Exec[create-sa_node_id-user]/returns: current node details:
Notice: /Stage[main]/Rabbitmq/Exec[create-sa_node_id-user]/returns: - node name: 'rabbitmqctl-6977@rsa-collector-01'
Notice: /Stage[main]/Rabbitmq/Exec[create-sa_node_id-user]/returns: - home dir: /var/lib/rabbitmq
Notice: /Stage[main]/Rabbitmq/Exec[create-sa_node_id-user]/returns: - cookie hash: jHfuU22qoNu7EVDTiz/tGA==
Error: rabbitmqctl -q -n sa@localhost add_user e5eac11a-8f4d-400d-a540-42cbc325557d e5eac11a-8f4d-400d-a540-42cbc325557d && rabbitmqctl -n sa@localhost clear_password e5eac11a-8f4d-400d-a540-42cbc325557d returned 2 instead of one of [0]
Error: /Stage[main]/Rabbitmq/Exec[create-sa_node_id-user]/returns: change from notrun to 0 failed: rabbitmqctl -q -n sa@localhost add_user e5eac11a-8f4d-400d-a540-42cbc325557d e5eac11a-8f4d-400d-a540-42cbc325557d && rabbitmqctl -n sa@localhost clear_password e5eac11a-8f4d-400d-a540-42cbc325557d returned 2 instead of one of [0]
Notice: /Stage[main]/Rabbitmq/Exec[set-sa_node_id-system-vhost-permissions]: Dependency Exec[create-sa_node_id-user] has failures: true
Notice: /Stage[main]/Rabbitmq/Exec[set-sa_node_id-system-vhost-permissions]: Dependency Exec[create-system-vhost] has failures: true
Warning: /Stage[main]/Rabbitmq/Exec[set-sa_node_id-system-vhost-permissions]: Skipping because of failed dependencies
Notice: /Stage[main]/Rabbitmq/Exec[create-sa-vhost]/returns: Error: unable to connect to node sa@localhost: nodedown
Notice: /Stage[main]/Rabbitmq/Exec[create-sa-vhost]/returns:
Notice: /Stage[main]/Rabbitmq/Exec[create-sa-vhost]/returns: DIAGNOSTICS
Notice: /Stage[main]/Rabbitmq/Exec[create-sa-vhost]/returns: ===========
Notice: /Stage[main]/Rabbitmq/Exec[create-sa-vhost]/returns:
Notice: /Stage[main]/Rabbitmq/Exec[create-sa-vhost]/returns: attempted to contact: [sa@localhost]
Notice: /Stage[main]/Rabbitmq/Exec[create-sa-vhost]/returns:
Notice: /Stage[main]/Rabbitmq/Exec[create-sa-vhost]/returns: sa@localhost:
Notice: /Stage[main]/Rabbitmq/Exec[create-sa-vhost]/returns: * connected to epmd (port 4369) on localhost
Notice: /Stage[main]/Rabbitmq/Exec[create-sa-vhost]/returns: * epmd reports: node 'sa' not running at all
Notice: /Stage[main]/Rabbitmq/Exec[create-sa-vhost]/returns: other nodes on localhost: ['rabbitmqctl-7120']
Notice: /Stage[main]/Rabbitmq/Exec[create-sa-vhost]/returns: * suggestion: start the node
Notice: /Stage[main]/Rabbitmq/Exec[create-sa-vhost]/returns:
Notice: /Stage[main]/Rabbitmq/Exec[create-sa-vhost]/returns: current node details:
Notice: /Stage[main]/Rabbitmq/Exec[create-sa-vhost]/returns: - node name: 'rabbitmqctl-7120@rsa-collector-01'
Notice: /Stage[main]/Rabbitmq/Exec[create-sa-vhost]/returns: - home dir: /var/lib/rabbitmq
Notice: /Stage[main]/Rabbitmq/Exec[create-sa-vhost]/returns: - cookie hash: jHfuU22qoNu7EVDTiz/tGA==
Error: rabbitmqctl -n sa@localhost add_vhost /rsa/sa returned 2 instead of one of [0]
Error: /Stage[main]/Rabbitmq/Exec[create-sa-vhost]/returns: change from notrun to 0 failed: rabbitmqctl -n sa@localhost add_vhost /rsa/sa returned 2 instead of one of [0]
Notice: /Stage[main]/Rabbitmq/Exec[set-guest-user-sa_vhost-permissions]: Dependency Exec[create-sa-vhost] has failures: true
Warning: /Stage[main]/Rabbitmq/Exec[set-guest-user-sa_vhost-permissions]: Skipping because of failed dependencies
Notice: /Stage[main]/Rabbitmq/Exec[set-sa_node_id-user-sa_vhost-permissions]: Dependency Exec[create-sa-vhost] has failures: true
Notice: /Stage[main]/Rabbitmq/Exec[set-sa_node_id-user-sa_vhost-permissions]: Dependency Exec[create-sa_node_id-user] has failures: true
Warning: /Stage[main]/Rabbitmq/Exec[set-sa_node_id-user-sa_vhost-permissions]: Skipping because of failed dependencies
Notice: /Stage[main]/Rabbitmq/Exec[set-carlos-federation-sa_vhost-policy]: Dependency Exec[create-sa-vhost] has failures: true
Warning: /Stage[main]/Rabbitmq/Exec[set-carlos-federation-sa_vhost-policy]: Skipping because of failed dependencies
Notice: /Stage[main]/Rabbitmq/Exec[set-guest-user-system-vhost-permissions]: Dependency Exec[create-system-vhost] has failures: true
Warning: /Stage[main]/Rabbitmq/Exec[set-guest-user-system-vhost-permissions]: Skipping because of failed dependencies
Notice: /Stage[main]/Yumconfig/Exec[disable-Centos-Repos]/returns: executed successfully
Notice: /Stage[main]/Rabbitmq/Exec[create-node_id-user]/returns: Error: unable to connect to node sa@localhost: nodedown
Notice: /Stage[main]/Rabbitmq/Exec[create-node_id-user]/returns:
Notice: /Stage[main]/Rabbitmq/Exec[create-node_id-user]/returns: DIAGNOSTICS
Notice: /Stage[main]/Rabbitmq/Exec[create-node_id-user]/returns: ===========
Notice: /Stage[main]/Rabbitmq/Exec[create-node_id-user]/returns:
Notice: /Stage[main]/Rabbitmq/Exec[create-node_id-user]/returns: attempted to contact: [sa@localhost]
Notice: /Stage[main]/Rabbitmq/Exec[create-node_id-user]/returns:
Notice: /Stage[main]/Rabbitmq/Exec[create-node_id-user]/returns: sa@localhost:
Notice: /Stage[main]/Rabbitmq/Exec[create-node_id-user]/returns: * connected to epmd (port 4369) on localhost
Notice: /Stage[main]/Rabbitmq/Exec[create-node_id-user]/returns: * epmd reports: node 'sa' not running at all
Notice: /Stage[main]/Rabbitmq/Exec[create-node_id-user]/returns: other nodes on localhost: ['rabbitmqctl-7335']
Notice: /Stage[main]/Rabbitmq/Exec[create-node_id-user]/returns: * suggestion: start the node
Notice: /Stage[main]/Rabbitmq/Exec[create-node_id-user]/returns:
Notice: /Stage[main]/Rabbitmq/Exec[create-node_id-user]/returns: current node details:
Notice: /Stage[main]/Rabbitmq/Exec[create-node_id-user]/returns: - node name: 'rabbitmqctl-7335@rsa-collector-01'
Notice: /Stage[main]/Rabbitmq/Exec[create-node_id-user]/returns: - home dir: /var/lib/rabbitmq
Notice: /Stage[main]/Rabbitmq/Exec[create-node_id-user]/returns: - cookie hash: jHfuU22qoNu7EVDTiz/tGA==
Error: rabbitmqctl -q -n sa@localhost add_user ce104637-d54b-44fa-8c44-bb9af55f1659 ce104637-d54b-44fa-8c44-bb9af55f1659 && rabbitmqctl -n sa@localhost clear_password ce104637-d54b-44fa-8c44-bb9af55f1659 returned 2 instead of one of [0]
Error: /Stage[main]/Rabbitmq/Exec[create-node_id-user]/returns: change from notrun to 0 failed: rabbitmqctl -q -n sa@localhost add_user ce104637-d54b-44fa-8c44-bb9af55f1659 ce104637-d54b-44fa-8c44-bb9af55f1659 && rabbitmqctl -n sa@localhost clear_password ce104637-d54b-44fa-8c44-bb9af55f1659 returned 2 instead of one of [0]
Notice: /Stage[main]/Rabbitmq/Exec[set-node_id-user-sa_vhost-permissions]: Dependency Exec[create-node_id-user] has failures: true
Notice: /Stage[main]/Rabbitmq/Exec[set-node_id-user-sa_vhost-permissions]: Dependency Exec[create-sa-vhost] has failures: true
Warning: /Stage[main]/Rabbitmq/Exec[set-node_id-user-sa_vhost-permissions]: Skipping because of failed dependencies
Notice: /Stage[main]/Rabbitmq/Exec[set-node_id-system-vhost-permissions]: Dependency Exec[create-node_id-user] has failures: true
Notice: /Stage[main]/Rabbitmq/Exec[set-node_id-system-vhost-permissions]: Dependency Exec[create-system-vhost] has failures: true
Warning: /Stage[main]/Rabbitmq/Exec[set-node_id-system-vhost-permissions]: Skipping because of failed dependencies
Notice: /Stage[main]/Rsa-sms-runtime/File[MessageBusWriteModule.conf]: Dependency Exec[create-node_id-user] has failures: true
Notice: /Stage[main]/Rsa-sms-runtime/File[MessageBusWriteModule.conf]: Dependency Exec[create-sa-vhost] has failures: true
Notice: /Stage[main]/Rsa-sms-runtime/File[MessageBusWriteModule.conf]: Dependency Exec[create-sa_node_id-user] has failures: true
Notice: /Stage[main]/Rsa-sms-runtime/File[MessageBusWriteModule.conf]: Dependency Exec[create-system-vhost] has failures: true
Warning: /Stage[main]/Rsa-sms-runtime/File[MessageBusWriteModule.conf]: Skipping because of failed dependencies
Notice: /Stage[main]/Mcollective/File[mcollective-server-private]: Dependency Exec[create-node_id-user] has failures: true
Notice: /Stage[main]/Mcollective/File[mcollective-server-private]: Dependency Exec[create-sa-vhost] has failures: true
Notice: /Stage[main]/Mcollective/File[mcollective-server-private]: Dependency Exec[create-sa_node_id-user] has failures: true
Notice: /Stage[main]/Mcollective/File[mcollective-server-private]: Dependency Exec[create-system-vhost] has failures: true
Warning: /Stage[main]/Mcollective/File[mcollective-server-private]: Skipping because of failed dependencies
Notice: /Stage[main]/Mcollective/File[mcollective-client-public]: Dependency Exec[create-node_id-user] has failures: true
Notice: /Stage[main]/Mcollective/File[mcollective-client-public]: Dependency Exec[create-sa-vhost] has failures: true
Notice: /Stage[main]/Mcollective/File[mcollective-client-public]: Dependency Exec[create-sa_node_id-user] has failures: true
Notice: /Stage[main]/Mcollective/File[mcollective-client-public]: Dependency Exec[create-system-vhost] has failures: true
Warning: /Stage[main]/Mcollective/File[mcollective-client-public]: Skipping because of failed dependencies
Notice: /Stage[main]/Mcollective/Package[mcollective]: Dependency Exec[create-node_id-user] has failures: true
Notice: /Stage[main]/Mcollective/Package[mcollective]: Dependency Exec[create-sa-vhost] has failures: true
Notice: /Stage[main]/Mcollective/Package[mcollective]: Dependency Exec[create-sa_node_id-user] has failures: true
Notice: /Stage[main]/Mcollective/Package[mcollective]: Dependency Exec[create-system-vhost] has failures: true
Warning: /Stage[main]/Mcollective/Package[mcollective]: Skipping because of failed dependencies
Notice: /Stage[main]/Mcollective/File[server.cfg]: Dependency Exec[create-node_id-user] has failures: true
Notice: /Stage[main]/Mcollective/File[server.cfg]: Dependency Exec[create-sa-vhost] has failures: true
Notice: /Stage[main]/Mcollective/File[server.cfg]: Dependency Exec[create-sa_node_id-user] has failures: true
Notice: /Stage[main]/Mcollective/File[server.cfg]: Dependency Exec[create-system-vhost] has failures: true
Warning: /Stage[main]/Mcollective/File[server.cfg]: Skipping because of failed dependencies
Notice: /Stage[main]/Mcollective/File[stompgemfile]: Dependency Exec[create-node_id-user] has failures: true
Notice: /Stage[main]/Mcollective/File[stompgemfile]: Dependency Exec[create-sa-vhost] has failures: true
Notice: /Stage[main]/Mcollective/File[stompgemfile]: Dependency Exec[create-sa_node_id-user] has failures: true
Notice: /Stage[main]/Mcollective/File[stompgemfile]: Dependency Exec[create-system-vhost] has failures: true
Warning: /Stage[main]/Mcollective/File[stompgemfile]: Skipping because of failed dependencies
Notice: /Package[stompgem]: Dependency Exec[create-node_id-user] has failures: true
Notice: /Package[stompgem]: Dependency Exec[create-sa-vhost] has failures: true
Notice: /Package[stompgem]: Dependency Exec[create-sa_node_id-user] has failures: true
Notice: /Package[stompgem]: Dependency Exec[create-system-vhost] has failures: true
Warning: /Package[stompgem]: Skipping because of failed dependencies
Notice: /Stage[main]/Rsa-sms-runtime/File[MessageBus.conf]: Dependency Exec[create-node_id-user] has failures: true
Notice: /Stage[main]/Rsa-sms-runtime/File[MessageBus.conf]: Dependency Exec[create-sa-vhost] has failures: true
Notice: /Stage[main]/Rsa-sms-runtime/File[MessageBus.conf]: Dependency Exec[create-sa_node_id-user] has failures: true
Notice: /Stage[main]/Rsa-sms-runtime/File[MessageBus.conf]: Dependency Exec[create-system-vhost] has failures: true
Warning: /Stage[main]/Rsa-sms-runtime/File[MessageBus.conf]: Skipping because of failed dependencies
Notice: /Stage[main]/Mcollective/File[mcollective-server-public]: Dependency Exec[create-node_id-user] has failures: true
Notice: /Stage[main]/Mcollective/File[mcollective-server-public]: Dependency Exec[create-sa-vhost] has failures: true
Notice: /Stage[main]/Mcollective/File[mcollective-server-public]: Dependency Exec[create-sa_node_id-user] has failures: true
Notice: /Stage[main]/Mcollective/File[mcollective-server-public]: Dependency Exec[create-system-vhost] has failures: true
Warning: /Stage[main]/Mcollective/File[mcollective-server-public]: Skipping because of failed dependencies
Notice: /Stage[main]/Mcollective/Service[mcollective]: Dependency Exec[create-node_id-user] has failures: true
Notice: /Stage[main]/Mcollective/Service[mcollective]: Dependency Exec[create-sa-vhost] has failures: true
Notice: /Stage[main]/Mcollective/Service[mcollective]: Dependency Exec[create-sa_node_id-user] has failures: true
Notice: /Stage[main]/Mcollective/Service[mcollective]: Dependency Exec[create-system-vhost] has failures: true
Warning: /Stage[main]/Mcollective/Service[mcollective]: Skipping because of failed dependencies
Notice: /Stage[main]/Mcollective/File[/etc/mcollective/facts.yaml]: Dependency Exec[create-node_id-user] has failures: true
Notice: /Stage[main]/Mcollective/File[/etc/mcollective/facts.yaml]: Dependency Exec[create-sa-vhost] has failures: true
Notice: /Stage[main]/Mcollective/File[/etc/mcollective/facts.yaml]: Dependency Exec[create-sa_node_id-user] has failures: true
Notice: /Stage[main]/Mcollective/File[/etc/mcollective/facts.yaml]: Dependency Exec[create-system-vhost] has failures: true
Warning: /Stage[main]/Mcollective/File[/etc/mcollective/facts.yaml]: Skipping because of failed dependencies
Notice: /Stage[main]/Collectd/Service[collectd]: Dependency Exec[create-node_id-user] has failures: true
Notice: /Stage[main]/Collectd/Service[collectd]: Dependency Exec[create-sa-vhost] has failures: true
Notice: /Stage[main]/Collectd/Service[collectd]: Dependency Exec[create-sa_node_id-user] has failures: true
Notice: /Stage[main]/Collectd/Service[collectd]: Dependency Exec[create-system-vhost] has failures: true
Warning: /Stage[main]/Collectd/Service[collectd]: Skipping because of failed dependencies
Notice: Finished catalog run in 28.11 seconds
Cause
Resolution
1. Login to ssh of appliance as admin.
2. Upload rabbitmq-server-3.4.2-1.noarch.rpm package file using WinSCP tool to the appliance.
3. Change execution permission to the package by below command.
chmod +x rabbitmq-server-3.4.2-1.noarch.rpm
4. Reinstall package using below command.
yum reinstall rabbitmq-server-3.4.2-1.noarch.rpm
5. Run puppet agent -t in appliance to apply changes to configuration.
sample output will be as below.
[root@XXXX ~]# puppet agent -t
Info: Retrieving pluginfacts
Info: Retrieving plugin
Info: Loading facts
Info: Caching catalog for ce104637-d54b-44fa-8c44-bb9af55f1659
Info: Applying configuration version '1454009856'
Notice: /Stage[main]/Logcollector/File[/usr/lib/rabbitmq/lib/rabbitmq_server-3.4.2/plugins/nw_admin.ez]/target: target changed '/opt/netwitness/nw_admin-10.5.1.0.14254.ez' to '/opt/netwitness/nw_admin-10.5.1.2.14400.ez'
Info: /Stage[main]/Logcollector/File[/usr/lib/rabbitmq/lib/rabbitmq_server-3.4.2/plugins/nw_admin.ez]: Scheduling refresh of Service[rabbitmq-server]
Notice: /Stage[main]/Ssh/Exec[fix-ssh]/returns: executed successfully
Notice: /Stage[main]/Appliance/Base::Trusted_connection[appliance-trusted-connection]/Exec[copy-appliance-key_and_cert]/returns: executed successfully
Info: /Stage[main]/Appliance/Base::Trusted_connection[appliance-trusted-connection]/Exec[copy-appliance-key_and_cert]: Scheduling refresh of Exec[tc-restart-nwappliance]
Notice: /Stage[main]/Appliance/Base::Trusted_connection[appliance-trusted-connection]/Exec[copy-appliance-trusted-connections_sa_node_cert]/returns: executed successfully
Info: /Stage[main]/Appliance/Base::Trusted_connection[appliance-trusted-connection]/Exec[copy-appliance-trusted-connections_sa_node_cert]: Scheduling refresh of Exec[tc-restart-nwappliance]
Notice: /Stage[main]/Appliance/Base::Trusted_connection[appliance-trusted-connection]/Exec[copy-appliance-trusted-connections_ca_cert]/returns: executed successfully
Info: /Stage[main]/Appliance/Base::Trusted_connection[appliance-trusted-connection]/Exec[copy-appliance-trusted-connections_ca_cert]: Scheduling refresh of Exec[tc-restart-nwappliance]
Notice: /Stage[main]/Rabbitmq/Service[rabbitmq-server]/ensure: ensure changed 'stopped' to 'running'
Info: /Stage[main]/Rabbitmq/Service[rabbitmq-server]: Unscheduling refresh on Service[rabbitmq-server]
Notice: /Stage[main]/Appliance/Base::Trusted_connection[appliance-trusted-connection]/Exec[copy-appliance-trusted-connections_node_cert]/returns: executed successfully
Info: /Stage[main]/Appliance/Base::Trusted_connection[appliance-trusted-connection]/Exec[copy-appliance-trusted-connections_node_cert]: Scheduling refresh of Exec[tc-restart-nwappliance]
Notice: /Stage[main]/Appliance/Base::Trusted_connection[appliance-trusted-connection]/Exec[tc-restart-nwappliance]: Triggered 'refresh' from 4 events
Notice: /Stage[main]/Yumconfig/Exec[disable-Centos-Repos]/returns: executed successfully
Notice: /Stage[main]/Logcollector/Base::Trusted_connection[logcollector-trusted-connection]/Exec[copy-logcollector-trusted-connections_ca_cert]/returns: executed successfully
Info: /Stage[main]/Logcollector/Base::Trusted_connection[logcollector-trusted-connection]/Exec[copy-logcollector-trusted-connections_ca_cert]: Scheduling refresh of Exec[tc-restart-nwlogcollector]
Notice: /Stage[main]/Logcollector/Base::Trusted_connection[logcollector-trusted-connection]/Exec[copy-logcollector-trusted-connections_node_cert]/returns: executed successfully
Info: /Stage[main]/Logcollector/Base::Trusted_connection[logcollector-trusted-co-nnection]/Exec[copy-logcollector-trusted-connections_node_cert]: Scheduling refresh of Exec[tc-restart-nwlogcollector]
Notice: /Stage[main]/Logcollector/Base::Trusted_connection[logcollector-trusted-connection]/Exec[copy-logcollector-key_and_cert]/returns: executed successfully
Info: /Stage[main]/Logcollector/Base::Trusted_connection[logcollector-trusted-connection]/Exec[copy-logcollector-key_and_cert]: Scheduling refresh of Exec[tc-restart-nwlogcollector]
Notice: /Stage[main]/Logcollector/Base::Trusted_connection[logcollector-trusted-connection]/Exec[copy-logcollector-trusted-connections_sa_node_cert]/returns: executed successfully
Info: /Stage[main]/Logcollector/Base::Trusted_connection[logcollector-trusted-connection]/Exec[copy-logcollector-trusted-connections_sa_node_cert]: Scheduling refresh of Exec[tc-restart-nwlogcollector]
Error: /Stage[main]/Logcollector/Base::Trusted_connection[logcollector-trusted-connection]/Exec[tc-restart-nwlogcollector]: Failed to call refresh: Command exceeded timeout
Error: /Stage[main]/Logcollector/Base::Trusted_connection[logcollector-trusted-connection]/Exec[tc-restart-nwlogcollector]: Command exceeded timeout
Notice: /Stage[main]/Mcollective/File[/etc/mcollective/facts.yaml]/content:
--- /etc/mcollective/facts.yaml 2015-12-16 11:41:54.000000000 +0000
+++ /tmp/puppet-file20160211-30939-1ioo683-0 2016-02-11 13:58:34.064509260 +0000
@@ -44,14 +44,14 @@
is_virtual: "true"
kernel: Linux
kernelmajversion: "2.6"
- kernelrelease: "2.6.32-504.1.3.el6.x86_64"
+ kernelrelease: "2.6.32-573.12.1.el6.x86_64"
kernelversion: "2.6.32"
macaddress: "00:50:56:94:0A:5D"
macaddress_eth0: "00:50:56:94:0A:5D"
management_interface: eth0
manufacturer: "VMware, Inc."
memorysize: "3.74 GB"
- memorysize_mb: "3832.64"
+ memorysize_mb: "3832.53"
module_name: ""
mtu_eth0: "1500"
mtu_lo: "65536"
@@ -64,10 +64,10 @@
node_cert_hash: e09ad79f
node_id: ce104637-d54b-44fa-8c44-bb9af55f1659
ntp_servers: puppetmaster.local
- nw_admin_path: /opt/netwitness/nw_admin-10.5.1.0.14254.ez
+ nw_admin_path: /opt/netwitness/nw_admin-10.5.1.2.14400.ez
operatingsystem: CentOS
operatingsystemmajrelease: "6"
- operatingsystemrelease: "6.6"
+ operatingsystemrelease: "6.7"
osfamily: RedHat
physicalprocessorcount: "2"
pm_node: "false"
Info: Computing checksum on file /etc/mcollective/facts.yaml
Info: /Stage[main]/Mcollective/File[/etc/mcollective/facts.yaml]: Filebucketed /etc/mcollective/facts.yaml to main with sum d544b3cc92cedf8cb25eace4be8f3933
Notice: /Stage[main]/Mcollective/File[/etc/mcollective/facts.yaml]/content: content changed '{md5}d544b3cc92cedf8cb25eace4be8f3933' to '{md5}f7f5bdd753a14d1e16374d8fb3d807b3'
Notice: Finished catalog run in 336.83 seconds
Related Articles
Defect: VM: HBOS: error occurs when trying to edit HTTP status sources 2Number of Views Archive Requests Utility 5Number of Views IP changes do not reflect what is loaded into the ACE/Server using Auto Registration on Windows 2000 44Number of Views Error: '404 Not Found' in RSA Federated Identity Manager (FIM) 2.0 when user redirected to Relying Party with the artifact 21Number of Views ACM-100162 || PV_USER_ALL_ACCESS view does not include custom attributes post 7.1.1 installation 11Number of Views
Trending Articles
Quick Setup Guide - Passwordless Authentication in Windows MFA Agent for Active Directory RSA Authentication Manager 8.9 Release Notes (January 2026) RSA Governance & Lifecycle 8.0.0 Administrators Guide RSA MFA Agent 2.3.6 for Microsoft Windows Installation and Administration Guide RSA MFA Agent 2.5 for Microsoft Windows Installation and Administration Guide
Don't see what you're looking for?