Ansible-playbook, ssh hosts unreacable

Hi.

I have just switched to ParrotOS (Home Edition) on my main workstation. My main reason for switching was the extensive usage of firejail, but now I suspect that it’s firejail that might be causing me problems.

I am trying to run an ansible-playbook, but it fails with hosts unreacable. If I try to connect directly to the hosts with ssh not using ansible it works.

I tried adding noblacklist ${HOME}/.ansible to /etc/firejail/ssh.profile, but that didn’t help.

Don’t know if it is relevant, but I have installed ansible with pip3

UPDATE: I got it working by temporarly changing my PATH and changing the order of /usr/bin and /usr/local/bin setting /usr/bin first, and by doing that preventing firejail from being executed. This is not a viable solution, so if anyone has got some tips on what to put in a custom profile, to make this work on a permanent basis that would be great. I use ansible a lot, so I need to get this working, and I really don’t want to disable firejail.


What version of Parrot are you running? (include version, edition, and architecture)

Distributor ID: Parrot
Description: Parrot 4.1
Release: 4.1
Codename: stable

What method did you use to install Parrot? (Debian Standard / Debian GTK / parrot-experimental)
Debian Standard

Configured to multiboot with other systems? (yes / no)
No

If there are any similar issues or solutions, link to them below:

If there are any error messages or relevant logs, post them below:

ansible-playbook 2.5.5
  config file = None
  configured module search path = ['/home/user/.ansible/plugins/modules', '/usr/share/ansible/plugins/    modules']
  ansible python module location = /usr/local/lib/python3.6/dist-packages/ansible
  executable location = /usr/local/bin/ansible-playbook
  python version = 3.6.5 (default, May 11 2018, 13:30:17) [GCC 7.3.0]
No config file found; using defaults                                                                      



Using module file /usr/local/lib/python3.6/dist-packages/ansible/modules/system/setup.py
<IP REMOVED> ESTABLISH SSH CONNECTION FOR USER: USERNAME
<IP REMOVED> SSH: EXEC ssh -C -o ControlMaster=auto -o ControlPersist=60s -o                              KbdInteractiveAuthentication=no -o PreferredAuthentications=gssapi-with-mic,gssapi-keyex,hostbased,       publickey -o PasswordAuthentication=no -o User=USERNAME -o ConnectTimeout=10 -o ControlPath=/home/user/.  ansible/cp/01a2b7c5f7 IP REMOVED '/bin/sh -c '"'"'echo ~USERNAME && sleep 0'"'"''
<IP REMOVED> (255, b'/home/USERNAME\n', b'')
fatal: [HOSTNAME REMOVED]: UNREACHABLE! => {
    "changed": false,
    "msg": "Failed to connect to the host via ssh: ",
    "unreachable": true
}

If firejail is the problem you can use firejail --noprofile ansible-playbook to remove the default profile in that instance, and then do this to make a custom profile for it.

If it doesnt solve your problems use the -v argument to get more information that might help (or -vvv or -vvvv for even more info)

That did the trick , thank you.

Since ansible were not running in firejail, but the problem was the ssh process spawned by ansible, I didn’t think that would work, but it did :smiley:

1 Like