Oefenweb / ansible-fail2ban

Ansible role to set up fail2ban in Debian-like systems
MIT License
117 stars 55 forks source link

banaction_allports not part of config options? #62

Closed jeanmonet closed 3 years ago

jeanmonet commented 4 years ago

Hi - first of all thank you for your work on this role.

Regarding possible configuration options, is there a reason for which banaction_allports variable is not proposed? Or can it be added?

tersmitten commented 4 years ago

I'm not sure that I know what the purpose of it will be. Can you point me to some documentation or example?

jeanmonet commented 4 years ago

I'm not an expert on fail2ban, but I was assuming banaction_allports can be used, for example, in the following way:

# jail.local
[sshd]
banaction: "%(banaction_allports)s"
...

The goal in this example would be to have each ban issued by the sshd jail be applicable to all the server's ports. Let me know if I'm mistaken.

By setting the banaction_allports parameter, I'm assuming I could choose for example ufw instead of iptables, similarly to the banaction parameter.

tersmitten commented 4 years ago

I think that this should work:

---
- hosts: all
  roles:
    - fail2ban
  vars:
    fail2ban_services:
      - name: sshd
        banaction: '"%(banaction_allports)s"'
jeanmonet commented 4 years ago

Won't this apply banaction_allports default (=iptables-allports)? If one wants banaction_allports to use ufw instead, one can manually set banaction_allports = ufw (to my understanding).

However, the role doesn't currently provide the ability to set fail2ban_banaction_allports: ufw.

Edit: on my side, I will probably stick to iptables for banaction, but it may be useful to have both parameters available in the role, not just first:

---
- hosts: all
  roles:
    - fail2ban
  vars:
    fail2ban_banaction: ufw
    fail2ban_banaction_allports: ufw    # role doesn't currently provide this parameter
    fail2ban_services:
      - name: sshd
        banaction: '"%(banaction_allports)s"'
tersmitten commented 3 years ago

Fixed in #64?

Feel free to reopen.