Guimove / terraform-aws-bastion

Terraform module which creates SSH bastion infrastructure on AWS
https://registry.terraform.io/modules/Guimove/bastion/aws
Apache License 2.0
205 stars 186 forks source link

Is the logfile in S3 empty or is it my misconfiguration? #140

Closed Invertisment closed 1 year ago

Invertisment commented 2 years ago

I downloaded the .data file but it doesn't have anything. I could connect to a database using the SSH port tunneling and nothing was logged. Are you sure that the logging does anything at all? How do I even test that the logs work?

This was the guide that I used to connect to a DB: https://aws.amazon.com/premiumsupport/knowledge-center/rds-connect-using-bastion-host-linux/

It simply allows to set up a port forwarding rule via the bastion and then I can connect to the hidden DB as if it was in localhost. I worked like this for some time and looked into the DB but still nothing was logged at all. Are you sure that your logging works? Maybe I misconfigured something... I hope this is the case.

This is my screenshot of the S3 .data file (I tried to delete one of the versions and this is what you see there): image

If I understand correctly then there is a batch worker that copies an empty log into S3 every once in a while. But that's not good.

This is the content of that file (probably I tried to log in via SSH but then SSH port forwarding is still not logged):

Script started on 2022-06-24 10:19:52+0000
]0;ec2-user@ip-_-_-_-_:~[ec2-user@ip-_-_-_-_ ~]$ exit

Script done on 2022-06-24 10:19:53+0000

I tried to log in and run several commands via the SSH session and this was the result:

Script started on 2022-06-24 10:19:52+0000
]0;ec2-user@ip-_-_-_-_:~[ec2-user@ip-_-_-_-_ ~]$ exit

Script done on 2022-06-24 10:19:53+0000

So... nothing was logged at all. Not even a single command. I ran echo hello. To prove that I waited for the batch job to complete I made this screenshot: image

My config (I omitted all values and used _ there):

module "bastion" {
  source = "Guimove/bastion/aws"
  vpc_id = _
  region = _
  is_lb_private = false // internet-facing instance
  create_dns_record = false
  bucket_name = "_"
  bastion_host_key_pair = "_"
  bastion_launch_template_name = "_"
  bucket_force_destroy = true
  //bastion_iam_policy_name = "BastionHost"
  elb_subnets = [_  ]
  auto_scaling_group_subnets = [_]

  depends_on    = [_]
}

If I log in to the host and look for /var/log/bastion then I get this:

[ec2-user@ip-_ bastion]$ pwd
/var/log/bastion
[ec2-user@ip-_ bastion]$ du -sh *
4.0K    2022-06-24_10-19-52_ec2-user_UPEkYmm5Qx3kA7S09RfaVC0uzfkimlbk.data
4.0K    2022-06-24_10-19-52_ec2-user_UPEkYmm5Qx3kA7S09RfaVC0uzfkimlbk.time
4.0K    2022-06-25_16-10-57_ec2-user_00cRAqTaQQ0LnKsy3LWp9VRoewWUsA0O.data
4.0K    2022-06-25_16-10-57_ec2-user_00cRAqTaQQ0LnKsy3LWp9VRoewWUsA0O.time
12K 2022-06-25_16-23-51_ec2-user_C6lUiIeIFJMslIjQkBf8fausqsGWXoTU.data
4.0K    2022-06-25_16-23-51_ec2-user_C6lUiIeIFJMslIjQkBf8fausqsGWXoTU.time
Invertisment commented 2 years ago

For some reason when I logged in it created a new log file and now it logged quite a bunch. I'm not yet sure if it logs if I forward ports though. image

Invertisment commented 2 years ago

No, it doesn't log the database forwarding at all.

I deleted all of the log files and here is a new fresh log after I've forwarded a production DB connection:

Script started on 2022-06-25 16:38:13+0000
]0;ec2-user@ip-_:~[ec2-user@ip-_ ~]$ cd /var/log/bastion
]0;ec2-user@ip-_:/var/log/bastion[ec2-user@ip-_ bastion]$ ls
2022-06-25_16-38-13_ec2-user_Mkwh3jP8dGfU4Bfu3sRbPiA4NASxxL7M.data
2022-06-25_16-38-13_ec2-user_Mkwh3jP8dGfU4Bfu3sRbPiA4NASxxL7M.time
]0;ec2-user@ip-_:/var/log/bastion[ec2-user@ip-_ bastion]$ exit

Script done on 2022-06-25 16:38:21+0000
Invertisment commented 2 years ago

Also your logging uploads cost money. It's not much but if I'll have multiple bastions then it will start adding up. This is not a good way to do this. It bashes the S3 server with an empty log and stores all of it in the history.

Summary of my Free Tier usage:

image

I ran the bastion for about a day or two. So if you upload logs every 5 minutes then... I ran my bastion instance for 63 hours and ir produced 2222 requests to S3 and also the read events as well. It was because it tried to upload an empty logfile and then it rotated the log and produced multiple log file uploads at once. So in production I expect that you log service would upload hundreds of log history snapshots into S3. When I deleted the logs then it didn't have anything to be uploaded but there was the small log of me deleting things. So I can't completely cleanup the instance but at least I can go there and remove the logs that I don't need anymore :thinking:

Guimove commented 1 year ago

I apologize for the confusion. Thank you for bringing this issue to our attention and providing detailed information about your experience with the logging functionality. We have investigated the matter and made improvements to the logging mechanism. The issue you encountered with empty log files and incomplete logging should now be resolved in the latest version of the module.

To test the logging functionality, we recommend performing SSH sessions and executing commands within the session. The logs should capture the commands and activities during the SSH session.

Regarding the cost of logging uploads to S3, we understand your concerns. We have taken steps to optimize the logging process and reduce unnecessary log file uploads. Additionally, we have introduced a new configuration variable, enable_logs_s3_sync, which allows you to disable the synchronization of logs to S3 if it is not required in your environment. By setting this variable to false, you can prevent the module from uploading logs to S3 and avoid incurring additional costs.

We appreciate your feedback and patience in helping us improve the module. If you encounter any further issues or have additional questions, please don't hesitate to reach out.