Closed t3chn0m4g3 closed 2 months ago
Hi @t3chn0m4g3,
Nice to meet you!
Thanks for your contribution, I will take charge of this new feature in the following days :)
Cheers
Mario
@mariocandela Likewise, this is great news! Looking forward to it!
Hi @t3chn0m4g3,
Download the latest version and take a look here: https://github.com/mariocandela/beelzebub?tab=readme-ov-file#honeypot-llm-honeypots
Happy hacking ❤️
Cheers
Mario
@mariocandela Closing this is fine, thank you very much for taking this on with warp speed 🤩🚀
I can now start testing, if everything works out I will be happy to integrate this into T-Pot.
Thanks again ❤️
Getting started 👋
@t3chn0m4g3 if I can help you in any way, write to me I am happy to help you 😊
@mariocandela Thank you, highly appreciated 😍
Lab is set up, now getting to work :)
@mariocandela - In plugins/llm-integration.go
I can see the ollamaEndpoint
as a const ...
https://github.com/mariocandela/beelzebub/blob/628e20e01fef2b0ad2261fe84ae6b0cc10d64823/plugins/llm-integration.go#L13-L18
... is there any option in the config or the cli to overwrite the endpoint with a different host (assuming that ollama will probably not reside on the honeypot)?
@mariocandela Reading helps... sorry...
apiVersion: "v1"
protocol: "ssh"
address: ":2222"
description: "SSH Ollama Llama3"
commands:
- regex: "^(.+)$"
plugin: "LLMHoneypot"
serverVersion: "OpenSSH"
serverName: "ubuntu"
passwordRegex: "^(root|qwerty|Smoker666|123456|jenkins|minecraft|sinus|alex|postgres|Ly123456)$"
deadlineTimeoutSeconds: 60
plugin:
llmModel: "llama3"
host: "http://example.com/api/chat" #default http://localhost:11434/api/chat
@t3chn0m4g3 Don't worry mate, you can find me here for anything :smile:
ps: These days I'm working on the documentation :heart:
Thanks for your time!
Thank you @mariocandela! ❤️
@mariocandela It is working 🚀😁❤️
I have some issues (T-Pot => Elastic, Kibana objects) with the logging format. Currently tr.TraceEvent
stores the events as nested JSON objects...
https://github.com/mariocandela/beelzebub/blob/628e20e01fef2b0ad2261fe84ae6b0cc10d64823/protocols/strategies/ssh.go#L32-L42
... and does not split RemoteAddr
into ip
and port
.
I am unfamiliar with go, so I managed to get this to work as an example:
remoteAddr, remotePort, err := net.SplitHostPort(sess.RemoteAddr().String())
if err != nil {
remoteAddr = "unknown"
remotePort = "unknown"
}
log.WithFields(log.Fields{
"info": "New SSH Session",
"protocol": tracer.SSH.String(),
"src_ip": remoteAddr,
"src_port": remotePort,
"status": tracer.Start.String(),
"id": uuidSession.String(),
"environ": strings.Join(sess.Environ(), ","),
"user": sess.User(),
"plugin": beelzebubServiceConfiguration.Description,
"command": sess.RawCommand(),
})
There is probably an easier / better way 😅, maybe you have an idea? Don't know if changes lead to any problems on your end. If not, let me know, happy to contribute.
Hi @t3chn0m4g3,
Sorry for the delay, I added the two useful information for kibana :)
https://github.com/mariocandela/beelzebub/releases/tag/v3.2.4
To speed up the process I applied the change, your solution is valid 😄
Thanks for your time ❤️
Awesome! Thank you! 🤩
Is your feature request related to a problem? Please describe. Using OpenAI API can involve unforeseen costs.
Describe the solution you'd like Adding support for Llama3 / Ollama supported LLMs could allow for streamlining invests and for larger installations.