Closed ITMPGIT closed 2 years ago
Great feature, but I have witnessed the same issue. I've created two files (attached) with one byte difference ("z" in the description field on the last row). The "SUCCESS" file imports as expected, but the "FAILED" file reports "Missing host list in session" ZBX_CSV_Loader-minimal-FAILED.csv ZBX_CSV_Loader-minimal-SUCCESS.csv
I've also checked the number of characters by copying the contents of the attached files to a Linux VM:
root@zbx-01:/tmp# wc ZBX*
17 17 436 ZBX_CSV_Loader-minimal-FAILED-no-header-row.csv
18 18 537 ZBX_CSV_Loader-minimal-FAILED-with-header-row.csv
17 17 435 ZBX_CSV_Loader-minimal-SUCCESS-no-header-row.csv
18 18 536 ZBX_CSV_Loader-minimal-SUCCESS-with-header-row.csv
There is nothing in my php.ini which give me any concern, and I've successfully imported Templates and Hosts (in XML format) that have been exported from another Zabbix which are significantly larger than the attached.
I'm a bit stuck as to what to look for in order to debug this, as my PHP skills are somewhat lacking, so any pointers would be appreciated.
It's also worth noting that if no HOST_GROUPS or TEMPLATES are specified in the CSV, the imported host(s) are added to the first Hostgroup in Zabbix, and the first Template is applied. Hostgroup is now mandatory in Zabbix, but Template isn't.
Also worth noting that either an AGENT_IP/DNS or SNMP_IP/DNS is required but these are also not mandatory in Zabbix.
Thanks
Thank you for the detailed information. There was a problem with the preview step caused by the way how session variables are stored client-side in cookies by the Zabbix frontend rather than on the server, which is normally the case in PHP. This collided with the 4096 character limit in cookies, so the preview data was omitted when the host list was too big. Both the preview and import data is now read directly from a temporary file instead, so even megabytes of CSV data should be possible now, as long as the file is below the upload limit.
The other problem with the host group and template linking was caused by a chain of smaller issues with the filtering and CSV preprocessing.
Both issues should be fixed in 6.0.3. For older Zabbix versions before 6.0, the fix has also been backported as version 5.4.3.
Great work, and quick turn-around for the solution - and the detailed explanation.
I’ve tested on Zabbix 6.x, and can confirm it now works as expected 😊
I did manage to work around the original issue by creating a really basic shell script to produce a YAML file, which did the trick (see below).
However I much prefer the ability to upload the CSV and will promote your solution whenever the opportunity arises
Thanks,
David
#------------------------
#!/bin/bash
# Title : update_hosts_from_DCIM.sh
# Author : David Tomkins
# Date : 16th May 2022
# Description
# : Create yaml file by extracting a list of devices from the "fac_Device" table in the DCIM database
outfile=/tmp/hosts_from_DCIM.yaml
# : Print the header
echo "zabbix_export:
version: '6.0'
date: '2022-05-16T10:58:44Z'
groups:
-
uuid: 3e1e7961db9a4f50a696086e9f55b7b5
name: Hardware
hosts:" >$outfile
# : Loop through the DCIM table and strip alpha-nums to allow us to create the zabbix host, and we'll leave the
if mysql -s dcim -e "select concat_ws('|',Label,Label,DeviceID) from fac_Device;" 2>&1 | \
awk -F\| \
' gsub(/[^a-zA-Z0-9]/," ",$2){printf(" -\n host: DCIM ID %s-%s\n name: \"DCIM ID %s-%s\"\n groups:\n -\n name: Hardware\n interfaces:\n -\n interface_ref: if1\n inventory_mode: DISABLED\n",$3,$2,$3,$1)}' >> $outfile
then
echo "Your file > $outfile < is ready to import into Zabbix"
else
echo "Error"
fi
exit
#------------------------
loads the file with the information, but the import error, any tips?