CloverHackyColor / CloverBootloader

Bootloader for macOS, Windows and Linux in UEFI and in legacy mode
BSD 2-Clause "Simplified" License
4.56k stars 620 forks source link

Clover does not boot on ProLiant DL 560 Gen 8 #486

Open natiq2004 opened 2 years ago

natiq2004 commented 2 years ago

I boot in legacy BIOS mode, because the machine does not support even the most basic EFI. The goal is to boot the server from PCIe 3 NVMe Raid Card.

When booting on HP ProLiant DL560 Gen8 Clover displays "6" on the screen and hangs. I see only number 6 on the screen, in text mode. I don't see clover GUI. Just the number 6 and no response to any key presses.

cb30165e-dafd-4764-9b10-cd9bc89e51df

DL560 Gen8 use the Integrated Matrox G200EH video as graphics output device. I am guessing that it is a problem caused by not supporting G200EH because of Clover compatibility problem.

If you install a new graphics card in the device to run the Clover, there is only one PCI-E 16x slot and you can not install the HighPoint SSD7540 NVMe RAID controller card.

Many of the people have faced this problem. But still they did not find the solution of this.

Please help me solve the G200EH and Clover compatibility issues. Or, ask carefully if you can fix the Clover for this problem!

SergeySlice commented 2 years ago

I may propose to you to make USB stick with legacy Clover choosing BIOS BlockIO so boot7. boot6 you will see number 6 on screen, and with boot7 you will see number 7. The difference is disk drivers. boot6 has an own ATA/ATAPI driver which is obviously will not work with NVMe Raid Card. boot7 has a driver which addresses to BIOS calls. So if BIOS can work with you controller then boot7 will be able to boot. As far as I remember cdboot is based on boot6 so it works with ordinary SATA DVD-ROM. There was a non-resolvable problem to use boot7 here connected with default block size in BIOS. If any developer wants to continue the investigation then welcome! My proposition is to use USB stick.

natiq2004 commented 2 years ago

BIOS can detect raid controller and NVMe disks are visible in Linux operating system. The problem is in Boot.

2f33db6e-5f77-4526-b91c-bbc96ad9cfb3

Before i test Clover of iso file i have used bootable USB with legacy Clover. I see only number "6" on the screen. While booting via USB i press 7 to boot7. I see only number "7" on the screen. I don't see clover GUI.

boot7

I think this problem is because of HP DL560 Gen8 server used Integrated Matrox G200EH video as graphics.

I have placed different video card into the server PCIe 16x slot. After this from bootable USB Clover is starter working. I have checked DVD-ROM as well, Clover is working.

b0e145ec-e6d7-47da-bd78-babb26c91878

When place new video card into the server, ILO Remote Console is not working. Because ILO Remote Console is working with integrated video card. Apart from this If i install a new graphics card in the device to run the Clover, there is only one PCI-E 16x slot and we can not install the HighPoint SSD7540 NVMe RAID controller card.

HP GEN8 servers are most usable servers. About this problem many users written, but still it did not solved. I am writing some of them to you.

https://web.archive.org/web/20190611084245/https://sourceforge.net/p/cloverefiboot/tickets/537/?limit=25 https://www.win-raid.com/t2375f50-Guide-NVMe-boot-without-modding-your-UEFI-BIOS-Clover-EFI-bootloader-method-46.html https://www.win-raid.com/t8846f50-Trying-to-make-a-HP-DL-G-run-on-NVMe-and-windows.html

Please try your best to solve this problem.

jonjof commented 2 years ago

Has there been update on this issue?

abaumgaertner commented 2 years ago

Confirm the issue with release 5150 on an HP DL380G8 with ILO4. Both boot6 and boot7 are stuck after the number is displayed and CPU fan ramps up to high after a few seconds. On DL380G7 with ILO3, boot6 and boot7 both start the legacy gui.

I'll try to privide clover logs.

thmsdelange commented 1 year ago

Sam issue here with r4961, r5070, r5150 on an HPE ML310e gen8 v2 server. Prepared usb using bootdisk utility. Also for me the only thing I can see is the number 6 with a blinking underscore. Is there anything I can do to solve this issue?

I've also read on multiple forums that the integrated graphics might cause this problem, will slapping in a simple graphics card solve my problem?

DrMxrcy commented 1 year ago

Has anyone found a workaround to this? I don't have room to slot in a GPU.

tech128 commented 1 year ago

If you install a new graphics card in the device to run the Clover, there is only one PCI-E 16x slot and you can not install the HighPoint SSD7540 NVMe RAID controller card.

You can install even a 16x card into a 1x slot, it just won't be able to use the full video bandwidth the card is capable of.

Has anyone found a workaround to this? I don't have room to slot in a GPU.

Even the base config with 1 riser gives you 3 slots. Should be able to get a card in one of those. Or get another riser and use one of the 3 extra slots that will give you.

WarRick0 commented 1 year ago

I have the same issue on my HP Proliant DL380p Gen8 server. Clover bootloader legacy menu don't appear on loading and i get "7_" blank screen. If i add Ati GPU on pcie slot and set bios pcie primary video device in bios, the clover menu appear on pcie graphic adaptor but not on integrated Matrox GPU (G200EH)

On ATI pcie gpu i cant see bios Post, access to my bios, smart array and Ilo4. i must switch between them.

Gouster4 commented 1 year ago

I have the same issue on my HP Proliant DL380p Gen8 server. Clover bootloader legacy menu don't appear on loading and i get "7_" blank screen. If i add Ati GPU on pcie slot and set bios pcie primary video device in bios, the clover menu appear on pcie graphic adaptor but not on integrated Matrox GPU (G200EH)

On ATI pcie gpu i cant see bios Post, access to my bios, smart array and Ilo4. i must switch between them.

Same problem with same server. Hopefully Clover will get G200 gpu support until this server will become just e-waste. Cannot boot nvme or HDD in HBA mode without Clover or some similar solution.

Matrox develops and manufactures the Matrox Mystique, Parhelia, G200, G400 and G550. Although the company provides free and open-source drivers for their chipsets which are older than the G550; chipsets newer than the G550 are supported by a closed-source driver.

So, with opensource drivers, im sure its possible. But i have no clue how hard or time consuming it would be.

vkz commented 1 year ago

Experiencing the same issue on Dell R720 with built in Matrox G200. Both boot6 and boot7 just show their respective numbers followed by a blinking cursor and no GUI whatsoever. Tried limiting GUI to TextOnly with no effect - same issue. Does look like the problem maybe with video support. Sadly, I've no other video adapter to plug in.

Gouster4 commented 1 year ago

Experiencing the same issue on Dell R720 with built in Matrox G200. Both boot6 and boot7 just show their respective numbers followed by a blinking cursor and no GUI whatsoever. Tried limiting GUI to TextOnly with no effect - same issue. Does look like the problem maybe with video support. Sadly, I've no other video adapter to plug in.

I have video adapters, but no free pcie slot. All of them are occupied by nvme to pci adapters.

But also keep in mind, that ilo remote console would work only with builtin G200 gpu. Everything displayed to other adapters wouldnt be shown in remote console.

vkz commented 1 year ago

@Gouster4 I expect the same would be the case with Dell’s own iDrac and remote virtual console, which is why I’m not in a rush to buy a pcie gpu. Sadly it needs to be the on board video to make use of remote management, sigh

Gouster4 commented 1 year ago

@Gouster4 I expect the same would be the case with Dell’s own iDrac and remote virtual console, which is why I’m not in a rush to buy a pcie gpu. Sadly it needs to be the on board video to make use of remote management, sigh

Comatibility depends on which gpu does Dell use. But for Remote Console, onboard gpu only, because its somwehow bound with remote management hardware (ilo or idrac). It basically has output of builtin gpu connected also to remote management chip, or it has direct access to gpu memory, and doesnt even need access to output, im not sure which one is it, but its hardware bound to internal gpu. It makes perfectly sense, as highjacking dedicated gpu by remote management would be hit or miss if even possible, and connecting video output from dedicated gpu to management chip outside of case doesnt look clean, and isnt very enterprise ready solution.

maxexcloo commented 1 month ago

Having the same issue here - has anyone found a fix?