Closed grisevg closed 7 years ago
Run Windows VM in your GPU :) Like mentioned in another post, if we could use HBM memory we would have 6-8x more bandwith than any dual Xeon workstation can provide (currently ~100 GB/s with 8 Channel DDR4-2400). Even GDD5X with around 300-400 GB/s would be a benefit if somehow the ImDisk part could be skipped.
My partner and I recently have created a X16 NVMe M.2 RAID card that holds 4 M.2 drives. For the next version of the card we are working to add HMB2 RAM on card. Idealy we would like to have the RAM load data from RAID volume and use the M.2.s as L1 cache. It appears you might be a good person to collaborate on this effort. Let us know what you thoughts are on such proposal.
From: Pummelchen [mailto:notifications@github.com] Sent: Sunday, April 2, 2017 5:12 PM To: prsyahmi/GpuRamDrive GpuRamDrive@noreply.github.com Cc: Davis, Buck BuckDavis@Balfourbeattyus.com; Manual manual@noreply.github.com Subject: Re: [prsyahmi/GpuRamDrive] What are the use cases for this? (#1)
Run Windows VM in your GPU :) Like mentioned in another post, if we could use HBM memory we would have 6-8x more bandwith than any dual Xeon workstation can provide (currently ~100 GB/s with 8 Channel DDR4-2400). Even GDD5X with around 300-400 GB/s would be a benefit if somehow the ImDisk part could be skipped.
— You are receiving this because you are subscribed to this thread. Reply to this email directly, view it on GitHubhttps://github.com/prsyahmi/GpuRamDrive/issues/1#issuecomment-291015443, or mute the threadhttps://github.com/notifications/unsubscribe-auth/AT0CruHp-WUWOVCIaQ66sFdo2Nihu93aks5rsA8LgaJpZM4K8j1Y.
Hi, this start merely as a PoC that this can be done on Windows too but a person I know has found its uses. As he is running some IO intensive apps lately, the lifespan of his old SSD drive become a concern and a faulty memory bank slot prevented him to expand the memory and balance it out. The extra unused RAM on dedicated graphics card can be use as a cache for that IO intensive apps.