Closed nstarke closed 2 years ago
I can take a stab at it.
I have this partially working in a local branch. I'll push it to a new branch once it's done. If there aren't any additional fixes that need to be made, I'll push it to master.
I pushed initial 10.1 changes to the ghidra-10.1 branch (builds from CI are available here - it's the Artifact named artifact
). I did some basic testing and didn't notice any obvious regressions, but let me know if you find anything.
@dev747368 I noticed that getInfo isn't used anymore. If I want to populate fields in Get Info (when right clicking a file in the FS viewer), I should implement getFileAttributes, right?
Edit: Oops, I included some unrelated analyzer changes. Fixed CI builds are here.
re: getFileAttributes(). Correct.
It can be as detailed as you want (SevenZipFileSystem.getFileAttributes()), or a simple static string (BootImageFileSystem.getFileAttributes())
I've got some concerns about your implementation of CBFSFile.
First, I strongly suggest avoid using InputStreamByteProvider. It is all kinds of bad. Second, your usage of CompressorInputStream.getUncompressedCount() may be bad. I don't think (from reading the apache commons javadoc, and the source for one of the decompressors) it gives you the size of what will be decompressed, but gives you size of what has been decompressed up to that point. Its possible that if the data was small enough that it could return a valid number due to reading the initial compressed block, but I would verify this before relying on it.
I've got a suggested (untested) implementation:
public ByteProvider getByteProvider() throws IOException {
// Extract the file if compression is used (specified in a compression attribute).
if (attribute != null && attribute instanceof CBFSCompressionAttribute) {
ByteArrayInputStream is = new ByteArrayInputStream(data);
int compressionType = ((CBFSCompressionAttribute) attribute).getCompressionType();
switch (compressionType) {
case CBFSConstants.CompressionAlgorithm.LZMA: is = new LZMACompressorInputStream(is); break;
case CBFSConstants.CompressionAlgorithm.LZ4: is = new FramedLZ4CompressorInputStream(is); break;
default: throw new IOException("Unsupported CBFS compression type: " + compressionType);
}
FileCacheEntryBuilder tmpFileBuilder = FileSystemService.getInstance().createTempFile(-1);
FSUtilities.streamCopy(is, tmpFileBuilder, TaskMonitor.DUMMY);
FileCacheEntry fce = tmpFileBuilder.finish();
return FileSystemService.getInstance().getNamedTempFile(fce, name);
} else {
return new ByteArrayProvider(data);
}
}
Thanks for jumping on this! If you need any help with any of it please feel free to delegate to me.
@dev747368 Thanks, your implementation looks good, though I haven't fully tested it yet =) I wasn't sure about getUncompressedCount; I'll also avoid using InputStreamByteProvider in the future.
I added support for file attributes in the ghidra-10.1 branch. I noticed some issues with CBFS handling, but these don't appear to be regressions. Please try it out and let me know if you encounter any bugs.
Hi,
I tried to install the ghidra-10.1
branch on windows and received this error when importing a UEFI image in 10.1, which I believe we have seen before:
'byte[] firmware.common.EFIDecompressor.nativeDecompress(byte[])'
java.lang.UnsatisfiedLinkError: 'byte[] firmware.common.EFIDecompressor.nativeDecompress(byte[])'
at firmware.common.EFIDecompressor.nativeDecompress(Native Method)
at firmware.common.EFIDecompressor.decompress(EFIDecompressor.java:52)
at firmware.uefi_fv.FFSCompressedSection.<init>(FFSCompressedSection.java:84)
at firmware.uefi_fv.FFSSectionFactory.parseSection(FFSSectionFactory.java:64)
at firmware.uefi_fv.UEFIFFSFile.<init>(UEFIFFSFile.java:150)
at firmware.uefi_fv.UEFIFirmwareVolumeHeader.<init>(UEFIFirmwareVolumeHeader.java:254)
at firmware.uefi_fv.FFSVolumeImageSection.<init>(FFSVolumeImageSection.java:49)
at firmware.uefi_fv.FFSSectionFactory.parseSection(FFSSectionFactory.java:72)
at firmware.uefi_fv.FFSGUIDDefinedSection.parseNestedSections(FFSGUIDDefinedSection.java:170)
at firmware.uefi_fv.FFSGUIDDefinedSection.<init>(FFSGUIDDefinedSection.java:118)
at firmware.uefi_fv.FFSSectionFactory.parseSection(FFSSectionFactory.java:66)
at firmware.uefi_fv.UEFIFFSFile.<init>(UEFIFFSFile.java:150)
at firmware.uefi_fv.UEFIFirmwareVolumeHeader.<init>(UEFIFirmwareVolumeHeader.java:254)
at firmware.uefi_fv.UEFIFirmwareVolumeFileSystem.mount(UEFIFirmwareVolumeFileSystem.java:56)
at firmware.uefi_fv.UEFIFirmwareVolumeFileSystemFactory.create(UEFIFirmwareVolumeFileSystemFactory.java:43)
at firmware.uefi_fv.UEFIFirmwareVolumeFileSystemFactory.create(UEFIFirmwareVolumeFileSystemFactory.java:28)
at ghidra.formats.gfilesystem.factory.FileSystemFactoryMgr.mountUsingFactory(FileSystemFactoryMgr.java:176)
at ghidra.formats.gfilesystem.factory.FileSystemFactoryMgr.probe(FileSystemFactoryMgr.java:338)
at ghidra.formats.gfilesystem.FileSystemService.probeFileForFilesystem(FileSystemService.java:679)
at ghidra.formats.gfilesystem.FileSystemService.probeFileForFilesystem(FileSystemService.java:630)
at ghidra.plugins.fsbrowser.FileSystemBrowserPlugin.doOpenFilesystem(FileSystemBrowserPlugin.java:231)
at ghidra.plugins.fsbrowser.FileSystemBrowserPlugin.lambda$openFileSystem$0(FileSystemBrowserPlugin.java:118)
at ghidra.util.task.TaskLauncher$2.run(TaskLauncher.java:117)
at ghidra.util.task.Task.monitoredRun(Task.java:134)
at ghidra.util.task.TaskRunner.lambda$startTaskThread$0(TaskRunner.java:106)
at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128)
at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628)
at java.base/java.lang.Thread.run(Thread.java:829)
It looks like the problem with the x64 architecture I outlined in #14 but I was not able to fix it the same way. I'll dig in a little more after the holiday. Thanks again! Cheers
Oops, I forgot to reply to this... the Gradle build script wasn't using the right paths for efidecompress (the os
directory names changed in Ghidra 10.1). Could you try the artifacts from the latest commit in the ghidra-10.1 branch again?
Apparently two classes were removed from the Ghidra 10.1 Java API:
Right now it is not possible to build against Ghidra 10.1.
I will look into solutions this week, but I am creating this issue now so I don't forget about it.