This GSE should run in its own process, with the ListenerLogger running in the "background". This GSE will read the log file written by ListenerLogger, transform the packet contents, render plots on the screen, take user input, etc. A key part of this is reading the log file.
Description of the reader
This log file reader should ingest blocks of data (in contrast to reading the log line-by-line as it is updated), returning some structured data that informs the caller:
status of the packet (errors, flags, etc)
who the packet came from (which detector, housekeeping system, etc)
what the packet contents are (histogram, light curve, event data, etc)
Since there are many packets per batch read, we should decide how packets should be sorted. Options: by detector, by time, by flag, etc. Maybe good to return a DataFrame-like structure so the caller can filter as needed.
Since the log file will be very large (several GB), we don't want to load the whole file into memory for every read operation. Instead, should use syntax like this:
batch = []
with open("log_file.log", "r") as file:
for i, line in enumerate(file):
if i > last_line_read:
batch.append(line)
# do other stuff to the line?
But figure out what happens at the end of the file. By the way, this should be safe to run while ListenerLogger is writing, since the open() is read-only.
Todo
[ ] Write a simple reading function, test it on .log file.
[ ] See if there is a way to enumerate(file) with the delimiter something other than \n.
[ ] Come up with return data structure for batch data.
Rationale
This GSE should run in its own process, with the ListenerLogger running in the "background". This GSE will read the log file written by ListenerLogger, transform the packet contents, render plots on the screen, take user input, etc. A key part of this is reading the log file.
Description of the reader
This log file reader should ingest blocks of data (in contrast to reading the log line-by-line as it is updated), returning some structured data that informs the caller:
Since there are many packets per batch read, we should decide how packets should be sorted. Options: by detector, by time, by flag, etc. Maybe good to return a DataFrame-like structure so the caller can filter as needed.
Since the log file will be very large (several GB), we don't want to load the whole file into memory for every read operation. Instead, should use syntax like this:
But figure out what happens at the end of the file. By the way, this should be safe to run while ListenerLogger is writing, since the
open()
is read-only.Todo
enumerate(file)
with the delimiter something other than\n
.