Closed Chacsam closed 1 year ago
I've got a same problem. More logs:
[Nest] 1 - 05/04/2023, 12:12:13 PM LOG [NestFactory] Starting Nest application...
[Nest] 1 - 05/04/2023, 12:12:13 PM LOG [InstanceLoader] TypeOrmModule dependencies initialized +107ms
[Nest] 1 - 05/04/2023, 12:12:13 PM LOG [InstanceLoader] BullModule dependencies initialized +1ms
[Nest] 1 - 05/04/2023, 12:12:13 PM LOG [InstanceLoader] ConfigHostModule dependencies initialized +2ms
[Nest] 1 - 05/04/2023, 12:12:13 PM LOG [InstanceLoader] DiscoveryModule dependencies initialized +1ms
[Nest] 1 - 05/04/2023, 12:12:13 PM LOG [InstanceLoader] ConfigModule dependencies initialized +15ms
[Nest] 1 - 05/04/2023, 12:12:13 PM LOG [InstanceLoader] BullModule dependencies initialized +1ms
[Nest] 1 - 05/04/2023, 12:12:13 PM LOG [InstanceLoader] BullModule dependencies initialized +1ms
[Nest] 1 - 05/04/2023, 12:12:14 PM LOG [InstanceLoader] TypeOrmCoreModule dependencies initialized +316ms
[Nest] 1 - 05/04/2023, 12:12:14 PM LOG [InstanceLoader] TypeOrmModule dependencies initialized +1ms
[Nest] 1 - 05/04/2023, 12:12:14 PM LOG [InstanceLoader] TypeOrmModule dependencies initialized +1ms
[Nest] 1 - 05/04/2023, 12:12:14 PM WARN [MetadataExtractionProcessor] Reverse geocoding is enabled
[Nest] 1 - 05/04/2023, 12:12:14 PM LOG [MetadataExtractionProcessor] Initializing Reverse Geocoding
/usr/src/app/node_modules/local-reverse-geocoder/index.js:746
throw err;
^
CsvError: Invalid Record Length: expect 19, got 16 on line 133063
at Object.__onRecord (/usr/src/app/node_modules/csv-parse/dist/cjs/index.cjs:940:11)
at Object.parse (/usr/src/app/node_modules/csv-parse/dist/cjs/index.cjs:903:36)
at Parser._flush (/usr/src/app/node_modules/csv-parse/dist/cjs/index.cjs:1336:26)
at Parser.final [as _final] (node:internal/streams/transform:112:25)
at callFinal (node:internal/streams/writable:694:27)
at prefinish (node:internal/streams/writable:723:7)
at finishMaybe (node:internal/streams/writable:733:5)
at afterWrite (node:internal/streams/writable:504:3)
at onwrite (node:internal/streams/writable:477:7)
at Parser.Transform._read (node:internal/streams/transform:245:5) {
code: 'CSV_RECORD_INCONSISTENT_FIELDS_LENGTH',
bytes: 22581640,
comment_lines: 0,
empty_lines: 0,
invalid_field_length: 0,
lines: 133063,
records: 133062,
columns: false,
error: undefined,
header: false,
index: 16,
raw: undefined,
column: 16,
quoting: false,
record: [
'5145072',
'Woodstock',
'Woodstock',
'Vudstok,uddosutokku,udeuseutog,wwdastak nywywrk,Вудсток,ووداستاک، نیویورک,ウッドストック,우드스톡',
'42.04092',
'-74.1182',
'P',
'PPL',
,
'',
,
'111',
'83052',
'',
'2088',
''
]
}
There is an unhandled csv exception that keeps restarting Next.js in the microservices container.
If @Chacsam would provide that container logs, I think it should be something similar.
This is an issue with the upstream local reverse geocoding library. The current work around is to shutdown all the containers, remove the microservices container and then bring up the whole stack again to recreate the microservices container
@alextran1502 That is true. I'm restarting the entire docker-compose stack, and it is back to work, but I need to pause the EXTRACT METADATA
job because it is the same after a few more processed messages.
I'm migrating ~100k files, so there are a lot of problematic csvs..
Yeah we want to fork the upstream library and rewrite it to handle this situation
The bug
I have set up Immich twice on Virtual Containers and got the issue each time after a few thousand succesful import. It Imports and at some point the jobs just hang while the "Metadata" Jobs automatically pauses itself.
The OS that Immich Server is running on
Ubuntu 22.04 (container), 6 Gb RAM, 4 CPU
Version of Immich Server
v1.54.1
Version of Immich Mobile App
1.54, build 77
Platform with the issue
Your docker-compose.yml content
Your .env content
Reproduction steps
Additional information
Immich-Server Log: (Rebooting didn't help)