Open edbr-xyz opened 2 months ago
I ended up exporting it directly from my phone using the Google Maps app. Google has really stuffed things up here. Once done from the phone the import worked as expected.
i am having the same issue, but i cannot figure out how Danielson89 fixed it the way they did. any tips?
I'd suggest splitting the Records.json file into smaller chunks: https://dawarich.app/docs/FAQ#why-my-attempt-to-import-recordsjson-fails
@applesoff In case you haven't gotten your data yet, these steps are what I used now that Google has swapped from online to local by default storage of timeline data:
https://support.google.com/maps/thread/280205453/how-do-i-download-my-timeline-history?hl=en
Instructions: try from android device settings > location > location services > timeline > export timeline data
My Records.json is 1.56 GB and 70 million lines! (data from 2010 till today) How big chunks can dawarich handle? And any tips on best way to split upp the file?
@mcfrojd splitting to files sized up to 100-150MB should work
I come back to this project once in a month trying to import my data (around 900mb) and it always fails.
I can't believe the technology is not there yet to read and put in a database less than a gigabyte of data. 😭
I ended up solving this by a python script:
import json
from datetime import datetime
def generate_sql(file_path, output_path, import_id, user_id):
now = datetime.utcnow().strftime('%Y-%m-%d %H:%M:%S.%f')[:-3]
with open(file_path, 'r') as json_file, open(output_path, 'w') as sql_file:
data = json.load(json_file)
locations = data.get('locations', [])
for location in locations:
parsed = parse_json(location)
sql = (
f"INSERT INTO public.points (latitude, longitude, timestamp, raw_data, topic, tracker_id, import_id, user_id, created_at, updated_at) "
f"VALUES ({parsed['latitude']}, {parsed['longitude']}, {parsed['timestamp']}, "
f"'{parsed['raw_data']}', 'Google Maps Timeline Export', 'google-maps-timeline-export', "
f"{import_id}, {user_id}, '{now}', '{now}');\n"
)
sql_file.write(sql)
def parse_json(entry):
timestamp_str = entry.get('timestamp') or entry.get('timestampMs', '')
if 'T' in timestamp_str:
timestamp = int(datetime.fromisoformat(timestamp_str.replace('Z', '+00:00')).timestamp())
else:
timestamp = int(timestamp_str) // 1000 if timestamp_str else 0
return {
"latitude": entry.get('latitudeE7', 0) / 10 ** 7,
"longitude": entry.get('longitudeE7', 0) / 10 ** 7,
"timestamp": timestamp,
"altitude": entry.get('altitude', 'NULL'),
"velocity": entry.get('velocity', 'NULL'),
"raw_data": json.dumps(entry).replace("'", "''")
}
input_json_path = 'Records.json'
output_sql_path = 'output.sql'
import_id = 1
user_id = 1
generate_sql(input_json_path, output_sql_path, import_id, user_id)
Steps:
1) create import
2) wait until it fails
3) put the user id and import id according to the database (it will be 1 and 1 if you run it on a fresh install)
4) put your Records.json
next to the script
5) run the script
6) modify docker-compose.yml
to expose the port of the database, e.g.
ports:
- "127.0.0.1:5432:5432"
7) get the output.sql
and just run it against the database (you will need to modify docker-compose.yml
to expose the port for the database, e.g. )
8) wait around 10-15 minutes (it took 11 minutes for 2 million rows)
After that I see all my points in the app. I checked the code and that seems to be the only thing to be done, @Freika please correct me if I am wrong, I am seeing Ruby for the first time in my life
sql = ( f"INSERT INTO public.points (latitude, longitude, timestamp, raw_data, topic, tracker_id, import_id, user_id, created_at, updated_at) " f"VALUES ({parsed['latitude']}, {parsed['longitude']}, {parsed['timestamp']}, " f"'{parsed['raw_data']}', '**Google Maps Timeline Export', 'google-maps-timeline-export**', " f"{import_id}, {user_id}, '{now}', '{now}');\n" )
tempted to do it, but is there any reason to keep the long topic and tracker_id ?
When I follow the steps to import a Records.json from Google Takeout, I get the following output:
(my account email replaced with \<my-email>)
The file that I am trying to import is quiet large, as seen in the above output.
I have tried upping the CPU and memory limits in
docker-compose.yml
. If I raise them enough, or remove the limits, the task will run, but will hang after a while, locking up the whole server, eventually spitting out the following:I am running dawarich in openmediavault-7 docker-compose using the :latest dawarich version, on an AMD Ryzen 5 2400g with 8GB RAM.