Closed Apathion closed 3 years ago
@Apathion please fill out the issue template completely, thanks!
sure.
Your Operating system and DeepLabCut version
**I'm working on MacOS high Sierra, using DeepLabCut 2.2rc2 Using the CPU-Env. (Radeon Pro 560 4096 MB, 3.6 GHz Intel Core i7, 16 GB 2400 MHz DDR4)
Please complete the following information about your system:
I have no idea what else to write here additionally (or what to write in the fisrt paragraph instead) I hope you still have all the info you need. If not, just let me know.
Describe the problem
I'm trying to get my test script running. While trying to generate frames and create a video, the process seems to fork (whatever that exactly means).
Please place code inside this:
(base) UniversitysiMac:~ psydev20$ conda activate DLC-CPU
(DLC-CPU) UniversitysiMac:~ psydev20$ cd /Users/psydev20/Desktop/DeepLabCut
(DLC-CPU) UniversitysiMac:DeepLabCut psydev20$ cd examples
(DLC-CPU) UniversitysiMac:examples psydev20$ pythonw testscript.py
Imported DLC!
On Windows/OSX tensorpack is not tested by default.
CREATING PROJECT
Created "/Users/psydev20/Desktop/DeepLabCut/examples/TEST-Alex-2021-06-30/videos"
Created "/Users/psydev20/Desktop/DeepLabCut/examples/TEST-Alex-2021-06-30/labeled-data"
Created "/Users/psydev20/Desktop/DeepLabCut/examples/TEST-Alex-2021-06-30/training-datasets"
Created "/Users/psydev20/Desktop/DeepLabCut/examples/TEST-Alex-2021-06-30/dlc-models"
Copying the videos
/Users/psydev20/Desktop/DeepLabCut/examples/TEST-Alex-2021-06-30/videos/reachingvideo1.avi
Generated "/Users/psydev20/Desktop/DeepLabCut/examples/TEST-Alex-2021-06-30/config.yaml"
A new project with name TEST-Alex-2021-06-30 is created at /Users/psydev20/Desktop/DeepLabCut/examples and a configurable file (config.yaml) is stored there. Change the parameters in this file to adapt to your project's needs.
Once you have changed the configuration file, use the function 'extract_frames' to select frames for labeling.
. [OPTIONAL] Use the function 'add_new_videos' to add new videos to your project (at any stage).
EXTRACTING FRAMES
Config file read successfully.
Extracting frames based on kmeans ...
Kmeans-quantization based extracting of frames from 0.0 seconds to 8.53 seconds.
Extracting and downsampling... 256 frames from the video.
256it [00:01, 191.51it/s]
Kmeans clustering ... (this might take a while)
Frames were successfully extracted, for the videos of interest.
You can now label the frames using the function 'label_frames' (if you extracted enough frames for all videos).
CREATING-SOME LABELS FOR THE FRAMES
Plot labels...
Creating images with labels by Alex.
100%|█████████████████████████████████████████████| 5/5 [00:01<00:00, 3.51it/s]
If all the labels are ok, then use the function 'create_training_dataset' to create the training dataset!
CREATING TRAININGSET
The training dataset is successfully created. Use the function 'train_network' to start training. Happy training!
CHANGING training parameters to end quickly!
TRAIN
Selecting single-animal trainer
Config:
{'all_joints': [[0], [1], [2], [3]],
'all_joints_names': ['bodypart1', 'bodypart2', 'bodypart3', 'objectA'],
'alpha_r': 0.02,
'batch_size': 1,
'clahe': True,
'claheratio': 0.1,
'crop_pad': 0,
'cropratio': 0.4,
'dataset': 'training-datasets/iteration-0/UnaugmentedDataSet_TESTJun30/TEST_Alex80shuffle1.mat',
'dataset_type': 'default',
'decay_steps': 30000,
'deterministic': False,
'display_iters': 2,
'edge': False,
'emboss': {'alpha': [0.0, 1.0], 'embossratio': 0.1, 'strength': [0.5, 1.5]},
'fg_fraction': 0.25,
'global_scale': 0.8,
'histeq': True,
'histeqratio': 0.1,
'init_weights': '/opt/anaconda3/envs/DLC-CPU/lib/python3.7/site-packages/deeplabcut/pose_estimation_tensorflow/models/pretrained/resnet_v1_50.ckpt',
'intermediate_supervision': False,
'intermediate_supervision_layer': 12,
'location_refinement': True,
'locref_huber_loss': True,
'locref_loss_weight': 0.05,
'locref_stdev': 7.2801,
'log_dir': 'log',
'lr_init': 0.0005,
'max_input_size': 1500,
'mean_pixel': [123.68, 116.779, 103.939],
'metadataset': 'training-datasets/iteration-0/UnaugmentedDataSet_TESTJun30/Documentation_data-TEST_80shuffle1.pickle',
'min_input_size': 64,
'mirror': False,
'multi_stage': False,
'multi_step': [[0.001, 5]],
'net_type': 'resnet_50',
'num_joints': 4,
'optimizer': 'sgd',
'pairwise_huber_loss': False,
'pairwise_predict': False,
'partaffinityfield_predict': False,
'pos_dist_thresh': 17,
'project_path': '/Users/psydev20/Desktop/DeepLabCut/examples/TEST-Alex-2021-06-30',
'regularize': False,
'rotation': 25,
'rotratio': 0.4,
'save_iters': 5,
'scale_jitter_lo': 0.5,
'scale_jitter_up': 1.25,
'scoremap_dir': 'test',
'sharpen': False,
'sharpenratio': 0.3,
'shuffle': True,
'snapshot_prefix': '/Users/psydev20/Desktop/DeepLabCut/examples/TEST-Alex-2021-06-30/dlc-models/iteration-0/TESTJun30-trainset80shuffle1/train/snapshot',
'stride': 8.0,
'weigh_negatives': False,
'weigh_only_present_joints': False,
'weigh_part_predictions': False,
'weight_decay': 0.0001}
Starting with imgaug pose-dataset loader (=default).
Batch Size is 1
Initializing ResNet
Loading ImageNet-pretrained resnet_50
2021-06-30 17:16:51.863933: I tensorflow/core/platform/cpu_feature_guard.cc:142] Your CPU supports instructions that this TensorFlow binary was not compiled to use: AVX2 FMA
2021-06-30 17:16:51.873591: I tensorflow/compiler/xla/service/service.cc:168] XLA service 0x7ff80b495150 initialized for platform Host (this does not guarantee that XLA will be used). Devices:
2021-06-30 17:16:51.873620: I tensorflow/compiler/xla/service/service.cc:176] StreamExecutor device (0): Host, Default Version
Training parameter:
{'stride': 8.0, 'weigh_part_predictions': False, 'weigh_negatives': False, 'fg_fraction': 0.25, 'mean_pixel': [123.68, 116.779, 103.939], 'shuffle': True, 'snapshot_prefix': '/Users/psydev20/Desktop/DeepLabCut/examples/TEST-Alex-2021-06-30/dlc-models/iteration-0/TESTJun30-trainset80shuffle1/train/snapshot', 'log_dir': 'log', 'global_scale': 0.8, 'location_refinement': True, 'locref_stdev': 7.2801, 'locref_loss_weight': 0.05, 'locref_huber_loss': True, 'optimizer': 'sgd', 'intermediate_supervision': False, 'intermediate_supervision_layer': 12, 'regularize': False, 'weight_decay': 0.0001, 'crop_pad': 0, 'scoremap_dir': 'test', 'batch_size': 1, 'dataset_type': 'default', 'deterministic': False, 'mirror': False, 'pairwise_huber_loss': False, 'weigh_only_present_joints': False, 'partaffinityfield_predict': False, 'pairwise_predict': False, 'all_joints': [[0], [1], [2], [3]], 'all_joints_names': ['bodypart1', 'bodypart2', 'bodypart3', 'objectA'], 'alpha_r': 0.02, 'clahe': True, 'claheratio': 0.1, 'cropratio': 0.4, 'dataset': 'training-datasets/iteration-0/UnaugmentedDataSet_TESTJun30/TEST_Alex80shuffle1.mat', 'decay_steps': 30000, 'display_iters': 2, 'edge': False, 'emboss': {'alpha': [0.0, 1.0], 'embossratio': 0.1, 'strength': [0.5, 1.5]}, 'histeq': True, 'histeqratio': 0.1, 'init_weights': '/opt/anaconda3/envs/DLC-CPU/lib/python3.7/site-packages/deeplabcut/pose_estimation_tensorflow/models/pretrained/resnet_v1_50.ckpt', 'lr_init': 0.0005, 'max_input_size': 1500, 'metadataset': 'training-datasets/iteration-0/UnaugmentedDataSet_TESTJun30/Documentation_data-TEST_80shuffle1.pickle', 'min_input_size': 64, 'multi_stage': False, 'multi_step': [[0.001, 5]], 'net_type': 'resnet_50', 'num_joints': 4, 'pos_dist_thresh': 17, 'project_path': '/Users/psydev20/Desktop/DeepLabCut/examples/TEST-Alex-2021-06-30', 'rotation': 25, 'rotratio': 0.4, 'save_iters': 5, 'scale_jitter_lo': 0.5, 'scale_jitter_up': 1.25, 'sharpen': False, 'sharpenratio': 0.3, 'covering': True, 'elastic_transform': True, 'motion_blur': True, 'motion_blur_params': {'k': 7, 'angle': (-90, 90)}}
Starting training....
iteration: 2 loss: 1.2227 lr: 0.001
iteration: 4 loss: 0.6428 lr: 0.001
2021-06-30 17:17:07.489254: W tensorflow/core/kernels/queue_base.cc:277] _0_fifo_queue: Skipping cancelled enqueue attempt with queue not closed
Exception in thread Thread-3:
Traceback (most recent call last):
File "/opt/anaconda3/envs/DLC-CPU/lib/python3.7/site-packages/tensorflow_core/python/client/session.py", line 1365, in _do_call
return fn(*args)
File "/opt/anaconda3/envs/DLC-CPU/lib/python3.7/site-packages/tensorflow_core/python/client/session.py", line 1350, in _run_fn
target_list, run_metadata)
File "/opt/anaconda3/envs/DLC-CPU/lib/python3.7/site-packages/tensorflow_core/python/client/session.py", line 1443, in _call_tf_sessionrun
run_metadata)
tensorflow.python.framework.errors_impl.CancelledError: Enqueue operation was cancelled
[[{{node fifo_queue_enqueue}}]]
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "/opt/anaconda3/envs/DLC-CPU/lib/python3.7/threading.py", line 926, in _bootstrap_inner
self.run()
File "/opt/anaconda3/envs/DLC-CPU/lib/python3.7/threading.py", line 870, in run
self._target(*self._args, **self._kwargs)
File "/opt/anaconda3/envs/DLC-CPU/lib/python3.7/site-packages/deeplabcut/pose_estimation_tensorflow/train.py", line 91, in load_and_enqueue
sess.run(enqueue_op, feed_dict=food)
File "/opt/anaconda3/envs/DLC-CPU/lib/python3.7/site-packages/tensorflow_core/python/client/session.py", line 956, in run
run_metadata_ptr)
File "/opt/anaconda3/envs/DLC-CPU/lib/python3.7/site-packages/tensorflow_core/python/client/session.py", line 1180, in _run
feed_dict_tensor, options, run_metadata)
File "/opt/anaconda3/envs/DLC-CPU/lib/python3.7/site-packages/tensorflow_core/python/client/session.py", line 1359, in _do_run
run_metadata)
File "/opt/anaconda3/envs/DLC-CPU/lib/python3.7/site-packages/tensorflow_core/python/client/session.py", line 1384, in _do_call
raise type(e)(node_def, op, message)
tensorflow.python.framework.errors_impl.CancelledError: Enqueue operation was cancelled
[[node fifo_queue_enqueue (defined at /opt/anaconda3/envs/DLC-CPU/lib/python3.7/site-packages/tensorflow_core/python/framework/ops.py:1748) ]]
Original stack trace for 'fifo_queue_enqueue':
File "testscript.py", line 146, in
How to Reproduce the problem Steps to reproduce the behavior:
Additional context I'm reinstalling Deeplabcut for our university-computer and had multiple errors (about 10 different ones so far?), so this is the latest in a whole line of troubleshootings.
I hope it is OK like this, if not let me know and I'll add to it.
I'm not sure on high sierra; if you run
import multiprocessing as mp
mp.set_start_method('spawn')
does it work?
Sadly not. I tried both lines and neither did something:
(DLC-CPU) UniversitysiMac:examples psydev20$ import multiprocessing as mp -bash: import: command not found (DLC-CPU) UniversitysiMac:examples psydev20$ mp.set_start_method('spawn') -bash: syntax error near unexpected token `'spawn''
Run those lines after starting pythonw, or add near the top of testscriot.py
Thanks for the input. Sadly this also didn't work. Now I get following error: File "/opt/anaconda3/envs/DLC-CPU/lib/python3.7/multiprocessing/context.py", line 242, in set_start_method raise RuntimeError('context has already been set') RuntimeError: context has already been set Traceback (most recent call last): File "", line 1, in
Did I place it at the wrong place in testscript.py?
Testscript:
#!/usr/bin/env python3 # -*- coding: utf-8 -*- """ Created on Tue Oct 2 13:56:11 2018 @author: alex DEVELOPERS: This script tests various functionalities in an automatic way. It should take about 3:30 minutes to run this in a CPU. It should take about 1:30 minutes on a GPU (incl. downloading the ResNet weights) It produces nothing of interest scientifically. """ import os import deeplabcut import platform import subprocess from pathlib import Path import numpy as np import pandas as pd import multiprocessing as mp mp.set_start_method('spawn') if __name__ == "__main__": task = "TEST" # Enter the name of your experiment Task scorer = "Alex" # Enter the name of the experimenter/labeler print("Imported DLC!") basepath = os.path.dirname(os.path.realpath(__file__)) videoname = "reachingvideo1" video = [ os.path.join( basepath, "Reaching-Mackenzie-2018-08-30", "videos", videoname + ".avi" ) ] # For testing a color video: # videoname='baby4hin2min' # video=[os.path.join('/home/alex/Desktop/Data',videoname+'.mp4')] # to test destination folder: dfolder = basepath dfolder = None net_type = "resnet_50" #net_type = "mobilenet_v2_0.35" #net_type = "efficientnet-b0" # to -b6 augmenter_type = "default" # = imgaug!! augmenter_type2 = "scalecrop" if platform.system() == "Darwin" or platform.system() == "Windows": print("On Windows/OSX tensorpack is not tested by default.") augmenter_type3 = "imgaug" else: augmenter_type3 = "tensorpack" # Does not work on WINDOWS numiter = 5 print("CREATING PROJECT") path_config_file = deeplabcut.create_new_project( task, scorer, video, copy_videos=True ) cfg = deeplabcut.auxiliaryfunctions.read_config(path_config_file) cfg["numframes2pick"] = 5 cfg["pcutoff"] = 0.01 cfg["TrainingFraction"] = [0.8] cfg["skeleton"] = [["bodypart1", "bodypart2"], ["bodypart1", "bodypart3"]] deeplabcut.auxiliaryfunctions.write_config(path_config_file, cfg) print("EXTRACTING FRAMES") deeplabcut.extract_frames(path_config_file, mode="automatic", userfeedback=False) print("CREATING-SOME LABELS FOR THE FRAMES") frames = os.listdir(os.path.join(cfg["project_path"], "labeled-data", videoname)) # As this next step is manual, we update the labels by putting them on the diagonal (fixed for all frames) for index, bodypart in enumerate(cfg["bodyparts"]): columnindex = pd.MultiIndex.from_product( [[scorer], [bodypart], ["x", "y"]], names=["scorer", "bodyparts", "coords"] ) frame = pd.DataFrame( 100 + np.ones((len(frames), 2)) * 50 * index, columns=columnindex, index=[os.path.join("labeled-data", videoname, fn) for fn in frames], ) if index == 0: dataFrame = frame else: dataFrame = pd.concat([dataFrame, frame], axis=1) dataFrame.to_csv( os.path.join( cfg["project_path"], "labeled-data", videoname, "CollectedData_" + scorer + ".csv", ) ) dataFrame.to_hdf( os.path.join( cfg["project_path"], "labeled-data", videoname, "CollectedData_" + scorer + ".h5", ), "df_with_missing", format="table", mode="w", ) print("Plot labels...") deeplabcut.check_labels(path_config_file) print("CREATING TRAININGSET") deeplabcut.create_training_dataset( path_config_file, net_type=net_type, augmenter_type=augmenter_type ) posefile = os.path.join( cfg["project_path"], "dlc-models/iteration-" + str(cfg["iteration"]) + "/" + cfg["Task"] + cfg["date"] + "-trainset" + str(int(cfg["TrainingFraction"][0] * 100)) + "shuffle" + str(1), "train/pose_cfg.yaml", ) DLC_config = deeplabcut.auxiliaryfunctions.read_plainconfig(posefile) DLC_config["save_iters"] = numiter DLC_config["display_iters"] = 2 DLC_config["multi_step"] = [[0.001, numiter]] print("CHANGING training parameters to end quickly!") deeplabcut.auxiliaryfunctions.write_plainconfig(posefile, DLC_config) print("TRAIN") deeplabcut.train_network(path_config_file) print("EVALUATE") deeplabcut.evaluate_network(path_config_file, plotting=True) # deeplabcut.evaluate_network(path_config_file,plotting=True,trainingsetindex=33) print("CUT SHORT VIDEO AND ANALYZE (with dynamic cropping!)") # Make super short video (so the analysis is quick!) try: # you need ffmpeg command line interface # subprocess.call(['ffmpeg','-i',video[0],'-ss','00:00:00','-to','00:00:00.4','-c','copy',newvideo]) newvideo = deeplabcut.ShortenVideo( video[0], start="00:00:00", stop="00:00:01", outsuffix="short", outpath=os.path.join(cfg["project_path"], "videos"), ) except: # if ffmpeg is broken/missing print("using alternative method") newvideo = os.path.join(cfg["project_path"], "videos", videoname + "short.mp4") from moviepy.editor import VideoFileClip, VideoClip clip = VideoFileClip(video[0]) clip.reader.initialize() def make_frame(t): return clip.get_frame(1) newclip = VideoClip(make_frame, duration=1) newclip.write_videofile(newvideo, fps=30) vname = Path(newvideo).stem deeplabcut.analyze_videos( path_config_file, [newvideo], save_as_csv=True, destfolder=dfolder, dynamic=(True, 0.1, 5), ) print("analyze again...") deeplabcut.analyze_videos( path_config_file, [newvideo], save_as_csv=True, destfolder=dfolder ) print("CREATE VIDEO") deeplabcut.create_labeled_video( path_config_file, [newvideo], destfolder=dfolder, save_frames=True ) print("Making plots") deeplabcut.plot_trajectories(path_config_file, [newvideo], destfolder=dfolder) print("EXTRACT OUTLIERS") deeplabcut.extract_outlier_frames( path_config_file, [newvideo], outlieralgorithm="jump", epsilon=0, automatic=True, destfolder=dfolder, ) deeplabcut.extract_outlier_frames( path_config_file, [newvideo], outlieralgorithm="fitting", automatic=True, destfolder=dfolder, ) file = os.path.join( cfg["project_path"], "labeled-data", vname, "machinelabels-iter" + str(cfg["iteration"]) + ".h5", ) print("RELABELING") DF = pd.read_hdf(file, "df_with_missing") DLCscorer = np.unique(DF.columns.get_level_values(0))[0] DF.columns.set_levels([scorer.replace(DLCscorer, scorer)], level=0, inplace=True) DF = DF.drop("likelihood", axis=1, level=2) DF.to_csv( os.path.join( cfg["project_path"], "labeled-data", vname, "CollectedData_" + scorer + ".csv", ) ) DF.to_hdf( os.path.join( cfg["project_path"], "labeled-data", vname, "CollectedData_" + scorer + ".h5", ), "df_with_missing", format="table", mode="w", ) print("MERGING") deeplabcut.merge_datasets(path_config_file) # iteration + 1 print("CREATING TRAININGSET") deeplabcut.create_training_dataset( path_config_file, net_type=net_type, augmenter_type=augmenter_type2 ) cfg = deeplabcut.auxiliaryfunctions.read_config(path_config_file) posefile = os.path.join( cfg["project_path"], "dlc-models/iteration-" + str(cfg["iteration"]) + "/" + cfg["Task"] + cfg["date"] + "-trainset" + str(int(cfg["TrainingFraction"][0] * 100)) + "shuffle" + str(1), "train/pose_cfg.yaml", ) DLC_config = deeplabcut.auxiliaryfunctions.read_plainconfig(posefile) DLC_config["save_iters"] = numiter DLC_config["display_iters"] = 1 DLC_config["multi_step"] = [[0.001, numiter]] print("CHANGING training parameters to end quickly!") deeplabcut.auxiliaryfunctions.write_config(posefile, DLC_config) print("TRAIN") deeplabcut.train_network(path_config_file) try: # you need ffmpeg command line interface # subprocess.call(['ffmpeg','-i',video[0],'-ss','00:00:00','-to','00:00:00.4','-c','copy',newvideo]) newvideo2 = deeplabcut.ShortenVideo( video[0], start="00:00:00", stop="00:00:01", outsuffix="short2", outpath=os.path.join(cfg["project_path"], "videos"), ) except: # if ffmpeg is broken newvideo2 = os.path.join( cfg["project_path"], "videos", videoname + "short2.mp4" ) from moviepy.editor import VideoFileClip, VideoClip clip = VideoFileClip(video[0]) clip.reader.initialize() def make_frame(t): return clip.get_frame(1) newclip = VideoClip(make_frame, duration=1) newclip.write_videofile(newvideo2, fps=30) vname = Path(newvideo2).stem print("Inference with direct cropping") deeplabcut.analyze_videos( path_config_file, [newvideo2], save_as_csv=True, destfolder=dfolder, cropping=[0, 50, 0, 50], allow_growth=True ) print("Extracting skeleton distances, filter and plot filtered output") deeplabcut.analyzeskeleton( path_config_file, [newvideo2], save_as_csv=True, destfolder=dfolder ) deeplabcut.filterpredictions(path_config_file, [newvideo2]) deeplabcut.create_labeled_video( path_config_file, [newvideo2], destfolder=dfolder, displaycropped=True, filtered=True, ) print("Creating a Johansson video!") deeplabcut.create_labeled_video( path_config_file, [newvideo2], destfolder=dfolder, keypoints_only=True ) deeplabcut.plot_trajectories( path_config_file, [newvideo2], destfolder=dfolder, filtered=True ) print("ALL DONE!!! - default cases without Tensorpack loader are functional.") print("CREATING TRAININGSET for shuffle 2") print("will be used for 3D testscript...") # TENSORPACK could fail in WINDOWS... deeplabcut.create_training_dataset( path_config_file, Shuffles=[2], net_type=net_type, augmenter_type=augmenter_type3, ) posefile = os.path.join( cfg["project_path"], "dlc-models/iteration-" + str(cfg["iteration"]) + "/" + cfg["Task"] + cfg["date"] + "-trainset" + str(int(cfg["TrainingFraction"][0] * 100)) + "shuffle" + str(2), "train/pose_cfg.yaml", ) DLC_config = deeplabcut.auxiliaryfunctions.read_plainconfig(posefile) DLC_config["save_iters"] = 10 DLC_config["display_iters"] = 2 DLC_config["multi_step"] = [[0.001, 10]] print("CHANGING training parameters to end quickly!") deeplabcut.auxiliaryfunctions.write_plainconfig(posefile, DLC_config) print("TRAINING shuffle 2, with smaller allocated memory") deeplabcut.train_network(path_config_file, shuffle=2, allow_growth=True) print("ANALYZING some individual frames") deeplabcut.analyze_time_lapse_frames( path_config_file, os.path.join(cfg["project_path"], "labeled-data/reachingvideo1/"), ) print("Export model...") deeplabcut.export_model(path_config_file, shuffle=2, make_tar=False) print("Merging datasets...") trainIndices, testIndices = deeplabcut.mergeandsplit( path_config_file, trainindex=0, uniform=True ) print("Creating two identical splits...") deeplabcut.create_training_dataset( path_config_file, Shuffles=[4, 5], trainIndices=[trainIndices, trainIndices], testIndices=[testIndices, testIndices], ) print("ALL DONE!!! - default cases are functional.")
Here is my terminal, in case the placing was right and something else went wrong:
Last login: Thu Jul 1 01:19:27 on console
(base) UniversitysiMac:~ psydev20$ conda activate DLC-CPU
(DLC-CPU) UniversitysiMac:~ psydev20$ cd /Users/psydev20/Desktop/DeepLabCut
(DLC-CPU) UniversitysiMac:DeepLabCut psydev20$ cd examples
(DLC-CPU) UniversitysiMac:examples psydev20$ pythonw testscript.py
Imported DLC!
On Windows/OSX tensorpack is not tested by default.
CREATING PROJECT
Created "/Users/psydev20/Desktop/DeepLabCut/examples/TEST-Alex-2021-07-01/videos"
Created "/Users/psydev20/Desktop/DeepLabCut/examples/TEST-Alex-2021-07-01/labeled-data"
Created "/Users/psydev20/Desktop/DeepLabCut/examples/TEST-Alex-2021-07-01/training-datasets"
Created "/Users/psydev20/Desktop/DeepLabCut/examples/TEST-Alex-2021-07-01/dlc-models"
Copying the videos
/Users/psydev20/Desktop/DeepLabCut/examples/TEST-Alex-2021-07-01/videos/reachingvideo1.avi
Generated "/Users/psydev20/Desktop/DeepLabCut/examples/TEST-Alex-2021-07-01/config.yaml"
A new project with name TEST-Alex-2021-07-01 is created at /Users/psydev20/Desktop/DeepLabCut/examples and a configurable file (config.yaml) is stored there. Change the parameters in this file to adapt to your project's needs.
Once you have changed the configuration file, use the function 'extract_frames' to select frames for labeling.
. [OPTIONAL] Use the function 'add_new_videos' to add new videos to your project (at any stage).
EXTRACTING FRAMES
Config file read successfully.
Extracting frames based on kmeans ...
Kmeans-quantization based extracting of frames from 0.0 seconds to 8.53 seconds.
Extracting and downsampling... 256 frames from the video.
256it [00:01, 194.44it/s]
Kmeans clustering ... (this might take a while)
Frames were successfully extracted, for the videos of interest.
You can now label the frames using the function 'label_frames' (if you extracted enough frames for all videos).
CREATING-SOME LABELS FOR THE FRAMES
Plot labels...
Creating images with labels by Alex.
100%|█████████████████████████████████████████████| 5/5 [00:01<00:00, 3.50it/s]
If all the labels are ok, then use the function 'create_training_dataset' to create the training dataset!
CREATING TRAININGSET
The training dataset is successfully created. Use the function 'train_network' to start training. Happy training!
CHANGING training parameters to end quickly!
TRAIN
Selecting single-animal trainer
Config:
{'all_joints': [[0], [1], [2], [3]],
'all_joints_names': ['bodypart1', 'bodypart2', 'bodypart3', 'objectA'],
'alpha_r': 0.02,
'batch_size': 1,
'clahe': True,
'claheratio': 0.1,
'crop_pad': 0,
'cropratio': 0.4,
'dataset': 'training-datasets/iteration-0/UnaugmentedDataSet_TESTJul1/TEST_Alex80shuffle1.mat',
'dataset_type': 'default',
'decay_steps': 30000,
'deterministic': False,
'display_iters': 2,
'edge': False,
'emboss': {'alpha': [0.0, 1.0], 'embossratio': 0.1, 'strength': [0.5, 1.5]},
'fg_fraction': 0.25,
'global_scale': 0.8,
'histeq': True,
'histeqratio': 0.1,
'init_weights': '/opt/anaconda3/envs/DLC-CPU/lib/python3.7/site-packages/deeplabcut/pose_estimation_tensorflow/models/pretrained/resnet_v1_50.ckpt',
'intermediate_supervision': False,
'intermediate_supervision_layer': 12,
'location_refinement': True,
'locref_huber_loss': True,
'locref_loss_weight': 0.05,
'locref_stdev': 7.2801,
'log_dir': 'log',
'lr_init': 0.0005,
'max_input_size': 1500,
'mean_pixel': [123.68, 116.779, 103.939],
'metadataset': 'training-datasets/iteration-0/UnaugmentedDataSet_TESTJul1/Documentation_data-TEST_80shuffle1.pickle',
'min_input_size': 64,
'mirror': False,
'multi_stage': False,
'multi_step': [[0.001, 5]],
'net_type': 'resnet_50',
'num_joints': 4,
'optimizer': 'sgd',
'pairwise_huber_loss': False,
'pairwise_predict': False,
'partaffinityfield_predict': False,
'pos_dist_thresh': 17,
'project_path': '/Users/psydev20/Desktop/DeepLabCut/examples/TEST-Alex-2021-07-01',
'regularize': False,
'rotation': 25,
'rotratio': 0.4,
'save_iters': 5,
'scale_jitter_lo': 0.5,
'scale_jitter_up': 1.25,
'scoremap_dir': 'test',
'sharpen': False,
'sharpenratio': 0.3,
'shuffle': True,
'snapshot_prefix': '/Users/psydev20/Desktop/DeepLabCut/examples/TEST-Alex-2021-07-01/dlc-models/iteration-0/TESTJul1-trainset80shuffle1/train/snapshot',
'stride': 8.0,
'weigh_negatives': False,
'weigh_only_present_joints': False,
'weigh_part_predictions': False,
'weight_decay': 0.0001}
Starting with imgaug pose-dataset loader (=default).
Batch Size is 1
Initializing ResNet
Loading ImageNet-pretrained resnet_50
2021-07-01 01:25:45.159811: I tensorflow/core/platform/cpu_feature_guard.cc:142] Your CPU supports instructions that this TensorFlow binary was not compiled to use: AVX2 FMA
2021-07-01 01:25:45.175178: I tensorflow/compiler/xla/service/service.cc:168] XLA service 0x7fc59e2f2640 initialized for platform Host (this does not guarantee that XLA will be used). Devices:
2021-07-01 01:25:45.175218: I tensorflow/compiler/xla/service/service.cc:176] StreamExecutor device (0): Host, Default Version
Training parameter:
{'stride': 8.0, 'weigh_part_predictions': False, 'weigh_negatives': False, 'fg_fraction': 0.25, 'mean_pixel': [123.68, 116.779, 103.939], 'shuffle': True, 'snapshot_prefix': '/Users/psydev20/Desktop/DeepLabCut/examples/TEST-Alex-2021-07-01/dlc-models/iteration-0/TESTJul1-trainset80shuffle1/train/snapshot', 'log_dir': 'log', 'global_scale': 0.8, 'location_refinement': True, 'locref_stdev': 7.2801, 'locref_loss_weight': 0.05, 'locref_huber_loss': True, 'optimizer': 'sgd', 'intermediate_supervision': False, 'intermediate_supervision_layer': 12, 'regularize': False, 'weight_decay': 0.0001, 'crop_pad': 0, 'scoremap_dir': 'test', 'batch_size': 1, 'dataset_type': 'default', 'deterministic': False, 'mirror': False, 'pairwise_huber_loss': False, 'weigh_only_present_joints': False, 'partaffinityfield_predict': False, 'pairwise_predict': False, 'all_joints': [[0], [1], [2], [3]], 'all_joints_names': ['bodypart1', 'bodypart2', 'bodypart3', 'objectA'], 'alpha_r': 0.02, 'clahe': True, 'claheratio': 0.1, 'cropratio': 0.4, 'dataset': 'training-datasets/iteration-0/UnaugmentedDataSet_TESTJul1/TEST_Alex80shuffle1.mat', 'decay_steps': 30000, 'display_iters': 2, 'edge': False, 'emboss': {'alpha': [0.0, 1.0], 'embossratio': 0.1, 'strength': [0.5, 1.5]}, 'histeq': True, 'histeqratio': 0.1, 'init_weights': '/opt/anaconda3/envs/DLC-CPU/lib/python3.7/site-packages/deeplabcut/pose_estimation_tensorflow/models/pretrained/resnet_v1_50.ckpt', 'lr_init': 0.0005, 'max_input_size': 1500, 'metadataset': 'training-datasets/iteration-0/UnaugmentedDataSet_TESTJul1/Documentation_data-TEST_80shuffle1.pickle', 'min_input_size': 64, 'multi_stage': False, 'multi_step': [[0.001, 5]], 'net_type': 'resnet_50', 'num_joints': 4, 'pos_dist_thresh': 17, 'project_path': '/Users/psydev20/Desktop/DeepLabCut/examples/TEST-Alex-2021-07-01', 'rotation': 25, 'rotratio': 0.4, 'save_iters': 5, 'scale_jitter_lo': 0.5, 'scale_jitter_up': 1.25, 'sharpen': False, 'sharpenratio': 0.3, 'covering': True, 'elastic_transform': True, 'motion_blur': True, 'motion_blur_params': {'k': 7, 'angle': (-90, 90)}}
Starting training....
iteration: 2 loss: 1.2516 lr: 0.001
iteration: 4 loss: 0.6739 lr: 0.001
2021-07-01 01:26:00.614942: W tensorflow/core/kernels/queue_base.cc:277] _0_fifo_queue: Skipping cancelled enqueue attempt with queue not closed
Exception in thread Thread-2:
Traceback (most recent call last):
File "/opt/anaconda3/envs/DLC-CPU/lib/python3.7/site-packages/tensorflow_core/python/client/session.py", line 1365, in _do_call
return fn(*args)
File "/opt/anaconda3/envs/DLC-CPU/lib/python3.7/site-packages/tensorflow_core/python/client/session.py", line 1350, in _run_fn
target_list, run_metadata)
File "/opt/anaconda3/envs/DLC-CPU/lib/python3.7/site-packages/tensorflow_core/python/client/session.py", line 1443, in _call_tf_sessionrun
run_metadata)
tensorflow.python.framework.errors_impl.CancelledError: Enqueue operation was cancelled
[[{{node fifo_queue_enqueue}}]]
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "/opt/anaconda3/envs/DLC-CPU/lib/python3.7/threading.py", line 926, in _bootstrap_inner
self.run()
File "/opt/anaconda3/envs/DLC-CPU/lib/python3.7/threading.py", line 870, in run
self._target(*self._args, **self._kwargs)
File "/opt/anaconda3/envs/DLC-CPU/lib/python3.7/site-packages/deeplabcut/pose_estimation_tensorflow/train.py", line 91, in load_and_enqueue
sess.run(enqueue_op, feed_dict=food)
File "/opt/anaconda3/envs/DLC-CPU/lib/python3.7/site-packages/tensorflow_core/python/client/session.py", line 956, in run
run_metadata_ptr)
File "/opt/anaconda3/envs/DLC-CPU/lib/python3.7/site-packages/tensorflow_core/python/client/session.py", line 1180, in _run
feed_dict_tensor, options, run_metadata)
File "/opt/anaconda3/envs/DLC-CPU/lib/python3.7/site-packages/tensorflow_core/python/client/session.py", line 1359, in _do_run
run_metadata)
File "/opt/anaconda3/envs/DLC-CPU/lib/python3.7/site-packages/tensorflow_core/python/client/session.py", line 1384, in _do_call
raise type(e)(node_def, op, message)
tensorflow.python.framework.errors_impl.CancelledError: Enqueue operation was cancelled
[[node fifo_queue_enqueue (defined at /opt/anaconda3/envs/DLC-CPU/lib/python3.7/site-packages/tensorflow_core/python/framework/ops.py:1748) ]]
Original stack trace for 'fifo_queue_enqueue':
File "testscript.py", line 148, in
Hello again.
I'm still trying to get my test script running and ran into 3 new errors. Two of them I could find and solve trough other issues on here, but this one I can't find outside of the DLC-forum and I really don't wanna make it worse again, so I rather ask once to often. While trying to generate frames and create a video, the process seems to fork (whatever that exactly means).
Code output
(base) UniversitysiMac:~ psydev20$ conda activate DLC-CPU (DLC-CPU) UniversitysiMac:~ psydev20$ cd /Users/psydev20/Desktop/DeepLabCut (DLC-CPU) UniversitysiMac:DeepLabCut psydev20$ cd examples (DLC-CPU) UniversitysiMac:examples psydev20$ pythonw testscript.py Imported DLC! On Windows/OSX tensorpack is not tested by default. CREATING PROJECT Created "/Users/psydev20/Desktop/DeepLabCut/examples/TEST-Alex-2021-06-30/videos" Created "/Users/psydev20/Desktop/DeepLabCut/examples/TEST-Alex-2021-06-30/labeled-data" Created "/Users/psydev20/Desktop/DeepLabCut/examples/TEST-Alex-2021-06-30/training-datasets" Created "/Users/psydev20/Desktop/DeepLabCut/examples/TEST-Alex-2021-06-30/dlc-models" Copying the videos /Users/psydev20/Desktop/DeepLabCut/examples/TEST-Alex-2021-06-30/videos/reachingvideo1.avi Generated "/Users/psydev20/Desktop/DeepLabCut/examples/TEST-Alex-2021-06-30/config.yaml" A new project with name TEST-Alex-2021-06-30 is created at /Users/psydev20/Desktop/DeepLabCut/examples and a configurable file (config.yaml) is stored there. Change the parameters in this file to adapt to your project's needs. Once you have changed the configuration file, use the function 'extract_frames' to select frames for labeling. . [OPTIONAL] Use the function 'add_new_videos' to add new videos to your project (at any stage). EXTRACTING FRAMES Config file read successfully. Extracting frames based on kmeans ... Kmeans-quantization based extracting of frames from 0.0 seconds to 8.53 seconds. Extracting and downsampling... 256 frames from the video. 256it [00:01, 191.51it/s] Kmeans clustering ... (this might take a while) Frames were successfully extracted, for the videos of interest. You can now label the frames using the function 'label_frames' (if you extracted enough frames for all videos). CREATING-SOME LABELS FOR THE FRAMES Plot labels... Creating images with labels by Alex. 100%|█████████████████████████████████████████████| 5/5 [00:01<00:00, 3.51it/s] If all the labels are ok, then use the function 'create_training_dataset' to create the training dataset! CREATING TRAININGSET The training dataset is successfully created. Use the function 'train_network' to start training. Happy training! CHANGING training parameters to end quickly! TRAIN Selecting single-animal trainer Config: {'all_joints': [[0], [1], [2], [3]], 'all_joints_names': ['bodypart1', 'bodypart2', 'bodypart3', 'objectA'], 'alpha_r': 0.02, 'batch_size': 1, 'clahe': True, 'claheratio': 0.1, 'crop_pad': 0, 'cropratio': 0.4, 'dataset': 'training-datasets/iteration-0/UnaugmentedDataSet_TESTJun30/TEST_Alex80shuffle1.mat', 'dataset_type': 'default', 'decay_steps': 30000, 'deterministic': False, 'display_iters': 2, 'edge': False, 'emboss': {'alpha': [0.0, 1.0], 'embossratio': 0.1, 'strength': [0.5, 1.5]}, 'fg_fraction': 0.25, 'global_scale': 0.8, 'histeq': True, 'histeqratio': 0.1, 'init_weights': '/opt/anaconda3/envs/DLC-CPU/lib/python3.7/site-packages/deeplabcut/pose_estimation_tensorflow/models/pretrained/resnet_v1_50.ckpt', 'intermediate_supervision': False, 'intermediate_supervision_layer': 12, 'location_refinement': True, 'locref_huber_loss': True, 'locref_loss_weight': 0.05, 'locref_stdev': 7.2801, 'log_dir': 'log', 'lr_init': 0.0005, 'max_input_size': 1500, 'mean_pixel': [123.68, 116.779, 103.939], 'metadataset': 'training-datasets/iteration-0/UnaugmentedDataSet_TESTJun30/Documentation_data-TEST_80shuffle1.pickle', 'min_input_size': 64, 'mirror': False, 'multi_stage': False, 'multi_step': [[0.001, 5]], 'net_type': 'resnet_50', 'num_joints': 4, 'optimizer': 'sgd', 'pairwise_huber_loss': False, 'pairwise_predict': False, 'partaffinityfield_predict': False, 'pos_dist_thresh': 17, 'project_path': '/Users/psydev20/Desktop/DeepLabCut/examples/TEST-Alex-2021-06-30', 'regularize': False, 'rotation': 25, 'rotratio': 0.4, 'save_iters': 5, 'scale_jitter_lo': 0.5, 'scale_jitter_up': 1.25, 'scoremap_dir': 'test', 'sharpen': False, 'sharpenratio': 0.3, 'shuffle': True, 'snapshot_prefix': '/Users/psydev20/Desktop/DeepLabCut/examples/TEST-Alex-2021-06-30/dlc-models/iteration-0/TESTJun30-trainset80shuffle1/train/snapshot', 'stride': 8.0, 'weigh_negatives': False, 'weigh_only_present_joints': False, 'weigh_part_predictions': False, 'weight_decay': 0.0001} Starting with imgaug pose-dataset loader (=default). Batch Size is 1 Initializing ResNet Loading ImageNet-pretrained resnet_50 2021-06-30 17:16:51.863933: I tensorflow/core/platform/cpu_feature_guard.cc:142] Your CPU supports instructions that this TensorFlow binary was not compiled to use: AVX2 FMA 2021-06-30 17:16:51.873591: I tensorflow/compiler/xla/service/service.cc:168] XLA service 0x7ff80b495150 initialized for platform Host (this does not guarantee that XLA will be used). Devices: 2021-06-30 17:16:51.873620: I tensorflow/compiler/xla/service/service.cc:176] StreamExecutor device (0): Host, Default Version Training parameter: {'stride': 8.0, 'weigh_part_predictions': False, 'weigh_negatives': False, 'fg_fraction': 0.25, 'mean_pixel': [123.68, 116.779, 103.939], 'shuffle': True, 'snapshot_prefix': '/Users/psydev20/Desktop/DeepLabCut/examples/TEST-Alex-2021-06-30/dlc-models/iteration-0/TESTJun30-trainset80shuffle1/train/snapshot', 'log_dir': 'log', 'global_scale': 0.8, 'location_refinement': True, 'locref_stdev': 7.2801, 'locref_loss_weight': 0.05, 'locref_huber_loss': True, 'optimizer': 'sgd', 'intermediate_supervision': False, 'intermediate_supervision_layer': 12, 'regularize': False, 'weight_decay': 0.0001, 'crop_pad': 0, 'scoremap_dir': 'test', 'batch_size': 1, 'dataset_type': 'default', 'deterministic': False, 'mirror': False, 'pairwise_huber_loss': False, 'weigh_only_present_joints': False, 'partaffinityfield_predict': False, 'pairwise_predict': False, 'all_joints': [[0], [1], [2], [3]], 'all_joints_names': ['bodypart1', 'bodypart2', 'bodypart3', 'objectA'], 'alpha_r': 0.02, 'clahe': True, 'claheratio': 0.1, 'cropratio': 0.4, 'dataset': 'training-datasets/iteration-0/UnaugmentedDataSet_TESTJun30/TEST_Alex80shuffle1.mat', 'decay_steps': 30000, 'display_iters': 2, 'edge': False, 'emboss': {'alpha': [0.0, 1.0], 'embossratio': 0.1, 'strength': [0.5, 1.5]}, 'histeq': True, 'histeqratio': 0.1, 'init_weights': '/opt/anaconda3/envs/DLC-CPU/lib/python3.7/site-packages/deeplabcut/pose_estimation_tensorflow/models/pretrained/resnet_v1_50.ckpt', 'lr_init': 0.0005, 'max_input_size': 1500, 'metadataset': 'training-datasets/iteration-0/UnaugmentedDataSet_TESTJun30/Documentation_data-TEST_80shuffle1.pickle', 'min_input_size': 64, 'multi_stage': False, 'multi_step': [[0.001, 5]], 'net_type': 'resnet_50', 'num_joints': 4, 'pos_dist_thresh': 17, 'project_path': '/Users/psydev20/Desktop/DeepLabCut/examples/TEST-Alex-2021-06-30', 'rotation': 25, 'rotratio': 0.4, 'save_iters': 5, 'scale_jitter_lo': 0.5, 'scale_jitter_up': 1.25, 'sharpen': False, 'sharpenratio': 0.3, 'covering': True, 'elastic_transform': True, 'motion_blur': True, 'motion_blur_params': {'k': 7, 'angle': (-90, 90)}} Starting training.... iteration: 2 loss: 1.2227 lr: 0.001 iteration: 4 loss: 0.6428 lr: 0.001 2021-06-30 17:17:07.489254: W tensorflow/core/kernels/queue_base.cc:277] _0_fifo_queue: Skipping cancelled enqueue attempt with queue not closed Exception in thread Thread-3: Traceback (most recent call last): File "/opt/anaconda3/envs/DLC-CPU/lib/python3.7/site-packages/tensorflow_core/python/client/session.py", line 1365, in _do_call return fn(*args) File "/opt/anaconda3/envs/DLC-CPU/lib/python3.7/site-packages/tensorflow_core/python/client/session.py", line 1350, in _run_fn target_list, run_metadata) File "/opt/anaconda3/envs/DLC-CPU/lib/python3.7/site-packages/tensorflow_core/python/client/session.py", line 1443, in _call_tf_sessionrun run_metadata) tensorflow.python.framework.errors_impl.CancelledError: Enqueue operation was cancelled [[{{node fifo_queue_enqueue}}]] During handling of the above exception, another exception occurred: Traceback (most recent call last): File "/opt/anaconda3/envs/DLC-CPU/lib/python3.7/threading.py", line 926, in _bootstrap_inner self.run() File "/opt/anaconda3/envs/DLC-CPU/lib/python3.7/threading.py", line 870, in run self._target(*self._args, **self._kwargs) File "/opt/anaconda3/envs/DLC-CPU/lib/python3.7/site-packages/deeplabcut/pose_estimation_tensorflow/train.py", line 91, in load_and_enqueue sess.run(enqueue_op, feed_dict=food) File "/opt/anaconda3/envs/DLC-CPU/lib/python3.7/site-packages/tensorflow_core/python/client/session.py", line 956, in run run_metadata_ptr) File "/opt/anaconda3/envs/DLC-CPU/lib/python3.7/site-packages/tensorflow_core/python/client/session.py", line 1180, in _run feed_dict_tensor, options, run_metadata) File "/opt/anaconda3/envs/DLC-CPU/lib/python3.7/site-packages/tensorflow_core/python/client/session.py", line 1359, in _do_run run_metadata) File "/opt/anaconda3/envs/DLC-CPU/lib/python3.7/site-packages/tensorflow_core/python/client/session.py", line 1384, in _do_call raise type(e)(node_def, op, message) tensorflow.python.framework.errors_impl.CancelledError: Enqueue operation was cancelled [[node fifo_queue_enqueue (defined at /opt/anaconda3/envs/DLC-CPU/lib/python3.7/site-packages/tensorflow_core/python/framework/ops.py:1748) ]] Original stack trace for 'fifo_queue_enqueue': File "testscript.py", line 146, in
deeplabcut.train_network(path_config_file)
File "/opt/anaconda3/envs/DLC-CPU/lib/python3.7/site-packages/deeplabcut/pose_estimation_tensorflow/training.py", line 189, in train_network
allow_growth=allow_growth,
File "/opt/anaconda3/envs/DLC-CPU/lib/python3.7/site-packages/deeplabcut/pose_estimation_tensorflow/train.py", line 176, in train
batch, enqueue_op, placeholders = setup_preloading(batch_spec)
File "/opt/anaconda3/envs/DLC-CPU/lib/python3.7/site-packages/deeplabcut/pose_estimation_tensorflow/train.py", line 77, in setup_preloading
enqueue_op = q.enqueue(placeholders_list)
File "/opt/anaconda3/envs/DLC-CPU/lib/python3.7/site-packages/tensorflow_core/python/ops/data_flow_ops.py", line 346, in enqueue
self._queue_ref, vals, name=scope)
File "/opt/anaconda3/envs/DLC-CPU/lib/python3.7/site-packages/tensorflow_core/python/ops/gen_data_flow_ops.py", line 4410, in queue_enqueue_v2
timeout_ms=timeout_ms, name=name)
File "/opt/anaconda3/envs/DLC-CPU/lib/python3.7/site-packages/tensorflow_core/python/framework/op_def_library.py", line 794, in _apply_op_helper
op_def=op_def)
File "/opt/anaconda3/envs/DLC-CPU/lib/python3.7/site-packages/tensorflow_core/python/util/deprecation.py", line 507, in new_func
return func(*args, **kwargs)
File "/opt/anaconda3/envs/DLC-CPU/lib/python3.7/site-packages/tensorflow_core/python/framework/ops.py", line 3357, in create_op
attrs, op_def, compute_device)
File "/opt/anaconda3/envs/DLC-CPU/lib/python3.7/site-packages/tensorflow_core/python/framework/ops.py", line 3426, in _create_op_internal
op_def=op_def)
File "/opt/anaconda3/envs/DLC-CPU/lib/python3.7/site-packages/tensorflow_core/python/framework/ops.py", line 1748, in __init__
self._traceback = tf_stack.extract_stack()
The network is now trained and ready to evaluate. Use the function 'evaluate_network' to evaluate the network.
EVALUATE
Config:
{'all_joints': [[0], [1], [2], [3]],
'all_joints_names': ['bodypart1', 'bodypart2', 'bodypart3', 'objectA'],
'batch_size': 1,
'crop_pad': 0,
'dataset': 'training-datasets/iteration-0/UnaugmentedDataSet_TESTJun30/TEST_Alex80shuffle1.mat',
'dataset_type': 'imgaug',
'deterministic': False,
'fg_fraction': 0.25,
'global_scale': 0.8,
'init_weights': '/opt/anaconda3/envs/DLC-CPU/lib/python3.7/site-packages/deeplabcut/pose_estimation_tensorflow/models/pretrained/resnet_v1_50.ckpt',
'intermediate_supervision': False,
'intermediate_supervision_layer': 12,
'location_refinement': True,
'locref_huber_loss': True,
'locref_loss_weight': 1.0,
'locref_stdev': 7.2801,
'log_dir': 'log',
'mean_pixel': [123.68, 116.779, 103.939],
'mirror': False,
'net_type': 'resnet_50',
'num_joints': 4,
'optimizer': 'sgd',
'pairwise_huber_loss': True,
'pairwise_predict': False,
'partaffinityfield_predict': False,
'regularize': False,
'scoremap_dir': 'test',
'shuffle': True,
'snapshot_prefix': '/Users/psydev20/Desktop/DeepLabCut/examples/TEST-Alex-2021-06-30/dlc-models/iteration-0/TESTJun30-trainset80shuffle1/test/snapshot',
'stride': 8.0,
'weigh_negatives': False,
'weigh_only_present_joints': False,
'weigh_part_predictions': False,
'weight_decay': 0.0001}
Running DLC_resnet50_TESTJun30shuffle1_5 with # of trainingiterations: 5
Initializing ResNet
Analyzing data...
5it [00:02, 1.70it/s]
Done and results stored for snapshot: snapshot-5
Results for 5 training iterations: 80 1 train error: 346.22 pixels. Test error: 340.4 pixels.
With pcutoff of 0.01 train error: 346.22 pixels. Test error: 340.4 pixels
Thereby, the errors are given by the average distances between the labels by DLC and the scorer.
Plotting...
100%|█████████████████████████████████████████████| 5/5 [00:01<00:00, 3.28it/s]
The network is evaluated and the results are stored in the subdirectory 'evaluation_results'.
If it generalizes well, choose the best model for prediction and update the config file with the appropriate index for the 'snapshotindex'.
Use the function 'analyze_video' to make predictions on new videos.
Otherwise consider retraining the network (see DeepLabCut workflow Fig 2)
CUT SHORT VIDEO AND ANALYZE (with dynamic cropping!)
ffmpeg version 4.3.1 Copyright (c) 2000-2020 the FFmpeg developers
built with clang version 11.0.0
configuration: --prefix=/opt/anaconda3/envs/DLC-CPU --cc=x86_64-apple-darwin13.4.0-clang --disable-doc --disable-openssl --enable-avresample --enable-gnutls --enable-gpl --enable-hardcoded-tables --enable-libfreetype --enable-libopenh264 --enable-libx264 --enable-pic --enable-pthreads --enable-shared --enable-static --enable-version3 --enable-zlib --enable-libmp3lame --pkg-config=/Users/runner/miniforge3/conda-bld/ffmpeg_1609681034781/_build_env/bin/pkg-config
libavutil 56. 51.100 / 56. 51.100
libavcodec 58. 91.100 / 58. 91.100
libavformat 58. 45.100 / 58. 45.100
libavdevice 58. 10.100 / 58. 10.100
libavfilter 7. 85.100 / 7. 85.100
libavresample 4. 0. 0 / 4. 0. 0
libswscale 5. 7.100 / 5. 7.100
libswresample 3. 7.100 / 3. 7.100
libpostproc 55. 7.100 / 55. 7.100
Input #0, avi, from '/Users/psydev20/Desktop/DeepLabCut/examples/Reaching-Mackenzie-2018-08-30/videos/reachingvideo1.avi':
Duration: 00:00:08.53, start: 0.000000, bitrate: 12642 kb/s
Stream #0:0: Video: mjpeg (Baseline) (MJPG / 0x47504A4D), yuvj420p(pc, bt470bg/unknown/unknown), 832x747 [SAR 1:1 DAR 832:747], 12682 kb/s, 30 fps, 30 tbr, 30 tbn, 30 tbc
Metadata:
title : ImageJ AVI
Stream mapping:
Stream #0:0 -> #0:0 (mjpeg (native) -> mpeg4 (native))
Press [q] to stop, [?] for help
[swscaler @ 0x7fa6bb000600] deprecated pixel format used, make sure you did set range correctly
Output #0, avi, to '/Users/psydev20/Desktop/DeepLabCut/examples/TEST-Alex-2021-06-30/videos/reachingvideo1short.avi':
Metadata:
ISFT : Lavf58.45.100
Stream #0:0: Video: mpeg4 (FMP4 / 0x34504D46), yuv420p, 832x747 [SAR 1:1 DAR 832:747], q=2-31, 200 kb/s, 30 fps, 30 tbn, 30 tbc
Metadata:
title : ImageJ AVI
encoder : Lavc58.91.100 mpeg4
Side data:
cpb: bitrate max/min/avg: 0/0/200000 buffer size: 0 vbv_delay: N/A
frame= 30 fps=0.0 q=31.0 Lsize= 240kB time=00:00:01.00 bitrate=1964.0kbits/s speed=10.8x
video:233kB audio:0kB subtitle:0kB other streams:0kB global headers:0kB muxing overhead: 2.731339%
Config:
{'all_joints': [[0], [1], [2], [3]],
'all_joints_names': ['bodypart1', 'bodypart2', 'bodypart3', 'objectA'],
'batch_size': 1,
'crop_pad': 0,
'dataset': 'training-datasets/iteration-0/UnaugmentedDataSet_TESTJun30/TEST_Alex80shuffle1.mat',
'dataset_type': 'imgaug',
'deterministic': False,
'fg_fraction': 0.25,
'global_scale': 0.8,
'init_weights': '/opt/anaconda3/envs/DLC-CPU/lib/python3.7/site-packages/deeplabcut/pose_estimation_tensorflow/models/pretrained/resnet_v1_50.ckpt',
'intermediate_supervision': False,
'intermediate_supervision_layer': 12,
'location_refinement': True,
'locref_huber_loss': True,
'locref_loss_weight': 1.0,
'locref_stdev': 7.2801,
'log_dir': 'log',
'mean_pixel': [123.68, 116.779, 103.939],
'mirror': False,
'net_type': 'resnet_50',
'num_joints': 4,
'optimizer': 'sgd',
'pairwise_huber_loss': True,
'pairwise_predict': False,
'partaffinityfield_predict': False,
'regularize': False,
'scoremap_dir': 'test',
'shuffle': True,
'snapshot_prefix': '/Users/psydev20/Desktop/DeepLabCut/examples/TEST-Alex-2021-06-30/dlc-models/iteration-0/TESTJun30-trainset80shuffle1/test/snapshot',
'stride': 8.0,
'weigh_negatives': False,
'weigh_only_present_joints': False,
'weigh_part_predictions': False,
'weight_decay': 0.0001}
Using snapshot-5 for model /Users/psydev20/Desktop/DeepLabCut/examples/TEST-Alex-2021-06-30/dlc-models/iteration-0/TESTJun30-trainset80shuffle1
Starting analysis in dynamic cropping mode with parameters: (True, 0.1, 5)
Switching batchsize to 1, num_outputs (per animal) to 1 and TFGPUinference to False (all these features are not supported in this mode).
Initializing ResNet
Starting to analyze % /Users/psydev20/Desktop/DeepLabCut/examples/TEST-Alex-2021-06-30/videos/reachingvideo1short.avi
/Users/psydev20/Desktop/DeepLabCut/examples/TEST-Alex-2021-06-30/videos already exists!
Loading /Users/psydev20/Desktop/DeepLabCut/examples/TEST-Alex-2021-06-30/videos/reachingvideo1short.avi
Duration of video [s]: 1.0 , recorded with 30.0 fps!
Overall # of frames: 30 found with (before cropping) frame dimensions: 832 747
Starting to extract posture
40it [00:01, 24.94it/s]
Saving results in /Users/psydev20/Desktop/DeepLabCut/examples/TEST-Alex-2021-06-30/videos...
Saving csv poses!
The videos are analyzed. Now your research can truly start!
You can create labeled videos with 'create_labeled_video'
If the tracking is not satisfactory for some videos, consider expanding the training set. You can use the function 'extract_outlier_frames' to extract a few representative outlier frames.
analyze again...
Config:
{'all_joints': [[0], [1], [2], [3]],
'all_joints_names': ['bodypart1', 'bodypart2', 'bodypart3', 'objectA'],
'batch_size': 1,
'crop_pad': 0,
'dataset': 'training-datasets/iteration-0/UnaugmentedDataSet_TESTJun30/TEST_Alex80shuffle1.mat',
'dataset_type': 'imgaug',
'deterministic': False,
'fg_fraction': 0.25,
'global_scale': 0.8,
'init_weights': '/opt/anaconda3/envs/DLC-CPU/lib/python3.7/site-packages/deeplabcut/pose_estimation_tensorflow/models/pretrained/resnet_v1_50.ckpt',
'intermediate_supervision': False,
'intermediate_supervision_layer': 12,
'location_refinement': True,
'locref_huber_loss': True,
'locref_loss_weight': 1.0,
'locref_stdev': 7.2801,
'log_dir': 'log',
'mean_pixel': [123.68, 116.779, 103.939],
'mirror': False,
'net_type': 'resnet_50',
'num_joints': 4,
'optimizer': 'sgd',
'pairwise_huber_loss': True,
'pairwise_predict': False,
'partaffinityfield_predict': False,
'regularize': False,
'scoremap_dir': 'test',
'shuffle': True,
'snapshot_prefix': '/Users/psydev20/Desktop/DeepLabCut/examples/TEST-Alex-2021-06-30/dlc-models/iteration-0/TESTJun30-trainset80shuffle1/test/snapshot',
'stride': 8.0,
'weigh_negatives': False,
'weigh_only_present_joints': False,
'weigh_part_predictions': False,
'weight_decay': 0.0001}
Using snapshot-5 for model /Users/psydev20/Desktop/DeepLabCut/examples/TEST-Alex-2021-06-30/dlc-models/iteration-0/TESTJun30-trainset80shuffle1
Initializing ResNet
Starting to analyze % /Users/psydev20/Desktop/DeepLabCut/examples/TEST-Alex-2021-06-30/videos/reachingvideo1short.avi
/Users/psydev20/Desktop/DeepLabCut/examples/TEST-Alex-2021-06-30/videos already exists!
The videos are analyzed. Now your research can truly start!
You can create labeled videos with 'create_labeled_video'
If the tracking is not satisfactory for some videos, consider expanding the training set. You can use the function 'extract_outlier_frames' to extract a few representative outlier frames.
CREATE VIDEO
/Users/psydev20/Desktop/DeepLabCut/examples/TEST-Alex-2021-06-30/videos already exists!
Starting to process video: /Users/psydev20/Desktop/DeepLabCut/examples/TEST-Alex-2021-06-30/videos/reachingvideo1short.avi
Loading /Users/psydev20/Desktop/DeepLabCut/examples/TEST-Alex-2021-06-30/videos/reachingvideo1short.avi and data.
Duration of video [s]: 1.0, recorded with 30.0 fps!
Overall # of frames: 30 with cropped frame dimensions: 832 747
Generating frames and creating video.
The process has forked and you cannot use this CoreFoundation functionality safely. You MUST exec().
Break on __THE_PROCESS_HAS_FORKED_AND_YOU_CANNOT_USE_THIS_COREFOUNDATION_FUNCTIONALITY___YOU_MUST_EXEC__() to debug.
The process has forked and you cannot use this CoreFoundation functionality safely. You MUST exec().
Break on __THE_PROCESS_HAS_FORKED_AND_YOU_CANNOT_USE_THIS_COREFOUNDATION_FUNCTIONALITY___YOU_MUST_EXEC__() to debug.
The process has forked and you cannot use this CoreFoundation functionality safely. You MUST exec().
Break on __THE_PROCESS_HAS_FORKED_AND_YOU_CANNOT_USE_THIS_COREFOUNDATION_FUNCTIONALITY___YOU_MUST_EXEC__() to debug.
The process has forked and you cannot use this CoreFoundation functionality safely. You MUST exec().
Break on __THE_PROCESS_HAS_FORKED_AND_YOU_CANNOT_USE_THIS_COREFOUNDATION_FUNCTIONALITY___YOU_MUST_EXEC__() to debug.
The process has forked and you cannot use this CoreFoundation functionality safely. You MUST exec().
Break on __THE_PROCESS_HAS_FORKED_AND_YOU_CANNOT_USE_THIS_COREFOUNDATION_FUNCTIONALITY___YOU_MUST_EXEC__() to debug.
The process has forked and you cannot use this CoreFoundation functionality safely. You MUST exec().
Break on __THE_PROCESS_HAS_FORKED_AND_YOU_CANNOT_USE_THIS_COREFOUNDATION_FUNCTIONALITY___YOU_MUST_EXEC__() to debug.
The process has forked and you cannot use this CoreFoundation functionality safely. You MUST exec().
Break on __THE_PROCESS_HAS_FORKED_AND_YOU_CANNOT_USE_THIS_COREFOUNDATION_FUNCTIONALITY___YOU_MUST_EXEC__() to debug.
The process has forked and you cannot use this CoreFoundation functionality safely. You MUST exec().
Break on __THE_PROCESS_HAS_FORKED_AND_YOU_CANNOT_USE_THIS_COREFOUNDATION_FUNCTIONALITY___YOU_MUST_EXEC__() to debug.
The process has forked and you cannot use this CoreFoundation functionality safely. You MUST exec().
Break on __THE_PROCESS_HAS_FORKED_AND_YOU_CANNOT_USE_THIS_COREFOUNDATION_FUNCTIONALITY___YOU_MUST_EXEC__() to debug.
The process has forked and you cannot use this CoreFoundation functionality safely. You MUST exec().
Break on __THE_PROCESS_HAS_FORKED_AND_YOU_CANNOT_USE_THIS_COREFOUNDATION_FUNCTIONALITY___YOU_MUST_EXEC__() to debug.
The process has forked and you cannot use this CoreFoundation functionality safely. You MUST exec().
Break on __THE_PROCESS_HAS_FORKED_AND_YOU_CANNOT_USE_THIS_COREFOUNDATION_FUNCTIONALITY___YOU_MUST_EXEC__() to debug.
The process has forked and you cannot use this CoreFoundation functionality safely. You MUST exec().
Break on __THE_PROCESS_HAS_FORKED_AND_YOU_CANNOT_USE_THIS_COREFOUNDATION_FUNCTIONALITY___YOU_MUST_EXEC__() to debug.
The process has forked and you cannot use this CoreFoundation functionality safely. You MUST exec().
Break on __THE_PROCESS_HAS_FORKED_AND_YOU_CANNOT_USE_THIS_COREFOUNDATION_FUNCTIONALITY___YOU_MUST_EXEC__() to debug.
The process has forked and you cannot use this CoreFoundation functionality safely. You MUST exec().
Break on __THE_PROCESS_HAS_FORKED_AND_YOU_CANNOT_USE_THIS_COREFOUNDATION_FUNCTIONALITY___YOU_MUST_EXEC__() to debug.
The process has forked and you cannot use this CoreFoundation functionality safely. You MUST exec().
Break on __THE_PROCESS_HAS_FORKED_AND_YOU_CANNOT_USE_THIS_COREFOUNDATION_FUNCTIONALITY___YOU_MUST_EXEC__() to debug.
The process has forked and you cannot use this CoreFoundation functionality safely. You MUST exec().
Break on __THE_PROCESS_HAS_FORKED_AND_YOU_CANNOT_USE_THIS_COREFOUNDATION_FUNCTIONALITY___YOU_MUST_EXEC__() to debug.
The process has forked and you cannot use this CoreFoundation functionality safely. You MUST exec().
Break on __THE_PROCESS_HAS_FORKED_AND_YOU_CANNOT_USE_THIS_COREFOUNDATION_FUNCTIONALITY___YOU_MUST_EXEC__() to debug.
The process has forked and you cannot use this CoreFoundation functionality safely. You MUST exec().
Break on __THE_PROCESS_HAS_FORKED_AND_YOU_CANNOT_USE_THIS_COREFOUNDATION_FUNCTIONALITY___YOU_MUST_EXEC__() to debug.
The process has forked and you cannot use this CoreFoundation functionality safely. You MUST exec().
Break on __THE_PROCESS_HAS_FORKED_AND_YOU_CANNOT_USE_THIS_COREFOUNDATION_FUNCTIONALITY___YOU_MUST_EXEC__() to debug.
The process has forked and you cannot use this CoreFoundation functionality safely. You MUST exec().
Break on __THE_PROCESS_HAS_FORKED_AND_YOU_CANNOT_USE_THIS_COREFOUNDATION_FUNCTIONALITY___YOU_MUST_EXEC__() to debug.
The process has forked and you cannot use this CoreFoundation functionality safely. You MUST exec().
Break on __THE_PROCESS_HAS_FORKED_AND_YOU_CANNOT_USE_THIS_COREFOUNDATION_FUNCTIONALITY___YOU_MUST_EXEC__() to debug.
The process has forked and you cannot use this CoreFoundation functionality safely. You MUST exec().
Break on __THE_PROCESS_HAS_FORKED_AND_YOU_CANNOT_USE_THIS_COREFOUNDATION_FUNCTIONALITY___YOU_MUST_EXEC__() to debug.
The process has forked and you cannot use this CoreFoundation functionality safely. You MUST exec().
Break on __THE_PROCESS_HAS_FORKED_AND_YOU_CANNOT_USE_THIS_COREFOUNDATION_FUNCTIONALITY___YOU_MUST_EXEC__() to debug.
The process has forked and you cannot use this CoreFoundation functionality safely. You MUST exec().
Break on __THE_PROCESS_HAS_FORKED_AND_YOU_CANNOT_USE_THIS_COREFOUNDATION_FUNCTIONALITY___YOU_MUST_EXEC__() to debug.
The process has forked and you cannot use this CoreFoundation functionality safely. You MUST exec().
Break on __THE_PROCESS_HAS_FORKED_AND_YOU_CANNOT_USE_THIS_COREFOUNDATION_FUNCTIONALITY___YOU_MUST_EXEC__() to debug.
The process has forked and you cannot use this CoreFoundation functionality safely. You MUST exec().
Break on __THE_PROCESS_HAS_FORKED_AND_YOU_CANNOT_USE_THIS_COREFOUNDATION_FUNCTIONALITY___YOU_MUST_EXEC__() to debug.
The process has forked and you cannot use this CoreFoundation functionality safely. You MUST exec().
Break on __THE_PROCESS_HAS_FORKED_AND_YOU_CANNOT_USE_THIS_COREFOUNDATION_FUNCTIONALITY___YOU_MUST_EXEC__() to debug.
The process has forked and you cannot use this CoreFoundation functionality safely. You MUST exec().
Break on __THE_PROCESS_HAS_FORKED_AND_YOU_CANNOT_USE_THIS_COREFOUNDATION_FUNCTIONALITY___YOU_MUST_EXEC__() to debug.
The process has forked and you cannot use this CoreFoundation functionality safely. You MUST exec().
Break on __THE_PROCESS_HAS_FORKED_AND_YOU_CANNOT_USE_THIS_COREFOUNDATION_FUNCTIONALITY___YOU_MUST_EXEC__() to debug.
The process has forked and you cannot use this CoreFoundation functionality safely. You MUST exec().
Break on __THE_PROCESS_HAS_FORKED_AND_YOU_CANNOT_USE_THIS_COREFOUNDATION_FUNCTIONALITY___YOU_MUST_EXEC__() to debug.
The process has forked and you cannot use this CoreFoundation functionality safely. You MUST exec().
Break on __THE_PROCESS_HAS_FORKED_AND_YOU_CANNOT_USE_THIS_COREFOUNDATION_FUNCTIONALITY___YOU_MUST_EXEC__() to debug.
The process has forked and you cannot use this CoreFoundation functionality safely. You MUST exec().
Break on __THE_PROCESS_HAS_FORKED_AND_YOU_CANNOT_USE_THIS_COREFOUNDATION_FUNCTIONALITY___YOU_MUST_EXEC__() to debug.
The process has forked and you cannot use this CoreFoundation functionality safely. You MUST exec().
Break on __THE_PROCESS_HAS_FORKED_AND_YOU_CANNOT_USE_THIS_COREFOUNDATION_FUNCTIONALITY___YOU_MUST_EXEC__() to debug.
The process has forked and you cannot use this CoreFoundation functionality safely. You MUST exec().
Break on __THE_PROCESS_HAS_FORKED_AND_YOU_CANNOT_USE_THIS_COREFOUNDATION_FUNCTIONALITY___YOU_MUST_EXEC__() to debug.
The process has forked and you cannot use this CoreFoundation functionality safely. You MUST exec().
Break on __THE_PROCESS_HAS_FORKED_AND_YOU_CANNOT_USE_THIS_COREFOUNDATION_FUNCTIONALITY___YOU_MUST_EXEC__() to debug.
The process has forked and you cannot use this CoreFoundation functionality safely. You MUST exec().
Break on __THE_PROCESS_HAS_FORKED_AND_YOU_CANNOT_USE_THIS_COREFOUNDATION_FUNCTIONALITY___YOU_MUST_EXEC__() to debug.
The process has forked and you cannot use this CoreFoundation functionality safely. You MUST exec().
Break on __THE_PROCESS_HAS_FORKED_AND_YOU_CANNOT_USE_THIS_COREFOUNDATION_FUNCTIONALITY___YOU_MUST_EXEC__() to debug.
The process has forked and you cannot use this CoreFoundation functionality safely. You MUST exec().
Break on __THE_PROCESS_HAS_FORKED_AND_YOU_CANNOT_USE_THIS_COREFOUNDATION_FUNCTIONALITY___YOU_MUST_EXEC__() to debug.
The process has forked and you cannot use this CoreFoundation functionality safely. You MUST exec().
Break on __THE_PROCESS_HAS_FORKED_AND_YOU_CANNOT_USE_THIS_COREFOUNDATION_FUNCTIONALITY___YOU_MUST_EXEC__() to debug.
The process has forked and you cannot use this CoreFoundation functionality safely. You MUST exec().
Break on __THE_PROCESS_HAS_FORKED_AND_YOU_CANNOT_USE_THIS_COREFOUNDATION_FUNCTIONALITY___YOU_MUST_EXEC__() to debug.
The process has forked and you cannot use this CoreFoundation functionality safely. You MUST exec().
Break on __THE_PROCESS_HAS_FORKED_AND_YOU_CANNOT_USE_THIS_COREFOUNDATION_FUNCTIONALITY___YOU_MUST_EXEC__() to debug.
The process has forked and you cannot use this CoreFoundation functionality safely. You MUST exec().
Break on __THE_PROCESS_HAS_FORKED_AND_YOU_CANNOT_USE_THIS_COREFOUNDATION_FUNCTIONALITY___YOU_MUST_EXEC__() to debug.
The process has forked and you cannot use this CoreFoundation functionality safely. You MUST exec().
Break on __THE_PROCESS_HAS_FORKED_AND_YOU_CANNOT_USE_THIS_COREFOUNDATION_FUNCTIONALITY___YOU_MUST_EXEC__() to debug.
The process has forked and you cannot use this CoreFoundation functionality safely. You MUST exec().
Break on __THE_PROCESS_HAS_FORKED_AND_YOU_CANNOT_USE_THIS_COREFOUNDATION_FUNCTIONALITY___YOU_MUST_EXEC__() to debug.
The process has forked and you cannot use this CoreFoundation functionality safely. You MUST exec().
Break on __THE_PROCESS_HAS_FORKED_AND_YOU_CANNOT_USE_THIS_COREFOUNDATION_FUNCTIONALITY___YOU_MUST_EXEC__() to debug.
The process has forked and you cannot use this CoreFoundation functionality safely. You MUST exec().
Break on __THE_PROCESS_HAS_FORKED_AND_YOU_CANNOT_USE_THIS_COREFOUNDATION_FUNCTIONALITY___YOU_MUST_EXEC__() to debug.
The process has forked and you cannot use this CoreFoundation functionality safely. You MUST exec().
Break on __THE_PROCESS_HAS_FORKED_AND_YOU_CANNOT_USE_THIS_COREFOUNDATION_FUNCTIONALITY___YOU_MUST_EXEC__() to debug.
The process has forked and you cannot use this CoreFoundation functionality safely. You MUST exec().
Break on __THE_PROCESS_HAS_FORKED_AND_YOU_CANNOT_USE_THIS_COREFOUNDATION_FUNCTIONALITY___YOU_MUST_EXEC__() to debug.
The process has forked and you cannot use this CoreFoundation functionality safely. You MUST exec().
Break on __THE_PROCESS_HAS_FORKED_AND_YOU_CANNOT_USE_THIS_COREFOUNDATION_FUNCTIONALITY___YOU_MUST_EXEC__() to debug.
The process has forked and you cannot use this CoreFoundation functionality safely. You MUST exec().
Break on __THE_PROCESS_HAS_FORKED_AND_YOU_CANNOT_USE_THIS_COREFOUNDATION_FUNCTIONALITY___YOU_MUST_EXEC__() to debug.
The process has forked and you cannot use this CoreFoundation functionality safely. You MUST exec().
Break on __THE_PROCESS_HAS_FORKED_AND_YOU_CANNOT_USE_THIS_COREFOUNDATION_FUNCTIONALITY___YOU_MUST_EXEC__() to debug.
The process has forked and you cannot use this CoreFoundation functionality safely. You MUST exec().
Break on __THE_PROCESS_HAS_FORKED_AND_YOU_CANNOT_USE_THIS_COREFOUNDATION_FUNCTIONALITY___YOU_MUST_EXEC__() to debug.
The process has forked and you cannot use this CoreFoundation functionality safely. You MUST exec().
Break on __THE_PROCESS_HAS_FORKED_AND_YOU_CANNOT_USE_THIS_COREFOUNDATION_FUNCTIONALITY___YOU_MUST_EXEC__() to debug.
The process has forked and you cannot use this CoreFoundation functionality safely. You MUST exec().
Break on __THE_PROCESS_HAS_FORKED_AND_YOU_CANNOT_USE_THIS_COREFOUNDATION_FUNCTIONALITY___YOU_MUST_EXEC__() to debug.
The process has forked and you cannot use this CoreFoundation functionality safely. You MUST exec().
Break on __THE_PROCESS_HAS_FORKED_AND_YOU_CANNOT_USE_THIS_COREFOUNDATION_FUNCTIONALITY___YOU_MUST_EXEC__() to debug.
The process has forked and you cannot use this CoreFoundation functionality safely. You MUST exec().
Break on __THE_PROCESS_HAS_FORKED_AND_YOU_CANNOT_USE_THIS_COREFOUNDATION_FUNCTIONALITY___YOU_MUST_EXEC__() to debug.
The process has forked and you cannot use this CoreFoundation functionality safely. You MUST exec().
Break on __THE_PROCESS_HAS_FORKED_AND_YOU_CANNOT_USE_THIS_COREFOUNDATION_FUNCTIONALITY___YOU_MUST_EXEC__() to debug.
The process has forked and you cannot use this CoreFoundation functionality safely. You MUST exec().
Break on __THE_PROCESS_HAS_FORKED_AND_YOU_CANNOT_USE_THIS_COREFOUNDATION_FUNCTIONALITY___YOU_MUST_EXEC__() to debug.
The process has forked and you cannot use this CoreFoundation functionality safely. You MUST exec().
Break on __THE_PROCESS_HAS_FORKED_AND_YOU_CANNOT_USE_THIS_COREFOUNDATION_FUNCTIONALITY___YOU_MUST_EXEC__() to debug.
The process has forked and you cannot use this CoreFoundation functionality safely. You MUST exec().
Break on __THE_PROCESS_HAS_FORKED_AND_YOU_CANNOT_USE_THIS_COREFOUNDATION_FUNCTIONALITY___YOU_MUST_EXEC__() to debug.
The process has forked and you cannot use this CoreFoundation functionality safely. You MUST exec().
Break on __THE_PROCESS_HAS_FORKED_AND_YOU_CANNOT_USE_THIS_COREFOUNDATION_FUNCTIONALITY___YOU_MUST_EXEC__() to debug.
The process has forked and you cannot use this CoreFoundation functionality safely. You MUST exec().
Break on __THE_PROCESS_HAS_FORKED_AND_YOU_CANNOT_USE_THIS_COREFOUNDATION_FUNCTIONALITY___YOU_MUST_EXEC__() to debug.
The process has forked and you cannot use this CoreFoundation functionality safely. You MUST exec().
Break on __THE_PROCESS_HAS_FORKED_AND_YOU_CANNOT_USE_THIS_COREFOUNDATION_FUNCTIONALITY___YOU_MUST_EXEC__() to debug.
The process has forked and you cannot use this CoreFoundation functionality safely. You MUST exec().
Break on __THE_PROCESS_HAS_FORKED_AND_YOU_CANNOT_USE_THIS_COREFOUNDATION_FUNCTIONALITY___YOU_MUST_EXEC__() to debug.
The process has forked and you cannot use this CoreFoundation functionality safely. You MUST exec().
Break on __THE_PROCESS_HAS_FORKED_AND_YOU_CANNOT_USE_THIS_COREFOUNDATION_FUNCTIONALITY___YOU_MUST_EXEC__() to debug.
The process has forked and you cannot use this CoreFoundation functionality safely. You MUST exec().
Break on __THE_PROCESS_HAS_FORKED_AND_YOU_CANNOT_USE_THIS_COREFOUNDATION_FUNCTIONALITY___YOU_MUST_EXEC__() to debug.
The process has forked and you cannot use this CoreFoundation functionality safely. You MUST exec().
Break on __THE_PROCESS_HAS_FORKED_AND_YOU_CANNOT_USE_THIS_COREFOUNDATION_FUNCTIONALITY___YOU_MUST_EXEC__() to debug.
The process has forked and you cannot use this CoreFoundation functionality safely. You MUST exec().
Break on __THE_PROCESS_HAS_FORKED_AND_YOU_CANNOT_USE_THIS_COREFOUNDATION_FUNCTIONALITY___YOU_MUST_EXEC__() to debug.
The process has forked and you cannot use this CoreFoundation functionality safely. You MUST exec().
Break on __THE_PROCESS_HAS_FORKED_AND_YOU_CANNOT_USE_THIS_COREFOUNDATION_FUNCTIONALITY___YOU_MUST_EXEC__() to debug.
The process has forked and you cannot use this CoreFoundation functionality safely. You MUST exec().
Break on __THE_PROCESS_HAS_FORKED_AND_YOU_CANNOT_USE_THIS_COREFOUNDATION_FUNCTIONALITY___YOU_MUST_EXEC__() to debug.
The process has forked and you cannot use this CoreFoundation functionality safely. You MUST exec().
Break on __THE_PROCESS_HAS_FORKED_AND_YOU_CANNOT_USE_THIS_COREFOUNDATION_FUNCTIONALITY___YOU_MUST_EXEC__() to debug.
The process has forked and you cannot use this CoreFoundation functionality safely. You MUST exec().
Break on __THE_PROCESS_HAS_FORKED_AND_YOU_CANNOT_USE_THIS_COREFOUNDATION_FUNCTIONALITY___YOU_MUST_EXEC__() to debug.
The process has forked and you cannot use this CoreFoundation functionality safely. You MUST exec().
Break on __THE_PROCESS_HAS_FORKED_AND_YOU_CANNOT_USE_THIS_COREFOUNDATION_FUNCTIONALITY___YOU_MUST_EXEC__() to debug.
The process has forked and you cannot use this CoreFoundation functionality safely. You MUST exec().
Break on __THE_PROCESS_HAS_FORKED_AND_YOU_CANNOT_USE_THIS_COREFOUNDATION_FUNCTIONALITY___YOU_MUST_EXEC__() to debug.
The process has forked and you cannot use this CoreFoundation functionality safely. You MUST exec().
Break on __THE_PROCESS_HAS_FORKED_AND_YOU_CANNOT_USE_THIS_COREFOUNDATION_FUNCTIONALITY___YOU_MUST_EXEC__() to debug.
The process has forked and you cannot use this CoreFoundation functionality safely. You MUST exec().
Break on __THE_PROCESS_HAS_FORKED_AND_YOU_CANNOT_USE_THIS_COREFOUNDATION_FUNCTIONALITY___YOU_MUST_EXEC__() to debug.
The process has forked and you cannot use this CoreFoundation functionality safely. You MUST exec().
Break on __THE_PROCESS_HAS_FORKED_AND_YOU_CANNOT_USE_THIS_COREFOUNDATION_FUNCTIONALITY___YOU_MUST_EXEC__() to debug.
The process has forked and you cannot use this CoreFoundation functionality safely. You MUST exec().
Break on __THE_PROCESS_HAS_FORKED_AND_YOU_CANNOT_USE_THIS_COREFOUNDATION_FUNCTIONALITY___YOU_MUST_EXEC__() to debug.
The process has forked and you cannot use this CoreFoundation functionality safely. You MUST exec().
Break on __THE_PROCESS_HAS_FORKED_AND_YOU_CANNOT_USE_THIS_COREFOUNDATION_FUNCTIONALITY___YOU_MUST_EXEC__() to debug.
The process has forked and you cannot use this CoreFoundation functionality safely. You MUST exec().
Break on __THE_PROCESS_HAS_FORKED_AND_YOU_CANNOT_USE_THIS_COREFOUNDATION_FUNCTIONALITY___YOU_MUST_EXEC__() to debug.
The process has forked and you cannot use this CoreFoundation functionality safely. You MUST exec().
Break on __THE_PROCESS_HAS_FORKED_AND_YOU_CANNOT_USE_THIS_COREFOUNDATION_FUNCTIONALITY___YOU_MUST_EXEC__() to debug.
The process has forked and you cannot use this CoreFoundation functionality safely. You MUST exec().
Break on __THE_PROCESS_HAS_FORKED_AND_YOU_CANNOT_USE_THIS_COREFOUNDATION_FUNCTIONALITY___YOU_MUST_EXEC__() to debug.
The process has forked and you cannot use this CoreFoundation functionality safely. You MUST exec().
Break on __THE_PROCESS_HAS_FORKED_AND_YOU_CANNOT_USE_THIS_COREFOUNDATION_FUNCTIONALITY___YOU_MUST_EXEC__() to debug.
The process has forked and you cannot use this CoreFoundation functionality safely. You MUST exec().
Break on __THE_PROCESS_HAS_FORKED_AND_YOU_CANNOT_USE_THIS_COREFOUNDATION_FUNCTIONALITY___YOU_MUST_EXEC__() to debug.
The process has forked and you cannot use this CoreFoundation functionality safely. You MUST exec().
Break on __THE_PROCESS_HAS_FORKED_AND_YOU_CANNOT_USE_THIS_COREFOUNDATION_FUNCTIONALITY___YOU_MUST_EXEC__() to debug.
The process has forked and you cannot use this CoreFoundation functionality safely. You MUST exec().
Break on __THE_PROCESS_HAS_FORKED_AND_YOU_CANNOT_USE_THIS_COREFOUNDATION_FUNCTIONALITY___YOU_MUST_EXEC__() to debug.
The process has forked and you cannot use this CoreFoundation functionality safely. You MUST exec().
Break on __THE_PROCESS_HAS_FORKED_AND_YOU_CANNOT_USE_THIS_COREFOUNDATION_FUNCTIONALITY___YOU_MUST_EXEC__() to debug.
The process has forked and you cannot use this CoreFoundation functionality safely. You MUST exec().
Break on __THE_PROCESS_HAS_FORKED_AND_YOU_CANNOT_USE_THIS_COREFOUNDATION_FUNCTIONALITY___YOU_MUST_EXEC__() to debug.
The process has forked and you cannot use this CoreFoundation functionality safely. You MUST exec().
Break on __THE_PROCESS_HAS_FORKED_AND_YOU_CANNOT_USE_THIS_COREFOUNDATION_FUNCTIONALITY___YOU_MUST_EXEC__() to debug.
The process has forked and you cannot use this CoreFoundation functionality safely. You MUST exec().
Break on __THE_PROCESS_HAS_FORKED_AND_YOU_CANNOT_USE_THIS_COREFOUNDATION_FUNCTIONALITY___YOU_MUST_EXEC__() to debug.
The process has forked and you cannot use this CoreFoundation functionality safely. You MUST exec().
Break on __THE_PROCESS_HAS_FORKED_AND_YOU_CANNOT_USE_THIS_COREFOUNDATION_FUNCTIONALITY___YOU_MUST_EXEC__() to debug.
The process has forked and you cannot use this CoreFoundation functionality safely. You MUST exec().
Break on __THE_PROCESS_HAS_FORKED_AND_YOU_CANNOT_USE_THIS_COREFOUNDATION_FUNCTIONALITY___YOU_MUST_EXEC__() to debug.
The process has forked and you cannot use this CoreFoundation functionality safely. You MUST exec().
Break on __THE_PROCESS_HAS_FORKED_AND_YOU_CANNOT_USE_THIS_COREFOUNDATION_FUNCTIONALITY___YOU_MUST_EXEC__() to debug.
The process has forked and you cannot use this CoreFoundation functionality safely. You MUST exec().
Break on __THE_PROCESS_HAS_FORKED_AND_YOU_CANNOT_USE_THIS_COREFOUNDATION_FUNCTIONALITY___YOU_MUST_EXEC__() to debug.
The process has forked and you cannot use this CoreFoundation functionality safely. You MUST exec().
Break on __THE_PROCESS_HAS_FORKED_AND_YOU_CANNOT_USE_THIS_COREFOUNDATION_FUNCTIONALITY___YOU_MUST_EXEC__() to debug.
The process has forked and you cannot use this CoreFoundation functionality safely. You MUST exec().
Break on __THE_PROCESS_HAS_FORKED_AND_YOU_CANNOT_USE_THIS_COREFOUNDATION_FUNCTIONALITY___YOU_MUST_EXEC__() to debug.
The process has forked and you cannot use this CoreFoundation functionality safely. You MUST exec().
Break on __THE_PROCESS_HAS_FORKED_AND_YOU_CANNOT_USE_THIS_COREFOUNDATION_FUNCTIONALITY___YOU_MUST_EXEC__() to debug.
The process has forked and you cannot use this CoreFoundation functionality safely. You MUST exec().
Break on __THE_PROCESS_HAS_FORKED_AND_YOU_CANNOT_USE_THIS_COREFOUNDATION_FUNCTIONALITY___YOU_MUST_EXEC__() to debug.
The process has forked and you cannot use this CoreFoundation functionality safely. You MUST exec().
Break on __THE_PROCESS_HAS_FORKED_AND_YOU_CANNOT_USE_THIS_COREFOUNDATION_FUNCTIONALITY___YOU_MUST_EXEC__() to debug.
The process has forked and you cannot use this CoreFoundation functionality safely. You MUST exec().
Break on __THE_PROCESS_HAS_FORKED_AND_YOU_CANNOT_USE_THIS_COREFOUNDATION_FUNCTIONALITY___YOU_MUST_EXEC__() to debug.
The process has forked and you cannot use this CoreFoundation functionality safely. You MUST exec().
Break on __THE_PROCESS_HAS_FORKED_AND_YOU_CANNOT_USE_THIS_COREFOUNDATION_FUNCTIONALITY___YOU_MUST_EXEC__() to debug.
The process has forked and you cannot use this CoreFoundation functionality safely. You MUST exec().
Break on __THE_PROCESS_HAS_FORKED_AND_YOU_CANNOT_USE_THIS_COREFOUNDATION_FUNCTIONALITY___YOU_MUST_EXEC__() to debug.
The process has forked and you cannot use this CoreFoundation functionality safely. You MUST exec().
Break on __THE_PROCESS_HAS_FORKED_AND_YOU_CANNOT_USE_THIS_COREFOUNDATION_FUNCTIONALITY___YOU_MUST_EXEC__() to debug.
The process has forked and you cannot use this CoreFoundation functionality safely. You MUST exec().
Break on __THE_PROCESS_HAS_FORKED_AND_YOU_CANNOT_USE_THIS_COREFOUNDATION_FUNCTIONALITY___YOU_MUST_EXEC__() to debug.
The process has forked and you cannot use this CoreFoundation functionality safely. You MUST exec().
Break on __THE_PROCESS_HAS_FORKED_AND_YOU_CANNOT_USE_THIS_COREFOUNDATION_FUNCTIONALITY___YOU_MUST_EXEC__() to debug.
The process has forked and you cannot use this CoreFoundation functionality safely. You MUST exec().
Break on __THE_PROCESS_HAS_FORKED_AND_YOU_CANNOT_USE_THIS_COREFOUNDATION_FUNCTIONALITY___YOU_MUST_EXEC__() to debug.
The process has forked and you cannot use this CoreFoundation functionality safely. You MUST exec().
Break on __THE_PROCESS_HAS_FORKED_AND_YOU_CANNOT_USE_THIS_COREFOUNDATION_FUNCTIONALITY___YOU_MUST_EXEC__() to debug.
The process has forked and you cannot use this CoreFoundation functionality safely. You MUST exec().
Break on __THE_PROCESS_HAS_FORKED_AND_YOU_CANNOT_USE_THIS_COREFOUNDATION_FUNCTIONALITY___YOU_MUST_EXEC__() to debug.
The process has forked and you cannot use this CoreFoundation functionality safely. You MUST exec().
Break on __THE_PROCESS_HAS_FORKED_AND_YOU_CANNOT_USE_THIS_COREFOUNDATION_FUNCTIONALITY___YOU_MUST_EXEC__() to debug.
The process has forked and you cannot use this CoreFoundation functionality safely. You MUST exec().
Break on __THE_PROCESS_HAS_FORKED_AND_YOU_CANNOT_USE_THIS_COREFOUNDATION_FUNCTIONALITY___YOU_MUST_EXEC__() to debug.
The process has forked and you cannot use this CoreFoundation functionality safely. You MUST exec().
Break on __THE_PROCESS_HAS_FORKED_AND_YOU_CANNOT_USE_THIS_COREFOUNDATION_FUNCTIONALITY___YOU_MUST_EXEC__() to debug.
The process has forked and you cannot use this CoreFoundation functionality safely. You MUST exec().
Break on __THE_PROCESS_HAS_FORKED_AND_YOU_CANNOT_USE_THIS_COREFOUNDATION_FUNCTIONALITY___YOU_MUST_EXEC__() to debug.
The process has forked and you cannot use this CoreFoundation functionality safely. You MUST exec().
Break on __THE_PROCESS_HAS_FORKED_AND_YOU_CANNOT_USE_THIS_COREFOUNDATION_FUNCTIONALITY___YOU_MUST_EXEC__() to debug.
The process has forked and you cannot use this CoreFoundation functionality safely. You MUST exec().
Break on __THE_PROCESS_HAS_FORKED_AND_YOU_CANNOT_USE_THIS_COREFOUNDATION_FUNCTIONALITY___YOU_MUST_EXEC__() to debug.
The process has forked and you cannot use this CoreFoundation functionality safely. You MUST exec().
Break on __THE_PROCESS_HAS_FORKED_AND_YOU_CANNOT_USE_THIS_COREFOUNDATION_FUNCTIONALITY___YOU_MUST_EXEC__() to debug.
I found this tip on the internet: import multiprocessing as mp mp.set_start_method('spawn')
but I don't know if this would work / be a good idea.