Closed GoogleCodeExporter closed 8 years ago
new server is at 63.247.29.216
Original comment by zack...@gmail.com
on 2 Sep 2014 at 8:23
locate was running very slow, turns out having the locale set to UTF-8 was the
culprit. moving it back to ASCII/C sped things up 4x.
Original comment by zack...@gmail.com
on 10 Sep 2014 at 10:30
I changed /etc/sysconfig/i18n to "C"
Original comment by zack...@gmail.com
on 10 Sep 2014 at 10:35
Testing the update of eccpgxt-beta
Original comment by natalia....@gmail.com
on 24 Sep 2014 at 1:56
md5sums of files located under /storage on new VM are different from those
under /storage on epifire2, so copying database from epifire2 to new VM wont
work.
new VM:
root@calv-epigenomedev C30HWACXX_2_SHE1303A82]# du
/storage/hpcc/uec-gs1/laird/shared/production/ga/flowcells/C30HWACXX/results/C30
HWACXX/C30HWACXX_2_SHE1303A82/ | md5sum
ea7e871b230c0114c1c4ca53faae59ea -
epifire2:
natalia@epifire2 sequencing_v4]$ du
/storage/hpcc/uec-gs1/laird/shared/production/ga/flowcells/C30HWACXX/results/C30
HWACXX/C30HWACXX_2_SHE1303A82/ | md5sum
6fa07f0b855169f41acc12b91325e26f -
I have to rebuild it from scratch on new VM.
Original comment by natalia....@gmail.com
on 25 Sep 2014 at 8:05
The parallel rebuild failed with this message:
DBD::mysql::st execute failed: Row size too large (> 8126). Changing some
columns to TEXT or BLOB or using ROW_FORMAT=DYNAMIC or ROW_FORMAT=COMPRESSED
may help. In current row format, BLOB prefix of 768 bytes is stored inline. at
./insertQC.pl line 715
This message was never showing up on epifire2. Maybe MySQL on new machine needs
some tuning?
Original comment by natalia....@gmail.com
on 29 Sep 2014 at 7:13
Try single build, not parallel, to see where it exactly fails
-zack (mobile)
Original comment by zack...@gmail.com
on 29 Sep 2014 at 7:15
It fails when calling run_metric_dynamic_crosstab()
Original comment by natalia....@gmail.com
on 29 Sep 2014 at 7:23
Try calling the stored proc from outside of perl, using the cmd line. Does
it still fail?
-zack (mobile)
Original comment by zack...@gmail.com
on 29 Sep 2014 at 7:25
Just tried:
mysql> CALL run_metric_dynamic_crosstab();
ERROR 1118 (42000): Row size too large (> 8126). Changing some columns to TEXT
or BLOB or using ROW_FORMAT=DYNAMIC or ROW_FORMAT=COMPRESSED may help. In
current row format, BLOB prefix of 768 bytes is stored inline.
Original comment by natalia....@gmail.com
on 29 Sep 2014 at 7:44
Hmmm is probably some setting. Let's look into it and figure out the
difference in behavior.
If the db is copied to epifire2, does the stored procedures work there?
-zack (mobile)
Original comment by zack...@gmail.com
on 29 Sep 2014 at 7:58
I copied the db from new VM to epifire2 and ran run_metric_dynamic_crosstab().
This procedure finished without errors.
Original comment by natalia....@gmail.com
on 29 Sep 2014 at 8:39
so that narrows it down. there must be some limit set in mysql that needs
adjusting.
Original comment by zack...@gmail.com
on 29 Sep 2014 at 8:41
I have searched online and this type of error is reported for mysql 5.6 InnoDB
tables and some solutions are provided, but our tables are MyISAM. This error
isnt mentioned anywhere in relationship with MyISAM tables.
Original comment by natalia....@gmail.com
on 29 Sep 2014 at 11:36
did you try the suggestion of setting row_format?
Original comment by zack...@gmail.com
on 1 Oct 2014 at 2:40
It turned out that in mysql 5.6 the default engine is set to InnoDB, so I
decided to try to set it to MyISAM inside of the stored procedure. It worked
but took 27 mins to finish:
mysql> call run_metric_dynamic_crosstab();
Query OK, 0 rows affected, 20 warnings (27 min 46.32 sec)
I wonder if its going to run faster if I leave the Engine to InnoDB and set
row_format to COMPRESSED
Original comment by natalia....@gmail.com
on 2 Oct 2014 at 12:05
Great work!
-zack (mobile)
Original comment by zack...@gmail.com
on 2 Oct 2014 at 12:37
I need a password to access hpcc export dir to delete some junk file
/export/uec-gs1/laird/shared/production/ga/flowcells/815C5ABXX/run1/results/815C
5ABXX/._815C5ABXX_qcmetrics.csv, which was the cause for the test update crash
The file owner is ramjan, I don't have enough permissions
Original comment by natalia....@gmail.com
on 3 Oct 2014 at 7:15
Go to storage server, su root, su zack, and you will become me. You can
then access and modify anything that belongs to me
-zack (mobile)
Original comment by zack...@gmail.com
on 3 Oct 2014 at 7:18
Ok, thanks. That worked
Original comment by natalia....@gmail.com
on 3 Oct 2014 at 7:27
I am wondering why /tmp/geneusDump.xml always has some $ symbol inserted after
the xml header? That makes the update to crash every time it attempts to read
from that file. I tried vi command: %s/$//g but the symbol is still there after
I do :wq and re-open the file.
Also if you open that file in vi it has lots of @ symbols inserted between the
xml header and xml tags:
<?xml version="1.0"?>
@
@
@
....
<flowcells>...
Original comment by natalia....@gmail.com
on 7 Oct 2014 at 10:53
use cat or less
vi is telling you that the line is long so it puts a @.
I looked at the file right now and did not see the "$", maybe you cleaned it by
hand already?
Original comment by zack...@gmail.com
on 7 Oct 2014 at 10:57
Still there:
[root@calv-epigenomedev sequencing_beta]# cat -vet /tmp/geneusDump.xml | less
<?xml version="1.0"?>$
<flowcells><flowcell serial="C562CACXX"
Original comment by natalia....@gmail.com
on 7 Oct 2014 at 11:00
you use:
cat -vet /tmp/geneusDump.xml
from the cat man page:
man cat
-e equivalent to -vE
-E, --show-ends
display $ at end of each line
isnt putting a "$" at the end of each line exactly what you're asking cat to do
with the "-e: display $ at end of each line" flag? this does not mean the
"$" is actually present in the file. "^" and "$" are common notations for start
and end of line, respectively.
Original comment by zack...@gmail.com
on 7 Oct 2014 at 11:05
I understand the command I ran just shows there is a new line after the header.
But having the end of line after the header makes the program to crash.
How to change the perl code so it doesnt insert the new line after header?
Original comment by natalia....@gmail.com
on 7 Oct 2014 at 11:09
its strange that this would be the cause. nevertheless, it should be easy to
fix since it gets parsed in as a single string. just s/\n// before returning.
this should drop the first "\n" it finds
Original comment by zack...@gmail.com
on 7 Oct 2014 at 11:35
Ok, thanks. I will try that after the current update finishes hopefully without
crashing for some reason
Original comment by natalia....@gmail.com
on 7 Oct 2014 at 11:39
The update crashed yesterday with the message:
DBD::mysql::st execute failed: Data too long for column 'qc_val' at row 1 at
./insertQC.pl line 386
while calling insertMetric with MergedInputDirs =
/storage/hpcc/uec-gs1/laird/shared/production/ga/flowcells/C0TUPACXX/results/C0T
UPACXX/C0TUPACXX_1_NIC1254A46/C0TUPACXX_qcmetrics.csv,/storage/hpcc/uec-gs1/lair
d/shared/production/ga/flowcells/D12D6ACXX/results/D12D6ACXX/D12D6ACXX_2_NIC1254
A46/D12D6ACXX_qcmetrics.csv,/storage/hpcc/uec-gs1/laird/shared/production/ga/flo
wcells/C1FD2ACXX/results/C1FD2ACXX/C1FD2ACXX_8_NIC1254A46/C1FD2ACXX_qcmetrics.cs
v,/storage/hpcc/uec-gs1/laird/shared/production/ga/flowcells/D10MEACXX/run2/resu
lts/D10MEACXX/D10MEACXX_2_NIC1254A46/D10MEACXX_qcmetrics.csv
I am investigating this issue.
Original comment by natalia....@gmail.com
on 8 Oct 2014 at 7:05
The stored procedure insertMetric crashed because the input parameter type
qc_val was set to varchar(255), which is shorter then the length of the input
string MergedInputDirs (540).
I increased the size to varchar(1000). The stored procedure worked.
But I don't understand why having qc_val set to varchar(255) didn't cause any
problems on epifire2 (it uses mysql 5.1)
Original comment by natalia....@gmail.com
on 8 Oct 2014 at 8:58
Actually varchar(1000) wasn't big enough, it crashed again. I just set it to
mediumtext.
Original comment by natalia....@gmail.com
on 8 Oct 2014 at 10:00
Original comment by zack...@gmail.com
on 12 Mar 2015 at 7:27
Original issue reported on code.google.com by
zack...@gmail.com
on 23 Jul 2014 at 8:46