Open mbcann01 opened 10 months ago
@edambo I think you will take a first pass at this, correct?
I'm not sure I understand @mbcann01. Are you asking if I worked on these already? I think I did, unless you noticed something I missed.
Hi @edambo ! I haven't checked yet. We just talked about it on Monday, so I didn't assume that you'd already done it. If so great!
@mbcann01 Ah, yes. I think I did this already if you mean deleting the files.
@edambo What about the other tasks listed above? Did you happen to do either of them?
@mbcann01 Yes, I think they were done before the holidays. I changed the style as you requested, but it's possible I missed something. Let me know if this is the case.
@edambo , I started looking through the code. In data_01_aps_investigations_import.qmd
line 26, the code to import the data looks like this:
aps_inv <- read_csv("../data/filemaker_pro_exports/aps_investigations_import.csv")
However, that produces and error on my computer because that is not the file path to the data. Of course, I can just change the file path in the code to match the file path on my computer, but I want it to run on your computer too. What is the path to this file in your computer?
That is very odd because this path works on my system. I'm not sure why it's not working. I just ran it and there were no errors.
Ebie
On Thu, Feb 29, 2024 at 4:04 PM Brad Cannell @.***> wrote:
@edambo https://github.com/edambo , I started looking through the code. In data_01_aps_investigations_import.qmd line 26, the code to import the data looks like this:
aps_inv <- read_csv("../data/filemaker_pro_exports/aps_investigations_import.csv")
However, that produces and error on my computer because that is not the file path to the data. Of course, I can just change the file path in the code to match the file path on my computer, but I want it to run on your computer too. What is the path to this file in your computer?
— Reply to this email directly, view it on GitHub https://github.com/brad-cannell/detect_fu_interviews_public/issues/33#issuecomment-1972046910, or unsubscribe https://github.com/notifications/unsubscribe-auth/A6F2HY6GBHQEBOKEGRC26H3YV6SWHAVCNFSM6AAAAABAGWHT6SVHI2DSMVQWIX3LMV43OSLTON2WKQ3PNVWWK3TUHMYTSNZSGA2DMOJRGA . You are receiving this because you were mentioned.Message ID: @.***>
I haven't changed that specifically, but I've made other changes to the data management files. I was waiting to finish working on the codebook files to pull everything together but I will go ahead and send a pull request now.
On Thu, Feb 29, 2024 at 10:50 PM Ebie Dambo @.***> wrote:
That is very odd because this path works on my system. I'm not sure why it's not working. I just ran it and there were no errors.
Ebie
On Thu, Feb 29, 2024 at 4:04 PM Brad Cannell @.***> wrote:
@edambo https://github.com/edambo , I started looking through the code. In data_01_aps_investigations_import.qmd line 26, the code to import the data looks like this:
aps_inv <- read_csv("../data/filemaker_pro_exports/aps_investigations_import.csv")
However, that produces and error on my computer because that is not the file path to the data. Of course, I can just change the file path in the code to match the file path on my computer, but I want it to run on your computer too. What is the path to this file in your computer?
— Reply to this email directly, view it on GitHub https://github.com/brad-cannell/detect_fu_interviews_public/issues/33#issuecomment-1972046910, or unsubscribe https://github.com/notifications/unsubscribe-auth/A6F2HY6GBHQEBOKEGRC26H3YV6SWHAVCNFSM6AAAAABAGWHT6SVHI2DSMVQWIX3LMV43OSLTON2WKQ3PNVWWK3TUHMYTSNZSGA2DMOJRGA . You are receiving this because you were mentioned.Message ID: @.*** com>
2024-03-15
Left off at:
data_02
.check_consenting_participants.qmd
to see if that needs to be a separate file or if it can be combined with data_02
. Left off on line 22 -- creating a more efficient way to read in the files.Copy and paste for commits:
Brad's review of data_02_consent_import.qmd
Part of #33
- Use the `here` package to facilitate file import and export.
- Made headings more consistent.
- Checked for overlap with qaqc/check_consenting_participants.qmd. There was nothing in the QAQC file that wasn't also in the data import file. Deleted the QAQC file.
- Added two carriage returns before level one headings.
2024-03-21
Left off at:
data_02
.check_consenting_participants.qmd
to see if that needs to be a separate file or if it can be combined with data_02
. Left off on line 115 -- trying to recreate Ebie's results.Copy and paste for commits:
Brad's review of check_consenting_participants.qmd
Part of #33
- Use the `here` package to facilitate file import and export.
- Made headings more consistent.
- Checked for overlap with qaqc/check_consenting_participants.qmd. There was nothing in the QAQC file that wasn't also in the data import file. Deleted the QAQC file.
- Added two carriage returns before level one headings.
- Simplified some of the code.
2024-03-22
Left off at:
data_02
.check_consenting_participants.qmd
to see if that needs to be a separate file or if it can be combined with data_02
. Left off on line 118 -- trying to recreate Ebie's results.Copy and paste for commits:
Brad's review of check_consenting_participants.qmd
Part of #33
- Use the `here` package to facilitate file import and export.
- Made headings more consistent.
- Checked for overlap with qaqc/check_consenting_participants.qmd. There was nothing in the QAQC file that wasn't also in the data import file. Deleted the QAQC file.
- Added two carriage returns before level one headings.
- Simplified some of the code.
2024-04-04
Left off at:
data_02
.check_consenting_participants.qmd
to see if that needs to be a separate file or if it can be combined with data_02
. Left off on line 118 -- trying to recreate Ebie's results.self_report_import.rds
data frame. I need to remove that row and then come back to check_consenting_participants.qmd
Reviewing data_06_self_report_import.qmd
Copy and paste for commits:
Brad's review of data_06_self_report_import.qmd
Part of #33
- Use the `here` package to facilitate file import and export.
- Made headings more consistent.
- Checked for overlap with qaqc/data_01_self_report_recode_factors.Rmd. There was nothing in the QAQC file that wasn't also in the data import file. Deleted the QAQC file.
- Added two carriage returns before level one headings.
- Dropped row with missing data.
2024-04-05
Reviewing data_06_self_report_import.qmd
data_01_self_report_recode_factor.Rmd
seems more accurate so far.Note: I got an error saying NA's were introduced by coercion. It had to do with the way "Don't know" was written. Just run unique(self_rep$neglect_go_help)
. Then, copy and paste. It won't look any different to your eye, but it should fix the problem.
When you are done reviewing data_06_self_report_import.qmd
, change coding for all "Yes/No" columns from "1/2" to "1/0" in data_01
and data_02
.
Then, go back to reviewing check_consenting_participants.qmd
.
2024-04-09
Reviewing data_06_self_report_import.qmd
qaqc/data_01_self_report_recode_facotors.Rmd
. That file recoded columns more accurately in some cases.When you are done reviewing data_06_self_report_import.qmd
, change coding for all "Yes/No" columns from "1/2" to "1/0" in data_01
and data_02
.
Then, go back to reviewing check_consenting_participants.qmd
.
2024-04-10, 2024-04-11
Reviewing data_06_self_report_import.qmd
qaqc/data_01_self_report_recode_facotors.Rmd
. That file recoded columns more accurately in some cases.data_06_self_report_import.qmd
, change coding for all "Yes/No" columns from "1/2" to "1/0" in data_01
and data_02
.check_consenting_participants.qmd
.
readr
package.Copy and paste for commits:
Brad's review of data_01_aps_investigations_import.qmd
Part of #33
- Started using the functions in recoding_factoring_relocating.R and nums_to_na.R to clean and transform categorical variables.
- Changed coding for all "Yes/No" columns from "1/2" to "1/0".
- Spot check the factor code.
- Use the `here` package to facilitate file import and export.
- Made headings more consistent.
- Checked for overlap with qaqc/data_01_aps_recode_factors.Rmd. After a review, I concluded that we are safe to delete the QAQC file.
2024-04-11
data_06_self_report_import.qmd
, data_01_aps_investigations_import.qmd
, and data_02_consent_import.qmd
check_consenting_participants.qmd
2024-04-16
check_consenting_participants.qmd
.check_consenting_participants.qmd
will no longer return the same results. For example, the MedStar ID ending in "...ff587" should not have been included in aps_investigations_import.rds
, so we went back to data_01_aps_investigations_import.qmd
and removed it. Now, when we run the code below to look for MedStar IDs that appear in the APS Investigations data, but not the consent data, "...ff587" will no longer appear. Therefore, we are primarily keeping this file around as a record of what we did rather than something we need to continue doing. Having said that, there could be additional rows that need to be removed in the future if people get on FM Pro and start clicking things. That will sometimes cause FM Pro to automatically generate values (e.g., name) in the survey data.data_03_clutter_scale_import.qmd
2024-04-26
data_03_clutter_scale_import.qmd
Copy and paste for commits:
Brad's review of data_03_clutter_scale_import.qmd
Part of #33
- Started using the functions in recoding_factoring_relocating.R and nums_to_na.R to clean and transform categorical variables.
- Changed coding for all "Yes/No" columns from "1/2" to "1/0".
- Spot check the factor code.
- Use the `here` package to facilitate file import and export.
- Made headings more consistent.
- Checked for overlap with qaqc/data_01_clutter_recode_factors.Rmd. After a review, I concluded that we are safe to delete the QAQC file.
Overview
Several GRAs have worked hard to create an analysis data frame from the separate files exported by FileMaker Pro as part of the DETECT follow-up interviews. I need to go review them all for correctness and stylistic consistency. Additionally, I need to create some instructions for using/updating files in the future.
Links
Tasks
here
package to facilitate file import and export.