Closed liberatr closed 5 months ago
This is a great idea.
The main issue is that this project reuses a lot from Backup and Migrate, and the forms and controls in that module are pretty complicated. If we just copy the UI instead of trying to use the code directly that could be one idea, or get someone who knows Backup and Migrate code better than me.
Perhaps this should be a different issue, but @quicksketch and I contemplated the idea of using a subset backup of just the System and Files tables, instead of making a direct connection to the source database. You can do this by hand using generic OOTB Backup and Migrate, if you're careful. (Or we could automate it, with some programming)
This would be used by d2b_migrate using a database prefix, creating (something like) d2b_system and d2b_files in the existing Backdrop database.
Yep, agree with @cellear.
In the current code, I believe only two queries are made from the source database:
db_query('SELECT * FROM {system}')
(for module analysis)db_query('SELECT uri FROM {file_managed}')
(for migrating files)So the challenge here is starting with a full database backup and only importing those two tables for the purpose of analysis.
Just a thought on a potential solution: The backup_migrate_filters_restore()
function that is called to restore databases runs operations on the database backup file prior to the import being executed. We could use that to trim down the file to only include the desired tables for analysis(?)
It's not quite ideal, it would be preferable if we could filter when the queries are getting executed, rather than parsing the flat file.
EDIT: I don't think this idea will work :laughing: There are per-table prefixes, but not per-table database engines.
If db file is uploaded into manual backups directory it should be possible to read it as a file. If we could replace "db_query('SELECT * FROM {system}') (for module analysis)" with code that can read list of modules from lines
"INSERT INTO system
VALUES ('modules/" and
"INSERT INTO system
VALUES ('sites/all/modules/contrib/"
it might allow getting same information from the uploaded file
@quicksketch @liberatr @docwilmot , what do you think about this approach?
I had a very good brainstorming session with our developer today, and we are proposing the following flow. In this flow "Database backup" step become part of step one and streamlines process for end user. We are working to figure out if it is possible to bring system table into Backdrop database as D7_system and analyze locally. We also need to have an option to upload files archive in addition to downloading file via url.
Why would we require the user to have the admin login for their D7 site? Couldn't we ask them to set the admin credentials for the new (upgraded) site during this process?
I realize that's an extra step if you do know the D7 credentials, but it's a quick and easy one. If they want the admin credentials to remain unchanged, they can simply set them to what they were.
Usually only admin users will be involved in website migration. During migration users from D7 are imported with username, email and password. After update process is completed it is important to login as some settings might need adjustment. It is possible to recover admin password via email, but it is important to have access to that email.
Setting new password after migration is completed requires additional coding and is not easy.
I think I've found a way to proceed from a sql
file upload. It will take a bit of time to work this out.
@docwilmot , thank you so much for huge continuing effort on this module! Let me know if we can do a working session together so our team can support your solution.
New branch db-upload
allows uploading a file. Work in progress.
@docwilmot , thank you SO MUCH! I will begin working with this branch!
Hi @docwilmot , I am testing file upload. On my local MAMP it timed out, I will check configuration again. On Pantheon instance I am getting the following message
File upload error. Could not move uploaded file.
The specified file temporary://backup_migrate_dIHzwT could not be moved/copied because the destination
directory is not properly configured. This may be caused by a problem with file or directory permissions.
More information is available in the system log.
In logs
Warning: file_put_contents(private://backup_migrate/manual/.info): failed to open stream: "BackdropPrivateStreamWrapper::stream_open" call failed in d2b_migrate_source_database_form_validate() (line 332 of /code/modules/d2b_migrate/d2b_migrate.module).
Thanks for testing. The latest might help, though I cant really test Pantheon filesystem.
Code changes for this issue have been tested on several sites and works great. @docwilmot , I think that it would be great to do next release.
Similar to Backup and Migrate, allow a user to upload a MySQL Dump as well as connect to an existing database.