-
Notifications
You must be signed in to change notification settings - Fork 0
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Why? What? How? #1
Comments
@des-des could this be useful: |
Ok this exists for sql server. It finds the difference between two dbs, and builds a migration script. This is the kind of thing we are interested in but for postgresql. https://opendbdiff.codeplex.com/ |
Ok, This is some js diffing two dbs: https://github.com/gimenete/dbdiff/blob/master/dbdiff.js. Works for postgresql! |
What are we proposing to build?The title here is postgres-schema-migration-checker. To me this suggests that we are building a ci tool to check if a schema migration will work. Ie given a migration script, can we apply it to the database without breaking anything. This seems different to my impression of our discussion but makes more sense. Given this outlook, I possible steps as.
Attempt to apply the migration, if it works we are okay to apply the migration to production. To outline what I think @nelsonic was suggesting (the whole migration process).
In this process, I am not sure if we can depend on reliability of transactions happening during the copy in step 8. What problem are we trying to solve?Oxford abstracts have many migration scripts. As the number of migration scripts grow, this process is becoming hard to manage. The db schema become more confused, as to understand the current schema the effect of many migration scripts must be taken into account. I think another problem here is testing time. Buillding a test db is taking a long time, as all the migration scripts need to be applied. @Conorc1000 @roryc89 Do you feel my description of your problem is accurate? Is there anything you can add. |
@des-des I think this is a good description of the problem. (Yes we currently run our migrations each time we deploy to heroku). |
@des-des I agree that this an accurate description of our most pressing schema migration issues. Thanks for helping out! :) |
@nelsonic It seems to me that normally you would apply a migration schema to a db. Rather than build a script that moved data between two dbs with different schema, I think this might be confusing me. This is also related to my previous question: During the actual deployment step, how can we safely keep the client / server live while the copy is in progress? eg If we send a POST to both live db and db being created can we be sure that the data will not get added twice. @iteles Do you have the photos? |
@des-des So sorry, I remember uploading them and checking the preview to make sure they were there but must have forgotten to hit 'Update comment' after that 😭 I've updated the top comment with them now. Apologies again. |
Hi @des-des yes, "normally" a migration would be applied to the DB. More "advanced" or "mature" schema migrators like Active Record will attempt to do this for you. Our idea was to investigate the If we need to clarify the requirements further, I'm happy to do so.
|
@nelsonic awesome. That makes sense! Will try to put something together soon! |
Ok, so I see there being three parts here 1: Managing the application of migration scripts in production.This is the focus of most of the tools @nelsonic has linked to. The main idea here is to ensure migration scripts are only applied once. The target db will have a table of previously applied schema. 2: Automatic creation of migration schemaGiven a pull request with a change of schema, can we automate the creation of a migration script?
and
are the same 3: Testing if a schema does not lose data.Too me, although it may be tangential to the problem at hand, this project needs a way of being confident that migrations are not losing data before it leaves beta. This is the problem Nelson's drawings are describing.
|
@des-des yes, this would be a good approach. 👍 |
@nelsonic could you be more specific? |
@des-des the steps you have described don't appear logical to me. Migration schemas would be automatically created on the developer's machine by the script. As for having access to production data, we can simulate valid records for both schemas based on the data types for the columns. |
Sorry to have left this so long. FAC1N happened and kinda took over for a couple of weeks then updating back here got lost in my todo list. About a month ago I spoke to the OA team, I'll quickly summarise the outcome of that conversation. OA have actually gone ahead and solved a part of their problem. Their DB now holds a table of run migrations. This means that for any instance of their db, they can make sure migrations are run once in order. Anyway, I think we need to step back a little and think about his problem in the context of dwyl's new stack.. |
@roryc89 @Conorc1000 @naazy Is this still required for your project? Postgres is still part of our stack, so this functionality may still be required. |
@roryc89 @Conorc1000 @naazy and if the answer is yes, can you be explicit about what is needed? |
Is possible to write python script for production db data migrate to staging db for postgresql?? |
@khairnarTK it's definitely possible, Django has migrations: https://docs.djangoproject.com/en/2.2/topics/migrations/ |
@iteles Please add the photos you took of the sketches from today into this description of the issue (or send them to me!) thanks! ⭐️
The text was updated successfully, but these errors were encountered: