You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
As @jimtyhurstasked in PR47, do we need to enable migrate and import every time the Django app starts?
Jim says, "Each time the image starts, it runs the migrations and imports the data as specified in docker-entrypoint.sh, right? I don't think it should do those things. In general, the application should be configured to access an existing database, so why should we run the migrations and import the data every time the web app starts?"
Does this harm anything about the database, to have these operations performed automatically?
If not in the application startup, how should we handle the following scenarios: creation of a local PostgreSQL instance in the future, creation of a Production instance of the database (when Hack Oregon lands a production version of the Winter 2017 season's apps) and the need to synchronize the database with the latest accepted changes to the application?
As to the former question, from what I see in PgAdmin at the moment, the tables appear to contain only the number of rows of data that we have in the source CSVs. So at least for the moment, this gives us room to figure out the more deterministic approach.
As to the latter, I can imagine at least three scenarios we might have to deal with:
as a Budget team developer, I want to work from a local PostgreSQL installation (a) to reduce the lag time for each query and (2) to protect other developers from unfinished and unvetted changes I'm experimenting with in any API development work I do.
as a Budget team developer, I want to use an automated script to perform the creation of the Production version of the Budget database.
as a Budget team developer, I want to use an automated script to perform database additions (migrations/schema additions and data additions/imports).
as a Budget team developer, I want to use an automated script to perform necessary data changes such as migrations/schema alterations (not additions but changes to existing schema objects) and data transformations (changes to existing data).
The text was updated successfully, but these errors were encountered:
As @jimtyhurst asked in PR47, do we need to enable
migrate
andimport
every time the Django app starts?Jim says, "Each time the image starts, it runs the migrations and imports the data as specified in docker-entrypoint.sh, right? I don't think it should do those things. In general, the application should be configured to access an existing database, so why should we run the migrations and import the data every time the web app starts?"
As to the former question, from what I see in PgAdmin at the moment, the tables appear to contain only the number of rows of data that we have in the source CSVs. So at least for the moment, this gives us room to figure out the more deterministic approach.
As to the latter, I can imagine at least three scenarios we might have to deal with:
as a Budget team developer, I want to work from a local PostgreSQL installation (a) to reduce the lag time for each query and (2) to protect other developers from unfinished and unvetted changes I'm experimenting with in any API development work I do.
as a Budget team developer, I want to use an automated script to perform the creation of the Production version of the Budget database.
as a Budget team developer, I want to use an automated script to perform database additions (migrations/schema additions and data additions/imports).
as a Budget team developer, I want to use an automated script to perform necessary data changes such as migrations/schema alterations (not additions but changes to existing schema objects) and data transformations (changes to existing data).
The text was updated successfully, but these errors were encountered: