2024-12-13-Friday
created: 2024-12-13 05:30 tags: - daily-notes
Friday, December 13, 2024
<< Timestamps/2024/12-December/2024-12-12-Thursday|Yesterday | Timestamps/2024/12-December/2024-12-14-Saturday|Tomorrow >>
🎯 Goal
- [ ] Create a scheduled CRON Job with a Celery worker so that updates to these BigBrain App notes are periodically refreshed to reflect their current state in my Local Version.
- [ ] Fix the uniqueness by Slug and
vault
so that a note with the same Slug can appear in a differentvault
🌟 Results
- Cooked my Local Version of the DIMMiN App and its Database in the pursuit of glory
🌱 Next Time
- Fix my Local Version so that Database Migrations can be applied and the Schema of the BigBrain App's
Note
Django Model can apply its uniqueness at the Slug/Vault
level
📝 Notes
I should add the LastModified
field upon retrieving the Note
from the S3 Bucket for the first time. Then, I can use that to reference against the easily retrievable LastModified
Metadata from the S3 API call to only update files that have been modified. This helps solve the efficiency problem that I was having earlier (i.e all files were being compared to their creationdate
Metadata that was derived from the local file's so they all thought they needed to be updated). I implemented this change and it worked as expected (nice!)
Next there was a weird constraint where the Slugs were conflicting in different Vault
s. Slugs should be unique, but only to the vault that they are in. I kept running into errors like:
Failed to process misc/vaults/DIMMiN Notes/Data Engineering/ETL.md: duplicate key value violates unique constraint "bigbrain_note_slug_key"
DETAIL: Key (slug)=(etl) already exists.
because I uploaded my WGU notes first and the WGU Vault
already has note (and therefore Slug) about ETL. Interestingly, I don't actually see the bigbrain_note_slug_key
constraint name anywhere. Therefore this probably has something to do with the Database Migrations. I used PostgreSQL's psql CLI to check how the database tables were actually made. Specifically, I used pgadmin's psql tool on the dimmin-staging
Database to check the configuration of the bigbrain_note
and ran
\d bigbrain_note
to find the following setup:
Table "public.bigbrain_note"
Column | Type | Collation | Nullable | Default
------------------+--------------------------+-----------+----------+----------------------------------
id | bigint | | not null | generated by default as identity
title | character varying(256) | | not null |
slug | character varying(256) | | not null |
content | text | | not null |
summary | character varying(256) | | |
is_private | boolean | | not null |
created_at | timestamp with time zone | | not null |
updated_at | timestamp with time zone | | not null |
vault_id | bigint | | not null |
relative_path | character varying | | |
last_modified_s3 | timestamp with time zone | | |
Indexes:
"bigbrain_note_pkey" PRIMARY KEY, btree (id)
"bigbrain_no_slug_afe83f_idx" btree (slug)
"bigbrain_no_title_8004a7_idx" btree (title)
"bigbrain_note_slug_2b9eb32e_like" btree (slug varchar_pattern_ops)
"bigbrain_note_slug_key" UNIQUE CONSTRAINT, btree (slug)
"bigbrain_note_vault_id_559649d3" btree (vault_id)
"unique_slug_per_vault" UNIQUE CONSTRAINT, btree (vault_id, slug)
Foreign-key constraints:
"bigbrain_note_vault_id_559649d3_fk_bigbrain_vault_id" FOREIGN KEY (vault_id) REFERENCES bigbrain_vault(id) DEFE
RRABLE INITIALLY DEFERRED
Referenced by:
TABLE "bigbrain_linkednote" CONSTRAINT "bigbrain_linkednote_linked_note_id_142c36ea_fk_bigbrain_note_id" FOREIGN
KEY (linked_note_id) REFERENCES bigbrain_note(id) DEFERRABLE INITIALLY DEFERRED
TABLE "bigbrain_linkednote" CONSTRAINT "bigbrain_linkednote_source_note_id_c02ba045_fk_bigbrain_note_id" FOREIGN
KEY (source_note_id) REFERENCES bigbrain_note(id) DEFERRABLE INITIALLY DEFERRED
TABLE "bigbrain_note_tags" CONSTRAINT "bigbrain_note_tags_note_id_027416c6_fk_bigbrain_note_id" FOREIGN KEY (not
e_id) REFERENCES bigbrain_note(id) DEFERRABLE INITIALLY DEFERRED
It was completely cooked so I needed to drop the tables themselves then apply Database Migrations:
DROP TABLE public.bigbrain_vault CASCADE;
However, Django was tracking my migrations and thought that the Database Migrations had already been applied so I needed to access the database shell with
python manage.py dbshell
Then, in the Database shell I needed to run
DELETE FROM django_migrations WHERE app='bigbrain';
to make Django forget about all the previous migrations. Unfortunately now when I tried to apply my database migrations I was told that the databases already exist (as they do with other tables / Django Models in the BigBrain App). So this was gonna be a little more tricky.
Since I pretty much nuked the Local Version of my Database, I created a backup and a new version of my Production database with
heroku pg:backups:capture -a dimmin
heroku pg:backups:download b014 --app dimmin
pg_restore -h "localhost" -p "5432" -U "postgres" -d "dimmin-staging" -v "latest.dump"
I'll have to look more into fixing this. I nuked my migrations so I'll need to re-configure my app to look like the app in GitHub.
Notes created today
List FROM "" WHERE file.cday = date("2024-12-13") SORT file.ctime asc
Notes last touched today
List FROM "" WHERE file.mday = date("2024-12-13") SORT file.mtime asc