2024-12-16-Monday


created: 2024-12-16 06:29 tags: - daily-notes


Monday, December 16, 2024

<< Timestamps/2024/12-December/2024-12-15-Sunday|Yesterday | Timestamps/2024/12-December/2024-12-17-Tuesday|Tomorrow >>


🎯 Goal


🌟 Results


🌱 Next Time

  • Start working on a devlog page where I can publish my notes from my WGU MSDADs program as well as my DIMMiN App development (hello!). Maybe place this under the Blog App section or create a new area for exploring my local notes publicly. This will probably be covered in more detail in BASB.

📝 Notes

So Last Friday I tried to look for a way to re-establish the Vault table in the database. There's an error where notes with the same Slug can't be loaded into different vaults because of a uniqueness constraint enforced early on in the Database Migrations. I kicked off accessing my Database via

python manage.py dbshell

I could then list all my different tables via

\dt

Which includes the following tables from the BigBrain App:

bigbrain_idea
bigbrain_idea_tags
bigbrain_linkednote
bigbrain_note
bigbrain_note_tags
bigbrain_project
bigbrain_project_tags
bigbrain_tag
bigbrain_vault

The trick here is that I can save the data from my other tables, then re-load them back into the database but it seems sketchy. If I had more data here (or other users were using the BigBrain App) then I would want to figure out some other way to do this. For now I'll work on

First I needed to drop all tables related to bigbrain:

DROP TABLE bigbrain_idea CASCADE;
DROP TABLE bigbrain_idea_tags CASCADE;
DROP TABLE bigbrain_linkednote CASCADE;
DROP TABLE bigbrain_note CASCADE;
DROP TABLE bigbrain_note_tags CASCADE;
DROP TABLE bigbrain_project CASCADE;
DROP TABLE bigbrain_project_tags CASCADE;
DROP TABLE bigbrain_tag CASCADE;
DROP TABLE bigbrain_vault CASCADE;

Then I was able to successfully apply Database Migrations. However I ran into a tricky problem when I applied my custom Django Management Command where uniqueness of the slug across the entire app was still enforced.

Once I removed the unique=True definition when creating my Note Django Model, I was finally able to get it to work! This was such a simple adjustment that I think I may actually want to reset to my original migrations to avoid having to drop any databases. I'm going to try to reset my codebase using Git:

git reset --hard

Which brought me back to my original migrations. Interestingly, when I removed the unique=True constraint from the Note model after applying the Git Reset, I got the following message:

CommandError: Conflicting migrations detected; multiple leaf nodes in the migration graph: (0003_alter_note_slug, 0013_alter_note_created_at_alter_note_updated_at in bigbrain).
To fix them run 'python manage.py makemigrations --merge'

Which is exactly the type of message I was hoping for, as the unique key constraints now cause problems.

python manage.py makemigrations --merge
python manage.py makemigrations
python manage.py migrate

However this gave an error. I ran the command:

python manage.py showmigrations

To explicitly look at my Database Migrations to find the following:

bigbrain
 [X] 0001_initial
 [ ] 0002_rename_description_idea_details_and_more   
 [ ] 0003_rename_working_title_idea_optional_working_title
 [ ] 0004_tag_alter_project_end_date_alter_project_results_and_more
 [ ] 0005_vault_note_linkednote_and_more
 [ ] 0006_alter_note_tags
 [ ] 0007_rename_project_description_project_description
 [ ] 0008_alter_note_tags
 [ ] 0009_alter_note_unique_together
 [ ] 0010_alter_note_title
 [ ] 0011_vault_slug
 [ ] 0012_note_relative_path
 [ ] 0013_alter_note_created_at_alter_note_updated_at
 [X] 0002_migrate_data
 [X] 0003_alter_note_slug
 [ ] 0014_merge_20241216_0705
 [ ] 0015_alter_note_unique_together_note_last_modified_s3_and_more

I decided to reset the Local Version of my PostgreSQL database so that I could try to re-apply the migrations on a database that reflects the Production database instead:

pg_restore -h "localhost" -p "5432" -U "postgres" -d "dimmin-staging" -v "latest.dump"

which worked! This likely didn't work before because I had reset the database without the prior Database Migrations (i.e I had dropped the tables whoops). This finally worked!! I was then able to push my changes to Production and successfully apply these changes:

heroku run bash -a dimmin
python manage.py migrate

Now I can have a working version of my local notes hosted on the DIMMiN App! Thankfully I didn't have to go around dropping tables like ChatGPT initially suggested, it's important to keep in mind the limitations of the current LLMs.

The next step is to establish a CRON Job with a Celery Worker to periodically update the vaults. I was able to do this by creating a tasks.py file for the BigBrain App and allowing it to execute the Django Management Command via the built-in django.core.management.call_command function in the following code:

from celery import shared_task
from django.core.management import call_command
import logging

logger = logging.getLogger(__name__)

@shared_task
def run_import_obsidian_vault_from_s3(vault_name):
    """
    Executes the 'import_obsidian_vault_from_s3' management command with the specified vault name.
    """
    try:
        logger.info(f"Starting management command: import_obsidian_vault_from_s3 with vault-name={vault_name}")
        call_command('import_obsidian_vault_from_s3', '--vault-name', vault_name)
        logger.info(f"Successfully completed: import_obsidian_vault_from_s3 with vault-name={vault_name}")
    except Exception as e:
        logger.error(f"Error running management command: import_obsidian_vault_from_s3 with vault-name={vault_name} - {e}")
        raise

I then added this code to my Celery Worker's schedule by updating the CELERY_BEAT_SCHEDULE configuration:

CELERY_BEAT_SCHEDULE = {
    'reset_tasks_at_midnight': {
        'task': 'taskmaster.tasks.scheduled_reset_task_status',
        'schedule': crontab(minute=0),  # Check at the start of every hour
    },
    'update_wgu_msdads_notes': {
        'task': 'apps.bigbrain.tasks.run_import_obsidian_vault_from_s3',
        'schedule': crontab(hour=12),  # Runs daily at noon
        'args': ['WGU MSDADS'],  # Pass vault_name as a positional argument
    },
    'update_dimmin_dev_notes': {
        'task': 'apps.bigbrain.tasks.run_import_obsidian_vault_from_s3',
        'schedule': crontab(hour=12),  # Runs daily at noon
        'args': ['DIMMiN Notes'],  # Pass vault_name as a positional argument
    },
}

Which should execute an update of the current notes in S3 daily at noon. I decided to schedule the WGU notes update every 30 seconds to test whether the job was being effectively scheduled. I was able to validate it by checking

heroku logs --tail

to see the output of the Celery Worker to be:

2024-12-16T15:50:15.064268+00:00 app[worker.1]: [2024-12-16 15:50:15,064: INFO/ForkPoolWorker-7] Starting management command: import_obsidian_vault_from_s3 with vault-name=WGU MSDADS
2024-12-16T15:50:15.081129+00:00 app[worker.1]: [2024-12-16 15:50:15,080: INFO/ForkPoolWorker-7] Found credentials in environment variables.
2024-12-16T15:50:15.203406+00:00 app[worker.1]: [2024-12-16 15:50:15,203: WARNING/ForkPoolWorker-7] Vault 'WGU MSDADS' already exists. Setting it as the current vault.
2024-12-16T15:50:16.960822+00:00 app[worker.1]: [2024-12-16 15:50:16,960: WARNING/ForkPoolWorker-7]       
2024-12-16T15:50:16.960834+00:00 app[worker.1]: Import complete!
2024-12-16T15:50:16.960835+00:00 app[worker.1]: Updated: 0
2024-12-16T15:50:16.960835+00:00 app[worker.1]: Unchanged: 731
2024-12-16T15:50:16.960835+00:00 app[worker.1]: Failed: 0
2024-12-16T15:50:16.960836+00:00 app[worker.1]:      
2024-12-16T15:50:16.962206+00:00 app[worker.1]: [2024-12-16 15:50:16,962: INFO/ForkPoolWorker-7] Successfully completed: import_obsidian_vault_from_s3 with vault-name=WGU MSDADS

Then I changed the scheduling of this job to be daily at noon. This could also be adjusted so that the args passed to it contain a collection of all vaults that I want to update on a schedule, but for now that's not really necessary. I uploaded my most recent snapshot of my vaults to S3 with the command

python manage.py upload_obsidian_vault_to_s3

and will wait and see whether these are uploaded by the time I start developing tomorrow. All that's left to do now is to display these daily notes in the app.


Notes created today

List FROM "" WHERE file.cday = date("2024-12-16") SORT file.ctime asc

Notes last touched today

List FROM "" WHERE file.mday = date("2024-12-16") SORT file.mtime asc

(Template referenced from Dann Berg, can be found here)


Previous Note 2024-12-13-Friday Next Note 2024-12-17-Tuesday