Skip to content

Conversation

@Haoming02
Copy link
Contributor

Description

  • Simple Description: When you download/train a model to experiment, the Webui caches the hash for it. Overtime, you will have hundreds if not thousands of hash for models that no longer exist. Therefore, I added a button to clean this up, which can reduce file size by a tiny amount, and improve load time by an even tinier amount...

  • Summary of Changes:

    1. Add a Prune all unused hash button in Settings/Actions, which when clicked calls:
    2. A new prune_unused_hash function in cache.py

Checklist


Performance

Checking hundreds of entries took less than a second~

Comment on lines +146 to +168
for db in ('hashes', 'hashes-addnet', 'safetensors-metadata'):
existing_cache = cache(db)
total_count = len(existing_cache)
with tqdm.tqdm(total=total_count, desc=f'pruning {db}') as progress:
for name in existing_cache:
if '/' not in name:
progress.update(1)
continue

category, filename = name.split('/', 1)
if category.lower() == 'lora':
exists = file_exists(os.path.join(models_path, 'Lora'), filename)
elif category.lower() == 'checkpoint':
exists = file_exists(os.path.join(models_path, 'Stable-diffusion'), filename)
elif category.lower() == 'textual_inversion':
exists = file_exists(cmd_opts.embeddings_dir, filename)
else:
progress.update(1)
continue

if not exists:
del existing_cache[name]
progress.update(1)
Copy link
Collaborator

@w-e-w w-e-w Dec 17, 2024

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

this is wrong

  • completely forgot about hypernet

  • you are assuming that the user is using the default directories which may not be the case, that the path of each model directory is a complicated mess, some models such as Lora can come from multiple directories

  • extensions could also add own cache entries, I don't think there's a way that you can avoid not accidentally removing those entries

as you said

which can reduce file size by a tiny amount, and improve load time by an even tinier amount...

so personally I wouldn't bother makeing it work, not worth the effort

@Haoming02
Copy link
Contributor Author

Who even uses Hypernetworks anyway

But yeah, not really worth it to deal with the mess that is the current model system...

@Haoming02 Haoming02 closed this Dec 18, 2024
@Haoming02 Haoming02 deleted the prune-btn branch December 18, 2024 01:41
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants