Model was merged using a combination of checkpoint mergers, block mergers, and CLIP/UNet swapping.
VAE is baked into the model, therefore VAE is not a requirement
I still recommend running VAE alongside this model, due to insufficient data VAE injections.
Link for the VAE I use - https://huggingface.co/stabilityai/sd-vae-ft-mse-original/tree/main
Models used in merge:
Note: The merging method I used was unconventional, so I can't provide an exact recipe. One tip I can give to get something similar to this, would be to make a "ema only" inference model from an existing, popular dreambooth model.(I recommend checking to make sure it doesn't have broken tensors first.) Use this For example; f222-pruned-emaonly. Use this model in your Tertiary position in the checkpoint merger when using "Add Difference". I wouldn't use this for every merge, only when it's appropriate. Based off my F222 example, this would be when the Mixed model you're merging is known to have F222 weights.