I always felt it was a shame that the Silicon models never got their due. They're some of the best merges I've ever seen, and they've been used in a fair number of merges on Civitai. As far as I can tell, they've seemed to languish in obscurity: I've never seen them mentioned on the SD subreddit or the civitai server. I wanted to share them with this community to bring more attention to them and their creators.
A note on sample images. The images shown in the Hugging Face page don't have replication parameters, so I used the prompts from the duo's medium article (alongside basic quality tags). I don't know if their recommended parameters are ideal, but I stuck with them in order to be respectful. The sample images also used the baked in VAE to show what was possible from the baseline model.
Information from the Hugging Face page:
A series of general-purpose models based off the experimental automerger, autoMBW.
A collaborative creation of Xerxemi#6423 & Xynon#7407.
All models listed have baked WD1.3 VAE. However, for the purposes of this model series, external VAE is also recommended.
This model is open access and available to all, with a CreativeML OpenRAIL-M license further specifying rights and usage. The CreativeML OpenRAIL License specifies:
You can't use the model to deliberately produce nor share illegal or harmful outputs or content
The authors claims no rights on the outputs you generate, you are free to use them and are accountable for their use which must not go against the provisions set in the license
You may re-distribute the weights and use the model commercially and/or as a service. If you do, please be aware you have to include the same use restrictions as the ones in the license and share a copy of the CreativeML OpenRAIL-M to all your users (please read the license entirely and carefully) Please read the full license here :https://huggingface.co/spaces/CompVis/stable-diffusion-license
Clearly indicate where modifications have been made.
If you used it for merging, please state what steps you took to do so.
Silicon28: a.k.a. extestg4. The first model of autoMBW to match/surpass quality of manual merge block weight merges.
Silicon29: a.k.a. extesto4. a similar, but much larger list of merges based off the list of Silicon28. First good model to be constructed on a semi-stabilized autoMBW codebase.
Silicon28-negzero: a.k.a. extestg4-negzero. A negatively finetuned version of Silicon28 for 10 epochs off a dataset of 3990 images. Better at some, worse at others.
Silicon29-dark: a.k.a. extesto4-dark. Silicon29, but merged with noise offset. Gives darker output than the original base.
Sampler: DPM++ 2M
Steps: 42 + 42 | can probably go lower, I just run at this
Upscaler: Latent (bicubic antialiased)
Denoising: ~0.5 to ~0.6
CFG: 13
more comparisons here: https://medium.com/@media_97267/the-automated-stable-diffusion-checkpoint-merger-autombw-44f8dfd38871
Note: all comparison photos are pure Silicon29 with the latent bicubic antialiased upscaler.
A: Silicon's atomic number is 14. This line of models was originally supposed to be the 14th experimental model in Xynon/models, a.k.a. experimental14a/b/c.
A: https://github.com/Xerxemi/sdweb-auto-MBW | preliminary article here: https://medium.com/@media_97267/the-automated-stable-diffusion-checkpoint-merger-autombw-44f8dfd38871