SD15HEDControlNetModel
HED soft-edge-conditioned ControlNet pipeline built on Stable Diffusion 1.5.
Takes an input image and a text prompt. Soft edge maps are extracted from
the image using the Holistically-nested Edge Detection (HED) detector from
lllyasviel/Annotators, then fed as spatial conditioning into the
lllyasviel/sd-controlnet-hed ControlNet backbone together with the
runwayml/stable-diffusion-v1-5 diffusion pipeline. HED produces
sketch-like contours that preserve structural outlines while allowing more
creative variation than hard-edge Canny maps, making this model well-suited
for artistic reinterpretation of existing images.
Requires the controlnet_aux package (pip install controlnet_aux).
References
- [1] Zhang & Agrawala, "Adding Conditional Control to Text-to-Image Diffusion Models", ICCV 2023. https://arxiv.org/abs/2302.05543
- [2] https://huggingface.co/lllyasviel/sd-controlnet-hed
Parameters
- num_inference_steps : integer, default=
20 - Number of denoising steps. Typical range: 20-30 for fast results, 40-50 for higher quality.
- controlnet_conditioning_scale : number, default=
1.0 - Weight of the ControlNet soft-edge conditioning (range 0.0-2.0). HED produces soft, sketch-like edge maps that are less strict than Canny. At 1.0 the output closely follows the input edges. Lower values give more creative freedom.
- guidance_scale : number, default=
7.5 - Classifier-Free Guidance (CFG) scale. Controls prompt adherence. Values 7-9 are typical for SD 1.5.
- device : string, default=
CPU - Hardware device for inference. GPU is strongly recommended. CPU inference is possible but very slow.
Methods
generate(self, input: Tuple[ForwardRef('Image.Image'), str]) -> List[Any]
SD15HEDControlNetModelGenerate output from a generative model.
Parameters
- input : Tuple[Image.Image, str]
- Input image and text prompt.
Returns
- List[Any]
- Generated output images in a list.
get_schema(cls) -> dict
ConfigObjectGenerates the component related Json Schema.
Returns
- dict
- Dictionary representing the Json Schema of the component.
validate_and_transform(self, raw_data: dict) -> dict
ConfigObjectIt takes the data given by the user to initialize the model and returns it with all the objects that the model needs to work.
Parameters
- raw_data : dict
- A dictionary with the data provided by the user to initialize the model.
Returns
- dict
- A validated dictionary with the necessary objects.