目录

feature-preserve-portrait-editing

This is the code for Learning Feature-Preserving Portrait Editing from Generated Pairs

Setup

We recommend using Miniconda to set up an environment:

conda create --name mcdm python=3.8.5

conda activate mcdm

Install the required packages:

pip install -r requirements.txt 

Demo on a Single Image

First, download pretrained model weights of two tasks (outfit editing and cartoon expression editing) from HuggingFace and put them into the root folder:

We provide examples for the outfit editing task. To apply all four supported outfit editing effects, simply run the following command. The results will be saved in ./demo_results.

bash demo.sh 

Alternatively, you can also run following commands for a specific editing like below:

python demo.py --model_dir portrait_editing_models/outfit/checkpoint-200000 --image_path ./data/outfit/test/image1.jpg --prompt "a man, cute flower costume"   

Model Training

We have provided a small sample dataset for outfit editing, located in the ./data directory. This dataset is intended for reference purposes.

To train the model with your own dataset, ensure that your dataset follows the same structure as the provided sample dataset.

To start training, run the following command:

bash train.sh

Acknowledgement

This codebase is adpated from diffusers and DreamPose.

License

Copyright 2024 Bytedance Ltd. and/or its affiliates

Licensed under the Apache License, Version 2.0 (the "License");
you may not use this file except in compliance with the License.
You may obtain a copy of the License at

    http://www.apache.org/licenses/LICENSE-2.0

Unless required by applicable law or agreed to in writing, software
distributed under the License is distributed on an "AS IS" BASIS,
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
See the License for the specific language governing permissions and
limitations under the License.

Citation

@article{chen2024learning,
title={Learning Feature-Preserving Portrait Editing from Generated Pairs},
author={Chen, Bowei and Zhi, Tiancheng and Zhu, Peihao and Sang, Shen and Liu, Jing and Luo, Linjie},
journal={arXiv preprint arXiv:2407.20455},
year={2024}
}
关于
814.0 KB
邀请码
    Gitlink(确实开源)
  • 加入我们
  • 官网邮箱:gitlink@ccf.org.cn
  • QQ群
  • QQ群
  • 公众号
  • 公众号

版权所有:中国计算机学会技术支持:开源发展技术委员会
京ICP备13000930号-9 京公网安备 11010802032778号