Skip to content

Transformers-ready; Code for CVPR 2019 paper. BASNet: Boundary-Aware Salient Object Detection

License

Notifications You must be signed in to change notification settings

creative-graphic-design/BASNet

 
 

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

73 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

BASNet, 🤗 Transformers-ready ver.

Open In Colab

#
# Get appropriate device
#
import torch

device = torch.device("cuda" if torch.cuda.is_available() else "cpu")

#
# Load the model and the processor
#
from transformers import AutoImageProcessor, AutoModel

repo_id = "creative-graphic-design/BASNet"

processor = AutoImageProcessor.from_pretrained(
    repo_id,
    trust_remote_code=True,
)
model = AutoModel.from_pretrained(
    repo_id,
    trust_remote_code=True,
)

#
# Download an image
#
import requests
from PIL import Image

image = Image.open(
    requests.get(
        "https://raw.githubusercontent.com/xuebinqin/BASNet/master/test_data/test_images/0003.jpg",
        stream=True,
    ).raw
)

#
# Preprocess the image
#
width, height = image.size
inputs = processor(images=image)

#
# Move the model and the inputs to the appropriate device
#
model = model.to(device)
inputs = {k: v.to(device) for k, v in inputs.items()}

#
# Run the model
#
with torch.no_grad():
    outputs = model(**inputs)
prediction = outputs[0][0]
assert list(prediction.shape) == [1, 1, 256, 256]

#
# Postprocess the prediction
#
image = processor.postprocess(prediction, width=width, height=height)
image  # Now you can visualize the output image

BASNet (New Version May 2nd, 2021)

'Boundary-Aware Segmentation Network for Mobile and Web Applications', Xuebin Qin, Deng-Ping Fan, Chenyang Huang, Cyril Diagne, Zichen Zhang, Adria Cabeza Sant’Anna, Albert Suarez, Martin Jagersand, and Ling Shao.

Salient Object Detection(SOD) Qualitative Comparison

SOD Qualitative Comparison

Salient Objects in Clutter(SOC) Qualitative Comparison

SOC Qualitative Comparison

Camouflaged Object Detection(COD) Qualitative Comparison

COD Qualitative Comparison

Predicted maps of SOD, SOC and COD datasets

SOD Results will come soon!
SOC Results will come soon!
COD Results

BASNet (CVPR 2019)

Code for CVPR 2019 paper 'BASNet: Boundary-Aware Salient Object Detection code', Xuebin Qin, Zichen Zhang, Chenyang Huang, Chao Gao, Masood Dehghan and Martin Jagersand.

Contact: xuebin[at]ualberta[dot]ca

(2020-May-09) NEWS! Our new Salient Object Detection model (U^2-Net), which is just accepted by Pattern Recognition, is available now!

U^2-Net: Going Deeper with Nested U-Structure for Salient Object Detection

Evaluation

Evaluation Code

Required libraries

Python 3.6
numpy 1.15.2
scikit-image 0.14.0
PIL 5.2.0
PyTorch 0.4.0
torchvision 0.2.1
glob

The SSIM loss is adapted from pytorch-ssim.

Usage

  1. Clone this repo
git clone https://github.com/NathanUA/BASNet.git
  1. Download the pre-trained model basnet.pth from GoogleDrive or baidu extraction code: 6phq, and put it into the dirctory 'saved_models/basnet_bsi/'

  2. Cd to the directory 'BASNet', run the training or inference process by command: python basnet_train.py or python basnet_test.py respectively.

We also provide the predicted saliency maps (GoogleDrive,Baidu) for datasets SOD, ECSSD, DUT-OMRON, PASCAL-S, HKU-IS and DUTS-TE.

Architecture

BASNet architecture

Quantitative Comparison

Quantitative Comparison

Qualitative Comparison

Qualitative Comparison

Citation

@article{DBLP:journals/corr/abs-2101-04704,
  author    = {Xuebin Qin and
               Deng{-}Ping Fan and
               Chenyang Huang and
               Cyril Diagne and
               Zichen Zhang and
               Adri{\`{a}} Cabeza Sant'Anna and
               Albert Su{\`{a}}rez and
               Martin J{\"{a}}gersand and
               Ling Shao},
  title     = {Boundary-Aware Segmentation Network for Mobile and Web Applications},
  journal   = {CoRR},
  volume    = {abs/2101.04704},
  year      = {2021},
  url       = {https://arxiv.org/abs/2101.04704},
  archivePrefix = {arXiv},
}

Citation

@InProceedings{Qin_2019_CVPR,
author = {Qin, Xuebin and Zhang, Zichen and Huang, Chenyang and Gao, Chao and Dehghan, Masood and Jagersand, Martin},
title = {BASNet: Boundary-Aware Salient Object Detection},
booktitle = {The IEEE Conference on Computer Vision and Pattern Recognition (CVPR)},
month = {June},
year = {2019}
}

About

Transformers-ready; Code for CVPR 2019 paper. BASNet: Boundary-Aware Salient Object Detection

Topics

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages

  • Python 100.0%