Guorun Wang

I’m Guorun. I am an Mres student at Imperial College London, supervised by Professor Lucia Specia. I am working on debias of multimodal models. I graduate from Data Science and Big Data technology in the Computer Science department, Tongji University.

Email  /  CV  /  Bio  /  Google Scholar  /  Twitter  /  Github

profile photo
Research

I’m interested in Multimodal, NLP, Computer Vision. I also have experience in 3D vision and Efficient NLP.

I was a research assistant in Cognitive and Intelligent Computing Lab, Tongji University, working on Multimodal in image caption, 2d classification with Professor Yaoru Sun.

I was a research assistant in H2lab , University of Washington, working on Parameter-efficient NLP with Dr. Qingqing Cao(Postdoc) and Professor Hannaneh Hajishirzi.

I was a research intern in contex.ai, working on 3d NerFs and understanding with Professor Lucia Specia and Dr. Viktoriia Sharmanska.

I was a research assistant of Professor Kuo-yi Lin. , working on Robustness of human body reconstruction and segmentation method.

Publications

Deep Pixel-Wise Textures for Construction Waste Classification
Jun Yang; Guorun Wang; Yaoru Sun; Lizhi Bai; Bohan Yang
IEEE Transactions on Automation Science and Engineering, 2023
ieee

Task-oriented Memory-efficient Pruning-Adapter
Guorun Wang; Jun Yang; Yaoru Sun
arXiv, 2023
arXiv

Pixel Difference Convolutional Network for RGB-D Semantic Segmentation
Jun Yang, Lizhi Bai, Yaoru Sun, Chunqi Tian, Maoyu Mao, Guorun Wang
IEEE Transactions on Circuits and Systems for Video Technology, 2023
arXiv

What are You Posing: A gesture description dataset based on coarse-grained semantics
Luchun Chen, Guorun Wang, Yaoru Sun, Rui Pang, Chengzhi Zhang
ICNLP, 2023

U2 Net-Plus and background removal-based PIFu-HD: human body reconstruction in complex background
Guorun Wang, Xudong Liu, Kuo-Yi Lin, Fuhjiun Hwang
International Journal of Internet Manufacturing and Services, 2022
website

Friends who help me a lot!
Lirong Yao
Gilgamesh

Website adapted from Jon Barron, many thanks!!!