Action-Net(CVPR2021)
- 论文中给出了三个数据集的测试精度,sth-sthV2,jester,egogesture.
其数据加载方式利用了作者构造的.pkl文件; - 要想要在UCF101,HMDB51上训练测试精度,有两种方法:
(1)改写代码
(2)构造ucf101和hmdb51数据集的.pkl文件
本文将附上方法(1)的实现步骤及代码,以ucf101为例:
root = './datasets/UCF101_org/'
annot_path = 'ucf101_splits_101'
rawframe_path = '/data1/han_wu/datasets/UCF101_org/ucf101_rawframes'
def load_video(annot_path, mode):
txt_file=os.path.join(annot_path,'ucf101_{}_split_1_rgb.txt'.format(mode))
video_names =[]
frame_nums = []
labels = []
file = open(txt_file,'r')
for content in file.readlines():
video_names.append(content.strip('').split(" ")[0])
frame_nums.append(content.strip('').split(" ")[1])
labels.append(content.strip('\n').split(" ")[2])
rgb_samples = []
for video_name in video_names:
video_path = os.path.join(rawframe_path,video_name)
rgb_list = []
for num in os.listdir(video_path):
frame_path = os.path.join(video_path,num)
rgb_list.append(frame_path)
rgb_samples.append(sorted(rgb_list))
print('{}: {} videos have been loaded'.format(mode, len(rgb_samples)))
return rgb_samples, labels
本文内容由网友自发贡献,版权归原作者所有,本站不承担相应法律责任。如您发现有涉嫌抄袭侵权的内容,请联系:hwhale#tublm.com(使用前将#替换为@)