I am testing your model and the result looks pretty weird, some values are more than 1, so when multiply by 112, the value is out of the range [0, 112].
Here are my code
import torchfrom torch.utils.data import DataLoaderfrom torchvision import transforms as tvTransformsdevice = torch.device("cuda" if torch.cuda.is_available() else "cpu")
I am testing your model and the result looks pretty weird, some values are more than 1, so when multiply by 112, the value is out of the range [0, 112].
Here are my code
import torch
from torch.utils.data import DataLoader
from torchvision import transforms as tvTransforms
device = torch.device("cuda" if torch.cuda.is_available() else "cpu")
source_code = '../input/d/hariwh0/facial-landmark'
sys.path.append(source_code)
from dataset.datasets import WLFWDatasets
from models.pfld import PFLDInference
model_path = source_code + '/checkpoint/snapshot/checkpoint.pth.tar'
checkpoint = torch.load(model_path, map_location=device)
backbone = PFLDInference().to(device)
backbone.load_state_dict(checkpoint['pfld_backbone'])
backbone = backbone.to(device)
transform = tvTransforms.Compose([tvTransforms.ToTensor()])
face_path = '../input/vn-celeb/VN-celeb/1019/11.png'
face = Image.open(face_path)
face = face.resize((112, 112))
face_arr = np.array(face.getdata()).T.reshape((112,112,-1))
face_arr = face_arr / 255
face_arr = transform(face_arr).unsqueeze(0)
face_arr = face_arr.type(torch.float)
face_arr = face_arr.to(device)
_, landmarks = backbone(face_arr)
landmarks = landmarks.detach().numpy()
landmarks = landmarks.reshape(landmarks.shape[0], -1, 2)
face_landmarks = landmarks[0] * [112, 112]
face_landmarks = face_landmarks.astype(int).tolist()
You can see in the result that many values are larger than 112. Please show me where I am wrong, thank you in advance!!!
You can found the image in folder 1019 picture 11.png at https://www.kaggle.com/hariwu1995/vn-celeb/