Open SL06 opened 6 months ago
I assume that the demo print landmark coordinates and from those, I will be able to calculate distance between to landmark as in measureLineBody ex: shoulder to elbow and print the result. I could not found how to print with .measureLineBody
I assume that the demo print landmark coordinates and from those, I will be able to calculate distance between to landmark as in measureLineBody ex: shoulder to elbow and print the result. I could not found how to print with .measureLineBody
have you solve this issue ?
Yes, thanks. I have found how to do the calculation using the landmark. I was trying to get the measure the leg length to body height ratio of a person from a camera feed on my Ipad but the result where instable and not precise. I am looking at other option for now.
Sylvain Lareau (Le Doc du Vélo) Ledocduvelo.ca http://ledocduvelo.ca/ @. @.> (819) 307-5117
Le 11 juill. 2024 à 10:28, Daniel Guarnizo @.***> a écrit :
I assume that the demo print landmark coordinates and from those, I will be able to calculate distance between to landmark as in measureLineBody ex: shoulder to elbow and print the result. I could not found how to print with .measureLineBody
have you solve this issue ?
— Reply to this email directly, view it on GitHub https://github.com/quickpose/quickpose-ios-sdk/issues/14#issuecomment-2223086435, or unsubscribe https://github.com/notifications/unsubscribe-auth/AIWMIQPUZKXQICJJRZRUSDDZL2I7DAVCNFSM6AAAAABKXDPJYOVHI2DSMVQWIX3LMV43OSLTON2WKQ3PNVWWK3TUHMZDEMRTGA4DMNBTGU. You are receiving this because you authored the thread.
I have the following error: Value of type 'QuickPose.Landmarks' has no member 'isFrontCamera' related to : flippedHorizontally: landmarks.isFrontCamera) at 2 places
and also Pattern with associated values does not match enum case 'success' related to this line :
.success(_, _) = status, let landmarks = landmarks {
I mean, I need help with this issue because right now, I am not able to resolve it
I have abandon the project several month ago.
I believe that this is the code I was using. You have some cleaning to do but it should help you. I did got help from ChatGPT4.
Sylvain
import SwiftUI import QuickPoseCore import QuickPoseSwiftUI
import QuickPoseCamera
import AVFoundation // SL
// Struct to represent a 3D point with visibility and presence, similar to QuickPose's representation /struct Quickpose.Point3d { var x: Double var y: Double var cameraAspectY: Double var z: Double var visibility: Double var presence: Double }/
struct SLQuickPose1: View { @State var useFrontCamera: Bool = true private var quickPose = QuickPose(sdkKey: "XXX) @State private var overlayImage: UIImage? @State private var cameraViewOpacity: Double = 0 static let fastLabel = "Fast (Body Points only)" @State private var performance: String = UserDefaults.standard.bool(forKey: "performanceFast") ? SLQuickPose1.fastLabel : "Normal" // @State private var targetFPS: Double? = UserDefaults.standard.bool(forKey: "performanceFast") ? 10 : nil @State private var targetFPS: Double? = 15 @State private var selectedFeatures: [QuickPose.Feature] = [.overlay(.wholeBody)]
@State private var feedbackText: String? = "allo!"
@State private var lastResult: Double? = nil
@State private var lastResult0: Double? = nil
@State private var lastResult1: Double? = nil
@State var MeasurementResult: QuickPose.FeatureResult?
@State private var heightInCMText: String = ""
@State private var scalingFactor: Double = 1.0
@State private var showHeightAlert: Bool = false
@State private var dataBuffer: [String] = []
@State private var isRecording: Bool = false
// Function to check camera availability
func checkCameraAvailability() -> Bool {
return UIImagePickerController.isSourceTypeAvailable(.camera) && AVCaptureDevice.default(for: .video) != nil
}
var body: some View {
GeometryReader { geometry in
ZStack(alignment: .top) {
// if ProcessInfo.processInfo.isiOSAppOnMac, let url = Bundle.main.url(forResource: "rain-dance", withExtension: "mov") {
// if !checkCameraAvailability(), let url = Bundle.main.url(forResource: "rain-dance", withExtension: "mov") {
if !checkCameraAvailability() {
// Create a URL directly from the given file path
let url = URL(fileURLWithPath: "/Users/slareau/Documents/XCode/QuickPoseSL/SLQuickPoseDemo/sl-dance.mov")
QuickPoseSimulatedCameraView(useFrontCamera: false, delegate: quickPose, video: url)
} else {
QuickPoseCameraSwitchView(useFrontCamera: $useFrontCamera, delegate: quickPose, frameRate: $targetFPS)
}
//QuickPoseCameraView(useFrontCamera: $useFrontCamera, delegate: quickPose)
// QuickPoseCameraView(useFrontCamera: true, delegate: quickPose)
QuickPoseOverlayView(overlayImage: $overlayImage)
}
.edgesIgnoringSafeArea(.all)
.alert("Add Height", isPresented: $showHeightAlert)
{
TextField("Your height in CM, e.g. 150", text: $heightInCMText).keyboardType(.numberPad)
Button("OK"){
}
} message: {
Text("To show a ruler in CM, please enter your height in CM. ")
}
//work OK
.onAppear {
// let modelConfigLite = QuickPose.ModelConfig(detailedFaceTracking: false, detailedHandTracking: false, modelComplexity: .light)
let modelConfigGood = QuickPose.ModelConfig(detailedFaceTracking: false, detailedHandTracking: false, modelComplexity: .good)
// let modelConfigHeavy = QuickPose.ModelConfig(detailedFaceTracking: false, detailedHandTracking: false, modelComplexity: .heavy)
let bikeStyle = QuickPose.Style(relativeFontSize: 0.33, relativeArcSize: 0.4, relativeLineWidth: 0.3)
// let feature1: QuickPose.Feature = .measureLineBody(p1: .shoulder(side: .left), p2: .elbow(side: .left), userHeight: 168, format: "%.1fcm",style: bikeStyle)
// let feature2: QuickPose.Feature = .measureLineBody(p1: .elbow(side: .left), p2: .wrist(side: .left), userHeight: 168, format: "%.1fcm",style: bikeStyle)
// let feature3: QuickPose.Feature = .measureLineBody(p1: .wrist(side: .left), p2: .index(side: .left), userHeight: 168, format: "%.1fcm",style: bikeStyle)
let feature4: QuickPose.Feature = .measureLineBody(p1: .shoulder(side: .left), p2: .hip(side: .left), userHeight: 168, format: "%.1fcm",style: bikeStyle)
let feature5: QuickPose.Feature = .measureLineBody(p1: .hip(side: .left), p2: .knee(side: .left), userHeight: 168, format: "%.1fcm", style: bikeStyle)
let feature6: QuickPose.Feature = .measureLineBody(p1: .knee(side: .left), p2: .ankle(side: .left), userHeight: 168, format: "%.1fcm", style: bikeStyle)
// let feature7: QuickPose.Feature = .measureLineBody(p1: .ankle(side: .left), p2: .footIndex(side: .left), userHeight: 168, format: "%.1fcm", style: bikeStyle)
// let feature1: QuickPose.Feature = .rangeOfMotion(.shoulder(side: .right, clockwiseDirection: false), style: bikeStyle)
// DispatchQueue.main.asyncAfter(deadline: .now() + 0.3){ // for max stressing of the device the delay stops ios killing thread on startup
// quickPose.start(features: [.showPoints()], modelConfig: modelConfigGood, onFrame: { status, image, features, feedback, landmarks in
quickPose.start(features: [feature4,feature5,feature6], modelConfig: modelConfigGood, onFrame: { status, image, features, feedback, landmarks in
// quickPose.start(features: [.overlay(.upperBody)], onFrame: { status, image, features, feedback, landmarks in
// if case .success(_) = status {
if case .success(let performance) = status {
overlayImage = image
// quickPose.update(features: [feature5])
// quickPose.update (features: [feature5], modelConfig:modelConfigGood)
/* if case .success(let performance) = status {
if performance.latency > 0 {
//let maxFps = Int(1 / performance.latency)
print(performance.fps)
} else {
print(performance.fps, performance.latency)
}
} else {
// show error feedback
}
}
}*/
if performance.latency > 0 {
//let maxFps = Int(1 / performance.latency)
print(performance.fps) // 16
} else {
print(performance.fps, performance.latency)
}
if let landmarks = landmarks {
// let eye_right: QuickPose.Point3d = landmarks.landmark(forBody: .eye(side: .right))
// let eye_left: QuickPose.Point3d = landmarks.landmark(forBody: .eye(side: .left))
// let shoulder_right: QuickPose.Point3d = landmarks.landmark(forBody: .shoulder(side: .right))
let shoulder_left: QuickPose.Point3d = landmarks.landmark(forBody: .shoulder(side: .left))
// let elbow_right: QuickPose.Point3d = landmarks.landmark(forBody: .elbow(side: .right))
// let elbow_left: QuickPose.Point3d = landmarks.landmark(forBody: .elbow(side: .left))
// let index_right: QuickPose.Point3d = landmarks.landmark(forBody: .index(side: .right))
// let index_left: QuickPose.Point3d = landmarks.landmark(forBody: .index(side: .left))
// let hip_right: QuickPose.Point3d = landmarks.landmark(forBody: .hip(side: .right))
let hip_left: QuickPose.Point3d = landmarks.landmark(forBody: .hip(side: .left))
// let knee_right: QuickPose.Point3d = landmarks.landmark(forBody: .knee(side: .right))
let knee_left: QuickPose.Point3d = landmarks.landmark(forBody: .knee(side: .left))
// let ankle_right: QuickPose.Point3d = landmarks.landmark(forBody: .ankle(side: .right))
let ankle_left: QuickPose.Point3d = landmarks.landmark(forBody: .ankle(side: .left))
// let footIndex_left: QuickPose.Point3d = landmarks.landmark(forBody: .footIndex(side: .left))
// let footIndex_right: QuickPose.Point3d = landmarks.landmark(forBody: .footIndex(side: .right))
/*
let personHeightLeft : Double?
let personHeightRight : Double?
let personHeight : Double?
if personHeightLeft = calculateBodyPartLength(point1: eye_left, point2: footIndex_left)
{
print("personHeightLeft:", personHeightLeft)
} else {
print("One or more left points are not sufficiently visible or present.")
}
if personHeightRight = calculateBodyPartLength(point1: eye_right, point2: footIndex_right)
{
print("personHeightLeft:", personHeightRight)
} else {
print("One or more left points are not sufficiently visible or present.")
}
if personHeightLeft != nil && personHeightRight != nil {
let observedPersonHeight = (personHeightLeft + personHeightRight) / 2
scalingFactor = Double(heightInCMText) ?? 100 / (observedPersonHeight * (69.1 / 64.7))
let personHeight = observedPersonHeight * scalingFactor
print("person Height :", personHeight)
*/
/* print(" shoulder_left : ", shoulder_left)
print(" hip_left : ", hip_left)
print(" knee_left : ", knee_left)
print(" ankle_left : ", ankle_left)*/
if let torso_left = calculateBodyPartLength(point1: shoulder_left, point2: hip_left), let upperleg_left = calculateBodyPartLength(point1: hip_left, point2: knee_left), let lowerLeg_left = calculateBodyPartLength(point1: knee_left, point2: ankle_left) , let leg_left = calculateBodyPartLength(point1: hip_left, point2: ankle_left) {
// Save values in a textFiles
print(" torso_left : ", torso_left)
print(" upperleg_left : ", upperleg_left)
print(" lowerLeg_left : ", lowerLeg_left)
print(" leg_left : ", leg_left, upperleg_left + lowerLeg_left)
//let IndexHomme = 0,655642023346304, let IndexHomme = 0,649484536082474
let IndexPersonMoy = 0.652563279714389
let legToBodyRatioIndex = ((upperleg_left + lowerLeg_left ) / (( torso_left + upperleg_left + lowerLeg_left ))) / IndexPersonMoy // Référence: leg : 33.7 torse 17.7 tot 51.4 33.7/51.4
// Save values in a textFiles - end
let formattedRatio = String(format: "%.3f", legToBodyRatioIndex)
feedbackText = "legToShoulderRatioIndex = \(formattedRatio)"
// print("legToShoulderRatioIndex:", formattedRatio)
print (feedbackText ?? "")
print(" ")
if self.isRecording {
let dataString = "torso_left: \(torso_left), upperleg_left: \(upperleg_left), lowerLeg_left: \(lowerLeg_left), leg_left: \(leg_left), legToShoulderRatioIndex : \(formattedRatio)"
self.dataBuffer.append(dataString)
}
} else {
feedbackText = "One or more points enough visible"
/* print("One or more left points are not sufficiently visible or present.")
print(" shoulder_left : ", shoulder_left.visibility)
print(" hip_left : ", hip_left.visibility)
print(" knee_left : ", knee_left.visibility)
print(" ankle_left : ", ankle_left.visibility)
print(" ")*/
}
} // landmark
} // if case success
else {
// show error feedback
}
}
)} //on appear
//}//on appear
/*.overlay(alignment: .bottom) {
if let feedbackText = feedbackText {
Text(feedbackText)
.font(.system(size: 35, weight: .semibold)).foregroundColor(.white).multilineTextAlignment(.center)
.padding(16)
.background(RoundedRectangle(cornerRadius: 8).foregroundColor(Color("AccentColor").opacity(0.8)))
.padding(.bottom, 16)
}
}*/
.overlay(alignment: .top) {
HStack {
Spacer()
Button(action: {
useFrontCamera.toggle()
}) {
Image(systemName: "arrow.triangle.2.circlepath.camera")
.font(.system(size: 20, weight: .semibold))
.foregroundColor(.white)
.padding(8)
.background(Circle().foregroundColor(Color.blue.opacity(0.8)))
}
.padding()
Button(action: {
self.isRecording.toggle()
if self.isRecording {
// Clear the existing buffer when starting a new recording
self.dataBuffer.removeAll()
} else {
// When stopping, save the buffered data to a file
let data = self.dataBuffer.joined(separator: "\n")
let success = saveMeasurement(data: data)
if success {
feedbackText = "Data saved successfully"
} else {
feedbackText = "Failed to save data"
}
}
}) {
Text(isRecording ? "Stop Recording" : "Start Recording")
.foregroundColor(.white)
.padding()
.background(isRecording ? Color.red : Color.green)
.cornerRadius(10)
}
}
.padding() // Add padding around the HStack content
}
.overlay(alignment: .bottom) {
if let feedbackText = feedbackText {
Text(feedbackText)
.font(.system(size: 40, weight: .semibold)).foregroundColor(.white).multilineTextAlignment(.center)
.padding(16)
// Changed Color("AccentColor") to Color.black for a semi-opaque black background
.background(RoundedRectangle(cornerRadius: 8).fill(Color.black.opacity(0.5)))
.padding(.bottom, 16)
}
}
.onDisappear {
quickPose.stop()
}
}//geometry
} // View
func calculateBodyPartLength(point1: QuickPose.Point3d, point2: QuickPose.Point3d) -> Double? {
// Check visibility and presence for both points
// let ispoint1VisibleAndPresent = point1.visibility >= 0.95 && point1.presence >= 0.95
// let ispoint2VisibleAndPresent = point2.visibility >= 0.95 && point2.presence >= 0.95
let ispoint1VisibleAndPresent = point1.visibility >= 0.95 && point1.presence >= 0.95
let ispoint2VisibleAndPresent = point2.visibility >= 0.6 && point2.presence >= 0.6
// Return nil if either point doesn't meet the criteria
guard ispoint1VisibleAndPresent && ispoint2VisibleAndPresent else {
return nil
}
// Assuming a simplified calculation of distance purely based on y-coordinates (not factoring in 3D space, camera angle, etc.)
// This estimation ignores various complexities for the sake of demonstration.
let xDiff = point2.x - point1.x
let yDiff = point2.y - point1.y
let zDiff = point2.z - point1.z
// Calculate distance using the Euclidean distance formula
return sqrt(xDiff*xDiff + yDiff*yDiff + zDiff*zDiff)
}
func saveMeasurement(data: String) -> Bool {
do {
let fileURL = try FileManager.default
.url(for: .documentDirectory, in: .userDomainMask, appropriateFor: nil, create: false)
.appendingPathComponent("measurement.txt")
try data.write(to: fileURL, atomically: true, encoding: .utf8)
print("File saved at: \(fileURL)")
return true
} catch {
print("Error saving file: \(error)")
return false
}
}
} // Struct main
Le 11 juill. 2024 à 10:34, Daniel Guarnizo @.***> a écrit :
I have the following error: Value of type 'QuickPose.Landmarks' has no member 'isFrontCamera' related to : flippedHorizontally: landmarks.isFrontCamera) at 2 places
and also Pattern with associated values does not match enum case 'success' related to this line : .success(, ) = status, let landmarks = landmarks {
I mean, I need help with this issue because right now, I am not able to resolve it
— Reply to this email directly, view it on GitHub https://github.com/quickpose/quickpose-ios-sdk/issues/14#issuecomment-2223103454, or unsubscribe https://github.com/notifications/unsubscribe-auth/AIWMIQPCBYPGGXDVP4GO5RTZL2JY3AVCNFSM6AAAAABKXDPJYOVHI2DSMVQWIX3LMV43OSLTON2WKQ3PNVWWK3TUHMZDEMRTGEYDGNBVGQ. You are receiving this because you authored the thread.
i solve the problem in the following way, I remove "landmarks.isFrontCamera" from line code "scaledToViewPoint = bodyNose.cgPoint(scaledTo: geometry.size, flippedHorizontally: landmarks.isFrontCamera)", because 'QuickPose.Landmarks' has no member 'isFrontCamera', now is in the following way "scaledToViewPoint = bodyNose.cgPoint(scaledTo: geometry.size, flippedHorizontally: isUsingFrontCamera)" adding a variable that I define in the view. here the code `import SwiftUI import QuickPoseCore import QuickPoseSwiftUI import AVFoundation
struct QuickPoseBasicView: View {
private var quickPose = QuickPose(sdkKey: "YOUR KEY") // register for your free key at https://dev.quickpose.ai
@State private var overlayImage: UIImage?
@State private var scaledToViewPoint = CGPoint(x: 1080/2, y: 1920/2)
@State private var isUsingFrontCamera = false // variable that I added
var body: some View {
GeometryReader { geometry in
ZStack(alignment: .top) {
if true, let url = Bundle.main.url(forResource: "happy-dance", withExtension: "mov") {
QuickPoseSimulatedCameraView(useFrontCamera: false, delegate: quickPose, video: url, videoGravity: .resizeAspect)
} else {
QuickPoseCameraView(useFrontCamera: false, delegate: quickPose, videoGravity: .resizeAspect)
}
QuickPoseOverlayView(overlayImage: $overlayImage, contentMode: .fit)
}
.overlay(alignment: .topLeading) {
Circle()
.position(x: scaledToViewPoint.x, y: scaledToViewPoint.y)
.frame(width: 12, height: 12)
.foregroundColor(Color.green.opacity(1.0))
}
.frame(width: geometry.size.width)
.edgesIgnoringSafeArea(.all)
.onAppear {
quickPose.start(features: [.showPoints()], onFrame: { status, image, features, feedback, landmarks in
overlayImage = image
if case .success = status, let landmarks = landmarks {
print(landmarks)
let bodyNose = landmarks.landmark(forBody: .nose)
let bodyNoseWorld = landmarks.worldLandmark(forBody: .nose)
scaledToViewPoint = bodyNose.cgPoint(scaledTo: geometry.size, flippedHorizontally: isUsingFrontCamera)
if let nose = landmarks.landmark(forFace: .faceNose) {
//print(nose.cgPoint(scaledTo: geometry.size, flippedHorizontally: landmarks.isFrontCamera))
print(nose.cgPoint(scaledTo: geometry.size))
print(nose)
}
} else {
// show error feedback
}
})
}.onDisappear {
quickPose.stop()
}
.overlay(alignment: .bottom) {
Text("Powered by QuickPose.ai v\(quickPose.quickPoseVersion())") // remove logo here, but attribution appreciated
.font(.system(size: 16, weight: .semibold)).foregroundColor(.white)
.frame(maxHeight: 40 + geometry.safeAreaInsets.bottom, alignment: .center)
.padding(.bottom, 0)
}
}
}
}`
I have the following error: Value of type 'QuickPose.Landmarks' has no member 'isFrontCamera' related to : flippedHorizontally: landmarks.isFrontCamera) at 2 places
and also Pattern with associated values does not match enum case 'success' related to this line :
.success(_, _) = status, let landmarks = landmarks {