BIPES / BIPES

BIPES: Block based Integrated Platform for Embedded Systems allows text and block based programming for several types of embedded systems and Internet of Things modules using MicroPython, CircuitPython, Python or Snek. You can connect, program, debug and monitor several types of boards using network, USB or Bluetooth. No software install needed!
https://bipes.net.br/ide
GNU General Public License v3.0
146 stars 69 forks source link

The generator code for the following blocks not specified for Python: hello #232

Closed Rimririn closed 3 days ago

Rimririn commented 4 months ago

I want to modify index.html to display the camera feed and create blocks that control the Pico board based on the camera's actions. I wrote the code to test this, but I'm encountering the following error. How can I use JavaScript blocks and Python blocks together?

ui/core/generator_stubs.js

Blockly.JavaScript["hello"] = (block) => {
  const time = block.getFieldValue("TIME");
  const code = `for(let i = 0; i < ${time}; i += 1) { alert("hello"); }\n`;
  return code;
};

ui/core/block_definitions.js

Blockly.Blocks["hello"] = {
  init() {
    this.jsonInit({
      message0: "%1 hello",
      args0: [
        {
          type: "field_number",
          name: "TIME",
          value: 0,
          min: 0,
          precision: 1,
        },
      ],
      previousStatement: null,
      nextStatement: null,
      colour: "#fdd835",
    });
  },
};

};

ui/toolbox/rpi_pico.xml

  <category name="New" colour="#fdd835">
    <block type="hello"></block>    
  </category>

ui/toolbox/rpi_pico.xml

  <category name="New" colour="#fdd835">
    <block type="hello"></block>    
  </category>

ui/index.xml

...
  <style>
      #camera-continer {
        z-index: 99;
        position: absolute;
        right: 0;
        height: 100%;
        background-color: white;
        border: 1px solid #000;
      }

      #webcam-container {
        position: relative;
        width: 500px;
        height: 375px;
        background-color: #4d4dff;
      }
      #webcam-container canvas,
      #webcam-container video {
        position: absolute;
        top: 0;
        left: 0;
      }
    </style>
...
    <div id="camera-continer">
      <div id="webcam-container">
        <video
          id="webcam-video"
          autoplay
          playsinline
          width="500"
          height="375"
        ></video>
        <canvas id="webcam-canvas" width="500" height="375"></canvas>
      </div>
      <div id="image box"></div>
    </div>
....
   <!-- ml5 -->
    <script src="https://unpkg.com/ml5@1/dist/ml5.js"></script>
    <script src="core/customML5.js" defer></script>

ui/core/customML5.js

let video;
let canvas;
let ctx;
let handPose;
let hands = []; 

document.addEventListener("DOMContentLoaded", () => {
  initializeCanvas();

  setupButton(); 
});

function initializeCanvas() {
  canvas = document.getElementById("webcam-canvas");
  ctx = canvas.getContext("2d");
}

function setupButton() {
  const cameraButton = document.getElementById("cameraButton");
  cameraButton.addEventListener("click", function () {
    const webcamContainer = document.getElementById("webcam-container");
    // webcamContainer.style.zIndex = "99"; // z-index 
    setupCamera().then(() => {
      loadHandPoseModel();
    });
  });
}

async function setupCamera() {
  video = document.getElementById("webcam-video");
  const stream = await navigator.mediaDevices.getUserMedia({ video: true });
  video.srcObject = stream;
  return new Promise((resolve) => {
    video.onloadedmetadata = () => {
      resolve(video);
    };
  });
}

function loadHandPoseModel() {
  handPose = ml5.handPose(video, {}, modelReady);
}

function modelReady() {
  console.log("Hand pose model is ready!");
  detectHands();
}

function detectHands() {
  handPose.detect(video, (results) => {
    if (results && results.length > 0) {

      const allHandsInBounds = results.every((hand) =>
        hand.keypoints.every(
          (keypoint) =>
            keypoint.x >= 0 &&
            keypoint.x <= canvas.width &&
            keypoint.y >= 0 &&
            keypoint.y <= canvas.height
        )
      );

      if (allHandsInBounds) {
        hands = results; 
      } else {
        hands = []; 
      }
    } else {
      hands = []; 
    }
    requestAnimationFrame(detectHands);
    requestAnimationFrame(draw);
  });
}

function draw() {
  ctx.clearRect(0, 0, canvas.width, canvas.height);
  ctx.drawImage(video, 0, 0, canvas.width, canvas.height);
  if (hands.length > 0) {
    hands.forEach((hand) => {
      hand.keypoints.forEach(drawKeypoint);
    });
  }
}

function drawKeypoint(keypoint) {
  ctx.fillStyle = "lime";
  ctx.beginPath();
  ctx.arc(keypoint.x, keypoint.y, 5, 0, 2 * Math.PI);
  ctx.fill();
}

image

image

sk8board commented 4 months ago

@Rimririn The scope of what you want is far beyond the scope of BIPES.

Maybe you should check to see if the following project will work better for your need?

https://quirkycort.github.io/IoTy/public/editor.html

https://github.com/QuirkyCort/IoTy

jetannenbaum commented 4 months ago

I have to agree with @sk8board that this is a bit beyond what BIPES was intended for. BIPES assumes that the code in the generator_stubs block will be executed as MicroPython code sent to the Pico. You would need to manipulate the DOM in order to send an alert to the webpage.

rafaelaroca commented 3 days ago

Yes, I agree with @sk8board and @jetannenbaum. This is out of BIPES scope, but you could do a "parallel" approach with BIPES or IoTy to make it happen.