omjadas / hudsucker

Intercepting HTTP/S proxy
https://crates.io/crates/hudsucker
Apache License 2.0
205 stars 34 forks source link

when I visit a download link , I got the error "connection closed before message completed" #118

Closed songjiachao closed 2 months ago

songjiachao commented 2 months ago
image
omjadas commented 2 months ago

Does this error happen on all downloads, or only on a specific download?

songjiachao commented 2 months ago
async fn handle_response(&mut self, _ctx: &HttpContext, mut res: Response<Body>) -> Response<Body> {
    res = decode_response(res).unwrap();

    let req = match self.req().clone() {
      Some(req) => req,
      None => return res,
    };

    let content_type_str: String = res
      .headers()
      .get(CONTENT_TYPE)
      .and_then(|ct| ct.to_str().ok())
      .unwrap_or_default()
      .to_owned();

    // 是否是文本类型
    let is_text_content = ["text/", "application/json", "application/javascript", "application/xml"]
      .iter()
      .any(|&typ| content_type_str.contains(typ));

    if is_text_content {
      let uri_str = remove_default_port(&Url::parse(format!("{}", req.uri()).as_str()).expect("Url parse error"));
      let rule = find_rule(&uri_str, MapType::Local);

      // 判断 effect_tools 中是否有 fun_debug
      let web_tools = WEB_TOOL_VEC.lock().unwrap().clone();
      let need_effect_tools = content_type_str.contains("text/html") && web_tools.len() > 0;

      if need_effect_tools || rule.is_some() {
        let body_mut = res.body_mut();
        let body_bytes = body_mut.collect().await.unwrap_or_default().to_bytes();
        // gzip 需要通过flate2库Decode内容
        let mut body_string = String::from_utf8_lossy(&body_bytes).to_string();

        if let Some(rule) = rule {
          let local_content = rule.local_content;
          body_string = local_content.unwrap_or(body_string);
        }

        if need_effect_tools {
          // 添加fun-console脚本
          body_string += web_tools.join("\n").as_str();
        }

        *body_mut = Body::from(body_string.clone());
        // 删除 content-encoding
        res.headers_mut().remove(CONTENT_ENCODING);
        // 判断是否有content-length,有的话需要计算长度
        if res.headers().get(CONTENT_LENGTH).is_some() {
          let content_length = HeaderValue::from(body_string.len());
          res.headers_mut().insert(CONTENT_LENGTH, content_length);
        }
      }
      // 读取新的 body 传给前端
      let body_mut = res.body_mut();
      let body_bytes = body_mut.collect().await.unwrap_or_default().to_bytes();
      *body_mut = Body::from(Full::new(body_bytes.clone()));

      let output_response = ProxiedResponse::new(
        res.status(),
        res.version(),
        res.headers().clone(),
        body_bytes,
        chrono::Local::now().timestamp_nanos_opt().unwrap(),
        hosts::get_ip(req.uri().host().unwrap()),
      );

      self.set_res(output_response).send_output();
    } else {
      let output_response = ProxiedResponse::new(
        res.status(),
        res.version(),
        res.headers().clone(),
        Bytes::new(),
        chrono::Local::now().timestamp_nanos_opt().unwrap(),
        hosts::get_ip(req.uri().host().unwrap()),
      );
      self.set_res(output_response).send_output();
    }
    res
  }

I know why . This handle_response deal with all the reponse and downloads bytes is big. so body_mut.collect().await.unwrap_or_default().to_bytes(); this code will wait long time .

I now change the handle_response just deal text content type.