terraform-coop / terraform-provider-foreman

Terraform provider for Foreman
https://registry.terraform.io/providers/terraform-coop/foreman
Mozilla Public License 2.0
35 stars 33 forks source link

host creation fails #18

Closed yuqo2450 closed 2 years ago

yuqo2450 commented 3 years ago

When creating a host, it is created in foreman, but terraform returns:

**Error: HTTP Error:{
  endpoint:   [https://foreman01.example.com/api/hosts]
  statusCode: [422]
  respBody:   [{
  "error": {"id":null,"errors":{"name":["has already been taken"]},"full_messages":["Name has already been taken"]}
}
]
}

  on main.tf line 69, in resource "foreman_host" "testhost01":
  69: resource "foreman_host" "testhost01" {**

Like I said the host is created in foreman but doesn't appear in terraform state. I am trying to deploy a metal host without a compute profile. Would be nice if you could help me out!

yuqo2450 commented 3 years ago

Today the error doesn't appear anymore. The host is now created in the state but terraform still returns the following error:

Error: HTTP Error:{
  endpoint:   [https://foreman01.example.com/api/hosts/11/power]
  statusCode: [422]
  respBody:   [{
  "error": {"message":"Power operations are not enabled on this host."}
}
]
}

It seems the api trys to power on the host, even though it has no compute profile. Also destruction of the host is not working properly. It is deleted in foreman but not in terraform state. terraform destroy just prints following error: Error: Failed to delete host in retry_count* 2 seconds Thx for help!

lhw commented 3 years ago

Could you provide some information on what kind of machine you are trying to create? Is it a bare metal machine started via ipmi or a virtual machine of sorts?

yuqo2450 commented 3 years ago

It is a Proxmox VM.

mgusek-interhyp commented 3 years ago

Can you provide your complete 'resource "foreman_host" "testhost01"' ?

yuqo2450 commented 3 years ago

here is my full resource:

resource "foreman_host" "metal-template" {
  name = var.host_name
  method = "build"
  enable_bmc = false
  bmc_success = false
  compute_profile_id = 0
  compute_resource_id = 0
  environment_id = var.host_env
  hostgroup_id = var.host_hg
  image_id = 0
  medium_id = 0
  domain_id = var.host_domain
  dynamic "interfaces_attributes" {
    iterator = interface
    for_each = var.host_interfaces[*]
    content {
      identifier = interface.value.if_identifier
      type = "interface"
      mac = interface.value.if_mac
      managed = interface.value.if_managed
      primary = interface.value.if_primary
      provision = interface.value.if_provision
      virtual = interface.value.if_virtual
    }
  }
  model_id = var.host_model
  parameters = var.host_params
  retry_count = 5
}
mgusek-interhyp commented 3 years ago

Looks like you are using a compute_resource, maybe a using a proxmox resource ? I assume there is no implementation for power on. Can you omit "compute_resource_id" and "compute_resource_id" ? Maybe "model_id" too

yuqo2450 commented 3 years ago

I omitted all attributes and I still get the same error. I don't realy understand GO so I can't read your code but the foreman API provides "PUT /api/hosts/:id/power --> Run a power operation on host". Probably u are using this?

lhw commented 3 years ago

We are currently using this provider with both bare metal machines and vsphere vms over foreman. And it worked fine so far with the power instructions. Could you provide some debug logs as described here by setting the provider to log:

provider "foreman" {
  ...
  provider_loglevel = "DEBUG"
  provider_logfile  = "terraform-provider-foreman.log"
  ...
}

You can shorten the log to the specific host creation part. There should be some lines from the SendPowerCommand

I'm surprised that the Proxmox connector doesn't accept the default power commands.

yuqo2450 commented 3 years ago

I think this is the most important part of the log file:

2021/03/14 17:56:10 [DEBUG] Created ForemanHost: [&{ForemanObject:{Id:4 Name:test-host CreatedAt:2021-03-14 16:56:10 UTC UpdatedAt:2021-03-14 16:56:10 UTC} Build:true Method:build DomainId:1 DomainName:example.org EnvironmentId:1 HostgroupId:1 OperatingSystemId:2 MediumId:5 ImageId:0 ModelId:0 EnableBMC:false BMCSuccess:false Comment: InterfacesAttributes:[{Id:4 SubnetId:1 Identifier:eth0 Name:test-host.example.org Username: Password: Managed:true Provision:true Virtual:false Primary:true IP:10.0.30.115 MAC:e0:fd:a3:e6:87:9b Type:interface Provider: AttachedDevices: AttachedTo: ComputeAttributes:map[] Destroy:false}] HostParameters:[] **ComputeResourceId:0 ComputeProfileId:0**}]
2021/03/14 17:56:10 [TRACE] resource_foreman_host.go#setResourceDataFromForemanHost
2021/03/14 17:56:10 [DEBUG] Using default Foreman behaviour for startup
**2021/03/14 17:56:10 [DEBUG] JSONBytes: [{"power_action":"on"}]
2021/03/14 17:56:10 [TRACE] foreman/api/client.go#NewRequest
2021/03/14 17:56:10 [DEBUG] method: [PUT], endpoint: [/hosts/4/power]
2021/03/14 17:56:10 [DEBUG] reqURL: [https://foreman01.example.org/api/hosts/4/power]**
2021/03/14 17:56:10 [DEBUG] SendPower: Retry #[0]
2021/03/14 17:56:10 [TRACE] foreman/api/client.go#SendAndParse
2021/03/14 17:56:10 [TRACE] foreman/api/client.go#Send
2021/03/14 17:56:11 [DEBUG] server response:{
  endpoint:   [https://foreman01.example.org/api/hosts/4/power]
  method:     [PUT]
  statusCode: [422]
  respBody:   [{
  "error": {"message":"Power operations are not enabled on this host."}
}
]
}

So there is definatley a poweron through the api. The Proxmox host is not used as a compute resource in foreman. My plan is to create a vm in proxmox and add the MAC address of the vm to the foreman host. The result is foreman is not aware of any proxmox host. So I have no compute resource at all in foreman. The documentation doesn't list proxmox compute resource as supported. Also seems, eventhough I omit the computeresource attributes they are automaticly created. Hope this helps. Let me know if you need something else.

julieenn commented 3 years ago

Same issue right here with Vcenter. Terraform only creating the foreman object but the vm isn't build. Here is my main.tf :

terraform {
  required_providers {
    foreman = {
      source = "HanseMerkur/foreman"
      version = "0.3.4"
    }
  }
}

provider "foreman" {
  provider_loglevel = "DEBUG"
  provider_logfile = "terraform-provider-foreman.log"

  client_username = "admin"
  client_password = "password"
  client_tls_insecure = "true"

  server_hostname = "192.168.1.200"
  server_protocol = "https"

  location_id = "2"
  organization_id = "1"

}

data "foreman_domain" "dev" {
  name = "domain.contenso"
}

data "foreman_environment" "production" {
  name = "production"
}

data "foreman_hostgroup" "Hostgroupattribution" {
  title = "test"

}

data "foreman_operatingsystem" "Centos7" {
  title = "CentOS 7.9.2009"
}

data "foreman_media" "mediumcentos" {
  name = "CentOS 7 mirror"
}

data "foreman_subnet" "networkinfo" {
  name = "WAN"
}

data "foreman_computeresource" "vcenter"{
  name = "leucub4-vvce03"
}

data "foreman_image" "image"{
  name = "leucub4-vtst10"
  compute_resource_id = "${data.foreman_computeresource.vcenter.id}"

}

resource "foreman_host" "TerraformTest" {
  name  = "tst03.domain.contenso"
  method = "build"
  comment = "tttteeesssttt"

  enable_bmc = false
  bmc_success = false
  compute_resource_id = "${data.foreman_computeresource.vcenter.id}"
  domain_id = "${data.foreman_domain.dev.id}"
  environment_id = "${data.foreman_environment.production.id}"
  hostgroup_id = "${data.foreman_hostgroup.Hostgroupattribution.id}"
  image_id = "${data.foreman_image.image.id}"
  operatingsystem_id = "${data.foreman_operatingsystem.Centos7.id}"
  medium_id = "${data.foreman_media.mediumcentos.id}"

}
jzandbergen commented 3 years ago

I have the same error using oVirt.

My foreman_host resource:

resource "foreman_host" "tf01" {
  name = "tf01.virtualt00bz.lan"
  hostgroup_id = "3"
  domain_id = 1
  compute_profile_id = 5
  method = "image"
  image_id = 8
  interfaces_attributes {
    type = "interface"
    managed = true
    primary = true
  }
}

Error in the foreman provider log:

2021/07/20 09:42:12 [DEBUG] server response:{
  endpoint:   [https://foreman.virtualt00bz.lan/api/hosts/337/power]
  method:     [PUT]
  statusCode: [422]
  respBody:   [{
  "error": {"message":"Power operations are not enabled on this host."}
}
]
}

And I am currently looking in the foreman log (/var/log/foreman/production.log) and here I see:

...
2021-07-20T07:42:11 [I|app|3f0aa6fb] Processed 2 tasks from queue 'Host::Managed Post', completed 2/2
2021-07-20T07:42:11 [I|app|3f0aa6fb]   Rendering api/v2/hosts/create.json.rabl
2021-07-20T07:42:11 [I|app|3f0aa6fb]   Rendered api/v2/hosts/create.json.rabl (Duration: 49.1ms | Allocations: 25135)
2021-07-20T07:42:11 [I|app|3f0aa6fb] Completed 201 Created in 2523ms (Views: 41.6ms | ActiveRecord: 46.0ms | Allocations: 120164)
2021-07-20T07:42:11 [I|app|9876fa96] Started GET "/api/hosts/337/vm_compute_attributes" for 10.255.0.1 at 2021-07-20 07:42:11 +0000
2021-07-20T07:42:11 [I|app|9876fa96] Processing by Api::V2::HostsController#vm_compute_attributes as JSON
2021-07-20T07:42:11 [I|app|9876fa96]   Parameters: {"apiv"=>"v2", "id"=>"337", "host"=>{}}
2021-07-20T07:42:11 [I|app|9876fa96] Authorized user terraform(terra form)
2021-07-20T07:42:11 [W|app|9876fa96] Action failed
2021-07-20T07:42:11 [I|app|9876fa96] Backtrace for 'Action failed' error (TypeError): no implicit conversion of nil into String
/opt/theforeman/tfm/root/usr/share/gems/gems/fog-libvirt-0.7.0/lib/fog/libvirt/requests/compute/list_domains.rb:9:in `lookup_domain_by_uuid'
/opt/theforeman/tfm/root/usr/share/gems/gems/fog-libvirt-0.7.0/lib/fog/libvirt/requests/compute/list_domains.rb:9:in `list_domains'
/opt/theforeman/tfm/root/usr/share/gems/gems/fog-libvirt-0.7.0/lib/fog/libvirt/models/compute/servers.rb:15:in `get'
/usr/share/foreman/app/models/compute_resource.rb:183:in `find_vm_by_uuid'
/usr/share/foreman/app/models/compute_resources/foreman/model/libvirt.rb:43:in `find_vm_by_uuid'
/usr/share/foreman/app/models/compute_resource.rb:362:in `vm_compute_attributes_for'
/usr/share/foreman/app/models/compute_resources/foreman/model/libvirt.rb:193:in `vm_compute_attributes_for'
/usr/share/foreman/app/models/host/managed.rb:784:in `vm_compute_attributes'
/usr/share/foreman/app/controllers/api/v2/hosts_controller.rb:228:in `vm_compute_attributes'
/opt/theforeman/tfm/root/usr/share/gems/gems/actionpack-6.0.3.4/lib/action_controller/metal/basic_implicit_render.rb:6:in `send_action'
/opt/theforeman/tfm/root/usr/share/gems/gems/actionpack-6.0.3.4/lib/abstract_controller/base.rb:195:in `process_action'
/opt/theforeman/tfm/root/usr/share/gems/gems/actionpack-6.0.3.4/lib/action_controller/metal/rendering.rb:30:in `process_action'
/opt/theforeman/tfm/root/usr/share/gems/gems/actionpack-6.0.3.4/lib/abstract_controller/callbacks.rb:42:in `block in process_action'
/opt/theforeman/tfm/root/usr/share/gems/gems/activesupport-6.0.3.4/lib/active_support/callbacks.rb:112:in `block in run_callbacks'
/usr/share/foreman/app/controllers/concerns/foreman/controller/timezone.rb:10:in `set_timezone'
/opt/theforeman/tfm/root/usr/share/gems/gems/activesupport-6.0.3.4/lib/active_support/callbacks.rb:121:in `block in run_callbacks'
/usr/share/foreman/app/models/concerns/foreman/thread_session.rb:32:in `clear_thread'
/opt/theforeman/tfm/root/usr/share/gems/gems/activesupport-6.0.3.4/lib/active_support/callbacks.rb:121:in `block in run_callbacks'
/usr/share/foreman/app/controllers/concerns/foreman/controller/topbar_sweeper.rb:12:in `set_topbar_sweeper_controller'
/opt/theforeman/tfm/root/usr/share/gems/gems/activesupport-6.0.3.4/lib/active_support/callbacks.rb:121:in `block in run_callbacks'
/opt/theforeman/tfm/root/usr/share/gems/gems/audited-4.9.0/lib/audited/sweeper.rb:14:in `around'
/opt/theforeman/tfm/root/usr/share/gems/gems/activesupport-6.0.3.4/lib/active_support/callbacks.rb:121:in `block in run_callbacks'
/opt/theforeman/tfm/root/usr/share/gems/gems/audited-4.9.0/lib/audited/sweeper.rb:14:in `around'
/opt/theforeman/tfm/root/usr/share/gems/gems/activesupport-6.0.3.4/lib/active_support/callbacks.rb:121:in `block in run_callbacks'
/opt/theforeman/tfm/root/usr/share/gems/gems/activesupport-6.0.3.4/lib/active_support/callbacks.rb:139:in `run_callbacks'
/opt/theforeman/tfm/root/usr/share/gems/gems/actionpack-6.0.3.4/lib/abstract_controller/callbacks.rb:41:in `process_action'
/opt/theforeman/tfm/root/usr/share/gems/gems/actionpack-6.0.3.4/lib/action_controller/metal/rescue.rb:22:in `process_action'
/opt/theforeman/tfm/root/usr/share/gems/gems/actionpack-6.0.3.4/lib/action_controller/metal/instrumentation.rb:33:in `block in process_action'
/opt/theforeman/tfm/root/usr/share/gems/gems/activesupport-6.0.3.4/lib/active_support/notifications.rb:180:in `block in instrument'
/opt/theforeman/tfm/root/usr/share/gems/gems/activesupport-6.0.3.4/lib/active_support/notifications/instrumenter.rb:24:in `instrument'
/opt/theforeman/tfm/root/usr/share/gems/gems/activesupport-6.0.3.4/lib/active_support/notifications.rb:180:in `instrument'
/opt/theforeman/tfm/root/usr/share/gems/gems/actionpack-6.0.3.4/lib/action_controller/metal/instrumentation.rb:32:in `process_action'
/opt/theforeman/tfm/root/usr/share/gems/gems/actionpack-6.0.3.4/lib/action_controller/metal/params_wrapper.rb:245:in `process_action'
/opt/theforeman/tfm/root/usr/share/gems/gems/activerecord-6.0.3.4/lib/active_record/railties/controller_runtime.rb:27:in `process_action'
/opt/theforeman/tfm/root/usr/share/gems/gems/actionpack-6.0.3.4/lib/abstract_controller/base.rb:136:in `process'
/opt/theforeman/tfm/root/usr/share/gems/gems/actionview-6.0.3.4/lib/action_view/rendering.rb:39:in `process'
/opt/theforeman/tfm/root/usr/share/gems/gems/actionpack-6.0.3.4/lib/action_controller/metal.rb:190:in `dispatch'
/opt/theforeman/tfm/root/usr/share/gems/gems/actionpack-6.0.3.4/lib/action_controller/metal.rb:254:in `dispatch'
/opt/theforeman/tfm/root/usr/share/gems/gems/actionpack-6.0.3.4/lib/action_dispatch/routing/route_set.rb:50:in `dispatch'
/opt/theforeman/tfm/root/usr/share/gems/gems/actionpack-6.0.3.4/lib/action_dispatch/routing/route_set.rb:33:in `serve'
/opt/theforeman/tfm/root/usr/share/gems/gems/actionpack-6.0.3.4/lib/action_dispatch/routing/mapper.rb:18:in `block in <class:Constraints>'
/opt/theforeman/tfm/root/usr/share/gems/gems/actionpack-6.0.3.4/lib/action_dispatch/routing/mapper.rb:48:in `serve'
/opt/theforeman/tfm/root/usr/share/gems/gems/actionpack-6.0.3.4/lib/action_dispatch/journey/router.rb:49:in `block in serve'
/opt/theforeman/tfm/root/usr/share/gems/gems/actionpack-6.0.3.4/lib/action_dispatch/journey/router.rb:32:in `each'
/opt/theforeman/tfm/root/usr/share/gems/gems/actionpack-6.0.3.4/lib/action_dispatch/journey/router.rb:32:in `serve'
/opt/theforeman/tfm/root/usr/share/gems/gems/actionpack-6.0.3.4/lib/action_dispatch/routing/route_set.rb:834:in `call'
/opt/theforeman/tfm/root/usr/share/gems/gems/apipie-dsl-2.3.0/lib/apipie_dsl/static_dispatcher.rb:67:in `call'
/opt/theforeman/tfm/root/usr/share/gems/gems/apipie-rails-0.5.17/lib/apipie/static_dispatcher.rb:66:in `call'
/opt/theforeman/tfm/root/usr/share/gems/gems/apipie-rails-0.5.17/lib/apipie/extractor/recorder.rb:137:in `call'
/opt/theforeman/tfm/root/usr/share/gems/gems/actionpack-6.0.3.4/lib/action_dispatch/middleware/static.rb:126:in `call'
/opt/theforeman/tfm/root/usr/share/gems/gems/actionpack-6.0.3.4/lib/action_dispatch/middleware/static.rb:126:in `call'
/opt/theforeman/tfm/root/usr/share/gems/gems/actionpack-6.0.3.4/lib/action_dispatch/middleware/static.rb:126:in `call'
/usr/share/foreman/lib/foreman/middleware/telemetry.rb:10:in `call'
/opt/theforeman/tfm/root/usr/share/gems/gems/apipie-rails-0.5.17/lib/apipie/middleware/checksum_in_headers.rb:27:in `call'
/usr/share/foreman/lib/foreman/middleware/catch_json_parse_errors.rb:9:in `call'
/opt/theforeman/tfm/root/usr/share/gems/gems/rack-2.2.3/lib/rack/tempfile_reaper.rb:15:in `call'
/opt/theforeman/tfm/root/usr/share/gems/gems/rack-2.2.3/lib/rack/etag.rb:27:in `call'
/opt/theforeman/tfm/root/usr/share/gems/gems/rack-2.2.3/lib/rack/conditional_get.rb:27:in `call'
/opt/theforeman/tfm/root/usr/share/gems/gems/rack-2.2.3/lib/rack/head.rb:12:in `call'
/opt/theforeman/tfm/root/usr/share/gems/gems/actionpack-6.0.3.4/lib/action_dispatch/http/content_security_policy.rb:18:in `call'
/usr/share/foreman/lib/foreman/middleware/logging_context_session.rb:22:in `call'
/opt/theforeman/tfm/root/usr/share/gems/gems/rack-2.2.3/lib/rack/session/abstract/id.rb:266:in `context'
/opt/theforeman/tfm/root/usr/share/gems/gems/rack-2.2.3/lib/rack/session/abstract/id.rb:260:in `call'
/opt/theforeman/tfm/root/usr/share/gems/gems/actionpack-6.0.3.4/lib/action_dispatch/middleware/cookies.rb:648:in `call'
/opt/theforeman/tfm/root/usr/share/gems/gems/actionpack-6.0.3.4/lib/action_dispatch/middleware/callbacks.rb:27:in `block in call'
/opt/theforeman/tfm/root/usr/share/gems/gems/activesupport-6.0.3.4/lib/active_support/callbacks.rb:101:in `run_callbacks'
/opt/theforeman/tfm/root/usr/share/gems/gems/actionpack-6.0.3.4/lib/action_dispatch/middleware/callbacks.rb:26:in `call'
/opt/theforeman/tfm/root/usr/share/gems/gems/actionpack-6.0.3.4/lib/action_dispatch/middleware/actionable_exceptions.rb:18:in `call'
/opt/theforeman/tfm/root/usr/share/gems/gems/actionpack-6.0.3.4/lib/action_dispatch/middleware/debug_exceptions.rb:32:in `call'
/opt/theforeman/tfm/root/usr/share/gems/gems/actionpack-6.0.3.4/lib/action_dispatch/middleware/show_exceptions.rb:33:in `call'
/opt/theforeman/tfm/root/usr/share/gems/gems/railties-6.0.3.4/lib/rails/rack/logger.rb:37:in `call_app'
/opt/theforeman/tfm/root/usr/share/gems/gems/railties-6.0.3.4/lib/rails/rack/logger.rb:28:in `call'
/opt/theforeman/tfm/root/usr/share/gems/gems/sprockets-rails-3.2.1/lib/sprockets/rails/quiet_assets.rb:13:in `call'
/usr/share/foreman/lib/foreman/middleware/logging_context_request.rb:11:in `call'
/opt/theforeman/tfm/root/usr/share/gems/gems/actionpack-6.0.3.4/lib/action_dispatch/middleware/remote_ip.rb:81:in `call'
/opt/theforeman/tfm/root/usr/share/gems/gems/actionpack-6.0.3.4/lib/action_dispatch/middleware/request_id.rb:27:in `call'
/opt/theforeman/tfm/root/usr/share/gems/gems/rack-2.2.3/lib/rack/method_override.rb:24:in `call'
/opt/theforeman/tfm/root/usr/share/gems/gems/rack-2.2.3/lib/rack/runtime.rb:22:in `call'
/opt/theforeman/tfm/root/usr/share/gems/gems/activesupport-6.0.3.4/lib/active_support/cache/strategy/local_cache_middleware.rb:29:in `call'
/opt/theforeman/tfm/root/usr/share/gems/gems/actionpack-6.0.3.4/lib/action_dispatch/middleware/executor.rb:14:in `call'
/opt/theforeman/tfm/root/usr/share/gems/gems/actionpack-6.0.3.4/lib/action_dispatch/middleware/static.rb:126:in `call'
/opt/theforeman/tfm/root/usr/share/gems/gems/rack-2.2.3/lib/rack/sendfile.rb:110:in `call'
/opt/theforeman/tfm/root/usr/share/gems/gems/actionpack-6.0.3.4/lib/action_dispatch/middleware/host_authorization.rb:76:in `call'
/opt/theforeman/tfm/root/usr/share/gems/gems/secure_headers-6.3.0/lib/secure_headers/middleware.rb:11:in `call'
/opt/theforeman/tfm/root/usr/share/gems/gems/railties-6.0.3.4/lib/rails/engine.rb:527:in `call'
/opt/theforeman/tfm/root/usr/share/gems/gems/railties-6.0.3.4/lib/rails/railtie.rb:190:in `public_send'
/opt/theforeman/tfm/root/usr/share/gems/gems/railties-6.0.3.4/lib/rails/railtie.rb:190:in `method_missing'
/opt/theforeman/tfm/root/usr/share/gems/gems/rack-2.2.3/lib/rack/urlmap.rb:74:in `block in call'
/opt/theforeman/tfm/root/usr/share/gems/gems/rack-2.2.3/lib/rack/urlmap.rb:58:in `each'
/opt/theforeman/tfm/root/usr/share/gems/gems/rack-2.2.3/lib/rack/urlmap.rb:58:in `call'
/usr/share/passenger/phusion_passenger/rack/thread_handler_extension.rb:74:in `process_request'
/usr/share/passenger/phusion_passenger/request_handler/thread_handler.rb:141:in `accept_and_process_next_request'
/usr/share/passenger/phusion_passenger/request_handler/thread_handler.rb:109:in `main_loop'
/usr/share/passenger/phusion_passenger/request_handler.rb:455:in `block (3 levels) in start_threads'
/opt/theforeman/tfm/root/usr/share/gems/gems/logging-2.3.0/lib/logging/diagnostic_context.rb:474:in `block in create_with_logging_context'
2021-07-20T07:42:11 [I|app|9876fa96]   Rendering api/v2/errors/standard_error.json.rabl within api/v2/layouts/error_layout
2021-07-20T07:42:11 [I|app|9876fa96]   Rendered api/v2/errors/standard_error.json.rabl within api/v2/layouts/error_layout (Duration: 3.2ms | Allocations: 5540)
2021-07-20T07:42:11 [I|app|9876fa96] Completed 500 Internal Server Error in 127ms (Views: 6.3ms | ActiveRecord: 13.2ms | Allocations: 21981)
2021-07-20T07:42:12 [I|app|0692dcf9] Started PUT "/api/hosts/337/power" for 10.255.0.1 at 2021-07-20 07:42:12 +0000
2021-07-20T07:42:12 [I|app|0692dcf9] Processing by Api::V2::HostsController#power as JSON
2021-07-20T07:42:12 [I|app|0692dcf9]   Parameters: {"power_action"=>"on", "apiv"=>"v2", "id"=>"337", "host"=>{}}
2021-07-20T07:42:12 [I|app|0692dcf9] Authorized user terraform(terra form)
2021-07-20T07:42:12 [I|app|0692dcf9]   Rendering api/v2/errors/custom_error.json.rabl within api/v2/layouts/error_layout
2021-07-20T07:42:12 [I|app|0692dcf9]   Rendered api/v2/errors/custom_error.json.rabl within api/v2/layouts/error_layout (Duration: 2.9ms | Allocations: 5539)
2021-07-20T07:42:12 [I|app|0692dcf9] Completed 422 Unprocessable Entity in 125ms (Views: 4.9ms | ActiveRecord: 16.5ms | Allocations: 22433)
2021-07-20T07:42:12 [I|app|16cc25d0] Started PUT "/api/hosts/337/power" for 10.255.0.1 at 2021-07-20 07:42:12 +0000
2021-07-20T07:42:12 [I|app|16cc25d0] Processing by Api::V2::HostsController#power as JSON
2021-07-20T07:42:12 [I|app|16cc25d0]   Parameters: {"power_action"=>"on", "apiv"=>"v2", "id"=>"337", "host"=>{}}
2021-07-20T07:42:12 [I|app|16cc25d0] Authorized user terraform(terra form)
2021-07-20T07:42:12 [I|app|16cc25d0]   Rendering api/v2/errors/custom_error.json.rabl within api/v2/layouts/error_layout
2021-07-20T07:42:12 [I|app|16cc25d0]   Rendered api/v2/errors/custom_error.json.rabl within api/v2/layouts/error_layout (Duration: 2.5ms | Allocations: 5536)
2021-07-20T07:42:12 [I|app|16cc25d0] Completed 422 Unprocessable Entity in 123ms (Views: 4.3ms | ActiveRecord: 16.2ms | Allocations: 22430)

So if I read it correctly the VM is never actually created and it can not be found at oVirt and thus it can not be powered on.

Ok... so this might be (at least for me) the "how", now let's figure out "why" :)

jzandbergen commented 3 years ago

I got it working on my side.

First off I found in /var/log/foreman/production.log that the compute_resource was set to nil. This should be set by the hostgroup to the appropriate value, but somehow this is not happening so I've added the compute_resource_id.

Then I got an error that said that the image_id did not belong to the compute_resource. But it definitely does, however if I check with the hammer-cli tool, I saw that the API call to the foreman-api does not send a image_id with an integer, but it sends the image_id as a string inside a compute attributes block. So i've added that to the terraform resource.

Then i got an stacktrace that mentioned something with a medium_id. Which should not happen because i use a image (This probably fails because we do not pass a image_id directly to foreman, but we pass it inside a compute_attributes block). So i've created a "dummy" media which can be ignored and added that to the terraform resource. Success!

So my final terraform resource:

resource "foreman_host" "tf01" {
  name = "tf01.virtualt00bz.lan"
  hostgroup_id = 3
  domain_id = 1
  compute_resource_id = 1
  compute_profile_id = 5
  compute_attributes = jsonencode({"image_id"="/var/lib/libvirt/images/alpine-3.13-cloud.qcow2"})
  medium_id = 14 # dummy, not used, but the foreman api demands it.
  ### Does not work, see `compute_attributes`:  image_id = 8 
  method = "image"
  interfaces_attributes {
    type = "interface"
    managed = true
    primary = true
  }
}

note: Most of my configs are set in the hostgroup and compute_profile so you might need to pass a few more options. Keep an eye on the foreman log at DEBUG level it might provide a good lead what is going wrong.

yuqo2450 commented 3 years ago

Thanks for your input. But this doesn't solve my problem. My Problem is that the foreman host is not created and not the corresponding vm. I want to use the foreman just as an install/puppet server in my environment not as an orchestration tool. So I haven't added any compute resources to foreman.

Glad to hear you got it working though.

jzandbergen commented 3 years ago

Ah yes i see, I'm really sorry, i didn't read your question properly.

mgusek-interhyp commented 2 years ago

In my case, I've created a VM in vsphere with resource vsphere_virtual_machine and after that creating a host in foreman with resource foreman_host. VM in vsphere and foreman is created, but poweron operation fails. I don't need a poweron operation, so maybe we can make it optional ?

lhw commented 2 years ago

Thats suprising to me. As we also create a fair amount of VMs on vsphere via foreman using this provider. We are using the "build" option on these for consistency with other machines but the poweron should still happen. Here is an example for kubernets workers we have in use:

resource "foreman_host" "worker" {
  count      = var.worker.count
  name       = format(var.name_format, (var.master.count + count.index) + 1)
  method     = "build"
  enable_bmc = false

  interfaces_attributes {
    type      = "interface"
    primary   = true
    managed   = true
    subnet_id = data.foreman_subnet.cluster.id
    compute_attributes = {
      "type"    = "VirtualVmxnet3"
      "network" = "vlan_${data.foreman_subnet.cluster.vlanid}_${replace(data.foreman_subnet.cluster.network_address, "/", "%2f")}"
    }
  }

  interfaces_attributes {
    type      = "interface"
    provision = true
    managed   = true
    subnet_id = data.foreman_subnet.infra.id
    compute_attributes = {
      "type"    = "VirtualVmxnet3"
      "network" = "vlan_${data.foreman_subnet.infra.vlanid}_${replace(data.foreman_subnet.infra.network_address, "/", "%2f")}"
    }
  }

  parameters = {
    "node_labels" = jsonencode(
      {
        // Beta Labels
        "beta.kubernetes.io/instance-type"         = "esx.${var.master.compute_profile}"
        // Stable Labels
        "node.kubernetes.io/instance-type" = "esx.${var.master.compute_profile}"
      }
    )
  }

  hostgroup_id        = foreman_hostgroup.worker.id
  environment_id      = data.foreman_environment.prod.id
  compute_resource_id = data.foreman_computeresource.vsphere.id
  compute_profile_id  = local.compute_profiles[var.worker.compute_profile]
}
yuqo2450 commented 2 years ago

As already mentioned, this does not concern the problem. You are creating the vsphere virtual machine through a foreman compute resource. @mgusek-interhyp and me have not connected the vsphere cluster to foreman as a compute resource so the vm is created manually (terraform ressource from vsphere provider). I went into deeper reading of the documentation and the code and found at that there is a possibility to connect bare-metal hosts but only if a IPMI is available through which the restart is triggered. Probably this is helping.

lhw commented 2 years ago

Ah okay. You are just registering the machine. Hmm yea the poweron part could be made optional I suppose. Or maybe just provide a build_method = "register" instead. I will think about it but it's not a huge priority for me.

mgusek-interhyp commented 2 years ago

I like the idea of build_method = "register", maybe i can provide a PR next week.

yuqo2450 commented 2 years ago

Probably this got fixed by @agriffit79 with #46. To be honest I am not sure what the build option does. Is it only preventing the virtual machine part or also the operating system (PXE) part? Does anyone know?

lhw commented 2 years ago

Probably this got fixed by @agriffit79 with #46. To be honest I am not sure what the build option does. Is it only preventing the virtual machine part or also the operating system (PXE) part? Does anyone know?

Yes. This was essentially the same issue. build_method differentiates between build and image. The later only being available for vm providers with cloning. The idea of build_method = "register" was the same as your solution