terraform-in-action / manning-code

217 stars 245 forks source link

Problem running terraform apply on chapter8/part2b_multicloud-mmorpg-nomad #23

Open DustinAlandzes opened 3 years ago

DustinAlandzes commented 3 years ago

I get this error:

Dustins-MBP:part2b_multicloud-mmorpg-nomad dustinalandzes$ terraform apply
╷
│ Warning: Empty provider configuration blocks are not required
│ 
│   on .terraform/modules/mmorpg/main.tf line 1:
│    1: provider "nomad" {
│ 
│ Remove the nomad.aws provider block from module.mmorpg. Add nomad.aws to the list of configuration_aliases for nomad in required_providers to define the provider configuration name.
│ 
│ (and one more similar warning elsewhere)
╵
╷
│ Error: Invalid index
│ 
│   on .terraform/modules/mmorpg/main.tf line 14, in locals:
│   14:   aws_region   = sort(data.nomad_regions.current.regions)[1]
│     ├────────────────
│     │ data.nomad_regions.current.regions is list of string with 1 element
│ 
│ The given key does not identify an element in this collection value: the given index is greater than or equal to the length of the collection.
╵

Will try to give some more detail at a future date, figured other people would run into this. I ended up skipping this and moving on to part 3

Maraket commented 2 years ago

Looking into this a little further, it looks like the Nomad instances aren't federating, I realized this when looking at my Consul UI, it doesn't show any region dropdown, and doesn't seem to be aware it's in a cluster.

Reading through the chapter, I am wondering if the issue is due to the fact that the Azure Consul cluster now has to be made up of 1 node (due to what appears to be changes to Azures pricing and quotas), meaning it isn't really a cluster, so when Federation is attempted it disregards Azure since it's not a cluster, if someone with more experience could comment

scottwinkler commented 2 years ago

So it used to be 3 but there was some change to the Azure free tier which limits the number of instances that can be started at a time, and this was causing some issues. Reduced the number to 1. I understand this makes it not a real cluster anymore, but you can always change the value back. I have not had time to look into this further. If you would like to make a PR, i would be happy to have a fix for this.

cschar commented 1 year ago

For anyone coming across this issue, I followed scotts instructions above and bumped the cluster to 3.

Before, there were no regions showing up in my nomad console,

Screenshot 2023-10-13 at 5 06 39 PM

changed code to adjust cluster size

Screenshot 2023-10-13 at 6 07 51 PM

and afterwards, the regions showed.

Screenshot 2023-10-13 at 6 07 07 PM