Fixes: #17314
We now deal correctly with the creation of the state file - we were
not dealing well with a ResourceNotFound error
Now that this has been changed around, we try and create the statefile
and if there is an error, we look for an existing statefile - previously
this was not the order of operations
If a count field references another count field which is interpolated
but is attached to a resource already in the state, the result of that
first interpolation will be lost when a plan is serialized. This is
because the result of the first interpolation is stored directly in the
module config, in an unexported config field.
This is not a general fix for the above situation, which would require
refactoring how counts are handles throughout the config. Ignoring the
error works, because in most cases the count will be properly
handled during the resource's interpolation.
An interpolated count value that is determined during plan, is lost
during plan serialization, causing apply to fail when the interpolation
string can't be evaluated.
Previously we just ported over the simple "string", "list", and "map" type
hint keywords from the old loader, which exist primarily as hints to the
CLI for whether to treat -var=... arguments and environment variables as
literal strings or as HCL expressions.
However, we've been requested before to allow more specific constraints
here because it's generally better UX for a type error to be detected
within an expression in a calling "module" block rather than at some point
deep inside a third-party module.
To allow for more specific constraints, here we use the type constraint
expression syntax defined as an extension within HCL, which uses the
variable and function call syntaxes to represent types rather than values,
like this:
- string
- number
- bool
- list(string)
- list(any)
- list(map(string))
- object({id=string,name=string})
In native HCL syntax this looks like:
variable "foo" {
type = map(string)
}
In JSON, this looks like:
{
"variable": {
"foo": {
"type": "map(string)"
}
}
}
The selection of literal processing or HCL parsing of CLI-set values is
now explicit in the model and separate from the type, though it's still
derived from the type constraint and thus not directly controllable in
configuration.
Since this syntax is more complex than the keywords that replaced it, for
now the simpler keywords are still supported and "list" and "map" are
interpreted as list(any) and map(any) respectively, mimicking how they
were interpreted by Terraform 0.11 and earlier. For the time being our
documentation should continue to recommend these shorthand versions until
we gain more experience with the more-specific type constraints; most
users should just make use of the additional primitive type constraints
this enables: bool and number.
As a result of these more-complete type constraints, we can now type-check
the default value at config load time, which has the nice side-effect of
allowing us to produce a tailored error message if an override file
produces an invalid situation; previously the result was rather confusing
because the error message referred to the original definition of the
variable and not the overridden parts.
Although we do still consider these deprecated for 0.12, we'll defer
actually generating warnings for them until a later minor release so that
module authors can retain their quoted identifiers for a period after 0.12
release for backward-compatibility with Terraform 0.11.
The error-handling behavior of the HCL parser was improved, which causes
the number of diagnostics and the diagnostics messages to be different
in cases where a block-like introduction is given but without any
following body.
This is largely minor bugfixes for issues found since we last updated the
vendoring. There are some new features here that Terraform is not yet
using and thus present little risk.
In particular this includes the HCL-JSON spec change where arrays can now
be used at any level in a block label structure, to allow for preserving
the relative order of blocks.
While not normally possible, manual manipulation of the state and config
can cause us to end up with a nil config in
evalTreeManagedResourceNoState.
Regardless of how it got here, we can't ever assume the Config field is
not nil, and EvalInterpolate happily accepts a nil RawConfig
Also:
- In the getting started guide, the TFE content was all tailored to the older
run-locally workflow. I've replaced it with some brief explanation and a link
to the dedicated TFE getting started guide.
- Fixed a sidebar link glitch in the configuration section. (Both "Terraform"
and "Terraform Enterprise" were marked as active if you were on the TFE page.)
- Renamed the "Terraform Enterprise" page "Terraform Push." (Some people have
gotten confused and landed on this page when trying to set up the `atlas`
remote backend.)
Also:
- In the getting started guide, the TFE content was all tailored to the older
run-locally workflow. I've replaced it with some brief explanation and a link
to the dedicated TFE getting started guide.
- Fixed a sidebar link glitch in the configuration section. (Both "Terraform"
and "Terraform Enterprise" were marked as active if you were on the TFE page.)
- Renamed the "Terraform Enterprise" page "Terraform Push." (Some people have
gotten confused and landed on this page when trying to set up the `atlas`
remote backend.)