If we encounter something that isn't a file -- for example, a dangling
symlink whose referent has been deleted -- we'll ignore it so that we
can either later produce a "no such plugin" error or auto-install a plugin
that will actually work.
This, in principle, allows us to make use of configuration information
when we populate the Meta structure, though we won't actually make use
of that until a subsequent commit.
We don't usually run "acceptance tests" during a Travis run, but this
particular suite doesn't require any special credentials since it just
accesses releases.hashicorp.com to download plugins, so therefore it's
safe to run in Travis at the expense of adding a few more seconds to
the runtime.
Running it in Travis can therefore give us some extra confidence for
pull requests that may inadvertently break certain details of the
workflow, as well as ensuring that these tests are kept up-to-date as
the system changes.
Since we now have a guide that recommends some specific ways to run
Terraform in automation, we can mimic those suggestions in an e2e test and
thus ensure they keep working.
Here we test the three different approaches suggested in the guide:
- init, plan, apply (main case)
- init, apply (e.g. for deploying to a QA/staging environment)
- init, plan (e.g. for verifying a pull request)
In 6712192724 we stopped counting data
source destroys in the destroy tally since they are an implementation
detail.
This caused this test to start failing, though since the new behavior is
correct here we just update the test to match.
A refactor introduced an extra `/` in the download url, which causes an
extra redirect during discovery.
Improve a registry test to verify that detection doesn't require the
registry after the modules have been fetched.
This function takes a map of lists of strings and inverts it so that
the string values become keys and the keys become items within the
corresponding lists.
Locals don't need to be evaluated during destroy. Rather than simply
skipping them, remove them from the state as they are encountered. Even
though they are not persisted in the state, it keeps the state up to
date as the destroy happens, and we reduce the chance of other
inconstancies later on.
These tests were written before subtest support was available. By running
them as subtests we can get better output in the event of an error, or
in verbose mode.
Shell tab completion for all of the subcommands under
"terraform workspace", providing the appropriate kind of auto-complete for
each argument, along with completion for for any flags.
This helper is a Predictor for the "complete" package that tries to
auto-complete workspace names from the current backend, if it's
initialized and operable.
The predictors built in to the "complete" package assume that the same
type of argument is repeated indefinitely, but most Terraform commands
don't work like that, so this helper allows us to define a sequence of
predictors that apply to each argument in turn.
The CLI package has automatic support for shell autocomplete (bash and
zsh, at time of writing) for subcommands, so all we need to do here is
just opt into it.
Users can install this into their shells by running:
terraform -install-autocomplete
This adds NoZeroValues, a small SchemaValidateFunc that checks that a
defined value is not a zero value. It's useful for situations where you
want to keep someone from explicitly entering a zero value (ie:
literally "0", or an empty string) on a required field, and want to
catch it in the validation stage, versus during apply using GetOk.
Module detection currently requires calling the registry to determine
the subdirectory. Since we're not directly accessing the subdirectory
through FolderStorage, and now handling it within terraform so modules can
reference sibling paths, we need to call out to the registry every
time we load a configuration to verify the subdirectory for the module,
which is returned during the Detect.
Record the subdirectories for each module in the top-level of the
FolderStorage path for retrieval during Tree.Load. This lets us bypass
Detection altogether, modules can be loaded without redetecting.
In order to remain backward compatible with some modules, we need to
handle subdirs during Load. This means duplicating part of the go-getter
code path for subDir handling so we can resolve any subDirs and globs
internally, while keeping the entire remote directory structure within
the file storage.
Test that we can get a subdirectory from a tarball (or any other
"packed" source that we support).
The 'tar-subdir-to-parent' test highlights a regression where the
subdirectory module references a module in its parent directory. This
breaks the intended use ofr the subdirectory and the implementation in
go-getter. We need to fix this in terraform, and possible plan warnings
and deprecations for this type of source.