Moving the transformer wholesale looks like it broke some tests, with
some actually doing legit work in normalizing singular resources from a
foo.0 notation to just foo.
Adjusted the TestPlanGraphBuilder to account for the extra
meta.count-boundary nodes in the graph output now, as well as added
another context test that tests this case. It appears the issue happens
during validate, as this is where the state can be altered to a broken
state if things are not properly transformed in the plan graph.
This fixes interpolation issues on grandchild data sources that have
multiple instances (ie: counts). For example, baz depends on bar, which
depends on foo.
In this instance, after an initial TF run is done and state is saved,
the next refresh/plan is not properly transformed, and instead of the
graph/state coming through as data.x.bar.0, it comes through as
data.x.bar. This breaks interpolations that rely on splat operators -
ie: data.x.bar.*.out.
A couple tests require lowering the grace period to keep the test from
taking the full 30s timeout.
The Retry_hang test also needed to be removed from the Parallel group,
becuase it modifies the global refreshGracePeriod variable.
Refresh calls may have side effects that need to be recorded if it
succeeds, especially common when when WaitForState is called from
resource.Retry.
If the WaitForState timeout is reached and there is a Refresh call
in-flight, wait up to refreshGracePeriod (set to 30s) for it to
complete.
Previously we fixed this specifically for the Enterprise VCS integration,
but we also had some long-running errors of this sort in the docs for
how to specify module sources on Bitbucket.
This test unfortunately relies on the timing of the loops in
WaitForState, and the text of the error message. Adjust the timing so
the timeout isn't an even multiple of the poll interval, and make sure
we reach a minimum number of retries.
Make sure that we can cancel the WaitForState refresh loop when reaching
a timeout, otherwise it may run indefinitely. There's no need to try and
store and read the Result concurrently, just pass the value over a
channel.