-
Notifications
You must be signed in to change notification settings - Fork 10.3k
Remote state issue: Status=413 Code="RequestBodyTooLarge" #36135
Copy link
Copy link
Closed
Labels
Description
Terraform Version
Terraform v1.9.8
on darwin_arm64
+ provider registry.terraform.io/hashicorp/azurerm v4.12.0
+ provider registry.terraform.io/hashicorp/local v2.5.2
+ provider registry.terraform.io/hashicorp/random v3.6.3Terraform Configuration Files
provider "azurerm" {
features {
resource_group {
prevent_deletion_if_contains_resources = false
}
key_vault {
purge_soft_delete_on_destroy = true
recover_soft_deleted_key_vaults = true
}
}
skip_provider_registration = true
}
terraform {
required_providers {
azurerm = {
source = "hashicorp/azurerm"
#version >~ "3.0"
}
random = {
source = "hashicorp/random"
version = "~> 3.0"
}
}
backend "azurerm" {
container_name="big-stat"
key="big_state_test"
resource_group_name="my-dev-tf"
storage_account_name="mydevsatf"
}
}Debug Output
https://gist.github.com/madejusz/048078a9f2725186c30f6157d36f5fb9
Expected Behavior
Stage file bigger then 250MB should work with remote state AzureRM (storage account)
Actual Behavior
Error: Failed to save state
│
│ Error saving state: blobs.Client#PutBlockBlob: Failure responding to request: StatusCode=413 -- Original Error: autorest/azure: Service returned an error. Status=413 Code="RequestBodyTooLarge" Message="The request body is too large and exceeds the maximum permissible
│ limit.\nRequestId:c3a3d901-901e-002f-4957-42b6f2000000\nTime:2024-11-29T12:06:10.0500344Z"
╵
╷
│ Error: Failed to persist state to backend
│
│ The error shown above has prevented Terraform from writing the updated state to the configured backend. To allow for recovery, the state has been written to the file "errored.tfstate" in the current working directory.
│
│ Running "terraform apply" again at this point will create a forked state, making it harder to recover.
│
│ To retry writing this state, use the following command:
│ terraform state push errored.tfstateSteps to Reproduce
-
Create a file with size 1MB:
yes "This is some meaningful content for encoding." | head -c 750000 | base64 | head -c 1048576 > file.txt -
Use file as input for:
resource "local_file" "foo" {
for_each = { for i in range(1, 300) : format("foo%03d", i) => i }
content = file("${path.module}/file.txt")
filename = "${path.module}/foo/${each.key}.bar"
}- setup remote backend pointing to Azure Storage Account blob container.
$ terraform apply -auto-approve
Additional Context
No response
References
No response
Reactions are currently unavailable