This page describes troubleshooting methods for common errors you may encounter while using Cloud Storage

See the Google Cloud Status Dashboard for information about regional or global incidents affecting Google Cloud services such as Cloud Storage

## Logging raw requests
When using tools such as
gcloud or the Cloud Storage client libraries, much
of the request and response information is handled by the tool. However, it is
sometimes useful to see details to aid in troubleshooting. Use the
following instructions to return request and response headers for your tool:
 Console
Viewing request and response information depends on the browser you're using to access the Google Cloud console. For the Google Chrome browser:
Click Chrome's
main menubutton (
)

Select
More Tools

Click
Developer Tools

In the pane that appears, click the
Networktab

 Command line
 gcloud
Use global debugging flags in your request. For example:
gcloud storage ls gsmy-bucket/my-object --log-http --verbosity=debug
 gsutil
Use the global
-D flag in your request. For example:
gsutil -D ls gsmy-bucket/my-object
 Client libraries
 C++
Set the environment variable
CLOUD_STORAGE_ENABLE_TRACING=httpto get the full HTTP traffic

Set the environment variable CLOUD_STORAGE_ENABLE_CLOG=yes to get logging of each RPC

 C#
Add a logger via
ApplicationContext.RegisterLogger, and set logging
options on the
HttpClient message handler. For more information, see
the FAQ entry

 Go
Set the environment variable
GODEBUG=http2debug=1. For more
information, see the Go package net/http

If you want to log the request body as well, use a custom HTTP client

 Java
Create a file named "logging.properties" with the following contents:
# Properties file which configures the operation of the JDK logging facility. # The system will look for this config file to be specified as a system property: # -Djava.util.logging.config.fileproject_loc:googleplus-simple-cmdline-sample}/logging.properties # Set up the console handler (uncomment "level" to show more fine-grained messages) handlers = java.util.logging.ConsoleHandler java.util.logging.ConsoleHandler.level = CONFIG # Set up logging of HTTP requests and responses (uncomment "level" to show) com.google.api.client.http.level = CONFIG
Use logging.properties with Maven
mvn -Djava.util.logging.config.file=path/to/logging.properties
insert_command
For more information, see Pluggable HTTP Transport

 Node.js
Set the environment variable
NODE_DEBUG=https before calling the Node
script

 PHP
Provide your own HTTP handler to the client using

httpHandler and set up middleware to log the request
and response

 Python
Use the logging module. For example:
import logging import http.client logging.basicConfig(level=logging.DEBUG) http.client.HTTPConnection.debuglevel=5
 Ruby
At the top of your
.rb file after
require "google/cloud/storage",
add the following:
ruby Google::Apis.logger.level = Logger::DEBUG
## Error codes
The following are common HTTP status codes you may encounter

 301: Moved Permanently
**Issue I'm setting up a static website, and accessing a directory path
returns an empty object and a
301 HTTP response code

**Solution If your browser downloads a zero byte object and you get a
301
HTTP response code when accessing a directory, such as
httpwww.example.com/dir/, your bucket most likely contains an empty object
of that name. To check that this is the case and fix the issue:
- In the Google Cloud console, go to the Cloud Storage
Bucketspage

- Click the
Activate Cloud Shellbutton at the top of the Google Cloud console

- Run
gcloud storage ls --recursive gswww.example.com/dir/. If the output includes
httpwww.example.com/dir/, you have an empty object at that location

- Remove the empty object with the command:
gcloud storage rm gswww.example.com/dir/
You can now access
httpwww.example.com/dir/ and have it return that
directory's
index.html file instead of the empty object

 400: Bad Request
**Issue While performing a resumable upload, I received this error and
the message
Failed to parse Content-Range header

**Solution The value you used in your
Content-Range header is invalid. For
example,
Content-Range:is invalid and instead should be specified as
Content-Range: bytesIf you receive this error, your current resumable
upload is no longer active, and you must start a new resumable upload

 401: Unauthorized
**Issue Requests to a public bucket directly, or via Cloud CDN, are
failing with a
HTTP 401: Unauthorized and an
Authentication Required
response

**Solution Check that your client, or any intermediate proxy, is not adding an
Authorization header to requests to Cloud Storage. Any request with
an
Authorization header, even if empty, is validated as if it were an
authentication attempt

 403: Account Disabled
**Issue I tried to create a bucket but got a
403 Account Disabled error

**Solution This error indicates that you have not yet turned on billing for
the associated project. For steps for enabling billing, see
Enable billing for a project

If billing is turned on and you continue to receive this error message, you can reach out to support with your project ID and a description of your problem


 403: Forbidden
**Issue I should have permission to access a certain bucket or object, but
when I attempt to do so, I get a
403 - Forbidden error with a message that is
similar to:
[email protected] does not have storage.objects.get access to the
Google Cloud Storage object

**Solution You are missing a IAM permission for the bucket
or object that is required to complete the request. If you expect to be able to
make the request but cannot, perform the following checks:
Is the grantee referenced in the error message the one you expected? If the error message refers to an unexpected email address or to "Anonymous caller", then your request is not using the credentials you intended. This could be because the tool you are using to make the request was set up with the credentials from another alias or entity, or it could be because the request is being made on your behalf by a service account

Is the permission referenced in the error message one thought you needed? If the permission is unexpected, it's likely because the tool you're using requires additional access in order to complete your request. For example, in order to bulk delete objects in a bucket,
gcloudmust first construct a list of objects in the bucket to delete. This portion of the bulk delete action requires the
storage.objects.listpermission, which might be surprising, given that the goal is object deletion, which normally requires only the
storage.objects.deletepermission. If this is the cause of your error message, make sure you're granted IAM roles that have the additional necessary permissions

Are you granted the IAM role on the intended resource or parent resource? For example, if you're granted the
Storage Object Viewerrole for a project and you're trying to download an object, make sure the object is in a bucket that's in the project; you might inadvertently have the
Storage Object Viewerpermission for a different project

 403: Forbidden
**Issue I am downloading my content from
storage.cloud.google.com, and I
receive a
403: Forbidden error when I use the browser to navigate to the
object using the URL:
httpsstorage.cloud.google.com/
BUCKET_NAME/ OBJECT_NAME **Solution Using
storage.cloud.google.com to download objects is known as
authenticated browser downloads, which uses cookie-based authentication

If you have configured Data Access audit logs in Cloud Audit Logs to track
access to objects, one of the restrictions of that feature is that
authenticated browser downloads cannot be used to download a tracked object,

unless the object isreadable. Attempting to use an authenticated
browser download for non-public objects results in a
403 response. This
restriction exists to prevent phishing for Google IDs, which are used for
cookie-based authentication

To avoid this issue, do one of the following:
- Use direct API calls, which support unauthenticated downloads, instead of using authenticated browser downloads

- Disable the Cloud Storage Data Access audit logs that are tracking access to the affected objects. Be aware that Data Access audit logs are set at or above the project level and can be enabled simultaneously at multiple levels

- Set exemptions to exclude specific users from Data Access audit log tracking, which allows those users to perform authenticated browser downloads

- Make affected objectsreadable, by granting read permission to either
allUsersor
allAuthenticatedUsers. Data Access audit logs do not record access to public objects

 409: Conflict
**Issue I tried to create a bucket but received the following error:
409 Conflict. Sorry, that name is not available. Please try a different one

**Solution The bucket name you tried to use (e.g

gscats or
gsdogs)
is already taken. Cloud Storage has a global namespace so you may not name a
bucket with the same name as an existing bucket. Choose a name that is not being
used

 429: Too Many Requests
**Issue My requests are being rejected with a
429 Too Many Requests error

**Solution You are hitting a limit to the number of requests
Cloud Storage allows for a given resource. See the
Cloud Storage quotas for a discussion of limits in
Cloud Storage. If your workload consists of 1000's of requests per
second to a bucket, see Request rate and access distribution guidelines
for a discussion of best practices, including ramping up your workload gradually
and avoiding sequential filenames

## Diagnosing Google Cloud console errors
**Issue When using the Google Cloud console to perform an
operation, I get a generic error message. For example, I see an error message
when trying to delete a bucket, but I don't see details for why the operation
failed. **Solution Use the Google Cloud console's notifications to see detailed
information about the failed operation:
Click the
Notificationsbutton in the Google Cloud console header

A dropdown displays the most recent operations performed by the Google Cloud console

Click the item you want to find out more about

A page opens up and displays detailed information about the operation

Click on each row to expand the detailed error information

Below is an example of error information for a failed bucket deletion operation, which explains that a bucket retention policy prevented the deletion of the bucket

## Static website errors
The following are common issues that you may encounter when setting up a bucket to host a static website

 HTTPS serving
**Issue I want to serve my content over HTTPS without using a load balancer. **Solution You can serve static content through HTTPS using direct URIs
such as
httpsstorage.googleapis.com/my-bucket/my-object. For other options
to serve your content through a custom domain over SSL, you can:
- Use a third-party Content Delivery Network with Cloud Storage

- Serve your static website content from Firebase Hosting instead of Cloud Storage

 Domain verification
**Issue I can't verify my domain. **Solution Normally, the verification process in Search Console
directs you to upload a file to your domain, but you may not have a way to do
this without first having an associated bucket, which you can only create
*after* you have performed domain verification

In this case, verify ownership using the
**Domain name provider** verification
method. See Ownership verification for steps to accomplish this. This
verification can be done before the bucket is created

 Inaccessible page
**Issue I get an
Access denied error message for a web page served by my
website

**Solution Check that the object is shared. If it is not, see
Making Data Public for instructions on how to do this

If you previously uploaded and shared an object, but then upload a new version of it, then you must reshare the object. This is because the public permission is replaced with the new upload

 Permission update failed
**Issue I get an error when I attempt to make my data public. **Solution Make sure that you have the
setIamPolicy permission for your
object or bucket. This permission is granted, for example, in the
Storage Admin role. If you have the
setIamPolicy permission and you
still get an error, your bucket might be subject to
public access prevention, which does not allow access to
allUsers or
allAuthenticatedUsers. Public access prevention might be set on the bucket
directly, or it might be enforced through an organization policy that is
set at a higher level

 Content download
**Issue I am prompted to download my page's content, instead of being able
to view it in my browser. **Solution If you specify a
MainPageSuffix as an object that does not have
a web content type, then instead of serving the page, site visitors are prompted
to download the content. To resolve this issue, update the content-type
metadata entry to a suitable value, such as
text/html. See

Editing object metadata for instructions on how to do this

## Latency
The following are common latency issues you might encounter. In addition, the Google Cloud Status Dashboard provides information about regional or global incidents affecting Google Cloud services such as Cloud Storage

 Upload or download latency
**Issue I'm seeing increased latency when uploading or downloading. **Solution Use the
gsutil perfdiag command to run performance
diagnostics from the affected environment. Consider the following common causes
of upload and download latency:
CPU or memory constraints: The affected environment's operating system should have tooling to measure local resource consumption such as CPU usage and memory usage

Disk IO constraints: As part of the
gsutil perfdiagcommand, use the
rthru_fileand
wthru_filetests to gauge the performance impact caused by local disk IO

Geographical distance: Performance can be impacted by the physical separation of your Cloud Storage bucket and affected environment, particularly in cross-continental cases. Testing with a bucket located in the same region as your affected environment can identify the extent to which geographic separation is contributing to your latency

- If applicable, the affected environment's DNS resolver should use the EDNS(0) protocol so that requests from the environment are routed through an appropriate Google Front End

 CLI or client library latency
**Issue I'm seeing increased latency when accessing Cloud Storage
with
gcloud storage, gsutil, or one of the client libraries

**Solution The CLIs and the client libraries automatically retry
requests when it's useful to do so, and this behavior can effectively increase
latency as seen from the end user. Use the Cloud Monitoring metric
storage.googleapis.com/api/request_count to see if
Cloud Storage is consistenty serving a retryable response code, such
as
429 or
5xx

## Proxy servers
**Issue I'm connecting through a proxy server. What do I need to do? **Solution To access Cloud Storage through a proxy server, you must
allow access to these domains:
accounts.google.comfor creating OAuth2 authentication tokens via
gsutil config
oauth2.googleapis.comfor performing OAuth2 token exchanges
*.googleapis.comfor storage requests

If your proxy server or security policy doesn't support whitelisting by domain and instead requires whitelisting by IP network block, we strongly recommend that you configure your proxy server for all Google IP address ranges. You can find the address ranges by querying WHOIS data at ARIN. As a best practice, you should periodically review your proxy settings to ensure they match Google's IP addresses

We do not recommend configuring your proxy with individual IP addresses you
obtain from one-time lookups of
oauth2.googleapis.com and
storage.googleapis.com. Because Google services are exposed via DNS names that
map to a large number of IP addresses that can change over time, configuring
your proxy based on a one-time lookup may lead to failures to connect to
Cloud Storage

If your requests are being routed through a proxy server, you may need to
check with your network administrator to ensure that the
Authorization
header containing your credentials is not stripped out by the proxy. Without
the
Authorization header, your requests are rejected and you receive a
MissingSecurityHeader error

## What's next
- Learn about your support options

- Find answers to additional questions in the Cloud Storage FAQ

- Explore how Error Reporting can help you identify and understand your Cloud Storage errors.