Skip to content
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
Original file line number Diff line number Diff line change
Expand Up @@ -1355,7 +1355,7 @@ def update!(**args)
class GcsDestination
include Google::Apis::Core::Hashable

# The URI of the Cloud Storage object. It's the same URI that is used by gsutil.
# The URI of the Cloud Storage object. It's the same URI that is used by gcloud storage.
# Example: "gs://bucket_name/object_name". See [Viewing and Editing Object
# Metadata](https://cloud.google.com/storage/docs/viewing-editing-metadata) for
# more information. If the specified Cloud Storage object already exists and
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -464,7 +464,7 @@ def update!(**args)
class GcsDestination
include Google::Apis::Core::Hashable

# The URI of the Cloud Storage object. It's the same URI that is used by gsutil.
# The URI of the Cloud Storage object. It's the same URI that is used by gcloud storage.
# For example: "gs://bucket_name/object_name". See [Viewing and Editing Object
# Metadata](https://cloud.google.com/storage/docs/viewing-editing-metadata) for
# more information.
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -787,18 +787,18 @@ def update!(**args)
# otherwise. The pipeline runner should add a key/value pair to either the
# inputs or outputs map. The indicated data copies will be carried out before/
# after pipeline execution, just as if the corresponding arguments were provided
# to `gsutil cp`. For example: Given the following `PipelineParameter`,
# to `gcloud storage cp`. For example: Given the following `PipelineParameter`,
# specified in the `inputParameters` list: ``` `name: "input_file", localCopy: `
# path: "file.txt", disk: "pd1"`` ``` where `disk` is defined in the `
# PipelineResources` object as: ``` `name: "pd1", mountPoint: "/mnt/disk/"` ```
# We create a disk named `pd1`, mount it on the host VM, and map `/mnt/pd1` to `/
# mnt/disk` in the docker container. At runtime, an entry for `input_file` would
# be required in the inputs map, such as: ``` inputs["input_file"] = "gs://my-
# bucket/bar.txt" ``` This would generate the following gsutil call: ``` gsutil
# bucket/bar.txt" ``` This would generate the following gcloud storage call: ``` gcloud storage
# cp gs://my-bucket/bar.txt /mnt/pd1/file.txt ``` The file `/mnt/pd1/file.txt`
# maps to `/mnt/disk/file.txt` in the Docker container. Acceptable paths are:
# Google Cloud storage pathLocal path file file glob directory For outputs, the
# direction of the copy is reversed: ``` gsutil cp /mnt/disk/file.txt gs://my-
# direction of the copy is reversed: ``` gcloud storage cp /mnt/disk/file.txt gs://my-
# bucket/bar.txt ``` Acceptable paths are: Local pathGoogle Cloud Storage path
# file file file directory - directory must already exist glob directory -
# directory will be created if it doesn't exist One restriction due to docker
Expand Down