Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[TEP074] Remove Storage pipelineResources #6014

Closed
wants to merge 2 commits into from
Closed
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
33 changes: 0 additions & 33 deletions config/config-artifact-bucket.yaml

This file was deleted.

96 changes: 0 additions & 96 deletions docs/resources.md
Original file line number Diff line number Diff line change
Expand Up @@ -47,8 +47,6 @@ For example:
- [Resource types](#resource-types)
- [Git Resource](#git-resource)
- [Image Resource](#image-resource)
- [Storage Resource](#storage-resource)
- [GCS Storage Resource](#gcs-storage-resource)
- [Why Aren't PipelineResources in Beta?](#why-aren-t-pipelineresources-in-beta)

## Syntax
Expand Down Expand Up @@ -540,100 +538,6 @@ status:
If the `index.json` file is not produced, the image digest will not be included
in the `taskRun` output.

### Storage Resource

The `storage` resource represents blob storage, that contains either an object
or directory. Adding the storage resource as an input to a `Task` will download
the blob and allow the `Task` to perform the required actions on the contents of
the blob.

Only blob storage type
[Google Cloud Storage](https://cloud.google.com/storage/)(gcs) is supported as
of now via [GCS storage resource](#gcs-storage-resource).

#### GCS Storage Resource

The `gcs` storage resource points to
[Google Cloud Storage](https://cloud.google.com/storage/) blob.

To create a GCS type of storage resource using the `PipelineResource` CRD:

```yaml
apiVersion: tekton.dev/v1alpha1
kind: PipelineResource
metadata:
name: wizzbang-storage
namespace: default
spec:
type: storage
params:
- name: type
value: gcs
- name: location
value: gs://some-bucket
- name: dir
value: "y" # This can have any value to be considered "true"
```

Params that can be added are the following:

1. `location`: represents the location of the blob storage.
1. `type`: represents the type of blob storage. For GCS storage resource this
value should be set to `gcs`.
1. `dir`: represents whether the blob storage is a directory or not. By default
a storage artifact is not considered a directory.

- If the artifact is a directory then `-r`(recursive) flag is used, to
copy all files under the source directory to a GCS bucket. Eg: `gsutil
cp -r source_dir/* gs://some-bucket`
- If an artifact is a single file like a zip or tar, then the copy will be
only 1 level deep(not recursive). It will not trigger a copy of sub
directories in the source directory. Eg: `gsutil cp source.tar
gs://some-bucket.tar`.

Private buckets can also be configured as storage resources. To access GCS
private buckets, service accounts with correct permissions are required. The
`secrets` field on the storage resource is used for configuring this
information. Below is an example on how to create a storage resource with a
service account.

1. Refer to the
[official documentation](https://cloud.google.com/compute/docs/access/service-accounts)
on how to create service accounts and configuring
[IAM permissions](https://cloud.google.com/storage/docs/access-control/iam-permissions)
to access buckets.

1. Create a Kubernetes secret from a downloaded service account json key

```bash
kubectl create secret generic bucket-sa --from-file=./service_account.json
```

1. To access the GCS private bucket environment variable
[`GOOGLE_APPLICATION_CREDENTIALS`](https://cloud.google.com/docs/authentication/production)
should be set, so apply the above created secret to the GCS storage resource
under the `fieldName` key.

```yaml
apiVersion: tekton.dev/v1alpha1
kind: PipelineResource
metadata:
name: wizzbang-storage
namespace: default
spec:
type: storage
params:
- name: type
value: gcs
- name: location
value: gs://some-private-bucket
- name: dir
value: "y"
secrets:
- fieldName: GOOGLE_APPLICATION_CREDENTIALS
secretName: bucket-sa
secretKey: service_account.json
```

--------------------------------------------------------------------------------

Expand Down
8 changes: 0 additions & 8 deletions docs/variables.md
Original file line number Diff line number Diff line change
Expand Up @@ -108,14 +108,6 @@ variable via `resources.inputs.<resourceName>.<variableName>` or
| `url` | The complete path to the image. |
| `digest` | The digest of the image. |

#### Variables for the `GCS` type

| Variable | Description |
| -------- | ----------- |
| `name` | The name of the resource. |
| `type` | Type value of `"gcs"`. |
| `location` | The fully qualified address of the blob storage. |

#### Variables for the `Cluster` type
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

it looks like there's some cloudevent and cluster docs hanging around here


| Variable | Description |
Expand Down
106 changes: 0 additions & 106 deletions examples/v1beta1/pipelineruns/output-pipelinerun.yaml

This file was deleted.

26 changes: 0 additions & 26 deletions examples/v1beta1/taskruns/gcs-resource.yaml

This file was deleted.

5 changes: 0 additions & 5 deletions hack/update-codegen.sh
Original file line number Diff line number Diff line change
Expand Up @@ -72,11 +72,6 @@ ${PREFIX}/deepcopy-gen \
--go-header-file ${REPO_ROOT_DIR}/hack/boilerplate/boilerplate.go.txt \
-i github.com/tektoncd/pipeline/pkg/apis/pipeline/pod

${PREFIX}/deepcopy-gen \
-O zz_generated.deepcopy \
--go-header-file ${REPO_ROOT_DIR}/hack/boilerplate/boilerplate.go.txt \
-i github.com/tektoncd/pipeline/pkg/apis/resource/v1alpha1/storage

${PREFIX}/deepcopy-gen \
-O zz_generated.deepcopy \
--go-header-file ${REPO_ROOT_DIR}/hack/boilerplate/boilerplate.go.txt \
Expand Down
Loading