The code for this project was merged into the https://github.com/elyra-ai/elyra repository.
kfp-notebook
implements Kubeflow Pipelines operator NotebookOp
that supports processing of notebooks, Python scripts, and R scripts in pipelines.
bash
make clean install
The example below can easily be added to a python script
or jupyter notebook
for testing purposes.
```python import os import kfp from kfp_notebook.pipeline import NotebookOp from kubernetes.client.models import V1EnvVar
kfp_url = 'http://dataplatform.ibm.com:32488/pipeline'
cos_endpoint = 'http://s3.us-south.cloud-object-storage.appdomain.cloud' cos_bucket = 'test-bucket' cos_username = 'test' cos_password = 'test123' cos_directory = 'test-directory' cos_dependencies_archive = 'test-archive.tar.gz'
inputs = [] outputs = []
image = 'tensorflow/tensorflow:latest'
def run_notebook_op(op_name, notebook_path):
notebook_op = NotebookOp(name=op_name,
notebook=notebook_path,
cos_endpoint=cos_endpoint,
cos_bucket=cos_bucket,
cos_directory=cos_directory,
cos_dependencies_archive=cos_dependencies_archive,
pipeline_outputs=outputs,
pipeline_inputs=inputs,
image=image)
notebook_op.container.add_env_variable(V1EnvVar(name='AWS_ACCESS_KEY_ID', value=cos_username))
notebook_op.container.add_env_variable(V1EnvVar(name='AWS_SECRET_ACCESS_KEY', value=cos_password))
notebook_op.container.set_image_pull_policy('Always')
return op
def demo_pipeline(): stats_op = run_notebook_op('stats', 'generate-community-overview') contributions_op = run_notebook_op('contributions', 'generate-community-contributions') run_notebook_op('overview', 'overview').after(stats_op, contributions_op)
kfp.compiler.Compiler().compile(demo_pipeline,'pipelines/pipeline.tar.gz')
client = kfp.Client(host=kfp_url) pipeline_info = client.upload_pipeline('pipelines/pipeline.tar.gz',pipeline_name='pipeline-demo')
experiment = client.create_experiment(name='demo-experiment')
run = client.run_pipeline(experiment.id, 'demo-run', pipeline_id=pipeline_info.id)
```
Elyra is an open source set of AI-centric extensions to JupyterLab Notebooks. The project is hosted in incubation in the LF AI & Data Foundation.
GitHub Repository Homepagejupyter-notebook kubeflow-pipelines