This repo contains example machine learning models you can use to try out Cog.
Once you've got a working model and want to publish it so others can see it in action, check out replicate.com/docs.
The models in this repo are small and contrived. Here are a few real-world examples:
Having trouble getting a model working? Let us know and we'll help. If you encountered a problem with Cog, you can file a GitHub issue. Otherwise chat with us in Discord or send us an email at [email protected].
cod.yaml
build:
gpu: true
python_version: "3.8"
python_packages:
- "numpy"
- "pandas"
- "pillow==9.2.0"
- "torch==1.13.1"
- "transformers"
- "pysbd"
predict: "predict.py:Predictor"
Getting Error from docker below
`Building Docker image from environment in cog.yaml...
[+] Building 394.3s (18/18) FINISHED
=> [internal] load build definition from Dockerfile 0.1s
=> => transferring dockerfile: 2.01kB 0.0s
=> [internal] load .dockerignore 0.1s
=> => transferring context: 2B 0.0s
=> resolve image config for docker.io/docker/dockerfile:1.2 0.3s
=> CACHED docker-image://docker.io/docker/dockerfile:[email protected]:e2a8561e419ab1ba6b2fe6cbdf49fd92b95912df1cf7d313c3e2230a333fdbcc 0.0s
=> [internal] load metadata for docker.io/nvidia/cuda:11.2.0-cudnn8-devel-ubuntu20.04 1.6s
=> [internal] load build context 0.1s
=> => transferring context: 40.50kB 0.0s
=> [stage-0 1/10] FROM docker.io/nvidia/cuda:[email protected]:764fb6d1fc3df959612037cd20908aaff62436ec51a0a2bf445df6bb94cd24e1 217.1s
=> => resolve docker.io/nvidia/cuda:[email protected]a256:764fb6d1fc3df959612037cd20908aaff62436ec51a0a2bf445df6bb94cd24e1 0.0s
=> => sha256:780f4c3f099464364ab2449fad3f5b68d46c5d11da2c6f48ec51e4b59aa5cde7 2.43kB / 2.43kB 0.0s
=> => sha256:eaead16dc43bb8811d4ff450935d607f9ba4baffda4fc110cc402fa43f601d83 28.58MB / 28.58MB 2.3s
=> => sha256:764fb6d1fc3df959612037cd20908aaff62436ec51a0a2bf445df6bb94cd24e1 743B / 743B 0.0s
=> => sha256:024b0a00097da87a217d9587198ed0028dbd89b999fdb754bc8af87420f2a925 15.22kB / 15.22kB 0.0s
=> => sha256:c261e48f49ff9aa520143aa2874a7285c27708f6614627b2d2b854401fcc3e58 7.93MB / 7.93MB 19.7s
=> => sha256:f7ab41eda8be2641b5de237727cbea10b05c74620e80c011447aba1bcd3c095a 11.04MB / 11.04MB 2.3s
=> => extracting sha256:eaead16dc43bb8811d4ff450935d607f9ba4baffda4fc110cc402fa43f601d83 0.4s
=> => sha256:b83c93effa812d146d35aebe893aa5e96297e339bff8bf21f307f672f5b1810d 6.43kB / 6.43kB 2.5s
=> => sha256:bb3b051d276e080821222bf451ce90d092070dc93d5869a137f00320937949c1 184B / 184B 2.7s
=> => sha256:8e60e5d0729267f93cdd1f945fad3cd20fd95b1bc645aa4bd4be4d401afc8bf1 1.04GB / 1.04GB 164.9s
=> => sha256:64f7e567a05ce3c8027ecabdfdb1510bfa54cbc3c08889fb34443ce3955e61b6 61.46kB / 61.46kB 3.2s
=> => sha256:bd1330614aec39f7a56a6f5f3777c7110a81284f512da78a0774e7c7f7fdfe3a 1.15GB / 1.15GB 173.2s
=> => extracting sha256:c261e48f49ff9aa520143aa2874a7285c27708f6614627b2d2b854401fcc3e58 0.1s
=> => sha256:f0cc2865d06f056e501f1574935d9c1e60027a3a874cdcac7ec529d404acdd05 84.21kB / 84.21kB 20.2s
=> => extracting sha256:f7ab41eda8be2641b5de237727cbea10b05c74620e80c011447aba1bcd3c095a 0.2s
=> => extracting sha256:bb3b051d276e080821222bf451ce90d092070dc93d5869a137f00320937949c1 0.0s
=> => sha256:6dfa9a68bf6ac64163457cc36a88bc071e086ec16dd8c3c15cc6da4ba04cbb0f 1.31GB / 1.31GB 202.6s
=> => extracting sha256:b83c93effa812d146d35aebe893aa5e96297e339bff8bf21f307f672f5b1810d 0.0s
=> => extracting sha256:8e60e5d0729267f93cdd1f945fad3cd20fd95b1bc645aa4bd4be4d401afc8bf1 8.5s
=> => extracting sha256:64f7e567a05ce3c8027ecabdfdb1510bfa54cbc3c08889fb34443ce3955e61b6 0.0s
=> => extracting sha256:bd1330614aec39f7a56a6f5f3777c7110a81284f512da78a0774e7c7f7fdfe3a 10.9s
=> => extracting sha256:f0cc2865d06f056e501f1574935d9c1e60027a3a874cdcac7ec529d404acdd05 0.0s
=> => extracting sha256:6dfa9a68bf6ac64163457cc36a88bc071e086ec16dd8c3c15cc6da4ba04cbb0f 13.4s
=> [stage-0 2/10] RUN rm -f /etc/apt/sources.list.d/cuda.list && rm -f /etc/apt/sources.list.d/nvidia-ml.list && apt-key del 7fa2af80 0.6s
=> [stage-0 3/10] RUN --mount=type=cache,target=/var/cache/apt set -eux; apt-get update -qq; apt-get install -qqy --no-install-recommends curl; rm -rf /var/lib/a 10.7s
=> [stage-0 4/10] RUN --mount=type=cache,target=/var/cache/apt apt-get update -qq && apt-get install -qqy --no-install-recommends make build-essential libssl- 41.1s
=> [stage-0 5/10] RUN curl -s -S -L https://raw.githubusercontent.com/pyenv/pyenv-installer/master/bin/pyenv-installer | bash && git clone https://github.com/mo 62.7s
=> [stage-0 6/10] COPY .cog/tmp/build1447658136/cog-0.0.1.dev-py3-none-any.whl /tmp/cog-0.0.1.dev-py3-none-any.whl 0.1s
=> [stage-0 7/10] RUN --mount=type=cache,target=/root/.cache/pip pip install /tmp/cog-0.0.1.dev-py3-none-any.whl 8.3s
=> [stage-0 8/10] COPY .cog/tmp/build1447658136/requirements.txt /tmp/requirements.txt 0.1s
=> [stage-0 9/10] RUN --mount=type=cache,target=/root/.cache/pip pip install -r /tmp/requirements.txt 39.4s
=> [stage-0 10/10] WORKDIR /src 0.1s
=> exporting to image 0.1s
=> => exporting layers 0.0s
=> => writing image sha256:e31584317928bf9bae132645d99209f486d15dba63106217e734f13dfc717f71 0.0s
=> => naming to docker.io/library/cog-cogtest-base 0.0s
=> exporting cache 0.0s
=> => preparing build cache for export 0.0s
Running 'bash' in Docker with the current directory mounted as a volume... docker: Error response from daemon: could not select device driver "" with capabilities: [[gpu]]. ⅹ Docker is missing required device driver `
I stumbled into this error after run the notebook example. Not sure exactly what is wrong.
``` Building Docker image from environment in cog.yaml... $ docker buildx build --platform linux/amd64 --file - --build-arg BUILDKIT_INLINE_CACHE=1 --tag cog-notebook-base --progress auto . [+] Building 1.5s (13/13) FINISHED => [internal] load build definition from Dockerfile 0.0s => => transferring dockerfile: 590B 0.0s => [internal] load .dockerignore 0.0s => => transferring context: 2B 0.0s => resolve image config for docker.io/docker/dockerfile:1.2 0.4s => CACHED docker-image://docker.io/docker/dockerfile:[email protected]:e2a8561e419ab1ba6b2fe6cbdf49fd92b95912df1cf7d313c3e2230a333fdbcc 0.0s => [internal] load metadata for docker.io/library/python:3.9 0.4s => [internal] load build context 0.0s => => transferring context: 31.75kB 0.0s => [stage-0 1/5] FROM docker.io/library/python:[email protected]:475fe86ebf1da48ea27009a8f7d7e96231af4142de918a68010d48d0abb9c9c5 0.0s => CACHED [stage-0 2/5] COPY .cog/tmp/build3517166673/cog-0.0.1.dev-py3-none-any.whl /tmp/cog-0.0.1.dev-py3-none-any.whl 0.0s => CACHED [stage-0 3/5] RUN --mount=type=cache,target=/root/.cache/pip pip install /tmp/cog-0.0.1.dev-py3-none-any.whl 0.0s => CACHED [stage-0 4/5] RUN --mount=type=cache,target=/root/.cache/pip pip install jupyterlab==3.2.4 0.0s => CACHED [stage-0 5/5] WORKDIR /src 0.0s => exporting to image 0.0s => => exporting layers 0.0s => => writing image sha256:45a9ad9948b152eb756fb39d04d2adc6a74946793ad3828d8a90f0d1fedb7207 0.0s => => naming to docker.io/library/cog-notebook-base 0.0s => exporting cache 0.0s => => preparing build cache for export 0.0s
Running 'jupyter notebook --allow-root --ip=0.0.0.0' in Docker with the current directory mounted as a volume... $ docker run --rm --shm-size 8G --interactive --publish 8888:8888 --tty --mount type=bind,source=/Users/onevarez/WebstormProjects/cog-examples/notebook,destination=/src --workdir /src cog-notebook-base jupyter notebook --allow-root --ip=0.0.0.0 WARNING: The requested image's platform (linux/amd64) does not match the detected host platform (linux/arm64/v8) and no specific platform was requested usage: jupyter [-h] [--version] [--config-dir] [--data-dir] [--runtime-dir] [--paths] [--json] [--debug] [subcommand]
Jupyter: Interactive Computing
positional arguments: subcommand the subcommand to launch
optional arguments: -h, --help show this help message and exit --version show the versions of core jupyter packages and exit --config-dir show Jupyter config dir --data-dir show Jupyter data dir --runtime-dir show Jupyter runtime dir --paths show all Jupyter paths. Add --json for machine-readable format. --json output paths as machine-readable json --debug output debug information about paths
Available subcommands: dejavu execute kernel kernelspec lab labextension labhub migrate nbclassic nbconvert run server troubleshoot trust
Jupyter command jupyter-notebook
not found.
ⅹ exit status 1
```
I made this change locally to bypass paramspec error, but got another protobuf error:
``` Building Docker image from environment in cog.yaml as cog-resnet... [+] Building 3.0s (14/14) FINISHED => [internal] load build definition from Dockerfile 0.0s => => transferring dockerfile: 622B 0.0s => [internal] load .dockerignore 0.0s => => transferring context: 2B 0.0s => resolve image config for docker.io/docker/dockerfile:1.2 0.1s => CACHED docker-image://docker.io/docker/dockerfile:[email protected]:e2a8561e419ab1ba6b2fe6cbdf49fd92b95912df1cf7d313 0.0s => [internal] load metadata for docker.io/library/python:3.8 0.1s => [internal] load build context 0.0s => => transferring context: 29.32kB 0.0s => [stage-0 1/6] FROM docker.io/library/python:[email protected]:f8dd6cc493bb667f693293f69927ae7c5ebf430a88b9d384c0c3ee 0.0s => CACHED [stage-0 2/6] COPY .cog/tmp/build3752096533/cog-0.0.1.dev-py3-none-any.whl /tmp/cog-0.0.1.dev-py3-none 0.0s => CACHED [stage-0 3/6] RUN --mount=type=cache,target=/root/.cache/pip pip install /tmp/cog-0.0.1.dev-py3-none-a 0.0s => CACHED [stage-0 4/6] RUN --mount=type=cache,target=/root/.cache/pip pip install pillow==9.1.0 tensorflow==2 0.0s => CACHED [stage-0 5/6] WORKDIR /src 0.0s => [stage-0 6/6] COPY . /src 0.1s => exporting to image 0.0s => => exporting layers 0.0s => => writing image sha256:034e64239bd4618f4272e0b719fd4263366a34a9a37eacdaf77139887c179680 0.0s => => naming to docker.io/library/cog-resnet 0.0s => exporting cache 0.0s => => preparing build cache for export 0.0s Adding labels to image...
Traceback (most recent call last):
File "/usr/local/lib/python3.8/runpy.py", line 194, in _run_module_as_main
return _run_code(code, main_globals, None,
File "/usr/local/lib/python3.8/runpy.py", line 87, in _run_code
exec(code, run_globals)
File "/usr/local/lib/python3.8/site-packages/cog/command/openapi_schema.py", line 18, in
More information: https://developers.google.com/protocol-buffers/docs/news/2022-05-06#python-updates
ⅹ Failed to get type signature: exit status 1 ```
I tried different versions of protobuf like 3.17, 3.20 or 4.21, and none of them worked. I also tried to set PROTOCOL_BUFFERS_PYTHON_IMPLEMENTATION=python, still didn't work.
Parallel of https://github.com/replicate/cog/pull/593
Super tiny change that removes friction for token differences